There’s got to be some kind of licensing clarity that can be actually legislated. This is just straight-up price gouging through obscurantism.
There’s got to be some kind of licensing clarity that can be actually legislated. This is just straight-up price gouging through obscurantism.
AI finally allowing grooming at scale is the kind of thing I’d expect to be the setup for a joke about Silicon Valley libertarians, not something that’s actually happening.
Computer scientists hate him: solve the halting problem by smashing all running computers with a sledgehammer.
Sure we’ve been laying the groundwork for this for decade, but we wanted someone from our cult of personality to undermine democracy and replace it with explicit billionaire rule, not someone with his own cult of personality.
Reading the article explains the article, my dude.
I know next to nothing about C++ but I do know that I heard that closing line in the original voice and got goosebumps.
I’m pretty sure you could download a decent markov chain generator onto a TI-89 and do basically the same thing with a more in-class appropriate tool, but speaking as someone with dogshit handwriting I’m so glad to have graduated before this was a concern. Godspeed, my friend.
There’s a whole lot of ontological confusion going on here, and I want to make sure I’m not going too far in the opposite direction. Information, in the mathematical Shannon-ian sense, basically refers specifically to identifying one out of a possible set of values. In that sense, no underlying physical state could be said to hold “more” information than any other, right? Like, depending on the encoding a given amount of information can use a different amount of space on a channel (TRUE vs T vs 1), but just changing which arrangement of bits is currently in use doesn’t increase or decrease the total amount of information in the channel. I’m sure there’s some interesting physics to be done about our ability to meaningfully read or write to a given amount of space (something something quantum something something) but the idea of information somehow existing independently rather than being projected into the probability distribution of states in the underlying physical world is basically trying to find the physical properties of the Platonic forms or find the mass of the human soul.
No V0ldek, you are the small shell script. And then V0ldek was a zombie process.
Honestly I’m more surprised to learn that this is deriving itself from actual insights being misunderstood or misapplied rather than being whole-cloth bullshit. Although the landauer principle seems kind of self-evident to me? Like, storing a bit of data is more dependent on the fact that an action was performed than on the actual state being manipulated, so of course whether we’re talking about voltages or magnets or whatever other mechanism is responsible for maintaining that state the initial “write” requires some kind of action and therefore expenditure of energy.
Then again I had never heard of the concept before today and I’m almost certainly getting way out of my depth and missing a lot of background.
Obviously mathematically comparing suffering is the wrong framework to apply here. I propose a return to Aristotelian virtue ethics. The best shrimp is a tasty one, the best man is a philosopher-king who agrees with everything I say, and the best EA never gets past drunkenly ranting at their fellow undergrads.
I mean, that kind of suggests that you could use chatGPT to confabulate work for his class and he wouldn’t have room to complain? Not that I’d recommend testing that, because using ChatGPT in this way is not indicative of an internally consistent worldview informing those judgements.
You’re doing the lord’s simulation-author’s work, my friend.
Since the Middle ages we’ve reduced God’s divine realm from the glorious kingdom of heaven to an office chair in front of a computer screen, rather than an office chair behind it.
Oh the author here is absolutely a piece of work.
Here’s an interview where he’s talking about the biblical support for all of this and the ancient Greek origins of blah blah blah.
I can’t definitely predict this guy’s career trajectory, but one of those cults where they have to wear togas is not out of the question.
You’re missing the most obvious implication, though. If it’s all simulated or there’s a Cartesian demon afflicting me then none of you have any moral weight. Even more importantly if we assume that the SH is true then it means I’m smarter than you because I thought of it first (neener neener).
This feels like quackery but I can’t find a goal…
But if they both hold up to scrutiny, this is perhaps the first time scientific evidence supporting this theory has been produced – as explored in my recent book.
There it is.
Edit: oh God it’s worse than I thought
The web design almost makes me nostalgic for geocities fan pages. The citations that include himself ~10 times and the greatest hits of the last 50 years of physics, biology, and computer science, and Baudrillard of course. The journal of which this author is the lead editor and which includes the phrase “information as the fifth state of matter” in the scope description.
Oh God the deeper I dig the weirder it gets. Trying to confirm whether the Information Physics Institute is legit at all and found their list of members, one of whom listed their relevant expertise as “Writer, Roleplayer, Singer, Actor, Gamer”. Another lists “Hyperspace and machine elves”. One very honestly simply says “N/A”
The Gmail address also lends the whole thing an air of authority. Like, you’ve already paid for the domain, guys.
How sneerable is the entire “infodynamics” field? Because it seems like it should be pretty sneerable. The first referenced paper on the “second law of infodynamics” seems to indicate that information has some kind of concrete energy which brings to mind that experiment where they tried to weigh someone as they died to identify the mass of the human soul. Also it feels like a gross misunderstanding to describe a physical system as gaining or losing information in the Shannon framework since unless the total size of the possibility space is changing there’s not a change in total information. Like, all strings of 100 characters have the same level of information even though only a very few actually mean anything in a given language. I’m not sure it makes sense to talk about the amount of information in a system increasing or decreasing naturally outside of data loss in transmission? IDK I’m way out of my depth here but it smells like BS and the limited pool of citations doesn’t build confidence.
Anyone remember when Chrome had that issue with validating nested URL-encoded characters? Anyone for John%%80%80 Doe?
You know, that 30% figure is already enough to make it hard to express the value and power that the 1% control in terms of money - the numbers just don’t seem real. In practice they will never face a financial obstacle and can treat money (or their stuff as valued in money) as worth whatever they want it to be at the time.
In that sense the fact that Bitcoin valuations are basically made-up by whales and exchanges is pretty obvious to understand.