

Iām mildly surprised that music of any kind is whatās getting uploaded to something called Deezer
Iām mildly surprised that music of any kind is whatās getting uploaded to something called Deezer
ChatGPTās got what intelligence craves⦠itās got neurons
As I noted on the YouTube video, this is doubly heinous as a lot of CA community college instructors are āfreeway flyersā - working at multiple campuses, sometimes almost 100 miles apart, just to cobble together a full-time work schedule for themselves. Online, self-paced, forum-based class formats were already becoming popular even before the pandemic, and Iāve been in such classes where the professor indicated that I was one of maybe 3 or 4 students who bothered to show up to in-person office hours. I have to wonder if that will end up being a hard requirement at some point. The bottom rung on the higher-education ladder is already the most vulnerable, and this just makes it worse.
I have to agree. There are already at least two notable and high-profile failure stories with consequences that are going to stick around for years.
And sadly more to come. The first story is likely to continue to get a hands-off treatment in most US media for a few more years yet, but the second one is almost certainly going to generate Tacoma Narrows Bridge-level legends of failure and necessary restructuring once professionals are back in command. The kind of thing that is put into college engineering textbooks as a dire warning of what not to do.
Of course, itās up to us to keep these failures in the public spotlight and framed appropriately. The appropriate question is not, āhow did the AI fail?ā The appropriate question is, āhow did someone abusively misapply stochastic algorithms?ā
Would you invest in commercial real estate, knowing there was a non-zero chance your tenants might come in one day to discover a thoroughly intoxicated JD Vance in a compromising position with the break-room furniture?
Mesa-optimization⦠that must be when you rail some crushed-up Adderall XRs, boof some modafinil for good measure, and spend the night making sure your kitchen table surface is perfectly flat with no defects abrasions deviations contusionsā¦
couldnāt help myself, there are seldom more perfect opportunities to use this one
I actually think itās part-and-parcel of Yarvinās personality. As much as he rails against āthe Cathedral,ā PMCs, whatever, he himself is a perfect example of a pathological middle manager. Somebody who wants power without having to shoulder ultimate responsibility. He craves the childishly simplified social environment of a medieval-fantasy kingās court, but he doesnāt want to be the king himself. He wants to be (and has been, up until now) the scheming vizier who can run his manipulation games in the background, deciding who gets in front of the king but not having to take the heat if the king makes a bad decision. (And the ākingsā he works for have made plenty of bad decisions, but consequences have only just begun to catch up.)
I suspect this newfound mainstream attention is far more uncomfortable than it is validating for him. Perhaps the NYT profile was a burst of exhilaration, but the shine has worn off quickly. This correlates with the story last year about him coming back to Urbit as a āwartime CEO.ā If Urbit is so damn important for building his ridiculous vision, why wasnāt he running it the whole time? He doesnāt actually want to be CEO of anything. Power without responsibility.
He will never stop to reflect that his āphilosophy,ā such as it is, is explicitly tailored for avaricious power-hungry narcissists, soooooo
Obvious joke is obvious, but
The essay brims with false dichotomies, logical inconsistencies, half-baked metaphors, and allusions to genocide. It careens from Romanian tractor factories to Harvard being turned āinto dust. Into quarksā with the coherence of a meth-addled squirrel.
Harvard isnāt already full of Quarks?
Another thread worth pulling is that biotechnology and synthetic biology have turned out to be substantially harder to master than anticipated, and it didnāt seem like it was ever the primary area of expertise for a lot of these people anyway. I donāt have a copy of any of Kurzweilās books at hand to look at his predicted timelines for that stuff, but theyāre surely way off.
Faulty assumptions about the biological equivalence of digital neural network algorithms have done a lot of unexamined heavy lifting in driving the current AI bubble, and keeping the harder stuff on the fringes of the conversation. That said, I donāt doubt that a few refugees from the bubble-burst will attempt to inflate the next bubble on the back of speculative biotech, and Iāve seen a couple of signs of that already.
For my money, 2015/16 Adams trying to sell Trump as a āmaster persuaderā while also desperately pretending not to be an explicit Trump supporter himself was probably the most entertaining heās ever been. Once he switched from skimmable text blogging to livestreaming, though, he wanted to waste too much of my time to be interesting anymore.
āThis Is What Yudkowsky Actually Believesā seems like a subtitle that would get heavy use in a future episode of South Park about Cartman dropping out after one semester at community college.
Yes, Kurzweil desperately trying to create some kind of a scientific argument, as well as people with university affiliations like Singer and MacAskill pushing EA, are what give this stuff institutional strength. Yudkowsky and LW are by no means less influential, but theyāre at best a student club that only aspires to be a proper curriculum. Itās surely no coincidence that theyāre anchored in Berkeley, adjacent to the universityās famous student-led DeCal program.
FWIW, my capsule summary of TPOT/āpost-rationalistsā is that theyāre people who thought that advanced degrees and/or adjacency to VC money would yield more remuneration and influence than they actually did. Equally burned out, just further along the same path.
Iāve been contemplating this, and I agree with most everyone else about leaning heavily into the cult angle and explaining it as a mutant hybrid between Scientology-style UFO religions and Christian dispensationalist Book of Revelation eschatology. The latter may be especially useful in explaining it to USians. My mom (who works in an SV-adjacent job) sent me this Vanity Fair article the other day about Garry Tan grifting his way into non-denominational prosperity gospel Christianity: https://www.vanityfair.com/news/story/christianity-was-borderline-illegal-in-silicon-valley-now-its-the-new-religion She was wondering if it was ājust another fad for these people,ā and I had to explain no, not really, it is because their AI bullshit is so outlandish that some of them feel the need to pivot back towards something more mainstream to keep growing their following.
I also prefer to highlight Kurzweilās obsession with perpetual exponential growth curves as a central point. Thatās often what I start with when Iām explaining it all to somebody. It provides the foundation for the bullshit towers that Yudkowsky and friends have erected. And I also think that long-term, the historiography of this stuff will lean more heavily on Kurzweil as a source than Yudkowsky, because Kurzweil is better-organized and professionally published. Itāll most likely be the main source in the lower-division undergraduate/AP high school history texts that highlight this stuff as a background trend in the 2010s/2020s. Right now, we live in the peak days of the LessWrong bullshit volcano plume, but ultimately, it will probably be interpreted by the specialized upper-division texts that grow out of peoplesā PhD theses.
awful.systems
Huh, 2 paradigm shifts is about what it takes to get my old Beetle up to freeway speed, maybe big Yud is onto something
It is what happened to look good in the valley between the Adderall comedown and yesterday eveningās edible really starting to hit
Just had a video labeled āauto-dubbedā pop up in my YouTube feed for the first time. Not sure if it was chosen by the author or not. Too bad, it looks like a fascinating problem to see explained, but I donāt think Iām going to trust an AI feature that I just saw for the first time to explain it. (And perhaps more crucially, Iām a bit afraid of what anime fans will have to say about this.)
And the photos from a previous event are an ocean of whiteness. Hard to argue that theyāre not, uh, cultivating a certain demographicā¦
I picked up a modern Fortran book from Manning out of curiosity, and hoo boy are they even worse in terms of trend-riding. Not only can you find all the AI content you can handle, thereās a nice fat back catalog full of blockchain integration, smart-contract coding⦠I guess they can afford that if they expect the majority of their sales to be ebooks.