Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful youāll near-instantly regret.
Any awful.systems sub may be subsneered in this subthread, techtakes or no.
If your sneer seems higher quality than you thought, feel free to cutānāpaste it into its own post ā thereās no quota for posting and the bar really isnāt that high.
The post Xitter web has spawned soo many āesotericā right wing freaks, but thereās no appropriate sneer-space for them. Iām talking redscare-ish, reality challenged āculture criticsā who write about everything but understand nothing. Iām talking about reply-guys who make the same 6 tweets about the same 3 subjects. Theyāre inescapable at this point, yet I donāt see them mocked (as much as they should be)
Like, there was one dude a while back who insisted that women couldnāt be surgeons because they didnāt believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I canāt escape them, I would love to sneer at them.
(Credit and/or blame to David Gerard for starting thisā¦)
:( looked in my old CS deptās discord, recruitment posts for the āExistential Risk Laboratoryā running an intro fellowship for AI Safety.
Looks inside at materials, fkn Bostrom and Kelsey Piper and whole slew of BS about alignment faking. Ofc the founder is an effective altruist getting a graduate degree in public policy.
thatās CFAR cult jargon right?
Not sure! What is CFAR?
Center For Applied Rationality. They hosted āworkshopsā were people could learn to be more rational. Except there methods werenāt really tested. And pretty culty. And reaching the correct conclusions (on topics such as AI doom) were treated as proof of rationality.
Edit: still host, present tense. I had misremembered some news of some other rationality adjacent institution as them shutting down, nope, they are still going strong, offering regular 4 day
brainwashing sessionsworkshops.Mesa-optimization? Iām not sure who in the lesswrong sphere coined itā¦ but yeah, itās one of their ātechnicalā terms that donāt actually have academic publishing behind it, so jargon.
Instrumental convergenceā¦ I think Bostrom coined that one?
The AI alignment forum has a claimed origin here is anyone on the article here from CFAR?
Why use the perfectly fine āinner optimizerā mentioned in the references when you can just ask google translate to give you the clunkiest, most pedestrian and also wrong part of speech Greek term to use in place of āinā instead?
Also natural selection is totally like gradient descent brah, even though evolutionary algorithms actually modeled after natural selection used to be their own subcategory of AI before the term just came to mean lying chatbot.
Mesa-optimizationā¦ that must be when you rail some crushed-up Adderall XRs, boof some modafinil for good measure, and spend the night making sure your kitchen table surface is perfectly flat with no defects abrasions deviations contusionsā¦
and you wrap it off with some linux 3d graphics lib hacking
Iām thinking they hired Jar-Jar Binks to the team.