I think funding and repetition are the fundamental building blocs here, rather than the human psyche itself. I have talked with otherwise bright people who have read an article by some journalist (not necessarily a rationalist) who has interviewed AI researchers (probably cultists, was it 500 million USD that was pumped into the network?) who takes AI doom seriously.
So you have two steps of people who in theory are paid to evaluate and formulate the truth, to inform readers who don’t know the subject matter. And then add repetition from various directions and people get convinced that there is definitely something there (propaganda and commercials work the same way). Claiming that it’s all nonsense and cultists appears not to have much effect.
There’s probably some blurring of what “AI doom” means for people. People might be left thinking that “there could be negative effects due to widespread job loss etc” without necessarily buying into the weird maximalist AI doom ideas or “torturing simulated you forever” nonsense.
And the weirdo cultists probably use that blurring to build support for their cause without revealing the weird shit they actually believe.
I think funding and repetition are the fundamental building blocs here, rather than the human psyche itself. I have talked with otherwise bright people who have read an article by some journalist (not necessarily a rationalist) who has interviewed AI researchers (probably cultists, was it 500 million USD that was pumped into the network?) who takes AI doom seriously.
So you have two steps of people who in theory are paid to evaluate and formulate the truth, to inform readers who don’t know the subject matter. And then add repetition from various directions and people get convinced that there is definitely something there (propaganda and commercials work the same way). Claiming that it’s all nonsense and cultists appears not to have much effect.
There’s probably some blurring of what “AI doom” means for people. People might be left thinking that “there could be negative effects due to widespread job loss etc” without necessarily buying into the weird maximalist AI doom ideas or “torturing simulated you forever” nonsense.
And the weirdo cultists probably use that blurring to build support for their cause without revealing the weird shit they actually believe.