Nothing More Powerful Than the Anti-Cultishness Cult
False accusations of cultishness reinforce more than they weaken 🌛
With the increasing mainstream awareness of powerful AI comes increased mainstream awareness of Eliezer Yudkowsky, figurehead of the rationalist community and long-time prophet of AI doom. Yudkowsky, or Eliezer, or Yud — he is now internet-famous enough to go by many names — is known to believe that we are almost certainly headed for a catastrophic outcome, unless we correct course and pour enormous energies into making sure that upcoming superintelligences are aligned with human values and desires. By default, he says, we all die. And nothing suggests that we’re in a non-default scenario.
I hope he is wrong. He hopes he is wrong. Everyone does, which is why good arguments against his position are extremely valuable.
So, collectively, how are we doing?
… I apologize in advance for answering in a snarky, strawmanish way, but so far the main responses to Eliezer’s seem to be:
Eliezer wears a fedora and looks like a weird nerd
Eliezer is not actually an AI specialist, he’s just a “blogger” or “forum moderator” (in addition to being a weird nerd)
Eliezer is a weird polyamorous Bay Area nerd who should just focus on his polycule (I heard this one directly from Curtis Yarvin)
The rationalist community Eliezer brought into existence is just a cult for weird nerds
Now, to be clear, this is a joke — there certainly are people who engage with his ideas and come up with totally valid criticisms. But it’s only partly a joke. My sense is that in the absence of actually strong arguments to dismiss Eliezer’s position, people resort to ad hominem attacks and other fallacies.
The problem for them is, Eliezer and his “cult” have been training for about 15 years specifically to defend against such fallacies.
Cults are bad. Everyone agrees on that. It’s less easy to agree on which things, exactly, are cults.
Are Christianity and Islam cults? No, most would say; religions are not cults. But is there really a difference between a religion and cult, other than being well established? What about new religious movements? Political movements? Secret societies? Discord servers? Friend groups?
Strictly defining something that has a lot of corner cases is a fool’s errand, and I won’t attempt it here. I’ll just note that today, “cult” is basically a generic pejorative word for communities. If you don’t like a particular group of people, just accuse them of being a cult, and let the generic negative connotations do the rest of the job. Since no one really knows what a cult is, no one can effectively defend their group against such accusations.
That doesn’t mean they won’t try. One way to defend against cultishness accusations is to dive into what exactly causes the negative connotations of “cult.” Why are cults bad? This is easier to agree on. Members of a cult or typically excessively controlled by a charismatic, manipulative leader. It’s hard to leave a cult. A cult is organized against deviant ideas that may be harmful to society. Cults convince their adherents of giving them money.
If we take a look at the rationalist / Less Wrong / Yudkowskian community, what do we find? … Not much, to be honest. Eliezer Yudkowsky is a charismatic person in some weird nerd sense, but he’s not really leading anything. He did co-found an organization, MIRI, which accepts donations. It’s certainly possible to construct a story in which MIRI is likened to an apocalyptic cult that asks people for money so that they will be favored by the all-seeing God / Superintelligence once the Second Coming / intelligence explosion happens. But the fact that this story is so easy to construct should make us suspicious. The analogy is just a little too cute. When we look into it, we quickly find that there’s no religious notion of repent or favor in the world to come; MIRI just tries to avoid what it thinks is the default catastrophic outcome. And it’s pretty easy to not give any money to MIRI. I’m fairly concerned about AI risk, and I’ve never given them a cent.
Is it difficult to leave the “cult”? This is best answered by pointing out that there’s a whole community of people on Twitter who call themselves “post-rationalists.” For the most part, leaving is achieved just by stopping reading the relevant blogs. Also, unlike the typical black-and-white thinking of cults, it’s very easy to be somewhat part of the community. There’s basically nothing more banal than being “rationalist-adjacent” or “EA-adjacent.” I myself have been somewhere between all of these — rationalist, rationalist-adjacent, and post-rationalist — for about six years, and I’ve never felt any sort of pressure to conform or stay, at least not beyond the pressure that any friend group might exert to ensure the people in the friend group remain friends.
In fact, rationalism is unusually tolerant of people who do not conform. It literally teaches people to think for themselves. Maybe it doesn’t always do an amazing job at it, but there are far better ways to manipulate people into staying than teaching them about cognitive biases and that “what can be destroyed by truth should be.”
To be fair, there are occasional stories of people who have felt pressure from rationalist communities and suffered from it. Although I don’t doubt such stories, I can’t help but wonder to what extent they’re just particular instances of a general human failure mode. Just like we can harm ourselves with something as mundane as a table or butter knife, which doesn’t make tables and butter knives inherently dangerous, so can we harm ourselves with something as mundane as any human community. Identifying too much with a group, having friends only in that group, and becoming emotionally dependent on the group are all risky behaviors. Cults make those easier to happen through manipulation. But you can harm yourself by becoming too strongly linked with any non-cult groups, if you’re vulnerable: political parties, employers, toxic friends, even your own spouse. That doesn’t mean your marriage is a cult.
(I should also mention the subgroups of rationalists — like possibly Leverage Research — that act in more straightforwardly cultish ways. I don’t think those say much about the broader movement.)
As to whether the ideas in the rationality movement are harmful to society — maybe! I personally have strong reservations about some of the ideas that are fundamental to rationalism and its offspring Effective Altruism, like an excessive focus on optimization. And I agree it’s possible that believing in AI doom is a harmful infohazard. But at this point we’re back in the land of arguing about the ideas on their merits, which is not typically what accusations of cultishness seek to elicit.
Even though generic accusations of cultishness are usually quite dumb, they can still cause damage. In fact, hurling generic accusations of bad concepts, like racism or nazism, is a particularly effective tactic, because it forces people to defend themselves in awkward ways without really costing the attacker anything. Making a coherent argument isn’t even required.
Like I was saying, however, the rationalist community has been preparing itself for this for a long time. Consider the short essay Every Cause Wants To Be A Cult, written by Yudkowsky in December 2007. Far before most people had heard anything about AI risk, rationalists had grappled with accusations of cultishness and concluded that there wasn’t in fact much difference between being a human being with some relationships, and being in a generically defined cult.
This makes recent accusations feel particularly… uninformed. Like, sure, you can accuse rationalists of being a cult, but be warned that it makes you sound like you’re 15 years out of the loop. When this is pointed out to accusers, they sometimes counter with “fuck this, I’m not going to go read hundreds of Less Wrong posts!”, which is perfectly sensible, but it also makes them sound like they’re 15 years out of the loop and actively avoiding entering the loop, at which point the reasonable response is probably to ignore them altogether.
This happens with many other attacks on rationalists, by the way. Attacks based on Eliezer or others not having a degree in whatever they’re talking about have been taken care of by much discussion of credentialism. Attacks based on “weird nerd aesthetics” should at least reference the rationalist idea of “weirdness points” if they want to be taken seriously. Even attacks based on them being too wordy have to contend with the fact that they’re perfectly aware of that and have held literal contests to make their ideas easier to digest!
Again, it’s perfectly possible to make valid critiques of any of these. But it requires some effort. If you don’t want to put in the effort, that’s fine, but then your accusation probably won’t achieve much.
Or rather, it will achieve one thing, except it’s probably not the thing you want.
A fun but counterintuitive fact is that haters are fans too. People who obsess about your flaws and problems pay way more attention to you than normal people do. They’re “fanboys with the sign switched,” as Paul Graham writes. And just like fans, their existence increases your status and power.
A devastating, well-argued critique of some rationalist tenet may very well spell trouble for the community. But an attack that has already been refuted several times in the past 15 years won’t do much damage, while it will give rationalist ideas more visibility and credibility.
Eliezer Yudkowsky is currently being attacked on Twitter basically every single day. And he takes the time to directly answer a lot of the criticism! My guess is that he understands the hater dynamic quite well. Each time he is attacked, he grows more powerful.
Accusations of cultishness work the same when they’re aimed at organizations that aren’t particularly bad the ways real cults are. The accusations reinforce the cultural presence of the organization, its memetic power, its mystique — all the while without even causing so much as a scratch on its reputation. Because people can easily see for themselves how non-cultish the movement is, the accusations fall flat.
Rationalists may therefore have accidentally figured an optimal place to be in: they’re anti-cultish enough that they avoid the pitfalls of cults, while having enough of the superficial characteristics of cults to receive the benefits associated with such public perception.
They’re an anti-cultish cult, which is the most powerful type of organization known to man.
I think this articulates why I suspect that rationalists (and sister movements like effective altruists) are basically winning. They’re not fully mainstream yet, but they’re getting there, and there’s basically nothing so far that is able to get in their way. All genuine attempts to argue against them either fail or are absorbed by them; all attempts to destroy them on aesthetic grounds backfire.
I would really like if someone could refute Eliezer’s ideas about AI doom for good. I really hope he’s wrong. But every time his detractors decide that coming up with good arguments is too hard and decide that talking about his physical appearance or his cultishness is the way to go, I grow a bit more worried.
1) Ozy Brennan of Thing of Things has mentioned the “high-intensity community” as a good concept for what-we-mean by cults: monasteries, militaries, some EA orgs, small Leninist vanguard orgs, and central cult examples like Heaven’s Gate and others are high-intensity. Notably, you can coherently claim a particular HIC serves a valuable purpose (or not, while in either case have known ways of going off the rails) and it’s relatively common for the same ideology or worldview to have both more and less demanding milieux.
2) Nitpick, but the post in postrationalism is much more like the post in postmodernism than the ex in ex-Mormon.
3) I don’t know that the pre-existence of rationalist responses to the cult objection are Bayesian evidence against their being so (though I agree rationalism isn’t a cult!), since actual cults frequently have standard response to why they aren’t cults.
4) Really I think a lot of this is - zooming out from cults and rationalism - an instance of the more general phenomenon that good arguments against weird positions are hard to come by, since it’s so tempting for people to fall back on “that’s weird.” If you’re considering something weird you have to be your own red team since often no one else will!
Cultic aspects, are to me, irrelevant, as are personality, credentials, etc. A cult can state true propositions as can evil people. Yud's credentials are those of an autodidact who has very little coding skills or sophisticated logic or mathematical fluency, yet those lacks don't make anything he says useless. He is in my opinion, a public intellectual of the kind we often see here and abroad. A philosopher at heart (one of my favorite professions) currently specializing in the many intriguing questions posed by AI. More power to him! However I do have a caveat. That his grasp of AI may be shallower than the scientists who toil out of the public spotlight on the astronomic complexities of coding the LLMs and who work under the hood of the engine so to speak. Yud may be in the publicity department of the AI alignment factory, but I trust more the folks down on the shop floor.