Effective altruism is “a research field and practical community that aims to find the best ways to help others, and put them into practice.”
Translation: it’s a nerdy charity club. It’s like regular charity, but with lots of math and philosophical analysis. And it’s an excuse to socialize with other nerdy people who are into those things. And it’s a home for quirky causes like averting the AI apocalypse and maximizing the happiness of all sentient beings (including shrimp and denizens of the distant future).
Okay, but is it bullshit?
If you want my answer, you’ll need a refresher on some of my previous posts. First, I’ve argued that happiness is bullshit. We say we want to be happy, but we don’t actually want that. Instead of pursuing happiness, we’re pursuing the sorts of ugly things that evolution made us want. We cannot admit we want these ugly things, so we say we want happiness instead. Because it sounds better.
Second, I’ve argued that our desire to make the world a better place is bullshit. Instead of selflessly trying to help everyone on the planet, we’re mainly trying to win virtue points (while showing we don’t care about virtue points), signal support for our political allies, morally one-up our rivals, fit in, dominate people, make others feel indebted to us, and prevent our status games from collapsing. We cannot admit we want these ugly things, so we say we want to make the world a better place instead. Because it sounds better.
Now what happens when you put these two feel-good statements together—namely that we want to be happy, and we want to make the world a better place?
You get: “We want to make the world a better place by making people happy.”
This is basically the slogan of the effective altruism (EA) movement. It’s a movement about pretending to want to give people what they pretend to want. Which means, yea, it’s pretty bullshitty.
But wait wait wait! That doesn’t mean it’s bad! I’m a fan of EA. Seriously. I think it’s a brilliant way of getting past the shortcomings of human nature. EAs circumvent their ugly motives by embedding themselves in social networks that give status to the most effectively altruistic (and impressively brainy) people. This causes them to indirectly want to make the world a better place as a means to increasing their status within the EA community, or as a means to making the EA community look morally and intellectually superior to other communities.
And that’s a good thing! I don’t see a better way of getting humans to do actual good in the world, do you?
So I’m all about pretending to want things we don’t want. It’s better than nakedly pursuing the ugly things we actually want. Where I might think twice about the EA movement is the “maximizing happiness” part.
If nobody wants to be happy, then that is a strange goal to pretend to pursue. It would be like starting a charity to maximize the number of pinecones in the universe, even though nobody wants pinecones, and nobody wants to give people pinecones.
So maybe EAs should pretend to want to minimize suffering instead? No, we don’t really want that either. Suffering is good for us: it helps us learn, grow, and achieve our goals (in fact, it’s part of what it means to have a goal). If you want to understand human motivation, you must internalize the following sentence: We don’t want stuff in our heads; we want stuff in the world.
So why is EA so fixated on stuff in our heads? Why doesn’t it focus on stuff we actually want—you know, stuff in the world, like food, sex, safety, and status? Why is it increasingly obsessed with safeguarding our blissed-out intergalactic future, instead of just crunching the numbers on how to save people’s lives right now? Why has EA been drifting toward speculative longtermism, utopianism, and visions of the apocalypse?
My best guess is that glorious futures, or apocalyptic visions, are useful for bringing people together, and EA is ultimately a group of primates that need to bond and share hero myths to be culturally stable. Utopias and dystopias, in this life or the next, are common features of religions for a reason: they’re good at rallying the tribe.
Now, this is not a criticism. Tribes need to be rallied (or else they’ll disband), and status games need to be sacralized (or else they’ll collapse). As far as tribes / status games go, EA is one of the better ones in the world right now. It’s certainly better than politics. But I think EAs could use a better sacred narrative than the hedonistic, intergalactic one they’re currently spinning.
I’m not the best at spinning sacred narratives—I’m better at poking holes in them—but maybe EAs could spin a narrative about transcending the mediocrity of our species, averting the zero-sum tragedy of the human condition, or building better social and economic incentive structures that channel our selfishness, nepotism, and groupishness toward the greater good.
Then again, maybe I’m just trying to win the Opinion Game by getting EAs to think more like me. Maybe I’m not as interested in making the world a better place, or helping EAs make the world a better place, as I am in elevating my own status at their expense. Maybe I’m pretending to want to stop people from pretending to want to give people what they pretend to want. The bullshit never ends, does it?
The truth is, we cannot choose what we ultimately want—evolution has already chosen that for us. But there’s at least one thing we can choose: what to pretend to want. That is where our freedom resides, if there is such a thing. If we can pretend to want a better world, and reward each other for pretending more convincingly than our rivals, then maybe we can create a better world.
And maybe that’s the new branding effective altruism needs: “effective pretending.” It’s a more humble, cynical vibe than EA’s current branding of “doing good better [than you],” which seems to suggest that EAs are ethical superheroes and everyone else is a virtue-signaling idiot. Maybe “effective pretending” will be less of an insult to outsiders.
Or maybe that’s a terrible idea and I should keep my mouth shut and thank the lord EA has become as successful as it has.
Either way, there’s a lesson here. We’ve got to think more carefully about what we’re pretending to want and why. We’ve got to come clean about the status games we’re playing and pretending not to play, so we can design them more intelligently, and prevent them from drifting in culty directions.
Pretend wisely.
This is brilliant. You are talking about the 🐘 in the 🧠. Just brilliant
What I understood is "EA is bullshit, but it's good bullshit, so it's ok". I think the problem with this and, in fact, the whole premise of this blog is that the meaning of "bullshit" is negative. "Everything is Bullshit" -> Everything is bad? Not necessarily, as we are starting to delve into the different kinds of good and bad bullshit. I think I sort of get it, but for many readers it might be hard to get past the negativity of "bullshit" to fully grasp the concept of good bullshit.