A Big Misunderstanding

I spend a lot of time with intellectuals—writers, thinkers, social scientists, etc. If I had to sum up their worldview in one sentence, I could hardly do better than this one:
Everything that’s wrong in the world is caused by misunderstanding.
Political polarization? Misunderstanding. If only people could get over their primitive “tribalism” and “confirmation bias,” they could have reasonable discourse and work together to solve humanity’s problems.
Misinformation? Misunderstanding. If only people knew how to “vaccinate” themselves against the “virus” of fake news, they’d stop being such gullible idiots and vote for the Democrats.
Bigotry? Misunderstanding. If only people realized that members of other ethnic groups were normal, decent human beings like them, there would be no bigotry.
Stereotypes? Misunderstanding. If only people knew that stereotypes were false and pernicious, there would be no stereotypes—and no bigotry.
War? Misunderstanding. If only people knew that war is pointless and evil, a product of bigotry and misinformation, there would be world peace.
Capitalism? False consciousness. If only people knew how much greedy corporations were exploiting them, the workers of the world would unite.
Wikipedia’s list of 265 cognitive biases? 265 misunderstandings! If only people joined the rationality movement and memorized these biases in elementary school, humans would conquer the galaxy.
Ineffective altruism? Misunderstanding. If only people knew that slacktivism and virtue signaling accomplish nothing, they’d become utilitarians and donate their money to shrimp welfare or preventing the AI apocalypse.
Unhappiness? Misunderstanding. If only people learned some positive psychology, they’d stop comparing themselves to sexier people on Instagram and start meditating and gratitude journaling.
Ahh, it’s the perfect story. If all the world’s problems are caused by misunderstanding, then that makes intellectuals—the people whose job it is to understand things—the most important people ever. Just by doing what they’re doing, they’re saving the world.
Wow. Intellectuals. Saving the world. Pretty cool thing for intellectuals to believe.
There’s no misunderstanding
Stereotypes are savvy. Our beliefs about religious, ethnic, occupational, and geographic groups are pretty accurate. The accuracy of our stereotypes is one of the most robust and well-replicated findings in psychology. But this fact has been suppressed by psychologists, because they’re terrified of any information that might make them look insufficiently progressive. Also, they hate Republicans, which makes sense because...
Partisan hatred is not a whoopsie. You want to know why partisans hate each other? It’s not because they gave in to a dumb, primitive urge called “tribalism.” It’s not because they had a senior moment and forgot to check for disconfirming evidence of their propaganda. It’s because they’re locked in zero-sum competition over the coercive apparatus of the state—the thing that forcibly puts human beings in prison at gunpoint. The stakes are high. And what do we do in a high-stakes competition? We fight dirty. We demonize the competition. And we deny we’re doing this—and embellish how much the other side is doing it—because denial and embellishment are useful weapons to wield in the fight.
Bigotry is not a brain-fart. A lot of it is intertwined with competition over the coercive apparatus of the state, because ethnic minorities are accurately stereotyped as allies of the Democratic Party (1, 2, 3). So feeling threatened by ethnic minorities is related to feeling threatened by Democrats, in the same way that feeling threatened by Christian fundamentalists is related to feeling threatened by Republicans. As for the rest of bigotry, it probably comes from zero-sum competition over intergroup status. Such competition may be most acute among ethnic minorities’ closest rivals in the social hierarchy—i.e., low-status white people—which might explain why antiracism confers elite status. And it might also explain why antiracist elites resent “millionaires and billionaires”—i.e., their closest rivals in the hierarchy.
“Misinformation” is a moral panic. We have two choices. We can either define “misinformation” broadly as “misleading information,” in which case it encompasses nearly all the information we consume, including the propaganda disseminated by intellectuals allied with the Democratic Party. Or we can define the term narrowly as “fabricated information,” in which case it is neither new, nor pervasive, nor an existential threat (check out Dan Williams on this).
Humans are rational. Most cognitive “biases” aren’t really biases—they’re savvy strategies and heuristics (check out Lionel Page on this). As a matter of fact, the guy who co-discovered most of these “biases,” Daniel Kahneman, said in an interview that learning about them didn’t improve his behavior in any way. He was, tellingly, unmotivated to become more “rational.” Maybe some part of him knew that the biases he discovered were self-serving. Maybe he had an inkling that…
Confirmation bias helps us win arguments and justify our actions.
Overconfidence helps us make money, gain status, and convince people that we know what we’re doing even if we don’t.
Loss aversion (where we try to avoid losses more than pursue gains) helps us make decisions we can better justify to others in a Darwinian world where fitness losses loom larger than fitness gains.
The spotlight effect (where we overestimate how much others are judging us) is rational in a competitive social marketplace where it’s better to err on the side of caution than suffer the horrors of social exile.
The sunk cost “fallacy” (where we use sunk costs as a reason for persisting in an activity) is an honest signal of commitment and resolve in the aforementioned social marketplace—a way of saying: “I finish what I start.”
The endowment effect (where things seem more precious once we own them) is rational if we’re competing for resources and depriving you of the resource makes it more valuable for me.
The self-serving bias (where we attribute our successes to innate superiority and our failures to others conspiring against us) is, well, self-serving.
The bias bias (where we think we’re less biased than others) is… also self-serving.
Positive illusions (where overestimate how bright our future is and how competent we are) is… you know this one.
The focusing illusion (where we overestimate how happy we’ll feel when we get what we want) is only a problem if we’re pursuing happiness. Which reminds me…
Happiness is bullshit. You want to know why you’re reading my depressing blog instead of meditating or gratitude journaling? Because you don’t actually want to be happy. The pursuit of happiness is just a story we tell ourselves. It’s a way to cover up the ugly things we’re actually pursuing, like status-enhancing opinions, moral superiority, high-status offspring, resources that others are deprived of, and control of the coercive apparatus of the state.
Altruism is effective. Evolutionary biologists have long known that animals do not evolve to care about the good of the species. Instead, they evolve to care about themselves, their families, and their allies. Of course, there is one kind of animal that can talk, called “humans.” Such humans might say they care about higher things like universal love or maximizing the welfare of sentient beings. But pretending to care is different from actually caring, and actions speak louder than words. If “effectiveness” means achieving our actual goals—like displaying our moral superiority or forging political alliances—then our altruism is very effective.
Stated motives vs. actual motives
A lot of intellectuals confuse our stated motives with our actual motives. They confuse our words with our deeds. It’s like mistaking Starbucks’ mission statement—“inspiring and nurturing the human spirit, one person, one cup, one neighborhood at a time”—with its goal of maximizing profit.
It’s easy to see how this could lead to the misunderstanding myth. If we judge ourselves according to our stated goals or “mission statements”—e.g., changing hearts and minds, making the world a better place—then yea, we’re doing a bad job at those things. There’s been a big misunderstanding here.
But if we judge ourselves according to our actual goals—climbing social hierarchies, derogating rivals, dominating people under moralistic pretexts—then we look pretty rational. Because we are. Natural selection made us that way. Show me an animal that has succeeded in surviving and reproducing in a hostile environment for millions of years, and I will show you a rational animal.
The default assumption of every intellectual should be that the human mind is about as well-designed as the hawk’s eye, the bat’s sonar, or the cheetah’s sprint. Unfortunately, intellectuals do not make this assumption. Instead, they assume our species is broken, and they’ve been put on this earth to fix us.
It’s not much of an exaggeration to say that the majority of mainstream social science is engaged in an effort to collect misunderstandings, regardless of whether or not they exist. Social scientists aren’t so much interested in gaining insight into human nature as they are in dunking on the masses, correcting our “biases,” combating “unfounded stereotypes,” bridging divides, designing “interventions” to make us less stupid and bad, putting happy vibes inside our heads, and advancing their political agendas. The best compliment you can give a social scientist about their work is not “Wow, that’s really insightful,” but “Wow, that has major policy implications [i.e., supports the policies I already like].”
Don’t be so cynical
Yes, this is all very cynical, isn’t it? In our culture, cynicism is icky because everyone knows that cynics are meanies. There probably is a correlation between being cynical and being an asshole, and—savvy stereotypers that we are—we've picked up on this correlation. The result is that we spout feel-good, idealistic bullshit to signal that we’re sweeties, and it works.
Unfortunately, all this signaling puts us in a bind. If we can’t blame humanity’s problems on the cynical motives built into our brains by evolution, because that makes us look mean and icky, then what are we supposed to blame humanity’s problems on? Well, there’s a beautiful option available to us: misunderstanding.
You see, it’s not that we’re hierarchical, coalitional, self-deceiving primates—forged in the crucibles of Darwinian natural selection—don’t be so cynical! No, the problem is that other people are biased, ignorant, gullible, weak-willed, and misinformed. They don’t know what’s best for them. They need us to nudge them, raise their consciousness, purge them of misinformation, and teach them who their political enemies are—you know, the people who happen to be our closest rivals in the social hierarchy.
Saving the world, one person, one bias, one misunderstanding at a time
If you’re an intellectual with ambitions to save the world, I think you ought to ask yourself the following questions:
“What if humans are savvy animals who generally understand what they have an incentive to understand?”
“What if stupidity is usually strategic?”
“What if capitalists, bigots, warmongers, virtue signalers, and political extremists understand what they’re doing all too well?”
“What if advice is mostly bullshit?”
“What if the primary cause of humanity’s problems is not bad beliefs, but bad motives?”
While reflecting on these questions, you may arrive at an unpleasant truth: there’s nothing you can do. The world doesn’t want to be saved.
Sure, you can tell the politicians they’re “biased,” but at the end of the day, a politician’s job is to win the support of biased voters. Sure, you can tell the voters they’re “biased,” but at the end of the day, the voters have basically no incentive to be unbiased, and strong incentive to parrot their tribe’s propaganda. Sure, you can tell the press about these terrible misunderstandings, but the press will only write about them if it increases their market share of the attention economy. Sure, you can tell the consumers to stop paying attention to attention-grabbing bullshit, but therein lies the problem: they won’t pay attention to you.
If you find yourself in a hole, you can study the hole all you want. You can examine the dirt around you to the last molecule. But no matter how thoroughly you understand the hole, you’ll still be stuck in it. Not every problem has a solution. Some things cannot be fixed. And once you come to the bracing realization that we have no deep desire to fix our broken world, you'll realize that our problem is that we have no problem. What's broken is that nothing is broken. The study of human nature is, all too often, the study of the hole we’re stuck in.
In the end, the only misunderstanding is that there's been a misunderstanding.



The way that I look at this issue is to consider to consider two possible meanings of "rationality" and also to to consider the roles of rationality and passions in directing human behavior.
I think that rationality is most often seen as a process of obtaining a true, accurate representation of the world. From this rationality-as-accurate-representation-of-reality view, misunderstanding is forming an incorrect representation of reality (e.g., stereotypes as inaccurate descriptions of various out-groups).
But, leaving aside the issue of how accurate stereotypes are, even if we did possess perfectly accurate knowledge of different groups, that knowledge alone does not tell us how to behave. A purely cognitive, information-processing description of people cannot tell us how they will use that information to choose a behavioral path. The goals we pursue are determined by our emotions, our motives, our passions. Without a push from emotional determinants, we would just sit there, contemplating our knowledge. Our rationality can only help us calculate specific behavioral paths for reaching the goals set by our passions. Therefore, making people more "rational" in the sense of possessing more accurate knowledge cannot solve recognized problems such as ethnic fighting, murder, and wars.
But note that rationality-as-accurate-representation-of-reality is not the only way to define rationality. In fact, there are problems in trying to think about rationality this way, the primary problem being that our mental maps or representations of reality can never be perfect representations of reality. The map is never the territory. Rather, our mental maps are constructions that serve the practical function of prediction *well enough* to allow creatures to achieve the goals of life *well enough* to reproduce.
A moment's thought reveals that an insect's mental world and our mental world are very different, but does it make sense to say that an insect's rationality is worse than ours? A pragmatic rather than correspondence conceptualization of rationality (a mental map that is good enough to allow an organism to reproduce before it dies) is more consistent with Kahneman's disinterest in correcting even his own biases because they are "rational" in a pragmatic, functional sense of helping him achieve his goals.
I recently came up with a different maxim: unaccountability is at the root of all evil
I haven't read your piece yet - just wanted to share