The harder it is to find the truth, the easier it is to lie to ourselves
Deep Dives
Explore related topics with these Wikipedia articles, rewritten for enjoyable reading:
-
Naïve realism (psychology)
19 min read
The article explicitly discusses naïve realism as a central concept - the intuitive belief that we perceive reality directly and objectively, which explains why we're baffled when others disagree with us. This Wikipedia article would provide the scientific foundation for this key psychological phenomenon.
-
Confirmation bias
13 min read
While mentioned in the article as an example of motivated cognition, the Wikipedia article provides extensive research history, experimental evidence, and real-world implications that would deepen understanding of this fundamental cognitive bias the author references.
-
Walter Lippmann
14 min read
The article quotes Lippmann's observation about the modern world being 'out of reach, out of sight, and out of mind.' Lippmann was a pioneering media theorist whose work on public opinion and the 'pictures in our heads' directly anticipated the epistemic complexity argument central to this article.
If you look at humanity, both today and throughout history, you can’t help but notice that people believe a lot of things that seem stupid and irrational. Pick your favourite example: conspiracy theories, religion, prejudice, ideology, pseudoscience, ancestor myths, people who hold different political opinions from your own, and so on.
This observation provokes a central question for the social sciences. Why do broadly rational people, people who often seem intelligent and competent in most aspects of their lives, sometimes believe highly irrational things?
One classic answer is that people are not disinterested truth seekers. In some contexts, our practical interests conflict with the aim of acquiring accurate, evidence-based beliefs. For example, we might want to believe things that make us feel good, that impose a satisfying order and certainty on a complex world, that help us persuade others that we’re noble and impressive, or that win us status and approval from our friends and allies.
Famously, when our goals come into conflict with the pursuit of truth in this way, the truth often loses out. We lie to ourselves, bury our heads in the sand, and engage in elaborate mental gymnastics. Less colloquially, we engage in what psychologists call “motivated cognition”: we—or our minds, at least—direct cognitive processes toward favoured conclusions, not true ones. For example, we instinctively seek out evidence that confirms those conclusions (confirmation bias), shield ourselves from evidence against them (motivated ignorance), insist on higher standards for arguments we dislike than for those we like (biased evaluation), and remember and forget information in convenient patterns (selective forgetting).
Throughout most of history, scholars had little doubt that this tendency was a central and destructive feature of the human condition.
For Adam Smith, for example, it “is the fatal weakness of mankind” and “the source of half the disorders of human life.” For Socrates in the Cratylus, “the worst of all deceptions is self-deception.” And of course, thinkers such as Freud and Nietzsche placed motivated cognition at the centre of their understanding of human psychology.
Against Motivated Cognition
This consensus continued from the emergence of scientific psychology until relatively recently. In the last decade or so, however, some researchers have become increasingly sceptical that motivated cognition is a significant force in human affairs. There are many reasons for this, including reinterpretations
...This excerpt is provided for preview purposes. Full article content is available on the original publication.