Brandolini's law
Based on Wikipedia: Brandolini's law
In January 2013, an Italian programmer named Alberto Brandolini was doing something that sounds almost comically mundane: he was reading Daniel Kahneman's book about cognitive biases while half-watching a political talk show. On the screen, former Prime Minister Silvio Berlusconi was sparring with journalist Marco Travaglio. And somewhere in that collision between behavioral science and televised political theater, Brandolini crystallized something that anyone who has ever argued with a conspiracy theorist already knows in their bones.
He posted it online, and it went viral:
The amount of energy needed to refute bullshit is an order of magnitude bigger than that needed to produce it.
That's it. That's the law. Sometimes called "the bullshit asymmetry principle," it describes one of the most frustrating features of our information ecosystem: lying is cheap, and truth is expensive.
The Mathematics of Mendacity
Let's be precise about what "an order of magnitude" means. In scientific notation, an order of magnitude is a factor of ten. Brandolini isn't saying that debunking nonsense takes twice as long as creating it. He's saying it takes roughly ten times as long. Maybe more.
Think about that for a moment.
A person can type a false claim in fifteen seconds. Someone else needs to research whether it's true, find authoritative sources, explain the context, anticipate objections, and present the correction in a way that might actually persuade someone. That process might take hours. Or days. Or, in some cases, decades.
This isn't just a clever observation about internet debates. It's a fundamental asymmetry that shapes how information spreads through society. The economics favor the bullshitter. Every time.
The Pineapple Importer and the Journalist
During the COVID-19 pandemic, a disinformation journalist named Jeff Yates at Radio-Canada encountered a perfect specimen of Brandolini's law in the wild. A YouTube video had gone viral, spreading medical misinformation about the coronavirus. The person presenting it was a pineapple importer—someone with no medical credentials, no epidemiological training, just a camera and an internet connection.
Yates described what happened next:
He makes all kinds of different claims. I had to check every single one of them. I had to call relevant experts and talk to them. I had to transcribe those interviews. I had to write a text that is legible and interesting to read. It's madness. It took this guy 15 minutes to make his video and it took me three days to fact-check.
Fifteen minutes versus three days. That's not an order of magnitude—that's closer to three hundred times the effort. And here's the kicker: by the time Yates published his fact-check, the video had already been viewed by millions of people. The lie had completed several laps around the world before the truth had finished lacing up its shoes.
An Ancient Problem with Modern Amplification
Brandolini didn't invent this observation. He just gave it a name and a memorable formulation. People have been noticing this asymmetry for centuries.
In 1710, Jonathan Swift—the same satirist who wrote Gulliver's Travels—captured it beautifully:
Falsehood flies, and truth comes limping after it; so that when men come to be undeceived, it is too late, the jest is over, and the tale has had its effect: like a man who has thought of a good repartee, when the discourse is changed, or the company parted; or, like a physician, who has found out an infallible medicine, after the patient is dead.
Notice how Swift understood not just the speed differential, but the timing problem. By the time you have the perfect comeback, the conversation has moved on. By the time you develop the cure, the patient is dead. The damage is done.
Over a century later, in 1845, the French economist Frédéric Bastiat made a similar complaint about political debate:
We must confess that our adversaries have a marked advantage over us in the discussion. In very few words they can announce a half-truth; and in order to demonstrate that it is incomplete, we are obliged to have recourse to long and dry dissertations.
The "long and dry dissertations" are the price of accuracy. A lie can be punchy and memorable. A correction requires context, nuance, caveats, and evidence. Which one do you think plays better on social media?
The Vaccine That Didn't Cause Autism
If you want to understand Brandolini's law at scale, consider the false claim that vaccines cause autism. This belief has shaped parental decisions for over two decades. It has contributed to outbreaks of measles and other preventable diseases. It has killed people.
Here's how it started: In 1998, a British doctor named Andrew Wakefield published a research paper in The Lancet, one of the world's most prestigious medical journals. The paper claimed to find a link between the MMR vaccine—which protects against measles, mumps, and rubella—and autism in children.
The paper was fraudulent. Wakefield had manipulated data. He had undisclosed financial conflicts of interest. An investigation revealed that he had been paid by lawyers who were suing vaccine manufacturers and that he had plans to profit from the panic he was creating.
The journal retracted the paper. The United Kingdom's General Medical Council revoked Wakefield's medical license. Study after study, involving millions of children, found no connection between vaccines and autism. The scientific consensus is absolute: the MMR vaccine does not cause autism.
And yet.
The lie persists. Decades of research, billions of dollars in studies, countless public health campaigns, and still a significant percentage of parents hesitate to vaccinate their children because of a fraudulent paper published in 1998. One man, one bad paper, and twenty-five years of global efforts haven't been enough to undo the damage.
That's Brandolini's law operating at civilizational scale.
The Icelandic Baby Boom That Wasn't
Not all misinformation is malicious. Sometimes it's just a joke that escapes containment.
In 2016, Iceland's national football team did something remarkable: they eliminated England from the UEFA European Championship. This was a genuine David-versus-Goliath moment. Iceland, a nation of about 330,000 people, had beaten one of football's traditional powerhouses.
Nine months later, an Icelandic doctor named Ásgeir Pétur Thorvaldsson made a joke on Twitter. He claimed that Iceland was experiencing a baby boom as a result of the victory—the implication being obvious. It was the kind of thing that sounds plausible if you don't think about it too hard. National celebration, euphoria, the math on nine months—sure, why not?
Media outlets around the world picked up the story. It made for a charming human interest piece. The problem was that it wasn't true. When statisticians actually looked at the birth data, there was no baby boom. The doctor had been joking.
But corrections don't travel as well as original stories. Somewhere out there, people still believe that Iceland's football victory led to a wave of celebratory babies. It's harmless misinformation, but it illustrates how easily false narratives take hold, even when they originate as obvious jokes.
Why We Fall for It
In 2020, researchers conducted studies on what they called "bullshit receptivity"—a technical term that sounds like an insult but is actually a measurable psychological trait. They found something important: people become more susceptible to nonsense when they're depleted.
Specifically, the study concluded that "people are more receptive to bullshit, and less sensitive to detecting bullshit, under conditions in which they possess relatively few self-regulatory resources."
Translation: when you're tired, stressed, or overwhelmed, your nonsense detector stops working properly.
This has implications for how misinformation spreads. During crises—a pandemic, an economic collapse, a war—people are exactly the kinds of exhausted and anxious that make them vulnerable. And crises are precisely when misinformation tends to flourish.
During COVID-19, researchers Jevin West and Carl Bergstrom studied how false claims about hydroxychloroquine spread. This antimalarial drug was briefly promoted as a potential COVID treatment. An early clinical trial looked promising, but it was soon refuted by better evidence. The drug didn't work against COVID-19 and could cause dangerous side effects.
But the claim that hydroxychloroquine was a miracle cure continued to spread. West and Bergstrom identified the toxic combination: widespread social media coverage, high anxiety, and high uncertainty. People were scared and desperate for solutions. The truth—that we didn't have a quick fix—was less comforting than the lie.
The Firehose of Falsehood
Brandolini's law becomes even more powerful when weaponized deliberately. Instead of defending a single lie, you can simply produce lies faster than they can be debunked.
This strategy has a name: the Gish gallop, named after creationist Duane Gish, who was famous for flooding debates with so many dubious claims that his opponents couldn't possibly address them all in the allotted time. While they were carefully explaining why claim number three was wrong, claims four through forty went unchallenged.
The same principle applies to political disinformation. If a fact-checker can debunk ten claims per day and a disinformation operation can produce a hundred, simple arithmetic tells you who wins. The truth-tellers are playing a losing game.
This is what makes Brandolini's law so dangerous in the age of social media. The technology has dramatically reduced the cost of producing and distributing nonsense while doing nothing to reduce the cost of refuting it. We've made lying cheaper and left truth-telling just as expensive as it ever was.
Gendered Disinformation
Brandolini's law doesn't affect everyone equally. Some people are targeted with more bullshit than others.
The U.S. Department of State has defined something called "gendered disinformation"—a specific category of online abuse that uses false narratives to drive women out of public life. According to their definition, it's "a subset of misogynistic abuse and violence against women that uses false or misleading gender and sex-based narratives, often with some degree of coordination, to deter women from participating in the public sphere."
When a woman enters politics, or journalism, or any other high-profile field, she becomes a target. The false claims come quickly: she slept her way to success, she's mentally unstable, she's a bad mother, she's secretly controlled by someone else. None of it needs to be true. It just needs to be repeated enough times to shape public perception.
And here's where Brandolini's law becomes particularly cruel: the woman being targeted has to choose between ignoring the lies (which allows them to spread) or spending enormous energy refuting them (which takes time away from her actual work and may draw more attention to the claims). There's no winning move.
Both foreign governments and domestic political operatives have recognized gendered disinformation as an effective tool. It's cheap to produce, expensive to counter, and it achieves the strategic goal of silencing certain voices. The asymmetry is a feature, not a bug.
Fighting Back
If Brandolini's law describes a fundamental asymmetry, is there any hope for truth-tellers? Or are we doomed to always be playing defense, always behind, always exhausted?
Researchers have identified some strategies that help.
First, preexposure warnings. If you can warn people that they're about to encounter misinformation before they see it, they become more resistant to it. This is sometimes called "prebunking" rather than debunking. It's easier to prevent someone from swallowing a lie than to extract it once they've internalized it.
Second, repeated retractions. Saying something once isn't enough. Corrections need to be repeated, through multiple channels, over time. This is exhausting and expensive—exactly as Brandolini's law would predict—but it works better than a single correction that gets buried.
Third, alternative narratives. People don't like having beliefs taken away without receiving something in return. If you just tell someone that their belief is wrong, they'll resist. But if you can offer them a different explanation—a true one—that fills the same psychological need, they're more likely to update their views.
West and Bergstrom, who have made careers out of studying nonsense, offer additional guidance for effective refutation:
- Be correct. Include all necessary information, have a friend review it, double-check your facts. Getting something wrong undermines your credibility and provides ammunition to the bullshitter.
- Be charitable. Acknowledge the possibility that you might be confused. Don't assume malice when confusion might explain things. Don't call people stupid. This might feel satisfying, but it doesn't change minds.
- Be clear. Make your argument coherent and easy to follow. Complicated refutations lose to simple lies.
- Admit mistakes. When you get something wrong, say so. This builds credibility for the things you get right.
Environmental researcher Phil Williamson of the University of East Anglia has argued that scientists have a duty to engage, despite the asymmetry. "The scientific process doesn't stop when results are published in a peer-reviewed journal," he wrote. "Wider communication is also involved, and that includes ensuring not only that information (including uncertainties) is understood, but also that misinformation and errors are corrected where necessary."
It's a losing battle in any single engagement. But Williamson's point is that scientists can't afford to cede the field entirely. If experts stay silent, the only voices people hear are the bullshitters.
The Bullshitter Versus the Bullshit
There's one interesting wrinkle to Brandolini's law that offers some hope. Sometimes, you don't have to refute every piece of bullshit. Sometimes, you can just discredit the bullshitter.
When someone lies repeatedly and gets caught repeatedly, something shifts. The bullshitter becomes more obvious than any individual piece of bullshit. Once someone has established a pattern of dishonesty, their future claims carry less weight. You don't need to fact-check every sentence if the source is known to be unreliable.
This is why credibility matters so much. It's why scientific journals retract fraudulent papers. It's why Andrew Wakefield losing his medical license was important even though it didn't immediately stop the spread of his ideas. Reputation is a form of efficiency. It lets people filter information without having to verify every claim from first principles.
Of course, this only works if people are paying attention to credibility signals. In an environment where distrust of institutions is high and anyone can build an audience online, the traditional markers of reliability don't carry the weight they once did. A disgraced former doctor can rebrand as a "truth-teller" and find a new audience that doesn't know or doesn't care about his history.
The Social Cost
There's one more dimension to Brandolini's law that often gets overlooked: the social cost of refutation.
Debunking isn't just time-consuming and exhausting. It can also be socially risky. If the bullshit you're trying to refute is popular in your community—your family, your church, your political tribe—then challenging it means challenging your relationships.
This is why false beliefs can persist even in communities full of intelligent, educated people. Everyone might privately suspect that something isn't true, but no one wants to be the person who speaks up and causes conflict. The social cost of correction exceeds the personal cost of going along with nonsense.
In this sense, Brandolini's law understates the problem. It's not just that refuting bullshit takes more energy than producing it. It's that refuting bullshit often requires courage, social capital, and a willingness to be ostracized. The asymmetry isn't just technical; it's also emotional and relational.
Living with the Asymmetry
Brandolini's law describes a feature of reality, not a solvable problem. The asymmetry between creating and refuting misinformation isn't going away. If anything, technology has made it worse.
What can individuals do?
First, be strategic about what you refute. You can't fight every lie. Choose the battles where you have leverage: where the audience is persuadable, where the stakes are high, where you have relevant expertise.
Second, invest in prebunking rather than debunking when possible. Help people develop the skills and heuristics to identify bullshit before it takes hold. Media literacy, source evaluation, understanding of cognitive biases—these are force multipliers.
Third, support institutions and people who do the expensive work of verification. Journalism, fact-checking organizations, scientific research—these are all fighting an uphill battle against Brandolini's law. They need resources and legitimacy.
Fourth, be honest about what you don't know. One of the reasons bullshit spreads so easily is that people confidently share things they haven't verified. A little epistemic humility—saying "I'm not sure" or "I should check that"—can slow the spread.
Finally, remember that Alberto Brandolini wasn't trying to counsel despair. He was trying to name something so we could recognize it. Once you see the asymmetry, you can stop blaming yourself for failing to win arguments against people who aren't constrained by truth. You can stop being surprised when lies spread faster than corrections. You can plan accordingly.
The pineapple importer will always be able to make his video faster than the journalist can fact-check it. But that doesn't mean we have to accept the pineapple importer's version of reality. It just means we need to be clear-eyed about the fight we're in.