← Back to Library
Wikipedia Deep Dive

Iatrogenesis

Based on Wikipedia: Iatrogenesis

In 2013, an estimated 142,000 people around the world died not from their diseases, but from their treatments. This represents a fifty-one percent increase from just two decades earlier. We have a word for this phenomenon—when medicine itself becomes the source of harm—and it comes from ancient Greek: iatrogenesis, meaning "brought forth by a healer."

The irony cuts deep. The very profession sworn to "first do no harm" has, throughout history and into the present day, been a significant source of suffering and death.

The Many Faces of Iatrogenesis

When most people think of medical harm, they imagine dramatic errors—the surgeon who amputates the wrong limb, the pharmacist who fills a prescription for the wrong drug. These obvious mistakes certainly happen, but they represent only a fraction of iatrogenic harm.

Consider the patient undergoing chemotherapy. The treatment is working exactly as designed. The drugs are the correct ones, administered at the proper doses. Yet the patient loses their hair, suffers debilitating nausea, and faces an elevated risk of developing a second cancer later in life. All of this is iatrogenic—harm caused by medical intervention—even though nothing went "wrong."

This distinction matters enormously. Iatrogenic harm encompasses:

  • Adverse effects from medications and vaccines, even when properly prescribed
  • Side effects from necessary but aggressive treatments like radiation therapy
  • Loss of function from surgical removal of diseased organs (remove part of the pancreas, and you may develop diabetes)
  • Hospital-acquired infections
  • Drug interactions that nobody anticipated
  • Complications from faulty procedures or outdated surgical instruments
  • Misdiagnosis leading to inappropriate treatment

The boundaries blur further when you examine the statistics. If a patient with a ruptured aortic aneurysm dies during emergency surgery, that death counts as iatrogenic. Yet without intervention, the survival rate for such ruptures sits below twenty-five percent. The surgery that killed the patient was still, statistically speaking, their best chance at living.

The Problem of Prejudice

Some iatrogenic harm stems not from the complexity of medicine but from the biases of practitioners. Patients are sometimes brushed off, their symptoms minimized or dismissed, because of their weight, gender, sexual orientation, ethnicity, religion, or immigration status. A doctor who doesn't take a patient's complaints seriously may miss a treatable condition entirely.

This creates a vicious spiral. Patients who feel disrespected or dismissed become less likely to seek care in the future. Conditions that might have been caught early progress to dangerous stages. The prejudice of one provider can, quite literally, prove fatal—even if no specific medical error ever occurs.

Trust, it turns out, is a medical intervention in its own right. When that trust breaks down, people die.

Bacteria Fight Back

Perhaps the most troubling form of iatrogenesis unfolds not in individual patients but across entire populations. Antibiotic resistance—the evolution of bacteria that our drugs can no longer kill—is itself an iatrogenic phenomenon.

When antibiotics first became widely available in the mid-twentieth century, they seemed miraculous. Infections that had been death sentences became minor inconveniences. But every prescription of antibiotics creates evolutionary pressure. Bacteria with random mutations that confer resistance survive while their vulnerable cousins perish. Over time, through the relentless logic of natural selection, resistant strains come to dominate.

The overprescription of antibiotics accelerated this process dramatically. Doctors gave them for viral infections they couldn't treat. Patients demanded them for conditions that would resolve on their own. Agricultural operations fed them to livestock by the ton. Each unnecessary exposure was another spin of the evolutionary roulette wheel.

Now we face bacteria that shrug off drugs that once reliably killed them. The World Health Organization considers antimicrobial resistance one of the greatest threats to global health. This is iatrogenesis on a civilizational scale—harm brought forth by healers, compounding with each passing year.

The Mind's Vulnerability

Psychiatry presents its own unique iatrogenic dangers. The field deals with conditions that lack definitive biological markers. There is no blood test for depression, no brain scan that definitively confirms schizophrenia. Diagnosis depends heavily on subjective assessment, on what patients report and how clinicians interpret those reports.

This creates ample room for error. Consider bipolar disorder in children, a diagnosis that increased dramatically in the early 2000s. Many of these children had previously been diagnosed with major depressive disorder and treated with stimulants or antidepressants—medications that can sometimes trigger manic episodes in people with bipolar tendencies. The treatment for one condition may have created another.

Historical examples prove more dramatic still. Hystero-epilepsy, once considered a genuine medical condition, is now understood to have been largely iatrogenic—a disorder created and reinforced by the very practitioners who claimed to treat it. Patients learned what symptoms were expected of them and dutifully produced them.

Chronic Fatigue Syndrome offers a more recent and still-contentious example. For years, many clinicians treated it as a psychiatric or psychosomatic condition. The standard recommendation—graded exercise therapy, slowly increasing physical activity—seemed reasonable enough. But for many patients, this approach caused demonstrable harm. Their symptoms worsened rather than improved. The treatment based on the assumption that their condition was primarily psychological actively damaged people who were genuinely, physically ill.

At the extreme end sits dissociative identity disorder, what was once called multiple personality disorder. A minority of researchers argue that this condition is almost entirely iatrogenic—that patients develop multiple personalities primarily because their therapists, consciously or not, expect and encourage them to do so. The bulk of diagnoses, these skeptics note, come from a tiny fraction of practitioners.

This remains controversial. But it illustrates a deeper truth: in psychiatry, the line between discovery and creation can be impossibly thin.

Ivan Illich's Three-Level Critique

In 1974, a social critic named Ivan Illich published a book called Medical Nemesis: The Expropriation of Health. It represented the most ambitious attempt to date to understand iatrogenesis not just as a medical problem but as a social and cultural one.

Illich identified three levels of harm.

The first level, clinical iatrogenesis, covers what we've already discussed: injuries from ineffective, unsafe, or erroneous treatments. This is the iatrogenesis that medical professionals themselves acknowledge and work to reduce. Remarkably, Illich articulated the need for what we now call "evidence-based medicine" two decades before that term entered common use—though the underlying concept, the idea that medical treatments should be grounded in demonstrated effectiveness rather than tradition or authority, has roots stretching back centuries.

The second level cuts deeper. Social iatrogenesis describes the medicalization of life itself—the transformation of ordinary human experiences into medical conditions requiring professional intervention. Pharmaceutical companies, medical device manufacturers, and healthcare systems all have financial incentives to expand the boundaries of treatable illness. Age-related decline becomes a disease to manage. Shyness becomes social anxiety disorder. The normal process of dying becomes a failure of medical technology.

Illich argued that medical education itself contributes to this problem. Doctors train overwhelmingly to diagnose and treat disease. They focus on what's wrong, on pathology, rather than on health or on helping people cope with the unavoidable difficulties of being human. This orientation shapes everything that follows.

The third level, cultural iatrogenesis, concerns what modern medicine has destroyed. Traditional societies had ways of understanding and coping with suffering, with death, with the limits of human existence. Medicine has progressively displaced these frameworks without providing adequate replacements. We've become dependent on institutions to manage experiences our ancestors handled within families and communities. We've lost, Illich argued, our capacity for autonomous coping.

This critique was not a rejection of modern medicine's benefits. Illich acknowledged that antibiotics save lives, that surgery can restore function, that vaccines prevent suffering on an enormous scale. His target was the unexamined expansion of medical authority into domains where it doesn't belong—the "unwarranted dependency and exploitation" that can accompany genuine healing.

Iatrogenic Poverty

There exists yet another dimension of harm that Illich didn't fully explore: the financial devastation that medical care can inflict. Researchers have coined the term "iatrogenic poverty" to describe impoverishment caused directly by healthcare expenses.

Every year, more than 100,000 households worldwide fall into poverty because of medical bills. In the United States, a 2001 study found that illness and medical debt caused half of all personal bankruptcies. This was true even for people with health insurance—the costs that insurance didn't cover proved sufficient to destroy their financial stability.

In developing countries undergoing economic transitions, the pattern often proves more devastating still. As incomes rise, people become willing to pay more for healthcare. Providers respond by offering more services, often at higher prices. But regulatory systems and consumer protections lag behind. Patients receive treatments of uncertain value, sometimes from practitioners more interested in profit than in health. They deplete their savings, sell productive assets, fall into debt, and ultimately into poverty—all while seeking help for their ailments.

The tragedy compounds: poor people get sick more often. They have less access to preventive care, worse nutrition, more dangerous working and living conditions. Poverty both results from and causes illness, creating a cycle that medicine, despite its best intentions, often fails to break.

The American Statistics

The United States provides particularly detailed data on iatrogenic harm, and the numbers are sobering. Annual deaths attributed to various forms of medical intervention include:

  • 12,000 from unnecessary surgeries
  • 7,000 from medication errors in hospitals
  • 20,000 from other hospital errors
  • 80,000 from infections acquired in hospitals (what medical professionals call nosocomial infections)
  • 106,000 from the negative effects of drugs prescribed and administered correctly

Add these figures together—excluding errors that researchers couldn't quantify—and you reach 225,000 deaths per year. Some estimates run higher. An Institute of Medicine report suggested the true figure might fall between 230,000 and 284,000 annual deaths.

To put this in perspective: this would make iatrogenic death one of the leading causes of mortality in the United States, ranking alongside heart disease, cancer, and stroke. Medical intervention kills more Americans each year than car accidents, gun violence, and drug overdoses combined.

These statistics come with important caveats. Many people who die from medical complications would have died anyway—that ruptured aneurysm patient, remember, had less than a twenty-five percent chance even with perfect surgical execution. Medicine treats the sickest people, and sick people often die. Still, the magnitude of the numbers demands attention.

A History of Harm

The recognition that medicine can harm as well as heal stretches back to antiquity. The Hippocratic tradition, originating with the Greek physician Hippocrates around four hundred years before the common era, placed "first do no harm" (primum non nocere in Latin) at the center of medical ethics. Many ancient civilizations made iatrogenic illness or death caused by negligence a punishable offense.

Yet for most of human history, iatrogenic harm was simply unavoidable. Doctors lacked the knowledge to understand what actually worked. They bled patients who needed their blood, purged patients who needed nutrition, prescribed mercury and arsenic as medicines. Some historians argue that George Washington was essentially killed by his doctors, who drained nearly half his blood over the course of treating what was probably a simple throat infection.

One particularly grim example from the nineteenth century: puerperal fever, also called childbed fever, killed enormous numbers of women in maternity hospitals. The mortality rates were shocking—in some institutions, one in six new mothers died. The cause, eventually identified by Hungarian physician Ignaz Semmelweis, was simple: doctors were transferring pathogens from the autopsy room to the delivery room on their unwashed hands.

Semmelweis's story illustrates both the persistence of iatrogenic harm and the difficulty of eliminating it. Even after he demonstrated that hand-washing dramatically reduced maternal deaths, the medical establishment resisted his findings. His colleagues couldn't accept that they themselves had been killing their patients. Semmelweis eventually suffered a mental breakdown and died in an asylum—ironically, from an infection contracted there.

Progress and Its Limits

The twentieth century brought genuine advances. Antiseptic technique became standard. Anesthesia made surgery survivable. Antibiotics (before resistance undermined them) cured previously fatal infections. Evidence-based protocols replaced tradition and intuition. Best practices emerged from careful research rather than guesswork.

These developments saved millions of lives. They also created new opportunities for harm. More powerful interventions carry more powerful risks. Chemotherapy can cure cancer and cause cancer. Immunosuppressive drugs make organ transplants possible and leave patients vulnerable to infections. Every capability comes with a shadow.

The challenge is not to eliminate iatrogenesis—that may be impossible—but to manage it wisely. This requires honesty about the limits of medicine, about the risks inherent in every intervention, about the certainty that some people who seek help will be harmed by it. It requires patients who understand what they're agreeing to and practitioners humble enough to acknowledge what they don't know.

It requires, above all, the recognition that "first do no harm" is an aspiration rather than a guarantee. Medicine heals. Medicine also hurts. The ancient Greeks who gave us the word iatrogenesis understood this duality. Their healers brought forth both health and disease. Twenty-four centuries later, so do ours.

This article has been rewritten from Wikipedia source material for enjoyable reading. Content may have been condensed, restructured, or simplified.