← Back to Library
Wikipedia Deep Dive

Health care prices in the United States

I've written the rewritten article. Here is the complete HTML content: ```html

Based on Wikipedia: Health care prices in the United States

Here's a peculiar thing about American healthcare: you can walk into a hospital, receive treatment, and have absolutely no idea what it will cost until a bill arrives weeks later. Try that at a restaurant, or a car dealership, or really anywhere else in the economy, and people would think you'd lost your mind.

Yet this is how the world's most expensive healthcare system operates.

The Numbers That Make Other Countries Gasp

The United States spends more on healthcare than any other wealthy nation—and it's not even close. By 2015, Americans were spending nearly ten thousand dollars per person on healthcare, totaling 3.2 trillion dollars. That works out to almost eighteen percent of the entire economy devoted to keeping people healthy (or at least trying to).

To put this in perspective: if you compare America to other countries in the Organisation for Economic Co-operation and Development—a club of mostly wealthy, industrialized nations—the United States spends at least a third more on healthcare relative to the size of its economy. Germany, France, the United Kingdom, Canada—all of them manage to cover their populations for considerably less.

The gap has been widening for decades. In 1970, healthcare consumed about six percent of the American economy. By 2015, that figure had tripled. And while the annual rate of increase has slowed somewhat in recent years, healthcare costs still rise faster than economic growth, meaning the system claims an ever-larger slice of the national pie.

Why Is Everything So Expensive?

The simplest explanation is also the most frustrating: things just cost more here.

The same medical procedure, performed with the same equipment by similarly trained doctors, costs dramatically more in America than almost anywhere else. A hip replacement that might cost fifteen thousand dollars in the Netherlands could easily run sixty thousand in the United States. An MRI scan that costs a few hundred dollars in France might be billed at over a thousand in an American hospital.

Americans also consume more healthcare—more tests, more procedures, more specialist visits. Whether this reflects genuine medical need or a system that incentivizes overtreatment is hotly debated.

But dig deeper, and you find structural reasons for the price gap:

  • Administrative complexity: The American system involves a bewildering array of private insurers, each with different rules, forms, and reimbursement procedures. Hospitals and doctors' offices employ armies of billing specialists just to navigate the paperwork. Some estimates suggest administrative costs consume nearly a third of all healthcare spending.
  • Higher incomes: Americans are richer on average than citizens of many other countries, which means they can afford to spend more—and providers charge accordingly.
  • Less government price control: Most wealthy countries use some form of government power to negotiate or dictate what healthcare should cost. The United States, by contrast, largely lets market forces determine prices—except the market doesn't work the way it does for other goods and services.

The Strange Economics of Not Knowing What Things Cost

Imagine shopping for a car this way: you visit a dealership, pick out a vehicle, drive it home, and three weeks later receive a bill for an amount determined by negotiations between the manufacturer and some third party you've never met. If you don't like the price, too bad—you already have the car.

This is essentially how healthcare works for most Americans.

A study by the California Healthcare Foundation found that only twenty-five percent of people who asked hospitals for pricing information could actually obtain it in a single visit. The prices exist, of course—they're negotiated painstakingly between insurance companies and healthcare providers—but they're treated as proprietary secrets, hidden from the very patients who will ultimately pay.

This opacity has given rise to "surprise medical bills," those heart-stopping envelopes that arrive long after a hospital stay, containing charges no one mentioned at the time. A patient might think they're covered by insurance, only to discover that the anesthesiologist who appeared briefly during their surgery was "out of network" and is now billing separately for thousands of dollars.

When Free Markets Fail

The Harvard economist N. Gregory Mankiw—no opponent of free markets—has explained why the usual rules of supply and demand break down in healthcare. His analysis is worth understanding, because it explains why this particular market resists easy fixes.

First, there are what economists call "positive externalities." When you get vaccinated against measles, you're not just protecting yourself—you're protecting everyone you might otherwise infect. But individuals, thinking only of their own costs and benefits, tend to undervalue this broader social benefit. Left to its own devices, a free market will produce fewer vaccinations than society actually needs. The same logic applies to medical research: the person who funds a breakthrough treatment can't capture all the benefits it creates, so private companies invest less than would be socially optimal.

Second, patients don't know what they need. Healthcare is technical and complex. When your doctor recommends a particular treatment, you're largely taking it on faith—you lack the expertise to evaluate whether it's truly necessary or whether a cheaper alternative might work just as well. This is fundamentally different from buying a television, where you can research specifications and read reviews. The information asymmetry between doctors and patients creates fertile ground for both over-treatment and under-treatment.

Third, healthcare spending is wildly unpredictable. Most of us will have relatively modest medical expenses in any given year, but a few of us will face catastrophic costs—cancer treatment, serious accidents, chronic conditions that require constant management. This unpredictability makes insurance essential. But insurance itself changes behavior: when someone else is paying the bills, patients and doctors alike have less incentive to economize. Economists call this "moral hazard."

Fourth, there's the problem of "adverse selection." If insurers can choose whom to cover, they'll naturally prefer healthy customers who are unlikely to file claims. But this creates a vicious cycle: as healthy people are siphoned off into cheaper plans, the remaining pool of insured people becomes sicker and more expensive to cover. Premiums rise, driving out more healthy people, until the whole system collapses—a phenomenon sometimes called a "death spiral."

This last problem led to one of the more ironic policy developments in recent American history. The individual mandate—the requirement that everyone purchase health insurance or pay a penalty—was originally proposed by the Heritage Foundation, a conservative think tank, as a market-based solution to adverse selection. Decades later, it became the most controversial element of the Affordable Care Act, denounced by many conservatives as government overreach.

The Public Programs: Medicare and Medicaid

Not all American healthcare operates in this murky quasi-market. Two massive government programs provide something closer to the single-payer systems found in other countries.

Medicare, created in 1965 under President Lyndon Johnson, covers Americans over sixty-five and people with certain disabilities. By 2017, it was spending nearly six hundred billion dollars annually to cover fifty-seven million people. Medicaid, created at the same time, covers primarily low-income children, pregnant women, and other groups deemed medically needy. It spent three hundred seventy-five billion dollars in 2017 and covered over sixty-eight million people—even more when you include the Children's Health Insurance Program.

These programs matter enormously for how the entire system works, because their sheer size gives the government leverage that individual patients lack. The Centers for Medicare and Medicaid Services—the federal agency that runs both programs—sets fee schedules that essentially tell healthcare providers what they'll be paid for each service. Want to argue? Good luck; Medicare is the single largest purchaser of medical services in the country.

The system uses something called "relative value units" to price medical procedures. Each treatment is assigned a certain number of these units based on how much labor and resources it requires. One unit translates to a dollar amount that varies by region and year—in 2005, it was roughly thirty-eight dollars. Private insurers often use these government-set values as a starting point for their own negotiations.

There's a twist, though. The committee that determines these relative value units—which ultimately shapes how much doctors get paid—is run by the American Medical Association, a trade group representing physicians. Critics have pointed out that this arrangement is a bit like letting the fox design the henhouse: the committee has been shown to systematically inflate the value of the procedures its members perform.

The Employer-Based System: An Accidental Architecture

Most working-age Americans—about a hundred fifty-five million people—get their health insurance through their jobs. This arrangement is so familiar that few Americans question it, but it's actually quite strange. Your employer doesn't provide your car insurance or your homeowner's insurance. Why health insurance?

The answer lies in a quirk of World War Two history. During the war, the government imposed wage controls to prevent inflation. Unable to compete for workers by offering higher pay, employers began offering health insurance as a perk instead. The practice stuck, aided by a tax quirk: employer-provided health insurance isn't counted as taxable income. This subsidy—worth an estimated two hundred fifty billion dollars per year—encourages employers to offer generous coverage and employees to accept it.

The average family policy cost over eighteen thousand dollars in 2016, with workers paying about five thousand and employers covering the rest. Single coverage averaged nearly sixty-five hundred dollars. These premiums have grown more slowly in recent years than they did in the 2000s, but they still outpace wage growth. From 2011 to 2016, deductibles grew by sixty-three percent while worker earnings grew just eleven percent.

Increasingly, employers are shifting costs to workers through higher deductibles—the amount you pay out of pocket before insurance kicks in. By 2016, four out of five workers had a deductible averaging nearly fifteen hundred dollars. Half of all workers faced deductibles of at least a thousand dollars, up from just ten percent a decade earlier.

The Individual Market and the Affordable Care Act

What if you don't get insurance through your job? Before 2010, you entered what was grimly known as the individual market, where insurers could reject you for pre-existing conditions, charge you whatever they wanted, and drop you when you got sick.

The Affordable Care Act—often called Obamacare—attempted to civilize this market. It created online marketplaces where individuals could compare plans, required insurers to accept everyone regardless of health status, and provided subsidies based on income to make coverage affordable.

By 2017, about twelve million people were buying insurance through these marketplaces. The subsidies work as tax credits that increase when premiums rise, creating a kind of shock absorber. In 2017, when some insurers proposed premium increases of forty to sixty percent, subsidized customers barely noticed—their after-subsidy costs stayed roughly the same because the subsidies grew proportionally.

But the system has vulnerabilities. In 2017, President Donald Trump discontinued a separate subsidy called "cost sharing reductions" that helped reduce deductibles and co-payments for lower-income consumers. This move raised premiums in the marketplaces by an estimated twenty percentage points above what they would otherwise have been.

Prescription Drug Prices: Where America Really Stands Out

Of all the categories of healthcare spending, prescription drugs show the starkest international comparisons. In 2015, Americans spent an average of eleven hundred sixty-two dollars per person on prescription medications. Canadians spent eight hundred seven dollars. Germans spent seven hundred sixty-six. The French spent six hundred sixty-eight. The British have an even more dramatic approach: they cap annual prescription costs at about a hundred thirty-two dollars per person.

Why the gap? In most countries, the government negotiates drug prices directly with pharmaceutical companies, using its purchasing power to demand discounts. Medicare, by contrast, is actually prohibited by law from negotiating drug prices—a provision that pharmaceutical companies lobbied hard to include when the program's drug benefit was created in 2003.

The Emergency Room Paradox

There's one place in America where everyone gets treated regardless of ability to pay: the emergency room. The Emergency Medical Treatment and Active Labor Act, passed in 1986, requires hospitals to stabilize any patient with a life-threatening condition, no questions asked about insurance or finances.

This sounds humanitarian, and in many ways it is. But it creates a strange dynamic. Emergency care is the most expensive way to deliver medicine—far costlier than preventive care or routine treatment at a doctor's office. Uninsured patients who can't afford regular checkups end up in emergency rooms with conditions that could have been caught early and treated cheaply. The hospital treats them, absorbs the cost, and passes it along to other patients and their insurers through higher prices.

It's healthcare for people who've fallen through every other crack in the system—expensive, inefficient, but mandated by law as a last resort.

A System Nobody Designed

Perhaps the strangest thing about American healthcare is that nobody set out to create it. The employer-based system emerged from wartime wage controls. Medicare and Medicaid were political compromises that left the rest of the system intact. The emergency room mandate was a response to hospitals dumping patients on the street. The Affordable Care Act tried to patch the remaining holes without fundamentally restructuring anything.

The result is a Rube Goldberg machine of staggering complexity: multiple payment systems, opaque pricing, misaligned incentives, and costs that continue to grow faster than Americans' ability to pay them. It's a system where brilliant medical advances coexist with people rationing insulin because they can't afford the full dose, where some of the world's best hospitals sit in cities where life expectancy has been declining.

Other countries have shown it's possible to cover everyone for less money with outcomes that are at least as good. The question isn't whether a better system is possible. The question is whether America can summon the political will to build one.

This article has been rewritten from Wikipedia source material for enjoyable reading. Content may have been condensed, restructured, or simplified.