Real wages
Based on Wikipedia: Real wages
The Paycheck Illusion
Here's a puzzle that affects everyone who has ever earned a wage: In 2005, the average American worker's paycheck grew by 2.7 percent. In 2015, it grew by just 2.1 percent. So workers were falling behind in 2015, right?
Wrong. They were actually getting ahead.
This counterintuitive truth reveals one of the most important—and most misunderstood—concepts in economics: the difference between what your paycheck says and what your paycheck actually buys.
Nominal Versus Real: The Money Mirage
When economists talk about wages, they distinguish between two very different things. Nominal wages are the raw numbers on your paycheck—the dollars and cents your employer deposits into your account. Real wages are something far more meaningful: they measure how much stuff you can actually buy with that money.
The gap between these two concepts is inflation.
Let's return to our puzzle. In 2005, paychecks grew 2.7 percent, but inflation was running at 3.4 percent. Prices were rising faster than wages. Workers could actually afford less at the end of the year than at the beginning—despite technically making more money. In 2015, paychecks grew a more modest 2.1 percent, but inflation was nearly zero, just 0.1 percent. Almost all of that wage increase translated into actual purchasing power.
The nominal raise in 2005 was a mirage. The smaller raise in 2015 was real.
A Simple Calculation, A Complicated Reality
The basic formula seems straightforward enough: take your nominal wage and divide by inflation to get your real wage. If you're making twenty thousand dollars and inflation is two percent, next year you'd need twenty thousand four hundred dollars just to stay even. Anything above that means your real wages are growing.
But here's where things get philosophically interesting. How exactly do we measure inflation?
Inflation isn't one number—it's an average across thousands of different goods and services. The price of gasoline might be up fifteen percent while the price of televisions fell by twenty percent. Rent might be skyrocketing while food costs barely budged. Which price changes matter most depends entirely on what you personally buy.
This creates a strange situation where real wages aren't precisely defined. Different ways of calculating inflation produce different measures of how much your purchasing power actually changed. For someone who drives a lot, gas prices matter enormously. For someone who works from home, they barely register.
Economists have found one situation where real wages can be said to have unequivocally increased: when a worker can now afford every possible combination of goods they could barely afford before, and still have money left over. In that case, no matter how you calculate inflation, real wages went up. But this clear-cut scenario is relatively rare. More often, workers gain the ability to afford some things while losing the ability to afford others—and whether that counts as progress depends on whose consumption patterns you're measuring.
Time as Money: Another Way to See It
There's a clever alternative to wading through inflation adjustments: just measure how long you have to work to buy something.
Consider a television. In 1970, an average American worker might have needed to work several hundred hours to afford a decent TV set. Today, a far superior television—larger, flatter, with better resolution—might cost just a few dozen hours of labor at the average wage. By this measure, TVs have become dramatically cheaper in real terms.
The same calculation works for most consumer goods. Clothing, appliances, electronics, even cars—nearly everything takes less work time to earn today than it did decades ago. This helps explain why people who earned what seem like poverty wages by today's nominal standards often lived reasonably comfortable lives. The cost of living was genuinely lower.
But there are important exceptions. Housing in major cities, healthcare, and higher education have all become more expensive in terms of work-hours required. A middle-class family in 1970 could often afford a home on a single income. Today, even dual-income households struggle to buy comparable housing in the same neighborhoods.
The Two Great Eras of Wage History
When economists look at the very long run—centuries rather than decades—they see two dramatically different phases of wage history.
The first phase, sometimes called the Malthusian era after the economist Thomas Malthus, lasted from ancient times until roughly 1800. During this vast span of human history, real wages barely budged. It wasn't that productivity never improved—it did, slowly. But every time people became slightly more productive, the extra resources allowed populations to grow, which pushed wages back down. More food meant more mouths, which meant the same amount of food per mouth.
This was a world where your great-great-grandparents earned roughly what their great-great-grandparents had earned. A peasant in medieval England had a standard of living not dramatically different from a peasant in ancient Rome or classical Greece. Real wages were essentially flat for millennia.
Then came the Industrial Revolution.
The second phase, sometimes called the Solow era after economist Robert Solow who studied economic growth, represents a complete break from human history. Starting around 1800 and accelerating through the nineteenth and twentieth centuries, real wages began climbing rapidly. Productivity gains now outpaced population growth. Each generation could afford things their parents could only dream of.
The difference is staggering. Over the century and a half from 1800 to 1950, real wages in industrialized countries roughly tripled. In the subsequent seventy years, they tripled again. A factory worker in 2020 could afford amenities—climate control, refrigeration, instant communication, rapid transportation—that would have seemed like magic to the wealthiest aristocrats of 1800.
The Productivity Puzzle
Something strange happened in the 1970s, at least in the United States and much of the developed world. The Solow-era pattern began to break down.
According to basic economic theory, workers should be paid roughly in proportion to what they produce. As productivity rises, wages should rise with it. And for decades after World War Two, this is exactly what happened. Workers produced more and earned more, in rough proportion.
Then the link snapped.
The Economic Policy Institute, a labor-focused think tank, calculated that between 1973 and 2013, productivity in America grew by about 74 percent while hourly compensation grew by just over 9 percent. Workers were producing dramatically more value but capturing only a small fraction of the gains.
The Heritage Foundation, a conservative think tank, disputes this analysis. Using different methods for adjusting inflation and including the rising cost of benefits like health insurance, they found that compensation grew about 77 percent while productivity grew about 100 percent over a similar period. Still a gap, but a much smaller one.
The disagreement illustrates just how much the details of measurement matter. But both analyses agree on the basic point: since the 1970s, the tight historical link between productivity and wages has weakened or broken.
Why Wages Stagnate
Economists have proposed many explanations for wage stagnation. No single factor tells the whole story, but several seem to matter.
The decline of labor unions has reduced workers' bargaining power. In 1954, about one in three American workers belonged to a union. By 2020, that figure had fallen below one in ten. Unions historically pushed wages up not just for their own members but for non-union workers too, by establishing norms and creating competitive pressure.
The shrinking of manufacturing employment has eliminated many middle-wage jobs. Factory work historically paid well despite not requiring advanced credentials. As manufacturing moved overseas or became automated, workers shifted into service jobs that often pay less.
Non-compete agreements and other restrictions on job mobility have made it harder for workers to leverage outside offers for higher pay. When workers can't credibly threaten to leave, employers have less reason to raise wages.
And rising benefit costs—particularly health insurance—mean that even when total compensation rises, less of it shows up in the paycheck. American workers today often receive more of their compensation as benefits rather than cash, making wage stagnation look more severe than total compensation stagnation.
The Global Picture
Wage stagnation hasn't hit every country equally.
In the years following the Great Recession of 2008, the global average for real wage growth was about 2 percent annually. But this average hides enormous variation. Asia—particularly China—experienced robust real wage growth of over 6 percent per year from 2006 to 2013. Workers there were rapidly getting richer.
The wealthy countries of the Organisation for Economic Co-operation and Development, or OECD, saw real wages grow by just 0.2 percent in 2013. Parts of Africa, Eastern Europe, Central Asia, and Latin America did even worse, with growth under 0.9 percent.
The United Kingdom experienced particularly sharp pain. Between 2007 and 2015, British real wages fell by about 10 percent—a decline matched among developed countries only by Greece, which was experiencing an outright economic collapse.
The International Labour Organisation, a United Nations agency focused on worker welfare, observed a broader pattern: in developed economies, labor's share of total economic output was declining while capital's share was rising. In simpler terms, workers were getting a smaller slice of the pie while owners of businesses and property were getting more.
Unemployment, Underemployment, and the Hidden Slack
One important development since the 2008 crisis: wages have become more sensitive to unemployment.
A 2014 study of the British economy found that before 2003, if the unemployment rate doubled, median wages would fall by about 7 percent. After 2003, the same doubling in unemployment would cause wages to fall by about 12 percent. Workers' bargaining position had weakened; they now suffered more during downturns and gained less during recoveries.
But unemployment statistics themselves can be misleading. A 2018 paper argued that the real culprit behind wage stagnation wasn't unemployment per se but underemployment—people working part-time who wanted full-time work, or people in jobs below their skill level.
By 2017, unemployment rates in many OECD countries had returned to their pre-2008 levels. Headline numbers looked healthy. But underemployment rates remained elevated. The labor market looked tighter than it actually was.
This hidden slack—people who technically had jobs but wanted more or better work—kept wages from rising even as official unemployment fell. The competition for scarce good jobs kept workers from demanding higher pay.
What Real Wages Tell Us
Understanding real versus nominal wages matters because it cuts through one of the most common economic illusions: the idea that rising numbers always mean rising prosperity.
When someone says that average wages today are higher than in 1950, that's obviously true in nominal terms. But the relevant question is always: what can those wages buy? A middle-class salary in 1950 could purchase a house, raise a family, and retire comfortably. Whether a middle-class salary today can do the same depends enormously on where you live and what you want to buy.
Real wages force us to think in terms of purchasing power rather than dollar amounts. They remind us that money is only a means to an end—a claim on goods and services—and that the value of that claim changes constantly.
They also reveal the limits of economic measurement. Any single number for real wages requires making choices about how to weight different price changes, and those choices embed assumptions about whose consumption patterns matter. The real wage experience of someone paying Bay Area rent is very different from the real wage experience of someone in rural Nebraska, even if their nominal wages are identical.
Perhaps most importantly, thinking about real wages over long time periods reminds us just how extraordinary the modern era is. For most of human history, real wages didn't grow. People lived and died with roughly the same purchasing power as their ancestors. The expectation that each generation will be materially better off than the last is historically unprecedented—and as the stagnation since the 1970s shows, not guaranteed to continue.