Paradigm shift
Based on Wikipedia: Paradigm shift
The Death of Scientific Certainty
In 1900, the most prestigious physics department in the world was advising students to pursue other careers. Physics, they said, was essentially complete. A few minor details remained—some troublesome measurements here, an unexplained phenomenon there—but the grand edifice of understanding the universe was finished. The foundation was solid. The roof was on. All that remained was interior decorating.
Within five years, a patent clerk in Switzerland would demolish that entire building.
This is the story of how science actually works—not as a steady march toward truth, but as a series of violent revolutions punctuated by long periods of comfortable certainty. It's the story of how the very ideas that make science powerful also make scientists blind. And it begins with a physicist turned philosopher named Thomas Kuhn, who looked at the history of science and saw something that made his colleagues profoundly uncomfortable.
What Kuhn Actually Said
In 1962, Kuhn published The Structure of Scientific Revolutions, a book that would itself become one of the most influential—and misunderstood—works in twentieth-century thought. His central idea was simple, even obvious in retrospect, yet it upended how we think about scientific progress.
Science, Kuhn argued, doesn't advance through the gradual accumulation of facts. Instead, it lurches forward through revolutions.
Most of the time, scientists work within what Kuhn called "normal science." They share a paradigm—a framework of assumptions, methods, and accepted truths that tells them what questions are worth asking and what answers are acceptable. The paradigm is like a pair of glasses through which scientists see the world. It brings certain things into sharp focus while rendering others invisible.
A chemist and a physicist looking at a helium atom aren't seeing the same thing. Each views it through the lens of their discipline's paradigm, with different questions and different tools. This isn't a flaw. It's what makes specialized science possible. The paradigm tells you what to measure, how to interpret your measurements, and what to do when something doesn't quite fit.
And that last part is crucial.
The Problem with Anomalies
Every paradigm produces anomalies—observations that don't quite fit the accepted framework. A measurement that's slightly off. An experiment that gives unexpected results. A phenomenon that the current theory can't explain.
Scientists are remarkably good at explaining away anomalies. They invoke experimental error, unknown variables, special circumstances. They adjust the theory around the edges. They file the troublesome data in desk drawers and move on to more productive questions.
This isn't dishonesty. It's necessity.
If scientists abandoned their paradigm every time they encountered something puzzling, science would be impossible. You'd never build anything because you'd constantly be tearing down the foundation. The paradigm provides the stability that allows detailed investigation. Most anomalies, after all, do turn out to be experimental errors or incomplete understanding.
But sometimes they don't.
Sometimes the anomalies accumulate. The desk drawers fill up. The patches on the theory become more elaborate than the theory itself. And eventually, the scientific community enters what Kuhn called a state of crisis.
When the Ground Shifts
Crisis in science looks different from crisis in other fields. It's not panic. It's more like a strange, uncomfortable loosening. The things everyone knew to be true start seeming less certain. Young researchers begin questioning assumptions their mentors never thought to examine. Philosophical debates that seemed irrelevant suddenly matter intensely.
Kuhn called the work done during crisis "extraordinary research," and he considered it more important than the paradigm shifts it eventually produces. During extraordinary research, scientists push beyond the boundaries of normal science. They try wild ideas. They resurrect abandoned theories. They design experiments that would have seemed pointless under the old paradigm.
The proliferation of competing ideas, the willingness to try anything, the explicit discontent, the recourse to philosophy and debate over fundamentals—this is science at its most creative and its most uncertain.
Eventually, someone proposes a new paradigm. A new way of seeing that makes the anomalies disappear—not by explaining them away, but by reframing the entire problem. What looked like errors in the old paradigm become features of the new one.
And then something remarkable happens.
The Gestalt Switch
You've probably seen those optical illusions—the image that's either a young woman or an old crone, the shape that's either a duck or a rabbit. You can see it either way, but you can't see it both ways simultaneously. The shift between them is sudden and complete.
Kuhn argued that paradigm shifts work the same way.
Scientists who adopt a new paradigm don't just add new information to their existing understanding. They see the world differently. The same observations that made sense under the old paradigm now mean something entirely different. It's not that they have more data. It's that they've changed what counts as data.
This is why paradigm shifts are so difficult, and why they often require generational change. The scientists who built their careers within the old paradigm genuinely cannot see what the revolutionaries are pointing at. It's not stubbornness or stupidity. It's that their intellectual glasses are ground to a different prescription.
Max Planck, himself a revolutionary in physics, put it bluntly: a new scientific truth does not triumph by convincing its opponents and making them see the light. It triumphs because its opponents eventually die, and a new generation grows up that is familiar with it.
A History of Revolutions
Once you know what to look for, you see paradigm shifts everywhere in the history of science.
Consider astronomy. For over a thousand years, educated Europeans knew that the Earth sat motionless at the center of the universe while the sun, moon, planets, and stars revolved around it. This wasn't ignorance—it was sophisticated science. The Ptolemaic system could predict planetary positions with impressive accuracy. It explained why objects fell toward the Earth's center. It aligned with common sense: we don't feel the Earth moving.
Then in 1543, Copernicus proposed that the Earth moves around the sun. This wasn't immediately better at prediction. In some ways it was worse. But it offered a different kind of elegance—a simpler explanation for why planets sometimes appear to move backward against the stars, a more unified account of celestial motion.
The shift took over a century. Galileo's telescopic observations helped. Kepler's elliptical orbits helped more. Newton's physics, published in 1687, finally provided a theoretical framework that made the sun-centered cosmos not just possible but necessary. By then, the old paradigm seemed not just wrong but almost incomprehensible. How could anyone have believed the Earth stood still?
That's the thing about paradigm shifts. They don't just change what we think is true. They change what we think is obvious.
The Chemical Revolution
In the eighteenth century, chemists explained combustion through phlogiston—a substance released when things burn. Wood contains phlogiston. When wood burns, the phlogiston escapes, leaving ash behind. Metals contain phlogiston. When metals rust, they release their phlogiston into the air.
The theory worked surprisingly well. It explained why things got lighter when they burned (they lost phlogiston). It explained why you couldn't burn things in a sealed container (the air became saturated with phlogiston). It even explained why metals, when heated with charcoal, could be recovered from their rust (the charcoal's phlogiston transferred to the metal).
There was just one problem. Some things—like metals—actually get heavier when they burn.
Antoine Lavoisier proposed a different account. Combustion, he argued, wasn't about releasing something. It was about combining with something—specifically, with a component of air he called oxygen. What we call rust isn't metal minus phlogiston. It's metal plus oxygen.
This wasn't just a different explanation. It was a different chemistry. The very concept of an element changed. The questions worth asking changed. And within a generation, phlogiston theory seemed not just wrong but absurd.
The Biological Revolution
Before Darwin, life on Earth was understood as the product of design. Species were fixed types, created for their particular roles. The exquisite fit between organisms and their environments—the eye for seeing, the wing for flying—demonstrated the wisdom of a creator.
Darwin proposed that this fit arose through a different mechanism entirely: natural selection. Organisms vary. Some variations help survival and reproduction. Those variations get passed to offspring. Over vast stretches of time, this mindless process produces the appearance of design.
The shift here was profound. It wasn't just that species changed over time—that idea had been around for decades. It was that the change was purposeless. Evolution had no goal, no direction, no improvement in mind. The apparent progress from simple to complex was an illusion produced by selection's blind tinkering.
Many people still haven't fully absorbed this shift. We still speak of species being "more evolved" or evolution "wanting" to produce certain outcomes. The old paradigm's categories persist in our language even when we consciously reject its claims.
The Physics Revolutions
The twentieth century brought paradigm shifts so rapid and so fundamental that physics became almost unrecognizable within a single generation.
Classical mechanics, perfected by Newton and refined for two centuries, described a universe of absolute space and time. Objects had definite positions and velocities. Causes preceded effects. The future was determined by the present, knowable in principle if you had enough information.
Then came quantum mechanics, which replaced certainty with probability. An electron doesn't have a definite position until you measure it. It exists as a cloud of possibility, collapsing into actuality only through observation. Einstein never accepted this. "God does not play dice with the universe," he famously objected. But the universe, it turned out, didn't care what Einstein preferred.
Simultaneously, Einstein's own relativity theory abolished absolute space and time. What you measure depends on how you're moving. Two events that seem simultaneous to one observer happen at different times for another. Space and time curve in response to matter. Gravity isn't a force—it's geometry.
These revolutions overlapped with the discovery that our galaxy is just one among billions, that the universe began in a hot dense state and has been expanding ever since, that the fundamental constituents of matter are quarks and leptons rather than the atoms that give the theory its name.
Each shift was resisted. Each eventually triumphed. And each made the previous common sense seem quaint.
Beyond the Natural Sciences
Kuhn explicitly limited his analysis to the natural sciences, arguing that fields like philosophy and sociology lacked the single dominant paradigm that makes revolutionary change possible. In these fields, he suggested, you have ongoing debates over fundamentals rather than long periods of consensus punctuated by upheaval.
But researchers in the social sciences couldn't resist the concept. They saw paradigm shifts in their own fields—or at least claimed to.
Psychology experienced what its practitioners call the cognitive revolution, a shift from behaviorism's exclusive focus on observable actions to the study of mental processes. For decades, behaviorists argued that talk of thoughts, beliefs, and intentions was unscientific—you could only study what you could measure. Then, gradually and then suddenly, the field decided that internal mental states weren't just real but central to understanding human behavior.
Anthropology underwent its own transformation, moving from theories that ranked human societies on a scale from primitive to advanced toward a more relativistic approach that saw each culture as a coherent system worthy of understanding on its own terms. Franz Boas and his students dismantled the scientific racism of earlier anthropology, arguing that cultural differences reflected history and environment rather than biological hierarchy.
Even economics saw something like a paradigm shift in the late nineteenth century, when the marginal revolution replaced classical theories of value based on labor with new theories based on marginal utility—the value of one more unit of something depends on how much you already have.
Whether these qualify as true paradigm shifts in Kuhn's sense is debatable. The social sciences might be too fractured, too multi-paradigmatic, for Kuhnian revolutions. But the concept has proven irresistible.
What Paradigm Shifts Are Not
Kuhn's work is frequently misunderstood, sometimes willfully. Two misreadings are especially common and especially damaging.
The first is relativism. If paradigms are incommensurable—if scientists working in different paradigms are in some sense living in different worlds—then doesn't that mean all paradigms are equally valid? Isn't science just another belief system, no more objectively true than mythology or religion?
Kuhn vehemently rejected this conclusion. When a new paradigm replaces an old one, he insisted, the new one is always better—not just different. It explains more. It predicts more accurately. It opens new avenues for research. The shift from Ptolemy to Copernicus to Newton wasn't arbitrary. Each step represented genuine progress in our understanding of the cosmos.
The second misreading is that paradigm shifts prove science is irrational—that scientific change is merely sociological, a matter of fashion and power rather than evidence and reason. Again, Kuhn disagreed. The process is social, yes. But social processes can track truth. The reason scientists eventually adopt new paradigms is that those paradigms work better. The social dynamics are mechanisms through which evidence exerts its influence, not substitutes for evidence.
The philosopher Donald Davidson argued that truly incommensurable paradigms are impossible. If two frameworks were genuinely untranslatable, we couldn't even recognize the second as an alternative to the first. The very fact that we can compare paradigms, that we can explain what Aristotle believed and why it differs from what Newton believed, shows that they share enough common ground for rational comparison.
The Paradigm Shift as Metaphor
Perhaps inevitably, the concept of paradigm shift escaped the philosophy of science and entered everyday language. Business consultants speak of paradigm shifts in management thinking. Political commentators identify paradigm shifts in public opinion. Personal development gurus promise to help you achieve a paradigm shift in your own consciousness.
Some of this is mere buzzword inflation—a fancy way of saying "big change." But the metaphor captures something real. There are moments when our frameworks for understanding genuinely shift, when we don't just learn new facts but reorganize how we interpret all facts.
The shift from treating disease as divine punishment to understanding it as germ infection was a paradigm shift with profound implications far beyond medicine. It changed how we think about causation, responsibility, and the nature of misfortune itself.
The emergence of evidence-based medicine represents a subtler shift—from trusting clinical judgment and established practice to demanding rigorous controlled trials. This hasn't changed our understanding of disease but has transformed how we validate treatments.
Artificial intelligence is currently undergoing something similar. For decades, the dominant paradigm was knowledge-based: encode human expertise into rules that machines can follow. Since around 2010, a new paradigm has emerged: train statistical models on massive datasets and let them discover patterns humans never specified. This shift affects not just how we build AI systems but what we think intelligence is.
Living Through Revolutions
What does it feel like to live through a paradigm shift? For most people, nothing. The revolutions happen in specialized communities. The rest of us encounter only their aftermath, the new common sense that replaced the old.
But for those within the field, it's profoundly disorienting. The things you spent decades learning aren't wrong in some minor way that requires updating. They're wrong in a way that makes your expertise feel obsolete. The questions you devoted your career to answering turn out to be the wrong questions. Your hard-won intuitions, trained on the old paradigm, lead you astray.
This is why established scientists often resist new paradigms while young scientists embrace them. It's not that old scientists are stupid or stubborn. It's that they have more invested in the old framework. They've built their professional identities around it. A paradigm shift threatens everything they've accomplished.
Young scientists have less to lose. They haven't yet committed to the old paradigm. They can see the anomalies with fresh eyes, unclouded by decades of explaining them away. And they have careers to build, careers that might be built better on new foundations.
This suggests something uncomfortable about the structure of scientific progress. It advances, as Planck noted, funeral by funeral. Each generation must die before certain truths can be accepted. The institutions that transmit knowledge also resist its transformation.
The Revolution That Wasn't
Here's an irony that Kuhn himself noted. His own work—The Structure of Scientific Revolutions—caused or participated in a major shift in how historians and sociologists understand science. Before Kuhn, the history of science was often written as a triumphant march toward truth. After Kuhn, it became a more complex story of competing frameworks, social negotiations, and discontinuous change.
But did this qualify as a paradigm shift in Kuhn's technical sense? Kuhn didn't think so. The history and sociology of science, like other social sciences, never had the single dominant paradigm that makes Kuhnian revolutions possible. Scholars still use pre-Kuhnian approaches. Multiple frameworks coexist. The field is too pluralistic for the dramatic gestalt switches that characterize natural science revolutions.
This highlights a limitation of the paradigm shift concept. It applies most cleanly to mature natural sciences—physics, chemistry, biology—where strong consensus exists before being disrupted. In fields with ongoing methodological debates, in fields where multiple schools of thought persist indefinitely, the concept loses its precision.
Eventually, even historians and philosophers of science stepped back from Kuhn's sharpest claims. The revolutionary model gave way to a synthesis—yes, there are periods of rapid conceptual change, but they're connected to periods of normal science by more gradual transitions than Kuhn initially suggested. Science is neither pure accumulation nor pure revolution but something more complex.
Perhaps that's the fate of all paradigm shifts. They clarify something important about how knowledge changes. But in doing so, they oversimplify. And then, over time, a more nuanced picture emerges—one that incorporates the revolutionary insight while qualifying its more extreme implications.
Which is to say: even our understanding of paradigm shifts has shifted.