System dynamics
Based on Wikipedia: System dynamics
In 1956, a group of managers at General Electric faced a puzzle that had been haunting them for years. Employment at their appliance plants in Kentucky was swinging wildly—up, down, up again—in a mysterious three-year cycle. They blamed the business cycle at first, the way you might blame the weather for a headache. But the explanation didn't fit. The economy wasn't moving in three-year waves. Something else was going on.
Enter Jay Forrester, an electrical engineer who had just joined the newly formed Sloan School of Management at the Massachusetts Institute of Technology. Forrester had an unusual background for a business school professor. He'd spent years building flight simulators and working on early digital computers. He thought in systems—in feedback loops and delays and the way small causes could cascade into large effects.
When Forrester examined the GE plants, he didn't look at external economic forces. Instead, he looked inward, at the company's own decision-making structure. How did managers decide when to hire? When to lay off? How quickly did information flow from one part of the organization to another?
Using nothing more than paper and pencil—this was before computer simulations—Forrester traced the flow of decisions through the company. And there it was. The three-year employment cycle wasn't something happening to GE from the outside. It was something GE was doing to itself. The instability emerged from the internal structure of the firm: the delays in how managers perceived demand, the rules they used for hiring, the time it took to train new workers.
This revelation—that organizations could generate their own crises through their very structure—became the foundation of a new field called system dynamics.
What System Dynamics Actually Is
At its core, system dynamics is a way of understanding why complex systems behave the way they do over time. It's built on three fundamental concepts: stocks, flows, and feedback loops.
Think of a bathtub. The water in the tub is a stock—it's something that accumulates. The faucet and the drain represent flows—the rates at which water enters and leaves. If the faucet runs faster than the drain empties, the stock rises. If the drain is faster, the stock falls. Simple enough.
Now add feedback. Imagine you're trying to fill the tub to a certain level. You watch the water rise, and when it gets close to your target, you start turning down the faucet. The state of the stock (how full the tub is) influences the flow (how fast water comes in). That's a feedback loop.
The interesting part is that feedback can be positive or negative. Negative feedback—also called balancing feedback—pushes a system toward equilibrium, like a thermostat keeping your house at a constant temperature. Positive feedback—also called reinforcing feedback—amplifies change. Think of compound interest: the more money you have, the more interest you earn, which gives you more money, which earns more interest.
Most real-world systems contain both types of feedback, often interacting in complex ways. And here's where it gets tricky: these systems almost always include delays. Information takes time to travel. Decisions take time to implement. Effects take time to manifest.
Those delays are often where systems go wrong.
The Counterintuitive Nature of Complex Systems
Here's something that surprises most people when they first encounter system dynamics: in complex systems, the structure often matters more than the components. You can have perfectly competent managers making perfectly reasonable decisions, and the system can still oscillate wildly or spiral out of control. The problem isn't the people—it's how the pieces fit together.
Consider the GE case again. Each manager was acting rationally based on the information available to them. When orders increased, they hired more workers. When orders decreased, they laid off workers. Perfectly sensible.
But there were delays everywhere. It took time to recognize that orders were increasing. It took time to hire new workers. It took time for those workers to become productive. And by the time all those delays had played out, the situation had changed. The managers were always responding to yesterday's reality while living in today's.
Even worse, their responses created new problems. Hiring surges created production surges, which meant orders got filled quickly, which meant customers stopped ordering for a while, which looked like a decline in demand, which triggered layoffs. The system was chasing its own tail.
This is why Forrester called his field "system dynamics" rather than something like "better decision-making." The dynamics emerge from the system structure, not from individual choices. You can't fix a structural problem by making better decisions within the existing structure. You have to change the structure itself.
From Factories to the World
For the first decade of its existence, system dynamics remained largely confined to corporate problems. Forrester and his students built computer models—using a language called DYNAMO, short for Dynamic Models—to help companies understand their inventory systems, their hiring practices, their market dynamics.
Then, in 1968, something unexpected happened. John Collins, the former mayor of Boston, arrived at MIT as a visiting professor. Collins had spent years grappling with urban problems—housing, employment, transportation, crime—and he was frustrated. Traditional approaches to urban policy seemed to make things worse as often as they made them better.
Forrester saw a familiar pattern. Cities, like corporations, are complex systems with stocks and flows and feedback loops and delays. And cities, like corporations, often generate their own problems through their very structure.
The collaboration between Forrester and Collins produced a book called Urban Dynamics, which applied system dynamics to cities for the first time. The model was controversial—some of its conclusions challenged conventional wisdom about housing policy—but it demonstrated that the techniques developed for understanding factories could illuminate much broader questions.
The really big leap came two years later.
The Limits to Growth
In 1970, Forrester attended a meeting in Bern, Switzerland, hosted by an organization called the Club of Rome. The Club was a group of industrialists, scientists, and economists who were worried about what they called "the predicament of mankind"—the possibility that human civilization was on a collision course with the planet's finite resources.
Could system dynamics help understand this predicament? Forrester thought so. On the plane home from Switzerland, he sketched out the first draft of a model he called WORLD1—a system dynamics model of the entire global socioeconomic system.
The model tracked five main stocks: population, industrial capital, food production, nonrenewable resources, and pollution. Each influenced the others through dozens of feedback loops. Population growth increased the demand for food and resources. Industrial capital increased the ability to extract resources but also increased pollution. Pollution reduced agricultural output and life expectancy. And so on.
Forrester refined the model and published it as WORLD2. His graduate student Dennis Meadows then led a team that expanded and updated it, producing WORLD3. That model became the basis for a 1972 book called The Limits to Growth, which may be the most famous—and most controversial—system dynamics study ever conducted.
The book's central conclusion was stark: if current trends continued, the world economy would face some form of collapse within the twenty-first century. Not because of any single factor, but because of the interaction between exponential growth and finite limits. The feedback loops that had driven industrial civilization's spectacular expansion would eventually reverse, as resources depleted and pollution accumulated and the system could no longer sustain itself.
The Limits to Growth sold millions of copies and sparked furious debate. Critics attacked its assumptions, its data, and its conclusions. Supporters argued that subsequent decades had largely validated its warnings. What's often missed in these debates is that the book wasn't primarily a prediction. It was an exploration of system structure—an attempt to understand how the feedback loops governing industrial civilization might play out over long time horizons.
How to Read a System
One of the most useful tools to emerge from system dynamics is the causal loop diagram—a visual map of how the parts of a system influence each other.
Imagine you're launching a new product. You might draw a diagram showing two feedback loops. On one side, there's a reinforcing loop: the more people adopt your product, the more they talk about it, which creates more adoption. Word of mouth amplifies growth.
On the other side, there's a balancing loop: as more people adopt, fewer potential customers remain. The pool of possible new customers shrinks as the pool of actual customers grows.
Both loops are always active. But their relative strength changes over time. Early on, when most people haven't heard of your product, the reinforcing loop dominates. Growth accelerates. Later, as you start running out of new potential customers, the balancing loop takes over. Growth slows, then stops.
This dynamic—explosive early growth followed by saturation—produces what mathematicians call an S-curve. It shows up everywhere: in product adoption, in epidemic spread, in population growth, in the diffusion of new ideas. The shape emerges not from any special quality of the thing spreading, but from the structure of the feedback loops governing its spread.
Causal loop diagrams can't tell you exactly what will happen. They're qualitative tools, good for understanding the forces at work in a system but not for making precise predictions. For that, you need to turn the diagrams into equations and run simulations.
The Art of Simulation
Building a system dynamics simulation involves several steps. First, you define the boundary of your system—what's inside the model and what's outside. Then you identify the key stocks: what things accumulate or deplete over time? Then the flows: what causes those stocks to change? Then the feedback loops: how do the stocks influence the flows?
Once you've mapped the structure, you write equations. Every flow needs a mathematical relationship specifying how fast it runs under different conditions. Every feedback loop needs to be captured in those relationships.
Then you estimate parameters. How many potential customers are there? How much does word of mouth influence purchasing decisions? How long does it take for customers to start talking about the product? These estimates might come from data, from expert judgment, from market research, or from educated guesses.
Finally, you run the simulation and see what happens. Then—and this is crucial—you run it again with different parameters. And again with different initial conditions. And again with different policy choices built into the model. The point isn't to predict the future with precision. It's to understand how the system responds to different conditions and interventions.
What if we double our advertising budget? What if a competitor enters the market? What if production delays increase? By playing out these scenarios, you develop intuition for the system's behavior. You start to see which interventions have lasting effects and which get absorbed by feedback loops. You start to understand where the leverage points are—the places where a small change in structure can produce a large change in behavior.
Where Systems Go Wrong
One of the most valuable contributions of system dynamics has been cataloging the ways that complex systems misbehave. These patterns show up across wildly different domains—in businesses, in ecosystems, in economies, in organizations of all kinds.
There's oscillation: the system swings back and forth around some equilibrium point, never settling down. This often happens when there are delays in feedback loops. By the time you notice a problem and respond to it, the situation has already changed, and your response creates a new problem in the opposite direction.
There's overshoot and collapse: the system grows beyond what's sustainable, then crashes. This happens when reinforcing loops drive growth faster than balancing loops can catch up. It's the pattern at the heart of The Limits to Growth.
There's policy resistance: interventions that should improve the system get absorbed or counteracted by feedback loops, leaving the original problem unchanged. You add more highway lanes to reduce congestion, but the reduced congestion attracts more drivers, and congestion returns to its previous level. The system has a kind of immune response to change.
There's the tragedy of the commons: individually rational behavior that's collectively destructive. Each actor benefits from exploiting a shared resource, but the collective result of everyone exploiting it is the resource's destruction. Without feedback mechanisms that internalize the collective cost, the system consumes itself.
Understanding these patterns doesn't automatically tell you how to fix them. But it does help you avoid the common mistake of treating symptoms rather than causes. If you're fighting policy resistance, for example, adding more force behind your intervention won't help. You need to change the feedback structure that's resisting you.
The Economist Who Saw the Crisis Coming
In the early 2000s, while most economists were confident that the era of major financial crises was over—a period some called "the Great Moderation"—an Australian economist named Steve Keen was building system dynamics models of the economy that pointed in a very different direction.
Keen's approach, which he implemented in software called Minsky (named after the economist Hyman Minsky), modeled the economy as a system of stocks and flows with feedback loops and time delays. Crucially, his models included something most mainstream economic models left out: the dynamics of private debt.
In Keen's models, periods of stability bred instability. When times were good, businesses and households took on more debt. The easy credit fueled more spending, which made times even better, which encouraged even more borrowing. A reinforcing loop. But the debt accumulated as a stock, growing larger and larger relative to income. Eventually, the burden of servicing that debt would overwhelm the system's ability to grow. The reinforcing loop would reverse, and the economy would contract—sometimes violently.
Keen predicted the 2008 financial crisis years before it happened. He wasn't the only one to see it coming, but his system dynamics approach gave him a structural understanding of why a crisis was likely, not just a hunch that something was wrong.
Thinking in Systems
You don't need computer simulations to benefit from system dynamics. Some of its most valuable applications are what practitioners call "back of the envelope"—quick, qualitative analyses that help clarify thinking about complex situations.
Drawing a causal loop diagram forces you to make your mental model explicit. What do you think causes what? Where do you think the feedback loops are? Where are the delays? Often, the act of drawing the diagram reveals hidden assumptions or contradictions in how you're thinking about a problem.
It also helps with communication. Two people arguing about a policy might simply have different mental models of how the system works. Making those models explicit—as diagrams that can be examined and discussed—can shift the argument from conclusions to assumptions. It's easier to find common ground when you can see exactly where your views diverge.
Perhaps most importantly, system dynamics teaches a certain humility about intervention. Complex systems often behave counterintuitively. The obvious intervention sometimes makes things worse. The leverage point is often not where you'd expect. Delays mean that cause and effect can be separated by years or decades, making it hard to connect actions to outcomes.
This doesn't mean you shouldn't intervene. It means you should intervene thoughtfully, with an understanding of the feedback structures you're working within, and with a readiness to adjust when the system responds in ways you didn't anticipate.
The Structure Beneath
What makes system dynamics powerful is also what makes it difficult. The insight that structure determines behavior is liberating—it means you're not just at the mercy of external forces—but it's also demanding. Changing structure is harder than changing decisions. It requires seeing the invisible architecture of policies, practices, information flows, and incentive systems that shape how a system behaves.
Jay Forrester, who died in 2016 at the age of ninety-eight, spent six decades developing and applying these ideas. His work influenced fields far beyond business, from urban planning to environmental policy to public health. The software tools for building system dynamics models have grown more sophisticated and accessible. The ideas have spread into education through programs like the one at MIT and books like Peter Senge's The Fifth Discipline.
But the core insight remains what it was in 1956, when Forrester sat down with pen and paper to trace the flows of decisions through a Kentucky appliance factory. The behavior you see emerges from the structure you can't see. If you want to change the behavior, you have to understand the structure. And if you want to understand the structure, you have to think in systems.