← Back to Library
Wikipedia Deep Dive

Carbon capture and storage

Based on Wikipedia: Carbon capture and storage

Here is an inconvenient truth about one of our most celebrated climate solutions: of all the carbon capture projects ever announced, roughly seventy percent never got built. In the electricity sector alone, the failure rate exceeds ninety-eight percent.

Carbon capture and storage has been discussed for decades as a way to clean up our energy system while still burning fossil fuels. The basic idea sounds elegant: instead of letting carbon dioxide escape from smokestacks into the atmosphere, capture it and bury it underground forever. Problem solved, right?

Not quite.

What Carbon Capture Actually Does

Let's start with what happens at a carbon capture facility. When you burn natural gas or coal, the exhaust contains carbon dioxide mixed with other gases. To capture that carbon dioxide, you typically pass the exhaust through a chemical solvent—usually an amine, which is a nitrogen-based compound that binds to carbon dioxide molecules like a magnet picking up iron filings.

Once the solvent has grabbed the carbon dioxide, you heat it up in a separate chamber. The heat breaks the chemical bond, releasing a concentrated stream of pure carbon dioxide. Then you compress this gas until it becomes what physicists call a supercritical fluid—a strange state of matter that's neither quite liquid nor quite gas, but something in between. In this dense, pressurized form, the carbon dioxide can be pumped through pipelines and injected deep underground.

This all requires enormous amounts of energy. Plants with carbon capture burn significantly more fuel than plants without it, which creates a frustrating paradox: you need to pollute more to pollute less.

The Oil Industry's Favorite Climate Solution

Here's something that might raise an eyebrow: ninety percent of the world's carbon capture capacity is connected to the oil and gas industry. This isn't a coincidence.

The technology for separating carbon dioxide from other gases was actually patented in 1930, long before anyone worried about climate change. Natural gas companies needed it because raw natural gas contains carbon dioxide that must be removed before the gas can be sold. For decades, they simply vented this waste carbon dioxide into the atmosphere and called it a day.

Then in 1972, American oil companies discovered something lucrative. If you inject carbon dioxide into an aging oil field, it does something remarkable to the crude oil trapped in the rock. The carbon dioxide dissolves into the oil, making it less dense and more fluid. It's like adding dish soap to grease—suddenly the oil can flow more easily toward your wells.

This process is called Enhanced Oil Recovery, often abbreviated as E-O-R. Today, around eighty percent of all the carbon dioxide captured worldwide gets pumped into oil fields, not to store it, but to squeeze out more petroleum. For every ton of carbon dioxide injected, you typically get about two extra barrels of oil. Burning those two barrels releases approximately one ton of carbon dioxide back into the atmosphere.

So is this actually helping the climate, or is it an elaborate way for oil companies to extract more product while claiming environmental credentials? That question remains genuinely controversial.

The Promise Versus the Reality

In 2005, the Intergovernmental Panel on Climate Change—the United Nations body that assesses climate science—released a major report highlighting carbon capture as a key tool for fighting global warming. Governments got excited. Over the following years, they poured an estimated thirty billion dollars into subsidies for carbon capture and hydrogen projects based on fossil fuels.

The plans were ambitious. By 2020, one hundred forty-nine projects were supposed to be operational, capturing one hundred thirty million tons of carbon dioxide annually. These weren't just proposals—they had funding, timelines, and government backing.

What actually happened? About seventy percent of those projects were never built.

The reasons for failure were varied but instructive. Some projects received one-time grants that weren't enough to cover long-term operating costs. Others couldn't find insurance companies willing to take on liability for carbon dioxide that might leak from underground storage centuries in the future. Many faced public opposition from communities that didn't want to live near carbon dioxide pipelines or injection sites. And some simply couldn't make the economics work—capturing carbon turns out to be expensive when you're not also using it to pump oil.

As of 2024, carbon capture is operating at just forty-four plants worldwide. Together, they capture approximately one-thousandth of global carbon dioxide emissions. That's not a typo. After decades of development and billions in subsidies, the technology addresses roughly 0.1 percent of the problem.

How Underground Storage Works

When carbon dioxide is genuinely stored rather than used for oil extraction, it typically goes into deep saline aquifers. These are layers of porous rock, like sandstone, saturated with extremely salty water—too salty to drink or use for agriculture. Picture an underground sponge, soaked with brine, located more than eight hundred meters below the surface.

At that depth, the pressure keeps the carbon dioxide in its supercritical state, dense and unlikely to rise. Above the porous rock sits an impermeable layer called a caprock, usually made of shale or clay. This caprock acts like a lid, preventing the carbon dioxide from migrating upward.

The carbon dioxide doesn't stay still down there. It spreads through the pore spaces in the rock, somewhat like water spreading through a sponge. Over time, several processes help trap it more permanently. Some dissolves into the salty water. Some gets stuck in individual pores and can't move. Very slowly—over thousands or millions of years—some reacts with minerals in the rock to form stable carbonate deposits.

That last process, mineral carbonation, is particularly appealing because once carbon dioxide has turned into rock, it's essentially permanent. But it happens too slowly to rely on in the near term.

The global capacity for this kind of storage is vast, estimated at somewhere between eight thousand and fifty-five thousand gigatons. For perspective, humanity currently emits about forty gigatons of carbon dioxide annually. So in theory, we have room for centuries of emissions underground. In practice, only a fraction of that capacity will prove accessible, economical, and acceptable to nearby communities.

The Sleipner Story

The first large-scale project dedicated purely to climate-motivated carbon storage began in 1996 at the Sleipner natural gas field in the Norwegian North Sea. Norway had introduced a carbon tax in 1991, making it expensive to vent carbon dioxide into the atmosphere. Suddenly, the economics of storage made sense.

The Sleipner project has been operating for nearly three decades, injecting about one million tons of carbon dioxide annually into a saline formation beneath the seabed. It's often cited as proof that the technology works. And it does work—at this single site, under these specific conditions, with these particular geological formations.

But Sleipner also illustrates the limited scale of carbon capture. One million tons annually sounds impressive until you remember that global emissions exceed forty billion tons. You would need forty thousand Sleipner-sized projects to address current emissions. The world has forty-four carbon capture facilities total.

Where Carbon Capture Might Actually Make Sense

Critics often dismiss carbon capture entirely, but that's probably too simple. The technology faces real limitations, but it may have genuine value in specific situations where no good alternatives exist.

Consider cement manufacturing. When you make cement, you heat limestone to extreme temperatures, which causes a chemical reaction that releases carbon dioxide as a byproduct. This isn't from burning fuel—it's from the limestone itself. No amount of solar panels or wind turbines can eliminate this emission. You either capture that carbon dioxide or you don't make cement.

Steel production faces similar challenges. The traditional method uses coal not just for heat but as a chemical ingredient, reacting with iron ore to remove oxygen. Again, carbon dioxide emerges from the chemistry itself, not just from combustion.

These industries—cement, steel, and certain chemical processes—represent what analysts call "hard to abate" sectors. They account for roughly twenty percent of industrial emissions. For these applications, carbon capture may be genuinely necessary, at least until entirely new production methods can be developed and scaled.

The Opportunity Cost Problem

Here's the uncomfortable question that carbon capture advocates don't like to discuss: what else could we do with that money?

Solar and wind power have become remarkably cheap over the past decade. In many parts of the world, building new renewable energy is now less expensive than running existing coal plants. Electric vehicles are rapidly approaching price parity with gasoline cars. Heat pumps can warm and cool buildings far more efficiently than furnaces and air conditioners.

All of these technologies are mature, proven, and deploying rapidly. They don't require capturing anything or burying anything underground. They simply don't produce carbon dioxide in the first place.

When governments subsidize carbon capture, they're making a choice about where to direct limited resources. Thirty billion dollars spent on carbon capture is thirty billion dollars not spent on solar farms, transmission lines, public transit, or building insulation. And the track record strongly suggests that carbon capture delivers far less climate benefit per dollar than these alternatives.

This doesn't mean carbon capture is worthless. It means policymakers should be honest about its limitations and appropriate applications rather than treating it as a general-purpose climate solution.

The Politics of Fossil Fuel Abatement

International climate negotiations have developed a curious vocabulary around carbon capture. The phrase "fossil fuel abatement" appears in various agreements, typically undefined but generally understood to mean continued use of coal, oil, and natural gas with carbon capture attached.

This language matters because it shapes what countries commit to. If "abatement" counts as climate action, then nations can continue extracting and burning fossil fuels indefinitely while claiming progress toward emissions targets. Environmental groups argue this is exactly the point—that carbon capture provides political cover for the fossil fuel industry to keep operating.

The fact that ninety percent of carbon capture capacity is connected to oil and gas production lends weight to this concern. So does the industry's enthusiasm for the technology. When ExxonMobil and Shell champion a climate solution, it's worth asking whose interests that solution primarily serves.

At the same time, dismissing carbon capture entirely may be too simplistic. If the technology can genuinely help decarbonize cement, steel, and chemicals—sectors that collectively employ millions of workers and underpin modern infrastructure—then rejecting it on ideological grounds could slow necessary transitions.

The Leakage Question

When you bury carbon dioxide underground, how confident can you be that it will stay there? This is not a trivial question. The climate benefit of carbon capture depends entirely on permanent storage. If the gas leaks back out over decades or centuries, you've spent enormous resources to merely delay emissions rather than prevent them.

Geological storage sites are selected specifically to minimize leakage risk. You need the right kind of caprock, without fractures or faults that could provide escape routes. You need to inject at pressures low enough that you don't crack the rock. You need monitoring systems to detect any migration of carbon dioxide toward the surface.

The evidence so far is reasonably encouraging. The Sleipner project has been monitoring its storage formation since 1996 and reports no significant leakage. Other sites have similarly positive track records. But these projects have only been operating for a few decades. Meaningful climate timescales stretch to centuries and millennia.

There's also the question of induced seismicity—earthquakes triggered by pumping fluids underground. This is a well-documented phenomenon in wastewater injection from oil and gas operations. Large-scale carbon dioxide injection could potentially trigger similar seismic events, which might compromise the integrity of storage formations.

None of this means geological storage is inherently unsafe. It means we're making very long-term bets based on relatively short-term experience.

A Faster Alternative: Mineral Carbonation

There's a fascinating alternative to burying carbon dioxide in its gaseous form. Certain types of rock, particularly basalt, react chemically with carbon dioxide and water to form solid carbonate minerals. Once the carbon has literally turned to stone, it cannot escape.

Iceland has pioneered this approach at a project called CarbFix. They inject carbon dioxide dissolved in water into underground basalt formations. Within two years—remarkably fast by geological standards—most of the carbon dioxide has mineralized into stable carbonates. The risk of leakage drops essentially to zero.

The catch is that this process requires large amounts of water and only works in areas with suitable basalt formations. Iceland is ideal—volcanic geology everywhere, abundant water, cheap geothermal electricity to power the operation. Scaling this approach worldwide is far more challenging.

Still, mineral carbonation represents a genuinely different value proposition. You're not asking future generations to trust that caprocks will hold for millennia. You're converting the problem into rock and moving on.

The Bottom Line

Carbon capture and storage is a real technology that genuinely works in specific applications. It is not a scam. It is also not a climate solution at the scale its proponents sometimes suggest.

After decades of development and tens of billions in government support, the technology captures about one-thousandth of global emissions. Most of that captured carbon gets used to extract more oil. The vast majority of announced projects never get built. Operating plants require significantly more energy than comparable facilities without carbon capture.

Meanwhile, solar and wind power have dropped in cost by ninety percent over the past decade. Electric vehicles are reaching mainstream adoption. Heat pumps are transforming how we condition buildings. These technologies don't require capturing anything—they simply don't produce emissions in the first place.

For industries that truly have no alternative—cement, steel, certain chemical processes—carbon capture may eventually prove essential. For everyone else, it looks increasingly like an expensive detour on the road to genuine decarbonization.

Perhaps the most important thing to understand about carbon capture is who benefits from its prominence in climate discussions. As long as we're debating how to capture emissions from fossil fuels, we're not focusing on eliminating those fuels entirely. That's a feature, not a bug, for companies whose business models depend on continued extraction. Whether it's a feature or a bug for the climate is the question that matters most.

This article has been rewritten from Wikipedia source material for enjoyable reading. Content may have been condensed, restructured, or simplified.