Semiconductor industry
Semiconductor industry
Based on Wikipedia: Semiconductor industry
Three companies. Just three companies on the entire planet can manufacture the most advanced chips that power your phone, your car, and increasingly, your toaster. Taiwan Semiconductor Manufacturing Company, Samsung, and Intel. That's it. And one of those three—Intel—is struggling to keep up.
This extraordinary concentration of capability represents one of the most remarkable industrial phenomena of our time. The semiconductor industry generates over half a trillion dollars in annual revenue, enables five trillion dollars in technology sales, and underpins twenty-nine trillion dollars in e-commerce. Yet its most advanced capabilities rest in the hands of a vanishingly small number of players.
How did we get here?
The Transistor: A Twenty-Five Thousand Dollar License That Changed Everything
In 1948, three researchers at Bell Labs—William Shockley, Walter Brattain, and John Bardeen—invented the transistor. A transistor is essentially an electronic switch, a tiny device that can turn electrical current on or off. Before transistors, electronics relied on vacuum tubes: fragile glass bulbs that generated tremendous heat, consumed enormous power, and burned out regularly. Transistors were smaller, more reliable, used far less energy, and didn't require constant replacement.
Bell Labs, owned by AT&T, was a regulated telephone monopoly at the time. The government required them to license their inventions to others. So Bell offered the transistor technology to anyone willing to pay twenty-five thousand dollars—a sum that seems almost comically low given what followed.
Companies rushed in. Motorola started making transistors in 1952. Shockley himself left Bell Labs to start Shockley Semiconductor in 1955. Texas Instruments, Fairchild Semiconductor, and others quickly followed. Within a decade, the transistor had spawned an entirely new industry.
The Integrated Circuit: One Chip to Rule Them All
But the real breakthrough came in 1958, when two engineers working independently—Jack Kilby at Texas Instruments and Robert Noyce at Fairchild Semiconductor—invented the integrated circuit.
The idea was elegant. Instead of building electronic devices from many individual transistors wired together, why not manufacture multiple transistors directly onto a single piece of semiconductor material? One chip. Many transistors. No separate wiring needed.
Semiconductor material, by the way, is neither a good conductor of electricity like copper nor a good insulator like rubber. It sits somewhere in between, and its conductivity can be precisely controlled by adding tiny amounts of other elements—a process called doping. Silicon is the most common semiconductor material, which is why we call the tech industry's heartland Silicon Valley.
The integrated circuit triggered an arms race in miniaturization. Engineers discovered that if they could make transistors smaller, they could fit more of them onto each chip. More transistors meant more computing power. And something remarkable happened: the industry found ways to double the number of transistors on a chip roughly every two years.
Moore's Law: The Most Important Prediction in Technology
In 1965, Gordon Moore—one of the founders of Fairchild Semiconductor and later Intel—observed this doubling pattern and predicted it would continue. His prediction, dubbed Moore's Law, turned out to be astonishingly accurate for over six decades.
Moore's Law isn't really a law of physics. It's more like a self-fulfilling prophecy. The entire industry organized itself around the assumption that this doubling would continue. Companies planned their product roadmaps assuming it. Customers expected it. Investors demanded it. And engineers, knowing they had to deliver, found ways to make it happen.
The results border on the incomprehensible. The original integrated circuits contained a handful of transistors. Modern processors contain billions. An iPhone today has more computing power than all the computers that existed in the world in 1960 combined.
The Great Unbundling
For the first few decades, semiconductor companies did everything themselves. They designed chips. They invented new manufacturing processes. They refined the chemicals and purified the silicon wafers. They built their own manufacturing equipment—the furnaces, the lithography machines, the etching tools. They assembled and tested the final products.
This vertical integration made sense when the industry was young. Nobody else could supply what you needed, so you had to build it yourself.
But as the industry matured, companies began to specialize. Equipment makers emerged who did nothing but build manufacturing machinery. Chemical companies focused on producing ultra-pure materials. Testing and packaging operations spun off into separate businesses.
Then, in 1969, a company called LSI Logic pioneered something radical: designing chips without owning any manufacturing facilities at all. These "fabless" semiconductor companies focused entirely on design, outsourcing actual production to others.
The word "fab" is industry shorthand for fabrication plant—the facility where chips are actually manufactured. A fabless company has no fab. It designs chips on computers and sends those designs to someone else to build.
The Rise of the Foundry
Initially, fabless companies had to beg their competitors for manufacturing capacity. If you designed a chip but had no fab, you might have to go to Texas Instruments or Motorola—companies that made their own competing products—and ask them to manufacture your design. This created obvious conflicts of interest.
The solution emerged in the mid-1980s from Taiwan. A company called Taiwan Semiconductor Manufacturing Company, or TSMC, pioneered the "pure-play foundry" model. TSMC would manufacture chips, but it would never design its own products. It would never compete with its customers. It would be a neutral factory, open to all.
United Microelectronics Corporation, or UMC, another Taiwanese company, adopted the same approach. The foundry model transformed the industry's structure.
Suddenly, a brilliant engineer with a great chip design didn't need billions of dollars to build a factory. They could start a fabless company, focus entirely on design innovation, and outsource manufacturing to TSMC or UMC. The barriers to entry in chip design collapsed. Innovation accelerated.
The Trillion-Chip Year
By 2021, the semiconductor industry had reached staggering scale. Annual sales hit a record 555.9 billion dollars—up more than twenty-six percent from the previous year. More than 1.15 trillion individual semiconductor units shipped that calendar year.
One trillion chips. In a single year.
China alone consumed 192.5 billion dollars worth of semiconductors in 2021, making it by far the world's largest market. This creates obvious geopolitical tensions, given that the most advanced manufacturing capabilities reside not in China but in Taiwan—an island whose political status China disputes.
The industry projects revenues approaching 727 billion dollars by 2027, continuing decades of remarkable growth.
The Concentration Problem
But here's what should concern anyone paying attention: the foundry model that enabled so much innovation has also created extraordinary concentration.
Building a cutting-edge semiconductor fab costs an almost incomprehensible amount of money. TSMC's latest factory, capable of manufacturing chips using a three-nanometer process—three billionths of a meter, about the width of a dozen atoms—cost 19.5 billion dollars to construct. And that facility was completed in 2020; newer ones cost even more.
These astronomical costs create insurmountable barriers. A new entrant can't just decide to compete with TSMC. The capital requirements alone would bankrupt most countries, let alone companies.
The result is that the entire world's supply of the most advanced semiconductors flows through just three companies. TSMC in Taiwan leads the pack, capable of manufacturing chips at the five-nanometer node and beyond. Samsung in South Korea has similar capabilities. Intel in the United States—once the undisputed leader—has fallen behind, currently able to produce only at the ten-nanometer level.
GlobalFoundries, an American-headquartered foundry, has essentially given up the race for the most advanced nodes. The development costs simply became too high. They now focus on twelve-nanometer chips and larger, ceding the cutting edge to the big three.
Geography Is Destiny
The semiconductor industry's geography reads like a roster of Cold War allies and strategic rivals. The United States dominates in chip design and certain types of manufacturing equipment. Taiwan and South Korea lead in advanced manufacturing. Japan excels in materials and equipment. The Netherlands—specifically one company, ASML—has a near-monopoly on the extreme ultraviolet lithography machines needed to manufacture the most advanced chips. Israel and Germany maintain significant presences in specialized areas.
American semiconductor manufacturers have spread their fabrication facilities across the globe. Just over half their plants sit in the Americas. Thirty-nine percent are in the Asia-Pacific region, including nine percent in Japan. Only nine percent are in Europe.
This geographic distribution creates profound vulnerabilities. A war in the Taiwan Strait, a natural disaster in Japan, a disruption in the Netherlands—any of these could cripple the global supply of essential components.
The Volatility Paradox
Despite its consistent long-term growth, the semiconductor industry lurches through boom-and-bust cycles that would give most executives vertigo.
Over the past two decades, annual growth has averaged around thirteen percent—an extraordinary rate for an industry of this size. But that average masks wild swings. Some years see explosive growth; others bring painful contractions.
This volatility stems partly from the industry's position at the base of the technology food chain. Semiconductors go into everything: phones, computers, cars, appliances, industrial equipment, medical devices, military systems. When the economy booms, demand for all these products surges, and semiconductor makers scramble to keep up. When recession hits, demand collapses simultaneously across all these markets.
But there's a deeper issue. Semiconductor manufacturing requires enormous capital investments made years before the resulting chips reach the market. Companies must guess what demand will look like three, four, five years out, then commit billions to building capacity. Sometimes they guess right. Sometimes they don't. When multiple companies overestimate demand and build too much capacity simultaneously, prices collapse. When they underestimate, shortages ensue.
The COVID-19 pandemic illustrated both dynamics. First, the industry anticipated a recession and cut production. Then demand for electronics exploded as everyone worked from home, learned from home, and entertained themselves from home. The resulting shortage affected everything from PlayStations to automobiles, demonstrating just how dependent the modern economy has become on this one industry.
Price Performance: The Miracle Nobody Notices
If any other industry had improved its price-to-performance ratio at the rate semiconductors have, it would be front-page news. A car that cost fifty thousand dollars in 1970, improving at the semiconductor industry's rate, would cost a fraction of a penny today while traveling at the speed of light.
This relentless improvement drives change throughout the broader economy. Products that seemed like science fiction a decade ago—smartphones that recognize your face, cars that drive themselves, satellites that beam internet to remote areas—become possible only because semiconductors keep getting cheaper and more powerful.
The semiconductor industry doesn't just respond to market demands. It creates possibilities that other industries then rush to exploit. It's the technological foundation upon which everything else gets built.
The Structure Today
The modern semiconductor industry comprises several distinct types of companies.
Pure-play foundries like TSMC specialize exclusively in manufacturing chips designed by others. They never compete with their customers by designing their own products. Some foundries also offer ancillary services—design assistance, testing, packaging—but manufacturing remains their core business.
Integrated Device Manufacturers, or IDMs, both design and manufacture their own chips. Intel is the classic example. Samsung straddles categories—it manufactures both its own designs and, through its foundry division, designs from others.
Fabless companies design chips but own no manufacturing facilities. They send their designs to foundries for production. Qualcomm, Nvidia, AMD, and Apple's chip division all operate this way. This model allows companies to focus entirely on innovation without the capital burden of building fabs.
OSAT companies—which stands for Outsourced Semiconductor Assembly and Testing—specialize in the final stages of chip production. After wafers come out of the fab, chips must be cut apart, packaged in protective housings, and tested to ensure they work properly. Many companies outsource these steps to specialists.
Finally, there are the equipment and materials suppliers: the companies that build the machines foundries use, produce the ultra-pure chemicals required for manufacturing, and supply the silicon wafers that serve as the chips' foundation.
The Nanometer Race
When semiconductor companies talk about their manufacturing "process node"—whether they make five-nanometer chips or ten-nanometer chips—they're referring to a measure of how small they can make the transistors.
A nanometer is one billionth of a meter. For comparison, a human hair is roughly 80,000 to 100,000 nanometers thick. The transistors in cutting-edge chips are thousands of times smaller than a single hair.
Smaller transistors bring multiple advantages. You can fit more of them on each chip, increasing computing power. They switch faster, enabling higher processor speeds. And they consume less power, extending battery life in mobile devices.
But making things smaller becomes exponentially harder. The physics of light begins to interfere—you're trying to draw patterns smaller than the wavelength of the light used in traditional manufacturing. Materials behave differently at these scales. Heat becomes impossible to dissipate. Quantum effects that can be ignored at larger scales suddenly matter enormously.
TSMC and Samsung have reached the five-nanometer node and are pushing toward three nanometers and beyond. Intel, once the clear leader, fell behind and currently manufactures at roughly the ten-nanometer level. The company is considering something once unthinkable: outsourcing some of its chip production to TSMC.
The Boring Chips That Matter Most
While headlines focus on the cutting-edge processors in smartphones and data centers, much of the semiconductor industry's output consists of far less glamorous chips. Power management integrated circuits, or PMICs. Microcontrollers. Analog chips. Display drivers. These components rarely appear in marketing materials, but they're everywhere.
A modern car contains thousands of semiconductors, most of them decidedly unglamorous. The chips that control your antilock brakes, regulate your battery charging, manage your power windows, and monitor your tire pressure aren't manufactured on cutting-edge five-nanometer processes. Many use technology nodes considered ancient by smartphone standards—forty nanometers, sixty-five nanometers, even older.
This matters because the shortage that crippled automobile production during the pandemic wasn't primarily a shortage of cutting-edge chips. It was a shortage of these boring, mature-node semiconductors that nobody had thought much about until they suddenly couldn't get them.
The unglamorous segments of the semiconductor industry are surprisingly concentrated too. Power electronics—the chips that manage electrical power conversion and distribution—represents a 216 billion dollar market on its own. Network and communications devices account for nearly a third of all semiconductor revenue.
Research and Development: Betting the Company
The semiconductor industry invests more in research and development than almost any other sector of the economy. In 2010, it had the highest R&D intensity of any industry in the European Union, and ranked second globally only to biotechnology when combining EU, United States, and Japanese figures.
This intense investment is not optional. The relentless pace of improvement that has characterized the industry for sixty years requires constant innovation. Stop investing in R&D and you fall behind. Fall behind and you die.
The research challenges are staggering. How do you manufacture transistors that are literally a few atoms wide? How do you inspect chips with billions of components to catch defects? How do you design systems so complex that no human can fully comprehend them?
The answer, increasingly, involves using the industry's own products to advance itself. Modern chip design is impossible without computers running specialized software. Artificial intelligence helps optimize chip layouts and predict manufacturing problems. The semiconductor industry is perhaps the purest example of a technology that bootstraps its own improvement.
Why This All Matters
The semiconductor industry's peculiar structure—its extreme capital intensity, its geographic concentration, its reliance on a tiny number of cutting-edge manufacturers—creates vulnerabilities that extend far beyond the tech sector.
Modern cars contain thousands of semiconductors controlling everything from engine timing to infotainment systems. Medical equipment depends on chips for imaging, monitoring, and diagnostics. Power grids, water treatment plants, and telecommunications networks all run on semiconductors. Military systems—guidance computers, communication equipment, radar installations—cannot function without them.
A disruption to this industry doesn't just mean delayed PlayStations. It means factories that can't build cars, hospitals that can't acquire equipment, and militaries that can't maintain their weapons systems.
Governments have begun to recognize these vulnerabilities. The United States, European Union, Japan, and others are investing billions to build domestic semiconductor manufacturing capacity. But fabs take years to construct and require expertise that cannot be easily transferred. The concentration that developed over decades will not be undone quickly.
In the meantime, the industry that started with a twenty-five thousand dollar license in 1948 has become the foundation upon which modern civilization rests—and that foundation is narrower and more fragile than most people realize.