Cable television in the United States
Based on Wikipedia: Cable television in the United States
The Rise and Fall of America's Information Pipeline
In 2015, three out of four American adults paid for cable or satellite television. A decade later, that number had collapsed to just over one in three. The cable television industry, which once seemed as permanent and essential as electricity or running water, is now in what looks like terminal decline.
But this isn't just a story about streaming services killing an outdated technology. It's a story about how a scrappy workaround invented by small-town entrepreneurs became a regulatory battleground, a cultural force, and eventually one of the most profitable industries in American history—before the internet came along and offered something better.
The Accidental Invention
Cable television was never supposed to exist.
When commercial television launched in the late 1940s, the assumption was simple: broadcasters would send signals through the air, and anyone with an antenna could pick them up for free. The business model depended on advertising, and the more viewers you could reach, the more you could charge advertisers. It was elegant and democratic—or at least it would have been, if not for geography.
The problem was mountains. And tall buildings. And valleys. Radio waves travel in straight lines, more or less, and anything that gets in the way blocks the signal. If you lived in a flat area within range of a broadcast tower, television was free and clear. If you lived in Mahanoy City, Pennsylvania—a small town nestled among the anthracite coal hills—you might as well have lived on the moon.
John Walson ran an appliance store in Mahanoy City. He had a problem: he couldn't sell television sets to people who couldn't receive television signals. So sometime in 1948, he climbed to the top of a nearby mountain, set up a large antenna that could pick up signals from Philadelphia, and ran cable down to his store. Then he started running that cable to his customers' homes.
This was Community Antenna Television, or CATV—a literal description of what was happening. A community was sharing a single very large antenna.
The Pioneers and Their Contested Claims
Walson's claim to being "first" has been disputed for decades. The exact date he started can't be verified, though the United States Congress and the National Cable Television Association eventually credited him with inventing cable television in the spring of 1948.
But there were others. Leroy "Ed" Parsons, a radio station owner in Astoria, Oregon, built what's often described as the first cable system to use coaxial cable, amplifiers, and a true community antenna. His origin story has an almost mythological quality to it.
In 1947, Parsons and his wife attended a broadcasters' convention where they saw their first television set. They were captivated. When Parsons learned that a Seattle radio station was about to launch a television channel, he realized he had a problem: Seattle was 125 miles away, far beyond the range of any normal antenna.
Parsons wasn't deterred. He mounted a large antenna on the roof of the Hotel Astoria, discovered he could actually pick up the distant Seattle signal, and ran coaxial cable across the street to his apartment. When the station went on the air in November 1948, Ed Parsons was the only person in Astoria who could watch television.
He quickly realized he had a business opportunity. For a $125 installation fee and $3 per month, he would connect your home to his antenna. By May 1968, Parsons had been acknowledged as the "father of community antenna television."
Meanwhile, in Tuckerman, Arkansas, a movie theater manager named James Y. Davidson saw a different opportunity. In 1949, he set up a cable system to bring signals from a Memphis television station to his town, which was too far away to receive the broadcasts with ordinary antennas. For a theater manager, this might seem like an odd business move—wouldn't free television compete with his movie house? But Davidson apparently understood that the future was coming whether he participated or not.
The First Commercial System
The person who truly commercialized cable television was Robert Tarlton, a television set retailer in Lansford, Pennsylvania. Like Walson, Tarlton faced the problem of selling televisions to people who couldn't receive signals. Unlike Walson, Tarlton thought bigger.
In 1950, Tarlton organized a group of fellow television retailers to build a system that would bring Philadelphia broadcast stations to Lansford for a monthly fee. The system worked beautifully, and more importantly, it got noticed. The New York Times covered it. Newsweek covered it. The Wall Street Journal covered it.
The publicity set off a wave of cable system construction across the United States. Tarlton became a highly sought-after consultant, and he eventually went to work for Jerrold Electronics, the company whose equipment had made his system possible. Jerrold's president, Milton Shapp—who would later become governor of Pennsylvania—reorganized his entire company to serve the suddenly booming cable industry.
Tarlton spent the 1950s helping build major cable systems and training the operators who would run them. In 2003, he was inducted into the Cable Television Hall of Fame for his role in creating the first widely publicized cable television company in America.
The Threat to Free Television
To understand what happened next, you need to understand the economics of broadcast television in the 1950s.
The major networks—NBC, CBS, ABC—made their money from advertisers. Advertisers paid based on how many people watched their commercials. The more viewers, the more money. This created an incentive to reach as many households as possible with programming that appealed to the broadest possible audience.
For the networks, cable television represented both an opportunity and a threat. On one hand, cable could extend their reach into areas that couldn't receive their signals. On the other hand, what if cable operators started charging for content? What if the pay-television model proved more profitable than advertising?
The entertainment industry was already nervous about television. The rise of free broadcast television in the 1950s had devastated the movie business by offering an alternative to paying for films. Now there was talk of turning free television viewers into paid television viewers.
The math was tantalizing. In 1957, CBS broadcast a musical version of Cinderella that 25 million American households tuned in to watch. Executives calculated that if the network had charged just 25 cents per household, it would have earned more than $6 million—without the cost of distributing the program through local affiliates.
But legal, regulatory, and technological obstacles kept pay television at bay. For its first 24 years, cable in the United States was used almost exclusively to relay existing broadcast signals to remote areas. Original cable programming wouldn't arrive until 1972, when the industry was finally deregulated.
The FCC Gets Involved
The Federal Communications Commission, or FCC, is the government agency responsible for regulating communications in the United States. It oversees everything from radio stations to telephone networks to the allocation of wireless spectrum.
The FCC's involvement with cable television began with a letter.
On August 1, 1949, an FCC secretary named T.J. Slowie wrote to Ed Parsons asking him to "furnish the Commission full information with respect to the nature of the system you may have developed and may be operating." This was the first recorded contact between the federal government and the fledgling cable industry.
An FCC lawyer named E. Stratford Smith initially determined that the Commission could regulate cable systems as "common carriers"—the same legal category that covered telephone companies and railroads. But the FCC didn't act on this opinion, and Smith himself later changed his mind after spending time working in the cable industry.
What followed was decades of regulatory uncertainty, as the FCC, Congress, and the courts tried to figure out what cable television actually was and who should control it.
The Battle for Control
Broadcasters saw cable as a threat. Cable systems could import distant signals, giving viewers more choices and potentially fragmenting the audience. If people could watch stations from other cities, local broadcasters would lose viewers and advertising revenue.
In the late 1950s, broadcasters tried to force the FCC to regulate cable systems as common carriers. They argued that cable undermined the Commission's goal of ensuring every community had its own television station. The case, Frontier Broadcasting Co. v. Collier, involved 288 cable systems in 36 states.
The FCC ruled against the broadcasters, deciding that cable wasn't really a common carrier since subscribers didn't choose the programming—they simply received whatever the cable operator decided to transmit.
But the broadcasters kept fighting. In the Carter Mountain case, a company that transmitted television signals by microwave to cable systems in Wyoming wanted to add new signals. A local television station objected, claiming economic damage. A hearing examiner sided with the transmission company, but the FCC overruled him and sided with the television station. The case went to appeal, and the FCC won.
This established an important principle: the FCC could restrict cable operators to protect broadcast television stations, even if no station had actually gone off the air due to cable competition. The mere plausibility of economic harm was enough to justify government intervention.
The Regulatory Framework
By 1965, the FCC had given itself formal power to regulate cable television. The Commission issued its First Report and Order on CATV, establishing two key rules.
The first rule required cable systems to carry all local stations whose signal reached the cable system's service area with reasonable quality. This was called "must-carry," and the logic was straightforward: cable was supposed to supplement broadcast television, not replace it. If people had to switch from cable to an antenna to watch local stations, they probably wouldn't bother.
The second rule prohibited cable systems from importing programs from distant stations that duplicated what local stations were showing. If a local station was going to broadcast a particular movie on Saturday night, a cable system couldn't import the same movie from a station in another city for two weeks before or after.
The FCC argued that without these protections, cable operators had an unfair advantage: they didn't have to bid against broadcasters for programming; they just picked up whatever signals they could capture.
The 1966 Second Report and Order went further. It prohibited cable systems from importing distant signals into the top 100 television markets—essentially the 100 largest metropolitan areas in the country. This meant cable systems could only really profit in areas with poor reception. Big cities, where the money was, were largely off-limits.
In 1968, the Supreme Court upheld the FCC's authority in United States v. Southwestern Cable, ruling that the Commission's power over "all interstate communications by wire or radio" included cable systems.
The Explosion of the 1980s
Everything changed in the 1980s.
In 1980, approximately 16 million American households subscribed to cable television. By 1990, that number had more than tripled to 50 million. What happened?
First, regulation loosened. The FCC lifted many of its restrictions on cable in large cities, though it added new requirements for local programming.
Second, technology improved. Better wiring and satellite technology made it possible to deliver more channels to more households. The 20-channel cable system of the 1970s gave way to systems with 50, then 100, then hundreds of channels.
Third, and most importantly, entrepreneurs figured out how to create programming specifically for cable.
In 1979, Nickelodeon launched as a children's network and ESPN launched as a sports network. In 1980, Ted Turner founded CNN, the first 24-hour news network. In 1981, MTV premiered, built entirely around music videos—a format that didn't really exist before cable made it viable. In 1988, Turner launched TNT.
These were the first networks designed from the ground up for cable rather than broadcast. They could target specific audiences—children, sports fans, news junkies, teenagers—rather than trying to appeal to everyone the way broadcast networks did.
The growth continued into the 1990s. By 2000, cable reached 67.7 million homes. In 1996, Rupert Murdoch launched Fox News, while NBC and Microsoft partnered to create MSNBC, which was notable for having a presence on both cable television and the emerging internet.
Public Access and Local Programming
One of the more unusual features of American cable television is public access television. This is the channel—usually found somewhere in the lower reaches of the dial—where local residents can create and broadcast their own programming.
The concept emerged from the FCC's 1976 regulations, which required cable systems with 3,500 or more subscribers to provide what's called PEG access: Public, Educational, and Government channels. Cable operators had to not only reserve channel capacity for these purposes but also provide facilities and equipment for local residents to use.
The result was a strange and wonderful flowering of amateur television. Local governments broadcast city council meetings. Schools created educational programming. And ordinary citizens produced whatever they wanted: cooking shows, political commentary, religious programming, avant-garde art, and occasionally things that would never be permitted on commercial television.
Public access became a cultural touchstone, often mocked for its low production values but also celebrated as a genuine form of democratic media. Anyone with something to say could, in theory, say it on television.
The Peak and the Decline
According to FCC reports, traditional cable television subscriptions in the United States peaked around the year 2000, at 68.5 million total subscriptions. The industry didn't know it at the time, but that was the high-water mark.
By December 2013, subscriptions had fallen to 54.4 million—a decline of 20 percent. Telephone companies had entered the market, reaching 11.3 million video subscribers, but even counting those, the total pay-television market was shrinking.
Writing in Forbes, media consultant Brad Adgate noted that cable TV "was in its zenith" in the early 2010s, with more than 100 million households subscribing to some form of pay television. Industry estimates suggested that cable would soon surpass broadcast television in viewership for the first time ever.
It never happened.
What happened instead was streaming. Netflix, which had started as a DVD-by-mail service, began offering streaming video in 2007. Amazon Prime Video launched in 2011. Hulu, originally created by the broadcast networks themselves, expanded rapidly. By the mid-2010s, it was becoming clear that the future of television wasn't cable—it was the internet.
The Demographics of Cable
One of the less discussed aspects of cable television's history is who actually subscribed to it.
Most cable viewers in the United States have traditionally lived in the suburbs and tended to be middle class. Cable was, and remains, less common in low-income areas, urban cores, and rural communities.
This is partly technological. Running cable lines requires significant infrastructure investment, and cable companies have historically been more willing to invest in areas where customers could afford higher bills. Rural areas, with houses spread far apart, were expensive to wire and often unprofitable.
It's partly economic. Cable subscriptions, which can run $100 to $200 per month for comprehensive packages, represent a significant expense for low-income households.
And it's partly about alternatives. Urban areas often have better over-the-air reception due to proximity to broadcast towers, reducing the value proposition of cable. Rural areas, meanwhile, increasingly turned to satellite television as an alternative.
Where Things Stand Today
The 2021 Pew Research Center survey that found cable and satellite subscriptions had fallen from 76 percent of adults in 2015 to 56 percent was alarming enough. The 2025 follow-up found that only 36 percent of American adults still subscribed.
This is a collapse. In less than a decade, pay television went from something three-quarters of Americans had to something only about a third have. The trend shows no signs of reversing.
The cable companies themselves have largely accepted this reality. Comcast and Charter, the two largest cable providers, now emphasize their broadband internet services rather than their television offerings. The irony is almost too neat: the internet, which the cable companies helped deliver to American homes through their cable infrastructure, has become the thing that's killing their original business.
For investors considering cable companies—as the Substack article that prompted this essay discusses—the question isn't whether cable television will decline. It's how fast the decline will continue and whether broadband internet can compensate.
The Legacy
Cable television's seven-decade run transformed American culture in ways we still take for granted.
It created the 24-hour news cycle. Before CNN, news happened at 6 PM and 11 PM. After CNN, news never stopped.
It invented niche programming. Before cable, television was aimed at the broadest possible audience. After cable, there were channels for children, channels for sports fans, channels for history buffs, channels for cooking enthusiasts. The idea that media could be tailored to specific interests rather than mass audiences began with cable.
It pioneered the subscription model for video content. Before cable, television was either free (broadcast) or expensive (movie theaters). Cable normalized the idea of paying a monthly fee for access to content—the same model that streaming services use today.
And it demonstrated what happens when technology runs ahead of regulation. The FCC spent decades trying to figure out what cable was and how to control it. By the time the regulatory framework was settled, the industry had already developed in ways the regulators hadn't anticipated.
The streaming services that are now replacing cable face similar regulatory uncertainty. What rules should apply to Netflix? To YouTube? To TikTok? The lessons of cable television—how an industry can grow in the gaps between regulations, how incumbents can capture regulators, how technology can outpace policy—remain relevant.
Ed Parsons, running coaxial cable across a street in Astoria to watch a Seattle television station, probably didn't imagine he was starting a multibillion-dollar industry. He just wanted to watch TV. Sometimes the biggest changes start with someone solving a small problem in an unexpected way.