← Back to Library
Wikipedia Deep Dive

FiveThirtyEight

Based on Wikipedia: FiveThirtyEight

On election night 2012, Nate Silver got all fifty states right. Every single one. Plus the District of Columbia. In an era when pundits confidently declared the race "too close to call," a baseball statistics nerd had built a model that saw what they couldn't.

This is the story of FiveThirtyEight—named for the 538 electors in the United States Electoral College—and how one man's obsession with prediction transformed American political journalism.

The Baseball Guy Who Crashed Politics

Nate Silver didn't start in politics. He started in baseball.

In the early 2000s, Silver was developing PECOTA, a system for projecting player performance. The name stood for Player Empirical Comparison and Optimization Test Algorithm, though Silver cheekily named it after Bill Pecota, a journeyman utility player whose statistics perfectly represented average. The system worked by finding historical comparisons—players who looked similar at similar ages—and using their career trajectories to predict what current players might do.

This approach, called sabermetrics (from the Society for American Baseball Research, or SABR), was revolutionizing how teams evaluated talent. Silver was good at it. Really good. But by 2007, he'd grown restless. He started posting political analysis on Daily Kos, a left-leaning political blog, under the pseudonym "Poblano."

His anonymity didn't last long.

Super Tuesday and the Birth of a Legend

February 5, 2008. Super Tuesday. Twenty-four states and American Samoa were holding primaries on the same day in the Democratic race between Barack Obama and Hillary Clinton. The pundit class was throwing around predictions like confetti.

Poblano—still anonymous—published a forecast. Obama would win 859 delegates. Clinton would take 829.

When the votes were counted, Obama had 847 delegates. Clinton had 834. The mysterious blogger had come within twelve delegates of perfection across two dozen simultaneous contests.

New York Times columnist William Kristol took notice, citing the "interesting regression analysis" from Daily Kos. A month later, in early March, Silver launched his own website. He called it FiveThirtyEight.

The name was both literal and symbolic. There are exactly 538 electors in the Electoral College: 435 for the House of Representatives, 100 for the Senate, and 3 for the District of Columbia. To win the presidency, you need 270—a simple majority. Silver was announcing his intentions. He wasn't just going to analyze polls. He was going to predict who would win the whole thing.

The Method Behind the Madness

What made Silver's approach different? Most poll aggregators at the time simply averaged the available polls. If three polls showed Obama up by 5, 7, and 3 points, you'd say he was up by about 5 points. Simple.

Silver thought this was lazy.

First, not all polls are created equal. Some pollsters have excellent track records. Others consistently miss. Some use rigorous sampling methods. Others cut corners. Silver began rating pollsters based on their historical accuracy, weighting their results accordingly. A poll from a historically accurate firm counted for more than a poll from a firm that had missed badly in past elections.

Second, polls age badly. A survey from three weeks ago tells you less than a survey from yesterday. Silver built time decay into his model, trusting recent data more than stale data.

Third—and this was the baseball insight—similar things tend to behave similarly. In PECOTA, Silver had predicted a player's future by finding comparable players from the past. Now he applied the same logic to states. Ohio and Pennsylvania share demographic similarities. If you're polling well in one but have sparse data in the other, the Ohio data can tell you something about Pennsylvania. Silver called this "nearest neighbor analysis."

He went further. He factored in national polling trends. He built regression models using demographic characteristics and historical voting patterns. In states with few or no polls, his model could still generate estimates based on what similar states were doing and how those states had voted in the past.

The result was something like a weather forecast for elections. Not "Obama will definitely win" but "Obama has a 74.6% chance of winning." This probabilistic approach was foreign to political journalism, which preferred confident declarations. Silver was saying something subtler: here's what the data suggests, and here's how uncertain we should be about it.

The 2008 Triumph

By October 2008, FiveThirtyEight was getting 2.5 million visitors per week. On Election Day itself, nearly 5 million people loaded the page to see Silver's final prediction.

His model projected Obama would win 349 electoral votes (or 353, depending on which calculation method you preferred). The actual result: 365 electoral votes. Silver had correctly called 49 out of 50 states. The only miss was Indiana, which Obama won by less than one percentage point—within any reasonable margin of error.

He'd also nailed the popular vote margin within about a point, and correctly predicted every Senate race.

Overnight, Nate Silver became the most famous statistician in America.

The New York Times Years

In August 2010, FiveThirtyEight became part of The New York Times. The blog was rebranded as "FiveThirtyEight: Nate Silver's Political Calculus," and Silver gained access to the resources and audience of one of the world's most prestigious newspapers.

He used the platform to expand his ratings of pollsters, building a database of more than 4,700 election polls and developing increasingly sophisticated methods for evaluating their accuracy. When critics questioned his transparency, Silver responded with characteristic precision: his methodology explanation ran to 4,807 words with 18 footnotes.

The 2012 election cemented his reputation. While pundits argued about whether Mitt Romney had momentum, Silver's model showed Obama as a consistent favorite. On election night, Silver called all 50 states correctly. He'd done something that seemed almost impossible: he'd been more accurate than the people who studied politics for a living by using methods borrowed from baseball.

The ESPN Era and Expansion

In July 2013, Silver left The New York Times for ESPN. The sports network was betting that Silver's data-driven approach could work beyond politics. When FiveThirtyEight relaunched in March 2014, it covered sports, science, economics, and popular culture alongside its political analysis.

The site won awards. Lots of them. Bloggie Awards for Best Political Coverage in 2008 and Best Weblog about Politics in 2009. Webby Awards for Best Political Blog in 2012 and 2013. Data Journalism Website of the Year from the Global Editors Network in 2016.

That last award came during a year that would complicate Silver's legacy.

The 2016 Problem

In the 2016 presidential election, FiveThirtyEight gave Donald Trump about a 29% chance of winning on election eve. This was actually much higher than most other forecasters—The Princeton Election Consortium gave Trump less than 1%, and many pundits considered even FiveThirtyEight's 29% absurdly generous to the Republican nominee.

Then Trump won.

A 29% chance is not a prediction of defeat. Something with a 29% chance happens all the time. Roll a die; getting a 1 or 2 is about a 33% chance, and nobody is shocked when it happens. But the public, and much of the media, didn't process it that way. To many, Silver had "gotten it wrong."

Silver pushed back, arguing that his model had correctly identified the race as closer than conventional wisdom suggested. The criticism highlighted a fundamental tension in probabilistic forecasting: humans are bad at thinking in probabilities. We want to be told who will win, not given odds.

How Poll Aggregation Actually Works

To understand what Silver built, you need to understand the problem he was solving.

Polls are snapshots. They capture what a sample of people say at a particular moment. But they're noisy. A poll might interview 800 people, which means it has a margin of error of about 3.5 percentage points. If the true split is 50-50, any given poll might show anything from a 7-point lead for one side to a 7-point lead for the other, just from random sampling variation.

Worse, polls have systematic errors. Maybe the pollster's sample skews older than the actual electorate. Maybe people who answer phone surveys differ from people who don't. Maybe respondents lie. These biases don't cancel out with more polls—they persist.

Silver's approach attacked both problems. Averaging multiple polls reduces random noise (though not systematic bias). Weighting by pollster quality helps account for the fact that some firms consistently do better than others. Incorporating demographic data and historical patterns provides anchoring when polls are sparse or conflicting.

The model also produced uncertainty estimates. Rather than saying "Obama will get 332 electoral votes," it ran thousands of simulations and said "Obama's most likely outcome is 332 electoral votes, but there's meaningful probability of results ranging from 280 to 370." This distinction mattered enormously for understanding what the forecast actually meant.

The Rise of the Competition

Silver's success spawned imitators. The Economist built its own election model. So did The Upshot at The New York Times (Silver's former home). Political scientists developed competing approaches. By 2020, poll aggregation and probabilistic forecasting had become standard practice in political journalism.

This democratization was both a vindication and a threat. Silver had proven that rigorous quantitative analysis belonged in political coverage. But he was no longer the only game in town.

The End of an Era

In 2018, operations transferred from ESPN to ABC News, both owned by Disney. The site continued, but the heady days of being a revolutionary upstart were over. FiveThirtyEight had become part of the establishment it once challenged.

In 2023, Nate Silver left the organization he'd founded. He took his forecasting model with him to a new venture called Silver Bulletin, essentially going back to his roots as an independent analyst. Disney hired G. Elliott Morris, a young political data journalist, to develop a new model for what remained of FiveThirtyEight.

On September 18, 2023, the original fivethirtyeight.com domain was shut down. Traffic was redirected to ABC News pages. The FiveThirtyEight name was shortened to just "538," and the branding was reimagined.

Then, on March 5, 2025, ABC News shut down 538 entirely. The staff were laid off. Seventeen years after Nate Silver posted his first anonymous forecast on Daily Kos, the most influential political forecasting brand of the internet era was gone.

The Legacy of Probabilistic Thinking

What did FiveThirtyEight actually change?

Before Silver, political journalism was dominated by "vibes." Pundits would watch a debate and declare a winner based on gut feeling. They'd visit a diner in Ohio and extrapolate to the entire electorate. They'd craft narratives about "momentum" that had no empirical basis.

Silver introduced accountability. His forecasts had numbers attached. You could go back and check whether he was right. This was uncomfortable for an industry that preferred unfalsifiable predictions, but it raised the standard for everyone.

He also introduced uncertainty. Traditional coverage treated elections like sporting events with inevitable outcomes. Silver's probabilistic approach reminded people that the future is genuinely unknown. A 70% favorite can still lose. Unlikely things happen.

Most importantly, he demonstrated that quantitative analysis could be accessible. Silver wrote clearly. He explained his methods. He showed his work. He proved that data journalism didn't have to be dry or academic—it could be engaging, even fun.

The Limits of Models

But FiveThirtyEight also illustrated the limits of quantitative prediction.

Models can only capture what they're designed to capture. Silver's model was excellent at aggregating polling data, but polling itself has weaknesses. Response rates have plummeted as people ignore unknown callers. Online panels may not represent the electorate. "Likely voter" screens—the methods pollsters use to identify who will actually vote—are more art than science.

There's also the observer effect. When FiveThirtyEight declared a candidate a heavy favorite, did that affect voter behavior? Did some supporters stay home because victory seemed assured? Did opponents become more motivated? These second-order effects are impossible to model.

And fundamentally, elections are rare events. We only get a presidential election every four years. The sample size for validating any model is tiny. Silver could call 49 out of 50 states correctly and still have a model that was wrong in ways that wouldn't show up until later.

The Personal Journey

There's something poignant about Silver's trajectory. He started as an anonymous blogger, writing for the joy of being right. He became famous, institutionalized, corporate. Then he left to start over, returning to independent analysis.

The internet, too, has changed. In 2008, blogs were where interesting writing happened. By 2023, newsletters and Substacks had taken their place. Silver's move to Silver Bulletin mirrored a broader shift: the unbundling of media, the return to individual voices over institutional brands.

FiveThirtyEight as a website is gone. But the approach it pioneered—rigorous, transparent, probabilistic—has become the default for serious election coverage. In that sense, Silver won. He changed how we think about political prediction. The original site may have been shut down, but its methods are everywhere.

Why 538 Electors?

One final note on the name. The number 538 comes from the Constitution's formula for the Electoral College: each state gets electors equal to its total congressional delegation (House members plus senators), and the District of Columbia gets three under the 23rd Amendment.

This means California, with its 52 House seats plus 2 senators, has 54 electoral votes. Wyoming, with its 1 House seat plus 2 senators, has 3. The system was designed as a compromise between those who wanted Congress to choose the president and those who wanted direct popular election.

Whether 538 is the right number—whether the Electoral College itself is the right system—is a separate debate entirely. But for seventeen years, that number became synonymous with trying to predict what American voters would do. It became a brand, a methodology, a cultural moment.

Now it's history. The electors remain. The forecasters have moved on.

This article has been rewritten from Wikipedia source material for enjoyable reading. Content may have been condensed, restructured, or simplified.