Internet Research Agency
Based on Wikipedia: Internet Research Agency
In a nondescript four-story building on Savushkina Street in Saint Petersburg, more than a thousand people once showed up for work each day to lie on the internet. They worked in shifts, twelve hours on and twelve hours off, churning out blog posts, social media comments, and political cartoons designed to look like they came from ordinary citizens around the world. They had quotas: one hundred comments per day, ten blog posts per shift, each at least 750 characters long. They earned about forty thousand rubles a month—roughly six hundred dollars at the time—plus bonuses for particularly effective deceptions.
This was the Internet Research Agency, though that clinical name hardly captures what it actually did. In Russian internet slang, the employees were simply called the "Trolls from Olgino," named after the Saint Petersburg district where the operation first set up shop in 2013. Americans would later call them "Kremlinbots." The organization itself sometimes went by "Glavset," which translates to something like "Central Network."
One former employee, interviewed by The Washington Post, described the experience with a literary reference that cuts to the heart of the operation:
I immediately felt like a character in the book 1984 by George Orwell—a place where you have to write that white is black and black is white. Your first feeling, when you ended up there, was that you were in some kind of factory that turned lying, telling untruths, into an industrial assembly line.
An industrial assembly line for lies. That phrase captures something essential about what made the Internet Research Agency different from ordinary propaganda. This wasn't a handful of government officials crafting careful talking points. This was manufacturing at scale.
The Man Behind the Factory
For years, a Russian oligarch named Yevgeny Prigozhin denied any connection to the Internet Research Agency. Then, in February 2023, he decided to brag about it instead.
"I've never just been the financier of the Internet Research Agency," Prigozhin announced. "I invented it, I created it, I managed it for a long time."
By then, Prigozhin had become better known in the West for something else entirely: the Wagner Group, a private military company that deployed mercenaries to conflicts in Ukraine, Syria, Africa, and elsewhere. The same man who ran a troll factory also ran a mercenary army. In the world of Russian power, these two enterprises weren't as different as they might seem. Both were tools for projecting influence while maintaining plausible deniability for the Kremlin.
Prigozhin's journey to this point was itself remarkable. He had spent nine years in Soviet prisons for robbery and fraud before reinventing himself as a restaurateur in the chaos of post-Soviet Russia. His catering company won contracts to provide food for the Kremlin, earning him the nickname "Putin's Chef." From banquets for the Russian president, he expanded into propaganda and then private warfare.
The Internet Research Agency would outlive its usefulness. In July 2023, following the Wagner Group's brief and bizarre mutiny against the Russian military, the troll factory was shut down. Prigozhin himself died two months later when his private jet fell from the sky under circumstances that remain officially unexplained but surprised almost no one who understood how the Kremlin deals with those who embarrass it.
How to Build a Troll Factory
The operation started small, in a white cottage in the Olgino district that employees could reach by taking the Saint Petersburg metro to Staraya Derevnia station and walking for about fifteen minutes. The workstations were set up in basement rooms. A help-wanted ad from August 2013, preserved by the newspaper Novaya Gazeta, gives a flavor of the pitch:
"Internet operators wanted! Job at chic office in Olgino! Salary 25,960 per month. Task: posting comments at profile sites in the Internet, writing thematic posts, blogs, social networks. Reports via screenshots. Individual schedule. Payments every week and free meals! Official job placement or according to contract. Tuition possible."
The salary was modest—about eight hundred dollars a month in 2013—but the promise of free meals and flexible hours attracted applicants. The "tuition possible" line hints at something telling: many of the recruits were young people who needed training in the dark arts of online deception.
As the operation grew, it moved to larger quarters. By late 2014, the trolls had relocated to the four-story building on Savushkina Street. Officially, this building was listed as an "uncompleted construction" in city records, a useful fiction that provided some cover for its actual purpose. By 2015, more than a thousand people worked there in a single building, with many others working remotely.
The operation was organized with industrial efficiency. Employees worked twelve-hour shifts, rotating through schedules that kept the factory running around the clock. Bloggers were responsible for maintaining three separate accounts each, posting under false identities designed to look like authentic voices from different demographics and regions. Commenters had quotas of 126 comments per day plus two original posts per account. Artists drew political cartoons. Supervisors monitored output through screenshots that employees submitted as proof of work.
What the Trolls Actually Did
The core mission was straightforward: shape online conversations to favor Russian interests. The specific targets evolved over time, but certain themes remained constant.
The trolls criticized Alexei Navalny, the Russian opposition leader who would later be imprisoned and die in custody under suspicious circumstances. They attacked Ukraine and its government, particularly after Russia's 2014 annexation of Crimea sparked international condemnation. They praised Vladimir Putin and Russian foreign policy. They defended Bashar al-Assad, the Syrian dictator whose brutal civil war Russia had intervened to support.
But the techniques went beyond simple cheerleading and criticism. The operation sought to erode trust in institutions themselves—not just to make people believe specific things, but to make them uncertain about what they could believe at all.
One particularly creative operation involved staging an elaborate hoax in the United States. In September 2014, residents of St. Mary Parish, Louisiana, began seeing alarming social media posts about a chemical explosion at a local plant. The posts included fake screenshots of CNN and local news coverage, doctored Wikipedia pages, and even a faked video of ISIS claiming responsibility for the supposed attack. Text messages warned people of toxic fumes. None of it was real. The Columbian Chemicals Plant was operating normally. The entire panic had been manufactured from an office in Saint Petersburg.
The point wasn't to make Americans believe there had actually been a chemical explosion. The point was to demonstrate—perhaps primarily to Russian intelligence officials evaluating the operation—that American information ecosystems could be manipulated, that panic could be manufactured from across an ocean, that reality itself had become contestable.
The Technology of Surveillance
To understand the Internet Research Agency, you need to understand the system it was built to feed.
In August 2012, the Russian edition of Forbes magazine revealed the existence of something called Prisma—not to be confused with the later photo-filtering app of the same name. This Prisma was a sophisticated monitoring system that could track approximately sixty million blogs and social media accounts in near real-time, analyzing the tone and content of each with only a few minutes of delay and an estimated error rate of just two to three percent.
The system was nicknamed "Volodin's Prism" after Vyacheslav Volodin, a powerful figure in Putin's presidential administration who became one of its primary users. Volodin had received his Prisma terminal just before the December 2011 parliamentary elections—the same elections that sparked massive protests in Moscow and other cities after widespread reports of fraud. These demonstrations, sometimes called the "Snow Revolution" for the winter weather in which they occurred, represented the largest challenge to Putin's authority in years.
Prisma gave officials like Volodin the ability to watch dissent emerge in real-time and coordinate responses. Other users included the State Duma, the Ministry of Internal Affairs, Moscow City Hall, and associates of Igor Sechin, the head of the state oil giant Rosneft.
The Internet Research Agency was, in essence, the offensive counterpart to this surveillance system. Prisma could identify what people were saying and feeling. The troll factory could flood those same channels with alternative narratives, manufactured consensus, and strategic chaos.
Going Global
The 2016 United States presidential election put the Internet Research Agency on the front pages of American newspapers, but the operation had been targeting Western audiences for years before that.
In January 2017, the United States Intelligence Community—the collective term for the country's various spy agencies—released a remarkable public report assessing Russian interference in the election. The report described the Internet Research Agency as a "troll farm" and noted that its "likely financier" was "a close ally of Putin with ties to Russian intelligence." The trolls, the report found, "previously were devoted to supporting Russian actions in Ukraine" before they "started to advocate for candidate Trump as early as December 2015."
A year later, in February 2018, a federal grand jury indicted thirteen Russian nationals and three Russian organizations, including the Internet Research Agency itself, on charges of conspiracy to defraud the United States. The indictment provided extraordinary detail about the operation, describing how trolls had created fake American personas, organized real political rallies on both the left and the right, and purchased political advertisements on social media platforms.
The operation was sophisticated enough to segment its audiences. Trolls posing as American conservatives pushed divisive messages about immigration and Islam. Trolls posing as American progressives promoted Bernie Sanders during the Democratic primary and tried to suppress African American voter turnout in the general election. The goal wasn't simply to help Trump win—though the evidence suggests that was the preferred outcome—but to deepen existing divisions in American society and undermine faith in democratic institutions regardless of who won.
The Finnish Front
Not all of the Internet Research Agency's targets were major powers. The operation also went after smaller countries—and individuals who investigated it.
Jessikka Aro was a Finnish journalist who began investigating pro-Russian trolling activities in Finland. Her reporting documented how the troll factory had targeted Finnish public discourse, spreading pro-Kremlin narratives and attacking critics of Russian policy. In response, she became a target herself.
Aro faced an organized campaign of harassment, disinformation, and hate. Trolls spread false claims about her personal life. She received threats. The experience illustrated something important about how the operation worked: it didn't just spread propaganda, it attacked the people who tried to expose the propaganda. The goal was to make investigating the troll factory so personally costly that journalists would think twice before pursuing such stories.
The Theater of Authenticity
Some of the Internet Research Agency's most creative work involved manufacturing fake evidence of events that never happened.
In September 2015, a video appeared online showing what appeared to be an American soldier shooting at a Quran with a Saiga 410K shotgun—a Russian-made weapon. The video sparked predictable outrage. A BBC investigation later determined that the video had almost certainly been produced by the Olgino troll factory. The "soldier's" uniform wasn't actually used by the U.S. military but was easily available for purchase in Russia. The man in the video appeared to be a bartender from Saint Petersburg who had connections to troll factory employees.
Another operation targeted the Netherlands. A video attributed to Ukraine's Azov Battalion—a controversial military unit with far-right associations—showed masked soldiers threatening Dutch citizens over an upcoming referendum on a European Union association agreement with Ukraine. The citizen journalism site Bellingcat traced the video back to the same Saint Petersburg operation.
These weren't just lies. They were theatrical productions, complete with costumes, scripts, and distribution strategies designed to make fabricated events look like authentic documentation of real occurrences. The trolls weren't merely arguing that their side was right; they were manufacturing "evidence" that could be cited, shared, and used to win arguments they had effectively rigged in advance.
The Roots Run Deep
The Internet Research Agency didn't emerge from nowhere. Its methods had precedents in Soviet intelligence tradecraft that stretched back decades.
Vladimir Putin's first assignment with the KGB, in the late 1970s, was with the Fifth Department—the unit responsible for countering dissidents through disinformation and what intelligence professionals call "active measures." This department was championed by Filipp Bobkov and KGB chief Yuri Andropov, who believed in "stamping out dissent" through psychological and information operations rather than merely through physical coercion.
Active measures included spreading false rumors, forging documents, creating front organizations, and manipulating media coverage. The techniques were refined over decades of Cold War competition. What the Internet Research Agency represented was the adaptation of these Soviet-era methods to the age of social media—where the barriers to entry for manipulation had dropped dramatically and the potential reach of a single piece of content had expanded exponentially.
The Soviet Union had needed to cultivate sympathetic journalists, establish front publications, and work through a careful network of human assets to spread disinformation in the West. The Internet Research Agency could accomplish similar goals with a building full of young people, some basic training, and an internet connection.
The Assembly Line Continues
Even after the Internet Research Agency officially shut down in 2023, the model it pioneered didn't disappear. In 2024, an investigation by multiple media outlets revealed documents leaked from an organization called the "Agency of Social Design," which had produced nearly forty thousand pieces of content—memes, images, comments—over just four months. This content was deployed in targeted campaigns against the governments of France, Poland, Germany, and Ukraine.
The troll factory had been a proof of concept. It demonstrated that industrialized deception could be cost-effective, scalable, and difficult to counter. Even when one operation was exposed and shut down, the techniques and trained personnel could be dispersed to new organizations. The assembly line for lies had spawned imitators and successors.
The fundamental insight that drove the Internet Research Agency—that social media platforms could be exploited to manufacture fake grassroots movements, erode trust in institutions, and inject chaos into democratic discourse—has become common knowledge. Other governments have copied the model. Private companies have adopted similar techniques for commercial purposes. The genie, as they say, is out of the bottle.
What the Trolls Wrought
The lasting impact of the Internet Research Agency is difficult to measure precisely. Did it change the outcome of elections? Shift public opinion on specific issues? Permanently damage trust in media and democratic institutions?
The honest answer is that we don't know with certainty. Social media influence is notoriously difficult to quantify. People are exposed to thousands of messages daily, and isolating the effect of any particular campaign is methodologically challenging. The Russian operation was undoubtedly real and substantial in scale, but determining its actual effects requires distinguishing signal from noise in an environment that is almost entirely noise.
What we can say with more confidence is that the Internet Research Agency helped pioneer a new form of conflict—one in which the weapons are narratives, the battlefield is social media, and the targets are the shared epistemological foundations that allow democratic societies to function. The trolls from Olgino didn't need to convince everyone of specific lies. They just needed to make enough people uncertain about enough things that collective action became harder, institutions became less trusted, and social cohesion frayed.
A factory for lies, running in shifts, meeting quotas, earning bonuses for effective deceptions. It sounds almost too cynical to be real. But it was real, and its consequences are still unfolding.