← Back to Library
Wikipedia Deep Dive

Digital Services Act

Based on Wikipedia: Digital Services Act

In April 2022, European Union negotiators emerged from sixteen hours of marathon talks in Brussels with something unprecedented: a comprehensive rulebook that would force the world's largest technology companies to open their black boxes. The Digital Services Act, or DSA, represents the most ambitious attempt by any government to regulate how social media platforms, search engines, and online marketplaces operate. And because companies like Meta, Google, and TikTok cannot afford to abandon a market of 450 million consumers, what happens in Brussels increasingly shapes the internet everywhere.

Why Europe Decided to Act

To understand the DSA, you need to go back to the year 2000. That's when the European Union passed its Electronic Commerce Directive, a law crafted when Google was a scrappy startup, Facebook didn't exist, and the iPhone was still seven years away. The directive was designed for an internet of static websites and email, not algorithmic feeds that can shape elections or recommendation systems that can radicalize teenagers.

By the late 2010s, the gap between law and reality had become a chasm. Individual European countries started improvising. Germany passed NetzDG in 2017, requiring platforms to remove hate speech within 24 hours or face fines up to 50 million euros. Austria followed with a similar law. France attempted its own version, called the Loi Avia, though courts struck down parts of it.

This patchwork created headaches for everyone. Tech companies faced different rules in different countries. Users had wildly inconsistent protections depending on which side of a border they lived on. And smaller platforms that lacked armies of lawyers struggled to comply with the maze of requirements.

When Ursula von der Leyen campaigned to lead the European Commission in 2019, she promised a "new Digital Services Act" that would replace this chaos with a single, coherent framework. The law that emerged doesn't just update the 2000 directive—it fundamentally reimagines the relationship between platforms, governments, and citizens.

The Tiered System: Not All Platforms Are Treated Equal

One of the DSA's cleverest innovations is its graduated approach. Rather than applying identical rules to a neighborhood forum and to YouTube, the law creates tiers based on size and risk.

At the base level, all digital intermediary services face certain obligations. An intermediary service is any business that transmits, stores, or hosts information on behalf of others—everything from internet service providers to cloud hosting companies to online platforms.

Online platforms face additional requirements. These include social networks, app stores, online marketplaces, and yes, pornographic websites. If you host user-generated content and make it publicly available, you're an online platform under the DSA.

But the most stringent rules apply to what the law calls Very Large Online Platforms, or VLOPs, and Very Large Online Search Engines, or VLOSEs. The threshold is 45 million monthly active users in the European Union—roughly ten percent of the bloc's population. Cross that line, and you enter a regulatory category all your own.

This matters because a tiny startup with a thousand users probably can't afford the compliance infrastructure that Meta has. Imposing identical requirements would either crush innovation or remain unenforced. The tiered system lets regulators focus their heaviest artillery on the platforms with the greatest reach and potential for harm.

What the Big Platforms Must Do

For the tech giants designated as VLOPs or VLOSEs, the DSA imposes obligations that would have been unthinkable a decade ago.

First, they must conduct regular assessments of systemic risks. This isn't about individual pieces of illegal content—it's about examining whether the platform's fundamental design choices are harming society. Does the recommendation algorithm push users toward extremism? Does the advertising system enable discrimination? Do the platform's features facilitate harassment campaigns? Companies must analyze these questions and take action to mitigate what they find.

Second, they must submit to independent audits. Think of this as a financial audit, but for algorithms and content policies. Outside experts examine whether companies are actually doing what they claim to do.

Third, and perhaps most consequentially, they must open their data to researchers. Article 40 of the DSA requires platforms to grant access to vetted academics and nonprofits who want to study how these systems affect society. For years, researchers have begged platforms for data while being mostly rebuffed. Now it's the law. As of October 2025, the procedures for requesting this access are finally in place, and researchers can apply through an official process.

The European Commission even created a new body to make this work: the European Centre for Algorithmic Transparency. Its job is to build the technical capacity to actually understand what's happening inside these recommendation systems and content moderation engines.

The Conditional Liability Shield

Here's where things get technical, but stay with me—this is the legal foundation underlying everything else.

When someone posts something illegal on a platform, who's responsible? The person who posted it, obviously. But what about the company hosting it?

In the United States, Section 230 of the Communications Decency Act gives platforms sweeping immunity. With narrow exceptions, platforms aren't treated as the publisher of content their users create. This has enabled companies to host billions of posts without facing the legal consequences that would attach if they were, say, a newspaper responsible for every article.

The European approach is different. The DSA maintains what's called a "conditional liability exemption." Platforms aren't automatically liable for user content they host. But once they're notified that specific content is illegal—once they have knowledge—they must act expeditiously to remove it. If they don't, they lose their protection.

This creates powerful incentives. Platforms can't simply ignore illegal content and hide behind immunity. They must build systems to receive and process complaints. They must make decisions about what stays up and what comes down. And those decisions must be transparent.

Notice and Action

The DSA codifies a "notice and action" system that governs how illegal content gets flagged and removed.

Anyone can submit a notice about potentially illegal content. Platforms must provide easy-to-use mechanisms for doing so. When a notice arrives, the platform must process it, make a decision, and communicate that decision to both the person who submitted the notice and the person who posted the content.

But here's the crucial part: platforms must explain their decisions. Not with boilerplate text, but with actual reasoning. Why was this content deemed illegal, or why wasn't it? What rule did it violate? What evidence was considered?

The European Commission hosts a DSA Transparency Database where platforms submit explanations for their moderation decisions. This creates an unprecedented public record. Researchers can analyze patterns. Journalists can investigate inconsistencies. Users can understand what rules actually mean in practice.

Fighting Back Against Platform Decisions

One of the DSA's most user-friendly innovations is its appeals process. If a platform restricts your account or removes your content, you have multiple avenues to challenge that decision.

First, you can use the platform's own internal complaint system. Platforms must have one, and they must actually review complaints promptly—not just send automated rejections.

But the DSA goes further. If you're unsatisfied with the internal process, you can turn to certified out-of-court dispute settlement bodies. These are independent organizations authorized to review platform decisions and issue recommendations.

One such body is Appeals Centre Europe, initially funded by the same trust that operates Meta's Oversight Board. They review decisions from Facebook, Instagram, TikTok, Pinterest, Threads, and YouTube at no cost to users. If the dispute settlement body rules in your favor, the platform must pay all fees.

These bodies can't force platforms to comply—their recommendations aren't legally binding. But platforms must engage with them in good faith. And the mere existence of external review changes the calculus. Platforms know their decisions might face scrutiny beyond their own walls.

The Stakes: What Happens If Platforms Don't Comply

European regulators have given themselves serious enforcement tools.

Companies that violate the DSA can face fines of up to six percent of their global annual turnover. For a company like Meta, with revenues exceeding 100 billion dollars, that's potentially a six-billion-dollar penalty. For ongoing violations, the Commission can impose periodic penalties of up to five percent of average daily worldwide turnover for each day of delay.

And in extreme cases—where a violation persists, causes serious harm to users, and involves criminal offenses threatening people's lives or safety—the Commission can request temporary suspension of the service. In theory, the EU could order a platform to go dark.

Whether regulators would actually go that far remains untested. But the threat exists, and companies must take it seriously.

The Russia Factor

The DSA might have moved more slowly through the legislative process if not for Russia's invasion of Ukraine in February 2022.

As Russian forces crossed the border, Kremlin-linked accounts flooded social media with disinformation. False narratives about Ukrainian bioweapons labs spread virally. Deepfake videos appeared purporting to show President Zelensky ordering surrender. Platforms struggled to keep up with the onslaught.

European policymakers, watching this unfold in real time, felt new urgency. The invasion demonstrated that platform manipulation wasn't just about consumer protection or fair competition—it was a national security issue. Two months later, in April 2022, negotiators finalized the DSA deal.

Free Speech and the European Approach

Any discussion of content regulation must grapple with free expression. The DSA was shaped significantly by two cases from the European Court of Human Rights that illustrate the tension.

In Delfi AS v. Estonia, an Estonian news website was held liable for hate speech in its comment section. The court applied what lawyers call proportionality analysis—weighing the platform's right to free expression against the harm caused by vicious comments targeting a specific individual. Given the severity of the hate speech, the court found liability was justified.

But in Magyar Tartalomszolgáltatók Egyesülete and Index.hu Zrt v. Hungary—mercifully abbreviated as MTE and Index.hu—the court reached a different conclusion. Hungarian platforms were held liable for user comments that were offensive but didn't rise to the level of hate speech. The court found this went too far. Imposing strict liability for any negative comment, without considering context, would chill legitimate expression.

The DSA tries to thread this needle. It distinguishes between different types of illegal content, with different procedures and safeguards depending on severity. It requires platforms to consider fundamental rights when making moderation decisions. It creates appeals processes to catch overreach. The goal is preventing harm without becoming a censorship regime.

Whether this balance works in practice is something only time and enforcement will reveal.

Who Enforces All This?

The DSA creates what regulators call a "hybrid enforcement framework"—a bureaucratic term for splitting responsibilities between European and national authorities.

The European Commission directly oversees the very large platforms and search engines. This was a deliberate choice. When the law was being negotiated, Member States had concerns about whether national regulators would actually enforce rules against tech giants. Ireland, where many platforms have their European headquarters, faced particular criticism for allegedly going easy on companies that brought jobs and tax revenue to Dublin.

For smaller services, enforcement falls to national Digital Services Coordinators—independent authorities designated by each Member State. These coordinators work together through a body called the European Board for Digital Services, which coordinates enforcement across borders without having binding decision-making power itself.

The Designated Giants

On April 25, 2023, the European Commission published its first list of designated VLOPs and VLOSEs. Nineteen platforms made the initial cut, including the names you'd expect: Facebook, Instagram, TikTok, Twitter (as it was still called then), YouTube, Google Search, Bing, Amazon, and several others.

These platforms had four months to comply with the full DSA requirements—until August 25, 2023. All other covered services had until February 2024.

The list has grown since then. In December 2023, three adult content platforms were added. In 2024, the fast-fashion retailers Shein and Temu were designated after their explosive growth pushed them past the 45-million-user threshold. The online pornography site XNXX joined in July 2024.

Some companies have fought back. Amazon and Zalando both initiated legal proceedings challenging their designations. The question of exactly how to count "active users" turns out to be surprisingly contentious.

The Broader Package

The DSA wasn't passed in isolation. It's part of a suite of digital regulations that together represent Europe's attempt to shape how the internet evolves.

The Digital Markets Act, or DMA, passed alongside the DSA and focuses on competition. It designates certain platforms as "gatekeepers" and imposes rules to prevent them from abusing their dominance—requiring interoperability, prohibiting self-preferencing, and mandating data portability.

The Democracy Action Plan, also part of this package, addresses political advertising and electoral manipulation specifically.

Together, these laws form a regulatory architecture unlike anything that exists in the United States or China. Europe is betting that democratic societies can set rules for technology without strangling innovation, that platforms can be held accountable without becoming arms of the state, and that transparency can be mandated without destroying the business models that make these services viable.

The Brussels Effect

Why does this matter beyond Europe's borders?

Technology companies face a choice. They can create one version of their products for Europe and another for everywhere else, maintaining separate content moderation systems, separate transparency reports, and separate algorithmic approaches. Or they can apply European standards globally, because building two different systems is expensive and complicated.

This dynamic is sometimes called the "Brussels Effect." When Europe regulates, the world often follows—not because other governments adopt the same laws, but because companies find it easier to build for the strictest rules and apply those standards everywhere.

We've seen this before. European privacy regulations influenced how companies handle data worldwide. European chemical safety standards shaped global supply chains. Now European platform regulations may reshape how social media operates from São Paulo to Singapore.

The DSA represents a gamble. Europe is betting that democratic oversight of technology platforms is both possible and necessary. It's betting that the problems caused by algorithmic amplification, viral misinformation, and targeted harassment are serious enough to justify substantial intervention. And it's betting that companies will comply rather than abandon one of the world's largest and wealthiest markets.

The next few years will reveal whether that gamble pays off. Enforcement is just beginning. Legal challenges are working through courts. The very large platforms are submitting their first transparency reports and conducting their first systemic risk assessments. Researchers are starting to access data they've been seeking for years.

Whatever happens, one thing is clear: the era of minimal platform regulation in Europe is over. The question now is whether the DSA becomes a model for the world or a cautionary tale about regulatory overreach. The answer may depend less on the law's text than on how vigorously and wisely it gets enforced.

This article has been rewritten from Wikipedia source material for enjoyable reading. Content may have been condensed, restructured, or simplified.