Skip to content

EU poised to impose sweeping social media regulation with Digital Services Act

Post categories

  1. Content Restrictions
The Digital Services Act would regulate everything from “dark patterns” and algorithms to public safety threats and illegal content.

The European Union is on the verge of doing what the U.S. has not done (and, in some cases, could not do) — comprehensively regulate social media platforms. Last week, the European Parliament and EU Council reached an agreement on the Digital Services Act, and while the final text has not been released, the law would impose sweeping new rules for internet platforms, regulating everything from “dark patterns” and algorithms to public safety threats and illegal content.

The DSA, and its partner regulation, the Digital Markets Act, were introduced to the European Parliament in 2020. The European Commission said the regulations were intended to accomplish two goals: “create a safer digital space in which the fundamental rights of all users of digital services are protected” and “establish a level playing field to foster innovation, growth, and competitiveness, both in the European Single Market and globally.”

The DSA covers a lot of ground. The current version carries provisions to prohibit misleading interfaces that prompt users to make decisions they might not otherwise make, and compels large platforms to comply with stricter obligations on disinformation, political ad transparency and hate speech, among other things. It also provides options for users to opt out of behavioral algorithms, bans micro-targeted advertising, and requires large platforms to undertake annual analysis and reporting with respect to what the EU says are the systemic risks of their services.

Unlike prior voluntary standards, the DSA comes with teeth — the EU Commission has exclusive power to fine platforms and search engines up to six percent of worldwide turnover for non-compliance. For big tech companies, that represents billions of dollars.

One of the more significant changes is the centralized governance of large platforms under the DSA. Platforms with more than 45 million monthly active users will be directly and exclusively supervised by the European Commission, instead of under the country-of-origin principle, where an internet-based service provider established in one country was only bound by that country’s laws. The DSA’s streamlined enforcement of large platforms is a response to some complaints of lax oversight of Europe’s comprehensive data regulation, the General Data Protection Regulation, which is enforced under the country-of-origin principle.

One of the final points added to the DSA was introduced in light of Russia’s recent military invasion of Ukraine and the internet’s role as a conduit for information warfare. The Crisis Response Protocol is a mechanism that would allow for the European Commission to consult with member states to declare a state of emergency and require content removal in such a crisis situation. This provision codifies the action the EU recently took when it ordered platforms to take down content from Kremlin-backed media organizations RT and Sputnik, flagging the content as state propaganda and disinformation.

Now that EU member states have come to a political agreement on the law, it must be formally approved by the European Parliament after technical and legal verifications, and will go into effect 15 months later. As many predict, while this law only regulates European internet activity, given its scope, it will likely have global effect.


Like what you’ve read? Sign up to get the full This Week in Technology + Press Freedom newsletter delivered straight to your inbox!

The Technology and Press Freedom Project at the Reporters Committee for Freedom of the Press uses integrated advocacy — combining the law, policy analysis, and public education — to defend and promote press rights on issues at the intersection of technology and press freedom, such as reporter-source confidentiality protections, electronic surveillance law and policy, and content regulation online and in other media. TPFP is directed by Reporters Committee attorney Gabe Rottman. He works with Stanton Foundation National Security/Free Press Legal Fellow Grayson Clary and Technology and Press Freedom Project Legal Fellow Gillian Vernick.

Stay informed by signing up for our mailing list

Keep up with our work by signing up to receive our monthly newsletter. We'll send you updates about the cases we're doing with journalists, news organizations, and documentary filmmakers working to keep you informed.