Digital Emigration

How to Build Democratic Resilience in the Age of Platform Algorithms

Published:

In the weeks leading up to the 2024 European Parliament elections, coordinated attempts were made to manipulate online political debate across the European Union. What distinguished this election cycle was not the absence of interference, but the response: several operations were identified and disrupted early, before they escalated into the visible disruption seen in previous elections.

This shift followed the entry into force of the Digital Services Act. It is a hopeful development for democratic resilience at a time when elections are increasingly exposed to interference, disinformation, and organised propaganda.

What the Digital Services Act is

The Digital Services Act (DSA) is an EU regulation adopted in 2022 and in force since 2024. It sets binding obligations for online platforms, especially very large ones such as Google, Meta, TikTok, and X, to address illegal content, systemic risks, and the societal effects of algorithmic systems. Unlike earlier voluntary codes of conduct, the DSA grants regulators inspection powers, audit rights, and the ability to impose meaningful financial penalties.

At its core, the DSA shifts responsibility. Platforms are no longer assessed only on whether they remove illegal content when notified, but on whether their systems create predictable risks for society, including risks to democratic processes.

Why Europe created it

Over the past decade, platform governance proved insufficient. Repeated election interference, coordinated disinformation, opaque recommender systems and the limits of self-regulation eroded trust in both platforms and democratic institutions. As public authorities grew dependent on engagement-driven private platforms while lacking visibility into how influence spread, a content problem turned into a systemic one. When risks to elections and public debate are shaped beyond democratic oversight, the issue becomes one of sovereignty rather than moderation.

The Digital Services Act is the EU’s response. It reasserts public authority over how critical digital platforms operate within the Union by reshaping incentives and imposing enforceable obligations. What it does not change is ownership, jurisdiction, or structural dependency. European public debate still runs largely on foreign-controlled systems, and regulation can constrain behaviour within those systems without removing that dependency.

Where the DSA already works

Prevented election interference under the DSA (EU, 2024)

In the run‑up to the 2024 European Parliament elections, multiple coordinated networks attempted to manipulate political debate across EU member states. These operations relied on fake accounts, impersonation of voters and public figures, and artificially inflated engagement to boost political messages.

Under the Digital Services Act, very large platforms were required to assess election‑related systemic risks and demonstrate concrete mitigation measures. As a result, several platforms dismantled these networks early, at scale, and across entire account clusters rather than reacting to individual posts.

The significance of this case is not that interference ceased, but that it was contained before it escalated. The DSA shifted incentives and timing, pushing platforms to intervene earlier, at greater scale, and under regulatory oversight.

This illustrates where the current DSA is strongest. It is most effective when manipulation relies on clearly deceptive behaviour such as fake identities, coordinated inauthentic activity, or impersonation.

Transparency and inspection

A second area where the DSA has already altered platform behaviour is transparency. Very large platforms are now required to explain key aspects of their recommender systems and to provide data access to vetted researchers. While implementation remains uneven, the obligation itself has shifted the balance of power between platforms and regulators. Practices that were previously opaque are now, at least in principle, subject to inspection and challenge.

Together, these changes mark a clear break from the fragmented and largely voluntary regime that preceded the DSA.

The pushback from abroad

The DSA has not gone uncontested. Critics, particularly in the United States, have framed it as an attack on free speech or an attempt to export European regulation beyond EU borders. Large technology companies have warned about compliance costs, legal uncertainty, and conflicts with non‑EU legal traditions.

From a European perspective, this pushback reflects a more fundamental tension: a jurisdictional conflict over who sets the rules for digital public spaces. The DSA asserts that when platforms shape democratic life within the EU, they fall under European regulatory authority, regardless of where they are headquartered.

Where the DSA falls short

The limits of the DSA become visible when influence is lawful, domestic, and algorithmically amplified.

In Germany, content from the Alternative für Deutschland (AfD) achieved disproportionate reach on platforms such as TikTok, YouTube Shorts, and X during the 2024–2025 election cycle. This was not driven by bots, fake accounts, or illegal speech. It was driven by recommendation systems optimised for engagement and polarisation.

From a regulatory perspective, nothing unlawful occurred. Yet the outcome raised a difficult question: what happens when democratic discourse is reshaped not by deception, but by platform design choices that systematically reward certain types of content?

The current DSA offers few tools for this scenario. Platforms can comply with audits and transparency requirements while still amplifying lawful content in ways that materially affect political competition.

What a DSA 2.0 could include

If Europe wants to address lawful but distortive amplification, a next iteration would need to move beyond content legality and focus on algorithmic power. Concrete measures could include:

1. Enforceable algorithmic transparency
Not just high‑level descriptions, but standardised, comparable disclosures on how recommender systems rank, amplify, and suppress political content during election periods. This would include audit‑ready metrics on reach, virality, and amplification effects, with penalties for incomplete or misleading disclosures.

2. Regulation of amplification as a business model
Where recommender systems are demonstrably optimised to maximise outrage, polarisation, or extremity because it increases engagement and revenue, that optimisation itself should qualify as a systemic risk. DSA 2.0 could require mitigation plans or impose constraints when amplification patterns measurably skew public debate.

3. Rules for default algorithms and silent updates
Default recommender settings matter because most users never change them. A strengthened regime could require that election‑relevant algorithmic changes, including so‑called silent updates, be logged, disclosed to regulators, and subject to temporary restrictions during sensitive periods.

4. Stronger interim powers
When credible evidence suggests algorithmic distortion of democratic processes, regulators would need the ability to impose temporary safeguards while investigations continue, rather than waiting for ex‑post findings.

Potential benefits
These measures would not determine which views may be expressed. They would instead limit the ability of platforms to invisibly reshape political competition through design choices. The result would be greater predictability, accountability, and resilience of public debate.

Expected pushback
Platforms would likely argue that such measures threaten innovation, reveal trade secrets, or conflict with free‑expression norms. The underlying dispute, however, would remain jurisdictional: whether profit‑driven optimisation should outweigh democratic safeguards when platforms function as core civic infrastructure.

Why Europe will need to go further

The Digital Services Act has strengthened Europe’s ability to defend its digital public space. It has increased transparency, constrained some of the most visible forms of manipulation, and restored a degree of public authority over platform behaviour.

At the same time, its limits are structural. Systems that shape democratic outcomes remain largely controlled elsewhere. Europe can influence how these systems operate, but it does not control them.

If democratic resilience is the objective, stopping at DSA 1 is not sufficient. Civil society depends on shared facts, pluralism, and institutions that are not systematically undermined by engagement‑driven optimisation. When public debate is shaped by systems designed primarily to maximise profit rather than democratic stability, the long‑term outcome is predictable: concentration of power, erosion of trust, and political capture.

Europe’s choice is therefore not between regulation and free expression. It is between exercising democratic authority over critical digital infrastructure, or accepting a trajectory in which civic life formally persists while substantive power drifts towards a small number of dominant platforms.