Democratization of the internet: Baby-step, giant-step?

Do we really live in a democracy? As far as our life on the internet is concerned, the answer so far has been a resounding “no”. Giant online platforms such as Facebook, search engines like Google, or mega marketplaces like Amazon haven’t taken any responsibility for what has been going on “in their houses”.

A new regulation is in the pipeline to try to change this. The DSA – the Digital Services Act – is part of a digital package proposed by the EU Commission at the end of 2020. The original stated goal was to “create a safer digital space where the fundamental rights of users are protected”.

The law follows a fundamental principle: what is illegal offline should also be illegal online. This applies, for example, to hate speech and racist propaganda, but also to counterfeit products sold on online marketplaces. 

In the early hours of Saturday morning April 23 2022, EU negotiators agreed on a common legislative text for the Digital Services Act in a so-called trilogue meeting between the European Council, the EU Commission and the European Parliament. It marked  the end of a long rounds of negotiations

To a large extent, the result is a success. There are improvements in many areas compared to the status quo. But the status quo basically didn’t lay down any rules for many platforms. 

“This is a good start towards more digital democracy, even if there is still a long way to go”, says Martin Schirdewan, who helped negotiate the legislative package for the Left.


High’s and lows of the result 

👍Marketplaces will be obliged to check suppliers so that fewer counterfeit products end up on the internet. Manipulative “dark patterns” that push consumers to make a purchase decision will be banned; this is a huge success for the Left who pushed for such a ban. In other respects, too, such misleading user interfaces – for example in cookie selection – will be largely banned. Sensitive data such as religious beliefs, sexual preferences or political views may only be used for targeted advertising to a limited extent. Minors will no longer receive targeted advertising.

👎But: It is disappointing that no comprehensive ban on personalized advertising has made it into the law. Corporate Europe’s interests won over user protection. 

👍Platforms must submit yearly risk assessments and examine how their services affect society. In addition, to check this, there should be better data access rights for scientists and non-governmental organizations. So that we can better understand how platforms work behind the scenes.

👎But: That only goes for very big platforms and search engines with more than 45 million users.

👎There is no ban on automated filtering systems to control content. AIs (Artificial Intelligence) remain a risk to freedom of expression when large platforms want to save on human staff.

👎The Digital Services Act  could have made a clear statement on the protection of end-to-end encryption. This was prevented by the usual fans of mass surveillance.

Disinformation vs freedom of speech

👍In principle, major platforms must remove illegal content such as hate speech, incitement to violence or terrorist propaganda quickly when they are informed about it. Users should be able to easily report such content. They should also have the possibility to challenge the deletion decisions of the platforms and to claim compensation.

Facebook & Co. must inform users directly about deleted content, shadowbans and suspensions, and offer complaint channels. In addition, there should be better complaint procedures in the case of incorrect account blockings and deletions. Illegal content should be deleted with clear rules.

👎But who decides where hate speech starts and where it ends?  What constitutes illegal content is somewhat different in different countries. Statements that would be considered insulting or defamatory in Poland could possibly be permissible as a form of free speech in Germany, and vice versa. What criteria will the EU then use to decide? 

👍The EU Commission, in conjunction with regulators in the 27 member states, is to be the lead agency for enforcement against the major global platforms. New regulatory authorities in the member states and a new EU agency with the necessary equipment could be a further step. According to the “polluter pays” principle, the major platforms should pay for their own supervision by up to 0.05% of their annual global turnover. That is a good point. We will push for this decision to be consistently implemented and monitored. 

The final documents with the technical details of the political agreement will still need weeks to be published with the final votes in Parliament and Council possibly taking place before summer. It remains to be seen whether the new rules will really unleash the power to hold internet giants accountable. We need to follow implementation and enforcement closely, identify mistakes and weaknesses, and improve them.



Related Meps and Contact person