DE

Digital Services Act
Digital Services Act: The New Foundation for Europe's Digital Single Market

EU flag and coffee cup with a "like"
© Edited via Canva.com 

20 years after the introduction of the European E-Commerce Directive, the Digital Services Act (DSA) was presented yesterday by the European Commission to complement the aging Directive. Twenty years ago, when the E-Commerce Directive (which still regulates the internal market for online services) was introduced, there were no social media platforms as we know them today. Regulation aimed at the requirements of the platform economy is therefore urgently needed. In addition to the Digital Services Act, the Commission also presented the Digital Markets Act (DMA), which is directed at platforms with a gatekeeper function. The DMA supplements regulations on competition law and is intended to limit the market power of the larger digital corporations.

The still valid e-commerce directive contains an instrument that is extremely important for the functioning of the internet, especially of platforms: the so-called "notice and take down" procedure. This, in short, releases platforms from their liability for illegal content of third parties, as long as they have no knowledge of it. However, this changes as soon as they are made aware of it. Upon becoming aware, they must act and remove the content, otherwise they are liable for it. The Digital Services Act builds on the E-Commerce Directive and retains this instrument. This is very welcome, also in view of the fact that in the USA, a comparable regulation, the so-called Section 230, is being heavily disputed and President-elect Joe Biden is eager to abolish it.

The Digital Services Act is intended, among other things, to better regulate illegal content on platforms and to remove it in compliance with the European Charter of Fundamental Rights. What sounds good at first raises the same problems that we already know from the German Network Enforcement Act (NetzDG): Platforms are to decide for themselves what is illegal. This is a task that the judiciary has to take on and not private companies. At the same time, the DSA makes no statements about what constitutes "illegal content". This is - quite rightly - regulated in other pieces of legislation at European or national level. However, it is positive that the DSA, like the NetzDG, requires a contact person for the platform operator operating in the European Union. Likewise, the member states have to appoint a "Digital Services Coordinator", who is to supervise compliance with the regulations in the member state and who will form the "European Board for Digital Services" at the European level, which will assist the European Commission as an advisory body.

However, the Digital Services Act also brings some improvements for users of digital platforms, especially in relation to content that has been removed according to the platform's own community standards. For example, users must be informed why their content has been removed from the platform. According to the DSA, the platform must offer possibilities to appeal the decision and provide a platform for dispute resolution. It is also to be welcomed that the DSA provides for safeguards to prevent the misuse of the reporting function for posts - after all, false reports are often used to try to silence unpopular opinions. Platforms are therefore advised to find regulations to temporarily block these users and to explain this procedure in their GTC. The terms and conditions should also state in comprehensible language whether the content is moderated by humans or algorithms.

The Digital Services Act distinguishes between the size of platforms in the intended obligations. It explicitly notes that "very large platforms" have a very different impact on European societies. The DSA defines very large platforms as those with more than 45 million users, or 10 percent of EU citizens. The penalties provided for in the Digital Services Act are considerable: up to 6 percent of annual turnover is possible in the case of extremely serious violations.

Users should be able to better understand the composition of the content that is displayed to them. To this end, very large platforms should disclose what their parameters are for recommendation systems (e.g. the News Feed) and allow alternative settings to be made. This includes the possibility of a neutral arrangement of content that is not based on the user's preferences as anticipated by the platform. Users should also be able to see why they are shown advertisements, i.e. according to which parameters the so-called micro-targeting was carried out. It should also be possible to see who paid for the ad.

As already mentioned in the "European Democracy Action Plan" presented at the beginning of December, the Digital Services Act contains regulations that are intended to limit the spread of disinformation. Online platforms are required to draw up a code of conduct outlining how they intend to deal with content that is not illegal but nevertheless harmful. This includes how to deal with fake accounts and bots, which often help spread disinformation and other harmful but not illegal content. Platforms that do not have a code of conduct and cannot justify this can be accused of not complying with the DSA. The obligation to provide data for research purposes announced in the Action Plan is reflected in the DSA.

It is also to be welcomed that very large platforms are to have plans in place for crisis situations, for example pandemics, earthquakes or terrorist attacks. In order to assess and mitigate risks, these platforms are also encouraged to include users, particularly affected persons, independent experts and representatives of civil society in their measures. This is an important step, especially in light of the genocide of the Rohingya in Myanmar, which was fuelled by disinformation and hate speech on Facebook, to which the platform failed to respond for a long time.

The Digital Services Act could also apply to channels (and possibly also groups) on Telegram and thus be more comprehensive than the German NetzDG, which has a loophole for messengers like Telegram that also enable public communication. This would mean that Telegram would also have to name a contact person in Europe. The DSA is not intended to apply to private communication via messenger and e-mail, but only to groups that are intended for the public.

The Digital Services Act is intended to create a uniform regulation for the European digital single market, which is also supposed to be a measure against the patchwork of national legislation that has arisen, for example, through the German NetzDG or the French "Avia Law". However, it does not replace national laws, but supplements and unifies them. The DSA claims to want to set international standards. The fact that it seeks to do this by privatising law enforcement to the platforms - as was already the case with the NetzDG - is to be sharply criticised. It is to be hoped that the European Parliament will advocate for a more sensible solution in the negotiations on the final text of the directive.

Ann-Cathrin Riedel

Ann-Cathrin Riedel
Digitalisation and Global Innovation Officer at FNF