This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Skip to main content
United Kingdom | EN-GB

Add a bookmark to get started

| 3 minutes read

The Digital Services Act – Ground rules for online platforms

On 23 April 2022 another important step was taken towards realizing the EU’s ambitious goal to make Europe “Fit for the Digital Age”; the EU Commission and the Member States reached a political agreement on the terms of the Digital Services Act (DSA) (COM/2020/825 final), with a target date of 1 January 2024. Whilst the recently approved Digital Markets Act (DMA) (COM/2020/842 final) seeks to restrict the power of Tech giants, the DSA aims to create a safer digital space for all users of online services, especially by obligating platforms to remove illegal content quickly. 

Content is determined as being illegal where it is not in compliance with EU law or the law of a Member State. Therefore it encompasses a broad spectrum of material, ranging from hate speech to terrorism to copyright infringements.

The targets of the Act are online intermediaries addressing EU consumers, with the Act sub-dividing these intermediaries based on the services provided. The obligations vary according to their size, role, and impact on the online ecosystem. This results in increasingly onerous obligations for intermediary services which simply offer network infrastructure (e.g. internet access providers), hosting services (e.g. cloud and webhosting services), online platforms (e.g. online marketplaces, app stores and social media companies) and very large online platforms.

A failure to comply with the new obligations set out by the DSA can result in fines of up to 6% of the infringing companies’ global annual revenue, higher than the 4% maximum under the GDPR.

Liability Regime of the e-Commerce Directive

Surprisingly, the DSA does not provide updated liability regimes for information society services, instead adopting the regimes of the e-Commerce Directive (Dir. 2000/31/EC) largely unchanged. These provide that internet society services are only liable for third-party content if they fail to remove it after becoming aware of its illegality. The DSA mirrors the liability provisions almost word for word and uses them as a starting point to establish procedures for reporting and promptly removing illegal content. By maintaining this liability regime, the DSA is more lenient than the European Commission had initially suggested.

“Notice-and-Action” Procedures

The DSA imposes an obligation for all service providers to institute notice-and-action procedures (or notice-and-takedown procedures). These must be easily accessible and user-friendly, allowing users to identify any potentially illegal content. The receipt of a valid user notice means a provider is deemed aware  of illegal content and can lead to liability where the content is not removed within a timely fashion. The provisions of the DSA specify which details must be included in such a notice to create positive knowledge.

Where user content is subsequently blocked or removed, the infringing user must be provided with sufficient reasons for this decision. Additionally, the service provider must inform both the notifying party and alleged infringer of the redress possibilities available against the decision. Where the service provider is an online platform, it must set up an internal complaint system which allows users to appeal decisions by the platform to block allegedly illegal content.

Further Obligations for all Online Platforms

On top of the duty to establish an internal complaints system, online platforms - apart from micro and small enterprises –have additional obligations.

As part of the notice-and-action procedure, mechanisms must be established to cooperate with so-called “trusted flaggers”, who regularly and reliably flag illegal content. Notices submitted by such individuals must be given preferential treatment and be dealt with more rapidly than others. Users who have repeatedly disseminated illegal content are to be suspended for a reasonable time period.

The DSA further aims for transparency and protection with regard to advertising, even banning ads that target children or use special categories of personal data, such as ethnicity, political views, sexual orientation. Online platforms must also disclose specific information regarding the algorithms which determine how content is recommended to users (e.g. ranking mechanisms) and offer users choices.

Additional Obligations for “Very Large Online Platforms”

Due to their incredible reach and influence, “very large online platforms” are placed under additional obligations to mitigate the specific risks they pose regarding the spread of illegal content online. The threshold to being deemed a “very large online platform” is crossed where an online platform has more than 45 million monthly active users within the EU.

Under the DSA, such companies are subject to stringent due diligence requirements, including mandatory risk assessments and corresponding mitigation measures, as well as regular audits of the systems put in place to conduct such assessments. The appointment of a compliance officer is mandatory, as well as setting up a repository containing detailed information on advertisements displayed on their service within the last year.

Finally, such platforms are placed under the supervision of the European Commission and so-called “Digital Services Coordinators”, which are to be appointed by the Member States. For the first time, there will be some independent social control of the platforms.


This political agreement must now be formally approved by the EU institutions. As a Regulation, the DSA will have direct effect within the EU Member States once it is adopted. It could come into force as early as 1 January 2024. An exception is included for very large online platforms, to which the DSA will apply four months after they have been designated as such.

(...) The DSA will upgrade the ground-rules for all online services in the EU. It will ensure that the online environment remains a safe space, safeguarding freedom of expression and opportunities for digital businesses. It gives practical effect to the principle that what is illegal offline, should be illegal online. ( ...) (European Commission President Ursula von der Leyen)


advertising, germany, publishing, media, dsa, digital services act