This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Skip to main content
United Kingdom | EN-GB

Add a bookmark to get started

| 9 minutes read

The Digital Services Act – a new set of regulations for online platforms

What is the Digital Services Act?

On 16 November 2022, the Digital Services Act (hereinafter referred to as: "DSA") has come into force. It targets online intermediary services and, in particular, online platforms. After the expiry of the transitional period on 17 February 2024, the DSA will be directly applicable throughout the EU. 

The DSA builds on the e-Commerce Directive, which is implemented, in particular, in the German Telemedia Act (TMG). The DSA partially adopts respectively  replaces the e-Commerce Directive, e.g. with regard to the exemptions from liability for user content. In other parts, the DSA supplements the e-Commerce Directive, for example with regard to information obligations for online service providers.

In large parts, however, the DSA goes beyond the provisions of the e-Commerce Directive: The DSA contains various new provisions, especially regarding illegal user content, which has thus far scarcely been subject to legislationThe regulations of the German Network Enforcement Act (NetzDG) partly overlap with the ones of the DSA, whereas, however, the NetzDG essentially only applies to social networks with more than 2 million users in Germany. In any case, the provisions of the NetzDG will be superseded when the DSA becomes effective on 17 February 2024 as European law takes precedence over national provisions; it thus will no longer be applicable.

Providers of online intermediary services and especially online platforms should therefore already now examine the impacts of the DSA, assess to what extent established processes, procedures and documentation meet the new requirements and, if necessary, update their services offered.   

Who is covered by the DSA?

The DSA applies to online intermediary services. This includes, in particular, online platforms.

The term ‘online intermediary services’ means - as its definition in the e-Commerce Directive - all types of information society services, such as the ‘mere conduit’ (e.g. by telecommunications service providers), caching’ (e.g. so-called proxy caching where the provider temporarily stores user information so that they can be, retrieved by the user or transmitted to third parties), or hosting, i.e. (permanently) storing user information.

As for the c-Commerce Directive, the ‘country of origin’-principle applies which states that a service provider must (only) comply with the regulations of the Member State in which it is established. The DSA, however, – like the General Data Protection Regulation (GDPR) – applies to all providers of online intermediary services who also offer their services to users who reside in the European Union (EU) (so-called ‘market place’- principle). The DSA applies regardless of whether the provider resides in the EU. Its scope of application thus is much broader than the one of the e-Commerce Directive.

Depending on their size and the type of services offered, the tiered approach of the DSA imposes additional obligations on providers of online intermediary services. As a consequence, providers who ‘host’ user information must observe more comprehensive obligations than providers who ‘merely conduit or ‘cache’ user information. In case user information is not only stored but also disseminated to the public and provided that this is not a minor and purely ancillary feature of another service, the online intermediary is considered an ‘online platform’. For providers of online platforms even more comprehensive obligations apply.

As the term online platform falls within the definition of ‘online intermediary services’, it is equally broad. As such, it covers a variety of platform that serve different functions (e.g. social networks and online marketplaces). In order to be classified as ‘online platform’, it is usually already sufficient if an online intermediary service allows for the creation of user profiles or interaction with other users.

When are providers of online intermediary services liable for user content?

The e-Commerce Directive already contains provisions on the liability of providers for user content. These are implemented in paras. 7 ff. of the German TMG and are adopted as is in the DSA. They thus will be equally applied under the DSA.

Following from these, providers of online intermediary services are exempted from liability for user content under certain circumstances, in particular, if they do not interact with the user content or  adopt it as their own. This is, e.g., the case if an online intermediary service qualifies as ‘mere conduit’ or ‘caching’ as per the definitions above. In case of a ‘hosting’ of user information, this only applies if the provider of the online intermediary service, in particular, of an online platform, has no actual knowledge of any illegal user content or takes immediate action as soon as the provider becomes aware of (evidently) illegal content (so-called notice and take down procedure). 

Thus, providers of online intermediary services are – still – under no general obligation to monitor their services  for illegal user content. 

However, as already mentioned, the DSA goes beyond the e-Commerce Directive in many respects and imposes increased requirements on providers of online intermediary services. These apply, for example, with regard to illegal user content and especially to reporting of such content by users. These requirements must be met independently of whether a provider of online intermediary services is exempted from liability as per paras. 7 et seq. TMG and go beyond the general and specific information requirements for service providers under para. 5 et seq. TMG.

What does this mean for me as a provider (of an online platform)?

The DSA imposes increased obligations on providers of online intermediary services depending on their size and the type of services offered. These can essentially be divided into three categories:

  • Measures for dealing with (illegal) content: Illegal content is effectively dealt with through complaints and redress mechanisms procedures that, e.g., allow the user to challenge the online platform’s content moderation decisions;
  • Transparency and reporting obligations: Transparency is increased through increased accountability and reporting obligations for providers of online intermediary services, e.g. on the algorithms used for recommendations;
  • Information obligations and requirements for the design of services: User autonomy and the protection of minors is strengthened through increased information obligations and requirements for the design of services.

Due to its enormous practical impact, this abstract focuses on online platforms, i.e. only some of the online intermediary services covered by the DSA. Requirements which only apply to online platforms that enable consumers to conclude so-called ‘distance contracts’, i.e. contracts between a consumer and a business (B2C) whereas only means of distance communication are used, were taken into account.

As a principle, the provisions of the DSA apply to businesses of all sizes. For small businesses, i.e. businesses with less than 50 employees and a maximum annual turnover of € 10 million, certain requirements do not apply, which we have pointed out in italics (see below).

Not dealt with in this abstract are online platforms and online search engines  with at least 45 million active users, i.e. about 10% of the EU population, (‘very large online platforms’ or ‘very large search engines’) and the additional requirements applicable to them.

Given the aforesaid, online platforms have to meet the following requirements:

Measures for dealing with (illegal) content 

Providers of online intermediary services must have mechanisms in place to detect and effectively deal with infringements of law, in particular with regard to illegal content. When designing and setting up his/her services, a provider must take into account both the fundamental rights of the user who reports illegal content and of the user whose content is affected by such complaint. In addition, a provider must justify his/her moderation decision with regard to such content on a case-by-case basis.

In particular, the following obligations apply:

  • Establishing a reporting and redress mechanism;
  • Justifying moderation decisions (e.g. deletion of individual content, blocking of user accounts or failure to take appropriate action);
  • Notifying the competent authorities of detected criminal offences that might threaten the life or safety of a person.

The following obligations apply except for online platforms that are considered small businesses (see above):

  • Establishing  an internal complaints management system for complaints against moderation decisions of user content;
  • Informing on access to out-of-court dispute settlement bodies regarding such complaints;
  • Blocking or suspending the processing of complaints from users in case of misuse.

Transparency and reporting obligations  

The DSA stipulates that the provider of an online intermediary service must design his/her service so that it is comprehensible and transparent for users. For this, infringements of law must be documented and re-processed "in the aftermath".

In particular, the following obligations apply:

  • Designating an point of contact  for national authorities;
  • Designating an point of contact for users;
  • Annually reporting on the number of administrative orders and reports/complaints by users as well as the number and outcome of content moderation decisions.

The following obligation applies except for online platforms that are considered small businesses (see above):

  • Reporting on the number of active users and infringement procedures (number of disputes submitted to out-of-court dispute settlement bodies, number of suspensions imposed and the like).

Information obligations and requirements for the design of services

Under the DSA, a provider of an online platform, must inform consumers in a clear and concise manner about any measures taken (such as algorithms for recommendations) and online services offered. It therefore prohibits so-called dark patterns, i.e. practices that distort or impair the user's ability to make an autonomous and informed choice or decision (e.g. user decisions are influenced by displaying buttons for certain response options so that they appear larger and/or more prominent). If minors are involved, even more restrictive requirements apply. If an infringement of law occurs, a provider of an online platform must make available to the consumer all relevant information so to allow the consumer  to enforce his/her rights.

In particular, the following obligations apply:

  • Providing information in the General Terms and Conditions (GTC) on mechanisms used for content moderation and algorithms used for recommendations;

The following obligations apply except for online platforms that are considered small businesses (see above):

  • Designing of websites and mobile apps must not materially distort or impair the ability of users to make autonomous and informed choices or decisions;
  • Informing, e.g. on the identity of the advertiser, when displaying advertisements;
  • Designing of recommendation systems must allow users to recognize and influence the essential parameters ;
  • Protecting minors by prohibiting advertising that typically addresses minors;
  • Informing on the identity of a business (B2C) in case of online purchases and designing websites and mobile apps in accordance with the applicable EU regulations;
  • Proactively informing affected consumers in case illegal products or services are sold via a supplier’s platform.

What are the sanctions in case the DSA is not complied with?

The Member States are responsible for enforcing the DSA. For this, a Member State must appoint a ‘Digital Services Coordinator’. The Digital Service Coordinator receives comprehensive enforcement powers which include blocking of an online service. In particular, the Digital Services Coordinator may impose fines on providers of online intermediary services. The maximum amount of a fine imposed for violations of the DSA is 6 per cent of a provider’s global annual turnover. For violations such as misleading, incomplete or incorrect information or failure to tolerate an inspection, the fine can be up to 1 per cent of a provider’s annual turnover.

In addition, the DSA allows for private law enforcement. For this, users of online services and certain private bodies, organizations and institutions have the right to lodge a complaint to the Digital Services Coordinator in case of a violation of the DSA. By this, they may initiate an official proceeding against an online intermediary service. Users can also claim compensation from providers of such services for damages and losses before national courts for breaches of the obligations under the DSA.

Further, with regard to very large online platforms and very large online search engines, the regulations of the DSA are (also) enforced by the EU Commission. As per the DSA, the EU Commission charges an annual supervisory fee for their supervising services which may be up to 0,05 % of a provider’s worldwide net income. By delegated regulation, the EU Commission has determined the methodology for the calculation of the supervisory fee for each provider in proportion to the number of average monthly ‘active recipients’ whereas any such fee additionally increases, when a certain number of ‘active recipients’ is met.

What are the DSA’s next steps?

Providers of online intermediary services and, in particular, online platforms should promptly examine their services offered and, if necessary, update their systems, terms, policies and processes, in order to comply with the DSA.

For this, we recommend taking the following steps:

  • Classifying the services offered in the categories of the DSA according to type and number of users
  • Reviewing as to whether the applicable requirements are already met or to what extent any systems, terms, policies and processes must be updated

It should be noted that the provisions of the DSA and the requirements stated therein are very comprehensive and that there are still no precedents or administrative practice as to the application and interpretation of the DSA.

Please feel free to contact us if you have any questions on the DSA and/or need assistance on how to comply with its regulations. 

Tags

germany, advertising, media, publishing, digital services act, dsa