This browser is not actively supported anymore. For the best passle experience, we strongly recommend you upgrade your browser.
Skip to main content
United Kingdom | EN-GB

Add a bookmark to get started

| 5 minute read

Radical new rules for media platforms as Coimisiún na Mean publishes Ireland’s first Online Safety Code

Introduction

On 21 October 2024, Coimisiún na Mean (CNaM) published Ireland’s first Online Safety Code (Code).

Essentially, the Code introduces a mix of onerous safety measures to protect users, particularly children, against harmful content which platforms will need to deploy, both via new terms and technology measures. The Code marks the culmination of many years of political, legislative and regulatory discussion in the field of online safety. But make no mistake: the Code, and statements from CNaM, indicate an intent to radically alter the current softly-softly approach to how media platforms allow users to consume content.

Who does the Code apply to? 

The Code applies to video-sharing platform service providers (VSPs) that have previously been designated by CNaM.  The Code does not apply to all online services – but this may be expanded in future. 

CNaM is mandated with keeping this register updated with details of any VSP under Ireland's jurisdiction. The definition of jurisdiction is that of the Audiovisual Media Services Directive 2018 and centres around where the VSP is established. Where a VSP is part of a group, or has a parent or subsidiary undertaking, different conditions apply. As we previously anticipated, this has given rise to jurisdictional issues, with Reddit challenging their designation as a VSP by CNaM on the ground of jurisdiction in the High Court. Similar issues are likely to emerge in the future, given the complexities involved and the multi-state nature of the VSPs' operations. With VSPs under Ireland's jurisdiction providing services to users within Ireland and across the EU, CNaM must co-operate with other Digital Service Coordinators under the Digital Services Act (DSAFramework to ensure there is no gap in enforcement. 

What does the Code require? 

The Code is split into two parts – snappily named A and B. Part A of the Code creates “general obligations” that VSPs must comply with by 19 November 2024, and Part B sets out “specific obligations” that VSPs must adhere to by 21 July 2025

  • General Obligations 

VSPs can achieve compliance with the general obligations set out in Part A of the Code without taking any specific measures, where they have the required protections in place on their platform and document that compliance. VSPs are required to adopt on their service:

  • Certain terms and conditions to protect users from harmful content relating to issues including bullying, eating disorders, self-harm, suicide and incitement to hatred and terrorist, and commercial communications.
  • Enable declarations for commercial communications.
  • A transparent and user friendly reporting system for harmful content.
  • An age verification system so young users cannot access content which could impair their physical, mental or moral development. This is an important aspect of the Code. The specified method of age verification to be used is not prescribed. It remains to be seen what form VSPs adopt, but possible techniques could include mandating passport photos or utilising AI software to analyse facial features. 
  • Parental controls to give parents and guardians access to tools to filter content which may be harmful to the physical, mental or moral development of children.
  • Rating systems to allow users to rate content which may be harmful, in a user-friendly way.
  • Complaints procedures to handle and resolve users’ complaints in relation to lack of compliance with any of the measures required by the Code.
  • Media literacy tools to ensure users have access to effective media literacy resources.
  • Specific Measures 

Part B provides a series of definitions and elaborates on the general obligations provided for in Part A. Compliance with Part B is conditional on VSPs taking the specified measures detailed in the Code. 

The specific measures focus on content control of content related to bullying and humiliation, eating disorders, self-harm and suicide, adult-only video content, incitement to hatred, criminal content, and commercial communications, parental controls and users' ability to report content.

Notably, the Code does not tackle recommender systems (i.e. algorithms used to recommend content to users based on personal data such as their search history, age and location). CNaM believes that it is more appropriate for recommender systems to be regulated by the DSA rather than the Code. 

Role of CNaM

CNaM is responsible for supervising VSPs to ensure that they comply with their obligations. It has established a Contact Centre to provide advice and guidance on how users can exercise their rights. The Contact Centre is not intended to be a means to resolve user complaints regarding specific instances of harmful content, and CNaM has emphasised that users should use the complaint mechanisms set up by the VSPs as the primary method of reporting harmful content.

CNaM also has the authority to issue statutory guidance to accompany the Code in accordance with and following the procedures set out in Section 139Z of the Online Safety and Media Regulation Act 2022.

CNaM has the power to take action for non-compliance through investigation and enforcement procedures, where necessary. Non-compliance with the Code can lead to fines of up to EUR 20 million or 10% of a VSPs' annual turnover, whichever is greater. 

What else should media platforms be watching out for?

CNaM is edging into the final months of its first year at the forefront of Ireland's efforts to develop and boost the regulation and enforcement of online safety. In addition to the Code, a variety of other new changes reflect the changing landscape for all media players in Ireland:

  • Branching into the political arena: In advance of the upcoming general election on 29 November in Ireland, CNaM announced that it will be replacing the broadcast moratorium with an Additional Care Requirement for broadcasters during the critical election period. This will oblige broadcasters to treat information that they believe to have been circulated with the intention of misleading voters with extreme care. In addition, broadcasters will be asked not to report on exit or opinion polls during polling hours. Guidelines are due to be published in the coming weeks.

 

  • Out-of-Court settlement bodies: CNaM recently published details of the first Out-of-Court Dispute Settlement Body (ODS) (ODS) in Ireland. CNaM has certified the Appeals Centre Europe (ACE) (ACE) as an ODS for a period of five years that runs from 26 September 2024. ACE aims to resolve disputes regarding content moderation between users and social media platforms in an affordable and transparent way. 

 

  • Digital Fairness Act: During the confirmation of his appointment as Commissioner-designate for Justice, Michael McGrath said he would bring forward a Digital Fairness Act to tackle issues affecting vulnerable users of online services impacted by practices including dark patterns, addictive design and personalised targeting. This is in response to a Digital Fairness Fitness Check carried out by the European Commission which identified these practices as potentially harmful to users. With the European Commission publishing its Working Document on 3 October 2024, the development of this Act and how it affects the DSA will be seen over the coming months. 

 

  • The Terrorist Content Regulation: In August, CNaM published a Decision Framework for the Terrorist Content Regulation (the Framework). At the time of writing, there have been no final removal orders issued which trigger CNaM’s competence to make a preliminary decision under the Framework.

Tags

online, onlinesafety, media