On 26 May, Ofcom issued a proposed draft guidance on how video-sharing platforms (VSPs) should behave in order to meet their regulatory requirements. Amongst other items, the guidance covers potential measures providers may utilise in order to protect their users from potentially harmful material. The proposal also contemplates what should be defined as ‘harmful material’ and what types of medial will fall under this category.
This article seeks to briefly summarise the consultation and proposed guidance. In doing so, it clarifies the context surrounding the proposal, what types of services fall within the category of VSPs, what proposals Ofcom has made for these platforms, and the next steps the regulator seeks to take in creating a safer environment to share and distribute content.
Why are Ofcom proposing this guidance?
In late 2020, changes to the Communications Act 2003 (the CA 2003) came into effect, implementing a number of regulatory requirements for UK-based VSPs and obligating Ofcom with ensuring their enforcement.
The main purpose of these amendments was to “protect users […] from harmful content” when engaging with VSPs. In doing so, the new regime set out a number of measures that must be considered when providing video-sharing services to the general public. In particular, providers must set in place suitable measures to protect the general public from materials that are likely to incite violence or hatred and must ensure that measures must are in place to protect those under the age of 18 from content that may “impair their physical, mental, or moral development”. This included a number of specific requirements to protect users from potential harms deriving from advertisements on their platform.
However, the manner in which this was to be achieved was left in something of a grey area with little direction and therefore presented Ofcom with an opportunity to clarify the current position of regulation, and how regulators sought to assist VSPs in achieving compliance.
What are VSPs?
The concept of VSPs largely originates from the provisions of the European ‘Audiovisual Media Services Directive’ (the Directive) where attempts were made to more concretely define the rapidly growing market of services and platforms that offered the ability to host and share media content online. These provisions were effected into UK law by virtue of their transposition into Part 4B of the CA 2003, and therefore bridging a gap in regulation caused by the Government’s ‘Online Harms regime’ not being finalised.
VSPs are a type of online service that allows its users to upload, share, and play-back user-generated videos with other members of the platform. Well-known examples of these are the likes of YouTube and Vimeo. Users typically access these services through an application programming interface (API) in the form of a mobile app or website and videos are then generally accessible to its users until they are deleted by the user themselves or the VSP ceases to host the media. The ease of access and use of VSP hosted media means that they can be easily used as platforms for entertainment, learning, and business resources.
A notable defining aspects of VSPs from the definition provided in the Directive is that, unlike curated platforms, they do not hold editorial responsibility or the obligation to educate, inform, or explain details to users about the content they host. It is therefore the responsibility of the users themselves to appropriately consider the content and information they view. This understandably would present a number of issues when it comes to protecting users from harmful online content as the Government intends without suitable moderation by the platform provider or regulator.
What is Ofcom proposing?
The proposal from Ofcom seeks to reflect the distinction in the CA 2003 between advertisements that are controlled by the VSP and those that are not. In other words, it seeks to distinguish between circumstances where a VSP has “specifically marketed, sold, or arranged” advertisements present on their platform and content, and when this is not the case.
In instances where the VSP does have control of these advertisements, VSPs are responsible for ensuring compliance with all relevant regulatory requirements. In such circumstances, Ofcom has proposed that regulation of these matters is to be done in tandem with the Advertising Standards Authority (the ASA).
Where the VSP has not substantially controlled advertisements, the VSP nevertheless must take the necessary steps to ensure that the advertisement on its hosted content meets all necessary requirements. In these particular circumstances, Ofcom has proposed that it will itself assess and determine whether the steps taken by the VSP have been sufficient to protect its users.
Ofcom have therefore requested consultation on five primary areas:
- The proposed guidance on how Ofcom will determine if a VSP has control of advertisements on their platform;
- The proposed guidance on how Ofcom seeks to regulate circumstances where VSPs have been found to control advertisements on their platform;
- The decision to involve the ASA as a co-regulator in instances where VSPs have been found to control advertisements on their platform;
- The proposed measures for VSP providers to take in order to appropriately monitor and regulate advertisements not controlled by the VSP; and
- The proposed guidance on how Ofcom seeks to regulate non-VSP controlled advertisements.
Where do we go from here?
In May of this year, the UK Government published the ‘Draft Online Safety Bill’ (the Online Safety Bill) with the aim of establishing a new regulatory framework capable of tackling harmful content online and achieving their goal of making the UK the safest place in the world to be online. When entered into force, the Bill will supersede many of the provisions of the CA 2003 and those provisions dedicated to advertising requirements will be repealed.
In publishing this Bill, the UK Government also stated that regulation of VSPs in the context of advertisements would continue to fall to the responsibility of the ASA and Ofcom. In the creation of these proposals, Ofcom have already factored in these considerations and therefore provide these amendments to complement the potential provisions of any future regulation of the subject. In doing so, they seek to foster a collaborative approach that will allow regulators and providers to work together to ensure that advertising standards are held in the best interests of the public and VSP users.
The consultation ends on 28 July 2021 with a summary of their findings due shortly after.
Does it go far enough?
The early stages of the consultation grant Ofcom the invaluable opportunity to take stock of the regulatory landscape and are an encouraging development towards the creation of a safer online environment for users in the UK. In particular, it does well to distinguish the responsibilities for advertisements based on whether control and curation of such media is in the hands of the VSP.
While it is acknowledged that this guidance will be superseded on the inception of the Online Safety Bill, the extent in which this is the case remains unclear. One particular area lacking clarity is the future scope of which platforms will be defined as VSPs and whether Ofcom’s cooperation with other digital regulators may result in foreign-based VSPs becoming subject to regulation by virtue of allowing access to UK users.
The current proposals therefore represent a promising next step in online regulation in the UK but continue to present further opportunity for development as the draft framework becomes legislation.