The draft Online Safety Bill (the Bill), the Government’s flagship proposal for legislation regulating online content (…originally unveiled in the Online Harms White Paper in April 2019 by Theresa May with the aim of ensuring “the UK the safest place in the world to be online”) returns to Parliament having survived four prime ministers and seven departmental secretaries on its long journey to becoming law.
Since 2019, the scope of the Bill, which was originally focussed on tackling online abuse and harassments, has snowballed packing in measures to tackle a wide range of other illegal and harmful content which adults and children may encounter online. For example, by restricting fraudulent content and imposing age verification on pornography and provisions relating to “deep fakes”. It now also seeks to tackle child protection concerns in relation to content relating to self-harm and suicide.
After a summer of delays, the Government has sought to address criticism both as to the overall scope of the legislation and perceived threats to freedom of speech. Paul Scully the current Minister for Tech and Digital Economy published a statement on 30 November 2022 saying that: “The approach we are taking has three main aims. We are strengthening the protections for children in the Bill, ensuring that adults’ right to legal free speech is protected, and also creating a genuine system of transparency, accountability and control to give the British public more choice and power over their own accounts and experience.” As a result in December 2022, the Government introduced amendments to the Bill in an attempt to smooth the path of the Bill’s remaining journey through Parliament.
Key amendments and additions
- Safety Duties Protecting Adults
The amended Bill replaces the harmful but legal offence with a “triple shield” for adult users. The triple shield requires platforms to remove all illegal content, remove content that is banned by its own terms and conditions and empower adult users with tools for tailoring the type of content they see and hiding potentially harmful content if they do not want to see if on their feeds. This third element is effectively an opt-out for adults and is likely to be implemented by platforms using the now familiar cookie consent style toggles already found on websites and apps. [Note that, the triple shield applies to adults only and children will continue to be afforded greater protection and will automatically be opted-out of such content.]
The “self-empowerment” tools will be required to include features which:
- where elected by the user, reduce the likelihood of the user encountering all, or particular types, of content that is harmful to adults;
- alert users to the harmful nature of priority content harmful to adults which the user may encounter when using the service; and
- allow users to filter out non-verified users.
The platforms will now also be required to include clear and accessible provisions in their terms of service specifying which features are offered in compliance with these obligations.
- Additional criminal offences
Some content that children and adults may encounter online is already illegal in the UK. However, the Bill will force social media platforms to actively remove all types of illegal content identified in the Bill (those offences defined in the Bill as a “Priority Offence”), and to stop such content being uploaded on an ongoing basis to stop children and adults from seeing it.
Under previous iterations of the Bill there was a long a list of illegal content that must be removed from the platforms. This covered child sexual abuse, controlling or coercive behaviour, extreme sexual violence, fraud, hate crime, inciting violence, illegal immigration and people smuggling, promoting or facilitating suicide, promoting self-harm, revenge porn, selling illegal drugs or weapons, sexual exploitation and terrorism.
These criminal offences remain in the amendments to the Bill with additional criminal offences added to the closed list, including:
- “epilepsy trolling” an offence in relation to subjecting users to flashing imagery; and
- disclosing or threatening to disclose, private sexual (or in Scotland “intimate”) photographs and films with intent to cause distress.
- Duties to protect news publisher content
In addition to the obligations for platforms to protect journalistic content, the amended Bill includes an obligation to comply with 25 duties to protect “news publisher content” which include (subject to exceptions) requirements that the platform contact news publishers prior to any action (i.e. taking down content, restricting access or labelling the content with warning labels) being taken and additional obligations where the platform has failed to provide such notification. For the purpose of the Bill, “news publisher content” covers any content generated on the service:
- by a “recognised news publisher” defined as the BBC, Sianel Pedwar Cymru, holders of a UK broadcast licence or any other entity which publishes, in the course of business, news material created by multiple individuals that is subject to editorial control and meets certain other requirements listed in the Bill; or
- which is uploaded by a recognised news publisher or which is shared or linked on a service and reproduced in full (not a screenshot, photograph or extract).
- Keeping children and young people safe online
The Bill already sought to protect children by:
- making social media platforms remove illegal content quickly or prevent it from appearing in the first place;
- preventing children from accessing harmful and age-inappropriate content (such as pornographic content, online abuse and bullying and other content which is not criminal but promotes or glorifies suicide self harm and eating disorders); and
- enforcing age limits and age-checking measures on social media platforms.
However, additional transparency measures now impose requirements for the largest platforms to also publish summaries of the outcome of their risk assessments for content that is harmful to children.
What can we expect next?
The Public Bill Committee met on Tuesday 13 and Thursday 15 December 2022 to scrutinise the updates to the Bill on a line-by-line basis. According to the Minister’s statement above, we can expect that the Bill will now pass to the House of Lords for consideration in January 2023, as the clock is ticking for the Bill. If it hasn’t received Royal Assent by April 2023, according to Parliamentary rules the legislation will be dropped entirely, and the process would need to start again in a new Parliament.
Like to hear more on this topic?
We held a session on ‘Online harms and media regulation’ as part of our Media, Sport and Entertainment summit held virtually in September and October 2022:
Speakers: Duncan Calow, DLA Piper | David Cook, DLA Piper | Anika Kruse, DLA Piper | Darach Connolly, DLA Piper | and a key note contribution by Lord Tim Clement-Jones CBE, Member of the Joint Select Committee on the Draft Online Safety Bill (2021-22)
This session and each other session in the programme, which covered the full spectrum of the MSE industries and provided insights on current issues and opportunities, is now available to watch online – click on the link below to access the recording.