Introduction
2024 has been tipped as not just an election year but “the election year”, with at least 64 countries plus the EU electing representatives this year[1].
Ahead of this collective global rush to the polls, Very Large Online Platforms and Search Engines (VLOPs and VLOSEs) as designated under the Digital Services Act (DSA) will be eagerly reviewing the European Commission’s recently-published guidelines on election-related online systemic risks (Guidelines).
Prohibition of the flow of election misinformation online is a significant priority of the DSA. Negative effects on civic discourse and electoral processes is listed as a potential systemic risk which need in-depth assessment by VLOPs and VLOSEs.
Obligations to balance mitigation of electoral process risks with rights such as freedom of expression, freedom of association, freedom of pluralism of the media come into sharp focus when approximately half the world’s population is being asked to cast ballots this year.
Requiring at least 45 million EU-based users to be designated a VLOP or VLOSE means that these entities have direct access to vast amounts of detailed personal data of a significant proportion of the voting public. Given many voters increasingly use such platforms as news and information sources, and venues of public debate, overt and subliminal campaign messaging displayed on the sites has enormous potential to sway and manipulate public opinions.
Risk mitigation measures
Pursuant to Article 34 of the DSA, providers must carry out systemic risk assessment arising from the use of their services, including assessment of actual or foreseeable negative effects on civic discourse and electoral processes. Article 35 states that reasonable, proportionate, and effective mitigation measures must be put in place to deal with these risks.
Article 35(3) goes on to state that the Commission, in cooperation with Digital Services Coordinators may issue guidelines on best practises and recommended possible measures to meet these systemic risk mitigation obligations.
Following a public consultation which ran from February to March 2024 on the topic, the Guidelines have now been published. Mitigation measures recommended by the Guidelines supplement the non-exhaustive list set out in Article 35(1), and include:
- Reinforcing internal processes
- setting up dedicated internal teams with adequate resources and expertise in activities such as content moderation, fact-checking and foreign information manipulations and interference (FIMI)
- using available analysis and information on local context-specific risks
- take into account elements such as the presence and activity of political actors on the service, relevant discussions on, and usage of, the platform in the context of elections or the service being used to assist the organisation of political events
- Implementing elections-specific risk mitigation measures tailored to each individual electoral period and local context
- promoting official information on electoral processes (by way of links, banners or popups)
- implementing media literacy initiatives
- applying inoculation measures to pre-emptively build resilience against expected manipulation techniques
- adapting measures to provide more contextual information, such as trust marks, prompts and nudges to consider accuracy of content
- adapting recommender systems to limit amplification of deceptive material
- reduce the monetisation and virality of content that threatens the integrity of electoral processes
- clearly labelling political advertising in line with the upcoming Regulation 2024/900 on the transparency and targeting of political advertising (Regulation on Political Advertising)
- influencer declaration of political advertising
- drawing on existing industry codes such as the Code of Conduct on Countering Hate Speech Online
- Mitigation measures specifically linked to generative AI
- clearly labelling content generated by AI (such as deepfakes)
- ensure data sources are reliable and accurate
- conduct red-teaming exercises prior to release of generative AI systems
- adapting terms and conditions accordingly and enforcing them adequately
- Cooperating and exchanging information with EU level and national authorities, academia, independent media providers and experts, NGOs, and civil society organisations
- Implementing incident response mechanisms during an electoral period (and involving senior leadership)
- Assessing the effectiveness of the measures through post-election reviews, including by publishing non-confidential versions of such post-election review documents and providing opportunities for public feedback on the risk mitigation measures put in place.
Legal effect
The Guidelines, which are framed as best practise recommendations, are not legally binding in a manner equivalent to a regulation or directive. However, they represent a more-than-persuasive instrument which will, if not complied with, require providers to prove to the Commission that equally effective measures have been taken to mitigate the risks. Additionally, many of the recommendations will remain relevant after elections have taken place.
Overlap with other adjacent regulations (such as the Regulation on Political Advertising, the AI Act and commitments made by relevant online platforms under voluntary, industry-led schemes such as the AI Pact and the Code of Practice on Disinformation) is acknowledged. The Commission explicitly mentions that the Guidelines already account for the forthcoming obligations that will be imposed on providers of VLOPs and VLOSEs by the Regulation on Political Advertising and AI Act. The Guidelines prepare VLOPs and VLOSEs for their upcoming entry into force, e.g. by including labelling requirements for transparency and by recommending them to ensure that the tools and application programming interfaces (APIs) enable research on their political advertising repositories, in order to analyse if they are fit-for-purpose and allow for meaningful research on disinformation, FIMI campaigns and hateful, (violent) extremist or radicalising content that is disseminated to influence individuals in their electoral choices during elections.
Providers will be seeking ways to ensure their compliance approaches under the DSA and these Guidelines are consistent with all other relevant applicable rules. The message is that these Guidelines should not come as a surprise to VLOPs and VLOSEs and should be viewed as complementary and supportive of other instruments and initiatives, to empower companies to achieve compliance.
However, complying with an ever-increasing patchwork of inter-dependent regulations and guidelines will be challenging for VLOPs, VLOSEs and indeed for other entities who are indirectly brought within scope; the Guidelines state that they will be “a source of inspiration for providers of online platforms or search engines that have not been designated as VLOPs or VLOSEs and whose services give rise to similar risks.”
Timelines and next steps
From a timeline perspective, the Commission recommends that risk mitigation measures are in place and functioning at least one to six months before an electoral period, and continue at least one month after the elections. The Commission has also made itself available, on a voluntary basis (and without prejudice to its investigatory and enforcement powers under the DSA), to periodically review providers’ mitigation measures adopted, either ex ante or ex post specific elections.
As stated in the Guidelines, a ‘crucial test case’ is the European Parliament elections in early June this year. Therefore, entities within scope should already be dedicating resources to the implementation of the Guidelines, as it is without doubt that political actors will have already launched their battles to win hearts and minds, click by click.