18/09/2023
Briefing

Applying the framework set out in Art. 34, the European Commission (the “EC”) has completed a risk assessment into the risk of Russian disinformation campaigns across a number of social media platforms (the “Disinformation Report” or “Report”)

Given the timing of this publication on the eve of DSA enforcement, it is reasonable to conclude that the EC intended to convey a message to Big Tech around the level of detail it expects to see in the first annual systemic risk reports from VLOP and VLOSE providers.

Five key takeaways from the Disinformation Report

1. Early signals from regulators

Providers will not have to wait for formal regulatory guidance on their compliance obligations.

Art. 35(2) DSA requires the European Board for Digital Services to publish an annual report into the “most prominent and recurrent systemic risks” and best practices for mitigating these risks. Similarly, Art. 35(3) permits the EC, in cooperation with national Digital Services Coordinators, to publish guidance on possible risk mitigation measures.

However, as we see from the Disinformation Report, the EC clearly intends to also provide guidance via ad hoc means.

•While this guidance came too late for the first annual systemic risk reports, providers should study the EC’s publication as they continue to develop their risk management policies and procedures.
•Providers that have decided to prepare separate risk assessment and risk mitigation reports should consider whether and how they can incorporate these early signals into their first annual mitigation reports under Art. 35(1).

2. Early focus on elections

The EC’s immediate focus appears to be on risks to civic discourse and electoral processes. The Disinformation Report finds that, since the start of the Ukraine war, over one third of EU residents have been exposed to Russian disinformation online. The combination of “coordinated inauthentic behaviour” and the propagation of “deceptive, dehumanising and violent content” risks undermining the democratic process and impeding the exercise of fundamental rights.

•With European Parliamentary elections due in 2024, and a slew of national and regional elections over the next two months, VLOP and VLOSE providers should focus their immediate mitigation efforts on controls related to tackling disinformation and protecting election integrity.

3. Recognition of regional and linguistic factors

The Disinformation Report found the inconsistent enforcement of Terms and Conditions by online platform providers, in particular in respect of content in Central and Eastern European languages. The EC noted that providers “rarely reviewed and removed more than 50 percent of the clearly violative content [the EC] flagged in … [these] languages”.

Considering the relative prominence of English and other large European languages in the online environment, it is perhaps unsurprising that existing risk detection and mitigation measures are more effective for larger countries. However, Art. 34(2) DSA specifically requires providers to “take into account specific regional or linguistic aspects” when carrying out their systemic risk assessments.

•Given the ongoing war in Ukraine, whose effects are disproportionately felt in Central and Eastern Europe, providers should review their existing suite of risk mitigations with an open mind, and consider whether these need to be rendered more effective for certain languages. While mitigations are required to be reasonable and proportionate, further investment may be needed.

4. Baseline Framework for Risk Assessment

The Report establishes a “Baseline Framework” which researchers are invited to use to assess online risks. This Baseline Framework leverages “qualitative and quantitative evidence”, in order to evaluate the severity of a particular risk factor, to determine its scale and intensity, and to identify whether a VLOP or VLOSE provider has designated a mitigation measure for a specific risk. The Baseline Framework further proposes the use of quantitative techniques for assessing the effectiveness of specific controls.

•Providers should reassess their first annual risk assessments in light of the Disinformation Report, and consider whether they have struck an appropriate balance between descriptive and data-driven analyses.
•Under Art. 40 DSA, vetted researchers are likely to play a prominent role in investigating systemic risks in the years ahead. Providers should consider the Baseline Framework proposed in the Disinformation Report and plan their Art. 40 compliance measures accordingly.

5. Use of monitoring actions

The Disinformation Report clearly illustrates the complications of systemic risk management. The Report finds that the Russian disinformation campaign carries “risks to public safety, fundamental rights and civic discourse, as well as an increased risk of illegal content disseminating on digital services”.

The overlapping nature of these risks presents a challenge in terms of deploying “reasonable, effective and proportionate mitigation measures” as required by Art. 35(1) DSA. This challenge is heightened further by the natural tension between mitigation measures, and the potential to negatively impact on the exercise of freedom of expression rights. While the Report refers to “clearly violative content”, the balance between free speech and other rights will not always be clear, and honest differences of opinion will emerge on a regular basis. 

It is therefore likely that the EC will use the tools available to it, including Art. 72 DSA monitoring actions, to extensively investigate alleged breaches of the DSA, before seeking to impose fines.

•Art. 72(2) is of particular interest, as it allows the EC to appoint external experts and auditors “to assist… in monitoring the effective implementation and compliance with [the DSA]”.
•Providers should consider at the outset of formal engagement with the EC, whether they can provide voluntary commitments which would satisfy the regulator so as to reduce the potential disruption that may arise on foot of the Commission appointing its own independent auditors.

What should clients do next?

In the short term, providers of VLOP and VLOSE services should continue to diligently implement their DSA compliance programme, including in relation to any non-VLOP/VLOSE services they offer. They should continue with their ongoing work in relation to the upcoming independent audit, and should carefully document the operation of their governance arrangements, including with respect to their Art. 41 compliance functions.

The transmission of risk assessments to the EC, in accordance with Art. 42(4)(a), will lead to regulatory correspondence and providers should engage constructively with the EC. Given the lack of meaningful guidance to date, it is to be hoped that such regulatory engagement will be cooperative in nature and focused on establishing expectations for the years ahead.

In the longer term, providers should review early signals from the EC and begin to consider what changes will be required to their approach for 2024. They should consider how they will assess the operational effectiveness of the mitigations which they propose to deploy in accordance with Art. 35(1) DSA, and feed those results into their second annual systemic risk assessment. Finally, they should remain alert for further formal and informal guidance, which will inform their approach.

If you require further information on the Digital Services Act, please contact any member of our Technology and Innovation team or your usual Arthur Cox contact.