It has been a year since the European Commission (“EC”) designated the first tranche of very large online platforms and very large online search engines (together, “VLOPSEs”). While these companies continue to navigate the first turn of the risk management lifecycle, it is worth reflecting on some of the lessons learned so far and looking ahead to what will be a very busy summer/autumn period for VLOPSEs providers. 

Enforcement without guidance

The EC was quick to exercise its new investigatory powers, launching the first Art. 72 monitoring actions in September 2023. By the end of 2023, it had opened formal proceedings against X, which would soon be joined by TikTok in February and April 2024. 

According to the EC’s press releases, the first two proceedings were commenced on foot of the Art. 34 systemic risk assessment reports that were submitted last September and subsequent requests for information. The third concerns the requirement to conduct an assessment prior to the deployment of functionality that is likely to have a critical impact on systemic risks. As was the case with the GDPR, these sorts of proceedings will be important in defining the contours of the DSA. However, for the providers in question, they find themselves in the unenviable position of being the first VLOPSEs to face enforcement action in circumstances where the extent and practicalities of the relevant DSA obligations have not been supplemented by formal regulatory guidance. 

A broad approach to enforcement

It is also worth reflecting on the breadth of these proceedings. While section 110 inquiries under the Irish Data Protection Acts (which implemented the GDPR enforcement regime) tend to focus on discrete areas of alleged infringement (such as insufficient transparency), the DSA proceedings that have been launched to date are quite broad. 

1. The first set of proceedings focuses on the obligation to assess and counter the dissemination of illegal content online, and to take measures to combat actual or foreseeable negative effects on civic discourse and electoral processes. The proceedings also deal with access to data by vetted researchers, alleged shortcomings in transparency in relation to content moderation and online advertising, and the alleged deceptive design of user interfaces. 

2. The second set of proceedings deals with the possible negative effects of recommender systems, including in relation to radicalisation and possible harm to physical and mental health. It also deals with the general obligation to provide a high level of privacy, safety and security to minors, in addition to concerns around advertising transparency and access to data by vetted researchers. 

3. The third set of proceedings is narrower, focusing on whether TikTok adequately assessed the likelihood and extent to which a rewards programme may have a critical impact on the systemic risks identified in Art. 34 of the DSA, prior to deploying that programme in Spain and France. Of particular note, in its pre-inquiry engagements, the EC is reported to have given TikTok just 24 hours to provide a copy of its critical impact risk assessment.

Each of these proceedings underscore the importance of contemporaneous and sufficiently detailed record-keeping, both in terms of dealing with broadly-scoped requests for information from the EC and in defending formal proceedings. Indeed, comments from Commissioner Breton that the EC was ready to trigger interim measures “unless TikTok provides compelling proof of [the programme’s] safety” provide a timely reminder of the precautionary principle that underpins EU regulation and should be kept in mind when conducting a critical impact risk assessment prior to launching a product or feature in the EU.

Common themes in enforcement

Based on these proceedings, it seems the EC is prioritising the enforcement of the DSA’s foundational risk management provisions, in focusing on the assessment of systemic risks and the sufficiency of measures to combat online harms. It is also ensuring that vetted researchers will have access to the data they need to contribute to the detection, identification and understanding of systemic risk across the EU. The participation of vetted researchers will effectively lead to a quasi-democratisation of the investigative process and may, for example, provide insights into whether and how regional and linguistic factors affect the online risk landscape. 

The outcomes of these proceedings should provide welcome clarity on key provisions, including Arts. 28, 34, 35 and 40(12). However, as most VLOPSE providers are already in the course of preparing their second annual systemic risk assessments, it is likely that they will have to wait until at least the 2025 cycle to benefit from any formal guidance that emerges from these proceedings. 

Audit Implementation Reports and product road-mapping

Of course, there will be no shortage of guidance from the first ever independent audit reports, which are due later this summer. These reports will contain findings on the extent of a provider’s compliance with their obligations under Arts. 11 to 48 of the DSA (as applicable).

Any ‘positive with comments’ or ‘negative’ findings in the independent audit reports will be accompanied by operational recommendations and recommended timeframes to achieve compliance with a particular provision. Any such findings will then trigger the requirement on the provider to prepare an audit implementation report, setting out the measures they propose to take in response.

Some of the operational recommendations may be documentary or procedural in nature, while others may require technical investment. For this latter category, providers would be advised to ensure that their internal engineering/road-mapping processes can support the preparation and rollout of these audit implementation reports. Similarly, Art. 41 compliance functions should consider how to incorporate these implementation reports into their compliance monitoring activities.

Supporting the management body

From a governance perspective, the management body of a VLOPSE provider must approve and review, at least annually, its strategies and policies for systemic risk management and, under Arts. 41(6) and (7), must devote sufficient time to the consideration of such matters.

Given this, the DSA is likely to feature prominently on the agendas for management body meetings in the coming months, as a series of deliverables converge, including the first independent audit reports and the second annual systemic risk assessment reports. The receipt of the final audit reports will trigger tight deadlines for adopting corresponding audit implementation reports and for publishing the suite of statutory documents referred to in Art. 42(4) DSA. On top of that, management bodies may also be required to give further consideration and take further actions in relation to their obligations under Arts. 41(5), (6) and (7) DSA.

All in all, this is likely to require significant time and attention from the management body, and providers should take steps now to prepare. Relatively minor steps such as providing additional training to the management body, for example in relation to what to expect from the independent audit reports, may help to support the process. 

May you live in interesting times

We are still at the early stages of the DSA and it will likely take several more years of guidance and formal decisions to clarify the precise requirements of the regulation. By the end of this year, the first annual risk assessment reports, mitigation reports, audit reports and audit implementation reports will be made publicly available (in redacted form) and VLOPSE providers will have the opportunity to benchmark their efforts against their peers.

Of course, this may also expose providers to enhanced public scrutiny, at a time when they are already facing a mosaic of historic legal reform at EU-level, including the revised Network and Information Security Directive, the Digital Markets Act and the AI Act. For each of these EU initiatives, providers are likely to be subject to similar obligations under UK or other law.

Given the scale and overlapping nature of these regulations, it will be sensible in the medium term to adopt holistic governance arrangements that support compliance in a regulation-agnostic manner. However, for now, providers would be forgiven for focusing on the next thing on the horizon.

The authors would like to thank James Farrell for his contribution to this article.