The Digital Services Package (“DSP”) reached the political agreement stage of EU adoption in April 2022 and is aimed at creating safer access to digital products and services, while encouraging fair competition in digital markets. It comprises two different pieces of legislation, the Digital Markets Act (“DMA”) and Digital Services Act (“DSA”), which together overhaul the rules applying to online platforms.

Overview of the DMA

The DMA will regulate the conduct of the largest core platform service providers (so-called “gatekeepers”) and will restrict unfair conditions and practices which limit the contestability of markets. Many of its measures are aimed at limiting the use of personal data across gatekeepers’ products or services and regulating how data will be shared with businesses and consumers which use the gatekeeper’s services.

Gatekeepers will be prohibited from cross-using personal data collected via a core platform service, to improve their other products and services (and vice versa). Gatekeepers’ use of personal data for online advertising services will be curtailed, and they will be required to provide advertisers and publishers with enhanced information on each advertisement placed by an advertiser or displayed on a publisher’s inventory. Advertisers, publishers and their authorised third parties will be given access to the gatekeeper’s performance measurement tools and the data necessary to independently verify the performance of the core platform service.

End users of the core platform services will be given enhanced data portability rights, including real-time access to data generated through their use of the service. Gatekeepers will also be required to provide an independently audited description of their profiling techniques for online advertising, which will be transmitted to the European Data Protection Board.

In addition to these limitations on personal data usage, the DMA includes provisions to improve the interoperability of number-independent interpersonal communication services (such as iMessage and WhatsApp), and establishes timelines for rendering specific features interoperable (e.g. group chats, video calls and end-to-end encryption). It will also prohibit gatekeepers from self-preferencing their own products and services, require them to be more transparent with businesses (e.g. with respect to search engine rankings) and enable businesses to conclude contracts with end users of core platform services and to take payments directly.

The European Commission (“EC”) has the power to designate gatekeepers and will be solely responsible for overseeing compliance.

Overview of the DSA

The DSA represents a significant evolution in intermediary liability. While it retains the E-Commerce Directive’s safe harbour protections for ‘mere conduit’, caching and hosting services and enables providers to carry out good faith and voluntary measures to detect, identify and remove illegal content, it introduces significant new obligations for online platforms. Although member states will designate a Digital Services Coordinator to oversee compliance with the DSA, the very largest online platforms and search engines will be subject to additional obligations and to the direct oversight of the EC.

The DSA introduces a formal notice-and-takedown regime, with provisions for ‘trusted flaggers’, and establishes minimum standards for governmental takedown orders and orders for information. Online platforms will be required to be more transparent with users who are subject to enforcement action, including content removal and demonetisation, and will be required to put in place out-of-court dispute settlement measures.

Where an online platform becomes aware of information giving rise to a suspicion of a criminal offence involving a threat to the life or safety of a person or persons, it will be required to promptly notify the law enforcement or judicial authorities of the member state(s) concerned. For online platforms which are accessible to minors, providers will be required to put in place appropriate and proportionate measures to ensure “a high level of privacy, safety, and security of minors, on their service”. They will also be prohibited from presenting advertising on the basis of profiling, where the service provider is “aware with reasonable certainty” that the recipient is a minor.

Online platforms which use recommender systems to present content suggestions will be required to inform users of the relevant parameters that inform these systems. Online advertising platforms will be required to provide information on the natural or legal person on whose behalf an advert was presented, as well as who paid for the advertisement. Furthermore, providers of online marketplaces will be required to ensure the traceability of traders on their platforms by collecting certain minimum information about each trader. Traders who fail to provide this minimum information, and to self-certify that the products and services they offer comply with applicable law, will be prevented from using the online platforms.

Providers of very large online platforms and search engines will be required to conduct annual risk assessments to “identify, analyse and assess any systemic risks stemming from the design, functioning and use made of their services”. These assessments must consider the spread of illegal content, as well as risks to public security or to civic discourse and electoral processes.

The assessment must also have regard to the Charter of Fundamental Rights of the European Union, including the right to human dignity; respect for private and family life; data protection; freedom of expression; non-discrimination; the rights of the child; and consumer protection. Providers of these services must put in place reasonable, proportionate and effective mitigation measures, which must be independently audited on at least an annual basis.


The DSP concentrates enforcement powers in the EC, with national Digital Services Coordinators overseeing compliance with the DSA for smaller online platforms. This represents a departure from the one-stop-shop model under the GDPR and it remains to be seen if this will lead to uniform enforcement of obligations across all regulated entities.

The DSA carries fines of up to 6% of annual worldwide turnover for failing to comply with an obligation, and separate fines of up to 1% of annual worldwide turnover for providing incorrect, incomplete or misleading information. Under the DMA, EC can impose fines of up to 10% of the company’s total worldwide annual turnover or 20% in the event of repeated infringements.

In addition to these fines, the DSP contains several other measures to ensure compliance, including the appointment by the EC of independent external experts and auditors under Article 26 DMA and the power to make voluntary commitments binding under article 56 DSA.


In the words of Commissioner Margrethe Vestager, the aim of the DPS “is to ensure a fully functioning and competitive single market for digital services based on European values”. The rules contained in the DSP will shape the development of digital services for years to come and are likely to have an effect beyond Europe. If it succeeds in its ambition, the DSP will transform the way individual users and businesses interact with some of the largest providers of online platforms and other core platform services.

While both texts are in the process of being adopted by the European Council, recent indications are that some of the large platforms are considering challenging the DMA in particular, although Amazon have described a court challenge as a “last resort”. Given the broad ramifications and the scale of the companies facing the biggest impact from the DSP, we can expect lots of twists and turns before the full impact of the proposed reforms take effect.



This article was originally published in the Autumn 2022 edition of the Irish Compliance Quarterly magazine and is available on the Compliance Institute’s website. The authors would like to thank Emma Mintern for her contribution to this article.