ePrivacy: EU Regulation Introduced to Allow Companies to Tackle Online Child Sexual Abuse
Earlier this year, we published a briefing on a draft EU Regulation which would allow certain companies to continue to tackle online child sexual abuse on their services. Since then, the final text of the Regulation has been agreed (Regulation (EU) 2021/1232) and has entered into force as of 2 August 2021 (the “Regulation”). This briefing outlines key provisions of the Regulation, which should be noted by providers of interpersonal communications services.
In light of the dramatic increase in reports of child sexual abuse online over the last decade, the European Commission has declared an “urgent need to take effective action” against child sexual abuse and has described this fight as a “priority of the EU.”
However, the introduction of the European Electronic Communications Code (Directive (EU) 2018/1972) (the “Code”) into law in December 2020 presented a new hurdle to these efforts, as the Code extended the provisions of the ePrivacy Directive (“ePD”) to, among others, providers of “number-independent interpersonal communications services” (“Service Providers”).
Of particular concern were the effects of Articles 5(1) and 6(1) ePD on the child protection efforts of Service Providers. Article 5(1) prohibits the “listening, tapping, storage or other kinds of interception or surveillance of communications and the related traffic data by persons other than users, without the consent of the users concerned, except when legally authorised to do so in accordance with Article 15(1).” Article 6(1) requires the erasure or anonymisation of traffic data, once it is no longer needed to transmit a communication (subject to certain exemptions).
These changes affected some of the most widely-used webmail and messaging services, and called into question the lawfulness of Service Providers’ voluntary use of technologies to fight child sexual abuse on their platforms.
The Regulation provides a temporary solution to this problem, and will allow Service Providers to continue to tackle online child sexual abuse, subject to compliance with certain measures. The Regulation will apply until 3 August 2024, by which time it is expected that more cohesive legislation will be in place.
Scope of the derogations
The Regulation enables qualifying Service Providers to invoke “temporary and strictly limited” derogations from Articles 5(1) and 6(1) ePD. These derogations allow Service Providers to scan otherwise-confidential communications to the extent strictly necessary for the purposes of detecting and reporting “online child sexual abuse,” which encompasses both “online child sexual abuse material” and the “solicitation of children.”
The scanning of audio communications is expressly out of scope of the Regulation.
Where Service Providers wish to engage in such monitoring, they must use the “least privacy-intrusive measures” and adhere to the principle of data protection by design and by default (per Article 25 of the GDPR).
Any technology used to detect online child sexual abuse must be in accordance with the state of the art in the industry, and where technologies are used to scan text, they should not be able “to deduce the substance of the content of the communications,” as they should be solely able to “detect patterns which point to possible online child sexual abuse.” Technologies must also be “sufficiently reliable,” with the most limited error rate possible, and they must be limited to the use of relevant key indicators and objectively identified risk factors.
Before using any relevant technology, Service Providers are required to conduct a data protection impact assessment (“DPIA”), and they must also engage in prior consultation with their competent supervisory authority.
Further, Article 3(1)(g) sets out the following key operational requirements for Service Providers who wish to avail of these derogations. These include:
- establishing internal procedures to prevent the abuse of or unauthorised access to and unauthorised transfers of personal and other data;
- ensuring human oversight of the processing of personal data and human intervention where necessary;
- requiring newly-identified infringing material to undergo human confirmation prior to reporting same to law enforcement authorities and other designated organisations;
- establishing appropriate procedures and redress mechanisms to enable users to lodge complaints against the removal or reporting of their content;
- informing users in a clear, prominent and comprehensible way that the Service Provider is availing of the derogation from the ePD for the sole purpose of detecting and removing online child sexual abuse material and of detecting the solicitation of children and making appropriate reports, the logic behind the measures they have taken and the attendant impact on the confidentiality of users’ communications, and the possibility that their personal data may be shared with law enforcement authorities and other appropriate organisations;
- informing users of their options for seeking redress, complaining to a supervisory authority and seeking a judicial remedy;
- publishing an annual report containing prescribed information, and submitting this to the competent supervisory authority and the Commission. The first report is due by 3 February 2022, and by 31 January every year thereafter.
Article 3(1)(j) of the Regulation requires that “every case of a reasoned and verified suspicion of online child sexual abuse [be] reported without delay to the competent national law enforcement authorities or to organisations acting in the public interest against child abuse.”
In order to meet these reporting obligations, Article 3(1)(h) requires Service Providers to securely store the relevant content, traffic and personal data relating to suspected online child sexual abuse. This data may only be stored for the purposes of prompt reporting, blocking the user’s account or suspending the provision of the service to them, creating a “unique, non-reconvertible digital signature (‘hash’)” which can be used to detect future online child sexual abuse material, enabling the user to seek redress or pursue other remedies, and/or responding to requests by competent authorities to assist them in preventing, detecting, investigating and/or prosecuting criminal offences.
This data must be held for no longer than strictly necessary, and must be deleted after a maximum of 12 months from the date of detecting the infringing material.
- DPIAs and prior consultation
Service Providers need not have completed a DPIA and have engaged in prior consultation with their competent supervisory authority until 3 April 2022, provided that Service Providers commence the prior consultation procedure by 3 September 2021.
- List of organisations acting in the public interest
Service Providers have until 3 September 2021 to provide the European Commission with a list of the “organisations acting in the public interest against child sexual abuse” to which the Service Providers intend on sending reports. This list must be updated on a “regular basis” thereafter.
What Service Providers Should Do Now
Bearing in mind that monitoring for online child sexual abuse remains entirely voluntary, Service Providers who wish to rely on the Regulation should promptly consider engagement with the Data Protection Commission (where Ireland is their main establishment for the purposes of the GDPR). Service Providers must also notify the European Commission by 3 September 2021 of any public interest organisation to which the Service Provider makes reports.
Service Providers must also take steps to comply with the operational requirements of the Regulation, which may involve an in-depth review of existing practices.
The European Data Protection Board is due to issue guidance to regulators in the coming months on how to assess whether activities conducted pursuant to the Regulation are conducted in accordance with the GDPR. This guidance should assist Service Providers in identifying and addressing key concerns from a data protection perspective.
The authors would like to thank David O’Connor for his contribution to this article.