26/04/2022
Briefing

Against this backdrop, the European Data Protection Board (the “EDPB”) has published timely “Guidelines 3/2022 on Dark Patters in Social Media Platform Interfaces: How to Recognise and Avoid Them” (the “Guidelines”). These Guidelines categorise the types of commonly-used dark patterns and outline the data protection issues that arise. They provide concrete guidance in relation to every step of the user’s journey on a social media platform and will be useful to controllers, who wish to assess their practices in advance of the DSA coming into force.

Dark patterns are defined in the Guidelines as “interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions regarding the processing of their personal data”. The purpose of dark patterns is to influence the user to make certain choices when interacting with online services.

While the Guidelines are predominantly aimed at social media platforms and users, the principles can be applied by other controllers to ensure greater data protection. The Guidelines are currently open for public consultation until 2 May 2022. We have set out some key takeaways from the Guidelines below.

Categorising Dark Patterns

The EDPB identifies the main categories of dark patterns and provides common examples for each. The Guidelines provide a useful taxonomy for social media companies in reviewing current interfaces and identifying potentially problematic practices.

Category Description Examples/Types of Dark Pattern
Overloading These techniques aim to provide the user with “a mass of requests, information, options or possibilities” to direct the user to make a choice preferable to the social media platform.
  • Continuous prompting (repeatedly asking users for data and offering arguments why they should provide it);
  • Privacy maze (difficult to navigate pages/too many pages);
  • Too many options.
Skipping Designing the user interface/experience in such a way that the user forgets or does not think about all or some of the data protection aspects.
  • Deceptive snugness (defaulting choices to the most data invasive features and options, nudging individuals towards approving these options);
  • Look over there (putting a data protection action or information in competition with another element to distract the user from the data protection action or information).
Stirring Affecting the choice users would make by appealing to emotions or using visual nudges.
  • Emotional steering (framing information in an emotive manner and using this to influence the emotional state of users to take an action that works against their data protection interests);
  • Hidden in plain sight (nudging users towards more invasive options, by presenting certain options in smaller font or by using faint grey text on a white background).
Hindering Making the process of finding information or managing personal data difficult or impossible to achieve.
  • Dead end (a redirect link is not working or is not available);
  • Longer than necessary (the user interface requires more steps to activate a data protection control, than to activate the privacy invasive setting);
  • Misleading information.
Fickle The design of the interface is unstable and inconsistent, making it difficult to find controls and information relating to the data processing.
  • Lacking hierarchy (information related to data protection lacks hierarchy, presenting the same information several times in different manners);
  • Decontextualising (providing data protection information on a page which is out of context).
Left in the dark Designing the interface in a way to hide information or controls related to data protection or to “leave users unsure of how data is processed and what kind of controls they might have over it”.
  • Language discontinuity (not providing information in the official language(s) of the country where users live);
  • Conflicting information;
  • Ambiguous wording of information.

The Guidelines state that the data processing principles in Article 5 of the General Data Protection Regulation (the “GDPR”) such as fair processing, transparency, data minimisation and accountability should be used in assessing whether a design pattern constitutes a dark pattern. Accordingly, the use of dark patterns to hinder a data subject’s ability to protect their personal data and make conscious choices will amount to a breach of the GDPR.

Dark Patterns and the GDPR

The assessment of whether a design pattern is a dark pattern starts with the data processing principles in Article 5 GDPR, in particular fair processing, transparency and accountability. The assessment also requires an analysis of other elements of the GDPR such as data protection by design and by default, the definition of consent or the obligations under Article 12 to provide information “in a concise, transparent, intelligible and easily accessible form”.

The EDPB recalls their earlier Guidelines 4/2019 on Article 25 Data Protection by Design and by Default and notes that some of the key elements of those guidelines become even more relevant with regard to dark patterns. This includes the need to align processing with the data subject’s expectations, to ensure data subject autonomy and consumer choice, and to ensure that controllers are truthful as to their processing activities. According to the EDPB, the correct application of these principles would prevent the implementation of dark patterns in the first instance.

Practical Examples – Dark Patterns in Social Media Platform Interfaces

The Guidelines analyse the life-cycle of a social media account and outline how dark patterns are commonly implemented. They also provide guidance as to how social media platforms can ensure compliance with the GDPR at each step. Dark patterns are divided into content-based patterns and interface-based patterns.

Opening a Social Media Account

  1. The Guidelines stress that the focus at this stage should be on ensuring that information is provided to the user in a way that helps them to understand exactly what they are signing up for. To the extent that the controller relies on consent as a legal basis, the provision of information should comply with the conditions for consent set out in Article 7 GDPR. Controllers must also ensure that the user is aware of how they can withdraw their consent, and doing so should be equally as easy as giving consent.
  2. Continuous prompting, such as using pop-ups to ask the user to provide unnecessary personal information or questioning a refusal to grant consent can impair the user’s freely given consent – the user may simply give-in and accept the pop-up. The Guidelines also note that this results in a breach of fair processing as it prolongs the sign-up process for users who do not wish to accept the pop-ups. Emotional steering can also be a breach of the GDPR and the Guidelines note that “influencing decisions by providing biased information to individuals can generally be considered as an unfair practice contrary to the principle of fairness of processing”. Imperative or motivational language can impact the decision of the user in an illegitimate manner, in particular when the user is a child.
  3. The interface can also be designed in such a manner as to nudge users towards more privacy-invasive options. This focus on the design of the interface is based on the idea that “even when social media providers make available all the information to be provided to data subjects under Article 13 and 14 GDPR, the way this information is presented can still infringe the overarching requirements of transparency under Article 12(1) GDPR”.

Staying Informed on Social Media

  1. Again, the Guidelines stress that “too much irrelevant or confusing information can obscure important content points or reduce the likelihood of finding them”. This can overload the user with information and amount to a breach of the transparency and fairness principles.
  2. Issues identified by the Guidelines include providing the user with conflicting information about the processing of their personal data or implementing emotional steering by only highlighting the positive aspects of sharing information, thereby giving the users “the illusion of safety and comfort with regard to the potential risks of sharing” personal data. The wording of any policies or information must also be such that would allow the average user to understand the “genuine message of the information without special knowledge”.
  3. In addition to this, the interface should be designed in such a way that “settings should be easily accessible and associated with relevant information for users to make an informed decision”. While a layered approach can help present the privacy notice or policy in a more readable manner, the Guidelines note that there is no one-size-fits-all approach and call for testing with users before implementation. The number of steps required when a user wishes to exercise their rights should be as low as possible.

Staying Protected on Social Media

The Guidelines examine common practices in managing consent and data protection settings and highlight those that result in breaches of the GDPR, including:

  • poorly designed toggles for consent, such that the user does not know which position indicates a refusal of consent;
  • putting a data protection action or related information in competition with another element (for example, one controller reportedly included a link to a recipe for “delicious cookies” in their cookie banner);
  • requiring more steps to withdraw consent or opt-out of targeted advertising than the steps required to provide consent or to opt-in; and
  • including a large number of steps for changing settings, which can discourage users from finalising their desired change in preferences or result in the user overlooking some options.

Data Subject Rights

The exercise of data subject rights can also be subject to dark patterns. Providing broken links or using ambiguous wording can make it difficult for an individual to exercise their rights or assess any conditionality attaching to their rights. The Guidelines stress that the controller shall not utilise a “privacy maze” to render the user journey longer than necessary, or to question the user’s motivation for exercising their rights through the use of pop-ups and similar banners.

Leaving a Social Media Account

The Guidelines note that a request to delete a social media account must be understood as a withdrawal of consent under Article 7(3) GDPR and will likely require the erasure of personal data, in accordance with the principle of data minimisation.

While a social media platform can offer an option to “pause” an account as well as permanent erasure, the Guidelines make clear that the user must be provided with sufficient information about the personal data that will be processed if the account is “paused” and that consent to processing given during the account sign-up process is not likely to cover the period after the account is paused.

Social media platforms cannot hide the permanent erasure option in a privacy maze or utilise emotional steering, such as “your friends will forget you”. Providing a “comprehensive description of the supposedly negative consequences caused by users erasing their account [will constitute] an impediment against their decision”. Another issue identified is the addition of irrelevant steps in the process of deleting the account – such as asking for the reason why the user is deleting the account and then providing “solutions” in the form of pop-ups.

Conclusion

The Guidelines provide a comprehensive and detailed breakdown of the life-cycle of a social media account and highlight common practices which are considered by the EDPB to be a breach of the GDPR. The Guidelines are likely to inform supervisory authorities’ investigations, particularly with respect to transparency obligations, and may trigger specific regulatory interest into the use of dark patterns.

Social media companies, and controllers generally, should review their practices and user interfaces in light of these Guidelines. To the extent that there is an overlap between good content/UX design and dark patterns, controllers should take steps to demonstrate their compliance with the GDPR, including documenting any user research they commission and the reasons for their design decisions. This will be particularly important, given the DSA’s expected prohibition on the use of dark patters.

The authors would like to thank Shay Buckley for his contribution to this article.