Dark Patterns in Privacy: Misleading the internet users

Dark Patterns in Privacy: Misleading the internet users

Have you ever received an email from a newsletter that you don’t remember subscribing to, checked that in the shopping cart the final price of the product is much higher than the advertised, or ended up paying for a subscription to a service after forgetting to cancel it after a free trial period? If the answer is yes, you’ve witnessed what is called a “Dark Pattern”

Dark Patterns, also known as “Deceptive design patterns ”, constitutes interfaces, methods or user experience implementations that make users take unintended, unwilling and potentially harmful decisions that they wouldn’t make under other circumstances in relation with the given website, app or service.

These methods on some occasions lead subjects to inoffensive consequences, but in most cases could imply the revelation of personal data, hiring unwanted products/services, markups or provision of financial information. 

But how does it work?

The implementation of dark patterns shows the unethical behaviour of the company and disrespectful attitude among their potential or actual users and in the long term works against the own purposes of the company or website.

Using psychological and marketing techniques, these deceptive designs may be presented in several steps of the operations, such as the registration process, in the presentation of pre-contractual conditions, through an advertising commercial, on check-out processes, in the configuration of privacy and cookie options, within the unsubscription steps or amid policies acceptance.

Types of Dark Patterns:

It´ll be easier for the reader to identify these tricks if presented with the categories of the different uses of deceptive designs. Based on the EDPB Guidelines on this specific topic  and other sources as the Deceptive Design organisation classifications we could distinguish between

  • Overloading: users are confronted with an avalanche/large quantity of requests, information, options or possibilities in order to prompt them to share more data or unintentionally allow personal data processing against the expectations of the data subject. Under this category we can also find the so-called Continuous prompting, Privacy Maze and Too Many Options.
  •  Skipping: The interface design or user experience is aligned in a way that users forget or do not think about all or some of the data protection aspects
  • Stirring: affects the choice users would make by appealing to their emotions or using visual nudges.
  • Hindering: means obstructing or blocking users in their process of becoming informed or managing their data by making the action hard or impossible to achieve.
  • Fickle: the design of the interface is inconsistent and not clear, making it hard for the user to navigate the different data protection control tools and to understand the purpose of the processing.
  • Left in the dark: an interface is designed in a way to hide information or data protection control tools or to leave users unsure of how their data is processed and what kind of control they might have over it regarding the exercise of their rights
  • Confirmshaming: The act of guilting the user into opting into something. The option to decline is worded in such a way as to shame the user into compliance.
  • Forced continuity: When your free trial with a service comes to an end and your credit card silently starts getting charged without any warning. In some cases this is made even worse by making it difficult to cancel the membership.
  • Hidden costs: You get to the last step of the checkout process, only to discover some unexpected charges have appeared, e.g. delivery charges, tax, etc.

What could we do against dark patterns?

Although, to a certain extent, these techniques are commonly known, we must fix some redlines, especially when it comes to exploiting users’ data and moneywise affairs. Something that we must realise about dark patterns is that a bad design is not the same as a design for evil, and this type of behaviour should be rejected by the consumers -reflected on bad reviews- and punished by the authorities when the redlines are crossed. 

The Spanish Data Protection Agency (AEPD) deals with dark patterns in its Guidelines for Data Protection by Default , specifically in sections VI and VIII. In application of the principle of fairness set out in Article 5(1) (a), data controllers must ensure that no dark patterns are used, at least in relation to decisions regarding the processing of their personal data. 

Also, using the same Article 5(1) (a) as an starting point, the European Data Protection Board (EDPB) adopted for public consultation its “Guidelines on Dark patterns in social media platform interfaces: How to recognise and avoid them” providing a number of examples, as well as best practice recommendations for avoiding dark patterns.

In any case, the principle of data protection by design and by default should be applied from the moment of conception of user interfaces and user experiences, prior to launch, to guarantee the fundamental rights and freedoms of individuals, as well as compliance with regulations.

The Guidelines contain a checklist in the Annex. This checklist includes examples of each of the categories, as well as the articles of the GDPR that are relevant and may be violated in each category. Where such violations occur, data protection authorities may sanction the use of dark patterns.

1 Comment

Post a Comment

Skip to content