Designing without dark patterns
guide for designers

1. Introduction

Page reading time: 17 min

a. Presentation of the guide

Vocation of the guide

This guide was designed after observing that there is a lack of resources for designers on the topic of dark patterns.  Indeed, there is significant academic research on the classification of dark patterns (deceptive designs), their automatic recognition, their impact on users, etc. On the other hand, there is very little practical documentation or tools available to designers who design these devices.

The guide’s objectives:

  • to be a compilation of good practices in terms of dark patterns;
  • help to correct our designer biases and habits;
  • provide alternative versions that are more respectful of the user.


This guide has a reflective rather than normative aim. It provides food for thought, as well as a substantial - albeit non-exhaustive - list of best practices.


This resource is distributed under Créative Commons licence CC-By. You are therefore free to share, modify or reuse this content, as long as you cite the authors.

Scope

This guide cover:

  • Dark pattern (deceptive design)
  • Captology
  • Design for attention
  • Persuasive design

This guide doesn’t cover:

Target audience

This guide is intended for:

  • Digital service designers (UX designer, UI designer, Product designer, etc.);
  • Anyone involved in the design of digital services (head of project, developer, product owner, data scientist, etc.).


The aim is to give all designers, whatever their technical level, the keys to act and achieve a more respectful approach of their user’s free will.

b. Definitions

  • dark patterns (or deceptive design): elements designed to push the user to do things that they would not necessarily have done initially
  • persuasive design: design intending to guide the users through a specific behaviour. Persuasive design is considered as a specialisation of UX design (user-centred).
  • captology: Acronym of Computer As Persuasive Technology. The study of digital technologies as a tool for influencing and persuading individuals. The term was coined by the American researcher B.J Fogg in the 90’s.
  • design for attention: design around the user’s attention (often used with the purpose of retaining it for as long as possible).

c. Challenges of dark patterns

Today, attention is monetised: it is placed at the heart of the economic models of social or entertainment platforms.

Our attention has always been exploited, as the CEO of TF1 recalled in 2004 'what we sell to Coca-Cola is available brain time'. But the appearance of smartphones and computers has produced a paradigm shift: digital services can now request our attention through interaction.

The phenomenon then leads to the over-solicitation of users’ attention generally. Technical possibilities (sending a push [unsolicited message] to a smartphone) favour ever-increasing information (advertising, etc.); just like access to more data (personal, demographic, etc.).

As user attention is limited, it therefore becomes rare. It is often wrongly said that people are addicted to their smartphones. We should rather say that digital services are addicted to the attention of their users.

This over-solicitation of users creates several issues:

Free will issues

The issue of freedom is the first issue to take into account. Deceptive or persuasive design practices most often contribute to hindering the user. By pushing them to act in the direction of service.


The different restrictions on the user's free will are described below:

Influencing consumer behaviour

Techniques for influencing consumer practices largely predate the emergence of digital technology (location of products in supermarkets, television advertising, etc.). Faced with these practices, consumers have developed numerous avoidance strategies (such as skipping advertisements, or putting a 'Stop advertising' sticker on their mailbox). Digital technologies, however, by increasing their persuasive power, greatly complicate how users understand these mechanisms of influence.

Thus, we could see an increase in the number of practices such as:

  • products added to our basket without our consent;
  • hidden added fees, unauthorised subscriptions.


These practices of influence, or even manipulation of users, are increasingly difficult to detect.


A January 2023 study, by the European Commission and national consumer protection authorities, showed that 148 of the 399 sites audited included manipulation techniques.

Examples of deceptive practices

These examples range from dark patterns to persuasive interfaces:

  • the more you look at plane tickets with the same IP address, the more plane ticket prices will increase;
  • Instagram integrates advertising content, which is barely detectable, within other posts viewed.

Excessive reactivity

mproving the user experience helps simplify the journey. But it also sometimes tends to lead the user to move from a posture of reflection, of understanding (where he/she takes the time to read and obtain information) to a posture of reactivity (where he/she reacts without thinking, without reading the content ).

A study by Columbia University with Microsoft Research and INRIA (National Institute for Research in Digital Sciences and Technologies), shows that 60% of content is retweeted before reading on Twitter (Gabielkov et al. 2016).

Locked in filter bubbles

The concept of a filter bubble describes the confinement of a user in a specific content universe.

The digital services we use personalise the information transmitted to us. This personalisation is carried out based on the connections we have, the content we watch, etc. Thistends to lock us into homogenised content.

This notion is particularly apparent on Tik Tok. Reactions to viewed content provide information to recommendation algorithms. Based on this, the algorithms subsequently determine which content to show to the user.

After a moment of viewing, the user only sees a tiny portion of elements corresponding to their supposed preferences.

The approach is similar on Netflix, whose recommendations are based on content we have already seen. This leads to certain content being made invisible.

The main risk is to be confined in a universe of beliefs that satisfies us without enriching us. Altering our perception of the world as it is.

Influencing decisions

Another risk is being guided towards decisions that we would not naturally make (behavioural targeting).

During Brexit or during the 2016 American elections, the company Cambridge Analytica published targeted advertisements on Facebook. Their objective was to influence American Internet users in their choice of vote (Cadwalladr, 2017).

The science fiction book 'Quality land' by Marc-Uwe Kling (2017) clearly shows this counter-utopian universe. People are trapped in filter bubbles and influenced to make this or that decision.

Attentional issues

We define attention here as the ability to concentrate, to be attentive.

The omnipresence of digital technology in our lives, combined with persuasive practices, generates multiple attentional consequences such as:

Compulsive use

We tend to use our phones compulsively. The average number of times per day we 'tap' on our phones is 2,617 times (according to the 2016 Dscout survey). Our attention is regularly interrupted, which weighs on our ability to concentrate.

Increasingly shorter tasks

The tasks that we use computers for are becoming shorter and shorter. 75% of screen content is viewed for less than a minute (Yeykelis et al., 2014).

Our attention increasingly on alert regime

Sociologist Dominique Boullier identifies four regimes of attention: loyalty, immersion, alert and projection.

The alert regime corresponds to all the warning signals that we receive on a daily basis (pop-up window, warning of unexpected gains, etc.). This alert regime is today confronted with widespread zapping and a series of 'flashes', which undermine its foundations (Boullier, 2009). This is why it is important to reduce the number of warning signals, which generate a climate of stress. Thisallows the user to increase their attention.

Health issues

Persuasive methods also pose health issues, particularly to mental health. The two health consequences described here are:

It should be noted that these aspects are the subject of lively debate among health and psychology professionals.

Stress and mental load

Social networks are designed to encourage the user to consult them regularly (according to the mechanisms presented by Nir Eyal in his book Hooked, 2017). The design practices used (attention design, persuasion) encourage the user to frequently consult their social networks, otherwise they feel stressed.

Why ? 

  • for fear of missing important information (qualified as FOMO: fear of missing out);
  • to receive rewards in the form of likes, comments, etc.

Moreover, by frequently stimulating users, social networks and applications increase their mental load.

Twitter Example

The use of infinite scrolling and the addition of new information 24 hours a day encourage users to regularly consult the content. Each post discovered is a stimulation. Wanting to be informed, the user will tend to consult repeatedly. Thus adding a mental load.

Biassed perception

Social media apps encourage a biassed image of ourselves.

Social media invites the use of filters. These make users appear with smoother skin, a thinner face, etc. They propagate beauty standards.
The persistence of this type of filtered images on social media contributes to modifying the self-esteem of certain individuals. To the point of causing body dysmorphic disorders.

Moreover, this change in beauty standards has other consequences such as the explosion in the use of cosmetic surgery. According to a 2018 article in the American scientific journal JAMA Facial Plastic Surgery (Rajanala et al., 2018), the number of cosmetic surgery procedures linked to the influence of social networks has quadrupled in three years (from 13% in 2016 to 55% in 2019). The increase mainly concerns patients under 30 years old.

This can be explained by stronger social comparison on social media. Comparison to peers has been linked to issues with body image. (Morrison et al., 2004).

Social media apps also lead to a biassed image of others, such as the illusion of a perfect life. Particularly visible on Instagram, the vision of lives with beautiful photos, showing vacations, orderly homes, without imperfections, pushes users to question their own happiness.

Other impacts

The list above is a non-exhaustive list of impacts on user health:

  • depression impacted by heavy use of social media (Lin et al., 2016) and by use before going to sleep (Lemola et al., 2014);
  • eating disorders such as orthorexia nervosa. This disorder is impacted by significant use of social networks and particularly Instagram (Turner & Lefevre, 2017);
  • attention problems and increased impulsivity during heavy smartphone use (Hadar et al., 2017);language delay in children with intensive use (more than two hours per day) of screen media (Hutton et al., 2020).

Ecological issues

The ecological issue must also be taken into account in the context of dark patterns.

Weight of digital

Today, digital technology has a significant environmental impact (around 4% of global GHG emissions). It is therefore urgent to rethink our ways of using it. Reducing digital impact will inevitably involve a form of digital sobriety, where the use of advertising and data collection will necessarily be questioned.

Defined use cases

It is also interesting to favour defined use cases which focus on functional needs.

Example

A user wants to book a train ticket. He goes to the tool, chooses his train, views the options and pays.

During this journey the user does not see any advertising for hotels, cars, or additional options for unrequested insurance. The user journey is limited to the essentials.

Find out more:

d. Role of digital designers

Digital designers (UI, UX, product) are responsible for the interfaces and experiences of digital services.

Have a conscious approach

Designer means arranging information and making a journey fluid. By ordering the elements, the designer decides what is important or not, the order of the information that the user will see. These design decisions influence users in various ways:

  • intentional, with a desire to capture and direct the attention of users;
  • unconsciously, by reproducing patterns that the designer has internalised.

Example

As a designer, I am asked to design a GDPR banner. Instinctively, I risk producing an interface like what I'm used to seeing on other sites I use. For example, design a primary button for accepting cookies. But in doing so, I unconsciously perpetuate a pattern that influences user behaviour.

But, only a conscious approach by the designer allows a digital service that respects the attention of its users. He/she becomes a guarantor of respect for users’ attention.

Form

To respect users' attention, digital designers must have detailed knowledge of persuasive mechanisms, and their more respectful alternatives.

Today, attentional, persuasive or deceptive practices are regularly addressed in public debate (see the Netflix documentary (Jeff Orlowski, 2020)). But they are still little taught in design schools, or little discussed in the professional world.

This guide allows designers to understand persuasive mechanisms in order to adopt a conscious approach.

Involve the whole team

Despite everything, the designer is not the only guarantor of respecting users' attention. They have to involve the whole team.

Having this conscious approach is a collective commitment. People designing the web are responsible for thoughtfully designing our attentional environments. This creates good habits and good attentional hygiene.

Most of the deceptive methods in this guide relate to experiences or interfaces. However, some persuasion practices concern more stakeholders than just designers.

Help and contribution

The guide is intended to evolve and be enriched over time thanks to contributions from the community.

Would you like to contribute to it?

Join the #projet_dark-patterns channel on the Designers Éthiques Slack.