Chapter 30: Changes afoot in the European Union

The European Union is putting considerable energy into preventing deceptive patterns. Existing legal frameworks like the General Data Protection Regulation and the consumer law frameworks, including the Unfair Commercial Practices Directive, are currently being deployed extensively against platforms that use deceptive patterns. What’s more, the EU has been working on two huge new laws that will extensively regulate big tech. By the first quarter of 2024, the Digital Services Act and the Digital Markets Act will have come into force. Both of these laws contain specific provisions regulating deceptive patterns and manipulative design. This is a step forward from the GDPR and the UCPD, which require interpretation and legal application of concepts like consent, transparency, and unfairness to deceptive patterns.

The Digital Markets Act

The Digital Markets Act (DMA) was created in March 2022 with the goal of ensuring fair and open digital markets in the EU.1 It targets big tech companies like Microsoft, Apple, Google, Meta and Amazon. Any company that has either more than 45 million monthly active EU users, or over €7.5 billion annual turnover in the EU may be defined as a ‘gatekeeper’ and is subject to obligations under the DMA. There are also some qualitative criteria that appear to be designed to prevent influential companies from sneaking under the size requirements2. In the words of the European Commission, the DMA aims at ‘preventing gatekeepers from imposing unfair conditions on businesses and end users and at ensuring the openness of important digital services’.3

The DMA bans deceptive patterns when they’re used to undermine the other rules in the DMA (article 13). The other rules are very wide-ranging, so this means that the DMA is powerful in its scope regarding deceptive patterns. For example, the recitals of the DMA (the part of the legislation that explains how to interpret the provisions) clearly state that deceptive patterns are forbidden if they are used by gatekeepers to do any of the following:

  • Interfere with a user’s choice to be tracked or not for targeted advertising outside the gatekeeper’s main platform (recital 36, 37).
  • Nag users; that is, prompt them more than once a year to give consent for data processing, having previously ignored or refused the request (recital 37).
  • Interfere with a user’s choice to install third-party apps or app stores (recital 41).
  • Interfere with a user’s choice of settings, or their choice to uninstall any pre-installed apps (recital 49).
  • Interfere with a user’s ability to export their data from the gatekeeper’s platform in a format that can be imported to third parties (recital 59).
  • Make it more difficult for a user to unsubscribe from a service than it was to subscribe (recital 63).

This demonstrates how enormously consequential the DMA will be for tech companies that get categorised as gatekeepers. If a gatekeeper breaks the rules, the sanctions are potentially huge: up to 10% of the company’s total worldwide annual turnover, or 20% if they are repeat offenders. There are other sanctions possible too, ranging all the way up to having the gatekeeper broken up or kicked out of the EU entirely.

The Digital Services Act

The EU Digital Services Act (DSA) contains even more good news about deceptive patterns.4 It entered into force in November 2022, and is gradually being rolled out, with the parts about deceptive patterns becoming fully applicable by June 2023. The DSA has a layered system with the rules becoming stricter at each successive level.

The DSA contains provisions about deceptive patterns, but they only apply to the two highest tiers: ‘online platforms’ and – something of a mouthful – ‘very large online platforms’ and ‘very large online search engines’ (VLOPs and VLOSEs):

  • Online platforms: A service ‘that, at the request of a recipient of the service, stores and disseminates information to the public’. This includes online marketplaces (like Amazon), app stores (like Apple’s App Store and Google Play), collaborative economy platforms (like Uber) and social media platforms (like Facebook).
  • Very large online platforms (VLOPs) and very large online search engines (VLOSEs): VLOPs are the same as online platforms, only bigger, with 45+ million monthly active users. VLOSEs are search engines (like Google) that have 45+ million monthly active users.

The DSA’s provisions about deceptive patterns don’t apply to the lower tiers. So, the following are excluded:

  • Micro and small enterprises: Any business with a headcount of less than 50 and a turnover of less than €10 million (unless the size of their user base makes them a VLOP or VLOSE).
  • Intermediary services: Offering network infrastructure – things like VPNs, DNS services, domain name registries, registrars, VOIP services, CDNs, and so on.
  • Hosting services: Offering cloud and web hosting, such as Godaddy or Amazon Web Services (AWS).

As you can see, the layered nature of the DSA is a bit complicated, but the main point to take away is that the provisions about deceptive patterns apply to lots of big tech companies. Apple, Amazon, Uber, Google, Facebook – they’re all regulated by the DSA in some capacity.

Now we’ve established that, we can move on to the actual provisions in the DSA that regulate deceptive patterns. The term ‘dark pattern’ is defined in the DSA recitals (recital 67). To quote:

‘Dark patterns on online interfaces of online platforms are practices that materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions. Those practices can be used to persuade the recipients of the service to engage in unwanted behaviours or into undesired decisions which have negative consequences for them. Providers of online platforms should therefore be prohibited from deceiving or nudging recipients of the service and from distorting or impairing the autonomy, decision-making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof. This should include, but not be limited to, exploitative design choices to direct the recipient to actions that benefit the provider of online platforms, but which may not be in the recipients’ interests’

It’s notable that the recital states that deceptive patterns do not have to be intentional; they only have to be shown to have an effect on users (‘either on purpose or in effect’), which is also the case in the UCPD for unfair commercial practices, and it will make enforcement more straightforward. The recital also goes on to explicitly forbid certain deceptive patterns – though bear in mind that a ‘recital’ in EU law is not legally binding; it’s just intended to clarify the law.

  • Misdirection: ‘presenting choices in a non-neutral manner, such as giving more prominence to certain choices through visual, auditory, or other components, when asking the recipient of the service for a decision’.
  • Nagging: ‘It should also include repeatedly requesting a recipient of the service to make a choice where such a choice has already been made’.
  • Hard to cancel: ‘making the procedure of cancelling a service significantly more cumbersome than signing up to it, or making certain choices more difficult or time-consuming than others, making it unreasonably difficult to discontinue purchases’.
  • Obstruction: ‘default settings that are very difficult to change, and so unreasonably bias the decision making of the recipient of the service, in a way that distorts and impairs their autonomy, decision-making and choice’ (recital 67).

Unlike recitals, ‘articles’ of an EU law are legally binding. In article 25 of the DSA, deceptive patterns are expressly forbidden:

‘Providers of online platforms shall not design, organise or operate their online interfaces in a way that deceives or manipulates the recipients of their service or in a way that otherwise materially distorts or impairs the ability of the recipients of their service to make free and informed decisions.’

Although this provision is rather brief, it immediately goes on to propose that the European Commission may issue further guidelines to expand upon these rules in the future. Specifically...

‘The Commission may issue guidelines on how paragraph 1 applies to specific practices, notably: (a) giving more prominence to certain choices when asking the recipient of the service for a decision; (b) repeatedly requesting that the recipient of the service make a choice where that choice has already been made, especially by presenting pop-ups that interfere with the user experience; (c) making the procedure for terminating a service more difficult than subscribing to it.’

So, although the DSA only applies to certain kinds of business, and although the provisions about deceptive patterns are quite brief, the story isn’t over – we can expect more rules to come.

The DSA also contains some important provisions about risk assessments, audits and risk mitigation that will have a significant effect in the fight against deceptive patterns. These provisions only apply to VLOPs and VLOSEs – the giants like Amazon and Google. On the one hand, the fact that these provisions are limited to giants is quite practical: giant businesses have a big impact on EU citizens, and they’re profitable enough to afford the regulatory burden of the extra work. But on the other hand, smaller entities get to circumvent these stringent requirements and can potentially get away with employing deceptive patterns. This underlines the challenge regulators face in ensuring comprehensive oversight and enforcement of rules meant to prevent deceptive design across all platforms, regardless of size. Here are some of the relevant provisions in the DSA:

  • Annual risk assessments are required, including assessment of deceptive patterns (article 34): The DSA requires VLOPs and VLOSEs to carry out risk assessments to work out what areas of their products are likely to break the rules of the DSA. The DSA includes rules about deceptive patterns, so this means they’ll have to investigate their own products and create documents explaining where there’s risk or presence of deceptive patterns. This will also shift some of the cost of investigation from the regulator to the business.
  • Risk assessment documentation must be provided to the authorities (article 34): All the supporting documents from the risk assessments have to be preserved for at least three years and given to the relevant authorities upon request. These documents will be a goldmine for regulatory investigators and enforcement officers.
  • External experts must carry out independent audits (article 37): In addition to the annual risk assessments, the business has to engage, at its own expense, an independent, external organisation to carry out audits to ensure compliance with the DSA. These audits will include content about deceptive patterns and recommendations on how to get rid of them. The auditor must be given cooperation, assistance and given full access to internal data by the business.
  • Audit reports will be made publicly available (article 42): Audit reports are given directly to the authorities, then made available to the general public (though the public version may have commercially sensitive materials redacted). These independent audits are likely to be more objective and comprehensive than internal risk assessments.
  • Negative findings in the audit report must be actioned (article 37): Within one month of receiving recommendations in the audit report, the business has to adopt an implementation report that explains how they will make changes to comply with the DSA. This means that the independent auditors will play an important role in stamping out deceptive patterns.
  • Enforced by the member state or the European Commission (article 49): The DSA defines a new body, called a Digital Services Coordinator, which each member state in the EU will have to designate. The coordinator is responsible for all matters relating to the DSA in the member state. However, enforcement of the DSA can be carried out either at a state level or by the European Commission. This will prevent member states from using lax enforcement to entice VLOPs and VLOSEs to base their headquarters locally. If any member state is lax, the European Commission can step in.

In summary, the DSA’s risk assessment, audit and risk mitigation provisions are a really big deal in the fight against deceptive patterns – they force businesses and auditors to show when they’re using deceptive patterns, and where they might use them in the future. However, this is limited to just VLOPs and VLOSEs, so if a business squeaks in under the size criteria, they don’t have to do the risk assessments and audits described above.

All considered, the DSA is a new and exciting addition to EU platform regulation. The DSA packs a hard punch, too: up to 6% of global turnover, risk mitigation measures, and even a ban from the EU in the case of repeated serious breaches.

The proposed Data Act

The Data Act was proposed in February 2022. If approved, it will apply to data sharing and data portability activities. It builds on the GDPR, providing more specific guidance for data sharing and portability. The Data Act aims to enable more data access and reuse among different actors, fostering innovation, competition, and the public interest. Although the Data Act does not introduce a new definition or test for deceptive patterns, it contains provisions that forbid certain kinds of deceptive patterns. For example:

  • Obstructing users from exercising their data protection rights: A product cannot make it difficult for users to delete their accounts or transfer their data to another service provider by hiding the options in complex menus or requiring multiple steps to complete these actions.
  • False consent in sharing data with a third party: Any third party that receives data cannot coerce, deceive or manipulate the user in any way, by subverting or impairing the autonomy, decision-making or choices of the user, including by means of a digital interface with the user.

In summary, if it’s adopted the Data Act will be a win for users because it will prevent companies from locking in their data and holding it hostage, and instead allow users to take their data to use with a competitor. It also forbids the use of deceptive patterns to try to get around this (and various other data related rules).

The UCPD guidance notice

Another notable piece of progress in the European Union is the recent Unfair Commercial Practices Directive guidance notice, which was published by the European Commission in December 2021.5 These sorts of guidance notices are not legally binding, but they effectively provide an instruction manual to the member states, explaining how to implement and use the directive. What’s remarkable about the guidance notice is that it has a section on ‘data driven practices and dark patterns’ (section 4.2.7), that forbids them. To quote:

‘If dark patterns are applied in the context of business-to-consumer commercial relationships, then the Directive can be used to challenge the fairness of such practices. […] any manipulative practice that materially distorts or is likely to distort the economic behaviour of an average or vulnerable consumer could breach the trader’s professional diligence requirements (article 5), amount to a misleading practice (articles 6–7) or an aggressive practice (articles 8–9), depending on the specific dark pattern applied.’

The guidance notice goes on to list numerous specific deceptive patterns that are forbidden under the UCPD:

  • Visual interference: ‘visually obscuring important information or ordering it in a way to promote a specific option’.
  • Obstruction: ‘e.g. one path very long, another shorter’.
  • Trick wording: ‘ambiguous language (e.g. double negatives) to confuse the consumer’.
  • Sneaking: ‘Default interface settings […] for example by using pre-ticked boxes, including to charge for additional services’.

Since the wheels of commercial law turn slowly, it’s quite possible that we haven’t yet seen the full impact of this guidance from the European Commission.

If we look at the UCPD guidance together with the DMA, the DSA and proposed Data Act, it’s quite obvious which way things are going. Deceptive patterns are a matter of focus for legislation in the EU, and we’re likely to see a lot of enforcement action in the coming years, followed by the tech industry waking up to the fact that they need to change their design practices.

Buy the book...

Since 2010, Harry Brignull has dedicated his career to understanding and exposing the techniques that are employed to exploit users online, known as “deceptive patterns” or “dark patterns”. He is credited with coining a number of the terms that are now popularly used in this research area, and is the founder of the website deceptive.design. He has worked as an expert witness on a number of cases, including Nichols v. Noom Inc. ($56 million settlement), and FTC v. Publishers Clearing House LLC ($18.5 million settlement). Harry is also an accomplished user experience practitioner, having worked for organisations that include Smart Pension, Spotify, Pearson, HMRC, and the Telegraph newspaper.