More about dark patterns …

Dark patterns is a subject close to our heart.

We’ve written  about them before, including:

So, what are the latest developments, internationally and in Australia.

What are dark patterns?  

Firstly, what do we mean by ‘dark patterns’?

Dark patterns are design features and/or language used on websites and in apps that make it more difficult for users to provide informed consent about or otherwise manage their privacy.   In other words, they are tricks that make you do things that you didn’t mean to, like buying or signing up for something or not being able to get out of something or thinking that you’ve switched something off, when in fact you haven’t … Like that message “Only 1 seat available’ – but is that really true?

The European Data Protection Board (“EDPB”) defines ‘dark patterns’ as “interfaces and user experiences implemented on social media platforms that lead users into making unintended, unwilling and potentially harmful decisions in regards to their personal data with the aim of influencing users’ behaviors”. The EDPB also defines 6 categories of dark patterns; (1) overloading, (2) skipping, (3) stirring, (4) hindering, (5) fickle, and (6) left in the dark, which is described in further detail in this blog post.

The common features of a ‘dark pattern’ are:

  • the manipulative or deceptive nature and
  • the resulting negative or harmful outcome on the consumer.

A good list of some of the different types of dark patterns is available here.  The same site has put together a “Hall of Shame” including over 400 examples of dark patterns, many of which you’ll be familiar with.

We also describe some dark patterns in our blog post Dark Patterns: What are They and Why Should You Remove Them From Your Website.

What laws apply to dark patterns?

Dark patterns are potentially covered by a web of different laws. They may be regulated by consumer protection, marketing and spam laws – as well as privacy laws. A list of some of the laws that apply to the use of dark patterns is here.

 

EU Regulation

In the EU, regulators are being particularly active in taking on these practices. There are new EU laws aimed at regulating digital service providers and digital markets that  specifically cover the use of dark patterns, and work together with existing consumer protection laws that already extend to misleading and deceptive practices. For example, the new EU Digital Services Act prohibits dark patterns on online platforms. It complements the Unfair Commercial Practices Directive (UCPD) and the privacy protections under the GDPR, ensuring that no regulatory gap is left for platforms to manipulate users.

The  EU Unfair Commercial Practices Directive prohibits unfair commercial practices affecting consumers’ economic interests before, during and after the conclusion of a contract. On December 29, 2021, the European Commission published guidance on the UCPD that confirms that the UCPD covers dark patterns and dedicates a section (4.2.7) to explain how the relevant provisions of the UCPD can apply to data-driven business-to-consumer commercial practices.

Under privacy laws, the use of dark patterns raises questions of legitimacy of use, transparency and fairness.

The EU is taking further steps to protect consumers from dark patterns, including both enforcement and new laws.  Some of the new laws that will impact dark patterns further include:

  • The proposed AI Act. The proposed AI Act sets out rules on the development, placing on the market, and use of artificial intelligence systems (“AI systems”) across the EU. While the AI Act is still undergoing the legislative process, the current proposal prohibits the use of dark patterns within AI systems;
  • The proposed Data Act. The proposed Data Act aims to facilitate greater access to and use of data, such as allowing users to access and port to third parties the data generated through their use of connected products and services. As part of this, the third party that receives this data is under an obligation not to “coerce, deceive or manipulate the user in any way, by subverting or impairing the autonomy, decision-making or choices of the user, including by means of a digital interface with the user”. Recital 34 explains that this means that third parties should not rely on dark patterns when designing their digital interfaces, particularly in a way that manipulates consumers to disclose more data.

EU enforcement activities include a recent sweep of EU online shopping websites, from which a press release notes that nearly 40% of sites reviewed (148 out of 399) rely on manipulative practices to exploit consumers’ vulnerabilities or trick them (e.g., fake countdown timers, hidden information, and web interfaces designed to lead consumers to purchases, subscriptions or other choices). The relevant member state’s consumer protection authorities were to contact the relevant traders to rectify their websites and take further action if necessary.

 

US Regulation

In the USA, the Federal Trade Commission (FTC) has been most active in this space.

In September 2022 the FTC released a report showing how companies are increasingly using sophisticated design practices. The report highlighted the FTC’s efforts to combat the use of dark patterns in the marketplace.

For example, its refereed to the FTC case against ABCmouse, where the FTC alleged the online learning site made it extremely difficult to cancel free trials and subscription plans despite promising “Easy Cancellation.” Consumers who wanted to cancel their subscriptions were often forced to navigate a difficult-to-find, lengthy, and confusing cancellation path on the company’s website and click through several pages of promotions and links that, when clicked, directed consumers away from the cancellation path.

The report also reiterated the agency’s commitment to taking action against tactics designed to trick and trap consumers.

Dark pattern regulation in Australia

Privacy Act

So, what is happening  in Australia. While some attention has been given to dark patterns, particularly by the Australian Competition and Consumer Commission (ACCC)  there has not been the sort of progress that’s been happening in the EU.

In its submissions to the Privacy Act review, the Office of the Australian Information Commissioner (OAIC) identified that “some APP entities operating online use so-called ‘dark patterns’ designed to nudge individuals to consenting to more collections and broader uses of personal information”, in a way which the OAIC considered could limit the usefulness of consent as a privacy protection.

The final report on the Privacy Act review makes some reference to dark patterns, particularly as they impact the ability to ‘opt-out’ of direct marketing. It proposes a right for individuals to opt out of targeted advertising, noting that it should be relatively easy for individuals to make this choice, taking a swipe at the sort of practices which make that difficult. Several ‘dark patterns’ are cited as being design features organisations should avoid when implementing opt-outs, including:

  • Repeatedly requesting individuals to make a certain choice.
  • Introducing barriers to service for individuals who elect to not receive targeted marketing.
  • Making electing to opt-out more difficult or time-consuming to access.

However, there are no specific proposals to address the use of dark patterns, or clarify where they are covered by the APPs.  This is a little disappointing – and perhaps a missed opportunity.

Australian Consumer Law

The ACCC may be more inclined to act in this space.  It referred to dark patterns in its Digital Platforms Inquiry Final Report.

In its September 2022 interim report, the ACCC followed up, again referring to some of the online practices that are harming Australians online.

The ACCC’s pursuit of Google evidences its commitment to enforcement to prevent these deceptive behaviours. In August 2022, the Australian Federal Court ordered Google to pay $60m for  making misleading representations to consumers about the collection and use of their personal location data on Android phones between January 2017 and December 2018, following court action by the ACCC. “This significant penalty imposed by the Court today sends a strong message to digital platforms and other businesses, large and small, that they must not mislead consumers about how their data is being collected and used,” ACCC Chair Gina Cass-Gottlieb said.

In April 2023, the ACCC announced it was intensifying its efforts to crack down on dark patterns, and the ways that digital platforms unfairly handle their customers.  The head of the ACCC has said that Australia should consider blanket legislation outlawing unfair trading practices, in order to stamp out new ‘dark patterns’ if and when they arise.  Probably too late for inclusion in the update of the Privacy Act?

So … while updates to the Privacy Act crawl along, it looks like the ACCC will take on the mantle of protecting the rights of individuals from the unfair practices increasingly being used by online sites and apps. If only Australia had a well resourced privacy regulator to support these efforts.

Further reading:

Privacy, security and training. Jodie is one of Australia’s leading privacy and security experts and the Founder of Privacy 108 Consulting.