chevron-down Created with Sketch Beta.

ARTICLE

Dark Patterns: Ill-Defined but Posing a Serious Risk

Andrew Nigrinis

Dark Patterns: Ill-Defined but Posing a Serious Risk
iStock.com/sefa ozel

Consumer protection agencies are raising the stakes for companies using so-called “dark patterns” in website or app design that may potentially “trick” consumers into taking unintended actions or incurring intended charges.

The Federal Trade Commission’s (FTC’s) Bureau of Consumer Protection, for example, announced in November 2022 a “record-breaking settlement” of $100 million with Vonage for its supposed use of dark patterns during its cancellation process. The Consumer Financial Protection Bureau (CFPB) is similarly zeroing in on supposed dark pattern practices. One problem with this heightened scrutiny is the lack of clear definitions to help companies know if they are at risk. Still, companies can take steps if they find themselves in the middle of a dark-patterns investigation to reduce their exposures as well as to try to avoid potential allegations of dark patterns in the first place.

What is a Dark Pattern

The term “dark patterns” first surfaced in 2010 as a metaphor for deceptive designs used in the user experience on digital platforms. These tricks in websites and apps influence customers to take unintended actions such as buying things, signing up for services, or agreeing to recurring fees. Dark-pattern enforcement is effectively the application of prior unfair and deceptive-practices regulations adapted to the digital space.

For example, when current CFPB Director Rohit Chopra was at the FTC, he characterized dark patterns as “the online successor to decades of dirty dealing in direct mail marketing.” A recent FTC staff report identified four specific activities that may fall into the dark-pattern definition:

  • Misleading consumers with ads disguised as something else
  • Making cancellation difficult
  • Burying key terms and junk fees
  • Tricking consumers into sharing data

The concern regulators have with these practices, intentional or not, is that they may be “used to deceive, steer, or manipulate users into behavior that is profitable for a company but often harmful to users or contrary to their intent.” Focusing on the digital environment potentially widens the scale of any investigation, given the wide array of potential data sources, from apps to web pages that may be scrutinized.

The CFPB, under former FTC commissioner Chopra, is now focusing on dark patterns as well and has announced cases alleging dark patterns. In support of this focus, a CFPB job posting for technologists from July 2022 states, “Our technologists will use their background in user experience or product design to help detect and deter dark patterns designed to take advantage of people.” This focus on dark patterns, first mentioned in an April 2022 news release, signals a major policy change. Conduct that may fall under the umbrella definition of “dark patterns” may be better understood by examining some recent examples where the financial penalties were substantial.

Cancellation Terms: The Vonage Settlement

The FTC’s $100 million dark-patterns settlement with Vonage related to Vonage’s cancellation process. The FTC’s investigation alleged four specific business practices at Vonage, a voice over internet protocol (VOIP) provider of telephone services for consumers and small businesses:

  1. Eliminating cancellation options: Consumers were offered many ways to sign up, but after 2017, the only method to cancel the service was by phone with an agent.
  2. Making the cancellation process difficult: The cancellation number was difficult to find, agents were available for limited hours, and callers were repeatedly transferred.
  3. Levying surprise fees: In many instances, an unexpected termination fee was charged.
  4. Continuing to charge customers after cancellation: Customers were charged post-cancellation fees and, after complaining, only offered partial refunds.

The FTC labeled these behaviors as dark patterns – digital versions of low-tech practices that violate Section 5 of the FTC Act and the Restore Online Shoppers Confidence Act (ROSCA), and argued that they violated the standard that cancelling should be at least as easy as signing up.

Data Sharing Consent: Google and Location Data

In another recent dark-pattern case, Google agreed to a $401 million settlement with 43 State Attorneys General including the District of Columbia. The plaintiffs alleged that Google used dark patterns to collect users’ location data. The alleged dark pattern centered on whether consumers understood how to control their settings to prevent Google from collecting location data. The state AGs alleged that consumers gave this data to Google despite a desire not to do so and a belief that they had opted out. The DC settlement will require Google, among other things, to place additional pop-up notifications to explicitly inform consumers of data collection with a supporting web page to review.

Online Selling: Fortnite

The FTC and DOJ settled another matter with Epic Games, Inc. for $520 million in December 2022, of which $245 million related to allegations of dark patterns surrounding in-game purchases for the popular video game Fortnite using an in-game currency called “V-Bucks.” The complaint specifies issues with the interface of the game that made it easy for consumers to make quick purchases through the placement and location of buttons.

As in the Vonage case, Fortnite made it easy to purchase but allegedly used dark patterns to make it difficult to reverse or undo charges. Additionally, customers who complained could find their accounts locked or frozen. This case seems to show the FTC and DOJ may use dark patterns in conjunction with other violations, here the Children’s Online Privacy Protection Rule, to litigate with companies. The Fortnite litigation may be a harbinger of FTC investigations.

What Can Happen in a Dark-Patterns Investigation

A company under investigation for an alleged dark-pattern practice might receive a civil investigative demand (CID), a type of administrative subpoena. While a CID is a signal of a serious investigation that should involve the company’s consumer protection counsel, it is also an opportunity to understand what type of conduct the investigator is reviewing.

A CID may include requests for emails, other communications, policy, and strategy documents, and usually data. The data request will normally reveal the type of dark patterns being investigated, as the agency will request data to show the efficacy of the alleged conduct. Supplemental data may also be requested later to assess any potential subversion of consumers’ intentions. These requests may focus on data such as usage statistics, customer complaints, timing of quitting a service or charge, and reasons for canceling a service or charge. This can also be an opportunity to present supplemental data that might help defend a company’s digital practices such as the data just mentioned, if the regulator didn’t request it, or other data that clarify a company’s digital practices such as consumer experience, post-purchase behavior, or other products and services purchased.

Data analysis often focuses on proving the effectiveness of an alleged dark pattern in influencing consumers. If the data analysis hasn’t been conducted, in anticipation of a CID or regulatory investigation, it may be beneficial for a company to do its own analysis during the investigation to assess any possible exposure. A regulator, in showing a company’s liability for a dark pattern, may do a similar analysis themselves.

Let’s use the example of hitting a “yes” button and assume two possibilities: a company’s actual site and a government idealized version of a dark-pattern-free site. A survey can be designed using mockups to obtain two sets of click-through rates which allows you to ask. This can help analyze if there is a statistically significant difference between what the government deems an environment free of the alleged dark pattern, and what the company is actually using.

In practice, multiple alleged dark-pattern practices may be working in tandem, which could require a series of surveys or tests. An additional benefit of such a survey is to help define the space for negotiating damages for a potential settlement. If a Company’s analysis of its own web page finds that 10% of its customers click “yes” but an analysis of a regulator’s preferred page finds a 9% “yes” click rate – this 1% difference can potentially have large ramifications for a damages model.

Proactive Steps Companies Can Take

Companies need not wait until they are under investigation to address whether they have dark-pattern issues. Before a company becomes a target, it can work with its consumer protection counsel to conduct objective periodic reviews designed to surface whether their consumer-facing websites and apps may lead customers into making unintended choices. Are the digital decision points transparent and unambiguous? Are the terms of service clear? Is it as easy to place an order as it is to opt out?

One resource a company can tap as part of a self-examination is the results of its own preliminary research that goes into designing its online order forms. A common tool companies use in this process is A/B testing, which involves presenting different versions of how consumers are presented options to see which format works best. Depending on the outcome of these tests, the results could help a company demonstrate it had sought to avoid influencing consumers to make unintended decisions.

Conclusion

The definition of impermissible dark patterns will surely evolve with future regulatory investigations and legal judgments. In the meantime, dark patterns should be viewed as a successor to deceptive marketing practices transplanted to the digital space. As such, companies should step up their reviews of their digital presence and establish guidelines for responding to an investigation or CID from regulators. Investigators, after all, will follow their own established methods for determining whether they believe a particular digital-interface feature is a dark pattern. Knowing that, companies should act preemptively by putting themselves in the regulators’ shoes as well as in the consumers’ shoes as they assess their consumer-facing websites and apps.

This article was prepared by the Business Law Section’s Consumer Financial Services Committee.

    Author