The Dark Patterns Origin Story
In 2010, the term “dark patterns,” like many good colloquialisms, was coined by an Englishman. That gentlemen, Harry Brignull, received a Ph.D. in Cognitive Science and is an independent user experience consultant. He currently runs the website “darkpatterns.org” and provides consultancy services to numerous companies. He was also a participant in the FTC’s recent dark patterns workshop, discussed below. In perhaps the clearest sign that dark patterns have gone mainstream, he also now advertises his work as an expert witness for “dark patterns and deceptive user experiences in digital products.” Darkpatterns.org defines dark patterns broadly (and sinisterly) as “tricks used in websites and apps that make you do things that you didn’t mean to, like buying or signing up for something.” So what does this mean as a practical matter? As is often the case with slick marketing, the use of the term dark patterns is in part an effort at rebranding. Several of the practices defined by the website as dark patterns are already ones that practitioners will recognize as practices long thought to be unlawful. For example, bait and switch and disguised ads are unlawful concepts long familiar to consumer protection practitioners. Other types of dark patterns are worded in such a way that they might describe a traditional consumer protection violation, but it is hard to know for sure. For example, the “trick questions” dark pattern might describe a misleading implied claim depending upon the manner in and extent to which consumers are actually “tricked.” “Sneak into Basket” might describe the practice of charging a consumer for a good or service without providing adequate notice and consent depending upon whether disclosure language is provided. Finally, other dark patterns may be familiar as tools used by marketers but their inclusion as practices that may also be unlawful is more surprising. For example, “confirmshaming” is defined as the “the act of guilting the user into opting into something. The option to decline is worded in such a way as to shame the user into compliance.” To date, at least, the FTC has not articulated shaming or embarrassment as a form of deception. There is also a dark pattern labeled “hidden costs” that relates to unexpected charges that appear at the last step of the checkout process. While there is currently litigation and debate over the inclusion of certain costs such as resort fees at the end of the checkout process, it would surprise most everyone to see “taxes” and perhaps even “delivery charges” as a type of dark pattern “hidden cost.”
The concept of dark patterns has also begun to appear in academic and other types of literature. In April 2019, Purdue University’s UX Pedagogy and Practice Lab created a website to discuss dark patterns. The Financial Times published an article in May 2019 entitled “When Manipulation is the Digital Business Model,” while the New York Times published a similar article in May 2016 entitled “When Websites Won’t Take No for an Answer.” Finally, in a study first posted in 2019 and later published in 2021, Professors Luguri and Strahilevitz at the University of Chicago authored a paper entitled “Shining a Light on Dark Patterns.”
Regulatory Appearance
Given the increasing prominence of dark patterns in the academic and policy worlds, it was only a matter of time before the concept crossed over into the regulatory world. The term “dark patterns” appears to have first transitioned from the ivory towers of academia to the penal towers of the regulatory world in the FTC’s 2020 $10 million dollar settlement with ABCmouse, a children’s online subscription based learning site, operated by Age of Learning, Inc. According to the FTC’s complaint, the company allegedly failed to adequately disclose that its twelve month and 30 day free trial memberships would automatically renew unless cancelled. Although ABCmouse also promised “easy cancellation,” cancellation was allegedly anything but easy. Consumers who tried to cancel through email, phone, or customer support were instead directed to the company’s online cancellation mechanism. The FTC also alleged that the cancellation link was difficult to find and not clearly marked. Further, the FTC alleged that consumers wishing to cancel had to first navigate six to nine screens, many with links that would take consumers out of the cancellation path and many of which also offered consumers inadequate information regarding how to continue the cancellation process.
Then-FTC Commissioner Chopra issued a separate statement regarding the case and the alleged use of dark patterns. Similar to darkpatterns.org, he described dark patterns as “design features used to deceive, steer, or manipulate users into behavior that is profitable for an online service, but often harmful to users or contrary to their intent.” He went on to describe them as digital tricks and traps that “involve an online sleight of hand using visual misdirection, confusing language, hidden alternatives or fake urgency to steer people toward or away from certain choices.” With regard to the FTC’s settlement with ABCmouse, he highlighted the company’s practices as falling within the “roach motel” dark pattern where it is easy to get in but almost impossible to escape. At the same time, Commissioner Chopra took to his official Twitter account and urged designers to come forward and file confidential complaints with the FTC concerning the use of dark patterns.
Not to be outdone, in March of this year, California hopped onto the dark patterns bandwagon by adopting a new regulation for its infamous California Consumer Privacy Protection Act (“CCPA”) that prohibits the use of certain dark patterns. The newly adopted regulation prohibits businesses from using an opt-out method “that is designed with the purpose or has the substantial effect of subverting or impairing a consumer’s choice to opt-out.” The regulation then provides the following illustrative examples:
- The business’s process for submitting a request to opt-out shall not require more steps than that business’s process for a consumer to opt-in to the sale of personal information after having previously opted out. The number of steps for submitting a request to opt-out is measured from when the consumer clicks on the “Do Not Sell My Personal Information” link to completion of the request. The number of steps for submitting a request to opt-in to the sale of personal information is measured from the first indication by the consumer to the business of their interest to opt-in to completion of the request.
- A business shall not use confusing language, such as double-negatives (e.g., “Don’t Not Sell My Personal Information”), when providing consumers the choice to opt-out.
- Except as permitted by these regulations, a business shall not require consumers to click through or listen to reasons why they should not submit a request to opt-out before confirming their request.
- The business’s process for submitting a request to opt-out shall not require the consumer to provide personal information that is not necessary to implement the request.
- Upon clicking the “Do Not Sell My Personal Information” link, the business shall not require the consumer to search or scroll through the text of a privacy policy or similar document or webpage to locate the mechanism for submitting a request to opt-out.
Although the regulation does not specifically mention dark patterns, the California Attorney General noted in a press release that the regulation bans “so-called ‘dark patterns’ that delay or obscure the process for opting out . . . .”
Additionally, Senators Warner and Fischer introduced legislation in 2019 that would have banned certain dark patterns, in particular manipulating consumers into providing personal information and designing online products that lead to compulsive usage by children. Senator Warner spoke at the recent FTC Dark Patterns workshop and specifically referred to the proposed legislation, which never passed, as an effort to combat dark patterns.
FTC Workshop
It is not surprising, therefore, that when Democrats took control of the FTC following the election of President Biden that all of these roads led to a half-day workshop, which was held on April 29 of this year, entitled “Bringing Dark Patterns to Light.” Then-Acting Chairwoman Slaughter opened the workshop by providing a broad definition of dark patterns, similar to that previously articulated by Mr. Brignull and Commissioner Chopra: “user interface designs that manipulate consumers into taking unintended actions that may not be in their interest.” The workshop was then divided into five panels. The first panel discussed how to define dark patterns and why are they employed and featured Mr. Brignull, as well as numerous academics. The second panel explored how dark patterns affect consumers and likewise featured numerous academics. The next two panels looked at how dark patterns impact communities of color and target kids and teens, while the final session looked at potential strategies for dealing with dark patterns.
With respect to next steps, workshop participants urged numerous courses of action, including further study by the FTC, the issuance of guidance, bringing enforcement actions, and the passage of additional legislation. While there has, as yet, been no formal follow up from the workshop it is clear that dark patterns are still very much on the FTC’s mind. For example, in her keynote address at the October 2021 National Advertising Division Annual Conference, Commissioner Slaughter spoke out against dark patterns with respect to opting out of data collection. Additionally, the FTC’s recent Enforcement Policy Statement was touted in a press release as fighting “illegal dark patterns that trick or trap consumers into subscriptions. As discussed in more detail below, the Enforcement Policy Statement takes aim at the fairly common practices of using hyperlinks to make disclosures and efforts to “save the sale” or collect information from consumers before permitting them to cancel subscriptions. Practitioners in this area are also aware of numerous FTC investigations focused on practices that could be classified as dark patterns. Before the FTC proceeds too far down this path, we believe that the agency should undertake further study and clarification surrounding which uses of dark patterns in particular may be problematic.
Questions Surrounding Targeting Dark Patterns for Enforcement Actions
It is one thing to criticize the use of dark patterns as a policy matter and quite another to suggest that they are all unlawful. To date, however, the FTC has broadly condemned dark patterns without indicating whether it believes all of the types of dark patterns identified by darkpatterns.org and other academics violate Section 5. This difference between practices criticized by policymakers versus those that are actually unlawful can be readily seen in the green marketing space. Numerous marketing practices are condemned as a policy matter by environmental activists as “greenwashing.” Some of them are in fact violations of Section 5 of the FTC Act and the FTC’s Green Guides. For example, one type of greenwashing is “the sin of fibbing” which applies to environmental claims that are false, such as falsely claiming Energy Star certification. There is little doubt that making false environmental claims is a violation of Section 5. Other types of greenwashing may not actually be unlawful. For example, activists might argue that it is greenwashing to market an SUV as having the best fuel economy in its class because no matter how good an SUV’s fuel economy might be it still guzzles gas compared to less thirsty compact and subcompact vehicles. This is commonly referred to as “the sin of lesser of two evils.” Yet, as long as the comparative claims are true there is nothing unlawful about advertising an SUV’s fuel economy as best in class. When thinking about policy concerns versus legal requirements, it is important to bear in mind that the FTC, unlike some other federal agencies, typically cannot implement its preferred policy outcomes, a point it has made frequently with regard to environmental claims. Thus, as much as the FTC may dislike certain practices, the FTC’s authority under Section 5 is limited to attacking practices that are “deceptive” or “unfair.” Each of these terms in turn have specific requirements. A deceptive practice, for example, must be one that misleads a meaningful number of “reasonable” consumers, while an “unfair practice must be one that creates substantial injury, is not outweighed by offsetting consumer or competitive benefits and finally, the injury must be one that consumers could not reasonably avoid.
To date, as can be seen above, the FTC has simply repeated the broad definition of dark patterns articulated by policy advocates—user interfaces that harm consumers by manipulating them to take actions they did not intend. The difficulty with such a broad definition is that while there are certainly practices falling within the broad umbrella of dark patterns that clearly violate Section 5, there are likely others that traditionally have not been viewed as unlawful and perhaps many that fall somewhere in between. Some dark patterns may be nothing more than effective marketing while others may reflect practices that have only been condemned by the FTC in more extreme circumstances. Condemning them in their entirety, however, would likely require the FTC to rethink its existing guidance and interpretations with respect to these practices.
There was, unfortunately, very little discussion during the workshop as to how one differentiates between lawful and unlawful dark practices, though one participant during the final panel did acknowledge that certain practices labeled as dark patterns may enjoy robust First Amendment protection and therefore may be difficult to legally attack.
Some of the dark pattern strategies, though, do fall into grayer areas. And so there’s just not a lot of case law on whether regulating obstruction would run aground of commercial free speech protections under the Central Hudson line of cases. Or is nagging protected by the First Amendment as a sales strategy? We just don’t have a lot of precedent there. We know that under certain circumstances, nagging isn’t protected by the First Amendment. Someone who asks out a coworker on a date once, probably protected, unless there’s a power disparity there. Someone who asks out a co-worker repeatedly despite refusals clearly isn’t exercising their free speech rights. They’re engaged in sexual harassment, if the requests are pervasive. So we do have some guideposts that we can look to from other areas of law. But I do think the First Amendment issues surrounding the regulation of obstruction, of nagging, of confirm shaming, and of certain kinds of subtle visual interference, those are questions that the FTC should spend some time thinking about and consulting with First Amendment experts as they try to regulate in this area.
If the FTC intends to move forward with enforcement actions targeting dark patterns (as it almost certainly does), it will be important for the agency to specifically define what types of dark patterns it views as unlawful and which may simply be undesirable; as well as articulating for particular practices that may sometimes be unlawful how one draws the line between what is permissible and what goes too far. As the FTC recently noted in its ill-fated argument to the Supreme Court in the AMG case, companies should understand what the FTC Act requires of them: “Courts are, of course, bound by principles of constitutional due process and notice. And if the court concludes that the [company] couldn’t possibly understand what was required of him, it will find that a remedy is not available.”
Providing clear guidance to advertisers and marketers, most of whom want to conduct themselves in a lawful manner, is particularly important with respect to dark patterns because as the workshop demonstrated, many practices which were condemned by panelists as dark patterns are commonly used marketing techniques heretofore understood to be lawful. If such practices are now to be condemned as unlawful, in whole or in part, it is important to provide a clear understanding as to the scope of and rationale for such condemnation. It is not only the FTC that should provide such guidance. Former FTC Commissioner Chopra is the newly confirmed head of the CFPB. Similar to the FTC, the CFPB has jurisdiction over deceptive and unfair as well as abusive marketing practices. Given that Director Chopra in his prior role as an FTC Commissioner spoke out numerous times against the use of dark patterns, it seems virtually certain that the CFPB will regulate to some degree the use of dark patterns in the financial services industry. One would hope that the FTC and CFPB will work together to regulate dark patterns in a uniform and consistent matter.
Below we address three key areas where guidance from the FTC, CFPB, or both agencies would be particularly welcome.
When does the use of advertising to “manipulate” consumers become an unlawful dark pattern?
The various definitions of “dark patterns” all share the common idea that consumers are manipulated into making choices that they might not otherwise have made, and that this choice results in some form of harm to the consumer. At first blush, manipulating consumers into purchasing a good or service that they hadn’t intended to purchase may sound terrible. But couldn’t this definition include much of traditional advertising? Doesn’t advertising often try to persuade consumers to purchase something they hadn’t intended to? Sometimes advertisers might accomplish that merely by listing a product’s or service’s performance characteristics or other objective criteria, but isn’t this also often accomplished through emotional appeals or other psychological forms of persuasion such as the use of certain colors or images, placement of text, or the location of goods in a brick and mortar store? After all, why does advertising so often include adorable animals or attractive people or patriotic themes? Sometimes advertising makes us feel good, and then we feel good about the goods or services the advertiser sells. And what about celebrity influencers or endorsers? Is it objectively meaningful that one of the Kardashians uses a product or is that endorsement intended to appeal emotionally or psychologically to the desire of people to emulate someone famous? Certainly, these techniques are intended in part to persuade a consumer to purchase something she otherwise did not intend to purchase and that unintended purchase could be viewed as a form of “harm” since a potentially unnecessary financial transaction has occurred. But has the consumer been manipulated in an unlawful sense or just artfully persuaded?
Of course, the increased extent to which data can be gathered with respect to individual consumers and the rise of digital advertising has likely accelerated the ability of advertisers to create personalized digital ads that feature images, colors, or content that is designed to specifically appeal emotionally or psychologically to its intended target. At what point, if at all, does the use of such techniques become unlawful. In what was viewed at the time as science fiction, the 2002 movie “Minority Report” featured Tom Cruise in a plot involving “precogs” who could so accurately predict the future behavior of people that individuals were arrested before they could commit a crime. Suppose someday soon advertisers could acquire so much data about individuals that sophisticated algorithms could accurately predict exactly what advertising techniques would persuade a consumer to purchase the advertiser’s good or service. Many of us might agree that the use of data and algorithms to that extent goes too far in overriding consumer autonomy and free will. Yet if showing the occasional cute puppy is ok but presenting a consumer with a digital advertisement so full of their personal emotional and psychological triggers that the consumer all but loses her autonomy goes too far, where does one draw the line between these two extremes? The FTC’s focus on dark patterns raises important questions regarding consumer manipulation but also creates an incredibly difficult line drawing exercise.
In a related vein, when, if ever, is it relevant that a particular technique will increase the rate of consumer purchase or selection? Suppose market research shows that consumers are more likely to purchase an article of clothing described online as “forest green” rather than just green? Is the use of forest green then a dark pattern? What about the grocery store that puts the discounted diapers at the back of the store so that sleep and financially deprived parents might pick up additional items on their way to and from the back of the store? Is this a non-digital dark pattern? What about the order in which options are presented? Consumer surveys typically rotate responses to questions because of a belief that order can bias response selection. Checkout pages for goods or services, however, that may present consumers with a variety of potential add-ons don’t rotate whether the first option is to accept or reject the additional purchase. Is it an unlawful dark pattern if the advertiser presents the desired selection in an order that makes it more likely to be selected?
One of the workshop participants, Professor Strahilevitz, presented published research on the use of dark patterns and how they might impact consumer choice. In the control group respondents were told (based upon their responses to various questions) that they were identified as very concerned about their privacy and were given the option to accept or reject a privacy protection program. There were also two test groups, described as mild and aggressive dark patterns. In the mild group the “accept” choice was now in red and labeled as the “recommended” option. In addition, respondents who did not select “accept” were then asked to confirm that they “do not want to protect my data or credit history.” In the aggressive dark patterns group, respondents went through the same initial sequence but if they continued to reject the privacy protection program they were shown up to three screens with information about the prevalence of identity theft and the problems it causes and each screen could only be bypassed after ten seconds had gone by. The results are probably not surprising. In the control group, only 11% of respondents accepted the privacy protection program; that percentage more than doubled to 25% in the mild dark patterns group, and rose to at least 37% in the aggressive dark patterns group.
This research demonstrates that these techniques are effective, but what, if anything, should these conclusions tell us about whether the use of these and similar practices is unlawful? As described above, advertisers have from the very beginning utilized techniques designed to impact consumer choice and cause them to buy more of the advertised product. Indeed, increased sales is the very hallmark of a successful advertising campaign. Which brings us back to the standards of “deception” and “unfairness.” Is it enough to argue that these techniques are not unlawful because they do not mislead? How much burden should consumers continue to shoulder to read advertising carefully? Darkpatterns.org faults advertisers for taking advantage of consumers who “don’t read every word on every page” but who rather “skim read and make assumptions.” Does this mean that every advertiser who takes advantage of that fact has violated Section 5 or do consumers “skim” and “make assumptions” at their own peril? Darkpatterns.org identifies the following as a form of “trick question” because the first choice is “opt out” and the second is “opt in.”