chevron-down Created with Sketch Beta.
September 01, 2014

Does the United States Have an Answer to the European Right to Be Forgotten?

Andrew R.W. Hughes

©2014. Published in Landslide, Vol. 7, No. 1, September/October 2014, by the American Bar Association. Reproduced with permission. All rights reserved. This information or any portion thereof may not be copied or disseminated in any form or by any means or stored in an electronic database or retrieval system without the express written consent of the American Bar Association or the copyright holder.

In May 2014, the European Court of Justice (ECJ) ruled that Google had an obligation to remove listings from its search results that revealed an individual’s personal data.1 The ruling has been hailed as a landmark for the so-called “right to be forgotten,” a European legal concept that allows individuals to have sensitive information about them “sink into oblivion.”2

The court’s decision stemmed from a request brought by Mario Costeja González, a Spanish man upset that a Google search of his name returned links to 16-year-old notices in a Spanish newspaper’s online archive concerning the foreclosure of his home.3 According to the ECJ, the information contained in those notices, by virtue of its age and sensitive nature, “appear[ed] . . . to be inadequate, irrelevant or no longer relevant, or excessive in relation” to Google’s purposes in indexing and retrieving the search results.4 As against González’s interest in keeping the details of his past private, the ECJ found that Google had failed to demonstrate “particular reasons substantiating a preponderant interest of the public in having, in the context of such a search, access to that information.”5

The ruling has excited considerable debate, with some commentators viewing it as a critical victory for the privacy rights of European Union (EU) citizens,6 and others warning that the ECJ ruling could lead to massive Internet censorship as Google and other search engines are flooded with demands to remove results.7 Unsurprisingly, the debate has not been confined to Europe, as American commentators have argued the pros and cons of the right to be forgotten, and its impact on privacy and free speech.8

But despite their differences, both supporters and detractors of the ECJ ruling agree that such a law is likely impossible in the United States.9 Putting aside cultural differences between the United States and Europe, the First Amendment, as presently understood, protects companies that publish true information lawfully obtained.10 Accordingly, for good or ill, we are unlikely to see something like the right to be forgotten take root in the United States.

However, as social media plays an increasingly important role in how individuals present themselves to the world—and as the first generation of children raised on Facebook and Twitter go out into the professional world—striking a balance between openness on the one hand and privacy and reputational interests on the other will only grow as a concern. Faced with this dilemma, it may be insufficient to simply cry First Amendment and put one’s foot down.

One way forward, which legislatures and courts have begun to embrace, is to allow individuals to protect themselves by enabling them to better control how websites display the personal information that the users themselves post online. For example, a California law set to take effect next year will require websites and apps directed at or known to be used by minors to provide a mechanism for minor users to remove content they post.11 Under this new law, a 16-year-old who posts a thoughtless comment and then thinks better of it will, in certain circumstances, have a right to remove that comment before it becomes a permanent blemish on his or her reputation. The California law is a modest step: it does not require a website or app operator to remove content about a minor that was initially posted or has been reposted by a third party, and it does not entitle minors to request the deletion of content for which they have received compensation or other consideration.12 Moreover, most major social networking sites already allow users—minor or not—to remove content.13 Nonetheless, by providing minors a right, independent of contract, to control their publicly posted information, the California law suggests one intellectual property–like approach to protecting individual privacy and reputation online.

A similar approach can be seen in the recent case of Fraley v. Facebook, Inc.14 There, Facebook users (and putative class representatives) claimed that Facebook had violated their right of publicity by featuring their names and photos in “Sponsored Stories.”15 Before the court could rule on whether the users’ acceptance of Facebook’s terms of service and continued use of the website amounted to their consent to have their personas used in advertising, Facebook settled with the class. As part of the settlement, Facebook is required to “[c]reate a mechanism that will allow Class Members to view, on a going-forward basis, the subset of their interactions and other content on Facebook that have been displayed in Sponsored Stories (if any),” and “[d]evelop settings that will allow Class Members to prevent particular items or categories of content or information related to them from being displayed in future Sponsored Stories.”16

Both the California law and the Fraley settlement seek to allow individuals to protect their privacy and reputations online by giving them greater control over the content they post. In effect, they give individuals the right to depublish and limit republication of information, respectively. Taken together, they signal the potential for the development of a broader right to control how one presents oneself online.

As Fraley suggests, the right to control one’s online persona is related to the right of publicity. However, there are key differences between the two rights. Most notably, although the scope of the right of publicity varies by state, it generally protects individuals only against the unauthorized commercial exploitation of their identities.17 By contrast, the California “eraser button” law provides minors a right to remove content without respect to the commercial use of that content. Further, unlike the right of publicity,18 it extends beyond an individual’s likeness to encompass any content he or she posts. In this sense—the law’s emphasis on allowing individuals to control their own content—it bears some resemblance to copyright as well. But, as Fraley suggests, the right may also be extended to allow individuals to control ostensibly factual material that is not subject to copyright (such as, for instance, that Andrew Hughes likes the Chicago White Sox on Facebook). Further, unlike copyright, this new emerging right’s principal concerns are privacy and reputation, not incentivizing creative expression.

This nascent right addresses itself to much the same concern as the European right to be forgotten, but it is much more limited in scope. It only allows individuals to control what they say about themselves, not what others say about them. While this limits the scope of the right, it also means it does not raise the same serious First Amendment concerns as the European right.19 Less dramatically—but perhaps no less importantly—this difference in emphasis as compared with the European right also avoids potential issues with the broad immunity granted to social websites under the Communications Decency Act (CDA) and the policies that underlie that immunity.

Section 230 of the CDA provides that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”20 This immunity has been broadly construed to shield social websites from liability for content posted by third parties.21 The ECJ’s ruling, which puts the onus of protecting personal information on Google as the “controller” of the data,22 is, at the very least, an uncomfortable fit alongside this the broad immunity. The American right to control one’s online persona, by contrast, treats an individual’s online persona as something like the individual’s property, which can be managed by the individual at his or her request. It thus resembles the now-familiar notice-and-takedown procedures of the Digital Millennium Copyright Act.23 By requiring websites only to remove an individual’s content at the request of that individual, this right avoids the risk that websites can be held responsible for others’ conduct merely by providing a forum for online speech.24

This intellectual property–like right to control one’s online content is still in its infancy, and its exact contours will not be known for many years. If the right continues to develop, thorny questions will eventually have to be answered, such as whether people can do anything about things they post online that others repost,25 and whether people ought to have a right to know all of the information that websites know about them.26 What is clear, though, is that an approach that treats any effort to protect online privacy and reputation as a fundamental threat to freedom leaves individuals, particularly minors, open to the risk that their small indiscretions will haunt them forever. Because the Internet does not forget.


1. Case C-131/12, Google Spain SL v. Agencia Española de Protección de Datos (AEPD), ECLI:EU:C:2014:616 (May 13, 2014), available at

2. Id. ¶ 90; see, e.g., Google Loses “Right to Be Forgotten” Case, Al Jazeera Eng. (May 13, 2014), Article no longer available; On Being Forgotten, Economist (May 17, 2014),; see also Jeffrey Rosen, The Right to Be Forgotten, 64 Stan. L. Rev. Online 88 (Feb. 13, 2012), (giving background on the right).

3. Case C-131/12, Google v. AEPD, ECLI:EU:C:2014:616.

4. Id. ¶ 94.

5. Id. ¶ 98. As the court noted, although González’s right to privacy overrode Google’s economic interest in publishing links to the notices and “the interest of the general public in finding that information upon a search” for González’s name, “that would not be the case if it appeared, for particular reasons, such as the role played by [González] in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of inclusion in the list of results, access to the information in question.” Id. ¶ 97.

6. See, e.g., Rory Cellan-Jones, EU Court Backs “Right to Be Forgotten” in Google Case, BBC News Eur. (May 13, 2014), (quoting EU Justice Commissioner Viviane Reding calling the decision “a clear victory for the protection of personal data of Europeans”); Gerd Leonhard, Twitter (May 14, 2014),

7. See, e.g., Index Blasts EU Court Ruling on “Right to Be Forgotten, Index on Censorship (May 13, 2014), (“Today’s decision from the Court of Justice of the European Union violates the fundamental principles of freedom of expression. . . . This is akin to marching into a library and forcing it to pulp books. . . . It should send chills down the spine of everyone in the European Union who believes in the crucial importance of free expression and freedom of information.”). In the wake of the ruling, Google has launched a web page for EU citizens to submit takedown requests. See Search Removal Request under European Data Protection Law, Google (last visited June 13, 2014). A third strand of commentary has focused on the supposed unworkability of the ECJ’s ruling. See, e.g., Peter Coy, Europe’s “Right to Be Forgotten” Ruling Is Unforgettably Confusing, Bloomberg Businessweek (May 15, 2014),

8. See, e.g., Ordering Google to Forget, N.Y. Times, May 13, 2014, (indicating that the ECJ ruling “could undermine press freedoms and free speech”); Eric Posner, We All Have the Right to Be Forgotten, Slate (May 14, 2014), (arguing that the ECJ ruling struck a good balance between privacy and free speech).

9. See, e.g., Matt Ford, Will Europe Censor This Article?, Atlantic, May 13, 2014,; Victor Luckerson, Americans Will Never Have the Right to Be Forgotten, Time, May 14, 2014,; Posner, supra note 8.

10. See Fla. Star v. B.J.F., 491 U.S. 524 (1989).

11. Cal. Bus. & Prof. Code § 22581 (effective Jan. 1, 2015). The law also requires the operator of the website or app to notify minors of their right to request removal of any content. Id. § 22581(a)(3).

12. Id. § 22581(b).

13. See Elizabeth Barcohana, Rash California Minors Get an Online “Eraser Button, Social Media Law Blog (Nov. 6, 2013), Article no longer available.

14. 830 F. Supp. 2d 785 (N.D. Cal. 2011).

15. Sponsored Stories are an advertising feature in which a business pays Facebook to create customized ads that display a user’s name and/or photo alongside a product or company that the user had “liked.” Id. at 791.

16. Frequently Asked Questions, Fraley v. Facebook, Inc., (last visited June 13, 2014).

17. Jonathan S. Jennings, Address at the ABA 2012 Annual Meeting: Right of Publicity Law Meets Social Media (Aug. 5, 2012), Article no longer available.

18. Sinatra v. Goodyear Tire & Rubber Co., 435 F.2d 711, 716 n.12 (9th Cir. 1970) (denying a claim by singer Nancy Sinatra for passing off stemming from the use of one of her recordings and noting that “[t]o the untrained ear the sound of the voice carried no recognition, and no confusion of source”).

19. There are potential scenarios in which requiring a website to remove user-generated content could violate the First Amendment. See Eric Goldman, California’s New “Online Eraser” Law Should Be Erased, Forbes (Sept. 24, 2013), (positing scenarios). These scenarios, however, differ from run-of-the-mill social media activity, and do not suggest any sort of fatal flaw in the California law.

20. 47 U.S.C. § 230(c). The CDA defines “interactive computer service” as “any information service, system, or access software provider that provides or enables computer access by multiple users to a computer server, including specifically a service or system that provides access to the Internet and such systems operated or services offered by libraries or educational institutions” and defines an “information content provider” as “any person or entity that is responsible, in whole or in part, for the creation or development of information provided through the Internet or any other interactive computer service.” Id. § 230(f).

21. See, e.g., Nemet Chevrolet, Ltd. v., Inc., 591 F.3d 250, 254 (4th Cir. 2009) (“[P]laintiffs may hold liable the person who creates or develops unlawful content, but not the interactive computer service provider who merely enables that content to be posted online.”); Chi. Lawyers’ Comm. for Civil Rights under Law, Inc. v. Craigslist, Inc., 519 F.3d 666, 672 (7th Cir. 2008) (upholding summary judgment on claim that Craigslist violated the Fair Housing Act by permitting discriminatory housing ads and finding that “given § 230(c)(1) [one] cannot sue the messenger just because the message reveals a third party’s plan to engage in unlawful discrimination”).

22. Case C-131/12, Google Spain SL v. Agencia Española de Protección de Datos (AEPD), ECLI:EU:C:2014:616, ¶¶ 32–41 (May 13, 2014), available at

23. See Steven C. Bennett, The “Right to Be Forgotten”: Reconciling EU and US Perspectives, 30 Berkeley J. Int’l L. 161, 180 n.84 (2012) (suggesting that “the European Union and the United States might agree on some form of ‘notice-and-takedown’ of content approach, to shield website purveyors from unanticipated liability”); Cynthia Wong, Don’t Blame the Messenger: Intermediary Liability and Protecting Internet Platforms, IIP Digital (July 29, 2010), (suggesting notice-and-takedown as one approach to protect privacy online, but noting that “notice-and-takedown systems are often easily misused to silence critics, especially where it is difficult to assess whether the challenged content is actually unlawful”). The concern that a notice-and-takedown system may be abused, however, is diminished when an individual only has the power to take down his or her own content.

24. See Zeran v. Am. Online, Inc., 129 F.3d 327, 330 (4th Cir. 1997) (“The purpose of [§ 230] immunity is not difficult to discern. Congress recognized the threat that tort-based lawsuits pose to freedom of speech in the new and burgeoning Internet medium. The imposition of tort liability on service providers for the communications of others represented, for Congress, simply another form of intrusive government regulation of speech.”).

25. If a person posts a copyright-infringing video on YouTube and then another person reposts it, the reposter can be held liable for copyright infringement. Extending this analogy to the right to control one’s online persona, however, raises many of the same First Amendment concerns as the ECJ’s ruling.

26. See Jeffrey Rosen, The Web Means the End of Forgetting, N.Y. Times, July 21, 2010, (“A University of California, Berkeley, study released in April [2010] found that large majorities of people between 18 and 22 said there should be laws . . . that give people the right to know all the information Web sites know about them (62 percent) . . . .”). Similar to the Fair Credit Reporting Act, which requires credit reporting agencies to provide individuals with one free credit report per year so that they can monitor their credit history and challenge any negative or false information, this new right can include a requirement that websites provide their users summaries on demand of all of the information the websites have stored on the users.

Andrew R.W. Hughes

Andrew R.W. Hughes is an associate at Pattishall, McAuliffe, Newbury, Hilliard & Geraldson LLP in Chicago, specializing in trademark and copyright law.