Among the clear contrasts between emerging U.S. data law and the fabric evolving in the wake of the adoption of the GDPR in the EU is the important concept of a “right to be forgotten”—an ability to expunge any reference to an individual from a data set, whether governmental, commercial, or in the custody of some other institutional data steward. Divergent views and statutory expressions of data “ownership” and the associated responses of the legal communities pose challenges.
February 28, 2020 Feature
The Right to Be Forgotten
By Dan Shefet
In a recent case decided in Germany (Constitutional Court, Nov. 6, 2019),1 the Right to Be Forgotten was granted to an individual who had committed murder in 1982 and was released from prison in 2002. The case illustrates the scope of the right to be forgotten.
The right to be forgotten became an integral part of European law since the European Union Court of Justice prejudicial judgment of May 14, 2014.2 It was codified in the GDPR, which entered into effect on May 25, 2018, as article 17, which reads as follows:
Article 17. Right to erasure (“right to be forgotten”)
The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay where one of the following grounds applies:
a. the personal data are no longer necessary in relation to the purposes for which they were collected or otherwise processed;
b. the data subject withdraws consent on which the processing is based according to point (a) of Article 6(1), or point (a) of Article 9(2), and where there is no other legal ground for the processing;
c. the data subject objects to the processing pursuant to Article 21(1) and there are no overriding legitimate grounds for the processing, or the data subject objects to the processing pursuant to Article 21(2);
d. the personal data have been unlawfully processed;
e. the personal data have to be erased for compliance with a legal obligation in Union or Member State law to which the controller is subject;
f. the personal data have been collected in relation to the offer of information society services referred to in Article 8(1).
Where the controller has made the personal data public and is obliged pursuant to paragraph 1 to erase the personal data, the controller, taking account of available technology and the cost of implementation, shall take reasonable steps, including technical measures, to inform controllers which are processing the personal data that the data subject has requested the erasure by such controllers of any links to, or copy or replication of, those personal data.
Paragraphs 1 and 2 shall not apply to the extent that processing is necessary:
a. for exercising the right of freedom of expression and information;
b. for compliance with a legal obligation which requires processing by Union or Member State law to which the controller is subject or for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller;
c. for reasons of public interest in the area of public health in accordance with points (h) and (i) of Article 9(2) as well as Article 9(3);
d. for archiving purposes in the public interest, scientific or historical research purposes or statistical purposes in accordance with Article 89(1) in so far as the right referred to in paragraph 1 is likely to render impossible or seriously impair the achievement of the objectives of that processing; or
e. for the establishment, exercise or defence of legal claims.
We find the right to be forgotten in other countries (Argentina, for instance3), and its modern, European roots can be traced back to the work of professor Mayer-Schönberger.4
The right may arguably have attained the level of a “Fundamental Right,” which means that it is deemed included in article 8 (privacy) of the Preamble of the Unified Treaties of the European Union.5 The Preamble incorporates most of the human rights in the European Convention of Human Rights as part of EU law, giving the European Union Court of Justice in Luxembourg jurisdiction over violations of such rights (in parallel with the European Court of Human Rights in Strasbourg6).
The idea of the right to be forgotten is very much inspired by concepts like forgiveness and rehabilitation in penal policy (once you have paid your debt to society, you should be given a second chance). Penal policy is not a question of vengeance but rehabilitation. Alongside this idea, we find another notion taken from penal policy: a statute of limitation. In addition, a crime will sooner or later (very few exceptions apply) be spent and stricken from criminal records (again here are also some exceptions).
These well-embedded philosophical and ethical principles, however, are deemed challenged on the Internet, where nothing is ever forgotten and certainly not forgiven. The Right to Be Forgotten seeks to address this conflict between values offline and online and ensures individuals a “second chance.” When the right was first introduced, especially Google raised concerns about free speech and the right to information, but these concerns were proven largely exaggerated, and the exercise of the right to be forgotten has become uneventful.
Google quickly provided a standard formula7 that individuals may submit by reference to article 17 requesting specific URLs to be delisted from the search engine when entering their name. Google will typically respond within a week, either agreeing to the request or the opposite (in this latter case, the matter may be submitted to the local Data Protection Agency).
The judgment call involved in these decisions may be challenging. Arguments against dereferencing and the right to be forgotten are often framed by way of one of the favorite examples, of a school hiring a bus driver and having a legitimate interest in knowing whether he/she has prior criminal convictions. Obviously, this line of reasoning fails to take into account that any potential employer still has the right to request a copy of criminal records, and such information will be provided if the candidate consents. The employer remains free to draw his own conclusions if consent is withheld.
If a request for delisting is accepted, the search engine will no longer refer to the impugned content when entering the data subject’s name (the content will, however, still be accessible when entering other search criteria, and the content as such is not deleted. The following text will show up on the bottom of the page: “Some information may have been deleted in accordance to EU law on data protection.” It is then moved to the Lumen database (https://www.lumendatabase.org).
Sometimes it is difficult to identify whether a request is covered by article 16 or article 17. Article 16 deals with the Right to Rectification, i.e., wrong information that may often apply in defamation cases. The main difference from a practical point of view between the two is that article 16 does not apply to the search engines (no obligation to rectify), but only to the website (it is the difference between dereferencing and take downs).
Article 16. Right to rectification
The data subject shall have the right to obtain from the controller without undue delay the rectification of inaccurate personal data concerning him or her. Taking into account the purposes of the processing, the data subject shall have the right to have incomplete personal data completed, including by means of providing a supplementary statement.
Article 17 contains six different grounds for the request for “delisting”:
- Delisting when the personal data is no longer necessary in relation to the search engine provider’s processing; for instance, when the data have been removed from a public register.
- The data subject withdraws consent (this means that there is no other legal basis for processing). Given that the data subject does not need to consent to search engine processing, this ground is theoretical. If consent is withdrawn for a particular publisher, then the publisher should inform the search engine and this way (indirectly) delisting from the search engine would be obtained.
- The data subject has exercised his right to object to processing.
- The data have been unlawfully processed.
- Personal data have been erased for compliance with a legal obligation.
- The personal data have been collected in relation to the offer of information society services to a child.8
There are exceptions, and the most important is the so-called journalistic exception in article 85.2:
- For processing carried out for journalistic purposes or the purpose of academic artistic or literary expression, Member States shall provide for exemptions or derogations from Chapter II (principles), Chapter III (rights of the data subject), Chapter IV (controller and processor), Chapter V (transfer of personal data to third countries or international organisations), Chapter VI (independent supervisory authorities), Chapter VII (cooperation and consistency) and Chapter IX (specific data processing situations) if they are necessary to reconcile the right to the protection of personal data with the freedom of expression and information.
It could also be that the data subject has legitimate grounds in this particular situation.
This often gives rise to difficult judgment calls where the exercise of striking the balance between the right to privacy and reputation is confronted with the right to information and freedom of expression. If the person in question has a role in public life and/or if the information relates to his or her professional activities, it will be difficult to have such information delisted. If, however, it relates to minor criminal offenses or if it reflects an individual’s personal opinion and does not appear to be verified fact, it is covered by the right.
How the Right to Be Forgotten May Challenge the Shield Agreement
The European Union Court of Justice (EUCJ) invalidated the Safe Harbor Agreement between the U.S. and the EU by its judgment of October 6, 2015.9 The main legal basis was the Agreement’s failure to ensure “adequate” protection of European data subjects’ data once it crossed the Atlantic. Such “inadequacy” was perceived as apparent in the wake of the Snowden revelations. The main challenge was surveillance—mass surveillance. Not only is privacy protected under EU law (at the time by the directive 1995), but it is also considered a “Fundamental Right” (i.e., mentioned as such in the “Charter of Fundamental Rights”).
Very quickly a new agreement replaced the Safe Harbor Agreement. This new agreement, the Shield Agreement, entered into effect on January 1, 2018. Two new legal challenges to the current version of the U.S.-EU Shield Agreement were launched on September 16, 2016,10 and October 25, 2016.11 Both are based on the Fundamental Rights. In other words, it is argued that the Shield Agreement does not provide adequate protection of these rights.
One may raise the question whether the right to be forgotten has acquired the status of a Fundamental Right and if access to data amounts to transfer of data, in which case failure to apply adequate protection to European data subjects could also expose the Shield Agreement. In the case C-311/18 (EUCJ), Data Protection Commissioner v. Facebook Ireland Limited, the Advocate General’s very recent opinion of December 19, 2019, corroborates this analysis:
The purpose of an adequacy decision is to find that the third country concerned ensures, as a result of the law and practices of that country, a level of protection of fundamental rights of the persons whose data are transferred essentially equivalent to that provided by the GDPR, read in there light of the Charter.12
The distinction between “access” and “transfer” of data is becoming increasingly artificial. The European Union Court of Justice decided some 15 years ago in the so-called Lindqvist case that “mere access was not transfer.”13 Much more recently, the Information Commissioner’s Office in the UK came out with guidance on the difference between transfer and access and actually seems to be implying that mere access does indeed constitute transfer.14
Given that the distinction between access to information and transfer is getting increasingly artificial, it may very well be that access to information in the U.S. on European data subjects in violation of article 16 and/or article 17 amounts to a violation of the existing Shield Agreement because the territorial criterion would simply be data related to a “European Data Subject.”
To the extent that access is deemed equivalent to data transfer and a decision passed by a data protection agency or a court to take down content (article 16) involving a European data subject by virtue of Fundamental Rights (in spite of the journalistic exemption in article 85.2) is not implemented in the U.S., it may be argued, that adequate protection is not awarded and that global reach is a result of the Shield Agreement.
On a personal note, I may add that as a practitioner of the GDPR in Europe, I regularly receive requests from “data subjects” living in the United States more or less desperately seeking solutions to the nefarious impact wrong, misleading, vengeful, or outdated information on them has on their profession and private lives. Some of those clients are actually seriously contemplating moving to Europe simply to “get their life back.”
Irrespective of the territorial reach of the GDPR as defined by the EUCJ in its judgments of September 24, 2019,15 and October 3, 2019,16 or the possibility of extraterritorial reach based on the requirements of the Shield Agreement (or its successor), such extraterritoriality will continue only to apply to EU data subjects.
Endnotes
1. Agence France-Press, German Court Backs Murderer’s “Right to Be Forgotten” Online, The Guardian (Nov. 27, 2019), https://www.theguardian.com/world/2019/nov/28/german-court-backs-murderers-right-to-be-forgotten-online.
Prior to this case, the judgment of Apr. 13, 2018, from the High Court in London (NT1 & NT2 v. Google LLC [2018] EWHC 799 (QB), https://www.judiciary.uk/wp-content/uploads/2018/04/nt1-Nnt2-v-google-2018-Eewhc-799-QB.pdf) also applied the right to be forgotten to spent sentences.
2. Case C131/12, Google Spain SL v. Agencia Española de Protección de Datos (AEPD) (May 13, 2014), available at http://curia.europa.eu/juris/document/document.jsf;jsessionid=FA8B76E27BD69D623C2AFF9CCF2DA9EE?text=&docid=152065&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=8461595.
3. Article 43 of the Argentine Constitution (1994):
Any person shall file this action to obtain information on the data about himself and their purpose, registered in public records or data bases, or in private ones intended to supply information; and in case of false data or discrimination, this action may be filed to request the suppression, rectification, confidentiality or updating of said data.
Edward L. Carter, Argentina’s Right to Be Forgotten, 27 Emory Int’l L. Rev. 23, http://law.emory.edu/eilr/content/volume-27/issue-1/recent-developments/argentinas-right-to-be-forgotten.html.
4. Viktor Mayer-Schönberger, delete: The Virtue of Forgetting in the Digital Age 1–3 (Princeton Univ. Press 2009).
5. European Union, Charter of Fundamental Rights, art. 8, Protection of Personal Data (2000), available at https://fra.europa.eu/en/charterpedia/article/8-protection-personal-data.
6. On Apr. 5, 2013, the draft accession agreement of the European Union to the European Convention on Human Rights was finalized. EU Accession to the ECHR, Council of Eur., https://www.coe.int/en/web/human-rights-intergovernmental-cooperation/accession-of-the-european-union-to-the-european-convention-on-human-rights [last visited Dec. 19, 2019].
7. Google, EU Privacy Removal, Personal Information Removal Request Form, https://www.google.com/webmasters/tools/legal-removal-request?complaint_type=rtbf.
8. European Data Prot. Bd., Guidelines 5/2019 on the criteria of the Right to be Forgotten in the search engines cases under the GDPR (part 1), Dec. 2, 2019, available at https://edpb.europa.eu/sites/edpb/files/consultation/edpb_guidelines_201905_rtbfsearchengines_forpublicconsultation.pdf.
9. Case C362/14, Maximillian Schrems v. Data Prot. Comm’r (Oct. 6, 2015), available at http://curia.europa.eu/juris/document/document.jsf?docid=169195&doclang=EN.
10. Case T-670/16, Action brought on 16 September 2016—Digital Rights Ireland v. Commission, 2016 O.J. (C 410) 26 (Nov. 7, 2016), available at https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:62016TN0670.
11. Case T-738/16, Action brought on 25 October 2016—La Quadrature du Net and Others v. Commission, 2016 O.J. (C 6) 39 (Jan. 9, 2017), available at https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX:62016TN0738.
12. Court of Justice of European Union Press Release 165/19, Advocate General’s Opinion in Case C-311/18 Data Protection Commissioner v Facebook Ireland Limited, Maximillian Schrems (Dec. 19, 2019), available at https://curia.europa.eu/jcms/upload/docs/application/pdf/2019-12/cp190165en.pdf.
13. Case C-101/01, Reference to the Court under Article 234 EC by the Göta hovrätt (Sweden) for a preliminary ruling in the criminal proceedings before that court against Bodil Lindqvist (Nov. 6, 2003), available at http://curia.europa.eu/juris/document/document.jsf?text=&docid=48382&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=8470865.
14. Guide to the General Data Protection Regulation (GDPR), Info. Comm’r’s Office, https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/international-transfers last visited Dec. 18, 2019].
15. Case C136/17, GC, AF, BH, ED v. Commission nationale de l’informatique et des libertés (CNIL) (Sept. 24, 2019), available at http://curia.europa.eu/juris/document/document.jsf?docid=218106&doclang=EN.
16. Case C18/18, Eva Glawischnig-Piesczek v. Facebook Ireland Limited (Oct. 3, 2019), available at http://curia.europa.eu/juris/document/document.jsf?text=&docid=218621&pageIndex=0&doclang=EN&mode=lst&dir=&occ=first&part=1&cid=52514.