chevron-down Created with Sketch Beta.
February 28, 2020 Feature

Case Developments in Data Law

By Rick Aldrich

“Data, data everywhere, but not a thought to think.”

—John Allen Paulos, professor of mathematics

Data has increasingly become the currency of the day. Some 2.7 zettabytes1 comprise the existing digital universe, and it is constantly growing. Google searches are performed at the rate of 3.8 million per minute. Facebook posts are created at the rate of 3.3 million per minute.2 Data is also increasingly added by wearable technology (such as activity trackers, smart watches, GPS devices), smart speakers (such as the Amazon Echo, Google Home, or Apple HomePod), connected vehicles, and an increasing array of Internet of Things (IoT) devices for the home (such as thermostats, refrigerators, light bulbs, entertainment systems), plus much more. An important subset of the IoT is the Industrial Internet of Things (IIoT), which includes industrial control systems, supervisory control and data acquisition (SCADA) devices managing critical infrastructure services, and cyber-physical systems. With such a proliferation of data, the law relating to data has also had to evolve. This article briefly examines some of the most important recent case developments in a few key areas of data law.

Data Protection Laws

One of the most significant developments in data protection law was the entry into force of the European Union’s (EU) Data Protection Directive3 in October 1998 and later its more comprehensive successor, the General Data Protection Regulation (GDPR),4 on May 25, 2018. From the latter, there have been two significant cases to emerge so far. The first was Google v. Commission Nationale de l’informatique et des Libertes (CNIL).5 The CNIL is an independent French administrative regulatory body whose mission is to ensure that data privacy laws are applied to the collection, storage, and use of personal data. It has the authority to impose fines or issue injunctions. Under the “right to be forgotten” provision6 of the GDPR, the CNIL ruled that Google had to remove links to a person’s personal data from all of Google’s domains worldwide, also known as de-referencing.7 Google instead implemented an approach that delisted results only related to EU domains (e.g., Google.de, Google.fr). Google also proposed “geo-blocking,” a technical approach that involves blocking search results from users within the EU even if they use a non-EU search domain. The CNIL found the approach was noncompliant and fined Google €100,000. Google appealed to the Court of Justice of the European Union (CJEU) seeking a ruling on whether it was required to de-reference personal data on all its domains worldwide.

The CJEU ruled that “a search engine operator cannot be required . . . to carry out a de-referencing on all the versions of its search engine.”8 It reached this conclusion after holding that the right to be forgotten “is not an absolute right” and is subject to balancing. In what appeared to be a nod to potential conflicts of law issues, the court noted that

the balance between the right to privacy and the protection of personal data, on the one hand, and the freedom of information of internet users, on the other, is likely to vary significantly around the world.9

While many hailed the decision as an appropriate limitation on the law’s reach, others expressed concern about a later part of the decision indicating that while “EU law does not currently require that the de-referencing granted concern all versions of the search engine in question, it also does not prohibit such a practice.”10

Indeed just about 10 days later, the same court availed itself of the door it had left open. In Glawischnig-Piesczek v. Facebook Ireland Limited,11 the plaintiff, an Austrian politician, requested Facebook remove what she considered to be disparaging posts associated with her image. The post and its associated comments called her a “lousy traitor,” a “corrupt oaf,” and a member of a “facist party.”12 Facebook refused the request, so Ms. Glawischnig-Piesczek sued in the Vienna Commercial Court and won a judgment that required Facebook to remove not just the offending post, but also all “identical” and “equivalent” posts. Facebook removed only the original post and only for users in Austria. Both parties appealed to the Austrian Supreme Court for elucidation on the exact extent of the order. The Austrian Supreme Court asked the Court of Justice of the European Union to opine on the case. The CJEU upheld the lower court’s takedown order, which extended beyond the original Facebook post to any reposts and any “equivalent” posts. It also clarified that the order was not limited to Austrian users. Rather, Facebook was ordered “to remove information covered by the injunction or to block access to that information worldwide.13

The first part of the court’s order is problematic because it places the onus on platforms that host communications to decide what exactly is “equivalent” to the offending speech. As Professor Woods noted in a commentary on the case, “If calling someone a corrupt oaf is impermissible, would it be okay to call him or her an ineffective doofus? Or a litigious ninny? Reasonable minds might disagree.”14 Additionally problematic for advocates of free speech, social media platforms are incentivized to err on the side of removing more information than less, as that is unlikely to result in any penalty. On the other hand, failing to remove questionable materials that the court believes should have been removed may result in fines or other penalties.

The second part of the court’s order is troubling because it appears to permit Austrian law to be imposed on other jurisdictions. If Ms. Glawischnig-Piesczek had been required to separately sue Facebook in the United States to remove the above-cited posts within the United States and for American web users, it seems quite clear that she would have lost on First Amendment grounds. While the broad language of the First Amendment that “Congress shall make no law . . . abridging the freedom of speech” has never been interpreted to be absolute, the phrases in question do not fit cleanly within any of the recognized exceptions for such as “incitements to violence, libel, obscenity, fighting words, and commercial advertising.”15 As such, U.S. courts would almost certainly find against Ms. Glawischnig-Piesczek in such an action brought within the United States. She would face the same fate if she tried to seek reciprocal enforcement of an Austrian or European Union judgment in the United States. However, because Facebook has assets in Europe, she can largely achieve the same ends by using the threat of asset seizure to force worldwide removal of the words via a CJEU decision, without ever having to involve U.S. courts.

Data and Territoriality

In United States v. Microsoft Corp.,16 federal agents believed a Microsoft-administered email account was being used to facilitate illegal drug trafficking. They sought and obtained a search warrant under a provision of the Stored Communications Act (SCA).17 The warrant directed Microsoft to disclose the contents of that email account and all other records or information associated with the account “[t]o the extent that the information . . . is within [Microsoft’s] possession, custody, or control.”18 Microsoft moved to quash the warrant with regard to the contents of the email account because the contents were stored on a Microsoft server in Ireland, and so the United States should leverage its Mutual Legal Assistance Treaties (MLATs) to seek legal process abroad. The magistrate denied Microsoft’s motion,19 as did the district court upon appeal,20 in large part because SCA “warrants” are treated more like subpoenas. As with subpoenas, SCA warrants are generally executed by requiring the Internet service provider (ISP) to hand over the information, unlike traditional warrants, where the law enforcement agents usually seize the physical property directly themselves. Subpoenas generally require the recipient to provide information over which it has control regardless of where the information is located.21 After Microsoft was held in contempt for refusing to comply with the warrant, it appealed to the Second Circuit, alleging that the United States was attempting to force Microsoft to pull the data from Ireland into the United States, which was effectively an extraterritorial application of the warrant. The Second Circuit noted that

[w]arrants traditionally carry territorial limitations: United States law enforcement officers may be directed by a court‐issued warrant to seize items at locations in the United States and in United States‐controlled areas, see Fed. R. Crim. P. 41(b), but their authority generally does not extend further.22

The court dismissed the lower court’s “hybrid” rationale by pointing to the fact that Congress used both “warrant” and “subpoena” in the same statute and therefore was presumed to know the difference. The court also took a territorial view of the data, holding that its physical location on servers in Ireland put it outside the reach of a U.S. warrant, and therefore overturned the lower court’s ruling and vacated the finding of contempt against Microsoft.

But some legal scholars have opined that “the very idea of online data being located in a particular physical ‘place’ is becoming rapidly outdated.”23 Espousing an “un-territorial” view of data, one scholar has advocated that territoriality should only matter when (1) “objects have an identifiable and stable location” and (2) “that location matters.”24 She then notes that data frequently crosses legal borders with “ease, speed, and unpredictability,” which challenges the first notion, and secondly, that data users generally have no idea, nor any real care, where their data is stored (especially cloud data users as in the Microsoft case) because they can generally access it from anywhere, undermining the second notion.25 Reasonable minds could differ on how these factors should be assessed in the Microsoft case because Microsoft made a business decision to locate the email contents of its users in or near Ireland on Irish servers, so such emails would have had a more identifiable and stable location, and they did so to ensure users would not suffer competitively unacceptable lag times were the data to be stored in the United States or similarly distant locations, which suggested that location mattered.

Just months after the Second Circuit’s Microsoft opinion, Google attempted to rely on that court’s reasoning in refusing to comply with a similar SCA warrant for extraterritorial data in a separate case.26 The magistrate ruled against Google under the somewhat controversial legal theory that there was no “search” when Google merely moved the electronic data to the United States (even though it was done explicitly at the behest of law enforcement pursuant to the legal authority of a warrant)—it only became a search when Google turned it over to the U.S. government within the United States. Thus, according to this rationale, there was no extraterritorial application of the warrant. The magistrate also seemed swayed by the different technological approach Google took. Unlike Microsoft, which kept emails on designated servers that were close to the users, Google tended to constantly move the data around, which would make even a theoretical attempt to use a Mutual Legal Assistance Treaty to seek process in another country futile. Also, Google broke up files into shards27 that could be spread among multiple servers in multiple countries, unlike Microsoft, which kept users’ emails relatively intact on a server. This meant that even if the U.S. government leverages an MLAT to have process served on data that temporarily resided within a certain country, that may only constitute one shard of one file. And although not part of this case, Google has reportedly considered setting up servers at sea that would be beyond the territorial jurisdiction of any nation-state.28 Such servers would be arguably beyond the reach of U.S. law enforcement even with MLATs or the like.

Although lower courts were distinguishing their outcomes from that reached by the Second Circuit’s Microsoft case, those cases had not yet been appealed to and decided by any other circuit courts of appeal, so no split in the circuits existed when the U.S. government appealed to the U.S. Supreme Court. Nevertheless, on October 16, 2017, the Court agreed to hear the appeal, and oral argument took place on February 27, 2018. Before the Court could decide the case, Congress stepped in and passed the Clarifying Lawful Overseas Use of Data (CLOUD) Act.29 Signed into law in March 2018, it amended the SCA to explicitly require email providers to disclose emails in its “possession, custody, or control” even if the emails were stored outside the United States. This effectively transformed it into a law with extraterritorial reach. In a per curiam opinion, the Microsoft case was dismissed as moot.30

Some other jurisdictions have dealt with the territoriality issue differently. In two separate cases in Belgium, the courts there held that “territoriality is determined based on where the data is accessed and received, not where it is located.”31

Data in Estate Planning

Data increasingly has value in the form of digital assets. “Digital asset” is defined in the Revised Uniform Fiduciary Access to Digital Assets Act (RUFADAA)32 to include “an electronic record in which an individual has a right or interest. The term does not include an underlying asset or liability unless the asset or liability is itself an electronic record.” This could include photo collections, digital subscriptions, virtual currencies (such as Bitcoin and Monero), or valuable items in virtual games (such as World of Warcraft or MapleStory) or virtual worlds (such as Second Life). Recognizing the difficulty of accessing such assets after a person’s death, the National Conference of Commissioners on Uniform State Laws (NCCUSL) originally proposed the Uniform Fiduciary Access to Digital Assets Act (UFADAA) in 2014. It attempted to “mimic the access that [executors and agents] would have to traditional property.”33 There was significant pushback by social media companies, civil rights groups, banks, fiduciaries, and others that the UFADAA provided too much access to digital assets, raising serious privacy concerns, and also overrode explicit provisions in many providers’ terms of service. Because of the opposition, only a single state, Delaware, passed the UFADAA. In response, the NCCUSL revised the UFADAA to counter the concerns, significantly scaling back the access that executors would have and building in protections. While the UFADAA gave personal representatives the right to access the content of an electronic communication unless the deceased explicitly denied such, the RUFADAA reverses it and requires explicit consent from the deceased in order to get access to the contents of those same communications. The revised version has since been enacted in 44 states and introduced in two others.34 Some cases are now being litigated over how its provisions are to be interpreted. Now courts are starting to interpret the newly passed laws.

In In re Serrano,35 the court held that Google was required to disclose a catalogue of electronic communications sent and received by the deceased (not including the actual content of such communications) along with the deceased’s calendar. All had been requested by the executor in order inform friends of the deceased’s passing and close out unfinished business. Google demurred pending a court order that specified “disclosure of the content [of the requested electronic information] would not violate any applicable laws, including but not limited to the Electronic Communications Privacy Act [EPCA] and any state equivalent.” The court found that requested disclosures by Google constituted “non-content” information because they did not include the contents of any electronic communication and therefore did not violate the SCA or EPCA. It further held that New York’s implementation of the RUFADAA as applied to this case did not violate any state law, so Google was required to provide the requested information.

Massachusetts is one of the few states not to have enacted the RUFADAA, so when the personal representatives of John Ajemian requested access to his Yahoo!, Inc. (Yahoo) account after he died, to inform his friends of his passing and of a planned memorial service, Yahoo declined. The representatives sued Yahoo.36 Yahoo claimed the SCA prevented it from releasing the contents of the email account. The Probate and Family Court agreed and granted Yahoo’s motion for summary judgment. The personal representatives appealed. The Supreme Judicial Court of Massachusetts reversed the lower court, holding that “the personal representatives may provide lawful consent on the decedent’s behalf to the release of the contents of the Yahoo e-mail account.”37 This was based on the court’s determination that Congress did not intend to preempt the field with the SCA and so it should be interpreted not to conflict with state probate laws. The court also concluded that the SCA was directed at unlawful interceptions of electronic communications, not with lawfully authorized estate management. This holding that personal representatives may provide consent on the decedent’s behalf directly contradicts one of the key assumptions of the RUFADAA.38

On the Horizon

This article only touched on a few of the many areas in which data law is developing. Some emerging areas to watch are autonomous vehicles, artificial intelligence, and the admissibility of data from IoT.

As automobiles rapidly advance towards a state where they will no longer need a human driver, the amount of data such autonomy will require to be created, collected, processed, and stored will increase astronomically.39 The data will pertain to the occupants; the vehicle’s speed, location, and direction; and the speed, location, and direction of nearby vehicles, people, or other obstacles. Who owns all that data? Car manufacturers, car dealerships, automobile insurance companies, the owners of the apps used by the vehicle, advertisers, law enforcement, the vehicle owners and occupants, and many others all have different interests in that data. The cars of today are already estimated to collect as much as 25 gigabytes of data per hour.40 Expect future laws and courts to sort these issues out.

Researchers are increasingly attempting to leverage artificial intelligence (AI) to make decisions for humans based on data that is too massive or complicated for humans to make sense of. Who is liable for the bad decisions of AI? If humans don’t fully understand how the decision was made, how can one contest it? The GDPR gives individuals a right to contest significant decisions made by AI, such as a loan decision, requiring the company employing the AI that made the decision to provide a human review of that decision.41 How will such laws fare?

And who owns the data collected by Fitbit, Amazon’s Alexa, and any number of other wearables or home IoT devices? Increasingly, law enforcement has sought access to such data to help solve crimes.42 In some cases, the data has been proposed for a use that may not have been anticipated by the manufacturer. Can data from a Fitbit be used to set the time of death, based on when the Fitbit stopped recording a heartbeat? Can a victim’s elevated heartbeat, as measured by Fitbit, be used to set the time of an assault by her assailant? These issues were raised in the California case of State v. Aiello but remain unresolved after the 91-year-old suspect died prior to trial.43

Conclusion

In sum, the proliferation of data, much of which is tied to private individuals, has led some jurisdictions to enact privacy protections to protect those individuals. The fact that data can be hosted anywhere and accessed from anywhere, nearly instantaneously, has complicated attempts by courts to regulate it. Determining the appropriate jurisdiction for adjudication is just one challenge. Conflict of laws concerns are another challenge. But perhaps the biggest challenge is the speed and ubiquity of the introduction of data-reliant technologies, which is so explosive that it threatens to surpass the capacity of legislatures and regulators to develop reasonable boundaries to permit introduction of novel technologies while still protecting individual rights.

The CJEU has already issued two decisions that pull in different directions. This will prompt courts to further refine interpretations and/or legislators to redraw the lines. Similar litigation is likely to start in early 2020 when the California Consumer Privacy Act,44 which has some provisions that are similar to the GDPR, goes into force.

Laws based on the territoriality of “things” are not easily extended to arguably “unterritorial” data. This is especially true of sharded data that may not exist anywhere until the hosting provider algorithmically pulls it together. The CLOUD Act has legislatively resolved the extraterritorial application of SCA but leaves open issues relating to conflicts of law issues.

Data in the form of digital assets has given rise to digital executors, to ensure one’s estate can manage those increasingly diverse and valuable digital assets after one’s death. The courts have just begun to weigh in on these matters but have already begun to question some of the underlying assumptions of the uniform law most states have adopted in these area.

But there are numerous additional areas where data law issues are just beginning to enter the judicial system. With data being everywhere, it is certain to give legislators and jurists plenty to think about for many years to come.

Endnotes

1. A zettabyte is one billion terabytes, or 1021 bytes. Zettabyte, TechTerms, https://techterms.com/definition/zettabyte (last visited Nov. 11, 2019). As a reference, the entire collection of printed works in the Library of Congress is estimated to equate to only about 10 terabytes. See Nicholas Taylor, Transferring “Libraries of Congress” of Data, The Signal (July 11, 2011), https://blogs.loc.gov/thesignal/2011/07/transferring-libraries-of-congress-of-data.

2. Data facts are from NodeGraph, https://www.nodegraph.se/big-data-facts/ (last visited Nov. 11, 2019).

3. Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, 1995 O.J. (L 281) 1.

4. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation), 2016 O.J. (L 119) 1.

5. Judgment of 24 Sept. 2019, Google v. CNIL, C-507/17, ECLI:EU:C:2019:772.

6. Article 17 of the GDPR, supra note 4, also referred to as the “right to erasure.” (Also see “The Right to Be Forgotten” by Dan Shefet on p. 26 of this issue.)

7. De-referencing only requires a search engine to remove links to the subject data, not actually remove the data itself, though the data host in some cases may be required to remove it in a separate action.

8. Google, C-507/17, para. 65.

9. Id. at para. 60.

10. Id. at para. 72.

11. C-18/18, ECLI:EU:C:2019:821.

12. Editorial, An E.U. Court Just Gave Countries a Free Pass for Censorship on Facebook, Wash. Post (Oct. 6, 2019), https://www.washingtonpost.com/opinions/an-eu-court-just-gave-countries-a-free-pass-for-censorship-on-facebook/2019/10/06/4fea9a7e-e6da-11e9-b403-f738899982d2_story.html.

13.Glawischnig-Piesczek, C-18/18, at paras. 14, 53 (emphasis added).

14. Andrew Keane Woods, The CJEU Facebook Ruling: How Bad Is It, Really?, Lawfare (Oct. 4, 2019), https://www.lawfareblog.com/cjeu-facebook-ruling-how-bad-it-really.

15. Daniel A. Farber, The First Amendment 15 (5th ed. 2019).

16. 138 S. Ct. 1186 (2018).

17. Codified at 18 U.S.C. §§ 2701–2712.

18. Microsoft Corp., 138 S. Ct. at 1187.

19. In re Warrant to Search a Certain E-Mail Account Controlled & Maintained by Microsoft Corp., 15 F. Supp. 3d 466 (S.D.N.Y. 2014).

20. See Microsoft Corp. v. United States (In re Warrant to Search a Certain E-Mail Account Controlled & Maintained by Microsoft Corp.), 829 F.3d 197, 204–205 (2d Cir. 2016).

21. Tiffany (NJ) LLC v. Qi Andrew, 276 F.R.D. 143, 147–48 (S.D.N.Y. 2011) (“If the party subpoenaed has the practical ability to obtain the documents, the actual physical location of the documents—even if overseas—is immaterial.”).

22. Microsoft, 829 F.3d at 201.

23. Orin S. Kerr, The Next Generation Communications Privacy Act, 162 U. Pa. L. Rev. 373, 408 (2014).

24. Jennifer Daskal, The Un-Territoriality of Data, 125 Yale L.J. 326, 329 (2015).

25. This “general” model of cloud data location can be modified contractually. For example, the Department of Defense Federal Acquisition Regulation Supplement includes a specific provision that requires storage of data within the United States or outlying areas of the United States. See 48 C.F.R. § 239.7602-2.

26. In re Search of Info. Associated with [Redacted]@gmail.com That Is Stored at Premises Controlled by Google, Inc., No. 16-mj-00757, 2017 U.S. Dist. LEXIS 92601 (D.D.C. June 2, 2017).

27. For more on sharding, see Sikha Bagui & Loi Tang Nguyen, Database Sharding: To Provide Fault Tolerance and Scalability of Big Data on the Cloud, 5 Int’l J. Cloud Applications & Computing 36 (2015).

28. Steven R. Swanson, Google Sets Sail: Ocean-Based Server Farms and International Law, 43 U. Conn. L.Rev. 709, 717–18 (2011).

29. CLOUD Act, Pub. L. No. 115-141, 132 Stat. 1213, div. V (2018).

30. United States v. Microsoft Corp., 138 S. Ct. 1186 (2018).

31. Jennifer Daskal, Borders and Bits, 71 Vand. L. Rev. 179, 193 (2018) (referring to Procureur-Général v. Yahoo! Inc., Hof van Cassatie [Cass.] [Court of Cassation], Jan. 18, 2011, Nr. P.10.1347.N (Belg.), translated in 8 Digital Evidence & Elec. Signature L. Rev. 216, 216–18 (2011), http://journals.sas.ac.uk/deeslr/article/view/1978/1915, and Procureur-Général v. Skype, Correctionele Rechtbanken [Corr.] [Criminal Tribunal] Antwerp, Division Mechelen, Oct. 27, 2016, No. ME20.4.1 105151-12 (Belg.)).

32. Revised Uniform Fiduciary Access to Digital Assets Act §2(10) (Nat’l Conf. of Comm’rs on Unif. State Laws 2015), https://www.uniformlaws.org/viewdocument/final-act-no-comments-33.

33. The Revised Uniform Fiduciary Access to Digital Assets Act (RUFADAA), NOLO, https://www.nolo.com/legal-encyclopedia/ufadaa.html.

34. Fiduciary Access to Digital Assets Act, Revised, Unif. L. Comm’n, https://www.uniformlaws.org/committees/community-home?CommunityKey=f7237fc4-74c2-4728-81c6-b39a91ecdf22 (last visited Nov. 11, 2019).

35. 2017 NY Slip Op 27200 (Surr. Ct. June 14, 2017).

36. Ajemian v. Yahoo! Inc., 84 N.E.3d 766 (Mass. 2017), cert. denied, No. 17-1005, 2018 WL 489291 (U.S. Mar. 26, 2018).

37. Id. at 778.

38. For a detailed analysis, see Ajemian v. Yahoo!, Inc.: Massachusetts Supreme Judicial Court Holds That Personal Representatives May Provide Lawful Consent for Release of a Decedent’s Emails, 131 Harv. L. Rev. 2081 (2018).

39. Autonomous Car Data: Future Cars Run on Data, Not Gasoline, Globalme, https://www.globalme.net/blog/autonomous-cars-data-not-gasoline (last visited Nov. 11, 2019).

40. Bill Hanvey, Your Car Knows When You Gain Weight: Vehicles Collect a Lot of Unusual Data. But Who Owns It?, N.Y. Times, May 20, 2019, https://www.nytimes.com/2019/05/20/opinion/car-repair-data-privacy.html (citing a McKinsey report).

41. GDPR, supra note 4, art. 22.

42. Holly Howell, Is Evidence Gathered from “Smart” Devices the New Way to Catch Dumb Criminals?, Am. J. Trial Advocacy (Jan. 24, 2017), https://cumberlandtrialjournal.com/is-evidence-gathered-from-smart-devices-the-new-way-to-catch-dumb-criminals/.

43. Lauren Smiley, A Brutal Murder, a Wearable Witness, and an Unlikely Suspect, Wired (Sept. 17, 2019), https://www.wired.com/story/telltale-heart-fitbit-murder.

44. Calif. Assem. B. 375 (2018).

Entity:
Topic:
The material in all ABA publications is copyrighted and may be reprinted by permission only. Request reprint permission here.

By Rick Aldrich

Rick Aldrich, CISSP, CIPT, GLEG, is a Cyber Security Policy & Compliance Analyst for Booz Allen Hamilton. He works on cybersecurity, policy, metrics, cyberlaw, and privacy issues. This article was prepared by the author in his personal capacity. The views and opinions expressed in this article are those of the author and do not necessarily reflect the official policy, opinion, or position of his employer or any other entity.