§ 1.1. Introduction
We are pleased to present the second edition of the rapidly growing Chapter on Artificial Intelligence.
A few years ago, when in my capacity as the Founder and Chair of the AI Subcommittee, I made the original suggestion that the Annual Review should include a new Chapter devoted entirely to AI, I understood from clients that this is an area where they were hungry for guidance. Over the last decade, AI and Machine Learning have become my passion. I continue to be fascinated by the various AI/ML issues I am asked by my clients to advise them on, and have watched with interest as US regulators and the plaintiff’s bar have begun to focus their sights on commercial and embedded AI. The pace of corporate deals involving companies who count AI as their innovation have also increased substantially. I have tried hard to keep up with the rapid pace of change and have published on many aspects, formulated proposed federal AI legislation that in 2018 became a House of Representatives Draft Discussion Bill, and have been invited to speak and teach on AI at many institutions including MIT/SLOAN, NYU, and Berkeley Law School.
In the absence of substantive federal legislation, case law plays an outsized role in helping shape the contours of the emerging legal issues associated with widespread adoption of AI and Machine Learning. Tracking relevant case developments from around the country is essential. Last year, we confidently predicted that the developments we report this year would increase exponentially year over year. Our prognostication has proven correct. This year’s Chapter marks a notable increase in reported cases in the field.
The goal of this Chapter is to have it become a useful tool for those business attorneys who seek to be kept up to date on a national basis concerning how the courts are deciding cases involving AI. We again made the same editorial decisions and included relevant legislation and pending legislation. We also made the same judgments as to what should be included. A notable example is facial recognition. Due to the nature of the underlying technology and the complexity of FR, FR necessarily involves issues of algorithmic/artificial intelligence. However, we did not include every case that references facial recognition when the issue at bar pertained to procedural aspects such as class certification (e.g., class action lawsuits filed under the Illinois Biometric Information Privacy Act (BIPA) (740 ILCS 14)).
Finally, I want to thank my colleagues, Adam Aft and Alex Crowley, for their assistance in preparing this year’s Chapter. Adam is a knowledgeable and accomplished AI attorney with whom I frequently collaborate, and Alex is a new joiner to our team with a noted exuberance for AI.
We hope this Chapter provides useful guidance to practitioners of varying experience and expertise and look forward to tracking the trends in these cases and presenting the cases arising in the next several years.
Palo Alto, California
§ 1.2.1. United States Supreme Court
Van Buren v. United States, 141 S. Ct. 1648 (Jun. 3, 2021). The dispute underlying this case arose when a police officer violated department policy by using the computer in his patrol car to access information in a law enforcement database for a non-law-enforcement purpose. The Court held that a computer user “exceeds authorized access” under the Computer Fraud and Abuse Act of 1986 (CFAA) “when he accesses a computer with authorization but then obtains information located in particular areas of the computer—such as files, folders, or databases—that are off-limits to him.” Here, the police officer was authorized to access his patrol-car computer and the law enforcement database. But due to the Court’s holding, the officer’s purpose, albeit improper, in accessing the computer and database was not relevant to determining liability under CFAA. This holding resolved a circuit split about how broadly to interpret CFAA. It avoided making “millions of otherwise law-abiding citizens” into criminals simply on the basis that they used their computers in a technically unauthorized way, such as to send personal email from a work laptop.
There were no other qualifying decisions by the United Sates Supreme Court. We note the Court has heard a number of cases foreshadowing the types of issues that will soon arise with respecting to artificial intelligence such as United States v. Am. Library Ass’n (539 U.S. 194 (2003)) in which a plurality of the Court upheld the constitutionality of filtering software that libraries had to implement pursuant to the Children’s Internet Protection Act and Gill v. Whitford (138 S. Ct. 1916) in 2017 in which, if the plaintiffs had standing, the Justices may have had to evaluate the use of sophisticated software in redistricting (a point noted again in Justice Kagan’s express reference to machine learning in her dissent in Rucho v. Common Cause (139 S. Ct. 2484 (2019))). The Court had previously concluded that a “people search engine” site presenting incorrect information that prejudiced a plaintiff’s job search was a cognizable injury under the Fair Credit Reporting Act in Spokeo, Inc. v. Robins (136 S. Ct. 1540 (2016)). These cases are representative of the type of any number of cases that are likely to make their way to the Court in the near future that will require the Justices to contemplate artificial intelligence, machine learning, and the impact of the use of these technologies.
§ 1.2.2. First Circuit
There were no qualifying decisions within the First Circuit.
§ 1.2.3. Second Circuit
Flores v. Stanford, 2021 U.S. Dist. LEXIS 185700 (S.D.N.Y 2021) (compelling disclosure of information related to the COMPAS software (used to assess the likelihood of recidivism and used by courts to inform bail amounts and sentencing) as relevant to inform the plaintiffs class certification given that having transparency and explainability regarding such information and the operation of the applicable algorithm would be potentially central to the plaintiffs’ assertions that the defendants had unconstitutional practices deployed against them in the manner in which the COMPAS software informed their sentencing).
Force v. Facebook, Inc., 934 F.3d 53 (2d Cir. 2019). Victims, estates, and family members of victims of terrorist attacks in Israel alleged that Facebook was a provider of terrorist postings where they developed and used algorithms designed to match users’ information with other users and content. The court held that Facebook was a publisher protected by Section 230 of the Communications Decency Act and that the term publisher under the Act was not so limited that Facebook’s use of algorithms to match information with users’ interests changed Facebook’s role as a publisher.
§ 126.96.36.199. Additional Cases of Note
Clark v. City of New York, 2021 U.S. Dist. LEXIS 177534 (S.D.N.Y. 2021) (denying motion to dismiss first amendment and state law religious discrimination claims against New York City for requiring Muslim women to remove their hijabs for booking photographs after arrest. A primary motivation for the women’s complaint was that forcing them to remove their hijabs for a picture would cause the women to violate their religious belief that men outside of their immediate family were prohibited from seeing the women without their hijabs, even if only via pictures stored in facial recognition databases. The court found that “requiring the removal of a hijab does not rationally advance the City’s valid interest in readily identifying arrestees,” including via facial recognition databases.)
Nat’l Coalition on Black Civic Participation v. Wohl, 2021 U.S. Dist. LEXIS 177589, 2021 WL 4254802 (S.D.N.Y. 2021) (holding that a robocall service provider that allowed users to upload messages to a website for distribution via the service provider’s automated phone calling (i.e., robocall) system was not entitled to neutral publisher immunity under Section 230 because it was not a provider or user of an interactive computer service and the service provider allegedly knew of the discriminatory and false content of the messages and actively helped the users determine where to distribute the messages).
Nuance Communs., Inc. v. IBM, 2021 U.S. Dist. LEXIS 115228 (S.D.N.Y. 2021) (noting that “This is a breach of contract case arising under New York law. But more than that, this case is a contemporary window into the brave new world of artificial intelligence (“AI”) commercial applications” and finding after a bench trial that IBM had breached its implied covenant of good faith by updating a product outside of the scope of the parties’ agreement in order to avoid making the updates available to Nuance in relation to the product within the scope of the agreement).
Calderon v. Clearwater AI, Inc., 2020 U.S. Dist. LEXIS 94926 (S.D. N.Y. 2020) (stating the court’s intent to consolidate cases against Clearview based on a January 2020 New York Times article alleging defendants scraped over 3 billion facial images from the internet and scanned biometric identifiers and then used those scans to create a searchable database, which defendants then allegedly sold access to the database to law enforcement, government agencies, and private entities without complying with BIPA); see also Mutnick v. Clearview Ai, Inc., 2020 U.S. Dist. LEXIS 109864 (N.D. Ill. 2020).
People v. Wakefield, 175 A.D.3d 158 (N.Y. App. Div. 2019) (concluding no violation of the confrontation clause where the creator of artificial intelligence software was the declarant, not the “sophisticated and highly automated tool powered by electronics and source code.”); see also People v. H.K., 2020 NY Slip Op 20232, 130 N.Y.S.3d 890 (Crim. Ct. 2020) (following Wakefield in concluding that where software was “acting as a highly sophisticated calculator” the analyst using the software was still a declarant and the right to confrontation was preserved).
Vigil v. Take-Two Interactive Software, Inc., 235 F. Supp. 3d 499 (S.D.N.Y. 2017) (affirmed in relevant part by Santana v. Take-Two Interactive Software, Inc., 717 Fed.Appx. 12 (2d Cir. 2017)) (concluding that the BIPA doesn’t create a concrete interest in the form of right-to-information, but instead operates to support the statute’s data protection goal; therefor, defendant’s bare violations of the notice and consent provisions of BIPA were dismissed for lack of standing).
LivePerson, Inc. v. 24/7 Customer, Inc., 83 F. Supp. 3d 501 (S.D.N.Y. 2015) (determining plaintiff adequately plead possession and misappropriation of a trade secret where plaintiff alleged its “predictive algorithms” and “proprietary behavioral analysis methods” were based on many years of expensive research and were secured by patents, copyrights, trademarks, and contractual provisions).
§ 1.2.4. Third Circuit
Zaletel v. Prisma Labs, Inc., No. 16-1307-SLR, 2017 U.S. Dist. LEXIS 30868 (D. Del. Mar. 6, 2017). The plaintiff had a “Prizmia” photo editing app. The plaintiff alleged trademark infringement based on the defendant’s “Prisma” photo transformation app. In reviewing the Third Circuit’s likelihood of confusion factors, the court considered the competition and overlap factor. The court concluded that “while plaintiff broadly describes both apps as distributing photo filtering apps, the record demonstrates that defendant’s app analyzes photos using artificial intelligence technology and then redraws the photos in a chosen artistic style, resulting in machine generated art. Given these very real differences in functionality, it stands to reason that the two products are directed to different consumers.”
§ 188.8.131.52. Additional Cases of Note
McGoveran v. Amazon Web Servs., 2021 U.S. Dist. LEXIS 189633 (D. Del. 2021) (granting motion to dismiss a claim under Illinois’ Biometric Information Privacy Act (BIPA) brought by residents of Illinois against non-Illinois-based companies Amazon Web Services (AWS) and Pindrop Security for collecting callers’ “voiceprints,” which can be used to identify the speaker, when the residents made calls from Illinois using Illinois phone numbers to a company that used AWS and Pindrop services. The court found no “allegations involving conduct that occurred ‘primarily and substantially’ in Illinois” and that “BIPA does not apply extraterritorially.”)
Thomson Reuters Enter. Ctr. GmbH v. ROSS Intelligence Inc., 2021 U.S. Dist. LEXIS 59945 (D. Del. 2021) (denying a motion to dismiss claim of copyright infringement and tortious interference with contract against ROSS Intelligence, a legal research services company, related to ROSS’s alleged obtaining, via a third party contracted with Thomson Reuters, and use of certain Westlaw materials, notably Westlaw’s Headnotes and Key Number System, when developing ROSS’s own artificial intelligence-based legal research software. While ROSS argued “that Westlaw Content is not copyrightable under the government edicts doctrine,” the court nonetheless held that Thomson Reuters at least had a plausible claim for copyright infringement based on Thomson Reuters’ efforts to register its content with the US Copyright Office and a plausible claim of tortious interference with contract due to the manner in which ROSS allegedly obtained the Westlaw content.)
In re Valsartan, Losartan, & Irbesartan Prods. Liab. Litig., 337 F.R.D. 610 (D.N.J. 2020) (requiring the defendants to use an eDiscovery document review protocol that the parties had mostly agreed on rather than letting Teva unilaterally implement its own machine-learning-based document review protocol, suggesting that eDiscovery implementation is a collaborative effort requiring transparency in how document analysis is performed regardless of which technologies are used to conduct the analysis.)
§ 1.2.5. Fourth Circuit
Thaler v. Hirshfeld, 2021 U.S. Dist. LEXIS 167393 (E.D. Va. 2021) (holding that an artificial intelligence machine cannot be considered an “inventor” under the US Patent Act because plain reading of relevant provisions and statutes indicates that inventors must be natural persons.)
TruGreen Ltd. P’ship v. Allegis Global Sols., Inc., 2021 U.S. Dist. LEXIS 33587 (4th Cir. 2021) (granting motions to dismiss counts of negligent misrepresentation and promissory estoppel made based on claims that Defendant failed to perform under the contract. The failure of the defendant’s AI chatbot recruiting tool to perform as defendant promised was one way in which the defendant failed to meet its contractual obligations.)
Sevatec, Inc. v. Ayyar, 102 Va. Cir. 148 (Va. Cir. Ct. 2019). The court noted that matters such as data analytics, artificial intelligence, and machine learning are complex enough that expert testimony is proper and helpful and such testimony does not invade the province of the jury.
§ 1.2.6. Fifth Circuit
Aerotek, Inc. v. Boyd, 598 S.W.3d 373 (Tex. App. 2020). The court expressly acknowledged that one day courts may have to determine whether machine learning and artificial intelligence resulted in software altering itself and inserting an arbitration clause after the fact.
§ 184.108.40.206. Additional Cases of Note
Bertuccelli v. Universal City Studios LLC, No. 19-1304, 2020 U.S. Dist. LEXIS 195295 (E.D. La. 2020) (denying a motion to disqualify an expert who the court concluded was qualified to testify in a copyright infringement case after having performed a “artificial intelligence assisted facial recognition analysis” of the plaintiff’s mask and the alleged infringing mask). But see Bertuccelli v. Universal City Studios LLC, 2021 U.S. Dist. LEXIS 77784 (E.D. La. 2021) (later excluding a portion of the expert witness’s testimony on the basis that plaintiff Bertuccelli failed to timely respond to defendants’ request for additional information about the expert witness’s initial report.)
§ 1.2.7. Sixth Circuit
Cahoo v. Fast Enters. LLC, 508 F. Supp. 3d 162 (E.D. Mich. 2020) (finding that the plaintiff class had sufficiently demonstrated injury-in-fact due to “fraud determinations based on rigid application of UIA’s logic trees coupled with inadequate notice procedures.” The application of the logic trees was too rigid in that such application resulted in significant outcomes—determination of fraud—solely on the basis of plaintiff’s failure to respond to a questionnaire. Whether or not the software using the logic trees constituted artificial intelligence was of little consequence.)
Delphi Auto, PLC v. Absmeier, 167 F. Supp. 3d 868 (E.D. Mich. 2016). Plaintiff employer alleged defendant former employee breached his contractual obligations by terminating his employment with the plaintiff and accepting a job with Samsung in the same line of business. Defendant worked for the plaintiff as director of their labs in Silicon Valley, managing engineers and programmers on work related to autonomous driving. Defendant had signed a confidentiality and Noninterference agreement. The court concluded that the plaintiff had a strong likelihood of success on the merits of its breach of contract claim. Therefore, the court granted the plaintiff’s motion for preliminary injunction with certain modifications (namely, limiting the applicability of the non-compete provision to the field of autonomous vehicle technology for one year because the Court determined that autonomous vehicle technology is a “small and specialized field that is international in scope” and therefore a global restriction was reasonable).
§ 220.127.116.11. Additional Cases of Note
In re C.W., 2019-Ohio-5262 (Oh. Ct. App. 2019) (noting that “[p]roving that an actual person is behind something like a social-networking account becomes increasingly important in an era when Twitter bots and other artificial intelligence troll the internet pretending to be people.”).
§ 1.2.8. Seventh Circuit
King v. PeopleNet Corp., 2021 U.S. Dist. LEXIS 207694 (N.D. Ill. 2021) (remanding plaintiff’s BIPA § 15(a) and (c) claims to state court, and denying defendant’s motion to dismiss plaintiff’s BIPA § 15(b) claim. Re the BIPA § 15(a) and (c) claims, the court found that plaintiff lacked Article III standing because she failed to allege a concrete and particularized injury rather than a general injury not particular to her. Re the BIPA § 15(b) claim, the court found that plaintiff suffered a concrete and particularized injury when defendant, a third-party technology provider, actively collected plaintiff’s biometric facial scans without obtain plaintiff’s informed consent, thereby violating § 15(b).)
Kislov v. Am. Airlines, Inc., 2021 U.S. Dist. LEXIS 194911 (N.D. Ill. 2021) (applying Bryant and Fox, among other cases to hold that plaintiffs lacked Article III standing for their BIPA § 15(a) claim because, like Bryant but unlike Fox, the plaintiffs only alleged that defendant American Airlines “failed to make publicly available any policy addressing its biometric retention and destruction policies” without further alleging a failure to comply with those policies (which the). The court remanded the case to state court.)
Jacobs v. Hanwha Techwin Am., Inc., 2021 U.S. Dist. LEXIS 139668 (N.D. Ill. 2021) (dismissing claims brought under BIPA § 15(a), (b), and (d) against a third-party technology manufacturer. Re the BIPA § 15(b) claim, the court found that defendant was not engaged in illegal collection of facial recognition data because it merely manufactured the camera and did not take any active steps to use the camera to collect or retain the data. Re the BIPA § 15(a) and (d) claims, the court found that no evidence to plausibly suggest that defendant, as a mere third-party technology provider, actually possessed or disclosed plaintiff’s biometric data.)
United States v. Bebris, 4 F.4th 551 (7th Cir. 2021) (affirming a district court holding that quashed Bebris’s subpoena made on the basis of an alleged claim that plaintiff’s Fourth Amendment rights were violated and Facebook acted as a government agent when providing results of image recognition analysis to the National Center for Missing and Exploited Children (NCMEC) in compliance with 18 U.S.C. § 2258A(a). The court found that the district court’s holding that Facebook did not act as a government agent was not clearly erroneous because Facebook voluntarily provided the images to the NCMEC (a quasi-governmental organization), no government entity contacted Facebook about Bebris or directed Facebook to take any actions with respect to Bebris, and Facebook had an “independent business purpose in keeping its platform free of child pornography.”)
Hazlitt v. Apple Inc., 2021 U.S. Dist. LEXIS 110556 (S.D. Ill. 2021) (applying Bryant and Fox to hold that plaintiffs had Article III standing for their BIPA § 15(a) and (b) claims, and applying Thornley to hold that plaintiffs lacked Article III standing for their BIPA § 15(c) because they had merely alleged a regulatory violation). Compare Hazlitt v. Apple Inc., 500 F. Supp. 3d 738 (S.D. Ill. 2020) (vacated for reconsideration after the Fox and Thornley decisions published).
Kalb v. Gardaworld Cashlink LLC, 2021 U.S. Dist. LEXIS 81325 (C.D. Ill. 2021) (finding that plaintiff had Article III standing for his BIPA § 15(a) claim because, like Fox and unlike Bryant, plaintiff alleged that not only had defendant failed to publish a data retention and destruction policy, defendant had no such policy at all. Thus, under Fox, plaintiff had alleged a concrete and particularized injury sufficient for Article III standing.)
Stein v. Clarifai, Inc., 2021 U.S. Dist. LEXIS 49516 (N.D. Ill. 2021) (finding no personal jurisdiction for a set of claims alleging that Clarifai violated BIPA § 15 by obtaining images of Illinois users from OKCupid user profiles to use in training facial recognition software. The court found that plaintiff had not alleged sufficient contacts with Illinois to bring a BIPA claim given that the only evidence of Clarifai’s contact with Illinois was obtaining a data set from an investor based in Chicago.)
Wilcosky v. Amazon.com, Inc., 517 F. Supp. 3d 751 (N.D. Ill. 2021) (holding that plaintiffs Wilcosky, Gunderson, and E.G. (a minor) had Article III standing for their BIPA § 15(a) and (b) claims against Amazon’s collection, use, and storage of plaintiffs’ voice biometric data, i.e., “voiceprint,” via the speech and voice recognition capabilities of Amazon’s Alexa virtual assistant. Under Bryant, Amazon’s failure to obtain plaintiffs’ informed consent about Amazon’s collection and storage of their voiceprints was sufficient concrete and particularized injury-in-fact under BIPA § 15(b). Under Fox, Amazon failed to publish and comply with a voiceprint data retention policy, which is a sufficiently concrete and particularized injury under BIPA § 15(a). The court also held that plaintiff Wilcosky’s and Gundersons’ claims were subject to arbitration related to Amazon Alexa given that they had agreed to arbitration when purchasing products from Amazon’s website.)
Thornley v. Clearview AI, Inc., 984 F.3d 1241 (7th Cir. 2021) (affirming that plaintiffs did not have Article III standing to pursue their BIPA § 15(c) claim in federal court against Clearview AI, a facial recognition company that plaintiffs alleged included the plaintiffs’ biometric identifiers or information in Clearview AI’s database. Significantly, the plaintiffs’ only alleged injury was general, statutory aggrievement under BIPA § 15(c). Because the plaintiffs had not alleged a concrete and particularized injury, the court remanded the case back to state court. This case was the court’s first opportunity to consider BIPA § 15(c).)
Fox v. Dakkota Integrated Sys., LLC, 980 F.3d 1146 (7th Cir. 2020) (finding that plaintiff had standing under BIPA § 15(a) because defendant violated plaintiff’s legal right by failing to “comply with data retention and destruction policies—resulting in the wrongful retention of her biometric data after her employment ended, beyond the time authorized by law.” The court distinguished Bryant from this case on the basis that Bryant was focused only on public disclosure of data retention and destruction protocols while this case also relied on an evaluation of compliance with those protocols. Further, the court held that unlawful retention of biometric data was a concrete and particularized injury just like unlawful collection of biometric data.)
Marquez v. Google LLC, 2020 U.S. Dist. LEXIS 199098 (N.D. Ill. 2020) (finding that plaintiff Marquez lacked Article III standing in federal court because he did not plead any particularized harm arising under defendant’s alleged violation of BIPA; rather, he had merely alleged that Google committed a public harm under BIPA § 15(a) by not publishing data retention policies. Thus, under Bryant, the court remanded the § 15(a) claim back to Illinois state court.)
Bryant v. Compass Group USA, Inc., 958 F.3d 617 (7th Cir. 2020). Plaintiff vending machine customer filed class action against vending machine owner/operator, alleging violation of BIPA when it required her to provide a fingerprint scan before allowing her to purchase items. The district court found defendant’s alleged violations were mere procedural violations that cause no concrete harm to plaintiff and therefore remanded the action to state court. The Court of Appeals held that a violation of § 15(a) (requiring development of a written and public policy establishing a retention schedule and guidelines for destroying biometric identifiers and information) of BIPA did not create a concrete and particularized injury and plaintiff therefore lacked standing under Article III to pursue the claim in federal court. In contrast, the Court of Appeals held that a violation of § 15(b) (requiring private entities make certain disclosures and receive informed consent from consumers before obtaining biometric identifiers and information) of BIPA did result in a concrete injury (plaintiff’s loss of the power and ability to make informed decisions about the collection, storage, and use of her biometric information) and therefore she had standing and her claim could.
Rosenbach v. Six Flags Entertainment Corporation, 129 N.E.3d 1197 (Ill. 2019). Rosenbach is a key Supreme Court of Illinois case answering whether one qualifies as an “aggrieved” person for purposes of BIPA and may seek damages and injunctive relief if she hasn’t alleged some actual injury or adverse effect beyond a violation of her rights under the statute. Plaintiff purchased a season pass for her son to defendant’s amusement park. Plaintiff’s son was asked to scan his thumb into defendant’s biometric data capture system and neither plaintiff nor her son were informed of the specific purpose and length of term for which the son’s fingerprint had been collected. Plaintiff brought suit alleging violation of BIPA. The Supreme Court of Illinois held that an individual need not allege some actual injury or adverse effect, beyond violation of his or her rights under BIPA, to qualify as an “aggrieved” person under the statute and be entitled to seek damages and injunctive relief. The court reasoned that requiring individuals to wait until they’ve sustained some compensable injury beyond violation of their statutory rights before they can seek recourse would be antithetical to BIPA’s purposes. The court found that BIPA codified individuals’ right to privacy in and control over their biometric identifiers and information. Therefore, the court found also that a violation of BIPA is not merely “technical,” but rather the “injury is real and significant.”
§ 18.104.22.168. Additional Cases of Note
Kloss v. Acuant, Inc., 2020 U.S. Dist. LEXIS 89411 (N.D. Ill. 2020) (applying Bryant v. Compass Group (summarized in this chapter) and concluded that the court lacked subject-matter jurisdiction over plaintiff’s BIPA § 15(a) claims because a violation of § 15(a) is procedural and thus doesn’t create a concrete and particularized Article III injury).
Acaley v. Vimeo, 2020 U.S. Dist. LEXIS 95208 (N.D. Ill. June 1, 2020) (concluding that parties made an agreement to arbitrate because defendant provided reasonable notice of its terms of service to users by requiring users to give consent to its terms when they first opened the app and when they signed up for a free subscription plan, but the BIPA violation claim alleged by the plaintiff was not within the scope of the parties’ agreement to arbitrate because the “Exceptions to Arbitration” clause excluded claims for invasion of privacy).
Heard v. Becton, Dickinson & Co., 2020 U.S. Dist. LEXIS 31249 (N.D. Ill. 2020) (concluding that for § 15(b) to apply, an entity must at least take an active step to “collect, capture, purchase, receive through trade, or otherwise obtain” biometric data and the plaintiff did not adequately plead that defendant took any such active step where the complaint omitted specific factual detail and merely parroted BIPA’s statutory language and the plaintiff failed to adequately plead possession because he failed to sufficiently allege that defendant “exercised any dominion or control” over his fingerprint data).
Rogers v. CSX Intermodal Terminals, Inc., 409 F. Supp. 3d 612 (N.D. Ill. 2019) (denying defendant’s motion to dismiss and relied on the Illinois Supreme Court’s holding in Rosenbach (summarized in this chapter) to conclude that plaintiff’s right to privacy in his fingerprint data included “the right to give up his biometric identifiers or information only after receiving written notice of the purpose and duration of collection and providing informed written consent.”).
Neals v. PAR Technology Corp., 419 F. Supp. 3d 1088 (N.D. Ill. 2019) (concluding that the BIPA does not exempt a third-party non-employer collector of biometric information when an action arises in the employment context, rejected defendant’s argument that a third-party vendor couldn’t be required to comply with the BIPA because only the employer has a preexisting relationship with the employees).
Ocean Tomo, LLC v. Patentratings, LLC, 375 F. Supp. 3d 915, 957 (N.D. Ill. 2019) (determining that Ocean Tomo training its machine learning algorithm on PatentRatings’ patent database violated a requirement in a license agreement between the parties that prohibited Ocean Tomo from using the database (which was designated as PatentRatings confidential information) from developing a product for anyone except PatentRatings).
Liu v. Four Seasons Hotel, Ltd., 2019 IL App(1st) 182645, 138 N.E.3d 201 (Ill. 2019) (noting that “simply because an employer opts to use biometric data, like fingerprints, for timekeeping purposes does not transform a complaint into a wages or hours claim.”).
§ 1.2.9. Eighth Circuit
There were no qualifying decisions within the Eighth Circuit.
§ 1.2.10. Ninth Circuit
Klein v. Facebook, Inc., 2021 U.S. Dist. LEXIS 175738 (N.D. Cal. 2021) (resolving disputes between the parties related to the electronically stored information (ESI) protocol to use as part of e-discovery. Notably, the court required the parties to disclose intent to their use technology assisted review (TAR), predictive coding, or machine learning for e-discovery, discuss how those tools would be used, and, if needed, defend the decisions made in using the tools to produce a sufficient set of documents for review. As part of its opinion, the court cited In re Valsartan, Losartan, & Irbesartan Prods. Liab. Litig., 337 F.R.D. 610 (D.N.J. 2020), which was discussed previously in this chapter.)
Gonzalez v. Google LLC, 2 F.4th 871 (9th Cir. 2021) (evaluating multiple complaints alleging that Google and other social media companies such as Facebook and Twitter were directly and secondarily liable for acts of terrorism committed by ISIS because the companies’ platforms facilitated ISIS recruiting and messaging. The court held that the defendants retained publisher immunity under 47 U.S.C.S. § 230. Notably, the court stated that it did “not hold that ‘machine-learning algorithms can never produce content within the meaning of Section 230.’ We only reiterate that a website’s use of content-neutral algorithms, without more, does not expose it to liability for content posted by a third-party. Under our existing case law, § 230 requires this result.” The court also held that the plaintiffs for two of the three complaints failed to state an adequate claim that the companies were liable for aiding and abetting ISIS.)
United States v. Nelson, 2021 U.S. Dist. LEXIS 71421 (N.D. Cal. 2021) (denying a motion to exclude an expert witness’s testimony about the function of an AI-based software program due to Federal Rule 702 and Daubert concerns because expert witnesses do not have to be experts in the algorithms used in certain software to reliably testify about the software’s outputs.)
In re Facebook Biometric Info. Privacy Litig., 522 F. Supp. 3d 617 (N.D. Cal. 2021) (approving a $650 million settlement for the Facebook biometric information privacy litigation, which involved BIPA § 15(a) and (b) claim against Facebook’s collection and retention of Illinois residents’ facial scans (biometric data) for facial recognition purposes.)
Lopez v. Apple, Inc., 519 F. Supp. 3d 672 (N.D. Cal. 2021) (dismissing claims that Apple violated multiple federal and state privacy laws when its artificial intelligence-based virtual assistant “Siri” was accidentally triggered to “listen” to conversations intended to be private. The court held that the plaintiffs lacked Article III standing because their claims were based entirely on a news article that claimed to reveal details about accidental triggering of Siri and resulting subsequent recordings of private conversations. The court also dismissed the each of the plaintiff’s claims for a variety of reasons, including that the plaintiffs’ allegations were conclusory or out of scope of the bounds of a given law.)
Williams-Sonoma, Inc. v. Amazon.com, Inc., 2020 U.S. Dist. LEXIS 163066 (N.D. Cal. 2020) (holding that Williams Sonoma had adequately alleged copyright infringement by Amazon because Amazon’s algorithm selects the “most attractive photos irrespective of rights” and then publishes those photos on its website without input from any other party.)
Patel v. Facebook, Inc., 932 F.3d 1264 (9th Cir. 2019). Facebook moved to dismiss plaintiff users’ complaint for lack of standing on the ground that the plaintiffs hadn’t alleged any concrete injury as a result of Facebook’s facial recognition technology. The court concluded that BIPA protects concrete privacy interests and violations of BIPA’s procedures actually harm or pose a material risk of harm to those privacy interests.
WeRide Corp. v. Kun Huang, 379 F. Supp. 3d 834 (N.D. Cal. 2019). Autonomous vehicle companies brought, inter alia, trade secret misappropriation claims against former director and officer and his competing company. The court determined the plaintiff showed it was likely to succeed on the merits of its trade secret misappropriation claims where it developed source code and algorithms for autonomous vehicles over 18 months with investments of over $45M and restricted access to its code base to on-site employees or employees who use a password-protected VPN. Plaintiff identified its trade secrets with particularity where it described the functionality of each trade secret and named numerous files in its code base because plaintiff was “not required to identify the specific source code to meet the reasonable particularity standard.”