With the Illinois Biometric Information Privacy Act (BIPA) capturing the vast majority of media coverage, as well as the attention of in-house counsel and c-suite executives, in recent years, two biometrics civil disputes currently ongoing in Texas have flown under the radar—despite their potential to have a sizeable impact on the biometrics legal landscape moving forward. Those actions—Texas v. Meta Platforms, Inc. f/k/a Facebook, Inc., No. 22-0121 (Tex. 71st Jud. Dist.), and Texas v. Google LLC, No. CV 58999 (Tex. 385th Jud. Dist.)—center on purported violations of the Lone Star State’s Capture or Use of Biometric Identifiers Act (CUBI) in connection with two major types of biometrics technology commonly deployed today, face and voice biometrics.
While the jury is still out as to how these two disputes will modify or expand the scope of legal risk and liability exposure for companies that use biometrics in their commercial operations, there are nonetheless several concrete takeaways from the CUBI proceedings that have taken place to date in these two high-profile civil actions.
The Meta CUBI Litigation
In February 2022, the Texas attorney general filed a civil lawsuit on behalf of the state of Texas against Meta Platforms, Inc. f/k/a Facebook, Inc. (Meta). The Meta litigation arises out of the social media company’s well-known (and now defunct) tag suggestions feature, which used face biometrics software to analyze photos and videos uploaded to the social media platform and, in turn, generate records of face geometry of both users and non-users. Though Meta discontinued its use of this technology on its Facebook platform in November 2021, the company purportedly failed to make similar changes to any of its other platforms or operations under its corporate umbrella, including Instagram and WhatsApp, among others.
In Meta, Texas has asserted three causes of action arising from purported violations of CUBI’s notice/consent, disclosure, and data retention/destruction requirements. In addition, a fourth cause of action has also been asserted for alleged violations of the Texas Deceptive Trade Practices Act (TDTPA) in connection with allegedly false, misleading, and deceptive acts and practices engaged in by Meta in connection with its Tag Suggestions feature.
The state seeks draconian damages against Meta for these violations of CUBI and the TDTPA. Of note, the state has taken the position that each separate capture of biometric identifiers in violation of CUBI constitutes a separate, independent violation of the statute, subject to a civil penalty of up to $25,000. Theoretically speaking—and using the complaint’s allegation that Meta improperly captured Texans’ biometric identifiers over one billion times—Meta could be subjected to up to $25 trillion in total civil penalties for violations of Tex. Bus. & Com. Code § 503.001(b) alone. Moreover, the state also seeks equitable disgorgement, which would require Meta to destroy all data collected through the unlawful capture of biometric identifiers, as well as any neural networks and algorithms that were trained or improved using unlawfully captured biometric identifiers.
In February 2023, Meta moved for partial summary judgment on the TDTPA cause of action, arguing that the claim failed as a matter of law because: (1) civil penalties under the statute are available only for a defendant’s alleged violations against its consumers; and (2) the state could not seek penalties on behalf of non-users because they were not consumers under the TDTPA. On May 18, 2023, the court denied the motion but did not issue a written opinion offering the court’s reasoning for its ruling.
In June 2023, Meta moved for partial summary judgment on the state’s CUBI disclosure claim, arguing that it failed because disclosures to corporate affiliates do not constitute “disclosures” for purposes of CUBI. On September 5, 2023, the court denied the motion but did not issue a written opinion stating the court’s reasoning for its ruling.
At the time of publishing, Meta is set for trial in mid-June 2024.
The Google CUBI Litigation
In October 2022, Texas filed a second civil lawsuit alleging CUBI non-compliance against another major tech company—this time, Google, LLC (Google). The Google CUBI litigation focuses on several different Google products and services that utilize face and voice biometrics.
In an identical fashion to the Meta litigation, the state asserts both CUBI and TDTPA causes of action against Google relating to its face and voice biometrics tools and seeks remedies in the form of injunctive relief, civil penalties, and equitable disgorgement.
In mid-2023, Google moved for partial summary judgment on the state’s claim for injunctive relief under CUBI. On August 28, 2023, the court denied the motion but did not issue a written opinion stating the court’s reasoning for its ruling.
At the time of publishing, Google is set for trial in December 2024.
Takeaways and Implications
UDAP Causes of Action
The major first takeaway pertains to the unfair or deceptive trade practices (UDAP) causes of action asserted in both Meta and Google. Unlike the relatively few focused biometric privacy laws on the books today—all 50 states maintain some form of UDAP statute. State UDAP laws generally follow the FTC Act, the Uniform Deceptive Trade Practices Act (UDTPA), or the Revised Uniform Deceptive Trade Practices Act (RUDTPA). These laws ordinarily adhere to one of two main approaches in defining the scope of covered improper acts and practices: (1) by articulating a broad prohibition against all such acts and practices; or (2) by expressly articulating specific prohibited acts and practices.
Importantly, the majority of UDAP statutes not only allow for state attorney general enforcement but also include private rights of action permitting individuals to pursue class action litigation against entities that run afoul of these laws. To further complicate matters, available damages under these statutes typically include (1) statutory damages; (2) treble or punitive damages (in instances of willful or knowing violations of the law); and (3) attorney’s fees and costs.
Due to the high-profile nature of these two lawsuits and the significant amount of press they have garnered Meta and Google may trigger new class action litigation in 2024 involving alleged violations of UDAP statutes in states that have no CUBI- or BIPA-like statute on the books. At the same time, future BIPA defendants may also be targeted not just for alleged violations of Illinois’s biometric privacy statute, but also for purported violations of Illinois’s Consumer Fraud and Deceptive Business Practices Act (Illinois UDAP) as well. For example, in addition to causes of action asserted under BIPA, class action complaints may also include Illinois UDAP claims alleging that a company has misrepresented or concealed certain uses of biometric identifiers and biometric information by the defendant, such as for purposes of internal training and improving the defendant’s AI and associated algorithms. Of note, the Illinois UDAP contains a private right of action.
Data & Algorithmic Disgorgement Remedy
The second takeaway pertains to the equitable disgorgement remedy sought by the state in both Meta and Google. As indicated above, this would require Meta and Google to not only destroy all data collected in violation of CUBI but also all neural networks and algorithms that were trained or improved using improperly obtained biometric identifiers as well. On the federal level, the Federal Trade Commission (FTC) has also clearly indicated that it intends to seek disgorgement when pursuing enforcement actions stemming from allegedly improper biometrics- and other AI-related practices.
This “fruit of the poisonous tree” remedy has particularly noteworthy implications for developers of biometric technology and other forms of AI and machine learning—especially smaller entities—as the use of disgorgement in biometrics litigation and enforcement actions would pose a mortal threat to these businesses as a whole, most of whom lack the resources to regroup and start from scratch in the event they are forced to delete the progress made in developing or refining their biometrics or other AI technology.
“Disclosures” under CUBI
The third takeaway pertains to the broad manner in which the term “disclosure” has been construed in the Meta litigation. As discussed above, Meta failed in its bid to procure the dismissal of the CUBI disclosure claim asserted against it, which focused on CUBI’s provision prohibiting any “person who possesses a biometric of an individual that is captured for a commercial purpose,” from “sell[ing],” or “leas[ing],” or “otherwise disclos[ing]” those identifiers, outside of a narrow set of circumstances. In the Meta litigation, the state alleged that the social media company disclosed biometric identifiers to its corporate affiliates in violation of the state’s biometrics statute.
In its summary judgment motion, Meta argued that this cause of action was fatally flawed because permitting corporate affiliates to access or use data does result in the “disclos[ure]” of that data. Because “disclose” is not defined in the text of CUBI, Meta relied on the ordinary meaning of the term—which is to make something known or public or reveal it. Permitting access by a company’s own affiliates, Meta contended, did not do any of these things.
In its opposition to summary judgment, the state argued that corporate affiliates are “separate legal persons.” Thus, according to the state, when Meta shared Texans’ biometric identifiers with its subsidiaries, it was disclosing that data with separate legal persons. “And there is no question that under ordinary usage, when one company reveals information to another, it has disclosed that information.”
The state also argued that CUBI only permits disclosure in narrowly defined circumstances, which does not even include disclosures with consent. This, the state contended, indicates the Texas legislature intended to ensure robust protections against the disclosure of its residents’ biometric identifiers. As such, Meta was impermissibly seeking to invent new statutory exceptions into a “non-disclosure regime” that Texas lawmakers clearly intended to be comprehensive. More specifically, the state explained, the placement of “disclose” after sale and lease, and the use of “otherwise” as a modifier, made clear that the legislature intended CUBI’s disclosure requirement to have an expansive scope, without any indication of an intent to exempt parent-subsidiary disclosures.
The court did not issue a written opinion explaining its reasoning for denying Meta’s motion for summary judgment, but it is reasonable to posit that the court agreed with the reasoning set forth by the state on this issue. While the Meta court’s decision is not binding, it does indicate that other courts may construe the term “disclosure” in an expansive, plaintiff-friendly manner in future CUBI disputes. If so, this could present significant compliance challenges for companies to use biometric identifiers in any type productive, beneficial manner—especially considering that the vast majority of biometric modalities involve reliance on third-party vendors to provide and/or facilitate the software used to leverage the benefits of biometric data. With that said, this aspect of Texas’s biometric privacy law has not been tested in court to date, and the possibility remains that courts will decline to apply the statute’s text strictly, as Texas maintains a rule of statutory interpretation that its laws will not be interpreted so as to lead to a foolish or absurd result.
“Commercial Purpose” Under CUBI
The final takeaway concerns the expansive manner in which the state has construed the term “commercial purpose” in the Google litigation.
In that action, the state alleges several distinct, independent “commercial purposes” underlying Google’s use of face and voice biometrics that allegedly ran afoul of CUBI. Most importantly, the state focuses extensively on Google’s purported use of biometric identifiers for purposes of improving its internal AI and associated algorithms. In this respect, the state alleges that Google has turned the collection and retention of Texans’ face and voice biometric identifiers “into a testing ground for AI and other products in its ever-growing, advertising revenue stream.” The state summarizes this use of biometric identifiers as follows:
[E]ach time Google’s algorithms process photos and videos to detect certain facts and objects or process voiceprints to better understand voice data, Google’s underlying AI becomes stronger, better-informed, more efficient, and more dominant. This translates into commercial benefits for Google through both increased revenue and the ability to refine and market application programming interfaces (“APIs”) to developers. As to hardware, Google similarly converts trained AI into improved hardware.
The state’s focus on internal improvement of AI and algorithms is significant, as this is a core, fundamental aspect of the operations of the vast majority of cutting-edge companies that are involved in the development of AI-powered products and services, but one that is sometimes inadequately disclosed in appliable biometrics-related privacy policies, notices, and similar disclosures. While this use of biometric data, by itself, does not violate CUBI (or other biometric privacy statutes) per se, using biometric data in this manner allegedly caused Google to run afoul of Texas’s biometric privacy law because due to the company’s failure to disclose this particular use before capturing biometric identifiers through its various technologically advanced products.
The Final Word
Companies that develop, use, or intend to use biometrics today—even those with no operations or presence in Texas—should make sure to monitor the progress of these two lawsuits for future developments, including how the courts in these actions interpret the key provisions of Texas’s biometrics statute.
At the same time, companies should consult with experienced biometrics counsel to ensure their biometric data processing practices align with the current body of law governing biometrics, as well as industry best practices, as new legislative and regulatory proposals on the horizon are likely to make their way into law in the immediate future—before the end of 2024.