chevron-down Created with Sketch Beta.

TortSource

Fall 2024

Biometrics Legal Risks and Liability Exposure

David Oberly

Summary

  • Meta Platforms recently agreed to pay $1.4 billion to end a Texas civil enforcement action alleging violations of Texas’s biometrics and UDAP statutes arising from the social media platform’s use of biometrics.
  • Meta drastically increases the scope and breadth of liability exposure arising from non-compliance with state UDAP laws, which broadly prohibit “unfair” and “deceptive” acts and practices.
  • The Texas AG’s pursuit of algorithmic disgorgement—an equitable remedy that involves the forced deletion and destruction of not only improperly collected personal data, but also all algorithms and AI models developed or enhanced through the use of such data—is representative of a broader trend in which state and federal privacy regulators continue to increase their reliance on this powerful “fruit of the poisonous tree” remedy.
  • The Texas AG’s success in extracting a ten-figure settlement for the misuse of biometric technologies—under a law with no private right of action, but a high maximum, per-violation civil penalty amount—will provide significant motivation to lawmakers in other in 2025.
  • Post-Meta, companies and their in-house legal teams must ensure that their organizational biometrics compliance programs align not only with the requirements of the well-known Illinois Biometric Information Privacy Act (BIPA)—but also with the larger patchwork of U.S. biometrics laws—to fully manage and mitigate applicable legal risk, which will only continue to grow as time progresses.
Biometrics Legal Risks and Liability Exposure
iStock.com/CentralITAlliance

Jump to:

In late July 2024, Meta Platforms, Inc. (Meta) agreed to pay $1.4 billion to resolve a civil enforcement action brought by the Texas Attorney General (AG) involving alleged violations of the Texas Capture or Use of Biometric Identifiers Act (CUBI) and the Texas Deceptive Trade Practices Act (DTPA) arising from its use of biometrics.

The Meta CUBI settlement is a watershed moment in the biometrics space. In addition to illustrating the outsized legal risks and liability exposure arising from today’s fragmented patchwork of U.S. biometrics laws and regulations, Meta highlights the need for maintaining comprehensive, enterprise-wide biometrics compliance programs that facilitate strict, ongoing compliance with the law. More than that, the Meta CUBI settlement provides several valuable takeaways and lessons that companies and their in-house legal teams can utilize to address and mitigate the growing risks of deploying biometrics in commercial operations.

The Meta Enforcement Action and Settlement

In Meta, the Texas AG alleged that unbeknownst to its end users, the social media giant used face biometrics tools across various platforms without providing notice or obtaining consent before collecting biometric data. The AG further alleged that Meta disclosed its end users’ biometric data to third parties in violation of CUBI and failed to destroy biometric data within the periods mandated by Texas law. Notably, the AG also claimed that Meta violated CUBI by using biometric data collected from end users internally to train its artificial intelligence (AI) models and algorithms and enhance its face biometrics tools, all without end users’ informed consent.

The Texas AG also asserted separate claims for purported non-compliance with Texas’s unfair and deceptive acts and practices (UDAP) statute, the DTPA, by (among other things): (1) misrepresenting the extent to which it collected and used biometric data; (2) failing to disclose information regarding its biometric data processing activities to its end users; and (3) publicly claiming that some of its platforms did not use any form of biometrics, when, in reality, Meta continuously performed face geometry scans on all uploaded photos and videos.

As a result of these purported violations, the Texas AG sought the maximum civil penalty amounts of $25,000 per CUBI violation and $10,000 per DTPA violation, as well as algorithmic disgorgement.

On July 30, 2024, the Texas AG announced its $1.4 billion settlement with Meta to resolve the CUBI enforcement matter. The $1.4 billion price tag represents far and away the most significant monetary settlement of a single action arising out of purported non-compliance with U.S. biometrics laws, far surpassing the previous record of $650 million that Meta paid to end a longstanding Illinois Biometric Information Privacy Act (BIPA) class action suit, which also arose from its use of face biometrics. In addition, the $1.4 billion figure also represents the largest monetary settlement to date arising from a single state AG privacy-related enforcement action, shattering the previous record of $390 million that Google paid in 2022 to resolve claims relating to its allegedly improper use of geolocation data.

Unlike BIPA, CUBI does not provide a private right of action, allowing class action litigation. Instead, the Texas AG maintains exclusive authority to enforce Texas’s biometrics statute, including imposing civil penalties of up to $25,000 per violation. Despite the law’s high maximum civil penalty amount, many viewed CUBI, enacted in 2001, as a paper tiger due to the lack of enforcement for the first 21 years of the statute’s existence. Notably, the Meta settlement establishes CUBI as a source of significant liability exposure and the Texas AG as a strong biometrics enforcer.

Takeaways and Lessons

Unfair or Deceptive Acts and Practices (UDAP) Claims

The most consequential impact of Meta is the drastically increased liability exposure that now exists post-Meta in connection with violations of state UDAP statutes. Unlike the relatively small patchwork of biometrics laws currently in effect, all 50 states have enacted UDAP laws, which broadly prohibit “unfair” and “deceptive” conduct. While sometimes referred to as “Little FTC Acts” due to certain similarities these statute laws share with Section 5 of the Federal Trade Commission Act (Section 5), state UDAP laws present markedly broader liability exposure vis-à-vis Section 5.

This is attributable to several significant differences between UDAP laws and Section 5, the first of which pertains to high civil penalty amounts allowable under UDAP statutes, which range from $1,000 to $50,000 per violation. Notably, even in those states with relatively low maximum penalty amounts, companies nonetheless still face high UDAP exposure, as civil penalties are assessed on a “per violation” (as opposed to “per person”) basis. In addition, most UDAP laws permit state AGs to retain outside lawyers on a contingency fee basis to pursue UDAP civil actions on behalf of the state. This dynamic creates strong financial incentives for outside attorneys to seek maximum civil penalties instead of an immediate injunction or other corrective action that would otherwise be more appropriate in many cases. Finally, in addition to AG enforcement activity and high civil penalties, all UDAP statutes also include private right of action provisions, with many further permitting class or representative actions.

The Meta CUBI settlement has garnered tremendous media coverage, placing Texas’s biometrics statute and its UDAP law front and center in the minds of lawmakers, regulators, and class action attorneys alike. Moving forward, the record monetary settlement that the Texas AG was able to secure, at least in part, by asserting UDAP violations tied to corporate biometrics practices, may influence and motivate other state AGs to pursue UDAP enforcement actions of their own for what they view as improper biometrics practices. These risks are enhanced by a wide margin because, in recent years, state and federal privacy regulators have significantly increased their focus and efforts on policing the collection and use of sensitive data—and biometric data in particular. At the same time, the always-enterprising plaintiff’s class action bar has also likely taken note of Meta’s UDAP component, which may prompt an uptick in class action filings focusing more heavily on the assertion of UDAP causes of action arising from alleged improper biometrics practices.

Algorithmic Disgorgement

A second major takeaway is the Texas AG’s assertion of algorithmic disgorgement against Meta. Algorithmic disgorgement is an equitable remedy that involves the forced deletion and destruction of improperly collected personal data, and all algorithms and associated AI models developed or enhanced through such data.

In Meta, the imposition of algorithmic disgorgement would have forced the social media company to destroy not only all data obtained through methods that ran afoul of CUBI but also all algorithms and neural networks that it had developed, trained, or improved on that illegally obtained data.

From a broader perspective, this aspect of the Meta matter represents a more significant trend in which state and federal regulators continue increasing their reliance on this powerful remedy. Most importantly, the Federal Trade Commission (FTC) has imposed algorithmic disgorgement in many recent enforcement actions where biometrics or other advanced technologies were implicated. As just one example, the nation’s de facto privacy and security regulator imposed algorithmic disgorgement against photo developer Everalbum as a result of its improper use of customer data to train and enhance its internal face biometrics-related AI models and algorithms, forcing the company to delete not only all photos and other user data that Everalbum had improperly collected, but also all face biometrics algorithms that had been created or improved through this ill-gotten data. In addition, Colorado’s recently enacted biometrics law, HB 1130, includes disgorgement as a remedy for HB 1130 non-compliance. Many new consumer privacy laws also include disgorgement as a remedy that state AGs can impose to enforce these more comprehensive privacy regulatory regimes.

Moving forward, it is likely that disgorgement will become a ubiquitous tool utilized by state AGs and the FTC alike in their respective future enforcement actions and, in particular, those that implicate the use of biometric technologies or the collection/use of biometric data.

CUBI Copycat Legislation

As discussed above, Meta’s ten-figure monetary settlement has garnered widespread attention and interest from federal, state, and municipal lawmakers. The Texas AG’s success in securing such a large settlement against one of the world’s leading technology companies—under a law with no private right of action but a high maximum per-violation civil penalty amount—will likely encourage other legislators to push to enact their own biometrics legislation modeled after CUBI.

In particular, lawmakers who are interested in enacting legislation to directly regulate the collection and use of biometric data but who are opposed to a private right of action enforcement mechanism may find significant motivation in Meta to introduce copycat legislation modeled after CUBI—which the Meta settlement showed has significant teeth despite the absence of a private right of action permitting class action litigation.

With that said, any CUBI copycat laws enacted moving forward will also likely come with their own nuances and unique compliance components, which will significantly increase the compliance burdens imposed on companies that develop, supply, or use biometric technologies. As more biometrics statutes are enacted into law, companies will see a corresponding rise in the potential legal and regulatory pitfalls associated with using biometrics.

Practical Compliance Tips & Strategies

Post-Meta companies and their in-house legal teams must ensure that their biometrics compliance programs align not only with BIPA’s core legal obligations—but also with the more extensive patchwork of U.S. biometrics laws—to fully manage and mitigate applicable legal risk, which will only continue to rise as time progresses.

CUBI-Specific Strategies

In particular, companies should consider the following strategies to address the critical issues discussed above about CUBI non-compliance, and illustrated by the Meta CUBI settlement:

State UDAP Claims. To address and mitigate UDAP liability exposure, companies should consider implementing enhancements to the following aspects of their biometrics compliance programs: (1) privacy by design; (2) governance; (3) risk assessments; (4) transparency, including clear and conspicuous notices and public-facing disclosures that detail how biometric data is collected, used, and shared; (5) choice, including obtaining pre-collection informed and explicit consent; (6) third-party management, including updating contracts between biometric technology vendors and customers to ensure strict legal compliance and accountability; (7) data security; (8) accuracy and completeness in all organizational statements and representations; and (9) employee/contractor training.

  • Algorithmic Disgorgement. To address risks associated with algorithmic disgorgement, companies should have mechanisms for obtaining informed consent from all end users and other data subjects, which cover the company’s external and internal uses of biometric data.
  • CUBI Copycat Legislation. Companies should remain cognizant of the increased likelihood that new CUBI copycat legislation will be enacted during the 2025 legislative cycle, which would bring with it liability exposure on a scale commensurate to that of BIPA, even in the absence of a private right of action.

Universally-Applicable Strategies

From a broader perspective, companies and their in-house legal teams should consider the following universally applicable strategies to mitigate further risk and liability exposure to the greatest extent feasible:

  • Privacy Policies & Related Public-Facing Disclosures. Privacy policies and related public-facing disclosures should contain information regarding the company’s biometric data processing practices, including all current and reasonably foreseeable purposes for which the company uses or may use biometric data, both internally and externally, as well as the company’s data retention policy and schedule for permanently deleting biometric data.
  • Informed Consent. Informed consent should be obtained from all end users and data subjects before collecting biometric data. Informed consent forms should contain all information disclosed in the company’s public-facing biometrics disclosures (discussed above) and clear language stating that the end user/data subject expressly agrees and consents to the company’s use and disclosure or sharing of his or her biometric data.
  • Data Retention & Destruction. Internal policies and mechanisms should be in place to permanently dispose of biometric data within the applicable period (s) mandated by applicable biometrics laws.
  • Sharing & Disclosure of Biometric Data. Mechanisms should be in place to ensure biometric data is not disclosed to or shared with any third parties unless express consent has been obtained from end users/data subjects allowing for the disclosure of their biometric data or where sufficient legal grounds exist under applicable biometrics laws that permit such disclosures.
  • Transactional Prohibition. Internal policies and mechanisms should be in place that strictly bar the company, its employees, and any related third parties from engaging in any activities that could be construed as leasing, selling, or otherwise “profiting from” biometric data.
  • Data Security. Proper security measures should be in place that satisfy the general standard imposed by biometrics laws, i.e., that biometric data is safeguarded using “reasonable care” and in a manner that is the same as or more protective than the manner in which the company stores, transmits, and protects other types of confidential and sensitive data.

Third-Party Risk Management Strategies

In addition to ensuring compliance, companies must also take proactive steps to mitigate the considerable legal risks and liability exposure that arise from third-party relationships. In particular, there are two significant risks that companies must address: (1) legal and regulatory compliance risks, which arise from a third party’s failure to satisfy all legal requirements that govern how it must handle, disclose, and store biometric data; and (2) security risks, which arise from a third party’s deficient or missing security controls. To do so, companies and their in-house legal teams should consider the following strategies:

  • Due Diligence & Vetting. Before entering into any contractual or other relationship with a vendor, customer, or other related third party, thorough due diligence and vetting of all such third parties should be conducted to identify potential risks and to ensure collaboration with only those outside entities that can maintain strict legal and regulatory compliance, while also adequately safeguarding the company’s biometric data. The following areas and issues should be prioritized during the diligence and vetting process: (1) legal and regulatory compliance programs; (2) legal and regulatory non-compliance history; (3) personnel vetting and training; (4) security and incident response programs; (5) security certifications; and (6) security incident history.
  • Contracts. All contracts with third parties that will have access to company biometric data should consider the principal issues and potential pitfalls associated with using biometric technologies. In particular, contracts should cover the following key issues: (1) compliance with applicable law; (2) indemnification; (3) limitations of liability; (4) allocation of responsibilities for satisfying biometrics-related legal obligations; (5) minimum data security standards; (6) security incident standards, cooperation, and reimbursement of remediation expenses; and (7) cyber insurance coverage.
  • Ongoing Monitoring. After entering into a contractual agreement with a third party that will handle or otherwise have access to company biometric data, ongoing monitoring must continue for the duration of the contractual relationship. All third parties should be reviewed regularly, such as through written questionnaires. Higher-risk third parties should be subject to a more thorough and exacting review, such as through external audits, to confirm ongoing legal and regulatory compliance and conformity with security-related industry best practices.

The Final Word

Post-Meta, it is imperative for companies that develop, supply, or use biometrics to double down on their compliance efforts while also applying a nationwide scope to compliance. As the Meta CUBI enforcement action shows, legal and regulatory compliance when using commercial biometrics is now a national issue, and companies can no longer approach compliance in a “one-size-fits-all” manner. By implementing the strategies discussed above, companies and their in-house legal teams can manage today’s complex potential legal pitfalls when deploying biometrics while maximizing the myriad of benefits that advanced biometric technologies offer.

    Author