The private right of action is one of the most controversial aspects of various privacy laws being proposed around the country. With a private right of action, plaintiffs’ attorneys are enforcing the privacy law by constantly seeking out potential defendants who are allegedly violating the law. Without a private right of action, state attorney generals must decide who to sue, if there are resources to sue, and if it is politically a good move to sue. Companies are often adamantly opposed to laws creating a private right of action, as such suits can result in large, complex class actions lasting for years and, potentially, very large judgements and settlements.
One result of the Illinois BIPA’s private right of action is that many online web firms and off-line companies are either stopping their use of biometric identification or more carefully obtaining opt-in consent from their customers and employees. The door opened for class actions and large judgments when in 2018, the Illinois Supreme Court ruled in Rosenbach v. Six Flags that BIPA did not require a showing of damages, only a showing that a violation occurred. Then in February 2022, the Illinois Supreme Court held in McDonald v. Symphony Bronzeville Park that the Illinois Workers’ Compensation Act does not protect companies from statutory BIPA damages. The McDonald case involved a nursing home collecting employees’ fingerprints without their consent, and the court found that the BIPA claims for statutory damages were not barred by the exclusivity provisions of the Illinois Workers’ Compensation Act.
In 2021, Facebook paid $650 million in a historic settlement of a BIPA lawsuit. Class members are to be awarded at least $345 each, though the payments have been delayed. Notably, Facebook announced it would stop using facial recognition just a few months later. Other plaintiffs and their attorneys also sued other web platforms including TikTok, Snapchat, and Google under BIPA. In 2021, TikTok announced that it settled an Illinois class action for $92 million. Shortly thereafter, in June 2021, TikTok changed its privacy policy to state that TikTok “may collect biometric identifiers” including “faceprints and voiceprints.” Plaintiffs filed a class action suit against Snapchat in 2020 for violations of BIPA. The case is currently before the Seventh Circuit on the issue of whether the minor plaintiff is subject to the Snapchat terms and conditions’ arbitration requirement. Microsoft, Amazon, and Shutterfly have also been sued for alleged BIPA violations. Reportedly, these cases involved photos uploaded from Flickr that were later used by IBM to “train” facial recognition software to help accurately identify people of color. The project was called “Diversity of Faces.” The IBM training database was then used by Microsoft and Amazon to improve their facial recognition systems.
Non-web firms have also been sued under BIPA. In 2021, for example, in Rosenbach v. Six Flags, Six Flags settled an Illinois class action for $36 million for fingerprints taken without consent.
Other States Take Action
Other states have also passed statutes limiting companies’ biometric use, but none with the “teeth” of a private right of action like Illinois’s BIPA. Texas was one of those states. In 2009, Texas passed the “Capture or Use of Biometric Identifier Act,” or CUBI. CUBI imposes a penalty of “not more than” $25,000 for each violation. However, unlike Illinois, there is no private right of action. In February 2022, Texas Attorney General Ken Paxton took action under the CUBI legislation and filed suit against Facebook, claiming that Facebook owed billions to the state for violating CUBI for not obtaining user consent when collecting the biometric data of more than 20 million Texas residents.
Still other states have passed laws limiting law enforcement’s use of facial recognition and biometric data. In October 2020, Vermont passed the “Moratorium on Facial Recognition Technology,” prohibiting law enforcement from using facial recognition. The Moratorium provides “a law enforcement officer shall not use facial recognition technology or information acquired through the use of facial recognition technology unless the use would be permitted with respect to drones….” Notably, the Vermont law expanded the definition of facial recognition to include recognition of “sentiment”:
“Facial recognition” means… the automated or semiautomated process by which the characteristics of a person’s face are analyzed to determine the person’s sentiment, state of mind, or other propensities, including the person’s level of dangerousness.
The COVID pandemic has been a busy time for new facial recognition laws. In 2021, Virginia enacted the “Facial recognition technology; authorization of use by local law-enforcement agencies” legislation (HB 2031) prohibiting local law enforcement and campus police from “purchasing or deploying” facial recognition. The Virginia statute did not prevent local law enforcement from using facial recognition deployed by others. Also, by prohibiting just “local law-enforcement agencies,” the law allowed other Virginia law enforcement agencies to use the technology. Interestingly, the law addressed only facial recognition and not the recognition of gait, fingerprints, voiceprints, or state of mind.
The same year, Massachusetts passed the “Facial and Other Remote Biometric Recognition” legislation limiting state law enforcement’s use of facial recognition. The law expressly included in the definition of facial recognition the “characteristics of an individual’s face, head or body to infer emotion, associations, activities or the location of an individual… gait, voice or other biometric characteristic.” The law required a court order or an immediate emergency where there could be a risk of harm to a person for use of facial recognition. It also limited all law enforcement agencies in the state, not just local law enforcement as in Virginia.
In 2021, Maine passed the “Act To Increase Privacy and Security by Prohibiting the Use of Facial Surveillance by Certain Government Employees and Officials,” which is similar to the “Facial and Other Remote Biometric Recognition” legislation in Massachusetts. However, Maine’s law applied to all government employees, not just law enforcement. Maine also allowed government employees to use facial recognition without a court order as long as the state employee was investigating a “serious crime” and believed there was “probable cause to believe that an unidentified individual in an image has committed the serious crime,” or under a limited number of additional exceptions. Massachusetts, by contrast, required a court order issued by a court that issues criminal warrants. Utah passed a similar law to that of Maine in 2021, limiting the government’s use of facial recognition except for investigations where there is a “fair probability” the individual is connected to the crime.
In other states:
- New York passed a 2021 law prohibiting facial recognition in schools.
- Washington state passed a law prohibiting government agencies from using facial recognition except with a warrant or in an emergency.
- In 2014, New Hampshire limited government agencies from using biometric data but allowed them to use it to solve a crime without a warrant.
- A 2020 Maryland law prohibits employers from using facial recognition during interviews without a signed consent.
- California passed a new law that banned law enforcement from using facial recognition in their body cameras but not in other police surveillance cameras. The law expires on January 1, 2023.
- Similarly, Oregon limited law enforcement from using facial recognition on body cameras.
While there appears to be a new trend in privacy rights among states, the majority of states—like Colorado and Montana—have failed in attempts to enact facial recognition legislation. Today as when Justice Brandeis opined on the topic 94 years ago, we are still balancing our right of privacy from the law enforcement with our fear of crime and the need to allow law enforcement to freely act. In addition, while Illinois, Texas, and California are limiting private companies from using biometric data without prior opt-in consent, most states have not enacted regulation to prevent private firms from using the technology, for now.
While the federal government is not addressing the thorny issue of facial recognition, states appear to be on a roll and are taking matters into their own hands. It is clear that both the left and the right of the political spectrum are seeking to curb the use of facial recognition and biometric software by law enforcement. Also, the implementation of a private right of action by Illinois has produced results in terms of keeping companies in line with regard to privacy rights. We should expect to see more state legislation granting private rights of action in cases related to violations of limitations on facial recognition and biometric data use, particularly in states with strong plaintiffs’ bars.