Technology Triggers Privacy Concerns
Even if the technology is accurate, the American Civil Liberties Union and others warn that the use of facial recognition software raises privacy concerns. It has been used by law enforcement to aid investigations. The FBI uses several technologies, including (1) the Next Generation Identification System and (2) the Facial Analysis, Comparison, and Evaluation Services Unit. It is estimated that the FBI has over 411 million photos available to search using its facial recognition technology.
What can be done with all of this biometric data out there for use by law enforcement and others? Surprisingly, there are not many restrictions.
One attempt to regulate is found in the Illinois Biometric Information Privacy Act (BIPA) and recognizes that "[t]he use of biometrics is growing in the business and security screening sectors and appears to promise streamlined financial transactions and security screenings." The statute covers a "retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry," and it aims to regulate "any individual, partnership, corporation, limited liability company, association, or other group, however organized." State and local government agencies are exempt from the statute. The statute regulates the retention and use of "biometric identifiers" and creates a private right of action for injunctive relief and attorney fees if successful. Recently, Texas and Washington have also enacted statutes regulating the use of biometric data, but neither of those states provides a private right of action for aggrieved individuals.
Limited Civil Protections under Current Laws
To date, there is little civil recourse if someone misuses another person's biometric information or does not adequately protect it, unless that company runs afoul of the Illinois statute or acts in some other tortious way. And even the BIPA has its limitations.
An early interpretation of the BIPA did not require a showing of actual damages. In Norberg v. Shutterfly, Inc., the defendants operated websites that offered facial recognition capabilities, making it possible to identify and categorize photos based on the people within them. The plaintiff alleged that the defendants were using his personal face pattern to recognize and identify him in photographs posted to websites, that he has never been a user of websites, and that he did not consent to have his biometric identifiers used by anyone. The court found that the plaintiff had plausibly stated a claim for relief under the BIPA and that he had stated a plausible claim for statutory damages without any discussion of actual damages.
In McCollough v. Smarte Carte, Inc., proposed class members had used Smarte Carte's rental lockers in Illinois, which renters could open by pressing a fingertip to a touchscreen. They sought damages, injunctive relief, and unjust enrichment, claiming that Smarte Carte failed to obtain advanced consent before saving fingerprint data. Although the court noted that saving fingerprint data without consent is a "technical violation" of the BIPA, it held that the plaintiffs failed to allege actual harm. Retention, without a disclosure, was not an injury, and therefore the plaintiffs lacked Article III standing.
The U.S. District Court for the Southern District of New York agreed with this analysis in Vigil v. Take-Two Interactive Software, Inc. The putative class members claimed that Take-Two had violated the BIPA by collecting biometric data for use in its video games NBA 2K15 and NBA 2K16. These Illinois residents had used a game feature called MyPlayer to scan their faces and create personalized virtual basketball players. Take-Two had not disseminated their faces or used them for any purpose outside of the game.
Although the Vigils agreed to the MyPlayer terms and conditions, they contended that Take-Two had not adequately informed them about how their facial data would be stored and transmitted and that their biometric data was not adequately protected. Moreover, Take-Two had not provided a retention schedule or instructions on how to delete the saved facial data permanently. As in McCollough, the Vigil court viewed these claims as technical violations of the BIPA but found no actual harm. While inadequate safeguarding enhances the potential for harm, abstract and speculative injuries would not support Article III standing.
Since those cases, the U.S. District Court for the Northern District of Illinois expressed doubt regarding a BIPA actual harm requirement. In Monroy v. Shutterfly, Inc., Monroy claimed that a Shutterfly user had uploaded his photo to the website. In turn, Shutterfly automatically located his face, analyzed its geometric contours, and stored this facial map in its database. Monroy was not a Shutterfly user.
Shutterfly moved to dismiss the complaint, arguing that Monroy had failed to plead actual damages. The court observed that the question "is a close one," but declined to require a showing of actual damages. After a close textual analysis and comparisons to other privacy acts, it agreed with Monroy that a claim of liquidated damages is sufficient under the BIPA.
Facial recognition technology will continue to be researched and developed for a host of uses—some we have thought of, some we have not. There are few statutes protecting against the misuse of this information or prohibiting the unauthorized use of an average person's biometric data. While the law is moving with this technology, it is moving at a much slower pace. Crafty litigators will need to stay on top of this developing technology for the best chance of protecting the unauthorized use of their client's biometric information.