chevron-down Created with Sketch Beta.

Litigation News

2013-2018

Facial Recognition Technology Hits Privacy Rights Crossroads

David Jeremy Simmons

Summary

  • Facial recognition technology is the computerized verification of a person's identity from a digital image.
  • It has grown by leaps and bounds in the past 15 years and is being used by, among others, the New York State Department of Motor Vehicles, companies tracking how you react when watching movies, and the FBI.
  • Personal privacy rights are at issue in the gathering, retention, and use of this very personal data. 
Facial Recognition Technology Hits Privacy Rights Crossroads
Weiquan Lin via Getty Images

Jump to:

Facial recognition technology, the computerized verification of a person's identity from a digital image, is not limited to science fiction movies and our imaginations. It will soon have a much greater impact on our lives. It has grown by leaps and bounds in the past 15 years and is being used by, among others, the New York State Department of Motor Vehicles, companies tracking how you react when watching movies, and the FBI. Now, new iPhone users will experience this technology.

Personal privacy rights are at issue in the gathering, retention, and use of this very personal data. The first step in understanding how to advise clients on this new technology and its impact on their privacy must start with an understanding of where it is and how facial recognition works.

How Is Facial Recognition Technology Being Used?

Currently, facial recognition software primarily identifies individuals for law enforcement purposes and to pay for goods and services. But companies are developing several other uses. For instance, facial recognition software is being developed for dating sites to match people with similar facial characteristics. Schools may be using it to take attendance and to see if students are paying attention. Companies such as Apple, Amazon, and Facebook have invested in this technology and are developing even more ways to use it. But, how does this technology work?

There are several ways this specialized recognition technology can capture your face. Under one method, algorithms identify and extract landmarks or features from your face, such as the position, size, or shape of your eyes, nose, cheekbones, and jaw. Additional algorithms process a gallery of saved faces, normalize the facial data within the gallery, and extract facial data that are useful to the algorithm. That normalized image is then compared with the facial data to find a match.

Another technology uses sensors to take a three-dimensional image of the shape of your face and identify distinct features. Yet another method performs a skin texture analysis, scrutinizing and mapping the visual details of the skin. Skin texture analysis turns the unique lines, patterns, and spots in your skin into mathematical space.

Apple's recent iPhone release uses Apple Face ID and the TrueDepth camera to analyze more than 30,000 invisible dots and create a precise depth map of your face. It uses a machine learning A11 Bionic chip to account for changes in your appearance, so you do not have to worry that your phone will not recognize you if you decide to grow a beard. But this technology is imperfect. At the Apple X release conference, the phone failed to recognize the speaker.

Technology Triggers Privacy Concerns

Even if the technology is accurate, the American Civil Liberties Union and others warn that the use of facial recognition software raises privacy concerns. It has been used by law enforcement to aid investigations. The FBI uses several technologies, including (1) the Next Generation Identification System and (2) the Facial Analysis, Comparison, and Evaluation Services Unit. It is estimated that the FBI has over 411 million photos available to search using its facial recognition technology.

What can be done with all of this biometric data out there for use by law enforcement and others? Surprisingly, there are not many restrictions.

One attempt to regulate is found in the Illinois Biometric Information Privacy Act (BIPA) and recognizes that "[t]he use of biometrics is growing in the business and security screening sectors and appears to promise streamlined financial transactions and security screenings." The statute covers a "retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry," and it aims to regulate "any individual, partnership, corporation, limited liability company, association, or other group, however organized." State and local government agencies are exempt from the statute. The statute regulates the retention and use of "biometric identifiers" and creates a private right of action for injunctive relief and attorney fees if successful. Recently, Texas and Washington have also enacted statutes regulating the use of biometric data, but neither of those states provides a private right of action for aggrieved individuals.

Limited Civil Protections under Current Laws

To date, there is little civil recourse if someone misuses another person's biometric information or does not adequately protect it, unless that company runs afoul of the Illinois statute or acts in some other tortious way. And even the BIPA has its limitations.

An early interpretation of the BIPA did not require a showing of actual damages. In Norberg v. Shutterfly, Inc., the defendants operated websites that offered facial recognition capabilities, making it possible to identify and categorize photos based on the people within them. The plaintiff alleged that the defendants were using his personal face pattern to recognize and identify him in photographs posted to websites, that he has never been a user of websites, and that he did not consent to have his biometric identifiers used by anyone. The court found that the plaintiff had plausibly stated a claim for relief under the BIPA and that he had stated a plausible claim for statutory damages without any discussion of actual damages.

In McCollough v. Smarte Carte, Inc., proposed class members had used Smarte Carte's rental lockers in Illinois, which renters could open by pressing a fingertip to a touchscreen. They sought damages, injunctive relief, and unjust enrichment, claiming that Smarte Carte failed to obtain advanced consent before saving fingerprint data. Although the court noted that saving fingerprint data without consent is a "technical violation" of the BIPA, it held that the plaintiffs failed to allege actual harm. Retention, without a disclosure, was not an injury, and therefore the plaintiffs lacked Article III standing.

The U.S. District Court for the Southern District of New York agreed with this analysis in Vigil v. Take-Two Interactive Software, Inc. The putative class members claimed that Take-Two had violated the BIPA by collecting biometric data for use in its video games NBA 2K15 and NBA 2K16. These Illinois residents had used a game feature called MyPlayer to scan their faces and create personalized virtual basketball players. Take-Two had not disseminated their faces or used them for any purpose outside of the game.

Although the Vigils agreed to the MyPlayer terms and conditions, they contended that Take-Two had not adequately informed them about how their facial data would be stored and transmitted and that their biometric data was not adequately protected. Moreover, Take-Two had not provided a retention schedule or instructions on how to delete the saved facial data permanently. As in McCollough, the Vigil court viewed these claims as technical violations of the BIPA but found no actual harm. While inadequate safeguarding enhances the potential for harm, abstract and speculative injuries would not support Article III standing.

Since those cases, the U.S. District Court for the Northern District of Illinois expressed doubt regarding a BIPA actual harm requirement. In Monroy v. Shutterfly, Inc., Monroy claimed that a Shutterfly user had uploaded his photo to the website. In turn, Shutterfly automatically located his face, analyzed its geometric contours, and stored this facial map in its database. Monroy was not a Shutterfly user.

Shutterfly moved to dismiss the complaint, arguing that Monroy had failed to plead actual damages. The court observed that the question "is a close one," but declined to require a showing of actual damages. After a close textual analysis and comparisons to other privacy acts, it agreed with Monroy that a claim of liquidated damages is sufficient under the BIPA.

Facial recognition technology will continue to be researched and developed for a host of uses—some we have thought of, some we have not. There are few statutes protecting against the misuse of this information or prohibiting the unauthorized use of an average person's biometric data. While the law is moving with this technology, it is moving at a much slower pace. Crafty litigators will need to stay on top of this developing technology for the best chance of protecting the unauthorized use of their client's biometric information.

Resources

    Author