Is your beautiful smile sufficient to pay for your next latte? Can the sound of your voice open doors? Biometric identification systems are rapidly increasing in use, as advances in sensors, readers, and software make physical characteristics easily measurable. Biometrics are simply measurements of a person’s physical being. Fingerprints, retinal or iris scans, hand geometry, facial recognition, gait analysis, voiceprint reading, and even keystroke analysis are all simple biometric ways to identify a person. DNA collection, review, and comparison may tell the most about a person, but for many functions, a quick look at the face is all that is needed.
Considering the accuracy and ease of use, it should come as no surprise that commercial use of biometrics has exploded in recent years. Biometric data is being incorporated into authentication processes for telephone calls, physical devices, and online applications. Banks regularly use voiceprint (a digitized representation of the sound of your voice) to authenticate account holders when they call customer service lines. In February 2016, MasterCard announced it would accept “selfies” in lieu of passwords for cardholders to sign in to their accounts using their faceprints (digitized representations of people’s facial features). Similarly, Amazon filed a patent, published in March 2016, for a program that will allow users to authorize purchases by taking selfies while performing actions at the program’s request, such as smiling or winking, to demonstrate the subject is the living user and not a photograph of him or her.
However, the use of biometrics for identification has pitfalls as well. For example, a person’s biometric data cannot be replaced or changed like her password or credit card number. You only get one set of fingerprints, retinal patterns, and faceprint, and once compromised, the biometric measurement may be lost as an identifier. Also, biometric identifiers can be collected and used surreptitiously. Faces and gaits can be measured in a crowd or at a doorway, voices can be recorded from phone calls. Biometric information allows individual identification more easily than traditional identifiers do. All it takes to identify an individual is to collect his faceprint one time, and use it forever.
Such use may lend itself to a high level of government surveillance. The FBI is currently developing the “Next Generation Identification” program, which will collect fingerprints, iris scans, DNA profiles, voiceprints, palm prints, and photographs, and will replace its current identification program, which contains only fingerprints. It is rumored that the FBI’s new program will also be interoperable with biometric databases maintained by the Department of Homeland Security and the Department of Defense, thus presenting infinite possibilities for matching identifiers.
Given advantages and risks, biometric capture and authentication usage would not long evade the gaze of legislatures and regulators, and the legal limitations have begun in earnest. This article examines the laws that govern use of our biometric data and outlines the steps businesses should follow in their implementation of authentication technologies employing biometric data. Currently, two sources of laws affect the use of biometric information by private and government actors: laws specifically addressing the use of biometric identifiers and broad privacy laws that include biometric information in their definition of personal information.
Privacy Laws Specifically Targeted to Biometric Information
A few states have enacted legislation specifically to regulate third parties’ use and collection of individuals’ biometric information. State laws concerning biometric information fall roughly into one of three categories: (1) laws with respect to the collection and use of biometric information belonging to students; (2) laws dealing with collection by government actors; and (3) laws targeting the collection and use of biometric information by businesses.
Student Biometric Information
California law prohibits operators of websites geared towards K-12 school purposes from selling students’ biometric data and restricts their use. Delaware has a similar law. In North Carolina and West Virginia, student biometric data may not be kept in the student data systems.
Illinois law prohibits school districts from collecting biometric information from students without parental consent, and they must stop using such information when the student graduates, leaves the school district, or when the district received a written request from the student and all biometric information must be destroyed within 30 days of discontinued use. The school district may only use biometric information for student identification or fraud prevention and may not sell or disclose to third parties without parental consent or pursuant to a court order. Arizona, Wisconsin, Louisiana, and Kansas have similar laws. Colorado law prohibits its Department of Education from collecting student biometric information unless required by state or federal law. A new Florida law enacted in 2014 goes even further than the foregoing state laws by prohibiting schools from collecting, obtaining, or retaining biometric information from students, their parents, or their siblings.
Government Actors Collecting Biometric Information
Missouri, Maine, and New Hampshire laws prevent state agencies from collecting, storing, or using individuals’ biometric data in connection with ID cards or driver’s licenses. Neither these laws nor any existing laws prohibit government actors from collecting or using biometric information in connection with law enforcement, immigration, border security, or national security.
Collection of Biometric Information by Businesses
The first state law to address business’ collection of biometric data was the Illinois Biometric Information Privacy Act, 740 ILCS 14 et seq. (BIPA) in 2008, followed shortly thereafter by Texas’s biometric law, contained in Section 503.001 of the Texas Business and Commercial Code, effective in 2009. BIPA sets forth a comprehensive set of rules for companies collecting biometric data and creates a private right of action for Illinois residents whose biometric data is collected or used in violation of BIPA’s rules. Generally, BIPA is composed of five primary elements. BIPA: (1) requires informed consent prior to collection; (2) prohibits profiting from biometric data; (3) permits only a limited right to disclose; (4) mandates protection obligations and retention guidelines; and (5) creates a private right of action for individuals harmed by violators of BIPA.
First, and most notably, BIPA prohibits a business from collecting or receiving biometric data without first notifying the individual of such collection in writing. The notice must include the purpose for the collection, and how long the data will be used or stored, and the business must receive the individual’s written consent to such collection. The form and content of the written release is not proscribed by BIPA, and BIPA gives us no guidance as to whether electronic methods of consent are permissible. One could expect that “click-wrap” agreements, which are terms and conditions provided electronically along with an “accept” button often used when a website is accessed or a service is requested on the internet, likely constitute written consent under BIPA. Conversely, BIPA’s written consent requirement is not likely satisfied with the use of “browse-wrap” agreements, which are the placement of terms and conditions of use somewhere on a webpage, without the requirement of any affirmative action to accept such terms.
Second, BIPA prohibits a business from selling or otherwise profiting from biometric data it collects or stores. The law uses the vague verbiage “otherwise profit” from the use of the biometric data, but is silent as to how direct the causal link must be between the profit and the data to qualify as a violation of the provision. This section of BIPA may be fodder for interesting lawsuits in the future.
Third, the Illinois law prohibits a business from disclosing an individual’s biometric data unless (i) the subject consents; (ii) the disclosure completes a financial transaction requested by the individual; (iii) the disclosure is required by Illinois law, municipal ordinance, or federal law; or (iv) the disclosure is required by a valid warrant or subpoena.
Fourth, BIPA requires a business to protect biometric data in the same manner it would other sensitive and confidential information in its possession, using the reasonable standard of care within its industry. In addition, the Illinois law requires a business in possession of biometric data to have a publicly available, written policy stating the business’s retention schedule for the data and rules governing its destruction and the business must adhere to such policy. (BIPA, § 15(a)). A business may not store biometric data for longer than the earlier of three years from the individual’s last interaction with the company or when the initial purpose for collecting the data has been fulfilled.
Finally, BIPA gives any person harmed by a business’ violation of BIPA a private right of action and entitles a prevailing party to statutory damages for each violation equal to the greater of $1,000 or actual damages for negligent violation of BIPA or the greater of $5,000 or actual damages for intentional or reckless violation of BIPA.
BIPA went largely unnoticed after its enactment, until a series of five similar lawsuits were brought in 2015 against businesses that allegedly collected and used biometric data belonging to Illinois residents in violation of BIPA. Such class action lawsuits based on BIPA were filed against Facebook and Shutterfly. Four lawsuits, Pezen v. Facebook Inc., 1:15-cv-03484 (N.D. Ill. Apr. 21, 2015), Licata v. Facebook Inc., 1:15-cv-04022 (N.D. Ill. May 5, 2015), Patel v. Facebook Inc., 1:15-cv- 04265 (N.D. Ill. May 14, 2015), and Gullen v. Facebook Inc., 1:15-cv-07861 (N.D. Ill. Aug. 31, 2015) all allege that Facebook violated BIPA with its photograph tagging suggestion feature, which captures and stores facial features, without subjects’ consent, to enable users to “tag” their friends in photographs. The plaintiffs in these cases allege they were not Facebook users at the time their faceprints were collected and, as such, did not consent to the collection of such biometric data that may have been described in Facebook’s click-wrap agreement for the creation of a Facebook account.
Similarly, in Norberg v. Shutterfly, Inc., 1:15-cv-05351 (N.D. Ill. June 17, 2015), the plaintiff claims that Shutterfly’s creation, collection, and storage of millions of “face templates” from individuals whose images appear in photographs submitted to Shutterfly, many of whom are not Shutterfly users, is a violation of BIPA’s informed consent requirement. At the time of publication, the U.S. District Court had allowed Norberg’s case against Shutterfly to proceed, rejecting Shutterfly’s argument that Norberg failed to state a claim because BIPA excludes photographs from the definition of biometric identifier, and the faceprints Shutterfly creates are derived from users’ photographs. The court denied Shutterfly’s motion on the basis that BIPA’s definition of “biometric identifiers” includes scans of facial geometry.
BIPA appears to be the biometric legislation to emulate. The Texas biometric law described above contains similar substantive provisions to that of BIPA, namely requiring informed consent before a business can capture a biometric identifier, prohibiting a business’ sale of biometric data with a few exceptions, and setting forth certain security and retention requirements. (Texas Business and Commercial Code, § 503.001(b) and (c)). House Bill No. 511 was introduced in Idaho in 2014, and the text of the bill is nearly identical to BIPA. The text of House Bill No. 96 introduced in Alaska in 2015 is not identical to BIPA’s, but contains similar substantive provisions. Washington’s state legislature is currently considering bill HB 1094, which is substantially similar to BIPA, requiring notice of collection, the purposes for which biometric data will be used, and when it may be disclosed to third parties, and requiring subjects’ consent to such collection and use. Washington’s bill also limits disclosure and sale, subject to some exceptions. Note, however, that the requirements of Washington’s bill do not apply when biometric information is collected for a “security purpose,” which is “the purpose of preventing shoplifting, fraud, or any other misappropriation or theft of a thing of value.” (HB 1094 § 3). Monitor HB 1094’s status in the legislature closely because, at the time of this article’s publication, it is a hairsbreadth from becoming law.
The FTC and Biometrics
The FTC has thrown its hat into the ring as well, by issuing recommended best practices for companies using facial recognition technology, but the FTC has stopped short of creating rules or laws in this space. The FTC published “Facing Facts: Best Practices for Common Uses of Facial Recognition Technologies” in October 2012 (the FTC Recommendations) to provide guidance to companies under its purview that currently or seek to incorporate facial recognition technology in their products or services.
The FTC first recommends that companies implement “privacy by design” by (i) maintaining reasonable data security protections for biometric information; (ii) establishing and maintaining appropriate retention and disposal practices for biometric information; and (iii) considering the sensitivity of biometric information when designing facial recognition technologies. In the FTC Recommendations, the FTC also suggests that companies employing facial recognition technologies should increase transparency of their methods and provide consumers with choices, such as the opportunity to opt out of collection of their biometric information. The FTC specifically advises social networking companies to give consumers a clear notice, apart from its privacy policies, that it collects faceprints, how the technology works, and how the company will use the data. The FTC also advises that social networking companies shall give consumers an easy way to opt out of collection and ability to turn off the facial recognition feature at any time and have the company delete the biometric data already collected.
Lastly, the FTC recommends that companies obtain subject’s express consent before collecting or using faceprints in two situations: (i) before using an image or faceprint in a materially different way than the company represented at the time of collection; and (ii) when using a faceprint to identify anonymous images of a subject to someone who could not otherwise identify the subject, such as in public places. The FTC Recommendations mirror BIPA’s requirements, without going as far as to advise against disclosure to third parties.
Though the FTC Recommendations are merely guidelines and shall not serve as a framework for FTC enforcement actions to the extent that the best practices suggest actions beyond what current laws may require, the FTC hints that a company’s failure to properly notify subjects of the use of facial recognition technologies could be subject to FTC enforcement actions. In a footnote to the FTC Recommendations, the FTC comments that:
If a company uses facial recognition technologies in a manner that is unfair […], or that constitutes a deceptive act or practice, the Commission can bring an enforcement action under Section 5. In contrast, in other countries and jurisdictions, such as the European Union, in certain circumstances, consumers may need to be notified and give their consent before a company can legally use facial recognition technologies.
Broad “Umbrella” Privacy Laws under Which Biometric Information May Fall
Many state laws governing data security and breach response include biometric information in their definitions of covered personal information. For example, the North Carolina Identity Theft Protection Act lists “biometric data” as an element of identifying information that, in combination with a person’s name, constitutes “personal information.” This law requires any entity conducting business in the state and maintaining personal information of a resident to take reasonable measures to protect the information against unauthorized access. Such reasonable measures must include proper disposal of the information. A security breach involving a North Carolina resident’s biometric data, if paired with his or her name, would also be governed by the North Carolina law’s data breach notification procedures. Most states’ data breach notification laws will govern unauthorized access to residents’ biometric information, though such inclusion may be vague, and not specifically identify biometric information, like in the case of South Carolina’s law, which defines “personal identifying information” as a person’s name plus “other numbers or information which may be used to access a person’s financial accounts or numbers or information issued by a governmental or regulatory entity that will uniquely identify an individual.”
The U.S. nationwide privacy law regime is based on the sectoral approach, such that the primary source of privacy laws in the United States takes the form of various laws governing industry sectors. Such various industry-specific laws also govern private and public actor’s use of individual’s biometric information in their governance of financial institutions, educational institutions, commercial entities, and health-care providers.
Financial institutions must comply with the provisions of the Gramm-Leach Bliley Act (GLBA), enacted in 1999, addressing the privacy of personally identifiable financial and account data. The privacy requirements of GLBA, Title V apply to “financial institutions,” which are essentially any business institutions significantly engaged in financial activities. GLBA’s privacy rule applies to the collection of nonpublic personal information (NPI). GLBA’s definition of NPI does not expressly list biometric information, but the expansive definition of NPI certainly includes biometric data. NPI is defined as personally identifiable financial information: “(i) provided a consumer to a financial institution; (ii) resulting from any transaction with the consumer or any service performed for the consumer; or (iii) otherwise obtained by the financial institution.” Under the FTC’s Privacy of Consumer Financial Information Rule, NPI is personally identifiable information or any list, description, or grouping of consumers derived using nonpublic personally identifiable information. Personally identifiable information is defined by the FTC as any information: (i) a customer provides to a financial institution to obtain a product or service; (ii) about a customer as a result of a transaction between the financial institution and customer involving the financial institution’s products or services; or (iii) otherwise obtained by the financial institution in connection with its provision of services or products to the customer. A consumer’s biometric information collected by an institution could fall under any of these definitions depending on the method and timing of its collection.
Consumer biometric data is not protected as tightly as most people believe by the financial laws. For example, GLBA does not prohibit a financial institution from selling or profiting from consumer NPI, including biometric data. GLBA Title V is silent on financial institutions’ collection of NPI. It does set forth an “Opt Out” requirement, but such requirement is with respect to a financial institution’s disclosure, not collection, of NPI. Under GLBA, a financial institution shall not disclose NPI to a third party unless it discloses to the consumer the possibility of the disclosure to such third party, the consumer is given the opportunity to opt out of such disclosure, and the consumer is told how to opt-out. Note also that disclosure to an affiliate is not subject to the notice and opt-out requirements and there is also an exemption for nonaffiliated third party servicers and joint marketers. GLBA provides a litany of exceptions from its opt-out and notice requirements, including when a financial institution’s disclosure of its NPI to a nonaffiliated third party is to prevent fraud or unauthorized transactions. Because the most prolific use of consumer’s biometric data in the financial industry is ostensibly for fraud prevention, this exception to the requirements of Sections 502(a) and (b) of GLBA eviscerates the requirements of notice and opt-out for most financial institutions’ use of biometric data.
Upon establishing a relationship with a consumer and annually thereafter, GLBA requires financial institutions to provide a notice to consumers describing its privacy policies and providing consumers with the right to opt out of disclosure of certain NPI. Such notices are required to include the categories of parties to whom NPI may be disclosed, the categories of NPI that are collected, and the methods financial institutions employ to protect NPI. GLBA does not require that notices include the specific purpose for which data is collected and the time period for which the financial institution will store the data. Finally, GLBA requires that financial institutions protect the security and confidentiality of customers' nonpublic personal information by adhering to the “appropriate standards” promulgated by their respective regulatory agencies or authorities.
The Family Educational Rights and Privacy Act, 34 CFR Part 99 (FERPA) governs the disclosure of students’ biometric information, to the extent that it is contained in student records. A student’s biometric record is included in the definition of personally identifiable information, and is a type of information that may be included in students’ education records. As such, FERPA prohibits schools from releasing students’ biometric information without parental consent, to the extent that it is contained in students’ education records, with some limited exceptions.
The health industry has long been regulated by detailed data rules, since 1996, when Congress enacted the Health Insurance Portability and Accountability Act (HIPAA) to regulate the treatment of electronically stored or transmitted individually identifiable protected health information (PHI) by hospitals, insurers, employers, doctors, and pharmacies. If biometric information is collected in the course of treating a patient, like collecting a voiceprint to enable a patient to sign into her online hospital account, such information would be treated as PHI under HIPAA. Under HIPAA’s Privacy Rule, covered entities may only use and disclose PHI with the individual’s written consent or (i) to the individual; (ii) for treatment, payment, and healthcare operations activities; (iii) with informal permission giving the individual the opportunity to agree or object; (iv) for uses incident to permitted uses; (v) for purposes that benefit the public interest; and (vi) a limited data set may be disclosed for research or public health purposes. Note that the Privacy Rule does not require consent for collection of PHI, just disclosure.
The Privacy Rule does not apply to information that has been de-identified, which may be accomplished by removing all specific identifiers from the PHI. HIPAA lists “Biometric identifiers, including finger and voice prints” as an identifier of an individual that must be removed from PHI for de-identification. The HIPAA Privacy Rule also requires covered entities to provide notices of their privacy practices to patients. Among other elements, these notices must describe how the entity will use and disclose PHI. The security rule requires covered entities and their business associates to protect electronic PHI using administrative, physical, and technical security safeguards. Finally, the Health Information Technology for Economic and Clinical Health Act (the HITECH Act) amended HIPAA to require covered entities and their business associates to notify affected individuals and the Department of Health and Human Services in the event of a breach involving unsecured PHI.
HIPAA only protects data collected by certain types of entities for the provision of health care services or payment for those services. It would not affect biometric data collected for identification purposes by a doctor or hospital, or any biometric data collected by a non-covered entity.
After weeding through the maze of sectoral laws, state laws, pending cases, and recommendations making up the patchwork of privacy laws and rules governing the use and collection of biometric data, the only clear conclusion is that practitioners, technology developers, and privacy-conscious individuals should watch this rapidly developing legal landscape. Companies employing technologies using biometric identifiers may want to err on the side of caution and ensure that their notification and consent processes are clear and conspicuous. For cautious businesses, employ an opt-in structure for your technologies using biometric identifiers. Look hard at your retention policies and look harder at your disposal practices. But also make sure to smile, so your iPhone camera can pay for your latte.