While the majority of CSPLs apply to either in-state businesses or businesses that target state residents, the Texas Act has a much broader scope and applies to legal or natural persons that produce products or services consumed by Texas residents. Notably, the Texas Act exempts from general coverage small businesses, as defined by the U.S. Small Business Administration, which definition hinges on industry-dependent income and number of employees. The Texas Act nevertheless prohibits small businesses from selling sensitive personal data without consumer consent.
The Florida Act applies to for-profit business organizations that conduct business in Florida, generate more than one billion dollars in global gross annual revenue, and determine the purposes and means of processing personal data about consumers. Focused on BigTech companies, the Florida Act applies only to entities that: (1) generate 50 percent or more of their global gross annual revenues from the sale of advertisements online; (2) operate an app store or digital distribution platform that offers at least 250,000 different software applications for consumers; or (3) operate certain kinds of consumer smart speaker and voice command component services.
B. The Emergence of a Common Compliance Floor
All New Acts contain a comparable basket of core consumer rights, including the right to know about, access, delete, and port personal data. Except for the Iowa and Utah Acts, all other New Acts afford residents rights to correct false or erroneous personal data. All New Acts prohibit discriminating against consumers for exercising their privacy rights. All New Acts require similar privacy notices, reliable and intuitive methods for exercising consumer rights, appeal mechanisms, and nearly identical frameworks for dealing with consumer requests.
All New Acts share similar definitions of sensitive personal information (SPI), which includes information revealing a person’s racial or ethnic origin, religious beliefs, mental or physical health diagnosis, citizenship or immigration status, and precise geolocation data, as well as additional information collected from a known child. All New Acts require implementation of reasonable data security measures. No New Act establishes a private right of action. Except for the Iowa Act, all New Acts allow consumers to opt out of use of their personal data for the purposes of sale, targeted advertising, and profiling. Except for the Iowa Act, all New Acts require covered entities to create data protection assessments.
Significant variations between CSPLs do not necessarily signal higher compliance burdens for all businesses. For example, the Florida Act, which contains numerous outlier provisions, applies only to businesses that satisfy its billion-dollar income threshold. One provision of the Tennessee Act, however, does increase compliance costs, by creating an affirmative defense for any business that adopts a privacy program that reasonably conforms to the National Institute of Standards and Technology Privacy Framework or “other documented policies, standards, and procedures designed to safeguard consumer privacy.” As it is currently not clear how reasonable conformity will be assessed, record keeping with respect to decisions and actions around privacy and data protection might serve to minimize risk when planning to rely on this defense.
C. Three Broad Compliance Models
There are noteworthy differences amongst the New Acts. For example, some states define SPI more broadly than others: All New Acts define SPI to include genetic or biometric data processed for the purposes of uniquely identifying an individual, but Delaware’s definition includes any genetic or biometric data regardless of usage. While the Texas Act defines SPI to include information revealing an individual’s “sexuality,” all other New Acts use the term “sexual orientation.” Delaware extends the definition to include “sex life” (as does Montana), as well as one’s “status as transgender or nonbinary” (as does Oregon). Moreover, Oregon extends the definition to include one’s status as a victim of a crime. Enforcement and rule-making processes will ultimately determine whether these differences will amount to differences in the level of protection offered by each state.
While most New Acts require consumer consent before processing SPI, Iowa and Florida give consumers the right to notice and the opportunity to opt out from processing of SPI. Florida also provides consumers with the opportunity to opt out of the collection of personal data via voice recognition or facial recognition features.
There are commonalities across these differences as well. Commentators have noted the emergence of three broad approaches to data privacy as exemplified by the California, Virginia, and Connecticut laws. The California Act covers employment contexts, establishes a state agency for privacy protection, and creates a limited right to private action. By contrast, the Virginia Act focuses on harmonizing state laws with those global best practices that businesses might find easier to implement. The Connecticut Act comes closest to global best practices, making it easier for U.S. businesses to offer their products and services on the global market.
Of the New Acts, the Iowa, Indiana, Tennessee, and Texas Acts are said to be modeled after the Virginia Law. The Montana, Oregon, and Delaware Acts are lined up with Connecticut. Thus, for example, the Iowa, Indiana, and Tennessee Acts do not require recognition of a universal opt-out mechanism that would allow consumers to indicate their privacy choices across all websites. As an additional example, while the Iowa, Indiana, and Tennessee Acts are silent on the matter. Florida, Texas, Oregon, Delaware, and Montana prohibit the use of dark patterns for eliciting consent.
Finally, the New Acts take varying approaches to notice-and-cure periods that provide opportunity to remedy violations and avoid enforcement measures. The Iowa, Indiana, Tennessee, and Montana Acts establish permanent cure periods, ranging from thirty to ninety days. By contrast, the Delaware, Oregon, and Florida Acts offer cure periods that are either discretionary or set to expire. Businesses might triage implementation of any variances in requirements between laws to account for relevant deadlines, including sunsetting cure periods.
III. New Limited Data Privacy Acts
A. Protecting Consumer Health Data
Washington and Nevada passed limited consumer health data privacy laws. Among other things, each Act requires prior consent for collection or processing of consumer health data. Each Act includes similar entity- and data-level exemptions as most CSPLs and is aimed to cover privacy and data protection gaps in the consumer health context. Each Act defines “consumer health data” similarly to include information regarding gender-affirming care, reproductive or sexual health, and health-related location information. Those definitions also include biometric and genetic data as consumer health data. Each Act requires consumer health data privacy policies.
Under both laws, consumers have a right to deletion of their consumer health data. Under the Washington Act, obligations to delete data extend to third parties or affiliates of covered entities. To varying degrees, both laws prohibit geofencing around entities that provide health care services. Both laws require written authorizations for the sale of personal health data, with the Washington Act enumerating detailed requirements for authorization. The Washington Act, unlike the Nevada Act, creates a private right of action. Uniquely, the Nevada Act exempts from coverage information used to access or enable gameplay on a video game platform and to identify shopping habits or interests unrelated to health status.
Connecticut amended its data privacy provisions to address “consumer health data,” which it defines as “any personal data that a controller uses to identify a consumer’s physical or mental health condition or diagnosis, and includes, but is not limited to, gender-affirming health data and reproductive or sexual health data.” Connecticut further amended its definition of “sensitive data” to include “consumer health data,” as well a one’s status as a “victim of crime.” Connecticut now prohibits regulated companies from processing consumer health data without obtaining the consumer’s consent, and also prohibits the use of a geofence near a health facility for the purpose of identifying, tracking, or collecting data from any consumer regarding consumer health data. Moreover, Connecticut requires that regulated companies conduct a data protection assessment that now includes the processing of consumer health data.
B. The Digital Privacy of Minors
The California Age-Appropriate Design Code Act applies to “[b]usinesses that develop and provide online services, products, or features that children are likely to access.” It requires covered businesses to consider “the best interests of children when designing, developing, and providing [any] online service, product, or feature” and to prioritize the “privacy, safety, and well-being of children over commercial interests.” It prohibits covered businesses from using the personal information of any children “in a way that the business knows, or has reason to know, is materially detrimental to a child.” The Act requires a “high level of privacy” as a default setting for children and prohibits the use of dark patterns. It requires privacy policies and other relevant materials in language that suits the age of the children that might need to access them. It requires obvious signals to children when they are monitored or tracked online by parents, guardians, or any other consumer. The Act mandates biennial Data Protection Impact Assessments that identify the “risks of material detriment to children” that arise from the data management practices and timed plans to mitigate identified risks before children gain access. Opportunity to cure violations are available only to businesses already in substantial compliance with the law. Finally, the Act establishes the California Children’s Data Protection Working Group.
New state laws in Utah, Louisiana, and Arkansas also address the privacy of minors on social media platforms. All three laws prohibit minors from opening or holding accounts on a social media platform without the express consent of a parent or guardian. Utah’s Social Media Regulation Amendments and Louisiana’s Secure Online Child Interaction and Age Limitation Act restrict direct messaging with minors, targeted advertising, collecting and using the personal information of minors, and require that a parent or guardian be given a password to access the minor’s account. Uniquely, the Utah Act grants a parent or guardian sweeping access to the minor’s social media accounts. The Utah Act imposes additional requirements, such as blocking minors from accessing their platforms from 10:30PM to 6:30AM. A parental or guardian can change these settings and otherwise limit minor usage. All three laws authorize state enforcement through civil action. The Utah Act empowers the Division of Consumer Protection with rulemaking authority. The Utah Act provides a thirty-day notice-and-cure period for violators, the Louisiana Act provides a forty-five-day cure period, and Arkansas’ Social Media Safety Act does not contain any cure period.
Connecticut amended its data privacy statute to include similar new protections for minors. Moreover, Connecticut established an Internet Crimes Against Children Task Force.
Businesses that need to comply with requirements across multiple jurisdictions should take care, particularly given the extensive rights granted to parents under the Utah Act, which might conflict with the rights to privacy established for children in other states or under federal law.
C. Colorado
Colorado passed a law that requires state and local government agencies, including institutions of higher education, that use—or intend to develop, procure, or use—a facial recognition service (FRS) to file with its reporting authority a notice thereof. Moreover, Colorado created a task force for the consideration of FRSs. Additionally, on March 15, 2023, the Colorado Attorney General’s Office released the final version of its rules implementing the Colorado Privacy Act.
D. New York City Publishes Final Rules on Automated Employment Decision Tools
On April 6, 2023, the New York City Department of Consumer and Worker Protection adopted first-in-the-country rules restricting the use of automated employment decision tools for hiring. Among other things, the new rules clarify terms and the requirements for bias audits, notice and disclosure to current and prospective employees, and other obligations for the employer or employment agencies covered under the law.
IV. Updates on the Biometric Information Acts of Illinois and Texas
Decisions handed down by the Illinois courts significantly expanded and clarified the reach of the Illinois Biometric Information Privacy Act (BIPA). First, in response to a certified question from the U.S. Court of Appeals for the Seventh Circuit, the Illinois Supreme Court ruled that claims for damages under BIPA accrue on each violation, not just the first. Second, individuals now have five years after any alleged BIPA violation to bring claims under the statute’s private right of action. Together, these holdings dramatically increase the potential damages a court might award under BIPA. Third, an Illinois appellate court held that BIPA’s requirement for a written retention-and-destruction schedule is triggered on the initial date of possession, not afterward. Fourth, federal courts in Illinois have determined that educational institutions that lend funds to students directly will qualify for BIPA’s financial institution exemption in the higher education context. Fifth, a federal court in Illinois found that BIPA claims were covered under insurance policies for “personal and advertising injuries,” including any injury “arising out of an oral or written publication, including electronic publication, of material that violates a person’s right to privacy.” Finally, a federal court in Illinois granted defendant’s motion for a new trial on damages, after initially awarding $228 million to a class in a BIPA suit. When the jury found 45,600 violations of BIPA, the court—not the jury—determined that the per-violation award should be $5,000—for a total of $228 million—but the court later held that the jury should have determined the appropriate amount of damages and granted a new trial on that issue.
Finally, the Texas attorney general sued Google under Texas’ Biometric Identifier Act, alleging the collection of Texans’ facial and voice recognition information, without explicit consent, violated that Act.
V. Conclusion
This year, states have shown clear willingness to establish privacy safeguards on a range of privacy issues. The states’ efforts share important commonalities, which streamline compliance. At the same time, on going state-level debate, for example, concerning the absence, or the length, of a cure period for any violation, suggest further changes through implementation or promulgations of rules lie ahead. Given continual emergence of new issues and given the increasing speed of legislative and regulatory responses, businesses should keep abreast of new developments in this field and take measures, as soon as possible, to not only ensure compliance but to develop privacy-sensitive work cultures in tune with the underlying principles driving privacy laws.