The California Privacy Protection Agency (CPPA) has released draft regulations for cybersecurity audits, cybersecurity risk assessments, and the use of automated decision-making technology (ADMT). While formal rulemaking has not yet begun and enforcement is likely at least two years off, businesses subject to the California Consumer Privacy Act, as amended by the California Privacy Rights Act of 2020 (the CCPA), should start considering now whether and how these regulations may apply to them. In their present form, the regulations would impose substantial obligations on certain businesses covered by the CCPA (businesses). Identifying and filling existing gaps, and preparing your workforce and governing body for compliance, will help your business control risk by properly safeguarding and processing personal information (PI). Processing means any operation or set of operations performed on PI, whether or not by automated means. It will also allow you to prioritize, budget, and avoid a last-minute rush to compliance.
January 23, 2024 Feature
Helping CCPA-Regulated Businesses Prepare for California’s Upcoming Cybersecurity and Automated Decision-Making Technology Regulations
Kimberly C. Metzger
Is My Company a Covered “Business”?
The proposed regulations do not apply to a company—even if it processes the PI of California residents—unless it is a covered “business.” The coverage analysis can be complex: a company can be a business even if it is not located in California, and a company located in California might not be a Business.
Generally, a company is a business if all of the following are true:
- It is a legal entity operated for profit.
- It does business in California (although the CCPA does not define “doing business,” it typically means directing purposeful commercial activity toward the state).
- It collects the PI of California residents—including employees, job applicants, and independent contractors—who are natural persons (consumers).
- Alone or jointly with others, it determines why and how PI is Processed.
- It satisfies one or more of the following:
- As of January 1, it had annual gross revenue greater than $25 million in the preceding calendar year.
- It annually buys, sells, and/or shares the PI of 100,000 or more Consumers or households. (Selling means communicating PI to a third party for money or other valuable consideration and sharing means communicating PI to a third party for cross-context behavioral advertising.)
- It derives 50 percent or more of its annual revenue from selling or sharing PI.
Certain entities that control or are controlled by a business are also covered, as are certain joint ventures and partnerships composed of Businesses. Certain companies (e.g., HIPAA-covered entities) and data (e.g., information subject to HIPAA or the GLBA) are excluded from coverage.
What Is “Personal Information”?
The proposed regulations address how a business safeguards and processes “personal information”. The CCPA defines PI broadly as “information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular Consumer or household.” It excludes publicly available information, certain information of public concern, and deidentified or aggregated information. Defined, but nonexclusive, categories of PI include:
- Personal identifiers (such as real name, alias, postal address, unique personal identifier, online identifier, IP address, email address, account name, Social Security number, driver’s license number, and passport number).
- Information described in California Civil Code § 1798.80(e) (name, signature, Social Security number, physical characteristics or description, address, telephone number, passport number, driver’s license or state identification card number, insurance policy number, education, employment, employment history, bank account number, credit card number, debit card number, other financial, medical, or health insurance information).
- Characteristics of protected classifications under California or federal law.
- Commercial information (such as records of personal property, products or services purchased, obtained, or considered; other purchasing or consuming histories or tendencies).
- Biometric information.
- Internet or other electronic network activity information (such as browsing history, search history, or information about a consumer’s interaction with a website, application, or advertisement).
- Geolocation data.
- Audio, electronic, visual, thermal, olfactory, or similar information.
- Professional or employment-related information.
- Education information (information that is not publicly available personally identifiable information, as defined in the Family Educational Rights and Privacy Act).
- Inferences drawn from information in other categories to create a profile about a consumer reflecting their preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.
- Sensitive PI. Sensitive PI is defined as a consumer’s :
- Social Security, driver’s license, state identification card, or passport number;
- account login, financial account, debit card, or credit card number together credentials allowing access to an account;
- precise geolocation;
- racial or ethnic origin, religious or philosophical beliefs, or union membership; or
- genetic data.
- The contents of a consumer’s mail, email, and text messages unless the business is the intended recipient.
- The processing of biometric information to uniquely identify a consumer.
- PI concerning a consumer’s health.
- PI concerning a consumer’s sex life or sexual orientation.
The ADMT (Draft) Regulations
The CPPA is tasked with implementing regulations that govern access and opt-out rights regarding a business’s use of ADMT (including profiling), and require a business to provide a consumer, on request, meaningful information about the business’s use of ADMT and its effect on the consumer.
- ADMT is any system, software, or process (including one derived from machine learning, statistics, or other data-processing or artificial intelligence) that processes PI and uses computation as a whole or part of a system to make or execute a decision or facilitate human decision-making. ADMT includes profiling.
- Profiling is any form of automated processing of PI to evaluate certain personal aspects relating to a natural person and in particular to analyze or predict aspects concerning their performance at work, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.
Importantly, the proposed regulations would not apply to all uses of ADMT by a business. Rather, they would apply only to the following uses:
- Making decisions with legal or similarly significant effects on a consumer. This means decisions that result in access to, or provision or denial of, financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment or independent contracting opportunities or compensation, healthcare services, or essential goods or services.
- Profiling a consumer acting in their capacity as an employee, independent contractor, job applicant, or student (for example, profiling an employee using keystroke loggers, productivity or attention monitors, video or audio recording or live-streaming, facial- or speech-recognition or -detection, automated emotion assessment, location trackers, speed trackers, and web-browsing, mobile-application, or social-media monitoring tools).
- Profiling a consumer in a publicly accessible place (for example, by using Wi-Fi or Bluetooth tracking, radio frequency identification, drones, video or audio recording or live-streaming, facial- or speech-recognition or -detection, automated emotion assessment, geofencing, location trackers, or license-plate recognition).
Each is a covered use. The CPPA is considering whether to apply the requirements to additional uses. The proposed regulations would require a business that processes PI with ADMT for a covered use to take the following steps.
Pre-use Notice
Before processing a consumer’s PI using ADMT, the regulations would require a business to provide a “Notice of Rights to Opt-Out of, and Access Information About, the Business’s Use of Automated Decision-making Technology” (a pre-use notice) to consumers, describing how the business will use ADMT, consumers’ opt-out rights, and consumers’ right to access information about the business’s uses of ADMT. They would also require the business to provide a layered notice or hyperlink by which the consumer could get additional information: ADMT logic and key parameters, intended output, how the output will be used for decision-making (including any human involvement), and whether the ADMT has been evaluated for validity, reliability, and fairness.
The notice would need to be conspicuously linked on the business’s internet homepage and any mobile app, and otherwise readily available where consumers will encounter it.
Provide Opt-out Rights
The draft regulations would require a business to let consumers request to opt out of having their PI processed with ADMT for covered use. Upon receiving an opt-out request, the business would need to (1) quickly (within 15 business days) stop processing the consumer’s PI with that ADMT, and not use or keep PI previously processed with it, and (2) notify persons to whom the business has disclosed the consumer’s PI for processing with that ADMT of the opt-out request, and instruct them to comply with it.
There are exceptions to the opt-out right for all covered uses except profiling for behavioral advertising. A business need not provide opt-out rights if the covered use is limited to and necessary for data security, fraud prevention, consumer safety, or—with some limitations—to provide goods or services specifically requested by the consumer.
Provide Access Rights
The regulations would also require a business to let a consumer request information about certain decisions resulting from ADMT. If the business makes a decision about a consumer resulting in a denial of goods or services (such as employment or compensation), the business would need to proactively notify the consumer that it has made a decision, that the consumer has the right to access information about the business’s use of ADMT, how the consumer can exercise the right, and that the consumer can file a privacy complaint with the CPPA or California Attorney General.
If the consumer submits a verifiable access request, the business would need to provide in plain language:
- The purpose for which the business used ADMT.
- The output of the ADMT with respect to the consumer.
- How the business used (or plans to use) the output to make a decision about the consumer.
- How the ADMT worked with respect to the consumer.
- How the consumer can obtain the range of possible outputs; exercise other CCPA rights; and submit a complaint to the business, CPPA, and AG.
Cybersecurity Risk Assessment Regulations
The CPPA is also tasked with issuing regulations requiring businesses whose processing of PI poses significant risk to consumers’ privacy and security to regularly submit a processing risk assessment (RA) to the CPPA. The goal is to restrict or prohibit processing if privacy risks outweigh the benefits to the consumer, the business, other stakeholders, and the public.
Not all businesses will need to conduct RAs—just those that conduct processing activities considered particularly risky:
- Selling or sharing PI.
- Processing sensitive PI.
- Processing PI with ADMT for a covered use.
- Processing PI of Consumers under 16.
- Processing PI to train AI or ADMT.
The draft regulations prescribe the contents of the RA, which would include (but not be limited to) a summary of the processing; the categories of PI to be processed; the purpose and operational elements of the processing; consumers’ reasonable expectations; the expected benefits and negative impacts of the processing; safeguards to be implemented; and—importantly—the business’s assessment of whether the negative impacts of the processing (as mitigated by the safeguards) outweigh the expected benefits. Additional elements would be required of businesses that process PI with ADMT for covered uses.
Within 24 months of the effective date of the regulations, a business would need to submit a certificate of compliance and abridged RA to the CPPA. Annual submissions would follow, and the CPPA could request the full RA at any time. A business would be required to review and update its RA (i) before processing PI with ADMT for a covered use, (ii) after a material change in a covered processing activity, and (iii) at least every three years.
Cybersecurity Audit Regulations
Finally, the CPPA is tasked with issuing regulations requiring businesses whose processing of PI “presents a significant risk to consumers’ privacy and security” to perform annual audits of their cybersecurity program (CSP). Under the current draft regulations, this requirement would apply only to:
- Businesses who, in the past year, derived 50 percent or more of gross revenue from selling or sharing PI (i.e., “data brokers”); and
- Businesses who, as of January 1, had annual revenue greater than $25 million in the preceding calendar year, and
- Processed the PI of 250,000 or more consumers or households;
- Processed the sensitive PI of 50,000 or more consumers; or
- Processed the PI of 50,000 or more consumers known to be under 16.
Businesses that meet these thresholds would need to conduct their first audit within 24 months of the effective date of the regulations, and each calendar date thereafter. Auditors could be internal or external to the business so long as they could exercise objective and impartial judgment.
The goal of the audits is to assess and document applicable components of the business’s CSP and how they protect PI and consumers and to identify gaps and weaknesses in the CSP and how the business plans to resolve them. The draft regulations identify CSP components that must be assessed “as applicable” given the business’s size and complexity, the nature and scope of its processing, and the state of the art and cost of implementation. If the business deemed an identified component inapplicable, it would need to explain why it is not necessary for the business’s protection of PI, and how existing safeguards provide at least equivalent security.
The business would be required to present the audit to its board or governing body (or if none, to the highest-ranking executive responsible for its CSP), and to submit a certificate of compliance to the CPPA.
What Should My Business Be Doing Now?
The draft regulations are subject to change during the pre-rulemaking and rulemaking process. However, anticipated regulatory compliance alone should not drive your business’s compliance efforts. For example, concerning cybersecurity, the CPRA itself—which is currently in effect—demands that businesses implement “reasonable security procedures and practices appropriate to the nature of the personal information to protect the personal information from unauthorized or illegal access, destruction, use, modification, or disclosure,” as do most industry standards and best practices. Customer protection, reputation management, and litigation avoidance compel the same thing. The draft regulations do not create this requirement—they merely flesh it out.
A business that anticipates being subject to the regulations should take a risk-based approach to preparation, guided by such considerations as:
- The nature and scope of PI it processes, including whether it processes sensitive PI, the PI of minors, or PI using ADMT for a covered use.
- Available human and financial resources.
- The current state of its CSP and ADMT processing program, and what it will take (time, money, effort) to close gaps.
- Leadership, corporate, and industry mandates.
- Risk tolerance.
While businesses have time to prepare for enforcement (at least, for the forthcoming cybersecurity regulations), the future regulatory burden may be high. Assessing gaps and planning for compliance now will help businesses safeguard and properly process PI, and be well-positioned for the future.