chevron-down Created with Sketch Beta.
May 12, 2021 Feature

Pitfall! Navigating the Tunnels With a Privacy-By-Design Framework

By Sean Michael Ashworth

If you are a child of the 80s, or just an enthusiast for the nostalgia of the Neon Age, you probably have fond memories of sitting in front of your parents’ Sears television set – toes gripping the shag carpet in a frantic state of anxiety – as you power your strength by mainlining Yoo-hoo and gripping the joystick of your Atari 2600. Every twenty minutes, as another session comes to pass, and you release your grip on your console’s controller – as well as that carpet – you start another round of Pitfall! For the uninitiated, a player controls Pitfall Harry through a jungle of quicksand and fire, snakes and scorpions, and of course, pits and tunnels. Over the span of twenty minutes Harry’s goal is to avoid these obstacles while accumulating as much treasure as possible in the hopes of achieving a perfect score of 114,000 points. Those precious points may be lost, however, if Harry encounters the game’s obstacles. Moreover, the game is impossible to win if Harry refuses to descend the game’s ladders into a maze of subterranean tunnels. While Harry can traverse the length of the jungle more quickly through these tunnels, they are rife with dead ends that lead to wasted time and other life-threatening obstacles. To win the game, Harry needs to understand these tunnels ahead of time in order to obtain all of the game’s available points. Without a lay of the land, he is lost in the labyrinth.

Harry may lose his life to scorpions lying in wait, but his creator, David Crane, safely avoided the obstacles that video game developers must address today. Internet of things, subscription-based services, machine learning predictive activities, and their ilk, create externalities to consumers’ privacy interests that businesses ought to reconcile. Privacy laws enacted over the past twenty years attempt to mitigate the externalities by imposing obligations on businesses and creating rights for consumers. Companies, to effectively comply with these laws, should adopt privacy-by-design principles to enhance their privacy and security practices.

This article examines privacy-by-design through its historical context and current state of law, while also consulting with jurisdictional guidance. Where appropriate, the article provides examples relating to the gaming industry. However, application of privacy-by-design principles must be administered on an industry and company specific basis. While the expense of compliance may be great, privacy-by-design frameworks should be viewed as helpful roadmaps providing value to users rather than roadblocks to innovation.

I. Foundational Principles of Privacy-By-Design and the Oecd Guidelines

Privacy-by-design requires accountability of companies collecting and maintaining personal information. The concept of accountability regarding personal information, however, is not new. In 1980, The Organization for Economic Co-operation and Development (OECD) published its Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (Guidelines);1 an influential text that informs many principles found in the General Data Protection Regulation (GDPR). The Guidelines offer rules to be adopted by member states that govern data flows between countries without disrupting economic growth and employment. Due to disparate regulations within the European Community at the time, the OECD endeavored to homogenize the protection of personal information and privacy through the Guidelines. “Accountability” in the Guidelines suggests that data controllers, i.e. organizations collecting and processing personal information, should comply with measures that ensure the following “principles:”

  • Collection Limitation - limits the amount of information collected and obtains appropriate consent;
  • Data Quality – confirms that information stored is relevant, complete, and accurate;
  • Purpose Specification – informs the data subject of the intended use for which the organization is collecting the information;
  • Use Limitation – limits disclosure of personal information its specified purpose; and
  • Security Safeguards – provides reasonable security safeguards to data subjects when collecting, using, and storing personal information.2

Of course, the technology landscape has shifted dramatically since 1980, and these principles have become considerably more nuanced. The shift in sophistication to data privacy compliance is as dramatic as that of 8-bit gaming to virtual reality. Instead of binary choices of consent between button A or button B, regulators now seek to change corporate culture within organizations to protect the consumers they serve. In fact, the European Union requires data controllers to embed a culture of privacy within the company’s DNA via privacy-by-design principles that mirror the Guideline’s recommendations, as detailed below.

II. The Birth of Privacy-By-Design

At the midpoint between the Guidelines’ release and the current regulations, the concept of privacy-by-design began to emerge. In the 90’s, Ann Cavoukian, Ontario’s former Information and Privacy Commissioner, began advocating for privacy-by-design principles. She argued that privacy cannot be achieved by complying solely with regulatory frameworks, or a zero-sum approach to privacy; rather, organizations must adopt privacy as a mode of operation, or positive sum, through the use of Privacy-Enhancing Technologies.3 To accomplish this “positive sum” approach, Ms. Cavoukian offered the following “Seven Foundational Principles” of privacy-by-design, which influenced much of privacy law since the Millennium.

  • Proactive and Preventative – companies must anticipate privacy incidents before they happen and prevent them from occurring;
  • Privacy by Default – users should receive a maximum degree of privacy without requiring the individual to request a system to provide privacy protections;
  • Embedded Privacy – IT systems and business practices should embed privacy protections without post hoc implementations;
  • Full Functionality – privacy and security should accommodate each other without unnecessary trade-offs, i.e. it is possible to have both without sacrificing one or the other;
  • End-to-End Security – security must exist prior to collecting personal information, and run with the lifecycle of the data through deletion;
  • Visibility and Transparency – companies must comply with their stated promises and objectives regarding data processing, regardless of the business practice or technology involved; and
  • Respect for User Privacy – architects and operators must keep the data subject’s privacy interests at the center of their operations.4

By applying the “Seven Foundational Principles” companies can gain a competitive advantage in the marketplace by empowering their customers with control over their personal information. These principles, as explained by Ms. Cavoukian, will provide greater privacy to consumers when applied to (1) IT systems, (2) accountable business practices, and (3) physical and network infrastructures for day-to-day activities.5

III. The General Data Protection Regulation

With much fanfare, the European Union’s General Data Protection Regulation6 (GDPR) went into effect in May 2018. Much of the Regulation is an effort to codify the principles found in the Guidelines, as well as expound on its predecessor, the European Data Protection Directive (EDPD). The GDPR creates a series of novel requirements, discussed in theory for years, including special rules for high-risk processing,7 control over personal information by data subjects,8 the creation of a data protection officer,9 and rules surrounding automated decision making.10 Further, to bolster the accountability suggestions in the Guidelines, the European Commission enhanced the methodology by which companies adhere to purpose limitation, data minimization, and data integrity by incorporating privacy-by-design concepts into the Regulation.

Data protection by design and by default is required under Article 25 of the GDPR. Data controllers collecting and processing personal information of European data subjects are required to implement appropriate technical and organizational measures to protect the fundamental rights of data subjects.11 The text of the Regulation, unfortunately, does not elaborate on the meaning of “appropriate” measures. However, the European Parliament provides factors for complying with Article 25. Data controllers must consider the state of the art of technology at their disposal; the cost of implementation; the nature, scope, context and purpose of the processing; and the likelihood and severity that the rights and freedoms of natural persons will be violated by the processing.12 Such an analysis, if necessary, must occur at the design stage of new systems and at the time personal information is processed.13

Unfortunately for data controllers, the Regulation does not specify the technical steps they must take in order to comply with Article 25. Rather, the Regulation provides a non-exhaustive list of measures that could be taken, which include: (1) pseudonymization, (2) data minimization, (3) increasing the visibility of what data is processed, and (4) allowing data subjects greater control over their data.14 Implementation of appropriate technical and organizational measures must ensure that collecting and processing personal data is necessary for its stated purpose.15 This obligation applies to the amount of data collected, the extent of processing, the period of storage, and accessibility of the data. Should a data controller wish to use personal data outside its originally stated purpose, it must obtain the individual’s consent.16

Beyond protecting the “rights and freedoms of natural persons” while processing personal data, data controllers must be able to demonstrate compliance with the Regulation based on their technical and organizational measures, by default.17 “Default” in this context aligns with common usage as defined in computer science, which “refers to the pre-existing or preselected value of a configurable setting that is assigned to a software application, computer program, or device.”18 Without factory settings addressing privacy rights, the sheer amount and esoteric nature of options would overwhelm most data subjects. According to the GDPR, the burden of simplifying privacy to protect the rights and freedoms of individuals falls on the data controller.

Companies must keep detailed records of their processing operations, which, at a minimum, will be reduced to writing and include internal policies that meet the principles of data protection by design and by default.19 Should a member state’s Data Protection Authority (DPA) request documentation of compliance with Article 25, a data controller is required to provide this documentation.20 Article 30 of the GDPR outlines the type of records that must be kept, which include, inter alia, the (1) purposes of the processing; (2) categories of data subjects and their personal data; (3) categories of recipients of personal data; (4) description of the adopted technical and organizational security measures; and (5) retention periods for deletion of personal data.

Companies can take several actionable steps to comply with Article 25 of the GDPR, regardless of the dearth of instruction under the Regulation. Practitioners should advise their clients to review processing systems and operating procedures for the following items:

  • Ensure that personal data is appropriately mapped, classified, labelled, stored and accessible in the event that a data subject request, either to request, amend, or delete personal data, is submitted;
  • Design systems to automate deletion of personal data;
  • Check that data collection forms are drafted to minimize the amount of personal data collected to comply with principles of data minimization;
  • Pseudonymize personal data where possible; and
  • Store personal data in a structured, machine-readable, and interoperable format to allow the company to satisfy the requirements of data portability.

Given the factors found in Article 30 - such as the current state of the art, costs, and the scope of processing - the above is merely a start toward compliance with Article 25. Each company’s path to compliance will be specific to its industry and abilities. However, the European Data Protection Board (EDPB) provides clarity to those seeking compliance with Article 25 through the thematic principles of the GDPR.

IV. Implementing Data Protection Principles

The EDPB, an independent body entrusted with consistent application of the GDPR, issued its recommendation for implementing Article 25 consistent with the Regulation’s overarching principles in November 2019.21 Accordingly, data protection by design and by default is not a self-contained requirement in the GDPR, but achieves compliance through a holistic approach to the Regulation whereby controllers incorporate the Regulation’s principles into their privacy designs.22 Controllers must address the following principles to comply with data protection by design and by default: (1) Transparency, (2) Lawfulness, (3) Fairness, (4) Purpose Limitation, (5) Data Minimization, (6) Accuracy, (7) Storage Limitation, and (8) Integrity and Confidentiality.23 Many of these principles are embedded in Article 25; however, insight can be drawn from the examples provided by the EDPB, which we will decipher in turn. The following subsections help practitioners utilize the EDPB’s advice in order to apply these principles to their clients’ products, IT systems, and business practices.

A. Transparency

Controllers must take care to be as clear and open as possible with data subjects about the collection, use, and disclosure of personal data. According to the EDPB, it is the controller’s responsibility to empower data subjects’ understanding of the methods by which they can invoke their rights.24 When constructing privacy policies, according to the example in the EDPB’s guidelines, companies should refrain from providing lengthy scrolls of information that are difficult for the average reader to understand. The use of multi-layered notice provides clarity to the data subject. The use of drop-down menus and links to further explanations of company policies is ideal. It is also suggested that controllers make use of different channels and media, such as video clips to explain the most important aspects of their policies.25

Controllers must also increase the accessibility of their privacy notices to comply with by-design-and-by-default requirements.26 To ensure the satisfaction of supervisory authorities, a company should make available their privacy policy on all internal webpages, so the information is always one click away. Of course, this information should also be designed in accordance with standards of universal design to make it accessible to all. Best practices to achieve meaningful conveyance of information will include just-in-time notice - where informational snippets or pop-ups grab the data subject’s attention to inform him of how data will be processed and why it is necessary before any collection takes place.27

B. Lawfulness

Controllers must identify the legal basis for processing personal data and provide notice of that basis to data subjects.28 The lawful basis for collecting data should extend through the entire lifecycle of information; i.e. each processing activity should have a legal basis and the controller must be able to identify that basis for each use. Moreover, in order for processing to be lawful it must also be necessary. If realistic alternatives for achieving the same purpose exist, then processing information is not lawful according to the EDPB. In instances where interests to the data are divided, the controller must carry out an objectively weighed balancing of interests. The controller should disclose to data subjects its assessment of balancing these interests. Additionally, data subjects must be provided an opportunity to withdraw consent as easily as they were provided the opportunity to give consent. If the means by which a data subject can withdraw consent do not exist, or are not easily attained, then consent is not valid. If the legal basis for processing information no longer exists, then processing that information must cease accordingly.29

An example given by the EDPB to assess the lawfulness of processing personal data can help further explain how a by-default-and-by-design assessment works in practice. A bank wants to retrieve tax data about its customers from public authorities to improve the management of loan applications.30 Of course, the bank is on the up-and-up, so it first requests permission from its customers to contact the public authorities for this information. The bank believes this practice fulfills the GDPR’s principle of lawfulness because the information is necessary for a contract. While the bank does need that information before issuing a loan to its customer, the method is not lawful despite the permission from the customer to contact public authorities, e.g. the tax administration of the member country. The method, according the EDPB, is what is important, not necessarily the need. There is a realistic alternative to the bank’s proposed method since a loan may be granted without obtaining data directly from the tax administration. Because the customer is able to enter into a contract by providing the information from the tax administration herself, the bank must refrain from implementing its idea. The bank’s processing options, by default, should not allow retrieval of data directly from any sources other than the data subject.31

C. Fairness

Fairness requires data controllers to process personal information in a way that is not detrimental, discriminatory, unexpected, or misleading.32 The principle of fairness supports other freedoms, just as the right to information, the right to intervene – including data portability – and the right to limit processing. When considering the measures and safeguards necessary to embed privacy into business practices and products, data controllers should incorporate elements of autonomy over personal data for their users,33 provide data subjects the ability to communicate and exercise rights, ensure that expectations concerning the use of data are honored, and provide consumer choice to data subjects. This last element ensures that users can invoke their rights to data portability. Regardless of whether a service or good is personalized or proprietary - or both - the data controller is required to accommodate the transfer of personal data to another controller at the direction of the data subject.34

As we enter a world where providers are beginning to offer streaming services for video games from remote Internet servers, such as Google’s Stadia, the EDPB’s guidelines on fairness concerning data protection by design and by default are timely. For controllers offering varying standards of quality based on subscription level, the prioritization of service may be tied to the type of subscription the user purchased.35 If a subscriber, for example, opts for a regular subscription versus a premium option then the controller may prioritize premium customers before the regular user. Fairness, however, dictates that all data subjects have equal rights and freedoms under the GDPR. Prioritized customers may not be given priority while invoking their rights in the same way they are given priority when asking for customer service.36

D. Purpose Limitation

Data controllers must design their products in a manner that processes information compatible with the purposes for which they are collected.37 That design must be limited to what is necessary to achieve the purposes disclosed to the data subject at the time of collection. If a controller finds it necessary to amend the purposes of its processing, it must comply with Article 6(4)38 in an effort to obtain the appropriate consent from the data subject. The data controller must predetermine the purposes of the processing to comply with data protection by design. Such predetermined purpose must also be stated with specificity to make it explicitly clear why the data is being processed. The purpose should guide the design process and set the boundaries for processing data. And as previously stated, the specified purpose must be necessary for the processing.39

Companies should be particularly mindful of purpose limitation when collecting personal information from data subjects pursuant to fulfilling payment obligations, e.g. the data stored for purchase history: name, address, email address, and telephone number. For example, a company may license Customer Relationship Management (CRM) products to provide itself a 360-degree view of their data subjects for better customer service.40 Before licensing a CRM, a company must document the purpose of its use. However, should the company begin to use its CRM to analyze customers’ purchasing power for the purpose of a targeted advertising campaign, then the data controller exceeds the original purpose of the CRM and violates the purpose limitation principle. Of course, this does not necessarily mean the data controller is unable to use the CRM for its marketing purposes. The controller would need the provider of the product to map the different processing activities using personal data with the purposes relevant to the campaign. With a complete data map of the flow of information in hand, the data controller can begin to assess whether its activities are in line with a legitimate purpose. If necessary, the controller may need to notify data subjects of its new purpose for processing data before proceeding.41

E. Data Minimization

The counterpart of the OECD’s use limitation, the EDPB specifies that controllers should incorporate data minimization principles into their data protection by design and by default analyses. As such, “[o]nly personal data that is adequate, relevant, and limited to what is necessary for the purpose shall be processed.”42 The controller is therefore responsible to predetermine the features and parameters of their processing systems and whether they are permissible. If personal data becomes irrelevant and unnecessary to its original purpose, then it should be deleted or anonymized. Here, controllers should assess the state of the art in order to verify whether technology, processes, or procedures exist that could make the purpose of processing personal data obsolete. Techniques such as pseudonymization and anonymization of data may be appropriate when deleting information is unnecessary. In situations where the purpose does not require the final set of data, but the initial processing does, then the controller must anonymize personal data as soon as identification is unnecessary. Likewise, if a controller needs to identify a customer for future processing activities, the data should be pseudonymized to limit the risks to the data subject.43

The EDPB advises data controllers to assess the information necessary to serve their customers.44 When collecting payment information for goods and services from data subjects, gaming developers should consider the method of delivery before providing webforms to complete a purchase. A standard contact form where all fields are required fields could result in obtaining too much information when they ask for a customer’s date of birth and telephone number. Not all fields are necessary for the purposes of buying a product. Companies should tailor their web forms to the specific product being ordered. A customer’s address is not necessary when he orders an electronic copy of a game. However, this is clearly not the case when shipping a physical edition of the product. Separate webforms should be created for the e-version of the game and the physical copy.45

F. Accuracy

Given the slew of principles the EDPB requires data controllers to follow, the need for maintaining accurate and up-to-date data should be no surprise. Companies must follow a reasonableness standard to ensure data is correct, rectifying inaccuracies without delay.46 Of course, this principle should be viewed through the lens of the “cost, nature, scope, and risk to the rights and freedoms of data subjects” standard found in Article 25 (e.g. where the cost of correcting information is high and risk of violating the rights of the data subject is low, it may not be necessary to keep accurate records). Multinational companies with customers in European member states should take extra care to correct information that may implicate the rights and freedoms of their data subjects. Games using geolocation technology, e.g. Pokemon Go, that retain inaccurate records of their users’ location may increase the risks to a corresponding user’s freedoms if those records are subpoenaed by law enforcement.

Additionally, according to the GDPR, automated decision-making software presents risks to the rights and freedoms of data subjects.47 Companies using artificial intelligence (AI) to profile customers or predict their behavior should take considered care that the data used to achieve their results are accurate. Moreover, controllers must ensure that their use of AI is non-discriminatory. Assuming the purpose for training AI to predict results has a legal basis, data controllers need to ensure that their data is accurate and is representative of the population to avoid bias.48

G. Storage Limitation

Companies must implement adequate controls to limit the amount of time personal information is maintained on their servers.49 Accurate data mapping and inventories are crucial to determine exactly what personal data is maintained and why it is processed. According to the EDPB, the purpose for the processing will decide how long personal data is stored. Moreover, documented internal policies for deletion procedures, automated deletion, the criteria for length of storage, and the manner of enforcing retention policies must be in place to comply with the Regulation.50

The EDPB’s example regarding storage limitation concerns personal data retained after a user cancels her membership to the controller’s services. To make deletion more effective, companies should opt for automatic systems that delete data regularly after memberships are cancelled, instead of manual deletion by employees. Systems should be configured to follow a given procedure that conforms to the company’s written policy on data retention. It is also important for the company to regularly review their retention policy and update it as needed.51

H. Integrity and Confidentiality

The trinity of data security principles – confidentiality, integrity, and availability – must be observed to provide adequate protection to data subject’s information,52 and data protection by design and by default will guide companies in achieving these protections for their customers. Integrity and confidentiality, according to the EDPB, will prevent data breaches and incidents as well as facilitate proper data processing activities and allow individuals to exercise their rights under the GDPR.53 Just as ongoing reviews of policies should occur with storage limitation, companies should continually review the appropriateness of their information security measures to identify vulnerabilities in their systems. Controllers should regularly conduct risk assessment and resiliency tests, initiate proper controls for accessing personal data, implement secure data transfer procedures (e.g. encrypted transmissions), store data in a secure manner and establish routine backups of information, and put in place a security incident response procedure and team for when an event occurs. 54

An information security management system55 could help manage policies and procedures for information security.56 Practitioners should impress on their clients the need for adequate and robust security, despite the push back they will inevitably receive from management and developers regarding high costs and stifled innovation, respectively. It’s an old adage, but a salient one. Security incidents are not an “if,” but a “when.” No company is immune to bad actors. An advanced security policy will limit liability and protect a controller’s reputation whenever data is compromised by unauthorized access or cyber-attacks.

V. Penalties And Fines

If a data controller violates the GDPR then it runs the risk of severe penalties. Infringements of Article 25 are subject to administrative fines up to 10,000,000 Euros or 2% of total worldwide turnover of the preceding year, whichever is higher.57 However, if such violations are the result of non-compliance with a DPA’s order or temporary limitation on processing data, then fines may double.58 In effect, the GDPR is important due to the ruinous fines it may impose on companies. Facebook or Microsoft may absorb a 2-4% revenue hit in any given year, but a startup client will most likely find itself fatally stung by a tunnel scorpion if it’s not adequately prepared.

VI. The State of Privacy-By-Design in the United States

Multinational companies in gaming and other industries surely must keep abreast of the GDPR. Data owners with solely United States market share, however, may tempt themselves with ignoring privacy-by-design. While privacy-by-design is not explicitly referenced in any state or federal statute, there are aspects of the principles outlined above that implicate current law. Most obviously is the Health Insurance Portability and Accountability Act59 (HIPAA) and the Gramm-Leach Bliley Act60 (GLBA). Both require covered entities and institutions to follow reasonable security standards to protect Personal Health Information and customer information, respectively. Of course, the gaming industry and other entertainment sectors need not worry about these statutes. They are, however, the first dominoes to fall in a trend toward protecting consumers’ personal information.

The federal government has adopted other statutes as well, in line with OECD principles, that have moved the goal post closer to a robust set of protections for individuals. The Fair and Accurate Credit Transaction Act of 200361 (FACTA) amended the Fair Credit Reporting Act to limit the amount of data that Credit Reporting Agencies are allowed to disclose, namely the truncation of credit and debit card numbers on receipts,62 which is in line with data minimization. FACTA also provides consumers the right to a free annual credit report from each of the three national consumer credit agencies, promoting transparency and accuracy. Perhaps most importantly for our purposes, FACTA also establishes a Disposal Rule requiring entities using consumer reports for a business purpose to discard, abandon, sell, or transfer personal information in a reasonable manner to protect against unauthorized access to consumers data.63 Clearly principles of data storage and integrity and confidentiality are adopted under FACTA as well. While federal statutes are adopting privacy-by-design principles in a sectoral and piecemeal fashion, the trend is clearly moving in a direction where other industries should take notice.

States are moving the needle toward enhanced privacy protections for individuals more quickly than the federal government. All 50 states currently have some form of privacy legislation on their books in the form of data breach statutes, most of which at least adopt the confidentiality and integrity principle by mandating reasonable security standards.64 These states generally provide a safe harbor defense to companies that maintain consumer personal information whereby unauthorized access of encrypted data does not result in liability to the company.65

Generally considered the most restrictive data security law in the country, the Massachusetts data breach notification law establishes minimum safeguards companies must extend to consumers when they maintain their personal information.66 It goes far beyond data breach notification by requiring companies to, inter alia, designate an Information Security Officer,67 anticipate risks to personal information and take mitigating steps to prevent unauthorized access of information,68 develop a comprehensive security program,69 impose penalties for violations of the program’s rules,70 establish controls to prevent former employees from accessing personal information,71 and review the security program at least annually and assess whether business changes could impact security.72 Common threads between the GDPR and the Massachusetts law are obvious. While the Massachusetts law does not explicitly reference privacy-by-design, companies engaging with Massachusetts residents must think through the risks to personal information and implement appropriate security procedures prior to collecting information. Ongoing assessments of the security standard are necessary, and without consideration of the state of the art’s effect on the current program, companies conceivably fail to adequately review their safeguards.

The California Consumer Privacy Act (CCPA), while not as demanding as Massachusetts in terms of security protections, provides its citizens with a private right of action against companies that do not adequately protect their personal information.73 The California General Assembly punted the specific requirements for reasonable security procedures and practices to the Attorney General (who, in turn, has yet to develop regulations on what exactly a reasonable security practice means). Be that as it may, the CCPA’s effect fully embraces the spirit of the GDPR’s desire to empower data subjects. California citizens are allowed to make various requests for the information companies maintain,74 giving consumers unprecedented transparency to the data companies maintain. The CCPA also provides California citizens with a mechanism to request that companies “Do Not Sell” personal information,75 promoting fairness and purpose limitation, as well as a right to request a company delete the information they possess,76 invoking data minimization. Like the Massachusetts law, privacy-by-design is not referenced by name. However, the principles underlying the concept are clearly influencing legislation.

This trend is not slowing down either. Within a year of the CCPA going into effect, Californians will likely see the California Privacy Rights Act (CPRA) on their ballots in November.77 The CPRA proposes many key amendments to the CCPA, not the least of which include: (1) an expanded definition of “sensitive personal information” and the ability to limit the use of such information; (2) the creation of a new enforcement agency overseeing implementation of the Act; (3) restrictions on automated decision-making and profiling; (4) a consumer right to correct data; and (5) data retention obligations based on necessity. This select list of proposed amendments goes far beyond the current state of privacy legislation found elsewhere in the United States. Companies may need to inform consumers of the purpose of their collecting and processing practices, and consumers may decline that purpose. If the necessity rule under the CPRA will have any teeth, it’s safe to assume the new enforcement agency will, at a minimum, invoke a similar rationale as the EDPB in providing guidelines on implementation. Even if the new agency does not mention privacy-by-design, their guidance could merely be a distinction without a difference.

Beyond the current or imminent state of privacy legislation in the United States, enforcement actions under state unfair competition laws and the Federal Trade Act are beginning to invoke privacy-by-design principles. The Uber data breach kerfuffle78 was settled in September 2018. Among the host of restrictions and requirements in the Final Judgment and Permanent Injunction, section 16, entitled “Corporate Integrity Program,” requires Uber to “develop, implement and maintain a process, incorporating privacy-by-design principles, to review proposed changes to Uber’s applications, its products, and any other ways in which Uber uses, collects, or shares data collected from or about Riders and Drivers.”79 No definition of “privacy-by-design” is offered in the Judgment, however.

The Federal Trade Commission (FTC), fortunately, is more helpful than California regarding implementation of privacy-by-design principles. Since 2011, the FTC has settled unfair and deceptive business practice matters by requiring companies to adopt a comprehensive privacy program consistent with the Commission’s privacy-by-design approach.80 In 2012, the FTC began providing businesses with recommendations for the privacy standards they expect, and ostensibly will include in upcoming consent decrees. In their understanding of an appropriate privacy framework, companies should incorporate privacy-by-design principles that include adequate data security, reasonable collection limits, sound retention practices, and data accuracy.81 While not formally adopted under statute, companies should be aware that privacy-by-design is expected by states and the FTC under unfair and deceptive business practice enforcement actions.

VII. Conclusion: The Path to 114,000 Points

David Crane sold his Atari cartridges without paying any mind to his consumers’ personal information. All he had to care about was the number of copies he sold and the cash pouring into his bank account. The gaming industry now has periphery concerns well beyond content creation. With higher and higher degrees of interactivity incorporated into product design, more and more responsibility for consumer data is assumed. Content creators must be mindful of the personal information they collect and maintain. While privacy-by-design is not explicitly required under U.S. law, its principles are scattered throughout legislation. Multinational companies with a European footprint should possess a documented privacy policy that complies with Article 25 of the GDPR. As a matter of best practices, it’s advisable for purely U.S. companies to adopt the guidance provided by the EDPB in developing a comprehensive privacy and security program. Otherwise, companies may find themselves lost in the tunnels on their quest to achieve a perfect score amongst their potential customers. Empowering users to take control of their data provides a sign of good faith that companies can commodify. Practitioners should be able to explain that increased privacy and security does not necessarily stifle innovation and brings value-added components to their products. Privacy-by-design can guide a better path to 114,000 points.

Endnotes

1. Organisation for Economic Co-operation and Development, Guidelines on the Protection of Privacy and Transborder Flows of Personal Data (Sept. 23, 1980), http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectionofprivacyandtransborderflowsofpersonaldata.htm (accessed May 2020).

2. Id.

3. Ann Cavoukian, Privacy-by-design, revised January 2011, https://www.ipc.on.ca/wp-content/uploads/resources/7foundationalprinciples.pdf (accessed May 2020).

4. Id.

5. Id.

6. The GDPR protects the “personal data” of “data subjects” and their use by “data controllers,” i.e. companies using personal data. Personal data refers to information that relates “to an identified or identifiable natural person. EU GDPR art. 4 (1).

7. Recital 75 of the GDPR provides several examples of “high-risk processing” that include discrimination; identity theft, fraud, or financial loss; damage to reputation; reversal of pseudonymization without consent; and deprivation of rights and freedoms.

8. See art. 15 (right to access by the data subject); art. 16 (right to rectification); art. 17 (right to be forgotten); art. 18 (right to restriction of processing); art. 20 (right to data portability); and art 21 (right to object).

9. Art. 37.

10. Art. 22.

11. EU GDPR art. 25 (1).

12. Id.

13. Id.

14. Art. 25 (1) – (2).

15. Art. 25 (2).

16. Id.

17. Recital 78.

18. European Data Protection Board, Guidelines 4/2019 on Article 25 Data Protection by Design and by Default, page 10, November 13, 2019, https://iapp.org/media/pdf/resource_center/edpb_guidelines_201904_dataprotection_by_design_and_by_defauld.pdf (accessed May 2020).

19. Id.

20. EU GDPR Art. 31

21. See id at 13.

22. See Norwegian Data Protection Authority. “Software Development with Data Protection by Design and by Default.” 28 November 2017.

23. Id at 13-24.

24. Id at 14.

25. Id at 14-15.

26. Id at 14.

27. Id.

28. Id at 15.

29. Id.

30. Id at 15-16.

31. Id.

32. Id at 16.

33. See n. 9 for examples of data subject rights.

34. European Data Protection Board, supra n. 19 at 16.

35. See id at 17.

36. Id.

37. Id at 18.

38. Article 6(4) provides a factor test to establish the compatibility of processing data for a purpose that exceeds its original purpose to the extent the secondary processing is lawful.

39. European Data Protection Board, supra n. 19 at 18.

40. See id.

41. Id.

42. Id at 19. See EU GDPR article 5 (1)(b).

43. Id.

44. Id.

45. See Id.

46. Id at 21.

47. EU GDPR art. 22. See also WP29 Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of Regulation 2016/679, Feb. 6, 2018, https://iapp.org/media/pdf/resource_center/WP251rev.01.pdf (accessed May 2020).

48. European Data Protection Board, supra n. 19 at 21.

49. Id at 22.

50. Id.

51. Id at 23.

52. ISO/IEC 27001.

53. European Data Protection Board, supra n. 19 at 23.

54. Id at 24.

55. An information security management system is a risk management process designed to manage sensitive information. See ISO/IEC 27001.

56. European Data Protection Board, supra n. 19, at 23-24.

57. EU GDPR art. 83 (4)(a).

58. Art. 83 (5)(e).

59. 45 CFR Part 160 (2000).

60. 16 CFR Part 314 (2002).

61. 15 U.S.C. §§ 1681-1681x (2003).

62. 15 U.S.C. § 1681c (g)(1).

63. 17 CFR Part 248 (2005).

64. National Conference of State Legislatures, Security Breach Notification Laws, Mar. 8, 2020, https://www.ncsl.org/research/telecommunications-and-information-technology/security-breach-notification-laws.aspx (accessed May 2020).

65. Practical Law Data Privacy Advisor, State Data Breach Laws Protected Personal Information Chart: Overview, https://content.next.westlaw.com/Document/I335629461e8b11e598db8b09b4f043e0/View/FullText.html?contexconte=(sc.Default)&transitionType=Default&firstPage=true&bhcp=1 (accessed May 2020).

66. See Massachusetts 201 CMR 17.00.

67. § 17.03 (2)(a).

68. § 17.01 (1).

69. § 17.03 (1).

70. § 17.03 (2)(d).

71. § 17.03 (2)(e).

72. § 17.03 (2)(i).

73. Cal. Civ. Code § 1798.150 (a)(1) (2018).

74. See §§ 1798.100, 1798.110, and 1798.115.

75. § 1798.120.

76. § 1798.105.

77. Caitlin Fennessey, CPRA’s Top-10 Impactful Provisions, IAPP, https://iapp.org/news/a/cpra-top-10-impactful-provisions/?mkt_tok=eyJpIjoiTjJOa01XTXlabVk1WldJNCIsInQiOiJkc0Z2MTRzcm11cDNrME5lY0QrQ1ZRTElvS29Tdkt1clhMTHZSYVJTNlc3TnBmaGNmdndQTVduNmNYZXRvUTJndlpjVDFlS1pmY0tQaDZINnJLT0RcL25KbnF2WTlEQUdZdlW1eVJVOFwvcjlKYzBMT0tvRk9KekhvWVRvV2pQeDdwIn0= (accessed May 2020).

78. In 2016, Uber experienced a data breach affecting the personal information of 57 million riders and drivers. See Kate Conger, Uber Settles Data Breach Investigation for $148 Million, New York Times, September 26, 2018, https://www.nytimes.com/2018/09/26/technology/uber-data-breach.html (accessed May 2020).

79. Final Judm. and Perm. Inj., 10, People v. Uber Technologies, Inc., No. CGC-18-570124 (Cal. Super.) emphasis added, https://oag.ca.gov/system/files/attachments/press-docs/uber-final-judgmentscanned_0.pdf (accessed May 2020).

80. See In the Matter of Google Inc., FTC Docket No. C-4336, FTC File No. 102-3136 (Oct. 13, 2011) (consent order) https://www.ftc.gov/sites/default/files/documents/cases/2011/10/111024googlebuzzdo.pdf (accessed May 2020).

81. Federal Trade Commission, Protecting Consumer Privacy in an Era of Rapid Change, pages 22-30, March 2012. https://www.ftc.gov/sites/default/files/documents/reports/federal-trade-commission-report-protecting-consumer-privacy-era-rapid-change-recommendations/120326privacyreport.pdf (accessed May 2020).

Entity:
Topic:
The material in all ABA publications is copyrighted and may be reprinted by permission only. Request reprint permission here.

By Sean Michael Ashworth

Sean Michael Ashworth is a graduate of Chicago-Kent College of Law and licensed to practice in Tennessee and the District of Columbia. In addition to his law credentials, he is also a Certified Information Privacy Professional in United States privacy law and Certified Information Privacy Manager through the International Association of Privacy Professionals. Mr. Ashworth develops intellectual capital in privacy and regulatory compliance for Cumberland Trust & Investment Company, where he also manages litigation issues and high net worth trust accounts as a Vice President & Trust Officer.