chevron-down Created with Sketch Beta.
April 22, 2022

Creating Competitive Advantage

Privacy and Security by Design in mHealth and Digital Health Products

Alea Garbagnati, Esq. and Lauren Wu, Esq.

Introduction

The pandemic has led to unprecedented adoption of digital health tools in healthcare delivery,1 which has coincided with consumers’ increased engagement in their personal health and embrace of digital health products in their everyday life.2 Technological advances, increased access to data, and the rapid generation and digitization of data3 have led to science fiction-like innovation. The proliferation of data is prevalent throughout the healthcare product pipeline; from patient-focused mHealth4 applications5 allowing individuals to self-report all aspects of their physical and mental wellbeing, wearables measuring everything from steps and sleep to heartbeats, blood pressure, and oxygenation, to large, high throughput molecular testing instruments and sequencers found in hospital and laboratory settings. Data is generated by and accessible to both patients and practitioners at unprecedented levels, coming from these products as well as from increased access to testing.6 This trend has been further catalyzed by the increased interoperability and direct accessibility of medical information by consumers via application programming interfaces (APIs), which enable the consumer to connect to their healthcare providers’ electronic health or medical records (EHR/EMR). For both regulators and healthcare providers, this improved accessibility is driven by the underlying goal of instigating patient engagement, i.e., patients having a better understanding of and interest in their own health and the ability to better manage their health and wellness which - in turn - could lead to cost-savings and improved outcomes.7

With these revolutionary changes, and in line with improved awareness of the sensitivity and vulnerability of data,8 comes an increased need for cybersecurity and privacy protection. Gone are the days when privacy and security features and controls are considered “nice to have” components of a digital health product.9 The ongoing slew of cyberattacks, data breaches, increased interest from regulators,10 and headline-worthy security flaws, in combination with the increased sensitivity11 and volume of health data,12 underscore how vital these features are.

Privacy and cybersecurity features are no longer simply checkmarks on a “long” list of product requirements needed for launch or Beta stage testing. Rather, these features are key components of healthcare products, whether to garner customer13 trust, gain a competitive advantage, or pass regulatory scrutiny. Companies developing these health and wellness products,14 already pressured to bring products to market at an unprecedented speed, may be tempted to release products that lack the very privacy or security features that customers, regulators, and users may assume - or worse expect - to be integrated.15 During product design, these features may also be viewed as an impediment to user experience as opposed to improving user optimization, and thus be omitted.16 Considerations and decisions of whether and how to include privacy and security functions are fundamental to strategizing the development of new products and maintenance throughout a product’s lifecycle. The most effective means to ensure that these capabilities and controls are integrated into digital health and mHealth products is for the manufacturers of such products to have in place fit-for-purpose privacy and security by design processes.

While the concepts of privacy and security by design are by no means new,17 the case for integrating privacy and security awareness and intention into the lifecycle of healthcare products and any data processed by such products is perhaps more important than ever. For mHealth and digital health products, the time is right to ensure that privacy and security by design are embedded in product and data lifecycles and the company’s very culture. Such processes and controls can and should be reframed from being an administrative burden and creative hindrance to a reward that translates into a competitive advantage and market differentiator.

Setting the Stage: A Brief Look at the Laws and Regulations Applicable to Digital Health and mHealth Products

Manufacturers of digital health and mHealth products are subject to a wide range of laws and regulations; in the United States those laws apply at both the state and federal levels. While some of the laws and regulations that apply to digital health and mHealth products may be obvious, in many cases manufacturers entering into this area may not be aware of the varied requirements that will be imposed on them, the data they may process, or their products. The following will briefly summarize some of the laws and regulations that establish privacy and security requirements that may directly impact digital health and mHealth products, the data processed by such products, and the manufacturers of those products.18

It May be Health Data, But HIPAA Won’t Usually Apply: Digital Health and mHealth Products and Applicable Privacy Law

Privacy is a critical consideration for digital health and mHealth products. At the outset, it is important to recognize that products by themselves are generally not subject to privacy laws; rather, products must be considered in conjunction with the entities using the products, manufacturing or developing the products or, often, the data itself.19 Data on a product with otherwise privacy-centric features may still be subject to misuse or may not be configured appropriately upon install, thereby limiting its performance capabilities. Similarly, a product with appropriate security features may nevertheless be used in a way where those features are not appropriately implemented by the end user, thus hindering the security of the product itself or the processing activities performed by the product. As a result, the conversation around product privacy and security should not be focused solely on how to make products “HIPAA compliant” or “GDPR compliant” to the exclusion of other considerations because “compliance” relies in part on how the products are integrated into the end user’s environment and how the user of the product or the manufacturer processes the data. In that sense, it would be more accurate to describe products as being privacy-enabling or privacy-supporting of applicable laws and regulations.

The focus, then, for manufacturers should be on the privacy-enabling features and controls that should be incorporated into their products based on relevant privacy frameworks, as well as the obligations that processing data through the products will place on the customers, manufacturers, sub-contractors, and even the end users or patients themselves. The extent that any specific privacy law is implicated, therefore, will not always be the same, even for arguably similar products. Such differentiating factors might include the intended end user of the product (e.g., direct to consumer versus healthcare professionals), the nature of the product (e.g., on-premises instrument versus cloud-based platform versus mobile app), the interoperability of the product (e.g., data stored locally on the device or elsewhere), and also the manufacturer’s role relative to the data (e.g., whether the manufacturer determines the purpose and means of data processing (i.e., as the data controller), or acts on the instruction of others (i.e., as the data processor), with respect to the data processed by a product),20 among other factors.

HIPAA may seem like an obvious starting point for U.S.-based manufacturers or manufacturers developing products for U.S.-based customers. But while HIPAA is the law most commonly associated with health data, it is a misconception by both regulators and consumers that all companies in the health or healthtech, biotech, and pharmaceutical industry, or that handle health data, are governed by HIPAA.21 With some exceptions, product manufacturers are very rarely directly subject to HIPAA.22 Even where direct-to-consumer (DTC)  health applications (apps) and products process health information, that processing will generally not be covered by HIPAA.23 Similarly, a manufacturer with a product that processes protected health information (PHI) and other sensitive data, but does not itself access or process the data, will likely not be directly subject to HIPAA. Nevertheless, it is often necessary or advisable for manufacturers to use HIPAA as a framework for privacy standards and product requirements to support the needs of customers who are regulated by HIPAA or - at a minimum - use HIPAA as a source of best practices.24

Manufacturers of digital health and mHealth products are more likely to be subject to other privacy and industry laws. Assuming such companies operate outside of HIPAA, in the United States manufacturers of digital health and mHealth products will likely be subject to Federal Trade Commission (FTC) rules and enforcement, which include obligations for breach reporting,25 privacy and security standards for products, and obligations to ensure that statements made to consumers about the company or its products’ privacy and security practices are appropriate and not misleading.26 The FTC’s recent heightened enforcement of the Health Breach Notification Rule may add a new privacy-related spin to product development.27 While breach notification itself tends to fall outside of the purview of product development, the scope of what can be considered an “unauthorized acquisition”28 under the Rule may reinforce the need to apply appropriate access controls to digital health products.

mHealth apps directed toward children may put the manufacturer in the direct scope of another potentially unexpected law, the Children’s Online Privacy Protection Act (COPPA).29 In fact, it may surprise developers and manufacturers of digital health and mHealth products that their products or services, while not directed toward children, could come under the scope of COPPA by not taking appropriate steps to prevent use by children or if they have knowledge that children are using the product or service.30

State laws may introduce additional privacy and security-related requirements. Since HIPAA does not preempt state law, there are a handful of states with health or medical information laws, some of which may be broader in scope and coverage, and may apply directly to manufacturers of digital health and mHealth products.31 Comprehensive state privacy laws, like the California Consumer Privacy Act (CCPA),32 have broad jurisdictional reach and include provisions that apply minimum privacy and security requirements and obligations with respect to data collected from the residents of the respective states.33 These requirements would likely impact any digital health and mHealth products coming to market, even if the laws only indirectly apply to product manufacturers, and notwithstanding whether exemptions built into such laws would also apply.34 Even when these comprehensive privacy laws do not apply, manufacturers would be wise to use them as a benchmark and basis from which to develop minimum product requirements to meet customer needs and expectations.

For companies doing business in the European Union and European Economic Area (EEA), the General Data Protection Regulation (GDPR) will likely have more direct and frequent application to digital health and mHealth products.35 This is due to the law’s direct requirement for companies to employ privacy by design,36 as well as the GDPR’s broad application to any data processing activities performed by companies in scope of the law, regardless of where the company is located.37 Besides privacy by design, which will be discussed at length in this article, the relevance of the GDPR to digital health and mHealth products may hinge on the products’ ability to process data subject requests,38 the manufacturer’s role with regards to any data processed (i.e., controller, processor, joint-controller, or none of the above), and whether and the extent to which the manufacturer can comply with the data transfer requirements of the GDPR.39 In the aftermath of the Schrems II decision,40 the European Commission’s implementation decision of the new Standard Contractual Clauses,41 and the ongoing interpretation of data transfer requirements by supervisory authorities,42 the discussion of data transfers and restricted transfers out of the European Union and EEA has increasingly been a moving target for companies of all industries. These data transfer considerations could directly impact certain digital health and mHealth products where product functionality, or the functionality of related services, requires data to flow across borders. In a similar fashion to HIPAA, another important consideration under GDPR is how the product and the product's processing of data determine the role that the manufacturer plays with respect to data.43

In addition to GDPR, manufacturers might be subject to or at least have to build in considerations for other international privacy and data protection laws. Although this is particularly important for companies that are placing their products on the markets in countries or territories outside of the United States, European Union, or EEA, it may even be necessary for companies providing companion diagnostics services, or otherwise partnering with pharmaceutical companies in global clinical trials. These laws may provide unique, or at least country-specific challenges, most common of which are data localization44 requirements. Where data localization applies, alternative storage and hosting arrangements45 or country-specific licensure may be necessary to ensure product functionality.

The complex landscape of privacy and data protection legislation in both the United States and abroad can seem daunting and overwhelming to manufacturers. A principles-based approach to privacy, which builds a framework around the common components of global privacy and data protection laws, can help manufacturers manage this complexity. This framework would, nevertheless, require some tailoring to specific regional and national requirements. But doing so can aid a company’s efforts to navigate the often murky waters of implementing new privacy laws and cybersecurity requirements;  though it should be acknowledged and accepted that there will continue to be obfuscation of certain obligations and requirements, particularly for some newer privacy laws that are published or become effective without the necessary implementation guidance being issued in a timely manner.46

A Cacophony of Compliance Considerations: Additional Laws and Regulations that May Apply

Among mHealth and digital health products it might be surprising to find that many are not subject to the regulatory jurisdiction of the U.S. Department of Health and Human Services, and in particular the Food and Drug Administration (FDA),47 Centers for Medicare & Medicaid Services (CMS),48 or similar governmental agencies, or their counterparts in other countries. Instead, both manufacturers and the apps and products themselves may fall under a litany of laws more focused on connectivity - like internet-of-things (IoT) devices and laws49  - or be focused on consumer protection, such as under Section 5 of the U.S. Federal Trade Commission Act (FTC Act)50 enforced by the FTC.51 Additionally, and more recently, the U.S. Department of Justice (DOJ) has expressed its intent to use the False Claims Act (FCA)52 to pursue cases of “cybersecurity-related fraud by government contractors and grant recipients,” specifically targeting the intentional misrepresentation of a company’s cybersecurity capabilities.53 It is not a stretch to assume that misrepresentation of a product’s cybersecurity capabilities would lead a manufacturer to face similar scrutiny. The DOJ’s foray into this space with the announcement of its Cyber-Fraud Initiative is especially interesting for manufacturers hoping to work as a government contractor or supplier that may not have previously thought of the FCA as applying to their business operations and transactions. It also raises a new topic area for whistleblower reporting, as well as qui tam suits, which may also impact how companies approach compliance risk.54 Finally, manufacturers may be subject to laws at the federal, state, and municipal level that, while not inherently privacy laws, would regulate the data or data processing impacted by the product. This includes the influx of laws related to machine learning, DTC genetic testing, genetic data,55 and artificial intelligence.56

Don’t Forget to Check - Is it a Medical Device?

In the United States, FDA regulates only those digital health products meeting the definition of “medical device”57 or being labeled or promoted as such. With the rise in digital technologies to support health and facilitate the increasing digitalization of healthcare more generally, FDA has had to expand beyond the more traditional constructs of “medical device” regulation and has adapted its practices and processes to fit these new products and subsequently the new potential risks to patients, an evolution that started in 2008 and continues today.58 FDA jurisdiction over digital health and mHealth products was clarified and limited under the 21st Century Cures Act (Cures Act),59 which was intended to help speed innovation and promote secure patient data sharing. The Cures Act included a number of exclusions from FDA regulation for certain low-risk60 software and products, including many of those that could come under the heading of general wellness, or health administration or education.61 While FDA is able to assert jurisdiction over certain digital health products that could pose a risk to a patient’s safety were the product (or device) not to function appropriately or as intended, the effect of the Cures Act helped to reshape FDA’s approach to digital health product regulation overall.62

FDA has also become more involved and interested in the cybersecurity of products, a natural extension of its duties given the increasing connectivity of medical devices. FDA has developed a number of resources on cybersecurity for medical device manufacturers, which it deems are “responsible for remaining vigilant about identifying risks and hazards associated with their medical devices, including risks related to cybersecurity.”63 FDA’s cybersecurity alerts have also at times been joined by other agencies, such as the Department of Homeland Security. These alerts are a source of useful information to help proffer guidance on mitigation and remediation activities. Of particular concern for manufacturers, however, is the fact that such alerts have also been used as a tool to warn potential customers (both providers and patients) not to purchase potentially compromised or at-risk products.64

As is often the case with FDA’s thought leadership, its approach to mHealth and digital health products has served as the reference around the world.65 However, regulation of digital health and mHealth products is by no means novel in the European Union66 and has been codified in major regulation changes, such as the Medical Devices Regulation.67 It remains to be seen how these holistic changes toward more comprehensive regulation of digital health products and Software as a Medical Device (SaMD) align with the shifting regulation applying to artificial intelligence and other similar more esoteric topics.

So, while it is obvious that digital health and mHealth products are an essential part of the digitization of healthcare and advancement in personalized care, the legal and regulatory environment in which these products are to be developed and deployed is a fraught and complex environment. Industry standards, like those from the National Institute of Standards and Technology (NIST),68 while not addressed in this summary, add to the already vast volume of requirements and considerations that apply to these sorts of products and manufacturers of such products. Because of this complexity, and the inherent privacy and safety concerns at issue, manufacturers should shift from thinking of compliance as an ability to meet a checklist of legal requirements to a framework that allows flexibility and thoughtfulness in approaching a product’s lifecycle. Taking such a principles-based and scalable approach can only be accomplished by putting in place an effective privacy and security by design program.

Privacy and Security by Design Basics

At its most basic, privacy by design sets up default modes of operation, whether applied to a company or a specific product, to ensure privacy and enable personal control over one’s information. This approach is based on seven foundational principles, requiring that privacy be: (1) proactive, not reactive; (2) by default; (3) embedded into design; (4) fully functional by operating on the basis of positive-sum, not zero-sum; (5) secure from end-to-end throughout the full lifecycle; (6) visible and transparent; and (7) respectful of user privacy by ensuring that product privacy functionality is user-centric.69 Privacy by design fosters innovation by ensuring that the privacy measures and mechanisms in place are intentionally scaled to the sensitivity of data. A company with a culture of privacy by design has appropriately tailored privacy mechanisms established to account for the additive sensitivity of certain types of data. For instance, consider a scenario where pseudonymous data70 is collected and stored in a repository. There may not be particular sensitivity in any one dataset in the repository, but the bulk storage or combination of various datasets may increase the likelihood that the data can be reassociated with an individual.71 The collection, processing, or storage of genetic or biometric information would also need comparatively more privacy and security mechanisms and controls in place than activities where only general information, like sales activity or unique visits to a website, are similarly handled.72

The goal of security by design is to embed in the development of a product (i.e., hardware and software) measures to maintain the confidentiality, integrity, and availability of its systems; in other words, make such systems as free of vulnerabilities and as impervious to attack as possible. For a company to practice security by design means it has “a culture and methodology that constantly evaluates threats and ensures that code is robustly designed and tested to prevent known attack methods.”73 The Open Web Application Security Project (OWASP), a nonprofit foundation that works to improve the security of software through its community-led open source software projects, defines security by design principles as: (1) least privilege;74 (2) separation of duties; (3) defense in depth; (4) failing securely; (5) open design; (6) avoiding security by obscurity;75 (7) minimizing attack surface area; (8) simplicity; and (9) correct fixes.76 Much like privacy by design, security by design should be fit-for-purpose for both the manufacturer of a product and the customer or end user. It can be iterative, phased, or hardwired into the product and is intended to be symbiotic and complementary to privacy by design.

Medical practices, healthcare delivery, and individual wellness utilize many different products and devices for a wide range of reasons and varied purposes. Even the users of such products vary - from institutional customers and healthcare practitioners to everyday consumers. As with this variation, in applying privacy and security by design to a company’s processes, it is important to recognize that one size does not fit all. This is particularly true for manufacturers of mHealth and digital health products.

Each type of health-related product poses its own unique data protection and security challenges. A product that processes the test data for one patient at a time, for example, may offer a greater risk of singling out patients than higher throughput instruments that can test hundreds of patients at once. The capacity of that high throughput instrument, however, implies that a much higher volume of data will be processed and, therefore, likely be subject to additional privacy and security risks.  Similarly, while the developer of a cloud-based healthcare platform may have to assess and mitigate the cloud environment and manage concerns related to data transfers or data storage and segregation, the developer of an on-premises instrument would focus more on how to integrate the instrument into its customers’ IT infrastructure without creating a new point of vulnerability for the customer. Privacy and security by design processes for products in the health and wellness space must therefore be tailored for the unique differences inherent in these products and the categories of data processed, but also scalable across a company’s product portfolio. Manufacturers should, therefore, approach privacy and security by design process implementation to reflect the industry in which the company operates, its customers’ industry and environments, and the particularities of its products.

Providing Opportunities within Obligations by Meeting Legal and Regulatory Requirements and Customer Expectations

Effective privacy and security by design ensures that digital health products meet legal and regulatory standards, to the extent applicable. In the United States, digital health products regulated as medical devices are generally understood to include cybersecurity controls as a product requirement.77 Similarly, the MDR includes express security requirements to (at a minimum) guarantee information security according to the state of the art; require IT security; and provide instructions for use that include and describe user and operator information security requirements.78 The MDR also includes privacy-related controls requiring the manufacturer to provide descriptions of arrangements to mitigate harm in the event of a data security breach.79 And while legal and regulatory requirements are of paramount importance for company executives and legal, privacy, and security professionals, the real driver for business partners and stakeholders is in meeting customer requirements and expectations.

A company that integrates both privacy and security by design into its culture, strategy, and product lifecycle can reap benefits beyond product viability. Doing so situates the company as a market leader, while simultaneously earning (and maintaining) the trust of consumers and regulators. Embedding privacy and security by design addresses new customer and user attitudes that place importance on including such approaches in product development and maintenance. This shift is likely due to the increase in cyberattacks on IoT products, including certain digital health and mHealth products and devices.80 This trend has led to more customers requiring manufacturers to provide copies of their applicable privacy and security policies, procedures, and organization charts as part of vendor and supplier due diligence. Customers are also increasingly asking for inventories of privacy and security attributes of products to ensure that the product meets the minimum standards necessary for the customer to perform its own compliance obligations. Companies are therefore incentivized to invest in their privacy and security by design processes to meet these obligations and expectations.

Achieving Privacy and Security by Design in mHealth and Digital Health Products

A viable privacy and security by design process is scalable, easily integrated, and applicable to different types of healthcare products. This is equally as important for established companies with product portfolios that span different risk profiles as it is for newer companies with nascent processes and limited products on the market. Most privacy and security professionals have experience with implementing privacy and security by design in some form. One challenge for truly embedding privacy and security by design into a company’s culture and processes can come from the tendency of such professionals to focus on the detailed elements and deliverables of privacy and security by design without taking the time to first construct an appropriate framework. While a checklist-like approach can help incorporate the appropriate privacy and security by design elements, it can be more characteristic of a nascent privacy and security program. By underscoring the “what” rather than the “why” the appropriate stakeholders are not given the opportunity to internalize a culture of privacy and security, undermining the concept altogether. Setting forth a high-level structure can help to level-set with business and functional stakeholders and provide a basis on which to begin training employees and influencing company culture. After the overarching governance structure is established, the individuals implementing privacy and security by design can ensure that execution is thoughtful and the details are conceptualized for and molded to fit the company’s business and products.

The beauty of privacy and security by design is that it is entirely tailorable and allows the process to meet the needs of different organizational structures and maturities. The framework described in this article takes into account various legal and regulatory requirements and industry best practices and distills them down into phases to help provide the context in which the specific required activities must be performed. Certainly, other privacy and security professionals may approach privacy and security by design differently. Accountability or responsibility for completing the tasks described in each phase may be centralized in a legal department or in a privacy or security group, decentralized among product teams or other functions, or even shared across both groups. Which group is responsible or accountable for any tasks or oversight, as opposed to which group is purely a consultant or advisor, comes down to who will have the appropriate knowledge and technical ability to perform the activity. By way of example, a privacy officer or in-house attorney may not have the skills or authority necessary to affect a cookie banner on a website or code multi-factor authentication (MFA) into a product, but would be expected to advise on whether such items are required and the content of such a banner or nature of the MFA. While privacy and security professionals bring privacy and security by design into the cultural awareness of a company, the product teams, developers, engineers, and IT professionals have the knowledge and expertise to inform and carry out the described activities.

Because there are a multitude of approaches to privacy and security by design, and a number of standards from which such principles can be derived,81 it can help privacy and security professionals, and their colleagues and stakeholders, to think about privacy and security by design processes as coming down to three essential phases: Identification and Classification, Assessment and Mitigation, and Documentation and Re-evaluation. Each of these phases should consider the product’s functionality throughout its lifecycle as well as the lifecycle of any processed data. Because these activities require the expertise of various functions and stakeholders, the approach described here is intended for a wide variety of audiences and framed to help privacy and security professionals, or individuals at a company tasked with those roles,82 to socialize and implement the concept.

Phase 1: Identification and Classification

The first step is to understand the data - what data the product processes (or will process); why the data is (or will be) processed; where and how the product processes data; with whom and how (or will) the data be shared; and identified risks83 relevant to that data. This identification exercise is sometimes called “data mapping” (especially when the identification follows the data flow from intake to storage or transfer) or creation of a data inventory. A preliminary risk identification and prioritization helps to evaluate any initial issues associated with data. In this phase, both the data to be processed and the risks associated with the data are identified and classified.

The identification and classification of data and risks for products are a little different from what many companies may have already performed by way of a data inventory. Data inventories are often completed for the purposes of understanding a company’s enterprise data, such as data processed for employment, finance, or other similar company functions. Such a data inventory can serve the dual purpose of helping to inform this initial phase for products, especially if the inventory-collected information includes data derived from or processed by the company’s products. The ability to leverage existing processes, infrastructure, taxonomy, and the like is beneficial to avoid duplication of efforts and potential confusion from inconsistent approaches. As with a more generalized data inventory, the specific requirements, classification types, and considerations under review for product data identification and classification will likely differ depending on the type of product, the way the data will be processed, the geographic area from which the data is originating and/or being transferred or stored, the role the manufacturer plays with regards to the data, and other relevant factors. Completion of this phase requires establishing a system to standardize when and how a privacy assessment must be conducted. Thus, this phase can help set expectations on timing and level of effort for the next phase - Assessment and Mitigation.

mHealth and Digital Health Products Often Add Complexity

Data identification and classification become increasingly complex in the area of health products and the role the manufacturer plays in relation to the data processing. As noted previously, the type of product can play a large role in establishing whether a manufacturer may be a controller, processor, or neither (or their equivalents). This is made more complicated by the fact-based nature of this analysis. While it might be assumed the relationship between customer and manufacturer for digital health products is that of controller-processor (or the covered entity-business associate construct established under HIPAA),84 the answer can often be more complex. As an example, an on-site diagnostic instrument may process personal data for the customer, but historically sends little - if any - data back to the manufacturer. The manufacturer’s data exposure is often limited to potential access during installation, maintenance, and support services. In these scenarios, even where the manufacturer acts as a data processor, it is in a much more limited capacity than a situation, for example, where the manufacturer might be hosting or processing data through a cloud-based service. This assessment may change, however, to the extent that data generated by the product might be used for the manufacturer’s own purposes, in which case, the manufacturer may be a controller in at least a limited capacity.85 This typically occurs when the processed data is used for product improvements or analytics,86 but may also occur when the manufacturer uses the data to comply with its own legal requirements. The nature of the manufacturer’s product or services may similarly shift the manufacturer’s role to controller (or statutory equivalent), such as when the entity is performing healthcare services. Laboratories performing human testing87 are often classified as covered entities under HIPAA in relation to any PHI because they are offering healthcare treatment to patients (i.e., by providing testing services), even though the direct customer of the lab result is arguably the healthcare provider and not the patient.88 Thus, for healthcare products and services associated with such products, the positioning of the manufacturer in relation to the data directly impacts any required identification and classification activities.

Leveraging Identification and Classification Activities for Commercial Value

As a manufacturer of mHealth or digital health products, it is essential to have an in-depth understanding of the data the product will process and any related processing activities, even if the manufacturer will not be a controller or processor of the data. Only by completing such an activity can the manufacturer have a comprehensive understanding of its risk relative to the product, and - if the manufacturer will be processing data from the product - relative to such data and processing activities. All of this information is necessary for a company’s risk management activities. In a time when customers and consumers are increasingly savvy about data protection and security issues related to product usage, such information may be helpful in persuading customers to purchase or use the product.

While the acts of identification and classification of data and risks seem solely administrative in nature and of limited value, creating this documentation could differentiate a product on the market. A ROPA, or record of processing activities, can provide customers with insight that may be required for vendor and product due diligence. A manufacturer’s ability to understand its potential risk relative to the data processed by its products will be a key component of discussions with potential institutional customers and would certainly be required to be understood by a regulator in the event of a mass breach, investigation, or audit.

More importantly, an inventory of processing activities is required by an increasing number of laws, like the GDPR89 and the CPRA.90 Other laws implicitly require such activities to be conducted, like the CCPA.91 If the manufacturer will ultimately be a processor of data in relation to a product, then completing this phase also serves to help the company fulfill its regulatory obligations. This can include informing some of the elements required in the public-facing privacy policy and notice, or being able to complete data subject requests, such as for access, deletion, and the like.

Beyond fulfilling legal requirements, the ability for an mHealth product to be available in an app store may implicitly require that the manufacturer has conducted this identification and classification exercise. As part of this review and before approval for release on the store, manufacturers must provide detailed information about the app, including what data will be processed, how and why the app will process data, permissions associated with the data, and even the app’s purpose string.92 Having an inventory of the impacted data and processing activities, while informative, does not alone provide sufficient detail for determining the features and controls necessary to ensure an appropriately functional product. For those purposes, the information must be assessed and reviewed.

Phase 2 - Assessment and Mitigation

Once the information on the data and processing activities has been gathered, it must be reviewed and evaluated to determine the disposition of the data and such activities, as well as any associated risks. The goals of the assessment are to: (1) ensure compliance with applicable legal, regulatory, and policy requirements; (2) determine the likelihood and extent of a possible privacy or security event, breach, or other incident and what effects may result (i.e., privacy risk if applied to individuals’ personal information or security risk if applied to companies, products, assets, systems, etc. or individuals’ non-personal data); and (3) determine whether any security or privacy controls can be applied to mitigate unacceptable risks. Acceptable risks,93 and whether to expend additional resources and time to mitigate those risks, must also be considered in light of the sensitivity of the data to be processed. Beyond the specific data and processing activities, privacy risks can also be extrapolated on the product side to include the inability of the product to enable the customers (in this case, likely a provider) to perform their privacy-related obligations or for the end users (which could be a provider, but is more likely a consumer) to be able to access or port their data. The disposition of the data during storage, both in terms of format of the data (i.e., is it encrypted, hashed, anonymized, pseudonymized) and location (i.e., on the device or in the cloud), is another area where both privacy and security risk should be assessed and, if needed,  mitigated.

Making Assessments “Fit for Purpose”

The form and nature of assessments in this phase are not set in stone. As with any other facet of privacy or security by design in healthcare products, these assessments are not one-size-fits-all. While requirements for data protection impact assessments (DPIAs) and privacy impact assessments (PIAs) have been increasingly codified into privacy and data protection laws,94 there nevertheless remains some flexibility in what must be assessed. Not all processing activities performed by a product may rise to the level of risk that would necessitate either a DPIA or PIA. In faster-paced development lifecycles, or where there is a higher likelihood of a need for patches or bug fixes, it is also likely not suitable or scalable to deploy a DPIA at the time of every product change or review cycle, particularly where the changes do not introduce new data processing activities or risks. Thus, given that these assessments are only as reliable as the quality and accuracy of information provided, one aspect of tailoring these assessments that should be performed is to establish criteria for when and how updates should be made.95 Standardizing a product-specific privacy assessment (Product Privacy Assessment or PPA), an alternative form of privacy assessment, can help manage risk and support data governance in the absence of a PIA or DPIA. These PPAs can be scaled and tailored to the organization’s development process and made interoperable with other forms of assessments, including security assessments like a Manufacturer Disclosure Statement for Medical Device Security (MDS2).96 

Completed product assessments of this sort are increasingly being requested by customers, especially those who will be using the product to process health or wellness data,97 as part of their vendor security review processes, and - if an mHealth app - may be required as part of the review and approval process for entry onto an app store. Providing such assessments gives assurance to customers that the product meets certain minimum privacy and security standards. While some organizations provide their full assessment, even if not required, often a summary or paired down version of a completed assessment, or a product information page containing similar information, are sufficient for meeting customer needs for these purposes. Whether to provide a full assessment, summary, or an abridged version can come down to a number of preferences, which may include considerations of attorney-client privilege or the potential for disclosure of proprietary or confidential information in contrast with the need to build trust through transparency. Similarly, how to provide updated information to customers and users should be part of the considerations made when making these assessments available.

Right Sizing Risk for Effective Mitigation

It is important to note that not all risks are created equal, even risks that might be categorized together as either acceptable or unacceptable. Risks should be stratified within categories based on impact (both internal and external) and likelihood. Mitigations associated with those stratified risks should similarly be valued and ranked, accounting for factors such as the time and resources required, internal and external impact, and residual risks. Mitigations in this context are the corrective actions to be taken to address or at least minimize the risks that have been identified. So, for a product, a mitigation could be fixing a vulnerability or addressing a bug. The failure to mitigate an identified risk does not necessarily mean that a product is inherently flawed or that the data processed by the product would be automatically at risk. Instead, once the risks and mitigations are appropriately identified, classified, and reviewed, a plan should be put into place for how to approach the implementation of any mitigation that needs to be performed as well as the expected outcome of the controls to be put in place and the residual risk from any mitigations not performed.

Mitigation need not be one and done, but can also be continuously iterated to speed rollout and maximize efficiency so long as such iterations do not introduce additional unacceptable risk. The timing of when mitigations are performed and the expected impact must be considered, as should failure of completion. For example, if a planned mitigation to an app (i.e., like a patch or bug fix) will be performed via push notification (i.e., a message sent directly to a user's mobile device) or more passively via an app store update,98 what would the impact to product functionality and the user be if the update is not performed? Alternatively, what would the likely impact on customer satisfaction be if the product frequently requires updates or is unavailable due to bugs or uninstalled patches? Thus, the timing of post-market mitigations can itself be a feature to be framed for the benefit (or detriment) of the end user or customer.

As with the conduct of the first phase activities, any changes to the data, processing activities, environment of deployment, performed or expected mitigations, and even progression to a different stage in a product lifecycle should necessitate a refresh of previously completed assessments and mitigation plans. Mitigation should also be considered in light of the different disposition of data, processing, and risk at the various stages of the product lifecycle, including at end of life and after the product is no longer supported. This is especially true to ensure data accessibility and portability, and in particular in digital health products where specific retention periods may apply.99 Risk mitigation is generally a good practice from a business strategy and risk management perspective, but can also be directly and indirectly required in some cases.100 Not only is mitigating risk a natural step in privacy and security by design, but the documentation of risk mitigation is required as part of the DPIA process.101

Approaching Assessment and Mitigation Relative to Digital Health Products and Data

A necessary lens to apply to assessments and mitigation activities for mHealth and digital health products is to consider the inherent sensitivity of the data. Even for wellness apps available to the general consumer, data may be collected about a users’ mental health condition, eating and workout habits, heart rate, menstrual cycle, alcohol, tobacco or drug consumption, and the like - all of which information may be considered sensitive to the individual though not expressly protected as sensitive outside of the healthcare setting. Such data may nevertheless be subject to some amount of regulatory oversight and enforcement.102 Decisions about whether a risk is acceptable and thus may not be mitigated should consider the impact on the individual whose data may be affected (i.e., a patient) as well as the customer user (i.e., a provider, researcher, office administrator) who may not be the data subject. Once risk assessment and mitigation have been completed, the focus should then shift onto the third phase - documentation.

Phase 3 - Documentation and Re-evaluation

The documentation phase is arguably more important than the performance of the prior two phases as it is where decisions related to the activities conducted in the previous phases are recorded and retained. Documentation is not only required to meet many legal and regulatory obligations, but likely also to meet internal reporting and quality requirements. Each of the phases described in this article should be documented in a manner that is - at a minimum - in accordance with applicable law, and - as applicable - internal corporate cybersecurity and/or quality standards and policies, like those maintained for quality management systems, codes of ethics, and other similar requirements. This phase, therefore, is not just about creating a reference for product support and maturation, or even data disposition at different stages of its lifecycle whether in the product or as part of the processing activities, but also with the mindset that these decisions may be subject to scrutiny in the event of a downstream security or privacy event, inspection, or audit. To quote a well-known regulatory maxim, “If it wasn't documented, it didn't happen.” Though perhaps trite, this maxim is appropriate for these processes, as only through documentation can a company demonstrate compliance and leverage its privacy and security by design activities for competitive advantage.

The method, format, and location of privacy and security by design process documentation should be strategic, taking into account the manufacturer's risk tolerance, maturity, the data and products themselves, as well as the position, strategy, and approach of similarly situated companies. Unsurprisingly, there is an incredible range and variation of options and solutions available to companies for how to document these activities. As an example, for a regulated digital health product, a full DPIA or PPA may not be suitable for inclusion in the product’s design history file (DHF) given that the intended audience of the DHF will be medical device regulatory authorities and bodies.103 However, the inclusion of a truncated version of the assessment (or even a statement validating satisfactory completion of the assessment) can help indicate in the product’s formal records that an assessment has been completed. This documentation thus ensures the inclusion of privacy and security in the development process.  Companies intending to document these activities should look to industry standards for models of appropriate documentation and treatment, including whether to apply attorney-client privilege, in addition to what is needed to meet legal and regulatory requirements.

Regardless of the form or nature of the documentation completed, it is crucial to note that such documentation should be treated as a living document. Especially in the mHealth space where it is expected that there will be frequent updates and bug fixes, the updates must also be documented - though not necessarily to the same degree as the underlying product. Regardless of whether there is a minor patch rollout or a complete version change, as previously noted, any change to the data, processing activities, environment of deployment, performed mitigations, or progression to a different stage in a product lifecycle requires previously completed documentation be reviewed and updated. In some instances, if the product is regulated as a medical device, reporting of these updates may also be required to be provided to the applicable regulatory agency.104

Leveraging Privacy and Security by Design to Build Trust

Even a stripped down PPA form may not be an appropriate document for distribution outside of the manufacturer. Although a PPA can be the source file for a number of other core documents and materials related to the product, there are many compelling reasons why the assessment itself should only be accessible by a limited - and often internal - audience.  Even a PPA could contain proprietary or confidential information that a manufacturer may not feel comfortable providing externally, unless required by law or regulation to do so. However, to withhold all of the information in a PPA would fly in the face of the trust building that needs to exist between a customer or user and the manufacturer.

Given the emphasis on transparency with regard to privacy and security in products and processes, companies have needed to find new approaches to ensure that such information is accessible to those who need or want such access. Manufacturers have approached this in a number of ways. One way is by using a “knowledge center,” a tool that has long been used by companies for general customer support needs to allow their customers to obtain information about products at their convenience or to curate internal resources for employees. These centers have more recently been adapted to include privacy and security elements.105 In some instances, these knowledge centers can be made accessible either generally or exclusively with customer credentials and in others can be released as internal learning centers to train and empower the companies’ employees to speak to privacy and security by design. One way to populate those centers is to convert or summarize formal PPA documentation into audience-centric documents that provide sufficient information to meet the purpose of the disclosure while allowing the company to avoid making publicly available materials that are considered sensitive.

Another way to manage transparency is to maintain a repository of appropriate responses to requests for information and other inquiries that are previously vetted by the privacy and security functions. An FAQ or similar document can serve some of these purposes, even if used passively by internal team members to respond to customers’ “vendor security review” questionnaires, requests, or audits. Maintaining a privacy or security statement that offers general information about the manufacturer’s products or services can also help provide transparency to customers, regulators, and the public while mitigating any perceived risks with making too-specific information publicly available. To this end, some companies - mostly those in more traditional technology and software as a service (SaaS) environments - have even created manicured brochures to highlight their privacy and security functionality and processes, often incorporating aspects of their SOC 2106 compliance reports.

In a regulated space like healthcare, however, a crucial element of this process is to ensure that a product’s privacy and security features are not misconstrued or over-exaggerated. Recent enforcement by the FTC, as well as increased interest by the U.S. Federal Communications Commission (namely in the area of telehealth)107 and the DOJ, highlight the importance not only of ensuring that appropriate security and privacy controls and mechanisms are in place, but also that such functions are not described with misleading or deceptive language or puffery.108 As a result, it is important for privacy and security functions to work with the company’s sales and marketing functions, and legal, to ensure that any public statements made about the capabilities of either the company or its products are accurate.109

This highlights an essential point that has not yet been covered, which is that privacy and security by design is an exercise in company-wide partnership, requiring the involvement of stakeholders from a range of functions and at all levels, from the top down.

Don’t Do It Alone! Identifying the Stakeholders and Establishing Processes to Ensure Optimization

An essential component of an optimized privacy and security by design process is to ensure that appropriate stakeholders are identified, engaged, and empowered to support and implement it. The activities described in the phased approach discussed here cannot be accomplished in the ivory tower of the privacy office or isolated among information security personnel. Instead, at each stage of the product lifecycle, from conceptualization and build, to deployment and obsoletion, whether regarding business development pitches, marketing, and promotion, or even maintenance or support, there should be individuals from different functional areas who can be actively involved in performing the activities necessary to complete each phase of the privacy and security by design process and be empowered as decision-makers. While it is advisable - and in many cases required - for all company employees to have some basic knowledge of privacy and security for internal purposes, this becomes even more important for manufacturers of mHealth and digital health products where the sensitivity and volume of the data are integral to the product.

Leveraging other stakeholders to build and implement privacy and security by design can help scale and mature these processes. By deepening a culture of privacy and security within the organization, other stakeholders can be empowered as privacy champions and liaisons to play a more substantive role in the process.110 Having stakeholders participate as a first line of defense in the product assessments can alleviate the pressures on the privacy and security teams, allowing them to focus on strategic and more complicated questions. Further, these champions and liaisons can help a company’s privacy and security professionals to promote privacy and security by design initiatives and awareness and to articulate the rationale for these activities to their respective functions and teams.

Equally important is that the processes established to implement privacy and security by design are easily replicated, digestible, and scalable. There is no need to create processes that run parallel when a single process can be leveraged to accomplish multiple tasks. Relatedly, if designated processes are overly cumbersome or redundant, they will not be performed accurately or effectively. Just as incorporating privacy and security by design into the product lifecycle is a truly cross-functional exercise, requiring input from stakeholders throughout the organization, so too should be the building and codification of the processes to implement this framework. Engaging and communicating with stakeholders frequently and approaching those interactions with openness and curiosity can be helpful in building rapport and trust that will serve a company well when things go badly. Having established trusting relationships with stakeholders can also be a means to differentiate a company from its competitors and ensure that the company is ready to respond to inquiries, requests, and audits from customers, regulators, and end users.

Conclusion

Privacy and security by design have become essential elements of the product lifecycle, particularly in light of rapidly changing technology, the increased digitization of data, and the ever-changing regulatory landscape. This is especially the case with mHealth apps and digital health products, where the privacy and security of users’, including patients’, health data are at stake. Managing privacy and security risk in health products can often be challenging when the risks can be specific to the nature of the product, its implementation, and the amount of data processed. A scalable and consumable approach to privacy and security by design can not only help a company manage these risks and comply with regulatory obligations, but can also be a business differentiator. Such an approach need not be overly complicated, nor does it demand “best in class” technology; rather, it can be tailored to the resources, size, and maturity of a company’s privacy and security program. By applying the three-phase approach of Identification and Classification, Assessment and Mitigation, and Documentation and Re-evaluation, there is enough inherent flexibility that the governance structure can be implemented to meet the needs of various types of companies. An effective privacy and security by design process engages and empowers impacted functions and stakeholders throughout the organization at each of these phases to ensure that the information is accurate, up-to-date, and appropriately documented both for internal use as well as to respond to customer, user, and regulator requests.

The benefits of implementing robust privacy and security by design processes and practices, while easily lauded by privacy and security professionals, can sometimes seem overly burdensome on the product development process and an administrative nightmare for legacy and end-of-life products. Reframing privacy and security by design as necessary to help support the product from end to end and throughout its lifecycle, thereby meeting customer expectations and legal obligations, can help make the case that such activities are compulsory to obtain and retain a competitive advantage. Especially for mHealth and digital health products, where the data at issue is highly personal and likely sensitive, the importance of ensuring that appropriate privacy and security controls are in place at all stages of a product’s lifecycle has never been more apparent. Indeed, by incentivizing business functions beyond privacy, security, and legal, by focusing on the value that privacy and security controls and features bring to a company, what may seem burdensome can actually be pivoted to drive innovation. However, reframing and incentivizing privacy and security by design is not sufficient; it must also be folded organically into the various stages of the product lifecycle and company culture by working directly with stakeholders to ensure that any developed processes are fit-for-purpose, repeatable, and effective. Next level status for manufacturers of mHealth and digital health products can only be achieved by embracing the opportunity that privacy and security by design processes can provide beyond employing the bare minimum of privacy and security practices. By simplifying these processes to more easily create a company culture where privacy and security are a forethought and not a reaction, privacy and security professionals can themselves add value and set themselves apart for the ultimate benefit of their company and its products, as well as customers, end users, and patients.

Endnotes: 

  1. Hackett, M., The digital transformation in healthcare has just begun, according to Accenture report, Mobi Health News (June 21, 2021), https://www.mobihealthnews.com/news/digital-transformation-healthcare-has-just-begun-according-accenture-report
  2. . IQVIA, Institute Report, Digital Health Trends 2021: Innovation, evidence, regulation, and adoption (July 22, 2021), https://www.iqvia.com/insights/the-iqvia-institute/reports/digital-health-trends-2021.
  3.  Dash, S., et al., Big Data in Healthcare: Management, Analysis and Future Prospects, 6 J. of Big Data 54 (2019) (indicating that the amount of data generated between 2005 and 2017 rose from 130 exabytes (EB) to 16,000 EB, and is expected to rise an additional 60% to 40,000 in 2020, which would amount to about 5,200 gigabytes (GB) of data generated per each individual). 
  4. Defined by the World Health Organization (WHO) as “[t]he use of mobile and wireless technologies to support the achievement of health objectives.” See WHO, Mhealth: New Horizons for Health Through Mobile Technologies, Global Observatory for eHealth series - Vol. 3 (2011), https://www.who.int/goe/publications/goe_mhealth_web.pdf. 
  5. Software applications (or apps) that run on mobile or web-based platforms, but are optimized for mobile devices. 
  6. Even before the COVID-19 pandemic and the unprecedented availability of direct-to-consumer (DTC) Covid screening tests, there was an increase in genetic tests, including some that are DTC, coming to market as well as consumer appetite for such tests. See ​​Phillips, K.A., et al., Genetic Test Availability and Spending: Where are we now? Where are we going?, 37 Health Aff. 710-716 (2018) (finding that as of August 1, 2017, there were “approximately 75,000 genetic tests on the market, with about ten new tests entering the market daily.”). 
  7. See generally Johnson, C.,  Richwine, C., &  Patel, V., Individuals’ Access and Use of Patient Portals and Smartphone Health Apps, 2020, ONC Data Brief No. 57, The Off. of the Nat’l Coordinator for Health Info. Tech. (September 2021), https://www.healthit.gov/sites/default/files/page/2021-09/HINTS_2020_Consumer_Data_Brief.pdf. 
  8. Auxier, B.,  et al., Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their Personal Information, Pew Rsch. Center (Nov. 15, 2019), https://www.pewresearch.org/internet/2019/11/15/americans-and-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information/ (finding that 70% of adults believe their personal data is less secure than five years ago and only six percent believe their data is more secure). 
  9. For this article, “digital health product” will mean a digital technology that uses computing platforms, connectivity, software, or sensors for healthcare and related uses, which can be for anything from applications in general wellness to applications as a regulated medical device. See FDA, Digital Health, https://www.fda.gov/medical-devices/digital-health-center-excellence/what-digital-health (last accessed Feb. 15, 2022).  mHealth typically relates solely to mobile health and wellness technologies, whereas digital health (or eHealth) products is a broader umbrella category, and will be used throughout this article to refer to both types of products. 
  10. See infra and accompanying notes (discussing the DOJ’s Cyber-Fraud Initiative, the FTC’s policy stance and recent enforcement of the Breach Notification Rule, and various new laws applying to these products). 
  11. See generally Sariyar, M.,  Suhr S., &  Schlunder, I., How Sensitive Is Genetic Data?, 15 Biopreservation and Biobanking 6 (2017) (arguing that whole genome sequences may “require special protection, whereas other genetic data … should be treated in a similar manner to other clinical data.”). 
  12. See Dash, supra n. 3, at 5 (discussing the impact of digitization of healthcare and big data). 
  13. Throughout this article “customer” will be used to refer to an institutional provider or purchaser, e.g., hospital system, healthcare professional or provider, or laboratory, or similar, which companies and entities will ultimately purchase the product via a vendor or supplier relationship. “Customer” should also be read to include pharmaceutical, biotech, or medical device companies that may be working with digital health product manufacturers or mHealth developers for the purpose of research or support patient initiatives. “User” or “end user” will typically refer to consumers. 
  14. For this article, companies developing or manufacturing these products may be referred to interchangeably as “manufacturer” and “developer” given the broad range of health-related digital products, some of which may be considered a regulated medical device, including software as a medical device (SaMD). 
  15. Even outside of the health and wellness arena, speed to market is often juxtaposed against building security and privacy into products and only recently have companies attempted to rebalance this relationship. As noted by   Preston Winter, the former Chief Information Officer and Chief Technology Officer at the National Security Agency, in a 2010 interview, “[American] IT systems are commercial systems, which were built with basically market penetration and speed time to market as the major objective in the design and development of the products, they were not built with security as the number one criteria. They are going to continue to have flaws. … [S]ome of the companies … have begun to take this threat very seriously and spent a lot of time and effort in learning how to write better-secured code.” Chabrow, E., IT Built for Speed to Market, Not Security: Ex-NSA CIO Explains Why Key IT Systems Remain at Risk, ISMG GovInfoSecurity (Aug. 16, 2010), https://www.govinfosecurity.com/built-for-speed-to-market-security-a-2844. 
  16. See  Ho, J., Corporate boards: Don’t underestimate your role in data security oversight, FTC: Business Blog (Apr. 28, 2021), https://www.ftc.gov/business-guidance/blog/2021/04/corporate-boards-dont-underestimate-your-role-data-security-oversight (discussing the role of corporate boards in ensuring that consumer data is protected, and stating “A strong data security program should never be reduced to a ‘check the box’ approach geared toward meeting compliance obligations and requirements. Instead, boards should ensure that their security programs are tailored to their companies’ unique needs, priorities, technology, and data.”). For instance, many users are likely used to having multi-factor authentication (MFA), and in particular two-factor authentication (2FA), for their email, EMR, and banking accounts, and have accepted those security controls and mechanisms as necessary to protect their data, even if such controls are inconvenient. Similarly, consumers may be familiar with use of “captcha” (i.e.,  the acronym for Completely Automated Public Turing test to tell Computers and Humans Apart) when setting up or making changes to an account. However, many health and wellness apps, and even some apps for EMRs, may not use MFA or even 2FA when it might otherwise be expected due to the sensitivity of the data being processed. And some developers may underestimate the value consumers place on such practices. 
  17. Ann Cavoukian, for example, has been advocating for privacy and security by design since the 1990s. See  Cavoukian, A., PhD, Privacy by Design: The 7 Foundational Principles, Information and Privacy Commissioner of Ontario (2009, revised 2011).   
  18. Additional laws and regulations, such as those relating to healthcare and fraud and abuse (e.g., in the United States,  the federal Anti-Kickback Statute (42 U.S.C. § 1320a-7b(b)), Physician Self Referral Law, known as the Stark Law (42 U.S.C. § 1395nn), False Claims Act (31 U.S.C. § § 3729-3733), Civil Monetary Penalties and Exclusions Statute (42 U.S.C. § 1320a-7a and 42 U.S.C. § 1320a-7), and each of their state equivalents, as applicable), practice of medicine laws, securities and employment laws, transparency (aka “sunshine”) laws, products liability, and intellectual property laws, among others, may also apply to the companies manufacturing or developing digital health and mHealth products, or the products themselves. 
  19. See e.g., Health Insurance Portability and Accountability Act of 1996, Pub.L. 104–191, as amended by the Health Information Technology for Economic and Clinical Health (HITECH) Act, 42 U.S.C. § 139w-4(0)(2) (2009), as part of the American Recovery and Reinvestment Act (ARRA) of 2009, Public Law 111-5 (2009) [together with their implementing regulations, collectively, hereinafter HIPAA]. HIPAA, for example, directly regulates the entities using and disclosing protected health information (PHI) (i.e., covered entities and business associates) rather than the platforms those entities might use. 
  20. See infra (analyzing the application of the controller / processor construct as applied to manufacturers of digital health products). 
  21. See  Garbagnai, A.,  Wu, L.,  &  Van Doninck, D., Building a Pragmatic Framework to Advance Data-Driven Healthcare Research and Innovation, American Bar Association, the ABA Health eSource, n. 43 and accompanying text (September 2020), https://www.americanbar.org/groups/health_law/publications/aba_health_esource/2020-2021/september-2020/bui-pra/ (discussing applicability of HIPAA to medical device manufacturers and pharmaceutical companies). 
  22. See e.g., infra (discussing CLIA laboratories as covered entities under HIPAA).  Manufacturers may also be considered covered entities under HIPAA where they perform healthcare services as a provider and electronically transmit health information in relation to such services as part of a transaction, e.g., as part of its patient assistance programs. Examples of where a manufacturer might fulfill the role of a HIPAA business associate include products with EMR/EHR integration, cloud-based products, and products that deploy service technicians with access to PHI, remotely or in-person, or when the company provides services on behalf of a covered entity. See also e.g., Parghi, I., Digital Health 101: There’s No Regulator-Free Path to the Digital Health Market, Bloomberg L. (July 6, 2016), https://news.bloomberglaw.com/health-law-and-business/digital-health-101-theres-no-regulator-free-path-to-the-digital-health-market. See also  Bechtel, C., et al., Why Aren’t More Patients Electronically Accessing Their Medical Records (Yet)?, Health Aff. Forefront (Jan. 13, 2020), https://www.healthaffairs.org/do/10.1377/forefront.20200108.82072/full/ (“​​HIPAA, the predominant legal framework for health data, is already wildly insufficient for protecting health data, … because HIPAA applies only to a set of “covered entities,” which do not always include many of the parties developing and using new health apps and services.”). 
  23. See Id. at Parghi. 
  24. See Garbagnai, supra n. 21 (discussing HIPAA-adjacent companies and products and potential treatment under various privacy laws, and emphasizing the use of existing legal frameworks, like HIPAA, to meet internal, industry and/or regulatory standards and expectations and build trust, rather than only when required by law). 
  25. See Health Breach Notification Rule, 16 C.F.R. Part 318, 74 Fed. Reg. 42962 (Aug. 25, 2009). 
  26. See infra  (discussing application of Section 5 of the FTC Act to ensure that businesses are not overstating or misrepresenting their privacy and security practices). See also FTC, Business Guidance Resources, Data Breach Response: A Guide for Business, https://www.ftc.gov/business-guidance/resources/data-breach-response-guide-business (last updated February 2021) (providing helpful resources for companies that may experience a data breach). 
  27. See FTC, FTC Warns Health Apps and Connected Device Companies to Comply With Health Breach Notification Rule (Sept. 15, 2021), https://www.ftc.gov/news-events/press-releases/2021/09/ftc-warns-health-apps-connected-device-companies-comply-health
  28. See FTC, Complying with FTC’s Health Breach Notification Rule, https://www.ftc.gov/tips-advice/business-center/guidance/complying-ftcs-health-breach-notification-rule (last accessed Mar. 10, 2022).   
  29. 15 U.S.C. §§ 6501 et seq. COPPA defines children as being under the age of 13 years. Of particular note, COPPA was the center of a recent FTC enforcement action against a mHealth company, where part of the settlement required the company to untrain any algorithms using children’s data. See FTC Press Release, FTC Takes Action Against Company Formerly Known as Weight Watchers for Illegally Collecting Kids’ Sensitive Health Data, Fed. Trade Comm’n (Mar.  4, 2022), https://www.ftc.gov/news-events/news/press-releases/2022/03/ftc-takes-action-against-company-formerly-known-weight-watchers-illegally-collecting-kids-sensitive. 
  30. See FTC, Business Guidance Resources, Complying with COPPA: Frequently Asked Questions, https://www.ftc.gov/business-guidance/resources/complying-coppa-frequently-asked-questions (last updated July 2020) (stating that COPPA “applies to operators of general audience websites or online services with actual knowledge that they are collecting, using, or disclosing personal information from children under 13, and to websites or online services that have actual knowledge that they are collecting personal information directly from users of another website or online service directed to children”). 
  31. See e.g., California Medical Information Act, Cal. Civ. Code §§ 56 et seq. [hereinafter CMIA]. The CMIA applies to - among other parties - businesses that offer software or hardware, “including a mobile application or related device” designed to maintain medical information. See id. at § 56.06. 
  32. 2018 Cal. Legis. Serv. Ch. 55 (A.B. 375) (WEST) [hereinafter CCPA].  See also, generally Prop. 24: The California Privacy Rights Act of 2020, codified at Cal. Civ. Code § 1798.199.40 (2020) [hereinafter CPRA];  Virginia Consumer Data Protection Act, VA Code Ann. §§ 59.1-571 - 59.1-581 (2021) [hereinafter VCDPA]; and Colorado Privacy Act, CO. Rev. Stat. §§ 6-1-1301 et seq. (2021) [hereinafter CPA] [collectively, hereinafter Comprehensive State Privacy Laws]. Note that CPRA, VCDPA, and CPA all have an effective date of January 1, 2023. 
  33. See e.g., CCPA at § 1798.150. More recent comprehensive state privacy laws include language similar to that found in the CPRA, which states “A business that collects a consumer’s personal information shall implement reasonable security procedures and practices appropriate to the nature of the personal information to protect the personal information from unauthorized or illegal access, destruction, use, modification, or disclosure.…” CPRA at § 1798.100(e). See also VCDPA at § 59.1-574 and CPA at § 6-41-1305(4). 
  34. See e.g., CCPA, supra note 32, at §§ 1798.145(c), 1798.146 (exempting from application of many of CCPA’s requirements data processed by HIPAA covered entities and business associates, PHI, and information de-identified in accordance with HIPAA or derived from information originally collected, created, transmitted or maintained by an entity regulated by HIPAA or CMIA, so long as such information is not re-identified; as well as data from research subject to U.S. and international regulations and standards governing the conduct of research and clinical studies). It should be noted that the CCPA’s notice requirements, prohibitions on re-identification, and certain contract language requirements may still apply. Id. 
  35. See generally European Parliament and Council of European Union (2016) Regulation (EU) 2016/679 [hereinafter GDPR].   
  36. Id. at Article 25. 
  37. See id. at Article 3 (describing the application of GDPR to organizations established both in and outside of the European Union). 
  38. This includes the consideration of whether the manufacturer might be prohibited from performing certain requests due to competing legal and regulatory obligations. See European Data Protection Board, Guidelines 10/2020 on restrictions under Article 23 GDPR (Oct. 13, 2021), https://edpb.europa.eu/system/files/2021-10/edpb_guidelines202010_on_art23_adopted_after_consultation_en.pdf.   
  39. See GDPR, supra n. 35, at Article 46.   
  40. Case C-311/18, Data Protection Commissioner v. Facebook Ireland Ltd. and Maximilian Schrems, ECLI:EU:C:2020:559 (July 16, 2020). 
  41. Commission Implementing Decision (EU) 2021/914 of 4 Jun. 2021 on standard contractual clauses for the transfer of personal data to third countries pursuant to Regulation (EU) 2016/679 of the European Parliament and of the Council C/2021/3972, O.J. (L199).   
  42. Zanfir-Fortuna, G., Understanding Why the First Pieces Fell in the Transatlantic Transfers Domino, Future of Priv. F. (Jan. 27, 2022), https://fpf.org/blog/understanding-why-the-first-pieces-fell-in-the-transatlantic-transfers-domino/ (analyzing the broader impact of Member State enforcement against Google Analytics at the end of 2021 and first quarter of 2022 and the implications of said enforcement on EU-US data transfers in general). 
  43. See supra n. 20 (discussing the role of manufacturer as a controller / processor). 
  44. Data localization occurs when a country’s or territory’s law prohibits or restricts the transfer of personal data outside of the country. 
  45. Luo, Y.,  Yu, Z., &  Liu, V., The Future of Data Localization and Cross-Border Transfer in China: A Unified Framework or a Patchwork of Requirements, Int’l Ass’n of Priv. Pro. (June 22, 2021), https://iapp.org/news/a/the-future-of-data-localization-and-cross-border-transfer-in-china-a-unified-framework-or-a-patchwork-of-requirements/ (describing the complexity of implementing the data localization requirements introduced by the Chinese Cyber Security Law, the Data Security Law, and the Personal Information Protection Law).  
  46. For example, South Africa’s Protection of Personal Information Act went into effect in 2020, but the grace period for implementation was extended after implementation guidance was not issued by the government until the middle of 2021. See  Britton, N.,  Swart K., & Van Der Vyver, E., South Africa POPIA: Information Regulator Issues Two New Guidance Notes on Authorizations for Special Personal Information and Children’s Personal Information, Clyde & Co. (June 29, 2021), https://www.clydeco.com/en/insights/2021/06/south-africa-popia-information-regulator-issues-tw. It should be noted that this also occurs in the United States. For example, in February 2022, the California Privacy Protection Agency announced that the CPRA implementation regulations would be delayed to the second half of 2022, despite the January 2023 effective date, but no extension of the expected enforcement (currently scheduled to begin July 1, 2023) has been announced. See Duball, J., CPRA Regulations Delayed Past July 1 Deadline, Expected Q3 or Q4, Int’l Ass’n of Priv. Pro. (Feb. 23, 2022), https://iapp.org/news/a/cpra-regulations-delayed-past-july-1-deadline-expected-q3-or-q4/ (also noting a similar occurrence for the CCPA; stating, “The California attorney general's office went past its deadline to produce regulations for the California Consumer Privacy Act in 2020 as those regulations took effect more than a month later. In that instance, the attorney general's office opted against any sort of enforcement delay while noting companies had ample time to complete compliance activities despite the delay on regulations.”). 
  47. The U.S. agency that administers and enforces the Federal Food Drug and Cosmetic Act, as amended. 21 U.S.C. §§ 301-392 (Suppl. 5 1934) [hereinafter FDCA]. 
  48. The U.S. agency that oversees and regulates all laboratory testing for humans through the Clinical Laboratory Improvement Amendments, though some research is exempted from CMS oversight. 42 U.S.C. § 263a (1988) [hereinafter CLIA]. 
  49. See generally e.g., European Union Agency for Cybersecurity, Baseline Security Recommendations for IoT (Nov. 20, 2017), https://www.enisa.europa.eu/publications/baseline-security-recommendations-for-iot; U.S. Internet of Things Cybersecurity Improvement Act of 2020, Pub. L No 116-207 (2020); and California Internet of Things Cybersecurity Improvement Act of 2017, Cal. Civ. Code § 1798.91.04 et seq. (2020). 
  50. 15 U.S.C. §§ 45 et seq. (1914), as amended. 
  51. The FTC is charged with ensuring that companies with consumer-facing products do not engage in “unfair or deceptive” trade practices, which have for over a decade often included a product’s or company’s security and/or privacy disposition. See generally FTC, Privacy and Security Enforcement, https://www.ftc.gov/news-events/media-resources/protecting-consumer-privacy/privacy-security-enforcement (last visited Mar. 3, 2022). 
  52. 31 U.S.C. §§ 3729 -3733. 
  53. DOJ, Office of Public Affairs, Deputy Attorney General Lisa O. Monaco Announces New Civil Cyber-Fraud Initiative (Oct. 6, 2021), https://www.justice.gov/opa/pr/deputy-attorney-general-lisa-o-monaco-announces-new-civil-cyber-fraud-initiative. In healthcare, FCA cases and settlements often involve providers or suppliers who make false or fraudulent claims under Medicare and other federal programs,  bill for services not rendered, or are the giver or recipient of a kickback related to a referral. With this recent initiative, the DOJ has broadened the reach of the FCA, as demonstrated by a recent example where a provider billed the State Department to build an EMR system that then was not used for the data related to the provided medical services and “had not consistently stored patients’ medical records on a secure EMR system.” See U.S. Attorney’s Office, E.D. NY., Contractor Pays $930,000 to Settle False Claims Act Allegations Relating to Medical Services Contracts at State Department and Air Force Facilities in Iraq and Afghanistan, U.S. Dept. of Just. (Mar. 8, 2022), https://www.justice.gov/usao-edny/pr/contractor-pays-930000-settle-false-claims-act-allegations-relating-medical-services. 
  54. See  Wilson, D., DOJ Atty Says Cybersecurity Plan Is A Call To Whistleblowers, Law360 (Feb. 23, 2022), https://www.law360.com/articles/1467843/doj-atty-says-cybersecurity-plan-is-a-call-to-whistleblowersAs summarized succinctly by the DOJ, “In addition to allowing the United States to pursue perpetrators of fraud on its own, the FCA allows private citizens to file suits on behalf of the government (called “qui tam” suits) against those who have defrauded the government.  Private citizens who successfully bring qui tam actions may receive a portion of the government’s recovery.  Many Fraud Section investigations and lawsuits arise from such qui tam actions.” DOJ, Fraud Section, The False Claims Act (updated Feb. 2, 2022), https://www.justice.gov/civil/false-claims-act. 
  55. It is worth noting  that while the Genetic Information Nondiscrimination Act (Pub. L. 110-233 (2008)) (GINA) is often discussed in the context of genetic data, the law is rarely in scope of product development efforts. 
  56. See generally supra n. 49 (IoT).  See also  Engler, ​​A., The EU and U.S. are starting to align on AI regulation, The Brookings Inst.: TechTank (Feb. 1, 2022), https://www.brookings.edu/blog/techtank/2022/02/01/the-eu-and-u-s-are-starting-to-align-on-ai-regulation/ (noting that “[s]ince 2017, at least 60 countries have adopted some form of artificial intelligence policy, a torrent of activity that nearly matches the pace of modern AI adoption.”); FDA, CDRH, ​​Direct to Consumer Tests, https://www.fda.gov/medical-devices/in-vitro-diagnostics/direct-consumer-tests (last updated Dec. 20, 2019); and Jillson, E.,  Selling genetic testing kits? Read on., FTC: Bus. Blog (Mar. 21, 2019), https://www.ftc.gov/business-guidance/blog/2019/03/selling-genetic-testing-kits-read. 
  57. Defined as “an instrument, apparatus, implement, machine, contrivance, implant, in vitro reagent, or other similar or related article, including a component part of accessory which is: … intended for use in the diagnosis of disease or other conditions, or in the cure, mitigation, treatment, or prevention of disease … or intended to affect the structure or any function of the body of man or other animals....” See FDCA, infra n. 47 at § 201(h). 
  58. See  Lievevrouw, E.,  Marelli, L., & Van Hoyweghen, I., The FDA’s standard‑making process for medical digital health technologies: co‑producing technological and organizational innovation, BioSocieties at 8, Fig. 1 (May 2021). FDA’s Center for Devices and Radiological Health (CDRH), the branch of FDA charged with overseeing medical devices, has made a concerted effort in recent years to help ease the regulatory burden placed on manufacturers of digital health products, including by creating a Digital Health Innovation Action Plan, issuing a number of guidance documents, and creating a “Digital Health Center of Excellence” (DHCoE), whose stated goal is to “[e]mpower stakeholders to advance health care by fostering responsible and high-quality digital health innovation.” See generally FDA, Digital Health Center of Excellence, https://www.fda.gov/medical-devices/digital-health-center-excellence (last visited Mar. 9, 2022) (providing links to draft and final guidance documents issued since 2005 that may apply to digital health products, including the Action Plan). The DHCoE is intended to further partner with industry, share knowledge, and establish innovative regulatory review processes, such as the Digital Health Software Precertification (Pre-Cert) Program pilot. Id. See also FDA, Digital Health Software Precertification (Pre-Cert) Program, https://www.fda.gov/medical-devices/digital-health-center-excellence/digital-health-software-precertification-pre-cert-program (last visited Mar. 9, 2022) (a pilot program intended to “help inform the development of a future regulatory model that will provide more streamlined and efficient regulatory oversight of software-based medical devices”). 
  59. 21st Century Cures Act, Pub.L. 114–255 (2016) [hereinafter Cures Act]. 
  60. Generally, those not reasonably likely to have serious adverse health consequences.  To make this determination, FDA must consider the following factors:  likelihood and severity of patient harm if the software were to not perform as intended; extent to which the software function is intended to support the clinical judgment of a healthcare professional; whether there is a reasonable opportunity for a healthcare professional to review the basis of the information or treatment recommendation provided by the software function; and the intended user and user environment. Id. at § 3060. 
  61. See id. at § 3060(a). 
  62. See e.g. FDA Digit. Health Ctr. of Excellence, Gen. Wellness: Policy for Low Risk Devices, Guidance Document (2016, reissued 2019) [hereinafter General Wellness Guidance] and FDA Digital Health Center of Excellence, Pol’y for Device Software Functions and Mobile Medical Applications, Guidance (2013, reissued 2019) (superseding “Mobile Medical Applications” dated Feb. 9, 2015). 
  63. See FDA, CDRH, Digital Health Center of Excellence: Cybersecurity (last visited Mar. 13, 2022), https://www.fda.gov/medical-devices/digital-health-center-excellence/cybersecurity [hereinafter Cyber Center of Excellence] (providing playbook templates, guidances, papers, communications, and other resources both for manufacturers and also healthcare delivery organizations, like doctors’ offices). 
  64. See e.g., FDA, CDRH, LifeCare PCA3 and PCA5 Infusion Pump Systems by Hospira: FDA Safety Communication - Security Vulnerabilities, Safety Alerts for Human Medical Products (May 13, 2015), http://wayback.archive-it.org/7993/20170722144742/https://www.fda.gov/MedicalDevices/Safety/AlertsandNotices/ucm446809.htm. As of publication, only 13 such alerts have been communicated, with the intended audience varying from patients and consumers to manufacturers, and also to healthcare providers and delivery organizations. See Cyber Center of Excellence, supra n. 63, at Cybersecurity Safety Communications and Other Alerts. 
  65. See Garbagnati, supra n. 21, at 7. 
  66. See e.g., European Commission, Communication from the Commission to the European Parliament, The Council, The European Economic and Social Committee and the Committee of the regions, eHealth Action Plan 2012–2020—Innovative healthcare for the 21st century, https://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52012DC0736&from=EN (Apr. 12, 2012). 
  67. European Parliament and Council of European Union Regulation (EU) 2017/745 [hereinafter MDR] (providing a similar definition of medical device as under the FDCA in Article 2(1) and including application to software throughout).
  68. See NIST, infra n. 74. 
  69. See generally Cavoukian, supra n. 17
  70. “Pseudonymous data” is generally recognized to refer to data scrubbed for identifiers that may link to a data subject, thus reducing linkability (e.g., via hashing or tokenization). 
  71. See generally  Simon, G., Assessing and Minimizing Re-Identification Risk in Research Data Derived from Health Care Records, 7 J. Elec. Health Data and Methods 6 (Mar. 29, 2019). 
  72. See Sariyar, supra n. 11. 
  73. Open Web Application Security Project (OWASP), A04:2021 – Insecure Design,  https://owasp.org/Top10/A04_2021-Insecure_Design/ (last visited Feb. 25, 2022). 
  74. See  Nieles, M., et al., National Institutes of Standards and Technology (NIST), U.S. Dept. of Comm., An Introduction to Information Security, NIST Special Pub. 800-12 Rev. 1 (June 2017). NIST defines “least privilege” as a principle that “allow[s] only authorized access for users or processes acting on behalf of users that is necessary to accomplish assigned tasks in accordance with organizational missions and business functions.” Id. at § 5.4.3. 
  75. Security by obscurity is basically where security is accomplished mainly by hiding important information (e.g., by obfuscating file names or hiding hardcoded passwords in code). There is a misperception that hiding such information is sufficient to minimize risk of a cyber attack. However, this is the electronic equivalent of hiding your house key on a rafter above the door or under a rock, or setting a lock combination on luggage or phone as 1234 or ABCD. See OWASP Juice Shop, Security through Obscurity, https://pwning.owasp-juice.shop/part2/security-through-obscurity.html (last visited Mar. 11, 2022). 
  76. Sveikauskas, D., Security By Design Principles According To OWASP, https://patchstack.com/security-design-principles-owasp/ (last updated June 16, 2021) (summarizing the OWASP Development Guide). 
  77. See FDA, Guidance Document, Postmarket Management of Cybersecurity in Medical Devices (December 2016) [hereinafter FDA Postmarket Cybersecurity Guidance]. See also FDA, FDA Fact Sheet, The FDA’s Role in Medical Device Cybersecurity, Undated, https://www.fda.gov/media/123052/download (last viewed Mar. 3, 2022) [hereinafter FDA Fact Sheet]. 
  78. See MDR, supra n. 67, at Annex I, 17.2 and 17.4. 
  79. Id. at Annex XV, 4.5. 
  80. Landi, H., 82% of healthcare organizations have experienced an IoT-focused cyberattack, survey finds, Fierce Healthcare (Aug. 29, 2019), https://www.fiercehealthcare.com/tech/82-healthcare-organizations-have-experienced-iot-focused-cyber-attack-survey-finds (stating “Almost all of the healthcare organizations surveyed agree that a security solution should be an enabler of new business models, not just a cost, which suggests attitudes towards IoT security are changing for the better as IoT devices proliferate throughout the sector.”). 
  81. See generally e.g., Ross, R., et al., NIST, U.S. Dept. of Comm., Systems Security Engineering: Considerations for a Multidisciplinary Approach in the Engineering of Trustworthy Secure Systems, SP 800-160 Vol. 1 (November 2016 (updated March 2018)) and International Organization for Standardization (ISO), Information technology — Security techniques — Catalogue of architectural and design principles for secure products, systems and applications, ISO/IEC TS 19249:2017 (October 2017). 
  82. This could be anyone, but typically would be someone in legal, IT, product development, or even quality. 
  83. See generally NIST, Computer Security Resource Center, Glossary, https://csrc.nist.gov/glossary (last visited Mar. 7, 2022) [hereinafter NIST Glossary] (defining “risk” to be “[a] measure of the extent to which an entity is threatened by a potential circumstance or event, and typically a function of: (i) the adverse impacts that would arise if the circumstance or event occurs; and (ii) the likelihood of occurrence”; “security risk” to be “the level of impact on agency operations (including mission functions, image, or reputation), agency assets, or individuals resulting from the operation of an information system given the potential impact of a threat and the likelihood of that threat occurring”; and “privacy risk” to be “the likelihood that individuals will experience problems resulting from data processing, and the impact should they occur”). In practice, in-house privacy and security professionals may also consider “risk” in this context to include the inability of a product to permit users to exercise their privacy rights or for customers to perform their privacy or security compliance obligations. 
  84. See 45 C.F.R. § 160.103 (including definitions of the key organizational roles covered by the law, that is, covered entities and business associates). 
  85. See GDPR supra n. 35, at Article 4 (defining a controller as “determin[ing] the purposes and means of the processing of personal data”). 
  86. It should be noted, however, that some EU Member States have advised on how manufacturers across industries can still use data for their own purposes as a data processor. In either scenario, it is clear that the intended uses of the data should be made transparent to the customer and/or end user. See Maldoff, G. & Tene, O., CNIL Sets Parameters for Processors’ Reuse of Data for Product Improvements, Int’l Ass’n of Priv. Pro. (Jan. 13, 2022), https://iapp.org/news/a/cnil-sets-parameters-for-processors-reuse-of-data-for-product-improvement/
  87. See generally CLIA, supra n. 48 (applying to lab testing for humans, but excludes certain testing conducted solely for research purposes). 
  88. While patients have the right to obtain their test results directly, lab tests are often still required to be ordered by and the results primarily directed to the treating practitioner. 
  89. GDPR, supra n. 35, at Article 30 (setting forth a requirement that companies maintain and report “records of processing”). 
  90. CPRA, supra n. 32, at § 1798.199.40(b) (giving power to the California Privacy Protection Agency to promulgate regulations for “record keeping requirements for businesses,” which is typically understood to refer at a minimum to records of processing activities). 
  91. See CCPA, supra n. 32.  While CCPA does not explicitly require a ROPA, it would be extremely difficult to comply with the law without one. CCPA does require companies to understand and demonstrate how they use data and to present that information in their public-facing privacy policies and notices.   
  92. See generally Apple Developer Portal, App Review, https://developer.apple.com/app-store/review/ (last visited Mar. 2, 2022) and Apple Developer Portal, Protecting the User’s Privacy, https://developer.apple.com/documentation/uikit/protecting_the_user_s_privacy (last visited Mar. 2, 2022) [collectively, hereinafter App Store Review]. 
  93. See generally, NIST Glossary, supra n. 83 (last visited Mar. 7, 2022) (defining “acceptable risk” and “residual risk” to mean the portion of risk remaining after design or mitigation is a reasonable level of potential loss of data or disruption to the system and thus “acceptable”). Whether a risk is acceptable is entirely relative and in proportion to a number of factors, including the type of data at issue, other protections in place, the industry or sector, company maturity, and likelihood of bad actors. By way of an example, an acceptable risk for some products may be to not require MFA or 2FA for making changes to an account or redeeming points or rewards. This approach may be acceptable to a company where the types of data at risk are not sensitive or inherently identifiable, the potential value of redemption is low, or where there is a trust architecture in place to ensure that if the risk becomes unacceptable, there are other security controls in place. 
  94. See e.g., GDPR, supra n. 35, at Article 35.  See also Article 29 Working Party, Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679 (Oct. 4, 2017), https://ec.europa.eu/newsroom/article29/items/611236.  See also Comprehensive State Laws supra n. 32, CPRA at § 1798.185(a)(15)(B); VDCPA at § 59.1-576(A)-(C); and CPA at § 6-1-1309(2)(a)-(c). Typically, PIAs focus on how an entity collects, uses, shares, and maintains personally identifiable information, and analyzes the information related to existing risks. PIAs are conducted to assess new business processes, acquisitions, or product launches or updates, or when a company’s activities will be in a new country or region. DPIAs instead are used to identify and minimize risk associated with the processing of personal data.  PIAs are used routinely for everyday business practices; in contrast, DPIAs are specified under applicable privacy laws as being required only when there is a high risk to the rights and freedoms of a data subject associated with a processing activity. Thus, the findings of a PIA (or similar assessment) typically will trigger the need for a DPIA to be conducted. 
  95. See infra (discussing the need to re-evaluate and refresh documentation). 
  96. Created by the Medical Imaging & Technology Alliance (MITA), the MDS2 is a voluntary standard for risk management purposes and to standardize information on security control features integrated in medical devices. 
  97. See e.g., Mayo Clinic, Medical and Research Device Risk Assessment: Vendor Packet Instructions (Oct. 1, 2020), https://www.mayoclinic.org/documents/medical-device-vendor-instructions/doc-20389647 (including a robust list of required documentation vendors must submit for review as part of Mayo Clinic’s process for assessing new medical devices, including requiring the manufacturer to provide a number of assessments and other internal artifacts to support their privacy and security features and controls as well as a copy of a current MDS2 form). See also Thomson Reuters, How in-house counsel can help the business maintain data security in vendor relationships  (Jul. 13, 2021), https://legal.thomsonreuters.com/en/insights/articles/maintaining-your-data-security-in-vendor-relationships (describing the role of in-house counsel in supporting their client in reviewing potential vendors and noting that accepting vendor self-assessments, third-party assessments, and certifications can be a necessary part of due diligence as well as continued oversight). 
  98. See generally App Store Review, supra n. 92. 
  99. For example, HIPAA requires that documentation of compliance records (e.g., authorizations, policies) be retained for six years (see HIPAA § 164.316(b)(2)(i)), but intentionally defers to state laws as to the duration of retention for actual medical records and information, which vary by state and depend on, among other things, the type of provider, age of the patient, types of records, and purpose of the records. If an mHealth app or digital health product is used in a clinical trial for the purposes of regulatory submission to FDA or similar authority, the duration of any retention period should then be consistent with regulatory and quality records keeping requirements for clinical trial data. Such data will likely also be required to be retained to meet contractual obligations between the manufacturer and the customer (in this case, likely a pharmaceutical, medical device, or biotech company). 
  100. See generally e.g., FDA, CDRH, Content of Premarket Submissions for Management of Cybersecurity in Medical Devices, Draft Guidance at 9 and 22-23 (Oct. 18, 2018) (including in its recommendation for manufacturers “determination of risk levels and suitable mitigation strategies” and requiring “[a] specific list and justification for all cybersecurity controls that were established for your device. This should include all risk mitigations and design considerations pertaining to intentional and unintentional cybersecurity risks associated with your device…”). See also supra n. 94 (discussing the GDPR’s requirement to document processing activities and well as risks and mitigations). 
  101. See e.g., GDPR, supra n. 35, at Article 35. 
  102. See e.g., FTC, supra nn. 25, 27, and 28 (discussing the FTC’s recent activities reinforcing strict breach notification obligations to consumer-generated health data not otherwise regulated by HIPAA). 
  103. In the United States, for digital health products regulated as a medical device, manufacturers must “comply with federal regulations. Part of those regulations, called quality system regulations (QSRs), requires that medical device manufacturers address risks, including cybersecurity risk.” See FDA Fact Sheet, supra n. 77. Such a requirement also demonstrates the necessity of performing the activities described in this article. See also supra  (discussing MDR cybersecurity and privacy requirements). 
  104. See e.g., FDA Postmarket Cybersecurity Guidance, supra n. 77
  105. There are a host of knowledge base and knowledge management software solutions available to companies, such as Slite, Zendesk, HelpJuice, Atlassian, HelpCrunch, Document360, and HubSpot Service Hub (among others), which can be deployed for either or both internal and external use. These solutions can range in capabilities from fairly basic to also including AI configuration for searches, advanced analytics, and other functions. See e.g., The Best Knowledge Base Software, Slite (Oct. 11, 2021), https://slite.com/learn/knowledge-base-softwares. Note that the authors of this article are not recommending these solutions, which are being listed solely as examples. 
  106. System and Organizational Controls (or SOC) 2 is a voluntary compliance standard intended to ensure the safety and privacy of data processed by technology service providers and SaaS companies that handle or store customer data. The SOC 2 standard sets forth a framework for protecting customer data based on the following principles: security, availability, processing integrity, confidentiality, and privacy. The standard is meant to be adaptable to each company’s own business needs.  Once the governance structure and foundation are established, the company’s program is then audited by a credentialed third party, which produces a report on the company’s compliance. See  Monahan, G., What SOC 2 Type II Certification Means, L. Tech. Today (Jul. 31, 2014), https://www.lawtechnologytoday.org/2014/07/soc-2-type-ii-certification-means/. 
  107. As demonstrated by the recent “connect2health” initiative and COVID-19 Telehealth Program, the latter of which was funded by the Coronavirus Aid, Relief, and Economic Security (CARES) Act, among many other initiatives. See generally FCC, Connecting Americans to Health Care, https://www.fcc.gov/connecting-americans-health-care (last visited Mar. 7, 2022). 
  108. See e.g., FTC, Press Release, FTC Requires Zoom to Enhance its Security Practices as Part of Settlement (Nov. 9, 2020), https://www.ftc.gov/news-events/news/press-releases/2020/11/ftc-requires-zoom-enhance-its-security-practices-part-settlement (discussing a settlement with Zoom Video Communications, Inc., and citing the complaint against Zoom where the FTC alleged that the company misled users by touting in various mediums the security controls in place and giving users "a false sense of security," lacked any offsetting measures to protect users' security, left ghost software even after app deletion, and failed to store recorded sessions securely, as was stated in its product literature). See e.g., FTC, Press Release, FTC Approves Final Settlement With Facebook (Aug. 10, 2012),  https://www.ftc.gov/news-events/press-releases/2012/08/ftc-approves-final-settlement-facebook (describing the imposition of a $5 billion penalty and “sweeping” new privacy and security restrictions and new corporate structures on Facebook due to violation of a 2012 settlement order, which had previously resolved charges that Facebook deceived consumers by telling them they could keep their information on Facebook private, and then repeatedly allowing it to be shared and made public).   
  109. See Weller, A. &  Leach, E., How to Build a Culture of Privacy, Int’l Ass’n of Priv. Pro. (Feb. 25, 2020), https://iapp.org/news/a/how-to-build-a-culture-of-privacy/.

Alea Garbagnati

Adaptive Biotechnologies, South San Francisco, CA

Alea Garbagnati, Esq. is Head of Privacy at Adaptive Biotechnologies, where she leads a team of amazing privacy professionals and provides strategic counsel on privacy compliance, product privacy by design, research, and data strategy. She is passionate about finding practical and ethical solutions to support data use in health research and innovation and has co-authored articles and presented on the subject. She has almost a decade of experience in international and domestic privacy, data protection, and cybersecurity law. Prior to joining Adaptive, she was privacy counsel at a multinational diagnostics and pharmaceutical company, where she was involved in a number of privacy, cybersecurity, and data related initiatives at varying levels of the organization. She was also an advisory consultant at a Big Four accounting firm, where she provided privacy and data protection support to companies across industries.

Ms. Garbagnati is an active member of the International Association of Privacy Professionals, serving as a 2022-2023 member of the Publications Advisory Board and having spoken at multiple IAPP events. She is an alumna of the University of California, Irvine (zot zot!), and Hastings College of the Law. When she’s not a privacy Jedi master, she enjoys spending time with her family and is an avid traveler, (very) amateur photographer, musical theater aficionado and passionate soccer fan (go Quakes!).  She may be reached at [email protected].

Lauren Wu

Evidation Health, San Mateo, CA

Lauren Wu, Esq., CIPP/US, is Head of Privacy and Senior Director of Legal - Regulatory & Compliance at Evidation Health, a California-based digital health company that helps individuals and partnering life science companies and healthcare organizations to measure health in everyday life and enabling anyone to participate in ground-breaking research and health programs. She is a privacy industry influencer, passionate privacy, regulatory and compliance attorney, and enabler of ethical data stewardship to advance health innovation. Before joining Evidation, she served as Senior Counsel for Roche Molecular Solutions (RMS) (now Roche Diagnostic Solutions and Roche Information Solutions) focusing on regulatory and healthcare fraud and abuse law matters, as well as reimbursement and privacy. While at Roche, Ms. Wu led the RMS Data Protection & Privacy team, serving as both U.S. Privacy Officer and Data Protection Officer, while also advising on regulatory issues such as transitioning to IVDR/MDR, as well as laboratory developed test and CLIA regulation. Prior to joining Roche, she worked as Senior Corporate Counsel, U.S. Privacy Officer, and Interim U.S. Compliance Officer at Genomic Health (now Exact Sciences), and was a Healthcare associate at Ropes & Gray LLP (SF) and Healthcare and FDA legal / legislative assistant at Sidley Austin LLP (DC). She is an alumna of the University of Southern California (Fight on, Trojans!) and Northwestern University School of Law (Go Wildcats!). Ms. Wu also serves as a Board member for the Surveillance Technology Oversight Project (S.T.O.P) and recently co-authored an article on privacy’s impact to healthcare research and innovation. A former ballerina and USC football recruiter, she enjoys cooking, wine, travel (@ least pre-COVID), gardening, and spending time with her husband and three beautiful daughters (one of whom could say "GDPR" before the age of 1). She may be reached at [email protected].

Entity:
Topic:
The material in all ABA publications is copyrighted and may be reprinted by permission only. Request reprint permission here.