Privacy Challenges of Data Sharing
While new interoperability tools create new opportunities to aggregate and analyze data, significant privacy challenges remain. At the federal level, there are primarily four privacy laws governing health information: (1) the Standards for Privacy of Individually Identifiable Health Information (Privacy Rule),promulgated pursuant to the Health Insurance Portability and Accountability Act of 1996 (HIPAA);(2) the Confidentiality of Substance Use Disorder Patient Records at 42 C.F.R. Part 2 (the “Part 2 Rule”);(3) Section 5 of the Federal Trade Commission (FTC) Act, prohibiting unfair and deceptive trade practices;and (4) the FTC’s Health Breach Notification Rule.
Privacy Rule
The Privacy Rule readily permits use and disclosure of protected health information (PHI) for treatment, payment, and healthcare operations.The Privacy Rule generally requires individuals’ authorizations, however, for HIPAA covered entities or business associates to use or disclose PHI for research. The primary exception to this authorization requirement is that an institutional review board (IRB) or privacy board that meets certain criteria may waive or alter the Privacy Rule’s authorization requirement in certain circumstances.Relying on an IRB’s waiver of authorization is one of the most frequent means of collecting large volumes of PHI for research. Some of the challenges associated with this permission, however, are questions as to what constitutes “research” (for example, whether commercial, non-published research is considered “research” under the Privacy Rule) and potential politics surrounding IRBs (for example, a covered entity may only be willing to disclose PHI pursuant to its own IRB’s waiver).
Another basis to aggregate health data under the Privacy Rule is through de-identification. The Privacy Rule permits a covered entity to de-identify PHI (or authorize its business associate to do so) as part of its permitted healthcare operations, whether or not the de-identified information is to be used by the covered entity.There are two methods for de-identification. The first is the “Safe Harbor Method,” in which 18 categories of direct and quasi-identifiers have been removed and the covered entity does not have actual knowledge that the remaining information could be used alone or in combination with other information to identify an individual who is a subject of the information.The second is the “Expert Determination Method,” in which a statistical expert determines and documents that “the risk is very small that the information could be used, alone or in combination with other reasonably available information, by an anticipated recipient to identify an individual who is a subject of the information.”De-identification is often a means for creating large data sets for research and analytics. The primary challenges with de-identification are:
- The use of the Safe Harbor method may substantially deteriorate the value of the data because of loss of dates more specific than year, loss of most geographic data about individuals more specific than state, and loss of unique identifiers that allow for data linkages across data sets.
- Engaging a statistical expert under the Expert Determination can be expensive and time consuming, and the expert’s determination may last for a limited duration.
- De-identification of unstructured data is often difficult, with identifiers potentially slipping through.
- Business associates often have access to the large quantities of health data spanning multiple covered entities, but such covered entities may not grant the business associates with de-identification rights.
Another option under the Privacy Rule for creating large data sets is through the creation of a limited data set that may be used or disclosed pursuant to a data use agreement.Some benefits of limited data sets are that they permit the use of PHI for research without the need for an authorization or an IRB or privacy board’s waiver of authorization and, unlike de-identified data, they can include dates, more specific geographic information such as zip codes of individuals, and unique identifiers that may allow for data linkages. Some of the challenges of limited data sets, however, are that they are limited to healthcare operations, public health, and research (with questions surrounding the scope of what constitutes “research”); they are subject to additional Privacy Rule restrictions such as the prohibition on selling PHI; and covered entities rarely give business associates the right to create, use, and disclose limited data sets.
The Privacy Rule also includes a few general limitations that often hinder disclosures that contribute to compilation of health data. One is the “minimum necessary” standard, under which a covered entity or business associate must make reasonable efforts to limit the amount of PHI disclosed to the minimum necessary to accomplish the intended purpose of the disclosure.Parties involved in creating data sets may disagree on what constitutes the minimum necessary PHI, and the covered entity may need to expend resources to redact some PHI that is unnecessary. Another limitation is the Privacy Rule’s prohibition on the sale of PHI. Even if a disclosure is otherwise permissible under the Privacy Rule, a covered entity or business associate generally cannot receive remuneration (including non-financial remuneration like services or intellectual property rights) beyond the cost of preparing and transmitting the protected health information.
Finally, another challenge with the Privacy Rule is that it often serves as a convenient excuse to not share data. As articulated above, the Privacy Rule includes various permissions that covered entities potentially can utilize to disclose PHI, contributing to big data analytics. But doing so involves navigating various Privacy Rule criteria, taking on the legal risk if a regulator disagrees with the covered entity’s interpretation of the Privacy Rule, and potential reputational harm associated with disclosing PHI for purposes unrelated to treatment. As a result, covered entities often default to stating that the Privacy Rule does not permit a disclosure of PHI, rather than going through the analysis and risk of diving deeper and navigating the Privacy Rule.
The Part 2 Rule
While the Privacy Rule is generally treated as the primary federal law governing health information, 42 C.F.R. Part 2 (the Part 2 Rule) has actually been around for decades longer.The Part 2 Rule governs certain federally assisted “Part 2 programs” that provide substance use disorder (SUD) services.A “program” is: (1) an individual or entity, other than a general medical facility, that holds itself out as providing and provides SUD services; (2) an identified unit within a general medical facility that holds itself out as providing and provides SUD services; or (3) medical personnel or other staff in a general medical facility whose primary function is the provision of SUD services and who are identified as such a provider.What constitutes “federal assistance” is broadly defined to include tax-exempt status, reimbursement from federal health insurance programs, or federal licensure, certification, or registration.The Part 2 Rule governs individually identifiable records that originate from a Part 2 program and identify a patient as having or having had a SUD.
The Part 2 Rule includes more stringent requirements on SUD records than HIPAA, potentially curtailing the disclosure of such records for data analytics. The most relevant permissions under the Part 2 Rule are: (1) the Part 2 program or a “lawful holder” of Part 2 records may de-identify the information in the same manner as under HIPAA; or (2) the Part 2 Rule permits disclosures of SUD records for research without patient consent if: (i) the recipient is subject to HIPAA, the Common Rule governing protection of human subject (45 C.F.R. part 46), or Food and Drug Administration regulations regarding human subjects (21 C.F.R. parts 50 and 56) or (ii) the disclosing Part 2 program or lawful holder of the records is subject to and complies with the Privacy Rule with respect to the disclosure.Absent those options, a Part 2 program or lawful holder of Part 2 records may need to exclude such SUD records from any disclosure of health data. Accordingly, conducting big data analytics involving SUDs can be particularly challenging.
Section 5 of the FTC Act
Section 5 of the FTC Act prohibits unfair and deceptive trade practices.The FTC interprets this as prohibiting practices that violate the privacy of individuals’ personal information.The FTC has brought enforcement actions against healthcare entities for violating their online privacy policies or other privacy statements, treating such practices as deceptive.The FTC also has brought enforcement actions where it alleged a privacy practice to be “unfair” because the act or practice causes or is likely to cause substantial injury to consumers that consumers cannot readily avoid themselves and that this not outweighed by countervailing benefits to consumers or competition.The FTC does not have jurisdiction over genuine non-profit entities under Section 5.
Historically, the FTC’s Section 5 enforcement actions in the healthcare space have involved issues such as improper disposal of PHI, allegations of lax security, or alleged disclosure of personal information collected from healthcare websites to third-party advertising platforms.The FTC has not brought any enforcement actions related to large disclosures of health data for big data analytics. Nevertheless, for-profit entities involved in disclosing health data should be very cognizant of the FTC’s authority and ensure that their disclosures are fully consistent with their privacy policies and do not cause any unfair harm to consumers.
The FTC’s Health Breach Notification Rule
Finally, the FTC has a breach notification rule governing personal health records.The FTC refers to this rule as the Health Breach Notification Rule (HBNR). The HBNR was promulgated in 2009.In September 2021, the FTC issued a policy statement clarifying the scope of the HBNR in two particular ways.First, the FTC clarified the scope of “personal health records,” broadly interpreting the definition to encompass nearly any health and wellness application. Second, the FTC clarified that a “breach” for purposes of the HBNR is not limited to external cyber attacks but also includes any use or disclosure of personal health records without users’ authorizations. After this policy statement and over a decade without enforcement of the HBNR, the FTC brought two enforcement actions under the HBNR in 2023.
As the federal government has pushed for increased interoperability and greater access to health data through APIs, there is significant potential to aggregate health data through collection and disclosure by consumers and consumer applications. Pursuant to the FTC’s broad interpretation of the HBNR, however, consumer application developers must ensure that they are transparent about their disclosure practices and have users’ authorizations for disclosures of their health data. Otherwise, the FTC will treat their disclosures as reportable breaches and may impose penalties for failures to notify users of such breaches.
State Privacy Laws
In addition to federal laws governing privacy of health data, each state has its own unique set of privacy laws that potentially limit the disclosure and aggregation of health data. These laws can generally fall under five categories.
First, a small but growing number of states have enacted comprehensive data privacy laws. The lead example of this type of law is the California Consumer Privacy Act (CCPA).Since California enacted CCPA, other states, such as Colorado, Connecticut, Indiana, Iowa, Utah, and Virginia, have similarly passed comprehensive privacy laws. These state laws also include exemptions for certain types of health information, such as PHI governed by HIPAA. But the details of these exemptions can vary. For example, the Utah Consumer Privacy Act exempts PHI generally, while CCPA only exempts PHI that is collected by a covered entity or business associate.These laws can impact the disclosure of health data, especially when such health data is outside of the governance of HIPAA. Additionally, CCPA is unique in requiring certain contractual provisions for the sale or licensing of de-identified health data that originated from PHI.
Second, many states have medical privacy laws governing the disclosure (and sometimes use) of health information. An example of this is the California Confidentiality of Medical Information Act.These laws may be more stringent than HIPAA with respect to limiting the extent that healthcare providers may disclose health data for purposes of data analytics in areas such as research. Some of the state laws may extend beyond healthcare providers, potentially governing recipients of health data.
Third, almost every state has laws governing certain sensitive conditions or treatments, such as, for example, HIV status, genetic information, and alcohol and drug abuse information. These laws tend to require authorizations and, unlike HIPAA and the more general state medical privacy laws, usually do not include many exceptions to patient authorization requirements.
Fourth, every state has a breach notification law. These laws do not address when health data may be disclosed, but may require breach notifications to affected individuals and regulators if an entity finds that health data was improperly disclosed.
Finally, Washington state recently passed a rather unique law, the My Health My Data Act.This law provides numerous privacy rights to consumers with respect to “consumer health data,” which is broadly defined. PHI is exempt. Generally, a consumer’s consent is required to collect or disclose consumer health data for purposes that are not necessary to provide a product or service that the consumer has requested.
This complex patchwork of state privacy laws constitutes a substantial challenge to amassing large quantities of health data for research and analytics. Entities must review each state’s laws and may need to exclude certain categories of identifiable health data for which authorizations are required.
Big Data Projects
The vast majority of entities in the healthcare sector that own robust data sets (data owners) are now or will soon face questions and projects related to the research and commercialization of health data, much of it identifiable to specific individuals, which may include consumers of all types, such as patients, their family members, beneficiaries, research subjects, employees, and others. Many of these questions and projects related to important human subjects research, healthcare sector patient care and utilization improvements, unique research and development (R&D), workforce efficiencies, population health, and many other critical healthcare issues. However, all of these questions and projects create significant legal risk for the data owners involved, which may also be related to their vendors or cause liability for their vendors, as well.
Data Privacy and Security Violations and Enforcement
The HHS Office for Civil Rights, FTC, and State Attorneys General all have (and have exercised) authority to enforce privacy rules and promises made to individual consumers (data subjects), including as discussed above. These regulators settle cases on a regular basis related to breaches involving identifiable consumer information and allegations of violations of data privacy and security requirements at the state and federal levels related to advertising, marketing, sale, and other improper or prohibited use and disclosure of the identifiable information of data subjects. For example, both state and federal regulators take the position that the disclosure of identifiable consumer information without advance written consent in exchange for any direct remuneration (money) or indirect remuneration (other goods and services, including favorable license terms and website analytics) constitutes a sale of such identifiable information, which is prohibited by state and federal law. As such, entities developing these types of projects should be acutely aware of any risks involved with using or disclosing identifiable consumer information as part of such projects.
Upstream Contractual Breaches
Most contracts drafted or revised in the last few months or years contain provisions addressing allowable uses and disclosures of identifiable information, along with licenses related to such information from the data owners. For example, in the healthcare sector, most health insurance payor program participation contracts include significant limitations on data aggregation and de-identification or anonymization of identifiable information of beneficiaries covered by such contracts.As such, downstream secondary uses and disclosures of data can run afoul of upstream contractual limitations leading to disputes and damages both pursuant to the contracts at issue, as well as due to breaches of such information or other regulatory inquiries related to such information.
Increased Risk of Data Breach
Given that these questions and projects usually involve large data sets or other sources of aggregated information, such as “data lakes” or “data warehouses,” the entities maintaining these sources of aggregated information may become targets for both insider and external threat actors or may be vulnerable in other ways. For example, information used for all of the types of projects discussed above is extremely valuable and may be stolen by criminals in a variety of ways. Additionally, given that many of the projects utilizing these sources of information involve emerging technologies and R&D related to all types of software applications and devices, often human error as part of development or implementation of such projects results in data privacy and security incidents. Any privacy or security incident could result in exposure or theft of all of the aggregated information involved in any project, with serious consequences pursuant to state and federal data breach requirements, as discussed above, and to class action lawsuits pursuant to all types of state law claims associated with consumer protections and data breach requirements.
Reputational Damage
Would your grandmother approve? While many entities undertake projects involving consumer identifiable information or anonymized information after rigorous legal analysis, some of these projects may still look improper to an outsider, including a consumer whose information may be involved in the project, a consumer who does not understand the scope or purposes of the project, or a consumer who believes their consent should be required for any uses or disclosures of their information, identifiable or otherwise. At the end of the day, even where no law or regulation is violated, the court of public opinion may frown upon questionable or unethical uses and disclosures of consumer information, both identifiable and anonymized.
Conclusion
Again, the bigger the data, the higher the risks. New interoperability tools create exciting new opportunities for aggregation and analysis of big data sets that can lead to unprecedented improvements in healthcare. Attorneys negotiating or reviewing transactions involving healthcare data, however, must be familiar not only with business goals and project details, but also with a complex patchwork of federal and state privacy laws governing the data involved. While these complex laws can be navigated, doing so requires carefully analyzing the interaction of business needs related to a continuously growing number of federal and state laws. Given all of these issues, a team approach to these projects, questions, and answers may result in the best outcome for all.