top of page

Privacy of medical records and patient information: Big Data, AI, expanded EHR Safe Harbors and cons

There is no doubt that medical and other personal information is more exposed in the cybersecurity realm than ever. When the Office of Inspector General (OIG) announced new proposed amendments to the existing Electronic Health Records (EHR) Safe Harbors in October 2019, perhaps OIG and the Centers for Medicare and Medicaid Services (CMS) were just acknowledging the serious privacy and cybersecurity challenges lurking in the vast scope of ostensibly legitimate sharing of PHI. Various state legislatures also are enacting far-reaching cybersecurity and privacy laws.

Electronic Health Records

By way of background, OIG first adopted the Safe Harbor for the donation of EHR software and training by hospitals to physicians in 2005, but placed limitations intended to discourage inducement or remuneration for referrals by requiring that physicians pay at least 15% of the cost and establishing a sunset provision, i.e., stating that the EHR Safe Harbor would expire Dec. 31, 2013. Apparently, OIG thought the problem would be resolved by that time. However, in 2013, OIG extended the safe harbor until 2021.

OIG has apparently abandoned the misguided assumption that the need for better digital infrastructure will ever expire in the face of ever-increasing privacy and cybersecurity threats, and has released a proposal to not only eliminate the sunset date entirely, but also to eliminate the 15% cost-sharing requirement and to include cybersecurity software in the deal.

Breach and piracy of medical records and all kinds of personal data is at an all-time high. Modern Healthcare reported Jan. 27, 2020, that hospitals are the fourth-most common target for ransomware attacks. The Office of Civil Rights (OCR), which enforces HIPAA, has reported that 2019 was another record-breaking year for PHI data breaches, reporting more than 400 breaches of greater than 500 records. When OIG announced proposed rules to add cybersecurity technology services and software to the EHR exception, it acknowledged the price of cybersecurity technology and related services has dramatically increased to the point where individual providers no longer have the resources to fight this battle.

In an interesting juxtaposition regarding the privacy of PHI, the Wall Street Journal just reported several new deals in January 2020 in which hospitals have granted Microsoft, IBM and Amazon the ability to access identifiable patient health information (PHI) in ventures designed to allow Big Data to crunch millions of healthcare records, ostensibly in the name of quality assurance, research and improving diagnostics via Artificial Intelligence (AI). Meanwhile, contemporaneously with reports that Google is exploring similar deals with the Mayo Clinic and Ascension Healthcare, the U.S. Department of Justice announced in February 2020 the expansion of its antitrust inquiry of how Google is using its online tools and the information collected from those tools to consolidate its market power and leverage. Across the pond, the EU is expanding its antitrust inquiry into “user Facebook’s use of data to stifle completion.”

Are these deals exceptions to the HIPAA privacy rules, which we know allow for the minimum necessary disclosure for the purpose of “treatment payment and operations,” or is the integration of all of the healthcare records to make them accessible not only for patient care but for research part of Treatment, Payment and Healthcare Operations (TPO)? Will these giant systems simply rely upon the TPO exception, or will the HIPAA privacy disclosure that all patients are required to sign now, but rarely read, include Big Data disclosures? Is the creation of artificial intelligence software, which requires the entry of huge amounts of data to “teach” the software systems to do diagnosis, part of TPO?

Concurrently, the Office of the National Coordinator for Health Information Technology (ONC), which is a department of HHS, has proposed a rule to implement certain provisions of the 21st century Cures Act (Cures Act) designed to advance interoperability, support the access, exchange and use of electronic health information, and make patients electronic health information (EHI) more electronically accessible through the adoption of standards and certifications for mobile digital applications (apps) March 4, 2019, which proposed regulations are being studied by the White House. The major app makers, i.e., Google, Apple, Microsoft, etc., the very industry giants seeking the access deals mentioned herein, believe interoperable health information apps should be as easily loaded as any other mobile app, but many regulators are concerned about the privacy and security of this data. Here is a link to the proposed rules:

One of the critical issues is interoperability, and whether one app developer can program restrictions into that app that would prohibit the sharing of that information through other systems. The restriction is fairly common with other commercial apps which do not contain PHI and do not interfere with a patient’s management of their own healthcare, or the management by or sharing with other systems. However, that commercial application is viewed as incompatible with the idea of improving health care delivery through the use of mobile apps.

Consumer privacy

Partially in response to the increased use and monetization of customer data, state legislatures are imposing substantial new privacy and security obligations for customer data. For example, the California Consumer Privacy Act (CCPA), which came into effect Jan. 1, 2020, significantly expands the right of California residents to know exactly who has received their personal information and the purposes for which their personal information is used. Not only must businesses delete personal information at the consumer’s request, they also must ensure that the personal information is subject to “reasonable security” procedures to avoid its inadvertent disclosure or misuse. Perhaps purposefully, the CCPA does not define what qualifies as “reasonable security.” Although adopting such a vague requirement leaves many businesses uncertain about their legal obligations, it also allows the standard of care for consumer data to evolve based on what is “reasonable” according to current technology; security measures that might seem technologically difficult or cost prohibitive in 2020 may very well become mainstream in a few years. Though California is leading the way in the United States, the CCPA contains many of the same data privacy rights and security obligations as the EU’s General Data Protection Regulation (GDPR), which came into force in 2018. Other state legislatures, including in New York, Washington, Texas and Nevada, have proposed laws similar to the CCPA and GDPR. It is almost certain that, within a few years, all businesses that retain and use data concerning any resident of the United States or Europe will be subject to substantially greater legal and regulatory obligations regarding the use, disclosure and security of personal information. Of course, the volume of data being collected and shared also will grow and will be increasingly vulnerable to misuse and security breaches.


The actual release of the EHR proposed Safe Harbor, although a legal development, is actually just the proverbial tip of the iceberg in this issue. As stated by Scott McNealy, chairman of Sun Microsystems, in 1999, “You have zero privacy anyway. Get over it.” At about the same time, Oracle CEO Larry Elison observed, “The privacy you are concerned about is largely an illusion. All you have to give up is your illusions, not any of your privacy.”

bottom of page