Please support the Valley News during the COVID-19 pandemic

The COVID-19 pandemic has brought the local economy — and many of the advertisers who support our work — to a near standstill. During this unprecedented challenge, we continue to make our coronavirus coverage free to everyone at because we feel our most critical mission is to deliver vital information to our communities.

If you believe local news is essential, especially during this crisis, we are asking for your support. Please consider subscribing or making a donation today. Learn more at the links below.

Thank you for your support of the Valley News.

Dan McClory, publisher

Editorial: Google’s health information grab is another example of privacy malpractice

  • FILE - This Sept. 24, 2019, file photo shows a Google sign on the campus in Mountain View, Calif. Google is working with large health care system Ascension. The partnership is intended to use artificial intelligence to find patterns that could help doctors, but some are concerned about privacy and protecting patients’ sensitive health information. (AP Photo/Jeff Chiu, File)

Published: 11/20/2019 10:10:14 PM
Modified: 11/20/2019 10:10:06 PM

The Wall Street Journal reported recently that Google has embarked on a project to collect and crunch the detailed personal health information of some 50 million Americans. What could possibly go wrong?

We’ll get to that shortly, but first the alarming details. The effort was begun last year in partnership with St. Louis-based Ascension, the nation’s second-largest health system with 2,600 hospitals, doctors’ offices, nursing homes and other facilities in 21 states. The Journal reports that the information being collected includes names and dates of birth; illnesses diagnosed; results of lab tests; medication and hospitalization records; and certain billing claims.

Remarkably, neither patients nor their doctors were informed of the arrangement. Even more remarkably, privacy experts told the Journal that the effort appears to pass HIPAA muster. The Health Insurance Portability and Accountability Act of 1996, which contains extensive privacy and security protections, allows the sharing of patient data with business partners without having to inform patients if it is used only to help the health system “carry out its health-care functions.”

The goal of the initiative, according to company information reviewed by the Journal, is to create an entire personal health record on Google’s cloud-computing system and then use artificial intelligence to ask questions and provide answers about a given treatment plan, suggest changes of treatment and predict the outcome of procedures and medications.

This effort is code-named Project Nightingale. Project Nightmare is more like it.

The first problem of several is that it was conceived and implemented in secrecy and only confirmed once the Journal reported on it. It is certainly an abuse of privacy to share personally identifiable health records — among the most intimate information that exists about anyone — with a third party without the patient’s knowledge. If this is indeed permitted under federal law, then the law should be changed to require notification and permit patients to opt-out.

Both health care providers and patients should worry that Project Nightingale’s recommendations will override the judgment of clinicians who are supposed to be in charge of patient care. Presumably they would retain nominal authority to diagnose and treat as they see fit, but the pressure to conform, especially if their compensation is tied to use of the Nightingale platform, could be overwhelming. And who can guarantee that AI is up to the task in life-and-death situations? Is Project Nightingale going to carry malpractice insurance?

For its part, Ascension, a nonprofit Catholic health care system, seeks to improve patient care, but also hopes to use the data to generate more revenue, for instance by zeroing in on more tests that might be necessary in a given case, according to the Journal. For Google, the aim is to create the template for a lucrative tool that can be sold to other health systems. In other words, both partners have a serious financial interest in patient data — and financial interest, as has been demonstrated repeatedly, is not always synonymous with best interest.

The gravest concern, of course, is that privacy could be breached. At least 150 Google employees already have access to much of the information. And there’s always the danger that the data could be hacked and patient information compromised. Moreover, as Sen. Mark Warner, D-Va., has pointed out, Google is already in trouble for serious violations of privacy and security, having agreed in September to pay a $170 million fine after investigations into complaints that its YouTube video platform illegally gathered data on children to target ads to them.

It is also relevant that Google has made a $2.1 billion deal to acquire Fitbit Inc., maker of watches that track wearers’ health information, and in September inked a 10-year deal to store the medical, genetic and financial information of the Mayo Clinic hospital system. This, combined with Project Nightingale, suggests that the company is staking a claim to a big segment of the health information market, something that should draw the scrutiny of antitrust lawyers.

Taken together, these factors strongly suggest that federal regulators should immediately halt Project Nightingale, at least until they can satisfy themselves and Congress that robust privacy protections are in place and that allowing a tech giant to gain a stranglehold on personal health data is in the best interests of patients and the nation as a whole. That’s an appropriately high bar when the stakes are so high.

Valley News

24 Interchange Drive
West Lebanon, NH 03784


© 2019 Valley News
Terms & Conditions - Privacy Policy