Google, University of Chicago named in suit charging misuse of patient data

The class action complaint alleges that, despite being deidentified, Google's expertise in data mining and AI makes it "uniquely able to determine the identity" of the medical records shared with it by the university.
By Nathan Eddy
04:03 PM

Google, the University of Chicago Medical Center and University of Chicago are listed as defendants in a class action suit that alleges they failed to properly de-identify sensitive patient medical data.

WHY IT MATTERS
The complaint, filed by Matt Dinerstein in the U.S. District Court for the Northern District Of Illinois, claims UChicago "promised in its patient admission forms that it would not disclose patients' records to third parties, like Google, for commercial purposes."

Instead, the university "did not notify its patients, let alone obtain their express consent, before turning over their confidential medical records to Google for its own commercial gain," the document states.

The suit alleges that "Google and the university claimed the medical records were de-identified. But that’s incredibly misleading. The records the University provided Google included detailed datestamps and copious free-text notes."

Google's expertise in data mining and artificial intelligence, Dinerstein charges, means it is "uniquely able to determine the identity of almost every medical record the university released."

In addition to seeking monetary compensation, the suit calls for an injunction requiring the University of Chicago to comply with all HIPAA de-identification regulations, enjoining the organization from disclosing identifiable patient medical records to third parties without first obtaining consent.

It also calls for an injunction prohibiting Google from using patient records obtained from U of C and an order requiring Google to delete all patient records received from the university.

Since electronic health records contain patients' highly sensitive and detailed medical records, including records revealing not only a person's height, weight and vital signs, but whether they suffer from certain diseases or have undergone a medical procedure, the University's release of EHR data would be in violation of HIPAA, Dinerstein's suit alleges..

"The personal medical information obtained by Google is the most sensitive and intimate information in an individual's life, and its unauthorized disclosure is far more damaging to an individual's privacy," the lawsuit states.

THE LARGER TREND
The use of de-identified patient data has been common practice for years, but hasn't been without its scrutiny

Back in 2010, the Office of the National Coordinator for Health IT launched a study on how to manage the privacy risks of using health information that had been stripped of personal identifiers.

Even as researchers and technology developers noted that such de-identified data is a must-have for population health and other purposes, ONC said it was seeking a "consensus on what risk we can tolerate for identification and then what level of removal, what kinds of removal of information, are required to get to that level of risk," then National Coordinator Dr. David Blumenthal told Congress.

More recently, de-identified data has become essential to the training and development of new artificial intelligence algorithms that are impacting every corner of healthcare, including AI technologies such as Google's DeepMind, which the class action suit names as one way the company could more easily "find connections between various data points" and compromise privacy, even with de-identified data.

In 2017, Healthcare IT News reported on a DeepMind initiative, Verifiable Data Audit, that was exploring a blockchain-like service that "could give mathematical assurance about what is happening with each individual piece of personal data, without possibility of falsification or omission."

The goal was to give providers and patients real-time insight into where and how data is being used.

"For example, an organization holding health data can’t simply decide to start carrying out research on patient records being used to provide care, or repurpose a research dataset for some other unapproved use," according to DeepMind.

ON THE RECORD
"We believe our healthcare research could help save lives in the future, which is why we take privacy seriously and follow all relevant rules and regulations in our handling of health data," said Google in a statement responding to the lawsuit.

"In particular, we take compliance with HIPAA seriously, including in the receipt and use of the limited data set provided by the University of Chicago."

Nathan Eddy is a healthcare and technology freelancer based in Berlin.
Email the writer: nathaneddy@gmail.com
Twitter: @dropdeaded209