Traces of malicious code in the registry were discovered in July. According to the article, the code was believed to have been left in 2007. Approximately 46,000 of the 160,000 total entries were coded in a way that is does not reveal personal health information. While that coding is standard procedure now, the other 114,000 may have been compromised. University officials notified those patients the last week of September.
The notification delay was due to the time it took to go through the Carolina Mammography Registry, which is one of the largest data warehouses for breast cancer research, according to officials. Paul Molina, MD, the vice chair of the radiology department at UNC, was quoted as saying, "Contacting patients unnecessarily was something we wanted to avoid."
The story goes on, but I want to stop here for a moment. What if during the months-long time it took for the university to go through the registry personal health information was compromised and one of the participants was victimized by the breach? Three words: Public Relations Nightmare. Is it better to contact patients first and alert them that there may be - emphasize "may be" - a potential privacy breach? If I'm one of those patients, I'd rather be alerted and then told later that all is well than not be told at all and read about the potential breach in the media.
EHR adoption hinges on acceptance by both patients and physicians. Gain patient trust by being upfront soon after the breach - even if it is a potential breach - is discovered. Remember when Odwalla recalled its products that contained apple juice in 1996 because of traces of E. coli bacteria were found? Soon after health officials found the link, Odwalla announced the nationwide recall because of the "safety and health" of its customers. That case has been come to be known as a model case study in crisis management. The healthcare industry should take note.
Back to the UNC story. There was some discussion of whether or not university servers are more at risk of security breaches because they're decentralized. John Travis, a senior director at Cerner, was quoted as saying that what's critical is the way the data is spread across multiple systems and the security of those systems. Fair enough. He called for hospitals having regular audits on who has access and who saw what when. If you think this should already be standard procedure, well, guess again. Hospitals and health systems that don't have these in place should start making plans to do so.