“The average hospital generates 665 TB (of data) a year.”
So said John Quackenbush, a professor of Biostatistics and Computational Biology at the Dana Farber Cancer Institute and Harvard School of Public Health, at the recent HIMSS and Healthcare IT News Big Data & Healthcare Analytics Forum.
“We’re awash in data,” he added, “The challenge is what to do with it.”
But while the healthcare industry is amassing more data than at any time in history, much of it is running on top of legacy technology. According to Vik Nagjee, CTO of Pure Storage, t many hospitals still operate with legacy IT, and he pointed to one integrated delivery network that maintains a staggering 18,000 applications.
With that in mind, Nagjee said hospitals should evaluate where they are now in order to move forward into the world of big data, analytics, AI and machine learning.
“Look at any other industry,” Nagjee said. “Finance is risk-averse and they figured out ways to not have such fragile infrastructures and applications. We need to come together to make this happen in healthcare.”
Harvard’s Quackenbush, meanwhile, said healthcare organizations have to focus on delivering the right data, including current information about the state of a patient in front of clinicians as well as outcomes data.
And he recommended not falling for the common misconception that simply throwing a lot of shiny new technology at the problem because biology and healthcare are hard.
Moreover, hospitals also have to overcome the challenges of integrating data from a variety of existing and emerging sources -- and today’s crop of electronic health records can be problematic.
“EHRs are not designed to be a strategic repository to drive better care, they’re designed to optimize billing,” said Adrian Zai, MD, research director of Partners eCare. “How to connect external data? We all know EHRs are not good at it. Finding ways for all data to work together is one of the challenges.”