Public health's 5 big data hurdles

By Kate Spies
08:51 AM

Public health entities are inevitably sitting on massive data sets. Growing archives of stored patient records, population reports, and lab results are thrusting data volume measures into the petabyte scale.

Agencies, on average, currently store data that could require more than “20 million four-drawer filing cabinets filled with text,” according to MeriTalk’s recent report, ‘The Big Data Gap.”

The copiousness of big data doesn’t need any clarification, but the significance of it does – as health entities work to implement EHRs, convert to ICD-10, and reach meaningful use, the importance of grappling with big data needs to be defined.

Amongst the growing projects issued to the public health sector, what are big data’s challenges and what are its benefits?

Here are our top five obstacles:

1. Health industry behind with big data
Chris Belmont (pictured at left), CIO and vice president of Ochsner Health Systems – a non-profit delivery system located in southeast Louisiana – pointed to a cultural reliance on paper as chief among the reasons health care is lagging other industries.

“We have a culture of being more retrospective and reactive: as in, ‘give me a report and I’ll act on it;’ that’s so 1990’s, and before… So I guess our culture still is, ‘give me a two-dimensional piece of paper with data on it, and I’ll find out things.’”

With some entities essentially addicted to paper, and others engaging with EMRs, fluency across the industry is interrupted, Belmont said.

[Related: Tapping big data for early indications of preventable conditions.]

Even though EHR-use has increased – from about 20 percent a few years ago, to nearly 60 percent today – but “it’s really still a paper-based system,” said Roger Foster, senior director of DRC’s high performance technologies group, which provides technology solutions to government programs.

Other industries, Foster explained, have maximized - or are working to maximize – the potential of stored data. He pointed to NOAA, which has been working to coordinate data across sites to outline weather trends and climate patterns.

2. Mobilizing data to reach across systems
Fluency doesn’t always extend across an organization itself; there’s often disparity between the compilation methods within an organization.

“There’s such a variation of how data is collected,” Foster said, “it’s hard to drive those methods to a meaningful outcome.”

Belmont and Foster agreed that harnessing big data can help reach that end. And Ochsner Health System can attest. The organization has been working to mobilize big data using a platform engineered by Informatica, a data integration company.

One of the main reasons for mobilizing their data, Belmont said, is Ochsner’s need for communication across internal systems. “We have a total of about 225 information systems that we support here at Ochsner. And all of these systems, more or less, keep their data in islands,” Belmont explained. “The real value is not necessarily reporting out of those individualized transactional systems, it’s the ability to aggregate and correlate that data horizontally across those organizations.”

[See also: With eye on public health, Delaware, Michigan roll out clouds.]

IBM’s global healthcare ambassador Lorraine Fernandes expanded on the differences in data forms, citing the growth of unstructured data as another reason public health organizations ought to mobilize – the sooner the better. IBM has been providing clients like Harvard Medical School and Wellpoint with big data analytics technology.

“With big data you have the ability to analyze that unstructured data,” Fernandes said, including reports from clinic equipment, telehealth devices, and home health monitors. “The reality is there’s a tremendous amount of data that is unstructured. There is a variety of data in healthcare, while you see that in all industries, it takes many different forms in healthcare.”

As Belmont echoed, reigning in big data will generate fluidity across a system’s varying data forms.

To read more about public health's big data hurdles, continue onto the next page...

3. Big data and meaningful use
What’s more, according to Ochsner project architect Jonathan Stevenson, is the fact that this communication aligns with another health IT initiative: meaningful use.

The public health sector will come to be dominated by the EHR, he said. “In IT, our core systems are our electronic medical records. How we can improve the experience in the electronic medical record for both our providers and our patients, as well as provide the predictive analytics, is going to be core and critical to differentiating ourselves in the future.”

And if a system like Ochsner has its sights set on incentives, Belmont said, “we have to show meaningful use and that’s going to require a lot of data collaborated from a lot of different systems.”

[See also: 4 tips for setting an enterprise information management strategy.]

The shift demands promptness, too. Belmont is looking to ward off the penalties threatened if stage 1 isn’t met by streamlining its EMR and mobilize stored data. “A lot of information does not exist in one of the core platforms,” he said. “So people are going to have to create big data structures so that they can do this [meaningful use] reporting. Remember that meaningful use is not just during the incentive period.”

4. Big data and cost reduction
Beyond aligning with meaningful use goals, the mobilization of big data could result in marked cost reductions for public health entities, DRC’s Foster said.

“Ultimately, it’s about understanding the drivers behind cost and quality,” Foster continued. “If you actually understand all the data in your system, you can use that to measure outcomes to improve quality, and reduce cost.”

The predictive analytics tied to big data mastery can drive care improvement as well, Ochsner CIO Belmont added. And that quality increase can save money.

“If I can keep you from coming into the hospital, I therefore don’t incur cost, and now I have a bigger margin,” Belmont said. “So if you’re paying me to take care of you, and you don’t come to the hospital, I can make more money on that.”

Tapping into big data can “improve quality and reduce cost,” IBM’s Fernandes agreed (pictured at right). “We truly can control costs at the same time we’re trying to improve patient care.”

5. Big data and the changing healthcare model

This potential seesaw relationship between quality and cost – one goes up, the other goes down – mirrors the evolving model of public healthcare.

“Healthcare in itself is changing,” Oschner’s Stevenson said. “It’s morphing into a different market essentially, where we’re going to be held responsible not just for the volume of patients that we manage but also for the actual outcomes relative to the volume of patients.”

Meaning that health entities like Oschner will “be compensated for taking care of a population,” Belmont explained, “not compensated for each visit that a patient has, so that’s a different change in our model.”

Stevenson thinks the key to matching this model is big data. Enhancing patient care demands access to population data, predictive metrics, patient records, and information from care devices – and, above all, the ability to integrate this diverse data across an organization.

“The magic in making that successful for any organization,” Stevenson said, “is the data."

Weighing the decision
But is harnessing big data worth the cost and effort? Worth breaking through the gridlock of silo-ed data, shedding paper, streamlining industry systems to EHRs, and shifting the retrospective view of the medical culture?

[Q&A: How Ochsner is 'Amazon-izing' itself with big data.]

Mobilization will take a lot of work, Belmont admits, especially when it comes to shifting the culture and educating providers about the technology. Foster cites the variation in data collection methods as a major hurdle. And Fernandes points to the sheer volume of big data as something to overcome, as well as the disparities between structured and unstructured data.

Indeed there are definite challenges tied to big data, ones as hard to ignore as its elephantine, petabyte presence in the health industry. Working through the challenges tied to harnessing big data, however, will allow a focus on patient care, clear cost reductions, and the ability to meet the changing demands of the healthcare industry.

“So why are we undertaking this strategy?” Stevenson asked. “It’s because we have to, not because we really have any choice relative to the way the market is moving.”