PRESENTED BY:
HEALTHCARE IT NEWS & HIMSS MEDIA
news

As health data changes, data storage needs to change, too

New types of data, says one expert, are driving the development of different storage environments focused on capturing, storing and partially analyzing large amounts of data prior to transmission.

Jeff Rowe | Dec 20, 2017 12:00 am

Organizations of all types are finding new uses for new data as part of their overall digital transformations.  In healthcare, the data may be coming centralized sources such as EHRs, or it may be coming from the “edges” of the Internet of Things in a diverse array of formats.

Writing recently at Network World, tech writer Joan Wrabetz notes that what she calls “new data” is both “transactional and unstructured, publicly available and privately collected, and its value is derived from the ability to aggregate and analyze it. Loosely speaking we can divide this new data into two categories: big data – large aggregated data sets used for batch analytics – and fast data – data collected from many sources that is used to drive immediate decision making."

She’s not speaking exclusively about healthcare, but her analysis certainly applies to the healthcare sector, and there’s little doubt “new data” is driving IT managers to seek new storage options.

As she sees the data landscape, there are several key challenges that are impacting data collection and management.

First, she says, data capture is driving edge-to-core data center architectures. “New data is captured at the source . . . (and) the volume of data collected at the source will be several orders of magnitude higher than we are familiar with today.”

For example, “in the bioinformatics space, data is exploding at the source. In the case of mammography, the systems that capture those images are moving from two-dimensional images to three-dimensional images.” She points out that the shift from 2-D to 3-D calls for exponentially greater storage capacity.

Second, “data scale is driving data center automation: The scale of large cloud providers is already such that they must invest heavily in automation and intelligence for managing their infrastructures.”

Other factors impacting storage considerations are the increased value of data over a longer time frame, which translates into more effective long-term storage, and the rapid rise of data analytics.   “The compute intensive nature of analytics is driving many new ways to store and access data,” she points out, “from in-memory databases to 100 petabyte scale object stores.”

Given these and other changes, Wrabetz says, the bottom line is that “intelligent architectures need to develop that have an understanding of how to incrementally process the data while taking into account the tradeoffs of data size, transmission costs, and processing requirements.”