PRESENTED BY:
HEALTHCARE IT NEWS & HIMSS MEDIA
news

Analytics adds new performance metrics to flash assessments

Beyond costs and speeds and feeds though, IT innovators are beginning to discover a critical new set of attributes they can use to manage their data more strategically than ever before.

Jeff Rowe | Nov 15, 2017 02:18 pm

In healthcare, as in most sectors of the economy, speed matters, particularly when it comes to how quickly a provider can access data relating to a particular patient. But while that concern for speed, writ large, has driven many considerations concerning how best to store data, some stakeholders are pointing to a new array of metrics which organizations should consider factoring into their IT decisions.

Writing recently at Network World, Lance L. Smith,  CEO of Primary Data, a cloud and data migration provider, notes that the “critical importance of storage performance led us all to fixate on latency and how to minimize it through intelligent architectures and new technologies.”

That importance isn’t going to disappear, Smith says, especially given the popularity of flash memory in storage, but, in his view, a number of other metrics are rapidly rising in importance to IT teams. 

“IT leaders are recognizing the importance of having intelligence about their data,” he explains, “so that they can create informed strategies rather than blindly throwing speeds and feeds into their infrastructure. Metadata analytics can now tell us the importance of a file by analyzing when it was last accessed, when it was last changed, who accessed it and more – giving IT the intelligence needed to determine strategies for meeting data requirements without overprovisioning and overspending.”

One way of looking at the situation, he says, is that “applications and storage have long been oblivious to each other’s capabilities and needs. The majority, if not nearly all, of today’s enterprise applications do not know the attributes of the storage where its data resides. Applications cannot tell if the storage is fast or slow, premium or low cost. They are also unaware of storage’s proximity, and factors like network congestion between storage and the application server, which can significantly impact latency.”

At the same time, storage does not know what data is the most important to an application. “It only knows what was recently accessed, and uses that information to place data in caching tiers, which will increase performance if that same data happens to be accessed again.”

The answer?  In his view, it’s metadata software. “Since metadata-intelligence software enables admins to see when files have been last opened, how often, by whom, when they were modified and more, admins can now manage their storage resources more efficiently. Cold data can move to archival tiers automatically and without disruption, while hot data is placed on storage systems that meet business needs for performance and price. It’s even possible to create policies that make use of this data and to automate data movement through scripts or software, freeing IT to focus on more strategic tasks.”

None of this, of course, is to suggest storage decisions won’t remain a key consideration for IT managers.  But as flash and other storage media present stakeholders with an improving array of options, it will be increasingly important for managers to have as much information about their data as possible in order to know where, and how best, to store it.

Resource Center VIEW MORE