"We're being swamped by a tsunami of data.”
That rather frank assessment of the challenge facing enterprises across the economy, including in healthcare, comes from Jim D'Arezzo, CEO of Condusiv Technologies, a storage software provider, who was recently interviewed by TDWI media.
"Data center consolidation and updating is a challenge,” he continued. There are issues such as data access speeds, compatibility with previous architectures, replication and backup, and cost of data storage. We run into cases where organizations do consolidation on a 'forklift' upgrade basis, simply dumping new storage and hardware into the system as a solution. Shortly thereafter, they often realize that performance has degraded. A bottleneck has been created that needs to be handled with optimization.”
To be sure, D’Arezzo isn’t speaking about healthcare per se, but few IT managers would deny the applicability of his observations. Indeed, D’Arezzo’s comments about AI and machine learning could come directly from any number of recent reports on the storage challenges related to new technologies.
"The thing about AI is that it requires massive amounts of data, compute power, and I/O," he says. "This is due to the amount of data necessary to have a proper AI solution. AI also needs speedy access to sufficient compute resources and has very intensive I/O."
Along with D’Arezzo, Mark Gaydos of Nlyte Software offered a perspective from a more generalized point of view for the interviewer, tech writer Brian J. Dooley, but they’re equally applicable to healthcare.
"Ultimately you need to aggregate data from various systems in varying formats and then normalize that data into information that is actionable," he says. "You then need to tie this information into where workloads are running so you have a physical-to-virtual-to-logical view of your data center. Only then can you begin to make informed decisions about consolidation.”
In short, as Dooley sums up the challenge to the IT infrastructures across the economy. “with the numerous considerations of storage and I/O, it is clear that organizations need to carefully study the effects of big data, advanced analytics, and artificial intelligence on infrastructure choices. Without sufficient infrastructure in place, the system will be unable to efficiently cope with the fire hose of data, and bottlenecks will emerge that could affect other systems.”