By James D’Arezzo, CEO of Condusiv Technologies
Healthcare interoperability issues can’t be solved until IT performance is addressed.
The overburdened IT infrastructure is already struggling to meet current needs, including processing increasing amounts of data.
The proposed U.S. Core Data for Interoperability mandated by the 2015 21st Century Cure Act calls for the standardization of electronic health records, while the American Medical Informatics Association recommends focusing on data sharing first.
The real crisis is not interoperability, however—it’s lagging system throughput.
There’s no question that the healthcare industry has plenty of reason to be concerned with interoperability and standardization:
- Turf wars over proprietary interfaces and protocols are having a major impact on healthcare IT budgets.1
- Non-compatible electronic health records contribute significantly to the fact that computerized record-keeping consumes more than 50 percent of the average physician’s workday, which now stretches to more than 11 hours.2
- Healthcare organizations struggling to process this tsunami of data are frustrated by the number and variety of analytics tools they are forced to use.3
But without first addressing systems management and capacity, interoperability and standardization won’t be effectively solved.
The advancements in data collection, analytics and electronic health records have created many challenges for the healthcare sector’s IT departments.
The sheer amount of healthcare data has skyrocketed from even a decade ago. Through advanced predictive analytics, this data can save lives by fostering the diagnosis, treatment and prevention of disease at a highly personalized level.
But to maximize the benefits all of this information can offer; healthcare organizations will need to make significant investments in data storage and infrastructure. With simple software fixes, many healthcare IT departments could easily free up half their bandwidth–essentially doubling IT budgets– by more efficiently using the infrastructure already in place.
Healthcare Data Demands
Healthcare institutions must comply with more than 629 different regulatory mandates in nine domains, costing the average community hospital between $7.6 and $9 million. Much of that spending is associated with meaningful use requirements –- government standards for how patient records and data are stored and transmitted.
Due to the demands of healthcare record-keeping and continued advancements in medical technology, IT spending is rising exponentially.
Along with that, medical research and development is booming to the point that institutions can’t keep up with the amount of data that needs to be stored and analyzed. Pharmaceutical and healthcare systems developers are also affected by the gap between data acquisition and analysis. Life sciences companies are launching products faster and in a greater number of therapy areas.
This fast-paced technological evolution places even more pressure on healthcare IT departments to deliver both innovation and efficiency.
Performance degradation occurs over time as the input/output (I/O) movement of data between the storage and computer/presentation layers declines. This degradation is particularly prevalent in the Windows environment. Luckily, targeted software solutions do exist that can improve system throughput by up to 50 percent without additional hardware.
If I/O drags, performance across the entire system slows, which primarily impacts computers running on Microsoft SQL servers (the most popular database in the world.) The Microsoft operating system is also notoriously inefficient with I/O.
In fact, I/O degradation is much more common than most organizations realize. More than a quarter of organizations surveyed last year reported that poor performance from I/O-heavy applications was slowing systems down.
Data analytics requires a computer system to access multiple databases, pulling information together through millions of individual I/O operations.
- Data analytics capabilities are dependent on IT system efficiency, which is dependent on the computer’s operating environment.
- The most widely used operating system, Microsoft Windows, is in many ways the least efficient.
- No matter the storage environment, Windows penalizes optimum performance due to server inefficiencies in the hand-off of data to storage.
- The average Windows-based system pays a 30 to 40 percent penalty in overall throughput capability due to I/O degradation.
The good news is that I/O degradation is a software problem, one for which relatively inexpensive software solutions exist. Dealing with it does not require (and ultimately is not helped by) major investments in new computational and storage hardware. Software solutions exist that can, in the Windows environment, improve overall system throughput by 30 to 50 percent or more.
To handle this escalating volume of data, and to reap the enormous promise of impending medical developments, the healthcare sector’s IT chiefs need to stay focused on the basics of what they are being asked to do. While investments in storage and infrastructure will be helpful to a degree, big data is primarily a matter of processing a certain volume of information at a certain speed. The ability to do that is fundamentally dependent on the overall system’s I/O efficiency—which can be affected only to a limited extent by additional hardware. New hardware can promise more I/O per second, but if the data it is processing is filled with small, fractured, random I/O it quickly becomes unmanageable.
While additional hardware can temporarily mask this degradation, targeted software can improve system throughput by up to 50 percent (or more) and should be part of the IT toolkit for any large-scale healthcare organization. Appropriate system software is just as important as hardware.
Data analysis is inherently slower than data acquisition. It can be made a great deal faster by optimizing the performance of existing servers and storage.
The American healthcare industry is already wasting up to $1.2 trillion a year on completely unnecessary processes – $88 billion due to the ineffective use of existing technology alone.
The first place many healthcare administrators can look to manage unnecessary spending is their current IT performance: I/O fixes can easily double system performance and speeds, creating more breathing room while accommodating increasing data storage needs. Maximizing the capabilities of your existing systems is also far more economical than expensive hardware upgrades.
About the Author:
James D’Arezzo is CEO of Condusiv Technologies, the world leader in software-only storage performance solutions for virtual and physical server environments.
- “Interoperability: Sharing patient information to coordinate care,” Elation Health.
- Finnegan, Joanne, “Primary care doctors spend more than 50% of workday on EHR tasks, American Medical Association study finds,” Fierce Healthcare, September 13, 2017.
- Donovan, Fred, “Healthcare Organizations Stymied by Health Data Analytics Tools,” HIT Infrastructure, March 14, 2019.