Yale New Haven Health transformed its AI and analytics with a unified data lake to support clinicians and patients
All enterprises today must find ways to use their massive data sets to deliver better outcomes. The healthcare industry is at the forefront of this transformation. Analytics and artificial intelligence deliver streamlined paperwork and faster detection of diseases, and they minimize the gap between research and realization. However, licensing the right tools, implementing processes, and retaining talent are barriers to successfully harnessing data. Yale New Haven Health, the largest healthcare provider in Connecticut, had a large and rapidly growing dataset. They knew their existing infrastructure and data pipeline wouldn’t support their future goals. The details of this transformation are covered in a new IDC perspective.
“Over the last 2 years, we've been figuring out how to migrate off of Hadoop and into something more cost effective and agile.” Wade Schulz, MD, PhD, Director, CORE Center for Computational Health, Center for Outcomes Research & Evaluation
Yale needed to centralize and integrate data from numerous sources to be easily used on any data-science platform to research inpatient care and healthcare outcomes. COVID-19, with its variants and increased clinical cases, created a huge influx of data that exposed the shortcomings of the existing big-data infrastructure. Yale needed a new infrastructure to support their influential research on the disease and its cures.
Before selecting NetApp®, Yale ran a mostly on-premises analytics environment on Hortonworks Data Platform based on Hadoop for capturing, processing, and analyzing data. However, increasing licensing costs became a concern. Additionally, managing the data lifecycle became increasingly difficult. Budget constraints and failing hardware made it difficult to justify expanding their existing environment. It was time for a new computational health platform. Enter NetApp.
“NetApp was selected because of its price/performance, scalability, and the availability of tools that facilitate the transfer of data between platforms.” Dr. Wade Schulz
The goal was to create a more flexible, future-proof system that was cloud ready. NetApp came in with a plan to:
Yale’s computational health platform was born out of an agile, disaggregated architecture better suited to their artificial intelligence and machine learning workflows. Memory and storage are now used in a far more efficient manner, and licensing costs were reduced by $500k. Additionally, the data lake enables easier adoption of technology such as GPUs.
With NVIDIA DGX Foundry, data scientists can easily get to work without IT building a complex AI infrastructure themselves. The platform is fully managed by NVIDIA and NetApp, allowing Yale to focus on research and outcomes instead of managing infrastructure.
“The complete spectrum of tools and capability is built within the platform” Dr. Wade Schulz
To learn more, read the full IDC report. - IDC Perspective: Yale New Haven Health | NetApp
Sean is the Market Strategist for Modern Analytics Solutions at NetApp. He joined the marketing team at NetApp in 2019 and has held roles on the Private Cloud and SAP solutions teams. Prior to NetApp, he spent 9 years in the financial industry, occupying various roles. Outside of work you’ll likely find him skiing, golfing, singing loudly in his car, or enjoying the Colorado mountains with his wife and dog.
Brush up on the latest trends and developments in cloud, on premises, and everywhere in between. This is where it all gets real, with a cherry on top.
Explore a wide range of open forums where you can post questions, share answers and just generally get smart on all the NetApp technologies that matter most to you.