Case Studies

Unlocking Epic EHR Data for a Large Healthcare System

Share
employees served
0
Clarity tables ingested to date with dynamic ETL
0 +
Caboodle tables ingested to date with dynamic ETL
0 +

Built a modern data platform built on Snowflake and AWS and spanning four subsidiary organizations, functioning as a “data hub.”

Implemented dynamic ETL pipelines for Epic and other data sources, laying the foundations for near-real-time data ingestion and scalable data governance.

Deployed Hakkoda accelerator IP for automated role provisioning and security framework rollout.

Challenge

The customer is a large healthcare system that brings together teaching, community hospitals, and academic medical centers. The system employs over 4,800 physicians and 38,000 employees who strive together to expand access to care while conducting groundbreaking research and education.

As a newcomer to Snowflake, Amazon Web Services, and Matillion, the healthcare system was looking to retire their on-prem legacy implementation and migrate their data warehouse components to the cloud. Though their migration plan involved data from a variety of sources, the complexity around working with proprietary Epic data, in particular, was a major blocker for the risk-adverse organization. 

With four different entities rolled up under the system’s umbrella, the enterprise needed to be able to ingest and analyze data from a wide gamut of EHR and non-EHR sources

Solution

Over the course of a three-month initial engagement, Hakkoda planned and implemented a centralized cloud data platform built on Snowflake and AWS that gave the organization the data dynamic ETL tools and enablement it needed at scale. 

Hakkoda’s Healthcare & Life Sciences team also leveraged its deep expertise in Epic EHR data to enable dynamic ingestion of Epic data into Snowflake. The customer’s small but highly skilled internal data team is now using the pipelines and best practices put in place for Epic to incorporate other data sources across their ecosystem to power system-wide analytics in and outside of the patient record. 

Leveraging a Hakkoda accelerator to manage role-based access and automatically apply their governance and security frameworks, the client’s internal team is also working to bring on as many as 500 users this year. They’ve also begun using the Snowflake AI Data Cloud to achieve their long-term goal of establishing a research hub.

The Model

Technology Used:

Snowflake

AWS (EC2 Instances, S3 Storage)

Matillion

Microsoft Azure

Full Time Resources:

1 Engagement Manager

1 Data Architect

2 Data Engineers

Project Duration:

3 Months

“Hakkoda's work with the large healthcare system started with the need to build a centralized cloud data platform on Snowflake and AWS that could integrate critical Epic data with all of the other information driving operations. Their internal team is highly skilled and deeply capable, so the other major piece of the puzzle was enablement--making sure they had the pipelines and best practices in place to integrate additional data sources as they continue to scale and innovate."
- Victor Wilson, Director of Data Engineering at Hakkoda

Case studies

Hakkoda - real-time supply chain - Thumbnail
Case Studies
Learn how a global supply chain and logistics company unlocked a 40% speed to market increase and a 6x ROI...
Case Studies
Hakkoda - Blue Yonder Case Study - Thumbnail
Case Studies
Learn how Blue Yonder is reducing their customer onboarding times and building more resilient supply chains with a powerful and...
Case Studies
Hakkoda - Improving Performance - Thumbnail
Case Studies
Learn how Transbank was able to monetize data products in two months and expand its TAM by $250B with an...
Case Studies