Epic Cogito is Moving to Azure: What are the Perks and What are the Drawbacks?

moving to azure
Big changes are coming to Cogito in the next two years. What does it mean for your organization, and what can you do to prepare?
September 20, 2023
Share

At their 2023 UGM conference in August, Epic made the long-awaited announcement that they would be transforming the underlying architecture of Cogito, their analytics suite comprising Clarity, Caboodle, Cosmos, Nebula, Reporting Workbench, and other modules of Epic, by moving it to a Microsoft Azure-native “lakehouse” architecture within the next two years. If you missed the announcement when it was made, a former colleague wrote a fantastic summary about what moving to Azure means for Epic customers. 

For those who have followed developments at Epic over the last few years, their selection of Microsoft Azure as their cloud platform of choice comes as no surprise. That said, it does have important implications for the future of healthcare organizations’ data and analytics strategies. 

At Hakkoda, we work with many Epic customers in the healthcare sector and have a deep understanding of how different organizations have incorporated Epic systems into their daily operations. We have clients using a single instance of Epic, others with multiple instances, still others that host their data internally and externally, and even those who use Epic in addition to other competing EHRs.

Needless to say, the implications of moving to Azure look slightly different between these different use cases. There is, in fact, a virtually endless list of topics and approaches to this announcement to contend with, many of which we will visit in depth in future articles as details about Epic’s future plans continue to emerge. 

In the meantime, this piece will go over some of the benefits and setbacks of moving to Azure from Hakkoda’s point of view.

The Benefits of Cogito Moving to Azure

Azure is better than an on-premise SQL server. This goes without saying, but it’s about time Epic found a more scalable solution for Cogito. Moving Clarity, Caboodle, Cosmos, and Nebula to Azure means bringing data out of the dark, on-premise database deployments that Epic has predominantly used since its inception of Clarity on Microsoft SQL Server decades ago. Procuring hardware, tuning Microsoft SQL queries, and dealing with resource contention for month-end reporting are considerable challenges for existing Cogito users still trapped in on-premise systems, and the choice to move to a cloud-based solution will usher in significant quality-of-life improvements.  

Moving hourly ingestion from Chronicles to Clarity is a game changer. Analytics consumers regularly ask for real-time data, which later turns out to be unnecessary or needed in small batches (i.e. by the minute or hourly). Historically, all of that had to be done primarily through HL7v2 messages or extracts from Reporting Workbench, but Epic’s move to Azure unlocks an easier path to provide analytics tools and machine learning models with more timely data.

Having data in the cloud will help organizations more easily make the shift to an all-cloud data and analytics stack later in their modernization journey. As more and more healthcare organizations make the leap to cloud-based data stacks, moving Cogito to Azure means that the organization will not have to invest in separate resources to maintain on-premise Epic analytics vs cloud-native solutions. This simplifies training for BIDs (Epic reporting analysts), Epic Administrators, and allows organizations to be more strategic in the talent they hire and retain. 

These benefits also extend to Snowflake customers who have already migrated their data to the cloud. Moving Cogito to Azure will simplify these customers’ technology stacks, allowing batch and near real-time use cases to benefit from all the data already living in the cloud. The cloud hosting of the data and more frequent intra-daily updates will allow Snowflake customers to work with the data even faster, by reducing the need for replication and extract processes often necessary today. They will also be able to use Snowflake’s ability to ingest data via Snowpipe or newly-supported Iceberg tables, with both the data and metadata staying in Azure. 

It’s a huge win for Snowflake customers that the move to Azure won’t make their lives harder. Quite to the contrary, these developments will make bringing Epic data into Snowflake easier than ever before.

The Drawbacks of Cogito Moving to Azure

The move to Azure brings a litany of interoperability concerns. Providers, payers, life sciences and healthtech organizations, and data consumers at major healthtech-focused organizations have bet big on Snowflake, including CMS, Oracle Cerner, Salesforce, ServiceNow, IQIVIA, Optum, Workday, to name a few. Meanwhile, some research organizations are dabbling with GCP or Palantir. 

Whatever solutions Azure promises to bring to the table, it will need to integrate with other platforms and tools across the healthcare data stack to give healthcare providers a unified source of truth. Failure to do so will quickly introduce new obstacles into Cogito’s interoperability and scalability and cause fresh headaches for organizations with platforms and tools other than those Epic provides directly. Without comment from Epic on how they plan to address these interoperability concerns, healthcare administrators may see a vendor lock-in problem on the near horizon.

The whole move hinges on Microsoft Data Fabric, which is an amalgamation of existing Azure technologies that require tuning and optimization. The reliability of Azure Data Fabric has not yet been proven, while previous technologies in Azure’s development have required considerable tuning or implementation of RBAC in many places to function (e.g. Azure Synapse, Azure Data Lake Storage). Microsoft’s OpenAI may be leading the pack at the present moment, but the race for artificial intelligence is a volatile one and it is difficult to predict for how long Microsoft will be able to maintain its lead in this volatile landscape. 

It will also be interesting to see how Epic’s Slicer Dicer will be affected by this move. Today, Slicer Dicer is driven by Epic Cogito’s Caboodle, an on-premise Microsoft SQL Server database. This particular tool has been widely adopted over the last handful of years, and its future state architecture, whether cloud-native or hybridized with on-premise elements, will be one to watch out for.

Cogito moving to Azure is a gamble on other technologies in the Microsoft ecosystem which have a somewhat storied past and which are up against steep competition in multiple arenas.

Some Closing Thoughts

At Hakkoda, we bet our whole company on the evolution of the Snowflake Data Cloud, which we believe is uniquely positioned to support your organization’s growth with interoperability, innovation, and scale in mind. 

Epic’s announcement that Cogito will be moving to Azure leaves us more convinced than ever that cloud-based platforms and solutions are at the heart of healthcare’s future. We also remain steadfast in our conviction that Snowflake will continue to outperform its competitors with best-of-class interoperability across the modern data stack while providing Snowflake customers with future-proof, scalable solutions that break down silos and open pathways to data innovation without locking them into any one vendor’s toolkit.  

While we continue to believe that organizations should maximize their investment in Epic and use it where appropriate, we also continue to caution them about building their data strategies to align with Epic’s strategy, roadmap, and technology choices. Many organizations choose to use modern BI tools or open source cohort builders like OHDSI ATLAS and i2b2, given that those tools can work with more data across the care continuum and can support requirements like non-clinical data in research. Snowflake is a great place to use ATLAS and i2b2, since data from Epic and beyond can be blended for deeper analysis and slicing and dicing cohorts.

Choosing Snowflake and Hakkōda for Your Data Innovation Journey

Hakkoda’s data teams bring certification from across the modern data stack to help your organization architect cutting-edge data solutions that leverage the security and sophistication of Epic software while preventing silos or vendor lock-ins that can put you right back into a state of data chaos. 

We offer highly flexible solutions that seamlessly automate the integration of Epic records with additional data sources, putting your organization’s data house in order so that you can focus your resources on what comes next.

Ready to learn more about how Snowflake can help your organization incorporate disparate data sources and enrich EHRs at scale? Let’s talk.

Hakkoda - function calling - Thumbnail
Blog
January 9, 2025
Explore how function calling empowers AI agents to perform actionable tasks and seamlessly connect with external systems.
agentic ai ai consulting data innovation
Hakkoda - State of Data 2025 announcement - Thumbnail
News
January 6, 2025
Hakkoda announces the release of its State of Data 2025 report, which reveals a shift toward more flexible data stacks...
AI automation ai copilots Apache Iceberg
Hakkoda - data governance implementation - Thumbnail
Blog
January 4, 2025
Every data governance implementation comes with its own goals and challenges. Learn how four Hakkoda customers carved their paths to...
Alation customer stories data catalog

Never miss an update​

Join our mailing list to stay updated with everything Hakkoda.

Ready to learn more?

Speak with one of our experts.