Extending Your SAP Data Analytics in Snowflake in Five Steps

Hakkoda - SAP data analytics - Thumbnail
Discover the five-step process that can help you leverage Snowflake’s unique capabilities to extract more value from your SAP data analytics.
October 10, 2024
Share
Picture of Matt Florian

Matt Florian

Director, SAP
Hakkoda

Picture of Benjamin Deaver

Benjamin Deaver

Senior Solution Architect
SNP Group

Picture of Joe Bartoletti

Joe Bartoletti

Global Alliance Director
SNP Glue

SAP remains the heavy hitter in enterprise resource planning. According to SAP’s statistics, 99 of the top 100 companies in the world run SAP. Enterprises have made substantial investments in their SAP R/3 ECC customizations. For every customization made when implementing SAP, that investment trickles into additional investment in the analytics landscape as well.

For enterprises that rely on SAP BW, this investment involves creating analytic models honed to reflect each customer’s unique business operating model. However, the BW models often need to be extended with analytics using other enterprise data, such as CRM, CDP, supply chain management, or plant IoT sensor data. While BW is well suited for SAP data, extending the existing analytic data models with external data is often cost-prohibitive and requires a redesign of the infrastructure to support the data.

Hakkoda - SAP data analytics - Image 1

Why Snowflake Makes Sense for Expanding Your SAP Analytics Model

However, there is an alternative to rebuilding the BW models. Enterprises can quickly extend their existing SAP analytic data models within the Snowflake AI Data Cloud. Snowflake is a cost-effective solution primarily due to its unique architecture and billing model. A critical difference between SAP BW or HANA architectures and Snowflake is the investment required. The SAP technology stack requires significant upfront investments in hardware and ongoing maintenance costs. In stark contrast, Snowflake requires little up-front investment based on a pay-as-you-go consumption model. This means you only pay for the compute and storage resources you use. 

Another key differentiator is around scaling up and out. Snowflake’s multi-cluster shared data architecture allows for integrating multiple data sources without requiring significant architecture changes that a similar exercise in SAP would invoke. This scalability ensures that as your data grows, so does your ability to maintain high-performance analytics without costly hardware upgrades or additional administrative overhead.

Let’s examine how you can implement this in your data ecosystem today using Snowflake, SNP Glue Connect for Snowflake, and Coalesce.io.


Let’s start with a scenario that we at Hakkoda have seen frequently. In this scenario, we leverage SAP delivery and inventory analytical data models from Business Warehouse (BW) and are looking to integrate the data from a third-party logistics (3PL) partner. This scenario uses SAP BW data to manage and analyze delivery schedules, inventory levels, and order fulfillment metrics.

The use case objective is to integrate the 3PL shipping data, including package tracking numbers, delivery statuses, and timestamps, with SAP’s internal data to improve the visibility of in-transit shipments to their customers.

Step 1: Ingest Data Into Snowflake

The first step for getting SAP analytics into Snowflake is to identify the best strategy and tooling to move the data. For this scenario, the best solution is to use the SNP Glue Native App for Snowflake. 

SNP Glue drastically reduces the latency of SAP data integration with the Snowflake Data Cloud to accelerate data availability within Snowflake to leverage against insights, applications, and innovations. By developing an optimized native connection to Snowflake for SAP data integration, data from SAP can be used immediately in the data cloud by harnessing the new Data Streaming for SAP.

The data coming in from the 3PL is coming in through JSON messages that provide regular updates on the shipping status. In this scenario, let’s assume this data comes to the 3PLs API layer, which can push JSON files directly to Snowflake, triggering Snowflake to prepare the data into the staging area almost instantaneously.

Step 2: Harmonized and Transformed Into Common Granularity

Data from the two sources needs to be harmonized to combine. This involves ensuring all the data follows the same format and structure where necessary. For example, date and time stamps from SAP and the 3PL partner should be in the same format (e.g., YYYY-MM-DD HH:MM). Similarly, units of measurement, currency values, and geographical identifiers should be standardized. With Coalesce, we can create consistent, repeatable, and testable data flows for transforming data using native Snowflake functionality.

The information from the 3PL needs to be converted and combined into datasets to align with the level of detail in SAP BW’s current models. For example, if SAP BW monitors daily inventory levels for each SKU and warehouse location, the 3PL data should be summarized into a daily aggregate. Snowflake’s native tools, such as Coalesce.io, can efficiently create testable data flows, simplifying complex data transformation tasks and making them more manageable.

Hakkoda - SPA Data Analytics - Image 2

Step 3: Build Integrated Analytical Models

The transformed data from the 3PL is then integrated into the ingested SAP analytic models within Snowflake. This involves merging the 3PL delivery and shipment data with SAP data, ensuring that all related information, such as order IDs and product SKUs, are aligned and accurately referenced.

Efficient schema design is crucial in optimizing data storage and access within Snowflake. When integrating SAP and 3PL data, it could be suitable to create a star schema. The central fact table would hold transactional data (e.g., deliveries and inventory movements) in this schema. In contrast, surrounding dimension tables would contain related information such as product details, warehouse data, and customer information. This method simplifies queries and improves performance by reducing the complexity of joins.

Your new detailed model is sourced primarily from the 3PL data to enhance our data analysis capabilities. This new model is intended to facilitate the implementation of data drill-through functionalities in the upcoming stages of our project. Your analytics engineers can establish a data narrative that connects the two fact tables by leveraging a common dimensional attribute, such as the Order ID. This will improve our ability to derive meaningful insights from the data.

Step 4: Analytical Model Extension

With the integrated data from SAP BW and the 3PL in Snowflake, we can now leverage the enriched dataset to extend its analytic models. This integration enables a unified view of inventory management alongside delivery and logistics metrics. This brings added value to the enterprise by assessing the overall effectiveness of the supply chain. In addition, we may also create views that combine attributes from SAP’s inventory data, such as stock levels and order history, with analytics derived from the 3PL data, such as timely delivery metrics, including shipment dates, delivery statuses, and logistical costs.

Moreover, this integrated data opens up the ability to create models that correlate inventory turnover rates with delivery performance metrics. This analytical model can identify patterns or discrepancies in how quickly inventory moves in relation to delivery speeds and reliability. Supply chain and demand analysts can use this model to assess which products are underperforming due to logistical delays or inefficiencies, enabling proactive adjustments to shipping methods or vendor selections. Additionally, the model has the potential to pinpoint bottlenecks in the supply chain that affect customer satisfaction and operational efficiency.

Hakkoda - SAP Data Analytics - Image 3

Step 5: Analytics Democratization and Visualization

By harnessing the integrated data within Snowflake, we can now create sophisticated data visualization tools and artificial intelligence models. For instance, we can create dynamic visualizations that compare planned versus actual delivery times with actionable insights into delivery performance. These visual tools make it easier for stakeholders to digest complex data and identify trends at a glance.

The integrated model also allows leveraging AI models using Snowflake Cortex to conduct deep dives into how delivery delays impact inventory levels. These AI-driven analyses can predict potential disruptions and quantify the effects on operations, enabling proactive adjustments to inventory management. By integrating these insights into existing demand forecasting models, we can refine the production planning processes to prevent overproduction or out-of-stock order dispositions. This integration improves production schedules closely aligned with forecasted inventory needs and delivery capacities, optimizing the balance between supply and demand.

Accelerating Time to Value with SNP and Hakkōda

SAP’s prominent position at the center of the ERP universe isn’t likely to change anytime soon. But enterprises know they can only do so much when their ERP data exists in a vacuum. Snowflake’s unique architecture, and scalable, consumption-based billing model make moving your analytics workloads an attractive alternative to costly rebuilds of existing BW models. 

For enterprises still struggling to overcome some of the persistent myths about SAP and who are still unsure how best to get their workloads into the cloud, the good news is that the right data partner and tooling can drastically simplify that process. Together, SNP Glue and Hakkoda enhance ERP analytics within Snowflake by streamlining the integration of SAP data with external sources, enabling enterprises to gain deeper insights and drive better decision-making. SNP Glue’s optimized connection minimizes latency, ensuring that SAP data is readily available for analysis, while Hakkoda’s expertise in data transformation with best-in-breed tools like Coalesce ensures that the data is harmonized and ready for actionable insights. 

Together, they empower organizations to extend their SAP analytic models seamlessly, democratize data access, and visualize complex metrics effectively. This integration not only improves supply chain visibility but also enhances operational efficiency, allowing businesses to adapt swiftly to changing market conditions.

Ready to extend your SAP analytics by leveraging the scalability and integration capabilities of the Snowflake AI Data Cloud? Join Hakkoda and the SNP Group for a live webinar Thursday, October 24th, to learn more about how you can accelerate your ERP insights with Snowflake.

Hakkoda - provider 360 - Thumbnail
Blog
February 11, 2025
Learn how a provider 360 approach unifies and analyzes provider data to empower payers to protect their bottom line and...
data analytics data in healthcare data innovation
Hakkoda - garbage in garbage out - Thumbnail
Blog
February 10, 2025
Discover why the old adage of "garbage in, garbage out" still rings true when it comes to achieving impactful, outcome-driven...
ai consulting clean data data innovation journey
Hakkoda - ai agent landscape - Thumbnail
Blog
February 5, 2025
Traditional LLMs are yielding ground in an emerging landscape of autonomous AI agents, ushering in a new era of intelligent...
agentic ai AI automation ai consulting

Never miss an update​

Join our mailing list to stay updated with everything Hakkoda.

Ready to learn more?

Speak with one of our experts.