Zero-copy, Three Doors, One Backbone: The New SAP + Snowflake Architecture

Discover how SAP’s Business Data Cloud and Snowflake’s AI Data Cloud enable zero-copy data sharing, unified semantics, and governed bidirectional workflows.
November 18, 2025
Share

When I say “zero-copy, three doors, one backbone,” I’m talking about enterprise data products that travel with their meaning. Think of web services, reimagined for an agentic, multi-platform world.  

The pattern clicked for me watching planners stitch together Excel models from SAP, Workday, and Salesforce, then push results back into SAP by hand or via brittle file uploads.  

That’s not a process issue. That’s an architecture gap SAP & Snowflake’s partnership announcement finally addresses by keeping semantically rich SAP context intact as it’s consumed in Snowflake’s AI Data Cloud. 

What Zero-copy Really Means 

Zero-copy is not “faster pipes.” It is the ability to share SAP Business Data Cloud (BDC) data products with Snowflake without duplicating underlying datasets, so the business meaning travels with the product while each platform computes on its own side. With the SAP Snowflake solution extension, customer get bidirectional, zero-copy access to semantically rich SAP data products. 

SAP and Snowflake both describe this as bringing semantically rich SAP products into Snowflake’s governed AI environment, anchored in SAP’s business data fabric. 

How it plays out in practice: 

  • Two directions. SAP’s BDC Connect enables secure, bidirectional, zero-copy sharing of data products with Snowflake. 
  • What travels. You share the product’s schema, semantic knowledge, and business metadata, not a bulk export. That’s how Snowflake can query SAP products with meaning intact while SAP remains the source of record. 
  • Where compute runs. Snowflake queries run in Snowflake. SAP usage runs in SAP. Access is decoupled from physical copies. SAP positions this as an “SAP Snowflake solution extension for BDC”  so customers get Snowflake’s managed AI and governance with SAP context preserved. 

A simple loop: your team blends a fulfillment allocation plan in Snowflake using SAP products plus non-SAP signals. An analyst approves it. The approved plan is shared back to SAP as a governed product, without tossing giant files around, so execution proceeds in the system of record. 

The Three Doors 

Now the “three doors” makes sense. Fiori is where operators work with granular SAP data. SAC is where line leaders review and approve, fed by governed BDC data products.  

Snowflake is where cross-functional teams extend those products with non-SAP data and where AI co-pilots live.  

The glue is the backbone: package data + semantics + governance in one layer. BDC publishes products with business context, while Snowflake Horizon keeps discovery, lineage, and policy visible and enforceable for everything downstream. 

Before → After 

Before: one-way ETL to build reports, followed by “copy, tweak, drift.” Different answers for the same KPI. 

After: publish a few authoritative BDC data products (for example, Sales Documents, Deliveries, Billings, Customer), then extend in Snowflake without stripping SAP semantics. SAP’s news and docs outline installing and working with products in a cataloged, governed way, which is exactly the foundation we want.  

Preserve Operational Data Provisioning (ODP) deltas and the Operational Delta Queue instead of re-inventing change logic in SQL (SAP Help: ODP overview). 

Mechanics That Prevent Drift 

Start with the products everyone cares about. Sales + Delivery build trust fast and let workstreams run in parallel: analytics teams enrich with enterprise data in Snowflake; an agent proposes fulfillment allocations for a human to approve. SAP frames these as business-ready, semantically rich products, which is the point.  

Keep base currency and UoM conversions in BDC so Fiori, SAC, and Snowflake match (aligned to SAP conversion policy and TCUR* tables; see SAP Help on currency conversion). If you need analytics-only conversions, compute them in Snowflake but read the same rate references and rules to avoid forking policy. For extensions: if SAC or Fiori will use it, extend in BDC; if it’s only for Snowflake, curate it in Snowflake and register lineage in Horizon. 

Why the Newest Releases Make the Backbone Real 

If zero-copy is the promise, recent releases are the parts that make it operational. On the Snowflake side, Horizon gives one place to discover and govern data, apps, and models with built-in lineage, policy, and privacy controls, so meaning published upstream stays visible and enforceable downstream. Pair that with Snowflake Intelligence, which brings governed natural-language access and agent workflows inside Snowflake’s security perimeter.  

On the SAP side, BDC Connect formalizes secure, zero-copy sharing of data products into enterprise platforms, and Data Product Studio (announced; GA planned) gives builders a visual workspace to model and manage custom products with robust metadata.  

Put together, “one backbone” stops being a slide and becomes a system: publish governed products in BDC; use zero-copy to consume them in Snowflake; rely on Horizon to keep ownership and lineage clear; and let Intelligence or agents propose actions that, once approved, write back through BDC Connect so decisions turn into execution without breaking clean core. 

Zero-copy in Both Directions 

With BDC Connect for Snowflake, this is not a one-way street. You can propose a plan in Snowflake, keep a human in the loop, and then share the approved plan back as a governed product for SAP to execute.

Press and industry coverage highlight bidirectional, zero-copy access as the focus of the announcement. 

Agent Safety and Audit Trails 

My non-negotiables are simple: scope and policy must be explicit, PII masked where required, approvals recorded, and every agentic step must leave an immutable audit trail that ties decisions to inputs and lineage.  

Snowflake’s governance primitives in Horizon and the platform direction around Intelligence support that “connected and trusted” model. 

FinOps: Freshness Follows Decisions 

Freshness should match the decision window, not habit. That is why we use Dynamic Tables and set a target_lag per product and persona: sub-hourly for operations, hourly for near-real-time fulfillment, nightly for planning. Snowflake documents how target lag drives refresh and cost, so latency and spend move together. 

The 90-day Starter 

Start small with orders and deliveries. Refactor the logic as BDC data products. Extend what is truly cross-functional in Snowflake.  

Let a co-pilot propose a fulfillment plan, keep a human in the loop, and write back after approval. Measure build time, parity across the three doors, and the operating cost with target-lag tuning. If your architecture simplifies and secures at the same time, it is worth your attention. 

Let AI be the analyst’s assistant. Keep decisions accountable, then scale what works. 

Ready to get started? Let’s talk. 

Blog
November 14, 2025
Learn how scalable AI use cases are shaping the next wave of innovation with six insights for retail, CPG, and...
Blog
November 13, 2025
See how governed data products in SAP and Snowflake improve reliability, safety, and storm response across utility and WAM operations.
Blog
November 10, 2025
Achieve faster, smarter retail allocation by uniting SAP data and Snowflake Intelligence for real-time, governed, and margin-driven decisions.
Blog
November 10, 2025
Reshape your S/4 strategy by building a unified, governed, and AI-ready data foundation across Fiori, SAC, and the enterprise.

Ready to learn more?

Speak with one of our experts.