SAP Business Data Cloud is set to redefine enterprise data management, addressing key gaps in SAP’s D&A portfolio and adding unique value-adds over its competitors. This 2-part blog series explores its real-world value proposition, the challenges businesses face in becoming data-driven, and how SAP is closing the gaps.
SAP Business Data Cloud – What is it and why should I care – Part 2

If you read my last blog in this series (link), you will have a good understanding of the key D&A goals of many modern organisations and the 4 key technical challenges that SAP’s D&A tools (prior to Feb 2025) could not quite address. These challenges prevented many organisations from adopting SAP’s D&A technologies as their central enterprise data platform – a key asset and enabler of the “data driven organisation”.
In this 2nd blog, I am going to show you how, with the advent of SAP Business Data Cloud (BDC for short), SAP has not only caught up with its main competitors but also jumped ahead in a key area. And in the process, addressed all the technical challenges we discussed in blog 1.
Since a picture speaks a thousand words, in your next few minutes of reading, I am going to make regular reference to the SAP Business Data Cloud architecture overview diagram below. I am going to refer to the components highlighted in red, green and orange, since these are game changing parts.
SAP Business Suite Data Cloud architecture overview
The Data Lake
The red box in the architecture diagram highlights a new, central component called the Object Store, aka Foundation Services, within BDC. It should really be called the Data Lake because that’s what it is. Since BDC can and will be deployed in all the major hyperscalers, the object store is implemented using the underlying object stores of the hyperscaler e.g. S3 in AWS, ADLS Gen 2 in Azure etc. The Object Store implements a Data Lake that is truly scalable, cost-effective and caters for fast ingestion of all data varieties. Furthermore, it conforms to the Delta table standard, so is Data Lakehouse ready. All future Data Products delivered by SAP, will be ingested and modelled into the Object Store following a medallion style architecture, at least up to a Bonze and Silver level (but more on this a bit later).
For the more technical reader, a short geek out about the red arrow labelled “APRS”. This stands for ABAP Push Replication Services. It is an all-new, data replication method between S/4HANA and the Object Store. APRS pushes data from ABAP CDS views to files in the Object Store. The service relies on several background jobs to manage data transfer, including Access Plan Calculation, Observer, Transfer, and Health Check jobs. Most importantly, APRS optimises data transfer by parallelising tasks, which is particularly useful for handling large datasets.
I am almost 100% sure of this (but not quite), but it seems like APRS will only be used exclusively by SAP for replication of data from S/4HANA PCE i.e. not available for customer managed S/4HANA.
The key point to note is that this layer is not a bolt-on, side-bar thing. It’s central and crucial to the BDC architecture. Hence, its existence means that Architectural Challenges #1 and #2 (mentioned in my first blog) are now solved! Customers now have a low-cost, scalable data lake handling big data and have freedom to adopt one or a hybrid of the contemporary architectural patterns prevalent in our industry.
All personas covered
Another unique and highly publicised component of BDC is the inclusion of SAP Databricks (highlighted in orange). Although the Databricks partnership was mentioned a while back with the introduction of SAP Datasphere, this is the first time we’ve seen the real fruit and depth of this partnership. In short, BDC now includes an OEM version of Databricks (neatly called SAP Databricks😊) which brings all the Lakehouse/AI richness, depth and leadership of Databricks to every SAP customer. Overnight, SAP has now gone from competing with Databricks to being one with Databricks. A great match!
There are a few key implications to note here (apologies if this part gets a little geeky and technical again):
- Since SAP’s Data Products are implemented in the Object Store using the delta table format, these can be shared via the open source Delta Share method with SAP Databricks – securely and without data replication/duplication/transfer! To the astute reader, this is a big deal. And will become a bigger deal once you’ve read the section on Data Products. Simply put, Data Engineers and Data Scientists using Databricks, now have instant access to all the rich SAP application data without the effort of data replication or paying for data integration costs (e.g. Outbound Premium in SAP Datasphere for Replication Flow). Furthermore, all the enrichment performed in SAP Databricks can be easily shared back to SAP Datasphere, once again with zero copy sharing.
- For those organisations who are already heavily invested in standalone Enterprise Databricks, you are still covered (i.e. you don’t have to introduce a new instance of SAP Databricks). By the time BDC is generally available (Q2 in Australia), you can avail of the bi-directional tightly integrated benefits of SAP Databricks via the zero copy Delta Share mechanism too.
The introduction of SAP Databricks means that the challenge of full persona coverage within an organisation is now solved. This was the 3rd challenge I mentioned in blog #1. SAP Databricks provides a contemporary data engineering experience using notebooks and popular data processing languages like Python and Scala that many modern data engineers look for. Furthermore, access to all contemporary AI models and data processing libraries ticks the box for Data Scientists.
The Data Product (and Insight Apps)
The Object Store (data lake) and SAP Databricks inclusions have addressed 3 out of 4 key technical challenges and, when combined with SAC and Datasphere, bring BDC right up to par with the best-in-class modern data platforms. But it’s the boxes highlighted in green in the architecture diagram that sees SAP jump ahead of the competition with a unique offering that they can only provide! The Data Products and Insight Apps.
The final technical challenge from the first blog was the SAP application data accessibility challenge. This has been an issue for every data platform out there, with no easy solution. Up till now, that is. SAP is so serious about productising its data in a manner that is easily accessible and consumable to all personas, that it has mandated that all its applications (S/4HANA, Ariba, Concur, SuccessFactors etc.) must develop a new, comprehensive suite of standardised easy to use Data Products and associated Insight apps. So, before going into some details, we need to understand that this ticks the final box of the SAP data accessibility challenge.
Now let’s look a little closer at the properties of these Data Products and Insight apps to understand their true value:
- No matter which application they come from, they are built to a clear, consistent standard (SAP has even released an open source standard called ORD – Open Resource Discovery, to drive this standardisation)
- They are easily discoverable in a catalogue in BDC and installed directly from there. No need to understand technical tables, fields, or ABAP CDS view names.
- Insight apps are delivered on top of the Data Products (in many cases via SAC) and provide a visible and tangible analytical output out of the box. But, it’s the underlying Data Products that are where the real gold is!
- The Data Products are deployed into the Object Store and modelled in a contemporary way using Bronze and Silver layered delta tables – this unlocks bi-directional, zero copy integration with Databricks! SAP Datasphere becomes more of a semantic modelling tool on top of the Data Products. This is quite a shift in data modelling approach.
- The Data Products are completely managed by SAP, i.e. once installed, the initial data load and regular delta loads are managed and monitored by SAP! No longer does the customer support team need to worry about this! The new APRS replication method is used to push data from S/4HANA into the Object Store.
- Given the point above, it becomes clear why these Data Products are only available for customers in S/4HANA PCE and/or the other LoB SaaS solutions, e.g. SuccessFactors, Ariba etc. SAP needs to run these applications so that they can provide the Data Product as a service.
After digesting all of this, you begin to realise why the SAP data accessibility challenge is resolved and is also a unique offering that no other vendor can offer, bundled with their MDP. This is why it’s the “leap frog” value item. For those who have tried to integrate and model SAP data into any cloud data platform in the past, you will understand how exciting well-defined, standardised data products as a service, across all SAP applications, really are!
Conclusion – SAP BDC is cool!
Hopefully, after a somewhat lengthy and occasionally technical read, you can see how BDC is really very different and not just an incremental development of SAP Datasphere and HANA Cloud. The new SaaS cloud data platform has all the components in it to compete with the best, and through the Data Products, adds value that only SAP can. With so many organisations around the world running SAP applications, housing their crucial enterprise data, SAP’s pivot to emphasising data (and AI) in their Business Suite, seems to be a master stroke. Now it’s all about good execution, which we’ll know more about once BDC is Generally Available in a few weeks’.
One final comment I’d like to make as we close this blog series.
I run a Data and Analytics Practice at NTT DATA, which is somewhat unique in the market in that we are a truly agnostic practice. We do lots of work with SAP D&A technologies, but we do just as much work in Microsoft D&A technologies (most notably MS Fabric and Azure Databricks) and Snowflake. It is hence my job and duty to be eagle-eyed and somewhat critical of all D&A tools in the market and to always keep our customers’ goals central when helping them select a modern data platform. To this end, I am not “obliged” to promote SAP Business Data Cloud. However, I hope, after reading this blog, you can understand why I am truly excited about BDC and why, for the first time in a long time, I feel that SAP really has a truly competitive Enterprise-wide Modern Data Platform that can compete with the best in class.
Important disclaimer (I hate these, but needs must ☹): SAP BDC is very new, and everything I am sharing is based on what I’ve managed to learn about a brand-new product that was only launched a few weeks ago. I’ve made every effort to ensure the accuracy of my facts, but there’s always a possibility that some finer details might not be entirely precise or subject to change. |
Author