Yesterday we saw SAP present their long awaited announcement at the SAP Business Unleashed 2025 event: SAP Business Data Cloud is coming! What can we expect from this new solution and what does that mean for existing Datasphere Customers?
First, let's take a look at how we got here. Two years ago, SAP used the same event (back then called “Data Unleashed”) to announce their new offering - SAP Datasphere. While a few major features were added during that time, this offering was essentially a rebrand of their existing “Data Warehouse Cloud” (DWC) solution. Still seen as quite an immature tool, most existing SAP BW customers preferred to stick to their current setup as they didn’t want to give up their investments and the feature set they are used to. Non-SAP customers viewed DWC as too far behind competitors like Databricks and Snowflake.
SAP Datasphere however has matured over these last years. SAP proclaimed it as the strategic successor of SAP BW, worked on feature parity and unique selling points like enabling Citizen Developers, allowing seamless S/4HANA and SAP Analytics Cloud (SAC) integration and various partnerships with leading vendors. These partnerships helped to patch up any weaknesses in their own offering and underlining the paradigm shift of opening up their platform to third party solutions. One of these partnerships was with Databricks.
Databricks can be seen as the leader in terms of modern Data Platform offerings. They stick out especially in terms of a cost efficient Lakehouse Architecture and sophisticated Machine Learning & AI support. While SAP has tried to catch up with Datasphere, it’s an uphill battle. The last two years of this cooperation have unfortunately not yielded any noteworthy results for customers, but it seems SAP and Databricks have quietly been working on something. Since yesterday we know - they worked on the Business Data Cloud.
Have we come full circle?
SAP announces Business Data Cloud product launch
Overview of the functional spectrum of SAP Business Data Cloud and the included components Datasphere, Analytics Cloud, DW, and SAP Databricks. Diagram published by SAP.
Source: https://news.sap.com/2025/02/sap-databricks-open-bold-new-era-data-ai/
The Business Data Cloud will combine a couple of products that can be utilized together to connect all SAP and non SAP data of a company.
SAP Datasphere:
SAP Datasphere will remain mostly unchanged - any prior investment from companies into Datasphere will stay safe. Datasphere can be used with its proven Citizen Developer Capabilities to share data within an organization and build central data models where it makes sense
Analytics and planning:
SAP Business Data Cloud will leverage SAP Analytics Cloud and its planning capabilities.
SAP Business Warehouse:
SAP Business Data Cloud provides on-premise SAP BW customers an easy path to the cloud. With native integration and Delta Share’s object store, you can seamlessly turn data into a data product—simplifying modernization and maximizing your BW investment.
SAP Databricks:
Some of Databricks’ core components will be integrated into the SAP Business Data Cloud product suite: the Mosaic AI and the ML workspace, the lakehouse governance layer Unity Catalog and SQL Warehouse query engine. From what we have seen in the short demo, this includes Databricks core capabilities like Jupyter Notebooks as developer environment as well as Apache Spark and Photon Engine as scalable distributed compute resources (Serverless only!). Recent additions to the vast Databricks feature scope like Databricks AI/BI and Genie dashboarding tools, Lakeflow ETL, and notably Delta Live Tables are not part of the integrated offering.
SAP data models are to be seamlessly available in the Databricks catalog and vice-versa, promising efficient, zero-copy data integration between established SAP business domain data and third party system or even unstructured data. This integration is technologically based on Databricks’ native data lakehouse innovations like the Delta Lake table format and the Delta Sharing exchange protocol.
SAP and Databricks or SAP Databricks?
Current state: SAP and Databricks
High-level comparison of how we advise our customers to combine SAP Datasphere as the leading Enterprise Business Intelligence framework and the stand-alone Databricks data lakehouse platform. Databricks offers more efficient scaling for large volumes of data and offers state-of-the-art Machine Learning and AI development capabilities that SAP lacks.
Recently we held our own Webinar on the topic of combining SAP Datasphere and Databricks.
Watch the recording of our webinar:
"SAP Datasphere and the Databricks Lakehouse Approach"
In short: SAP Datasphere is convincing as a business layer in a standardized data warehouse approach. It also offers seamless integration with SAP systems and provides a powerful semantic layer for self service business intelligence and reporting with SAP Analytics Cloud. Databricks, on the other hand, is particularly strong in the area of machine learning and enables a data lakehouse architecture that can manage large amounts of data cost-effectively. By combining these strengths, companies can build a modern, scalable, and intelligent data platform.
If you are interested in our approach - check out our recent article on this.
So even before this announcement the combination of SAP Datasphere & Databricks was a strong choice for many companies. It seems SAP has the same idea here, since the information from yesterday reinforces this approach even further.
Unleashed? SAP Databricks
Databricks’ role in the new SAP Business Data Cloud landscape - as presented by Databricks in their blog post,
source: https://www.databricks.com/blog/introducing-sap-databricks
SAP is finally filling a glaring hole in their cloud product portfolio regarding Machine Learning, Generative and Agentic AI development by offering an integrated Databricks workspace. The underlying support of Delta Lake and Delta Sharing technology enables more efficient collaboration across cloud and vendor boundaries. SAP finally steps onto the playing field of the open data lakehouse where major players like Google BigQuery, Snowflake, AWS RedShift already offer technological compatibility with the Databricks-led ecosystem.
The big additions to the functional spectrum of the SAP cloud offering are:
-
Fully-featured ML/AI experiment and operations framework to leverage the latest technological advances
-
Zero-copy data exchange between SAP Datasphere and Databricks
-
Unified data management over both (Non-SAP) structured and unstructured data in the Unity Catalog and underlying data lakehouse
-
Better integration with Non-SAP data sources
-
Interoperability with third-party vendors, products, and tools through Delta Sharing
Further side-effects of the extended Databricks partnership might be interesting for power users that felt limited by native SAP tooling in the past:
-
Python and Spark-based analytics are now available in a code-first environment
-
Unity Catalog offers unlimited storage capabilities
-
Distributed compute engines can handle enormous data volume for analytics and AI model training
-
Historically locked-away SAP data models are available for advanced analytics workflows
-
Integration with third-party BI tools like PowerBI and Tableau
It remains to be seen how seamless and efficient the integration of Datasphere models and Databricks Unity Catalog tables truly will be in practice. The separate technological philosophies remain as such under the new banner of Business Data Cloud: in-memory database vs. data lakehouse. We see a potential that SAP opens the door for competing approaches and parallel developments within their own intended one-stop-shop solution. Will eventually all data objects from Datasphere be accessible in Databricks and vice-versa? How is data from lakehouse storage accessed for in-memory queries on the Datasphere side of things?
Pricing and product roadmap are additional topics of discussion where we need to keep an eye on announcements in the near future. While SAP Databricks pricing will be somewhat similar to the established Databricks Unit (DBU) consumption model, that consumption is to be calculated into SAP Business Data Cloud Capacity Units. Availability dates for the Business Data Cloud and SAP Databricks are not yet announced and depend on which hyperscaler is chosen for hosting: offerings on AWS EU/Frankfurt and AWS US-East are to be launched first - Microsoft Azure release is expected to go last.
Our first interpretation of how SAP Business Data Cloud covers the spectrum of modern data platform capabilities:
BDC cleverly integrates Databricks for all the previously missing or under-developed functional areas like lakehouse storage,
stream data processing and ML/AI development.
We are hard at work to bring you more information and answer your pressing questions, so stay tuned. Let us know what you think about the announcement and raise any questions you might have in the form below the blog article.
If you would like to find out what the new solution means for your company, please do not hesitate to contact us. As an experienced Databricks and SAP partner, we can help you to exploit the potential of the Business Data Cloud and the new opportunities offered by this partnership.
Find a time to talk in our calenders:
Sebastian Uhlig: https://www.nextlytics.com/meetings/sebastian-uhlig
Irvin Rodin: https://www.nextlytics.com/meetings/irvin-rodin
Machine Learning, SAP Data Warehouse, Datasphere
