In this lesson, you'll learn how the provided SAP Databricks component is integrated into SAP Business Data Cloud.
The Purpose of Databricks
Databricks is a Data Intelligence Platform bringing data and Artificial Intelligence (AI) together. Some of its features are as follows:
Artificial Intelligence / Machine Learning
Support for full machine learning (ML) lifecycle, from experimentation to production, including generative AI and large language models.
Data Science
A collaborative and unified Data Science environment based on serverless computing, with integrated development environment (IDE) integration, and built-in visualization tools.
Data Engineering
Data ingestion and transformation: automated Extract Transform Load (ETL) processing, observability, and monitoring in a single stack.
Data Governance
Governance of structured and unstructured data, machine learning models, notebooks, dashboards, and files through Databricks Unity Catalog.
The Purpose of Databricks in the Context of SAP Business Data Cloud
By deploying Insight Apps, you can already gather many insights into your SAP data. However, what if you needed to execute some machine learning algorithms on top of that data?
To save you the trouble and the cost of copying the data into another Machine Learning Platform, SAP decided to provide Databricks as a service with SAP Business Data Cloud. As it's been tailored to the specific needs of SAP, and does not include the complete Databricks architecture and capabilities, this component is called SAP Databricks.

Databricks provides the possibility of using the Delta Share protocol. Thanks to this functionality, you can directly go into your SAP Business Data Cloud cockpit and share a Data Product with SAP Databricks. Keep in mind that in this context, you don't need to have an SAP Datasphere component.
Note
The concept of Delta Sharing is different from the concept of sharing in the context of SAP Datasphere Spaces or SAP Datasphere Marketplace.You'll then be able to execute some machine learning notebooks and get corresponding results. Those results will be stored in SAP Business Data Cloud, thus not leaving your secured SAP environment.
The SAP Databricks component integrated in SAP Business Data Cloud is indeed only the Databricks computing engine and no specific data storage is provided.
If you want to compute those results further, you can use Delta Sharing. You can then leverage this data into the SAP Datasphere tenant of SAP Business Data Cloud and, for example, create a new SAP Analytics Cloud story on top.
In this lesson, you've learned how the provided SAP Databricks component is integrated in SAP Business Data Cloud. You've learned that you can implement machine learning scenarios and share the results in SAP Business Data Cloud for further modeling and application development.