Let's start with the positioning and use of SAP Datasphere as business data fabric and explore the navigation areas on the landing page.
Business Data Fabric
Understanding the business context within SAP Datasphere is crucial for leveraging its full potential in your data management strategy, whether you are an enterprise architect, a data engineer, or a data analyst.
Watch the following video to explore the use case for SAP Datasphere.
SAP Datasphere enables a business data fabric architecture that uniquely harmonizes mission-critical data throughout the organization, unleashing business experts to make the most impactful decisions. It combines previously discrete capabilities into a unified service for data integration, cataloging, semantic modeling and data warehousing.
Note
Maybe, you have heard the term "SAP Data Warehouse Cloud" and wonder how it relates to SAP Datasphere.
In 2019, SAP launched a first cloud-native data warehousing solution under the branding SAP Data Warehouse Cloud. This software-as-a-service solution was already built on SAP HANA Cloud and integrated with SAP Analytics Cloud, enabling real-time analytics and modern reporting capabilities.
In 2023, SAP Datasphere was released as the logical successor. It provides additional data integration and cataloging features.SAP Datasphere:
- is designed to combine federation and ingestion of SAP and non-SAP data.
- preserves the full meaning and context of SAP data across systems and clouds.
- integrates with other data vendor’s platforms, delivering seamless and scalable access to one authoritative source for your most valuable enterprise data.
- leverages existing data investments as it does not require moving data into yet another data store.
- radically simplifies your data landscape, ensuring inherent governance throughout the data lifecycle.
SAP Datasphere is based on SAP HANA Cloud and follows a clear data warehouse as a service (DWaaS) approach in the public cloud, with fast release cycles.

SAP Datasphere provides the following capabilities for a complete business data fabric architecture:
Connectivity: Define connections to SAP and non SAP sources. To extend connectivity to cover additional data sources, you can use partner tools like Adverity and Precog.
Data federation, data replication and data transformation: Define remote tables and flows for various data acquisition scenarios.
Semantic onboarding: Import semantically-rich objects from your SAP systems and the content network, as well as the public data marketplace and other marketplace contexts.
Space management: Define spaces as secure virtual work environments where members can acquire, prepare, and model data.
Administration & security: Manage settings on system level, such as connectivity and data integration settings, security, auditing, and monitoring settings.
Data & business modeling: Use graphical low-code or no-code tools, or powerful built-in SQL for modeling needs. Enrich existing datasets with external data, coming from the data marketplace, CSV uploads, and third-party sources.
Catalog: Collect and organize data and metadata, enabling businesses and technical users to make confident data-driven decisions. Catalog improves productivity and efficiency by building trust in enterprise metadata through consistent data quality and governance.
Data marketplace: Enable businesses to easily integrate third-party data. You can search and purchase analytical data from data providers. The data comes in the form of objects packaged as data products, which you can use in spaces of your SAP Datasphere tenant.
SAP Datasphere, BW Bridge: Migrate your legacy SAP BW system to the cloud. You can reuse existing investments by transferring ABAP code, extractors, and ETL processes, while leveraging SAP HANA Cloud for modern, scalable data warehousing and integration with SAP applications.
Note
Modernizing SAP BW or SAP BW/4HANA to SAP Business Data Cloud is now the strategic direction.Object store & Apache Spark runtime: Store large amounts of data files and use Apache Spark runtime for large-scale data processing in file spaces (SAP HANA Data Lake Files) where it serves as the primary execution engine for transformation flows and data preparation tasks.
CLI: Use the CLI (Command Line Interface) to connect to SAP Datasphere and manage certain types of objects (for example spaces, users, connections).
API: Use the SQL API to enable external tools to create artifacts and write data using Open SQL Schema.
Analytics: Consume SAP Datasphere data in SAP Analytics Cloud stories via a live connection.
Planning: Leverage seamless planning with SAP Analytics Cloud for planning.
External consumption: Consume exposed data in clients, tools, and apps via an OData service or JDBC/ODBC.
Open data ecosystem partners:
- Collibra: Enable bi-directional metadata exchange between the SAP Datasphere catalog and the Collibra data catalog, facilitating efficient metadata management and robust data governance across your entire data ecosystem.
- Databricks: Integrate SAP data with a data lakehouse platform, especially for machine learning scenarios.
- Confluent: Set your data in motion with real-time event and streaming data.
- DataRobot: Leverage augmented intelligence with automated machine learning.
- Google Cloud: Leverage Google Cloud services on SAP Datasphere data.
Landing Page

The landing page of SAP Datasphere shows the following navigation areas:
- Repository Explorer : Search, filter, and manage existing objects or create new ones.
- Catalog & Marketplace : Quickly find and access data products and assets across all connected sources.
- Semantic Onboarding : Import semantically-rich objects from SAP systems, the content network, the public data marketplace, and other marketplace contexts.
- Business Builder : Model business scenarios in the business layer, independent of the data layer. Note that this tool is now deprecated. For business-specific requirements, it is recommended to create analytic models in the Data Builder.
- Data Builder : Define data models using graphical tools or the SQL editor. Data engineers can model, combine, and harmonize data in a standardized way, leveraging flows for real-time replication, federation, and transformation.
- Data Integration Monitor : Schedule, run, and monitor data ingestion, replication, and transformation tasks within a space with a centralized tool.
- Connections : Manage connections to remote sources within your spaces.
- Space Management : Create and manage spaces for data modeling. You can define space size, storage type, and priority, as well as add users and connect sources. While spaces are decoupled, objects can be shared between them.
- Translation : Manage multilingual metadata, such as business names and column labels, primarily for consumption in SAP Analytics Cloud.
- Security : Manage authorizations by maintaining users, custom roles, and scoped roles.
- Transport : Import and export objects using packages.
- System Monitor : Monitor system resources, including disk storage, elastic compute nodes, task logs, and statement logs.
- Data Sharing Cockpit : Manage data provider profiles and share data products.
- System : Perform configuration and administration tasks.