Configuring Data Ingestion

Objective

After completing this lesson, you will be able to set up Data ingestion and know what technical prerequisites & restrictions exist

Data Ingestion Setup & Config

  • Reduce implementation time and project risk with preconfigured integration content, data models and dedicated API's
  • Decrease operational cost by removing redundant data copies with a shared data layer used by multiple consumers
  • Improve customer trust with data validation and end-to-end integration monitoring
  • Increase quality and scale with an inner-source, de-centralized data ownership models
  • Simplified maintenance and support by avoiding multiple point to point integration channels
Diagram illustrating data access APIs and integrations for SAP systems, featuring various modules like Order Scheduling and Predictive Replenishment.

SAP S4/HANA Entity Enablement

A growing number of Industry Cloud Solutions are leveraging SAP S4/HANA ODM compliant Business Entities via Data Replication (DRF) and Data Ingestion (DI) frameworks. 

Table showing DRF and Cloud-enabled entities for SAP S4/HANA, with details on available and consuming models.

Data Ingestion for Industry Clouds

Unlock the power of SAP S4/HANA, SAP ECC and SAP CARAB with accelerated date integration to SAP cloud native Industry Cloud solutions

Flowchart illustrating the process from Enterprise Data Sources through Data Ingestion and Provisioning to Data Consumption, with integration monitoring.

Data Ingestion has been setup to be able to receive data from SAP and non-SAP datas ources through open APIs. For SAP S/4Hana the data replication is delivered as out-of-the-box content as well on SAP S/4HANA side as within Cloud Integration with predefined iFlows. The data processing flow is then common across the entities. Along the processing chain, monitoring is assured with Cloud ALM where content is available for Cloud Integration, Processing within data ingestion and from connected industry cloud solutions.

Data Ingestion For Industry Clouds - Solution Overview

This schema shows the processing flow in greated detail. Data Ingestion manages the persistence and data consistency from the import of data through the Open APIs, through Kafka processing and persistence in a data lake. From there the industry cloud solutions fetch the relevant data.

Diagram illustrating data integration from non-SAP solutions to SAP Intelligent Suite, focusing on data quality, security, and cloud services.
  1. SAP Data Ingress - DRF
  2. Non-SAP Data Ingress - OpenAPI
  3. Source Authentication
  4. Data Quality Validation
  5. Data Encryption
  6. Data Persistency (Lake)
  7. Controlled Data Access
  8. Integration Monitoring

Setting Up Data Ingestion In Your Sub Account

In order to enable data ingestion in your BTP account, you need to subscribe to the application in the same sub account where also the industry cloud solutions are deployed. It is not possible for an industry cloud solution to access data from a data ingestion instance running in a different sub account.

Add the subscription and instance of data ingestion to your sub account

SAP BTP Cockpit interface showing account overview, entitlements, and details for a Cloud Foundry environment.

Assign the entitlements to your target sub account

User interface showing the assignment of service plans for a subaccount called CH_CIC_EU with options for data ingestion services.

Assign the entitlements to your target sub account

Screenshot of SAP BTP Cockpit showing Global Account and Entity Assignments overview with service and quota details.

After adding the subscription and the instance for data ingestion to your sub account, you also need to generate a service key. This service key will be required to configure the access from external systems and from Cloud Integration to data ingestion. In fact when calling the Open APIs for the data entities, the authorization works through this service key.

Add the subscription and instance of data ingestion to your sub account

SAP BTP Cockpit showing subscriptions and instances for a subaccount, highlighting Data Ingestion for Industry Cloud Solutions.

Configuring Data Ingestion

When you open the data ingestion app from your sub account, you need to maintain the sub domain name of your tenant. You may also save a favorite in your browser to directly access the data ingestion app directly within your tenant.

Open the data ingestion app.

Login page for SAP with a username field and a Sign in with OpenID button.

To set up data ingestion, we need to create a source system in data ingestion. when connecting to SAP S/4HANA It is recommended that you use as source system, the business system ID of the respective SAP S/4HANA system. This is mandatory for a direct connection and an option if the processing goes through Cloud Integration. We therefore recommend to use this convention to be flexible within the project. Once set and data has been replicated, it is no longer possible to change the source system identifier

Get the source system name from SAP S/4HANA, by use of a function module BS_KEY_NAME

SAP interface showing function module details for LCR_GET_OWN_BUSINESS_SYSTEM, with parameters and test results visible.

Add this system as source system,

Popup window to add a system in SAP, displaying fields for System ID and System Name with existing system details below.

For each entity required for the respective industry cloud solution, the entity needs to be associated with the source system and the entity needs to be activated then. As a result the entity is set to Activated and the source system shows the source of choice

Check for relevant entities for your industry cloud solution with this link Available Data Entities for Data Ingestion for Industry Cloud Solutions | SAP Help Portal.

Enable the required entities in data ingestion.

SAP interface showing the activated AtpSnapshot data entity with details on products and system status.

Log in to track your progress & complete quizzes