Introduction

Introduction

Overview

In this exercise, you will learn about the high-level architecture of the artefacts used in this workshop and the connection between them. Also, you will learn about the pre-configured elements that have been prepared by SAP to facilitate the workshop. Lastly, you can find a short introduction about the most important involved product functionalities.

  • 1 - Architecture
  • 2 - SAP Datasphere - Data Builder and S/4 Actual Data
  • 3 - SAP Datasphere - Data Marketplace
  • 4 - Bi-directional Integration for Planning
  • 5 - Modelling in SAP Analytics Cloud - Liquidity Planning Model
  • 6 - Stories in SAP Analytics Cloud
  • 7 - What else will you learn?

1. Architecture

This workshop takes place in SAP Datasphere and SAP Analytics Cloud. First of all, internal data is combined with external data from the Data Marketplace. Afterwards, a connection to SAP Analytics Cloud is set up using the Datasphere Public Consumption API. After loading the data from the previously created union to SAP Analytics Cloud, a story in SAP Analytics Cloud is enriched with the respective tables for data entry and simulation before we then create the needed objects like the predictive scenario, multi actions and data actions. After running the prediction and saving the results into a planning version, we set up a connection and load data to SAP Datasphere using the Data Export Service offered by SAP Analytics Cloud.

2. SAP Datasphere - Data Builder and S/4 Actual Data

In the Data Builder of SAP Datasphere, you can define data models for your data with a technical, model-driven approach in the graphical tools or in the powerful SQL editor of the data layer. Two Data Builder artefacts have been pre-created for this workshop.

  • A table containing cash flow statement actuals from S/4HANA
  • A view on top of this table

They will be available for you via Shared Objects when you create a new view:

3. SAP Datasphere - Data Marketplace

Data Marketplace is fully integrated into SAP Datasphere. It’s tailored for businesses to easily integrate third-party data. You can search and purchase analytical data from data providers. The data comes in the form of objects packaged as data products that can be used in one or several spaces of your SAP Datasphere tenant.

Data products are either provided for free or require the purchase of a license at a certain cost. Some data products are available as one-time shipments, other data products are regularly updated by data providers.

To get data products into your SAP Datasphere tenant and consume them, you can follow a simple workflow:

4. Bi-directional Integration for Planning

SAP Datasphere and SAP Analytics Cloud are better together. This is why SAP delivered the bi-directional integration between SAP Datasphere and SAP Analytics Cloud for planning. This enables you to load fact data and master data from SAP Datasphere to SAP Analytics Cloud. Similarly, you can seamlessly retract fact data, master data, and audit data from SAP Analytics Cloud models and use it in SAP Datasphere.

Most importantly, this enables you to use actual data from SAP Datasphere in your planning tables in SAP Analytics Cloud. Additionally, you can join plan and actual data from multiple sources in common views in SAP Datasphere that you can then use for live reporting or any other kind of downstream processing of your plan data. You can also meet corporate requirements to store all steering-relevant data in one data warehouse as a single source of truth.

It is important to understand that the use case of bi-directional integration between SAP Analytics Cloud and SAP Datasphere is enabled by two public APIs:

  • SAP Datasphere provides a public OData API (Controlled Release with wave 11) to pull data from Datasphere and integrate it into SAP Analytics Cloud using the OData Services Connection.
  • SAP Analytics Cloud provides the Data Export Service (DES) API (generally available with QRC2.2022) to pull data from SAC and make it available in Datasphere using remote tables or data flows.

The following image already gives a good high-level overview of our scenario:

Data transfer from SAP Datasphere to SAP Analytics Cloud - Public API for Data Consumption

The SAP Datasphere Public API can be used to replicate fact and master data from SAP Datasphere into SAP Analytics Cloud for planning purposes. By using this API, you can authenticate against SAP Datasphere and get access to its data.

In particular, the API has the following key characteristics:

  • It supports deployed Data Layer entities which are marked as “expose for consumption”. This includes views of semantic type Analytical Dataset and Dimension.
  • It supports standard OData v4 query parameters ($select, $filter, $top, $skip, $orderBy, $count, etc.).
  • It supports business user access and authentication via OAuth 2.0 authorization

Moreover, we strongly recommend that the data in question is either replicated (via remote table replication or data flow) or the respective view’s data is snapshotted in SAP Datasphere.

The detailed technical set-up is described in this blog post.

Data transfer from SAP Analytics Cloud to SAP Datasphere - Data Export Service

With QRC2.2022, we generally released the new SAP Analytics Cloud Data Export Service (DES) to all SAP Analytics Cloud customers. In a nutshell, this is a generic OData-based pull API that can be triggered from other applications and platforms, including 3rd party tools. There is no dedicated UI for this functionality as the API simply facilitates extraction of SAP Analytics Cloud planning models from an external platform.

The API is comprised of two services: the Administration service and the Provider service. The first extracts a list of models on your SAP Analytics Cloud system, and the second one retrieves information about a specific planning model. It has the following key characteristics:

  • Extracts fact data, master data and audit data.
  • Support of basic data extraction qualities with DELTA data extract (fact data) and basic filtering capabilities.
  • Highly performant
  • Business & Technical User Access
  • Available for Cloud Foundry Tenants

The Data Export Service enables a wide range of possible scenarios. It will mostly be used to simplify the downstream processing of plan data that was generated in SAP Analytics Cloud.

The detailed technical set-up in combination with SAP Datasphere is described in this blog post. Other usage examples for the usage of DES can be found in this blog post. But bear in mind that DES can also be used by other SAP and 3rd party applications!

5. Modelling in SAP Analytics Cloud - Liquidity Planning Model

A model is a representation of the business data of an organization or business segment. You can use a model as the basis for your story. The Modeler is the place where you can create, maintain, and load data into models.

Planning models are prepared and preconfigured to help you perform business planning tasks such as forecasting to support and streamline the planning process, with many off-the-shelf features to give you a quick start in the planning process. When working with this type of model in a story, planning users can use a variety of features to update values in the model, and create new values. Out of the box, planning models come with:

  • Categories for budgets, plans, and forecasts.
  • Default time periods that you can quickly adjust to suit your data.
  • Auditing features for traceability.
  • Security features that make it possible for you to restrict access to specific values in data grids to named individuals.

More information on modeling can be found in the previous lesson under the Learn More section.

For our workshop, we have prepared a simple planning model that you will copy later on to load and manipulate data.

It consists of:

  • One measure “AMOUNT”.
  • A dimension of type Account called “SAP_FI_LIQUIDITY_ITEM” holding the accounts of our cashflow statement and accounts for our external data.
  • A dimension of type Version containing the different data versions for actual, budget, plan and forecast data.
  • A dimension of type Date called “Time” containing our planning periods and time hierarchy.
  • One dimension of type Generic called “SAP_ALL_COMPANY_CODE” containing our company’s legal entities.
  • One dimension of type Generic called “Currency” containing the involved transaction currencies.

6. Stories in SAP Analytics Cloud

An SAP Analytics Cloud story is a presentation-style document that uses charts, visualizations, text, images, and pictograms to describe data. You can also plan and change data via stories. Once you create or open a story, you can add and edit pages, sections, and elements as you like. The story toolbar is divided into different categories such as File, Insert, Data, and Tools to help you find options and perform tasks more efficiently.

We created one story with two pages to be copied and edited as part of our workshop. The first page contains visualizations of our data in a dashboarding sense while the second page contains tables and actions for planning purposes.

7. What else will you learn?

Throughout the session, you will be guided on the following product components as well.

In SAP Analytics Cloud:

In SAP Datasphere:

Summary

You can now move on to Combining External and Internal Data in SAP Datasphere.