In this exercise, we will copy the planning model from the sample content into our own folder and import the actual data and influencers from SAP Datasphere. For this, we will use an existing connection to SAP Datasphere and create an data import job in the planning model.
First, we open the Files folder by clicking on Files on the left navigation panel.

Next, we navigate to the Public folder, where we can find the sample content. This folder contains the story and data model which we will use as the basis for the next exercises.

Click on the folder SAC_Datasphere_EnergyCost.

Select both the Model and Story contained in the folder, and then click the Copy icon.

In the Copy 2 Files to… pop-up window, click on the down arrow in the file path and select My Files.

Click OK.

You will see a confirmation at the bottom of the screen that they have been copied to My Files.

At the top of the screen, click on Files to return to the location where the two files have now been copied. Select the first file COPY_SAP__FI_CLM_IM_LIQUIDITY_PLANNING, and then click the edit icon.

Change the title of the file to NEW_SAP__FI_CLM_IM_LIQUIDITY_PLANNING, and then click the Save.

Now let’s load the data from SAP Datasphere into our SAP Analytics Cloud (SAC) model and build a table on top of it. Navigate into our copied sample content folder (navigate back from Public folder --> My Files) and open the model NEW_SAP__FI_CLM_IM_LIQUIDITY_PLANNING.

We switch to the Data Management View of the model where we can create the Import Job.

Click on “Import Data”.

In the pop-up window we choose OData Services.

Next, we are prompted to select a connection from the dropdown list. We choose the connection we defined during Exercise 2 and click on “Next”.

Now we create a new query for the OData Service and set the name to V_Union_Actuals_and_Influencer and select the corresponding table. Then click Next to proceed.

In this view we can see the dimensions and measures that the table consists of. We could select individual dimensions and measures in the query, however in this case, we want to include the whole table. Using drag and drop we can simply drag the table to the Selected Data area and drop it there. Now all 10 dimensions and measures are included in the result set. To confirm click Create. The system now sets up the import job in the background and lets us know as soon as it is created.

Once the import job is created click on Set Up Import.

The first view which is shown is the data preparation step. Here we could resolve data quality issues before the mapping step, but also wrangle data and make edits such as renaming columns, create transformations, etc. In our case, we can skip this step by clicking Next and proceed with the mapping.

In the next step, the mapping between source and target columns is defined. The system already did most of the mapping for us, so we only have to map the TIMEMONTH column to the Time column. This can be done by dragging the TIMEMONTH column onto the blank source space of the Time column. Once this is done, click Next.

In the Dimension Properties view, we can add descriptions by mapping the description members of our source column to respective target column. The mapping is shown in the screenshot below. Drag and drop TRANSACTION_CURRENCYDESCRIPTION and AMOUNTUNIT to the source column. Once this is done, click Next. The system now validates the mappings and if no errors are found we can proceed to run the import.

Click on Run Import.

Then, in the Run Import pop-up that appears, click Finish.

The data is now imported into SAC. Once the process is finished, the status bar on the right shows the Data Timeline information about the date and duration time of the import process and number of rows which have been imported.

Congratulations! You have successfully imported data from SAP Datasphere to SAP Analytics Cloud. Any yellow warnings can be ignored, these appear due to null values in one of the files uploaded to Datasphere (this has no negative impact going forward). Now let’s have a look how we can build a table on top of that data to carry out our Liquidity Planning.