Export Forecast Data

Export Forecast Data

We now want to export the generated plan data from SAP Analytics Cloud to SAP Datasphere. The plan data could be used for downstream processing and combined plan vs. ACT reporting there.

An OAuth Client has already been set up in SAP Analytics Cloud. The necessary credentials and the SAC system’s token URL are included in this lesson for you to complete the export.

  1. In SAP Datasphere, go to Connections and click Create.

  2. Choose connection type Cloud Data Integration.

  3. Depending on the geography of your SAP Analytics Cloud URL: EU10 or US10 or AP11, enter the relevant settings.

EU10

If your SAP Analytics Cloud URL includes EU10 (e.g. https://academy-sac.eu10.hcs.cloud.sap/) enter these details:

US10

If your SAP Analytics Cloud URL includes US10 (e.g. https://academy-sac-2.us10.hcs.cloud.sap/) enter these details:

AP11

If your SAP Analytics Cloud URL includes AP11 (e.g. https://academy-sac-1.ap11.hcs.cloud.sap/) enter these details:

  • OAuth Client ID: sb-c96ff847-bf9c-45de-9d45-ce4916fa28de!b852|client!b23

  • OAuth Client Secret: c705978f-c670-45da-b7ef-443c88863ffb$Wp9osWXzPKOhEAolnMKyoGDzZq-8MYg1PJixCOOg2ls=

  1. Click Next Step.

  2. Name the connection SAC_Datasphere_Export_<your_userid> and select Create Connection.

  3. Navigate to the Data Builder (as in the first lessons) and create a new Data Flow.

  4. In the panel on the left hand side, go to Sources and find the newly created connection.

  5. Click the Import from Connection icon beside the newly created connection.

  6. Go back to your local copy of the model in SAP Analytics Cloud: NEW_SAP__FI_CLM_IM_LIQUIDITY_PLANNING by following these steps:

  • Click on the SAC Analytics Cloud tab.
  • Click on Modeler from the side menu.
  • Open your model from your Recent Files list.
  1. Copy your unique model ID from its URL.

You can now search for your model back in the SAP Datasphere Data Flow.

  1. Go back to SAP Datasphere and expand the connection name and then sac- SAC Namespace.

  2. Paste your model ID in the search box and hit return.

  3. Select the model ID to dispay a list of technical objects belonging to the model.

  4. Select FactData and click Next.

  5. On the Import Objects from Connection screen, select FactData to begin the import and click Add Selection.

  6. The fact data is now available on the canvas.

The Data Flow must write the data into a table.

  1. Click on the table icon to add a table.

  2. Name the table T_FactData_<your_userid> and click Create and Deploy Table.

  3. Click anywhere in the white space in the middle panel to show the Data Flow properties, then change the name of the Data Flow to DF_FactData_Liquidity_<your_userid>.

  4. Click Deploy.

  5. Click Save.

  6. Click Run to run the Data Flow. Once you have started it click Data Integration Monitor in the left panel to check the run status.

  7. Refresh the status occasionally in the monitor until the Data Flow run is completed.

  8. Click the back arrow to return to the Data Flow.

  9. Click the table and toggle the data preview to view the data.

Summary

Congratulations! You have completed all exercises.

In a real scenario, you could now use this data to feed it into your corporate plan vs. actual reporting and other planning processes etc.

Please note, that when using a Data Provisioning agent and remote tables, you could also enable delta replication for the extraction of SAP Analytics Cloud data to SAP Datasphere.