Importing and Preparing Transaction Data in a Model

Objectives

After completing this lesson, you will be able to:

  • Explain data transformation functions in an import model.
  • Import data into a model.

Import Transaction Data

Just as with public dimensions, data can be imported into models from files as well as source systems. However, note that the Date and Version dimension data is system-generated; therefore, there is no import option.

Master data is normally imported before transaction data is imported into the model. When transaction data is imported, it is stored in the model’s fact table. A model can receive data from one or several source systems depending on the scenario.

The work flow for importing transaction data into the model is similar to importing master data into a dimension. One key difference is the Set Up Import activity. The Set Up Import process is made up of four work steps.

The following figure shows the workflow for importing transaction data into the model.

The workflow for importing transaction data into the model

Data Import Workflow

The workflow below describe the import process for a measure-based model, not an account-based model.

Workflow to Import Transaction Data from a SAP Source System

When you import data into a model from an SAP source system, you will need to select the fields to import. In addition, you can set a filter for the import.

  1. In the model, choose the Data Management workspace.
  2. Choose Import DataData Source.
  3. Select the Source System and then the Connection.
  4. Choose the data source.
  5. Select fields to be imported; add filter(s) if necessary. When the import is created, it appears in the data management workspace of the model with a hyperlink: Set Up Import.

    The following figure illustrates how you select the data to import into the model structure.

    1. Available data: The source system fields are shown on the left side of the dialog. You can scroll through the list or search for specific fields that are required.
    2. Selected data: The fields that have been selected to be imported are displayed here.
    3. Filters: Filter the data where needed. In the example, week 53 is not required so the filter is set to Less than or equal to 52.
    Edit SAP Universe Query dialog with Available Data, Selected Data and Filters showing. The filter is set to Less than or equal to 52 weeks.
  6. Set up the import:
    1. Prepare the data.
    2. Map the source to the target.
    3. Map properties.
    4. Review the import.
  7. Run the import and read the log. If records fail, debug and re-import.

In order to run another import, repeat the process manually or schedule it to repeat on a regular basis.

Prepare Transaction Data

As we have already seen, data used in SAP Analytics Cloud models may need to be modified for reporting and analysis purposes. For example, data from a source system that is imported into a model may have multiple columns for employee names - First Name and Last Name. But for stories and analytic applications, designers may want only one column that combines both First and Last names.

In the Prepare Data step, you can do quality checks, perform simple and complex transformations, and change the job settings.

  1. Make simple data changes with smart transformations and quick actions directly from the data table.
  2. Use the overview pane to perform a quality check on the data.
  3. Perform complex transactions with the custom expressions editor.
  4. Change the job settings.
The Prepare Data step with highlights on the data table for quick actions, the overview pane, the custom expressions editor icon, and change job settings icon.

Custom Expression Editor

Just as when importing data into a dimension, when importing into a new model, simple transformations are available via the smart transformation and quick actions. If more complex transformations are needed, the custom expression editor is available.

In the Before data table, you can see the source data contains year and week as separate columns. SAP Analytics Cloud requires weeks 1-9 in two-digit format, using 08 instead of 8 as seen below.

The solution is to:

  1. use the Custom Expression editor to add a 0 if the week is less than 10.
  2. use the Transformation Bar to concatenate the Year and Week values with no separator into the required YYYYWK format.

These results are shown in the After data table.

At the top, a data table with Year and Week in separate columns and August shown at 8. Below it, concatenate formula putting the data into Year_Wk format.

Job Settings

During the import process, the Job Settings can be maintained:

  • Import method: These selections are important when importing data again for the same data region.
  • Reverse sign by account type: If the source system stores revenue, equity, and liabilities with positive values, then they can be reversed based on the account type property in the account dimension.
  • Update local dimensions with new members: If the model contains a private dimension, then its members can be updated with new members during the model import.
  • Conditional validation: This is used to validate members to make sure they are not parent members in a hierarchy.
Job setting options showing Import Settings and Validation Settings sections.

Map to Target

In the Map to Target step, the system will map fields automatically where possible. If the source and target fields names are different, then the mapping can be completed using drag-and-drop. When importing from an SAP source system, the mapping can be edited if needed.

The Map to Target screen. On the left are source columns, in the middle are unmapped target columns, on the right are mapped target columns
  1. Source fields.
  2. Unmapped target fields. You can:
    • Map to source fields.
    • Set to a constant.
    • Set to unassigned (#).
  3. Mapped fields.

Map Properties

The Map Properties step can be used to map properties of both public and private dimensions, if there are any. The only way to import into a private dimension is during the model import. Also, when importing into an analytic model, the Map Properties step indicates that all records have been assigned to the public actual version.

Note

All target fields must either be mapped or use a default value or Unassigned. Some source fields may be unmapped if they are not needed.

Review Import

Before you complete the import, you must review any issues. In the Review Import step, if there are no issues the Dimension Restrictions area will be blank. If you have any issues, choose the Prepare Dataset workspace, correct the issues, re-map if necessary, and return to the Review Import workspace.

For example, in the following image you can see the issue that requires review. The U00 Stores dimension does not include e-Fashion in the store name, so the imported data does not match the model dimension's data.

The review import step

Import Query Options

After the import process is complete, there are several important options in the Data Management workspace:

  • Edit allows you to change the import query.
  • Schedule is where you can set up the import query to run periodically (e.g. daily).
  • Select import method is used to change the import method (for example, from Update to Clean and Replace).
  • Refresh is used to run the import again.
Data management workspace is open. Icons on top right of screen are highlighted. Edit, Schedule, and Select import method. The more options has been selected in the import job to show the Refresh option.

Combine Data from Two Sources: Account-Based Model Only

In an account-based model you can combine data from two sources. For example, you may have a data source that contains a dimension for Entity and another one with the same Entity and a Profit Center dimension. You can combine both data sources into one model so that all the members for both Entity dimensions use the same Profit Center.

Watch this video to learn how to prepare data by combining data from two sources, in this case Microsoft Excel.

Update Methods

When data must be imported regularly into SAP Analytics Cloud models, you can use import method options to prevent data duplication. In addition, when importing from SAP systems, the import is typically filtered to the current period.

A choice of four import methods are available:

  1. Update:Updates the existing data and adds new entries to the target model. The scope of this update is based on a combination of all dimensions in the target model.

    For a more refined scope, use either the Clean and replace selected version data or Clean and replace subset of data update options.

  2. Append: Leaves the existing data unchanged and adds only new entries to the target model. When using append there are two options:
    1. Reject duplicated records in the model.
    2. Aggregate duplicated rows in the model.
  3. Clean and replace selected version data: Deletes the existing data and adds new entries to the target model only for the versions that you specify in the import. You can choose to use either the existing version or specify a new version under Version.

    For example, if you specify to import data for the Actual version, only the data in the Actual version is cleaned and replaced. Other versions, for example Planning, are not affected.

  4. Clean and replace subset of data: Replaces existing data and adds new entries to the target model for a defined subset of the data based on a scope of selected versions using either the Existing Version or New Version buttons. You can also limit the scope to specific dimensions. To define a scope based on a combination of dimensions, select + Add Scope and use the Select a dimension field to specify a dimension.

    When a Date dimension is defined in the scope, the time range in the source data (calculated based on the target model's granularity) combined with other dimensions in the scope will determine what existing data is cleaned and replaced.

    If, for example, Date and Region dimensions are defined as part of a scope, only entries that fall within the time range and match Region from the source data will be replaced in the target model. Existing data that does not match the scope will be kept unchanged.

Comparison: Import Methods

Examine the example below, be sure to review the incoming data and see how the resulting data changes, depending on the import method selected.

A data table containing examples of the data that has already been imported, the data to be imported and three versions of the data depending on if you choose Update, Append or Clean and replace.

Since the incoming data file only includes actual data, there will be no impact on the existing budget data.

  1. Update method:
    • Benefits increase to 1600.
    • A new record for Direct Labor 160 is inserted.
    • All other records remain the same.
  2. Append method:
    • Benefits increase to 3100.
    • A new record for Direct Labor 160 is inserted.
    • All other records remain the same.
  3. Replace/clear method:
    • Benefits increase to 1600.
    • A new record for Direct Labor 160 is inserted.
    • Actual records not in the incoming file are removed.
    • All other records remain the same.

Debug Transaction Data

When some incoming records are correct but others have issues, the good records are imported and the bad records are identified in the rejection summary which will appear in the data management workspace.

Note

If there is an issue with a scheduled import, the schedule status log can be accessed via Connections in the navigation bar.

Rejected records: In this example, 1,116 rows were rejected. The rejection summary shows that the dimension member does not exist. You must update the dimension and refresh the data into the model.

The model's data management screen with a rejection summary. The issues are shown in the rejection summary with how to fix the issue.

Import and Prepare Data in an Analytic Model

Task 1: Populate the Model with Data from a Query from an SAP Universe

Business Example

You need to acquire weekly data from an SAP Universe into an analytic model.

In this practice exercise, you will:

  • Select the model for the data import and the connection to be used
  • Build the SAP Universe query

Task 2: Set Up the Import Job to Wrangle and Map the Query Data to the Model

Business Example

You need to set up an automatic job that will import and wrangle the weekly data from an SAP Universe into a model.

In this practice exercise, you will:

  • Use the Custom Expression Editor to transform data
  • Map data
  • Run the import
  • View the data in the model

Log in to track your progress & complete quizzes