Loading Master data

Objective

After completing this lesson, you will be able to Load Master Data.

Loading Attributes and Texts

The company's analysts want to combine transactional data with historical or current master data. Master data should not be inconsistent when looking at different data sets and it must be available at all times. Therefore, you need to regularly load master data to the centrally defined InfoObjects of SAP BW bridge. Thereafter, load transactional data to SAP BW bridge with the implemented consistency checks.

Suppose you have already created the DataSource, InfoObject and transformation. To load data, you need an additional BW bridge object: The data transfer process (DTP). This allows you to add filter criteria and some error tracking properties. For each combination of source, target, and restriction, you create a separate DTP.

The image shows the purpose of the Data Transfer Process (DTP). It transports records from source data to a target. You can specify a filter condition. In the bottom of the image, you see 3 source records. At the top, you see separate tables for attributes and texts. After applying filter, each of them contains 2 rows.

In this unit, you will learn how to correctly load master data into the SAP BW bridge. A new record overwrites an existing one with the same key value. You need to avoid that the most recent change is overwritten by an older one. It is therefore important to keep the order of changes.

For SAP BW bridge as a target, Operational Data Provisioning is the central infrastructure for data extraction and data replication from SAP (ABAP) applications. It encompasses the Operational Delta Queue (ODQ) which ensures that the target applications (called subscribers) retrieve the data from the delta queue In the correct order and continue processing the data.

The image shows why the order is important when loading master data: The image shows an example loading categories of a product. It shows a source table in which the category changed from A to B and then to C. In the image, the data load is split into two packages, and record B of package 1 is slower than record C of package 2. Value C will be overwritten and value B will be the value stored at the end. The image shows the wrong value B in the target table.

However, SAP BW bridge could split large sets of records into different subsets called packages. In this case, you must make sure that these packages are not loaded in parallel for the same target. This is a setting on the DTP.

The image shows the same example loading categories of a product. It shows a source table in which the category changed from A to B and then to C. When the data load is split into two packages, you can avoid the issue by loading different packages in sequence. The image shows the correct value C in the target table.

However, if the different packages contain separate key values, you can load in parallel. To achieve this, you can either define different DTPs with different filters or use the group by extraction setting. With this setting, all values of the same semantic key are handled in the same package

The image shows the same example loading categories of a product. It shows a source table in which the category changed from A to B and then to C. When the data load is split into two packages, you can also avoid the issue by defining a semantic extraction key for splitting packages. The image shows the correct value C in the target table.

Creating a Data Transfer Process

Now after you have learned about the importance of loading data, you probably want to learn how to do it in the SAP BW bridge. Let's start with a simple example for master data (attributes).

Launch a video to learn how to create and start a DTP with the correct settings.

The request is an instance that is generated at the runtime of the Data Transfer Process. The request is processed in the steps that have been defined for the Data Transfer Process (extraction, filter, and transformation). The monitor for the Data Transfer Process request shows the header information, request status, and the status and messages for the individual processing steps.

When you execute the Data Transfer Process (DTP), the system asks you whether you want to check the monitor, which is part of the SAP BW/4HANA Cockpit. In the monitor, you find all of the information about the loading process, like the number of records and the total runtime.

Note that for the same InfoObject, you should load attributes first, and then texts or hierarchies. After executing the DTP, the master data-bearing characteristic contains the loaded attributes, texts, or hierarchies in the associated database tables.

Independent targets, such as texts and hierarchies, can be updated in parallel.

Now, after loading data, you want to check if the result is stored correctly in the master data tables. Launch a video to see how you can start different DTPs in parallel and how you can review the loaded values.

To sum it up, you can check the loaded data in the following ways:

  • From BW Modeling tools, from a DTP, choose Show MonitorsShow last request (log on if necessary), and wait until the BW bridge cockpit opens, then choose Manage Request and Storage: Active DataContents.
  • From BW Modeling tools, from an InfoObject, choose MiscellaneousMaster Data Maintenance(log on if necessary), and wait until the BW bridge cockpit opens.
  • From the BW modeling tools, from the Properties View, DDIC section, next to the relevant table, choose Display Data (log on if necessary) and wait until the BW bridge cockpit opens.
  • From anywhere inside the BW bridge cockpit, search and open the InfoObject Data Maintenance app, then enter the name of the basic characteristic you want to review.

Suppose the DTP works as expected. When you want to automatize the loading process, the recommendation is to create a process chain. We will cover this topic in unit 5.

Log in to track your progress & complete quizzes