The Data Transfer Process (DTP) is used to transfer data both to and within SAP BW/4HANA. Transformations and filters can be applied.
The following table lists the possible sources and targets of a DTP:
Sources and Targets of a DTP
Object Type | Usable as DTP Source | Usable as DTP Target |
---|---|---|
DataSource | yes | no |
CompositeProvider | yes | no |
Query | yes | no |
SAP HANA Analysis Process | yes | no |
InfoSource | no | no |
Characteristic InfoObject (Attribute, Text, Hierarchy, or XXL Attribute) | yes | yes |
DataStore Object (advanced) | yes | yes |
Open Hub Destination | no | yes |
In a Data Transfer Process in SAP BW/4HANA, the setting Extraction Mode has the following options:
Full: extracts data again, and must therefore be used only if the data volume is small and existing records are overwritten in the target. Full can be used to "repair" data of a specific semantic filter that has already been extracted in a previous delta run with a semantically wrong rule in a transformation.
Delta: only loads source records that haven't been loaded with this DTP. The first delta execution is an initialization step that retrieves all records.
The Delta DTP

The DTP is set to run in Delta mode as a default when the underlying source supports delta.
DTP Settings for Delta Mechanisms

In a DTP, various settings can be defined related to delta loads:
Filter: Restricts the amount of data read in the source.
For filters, use characteristics with values that are not changed in the source records.
Filter conditions for different Full DTPs may overlap, but filter conditions for different Delta DTPs of the same target must be totally disjoint to avoid duplicating records. A delta DTP with overlapping conditions cannot be activated.
- Only Retrieve last request: If the source contains a sequence of full loads and the latest request contains all information, use this option.
- Extract all green requests: When reading from a Staging DataStore Object that has only one inbound table, only requests are extracted that are older than the first yellow or red request. If you set this flag, green requests with yellow or red requests between them are extracted, too. The leading practice is to not set it: that is, keep the request sequence.
- Only get delta once: When using a delta enabled DataSource, data can only be loaded once even if it is deleted from the target. This may be needed if the source does not provide a delete record when source data is deleted. Then, the data set of the SAP BW/4HANA data can be adjusted to the source by dropping data sets in the target before loading the source data again. This setting then avoids loading outdated historical data. The leading practice is to deselect this option in usual scenarios. (Requests are only loaded once anyway.)
- Perform Delta Initialization Without Data: Sets a flag that data is processed without processing it. (Use this option if the target already contains the data, or if no historical data is needed.)
- Parallel Processing: The DTP extracts and processes different packages in parallel. The leading practice is to avoid it if consistency depends on the sequence of records, otherwise, use it.
Note
For further reading, refer to this blog: https://blogs.sap.com/2012/09/16/all-about-data-transfer-process-dtp-sap-bw-7/
To improve the performance of the data transfer process, you can combine the following options for parallel processing:
Splitting Data Loads
Option | Filter condition | What is processed? |
---|---|---|
Processing mode on the DTP | Determined based on the key for "Extraction grouped by" and package size | Different packages in one request |
Different delta DTPs to the same target | Only allowed with non-overlapping conditions | Several requests |
Different full DTPs to the same target | Technically any filter conditions (but danger of overwriting records) | Several requests |
Different delta or full DTPs to different targets | Any filter conditions | Several requests |

When you load data from different sources, or in different packages, there might be updates (provided as after images.) If so, it's important to control the sequence of updates to the active table of the target. In these cases, make sure that you use a Standard DataStore Object, because it provides the inbound table as intermediate storage. The stored records can be sorted by request (timestamp), package and record. Thus, there is a clear sequence of the sources and records and the current value is finally persisted.
Selective Full Extraction

If the source does not provide a delta mechanism, use smart selection criteria in a full DTP to reduce the amount of data to be loaded. For instance, if you know that only some records for a new product have been added, load only these values. Similarly, you can schedule a DTP daily with a "moving filter" for the date of the previous day. It is possible to derive filter values through an ABAP routine, and even in the SAP HANA runtime.
If records of the current month may be changed, deleted or added, but records of the previous month cannot be changed for business reasons, implement a load … delete … load scenario. Use a filter for the current month. This scenario is shown on the right side of the preceding figure. The deletion is necessary if you want to reflect the deletion of records in the source, but the source doesn't send a corresponding record mode. Deletion is fast if the DataStore Object (advanced) is partitioned by month. (The database executes a DROP PARTITION statement instead of a DELETE FROM statement.)