Introducing Filter and Rule Options

Objectives

After completing this lesson, you will be able to:
  • Understand rule assignment.
  • Describe filtering.

Filter and Rule Options

There are several reasons why you might want to filter and transform your data, such as the following:

  • To retrieve only the data of the last fiscal year

  • To save memory and transfer time by filtering

  • To avoid duplicates during N:1 replication

  • To recalculate fields on the fly

  • To scramble HR data

There are different tools to solve challenges such as those listed. Some of them are easier to use, while others require more technical knowledge and are thus for experts only. The different tools are described later.

Diagram showing data flow from ABAP/non-ABAP system to SAP HANA via SAP LT Replication Server, with replication strategy matrix recommending expert settings or views.

Filter and Rule Options

Diagram showing the SAP LT Replication Server process to send data portions from an ABAP/non-ABAP system to SAP HANA, with steps for loading, looping, and adding records.

The figure outlines the concept of transformation rules, which can be explained as follows:

  • The data for transformation is split into portions.

    The default is as follows:

    • Load: 10,000 lines
    • Replication: 5,000 lines
  • The portions are processed successively, mapped, and transferred to the sender.

  • The extension of functionality, for example, data conversion, is possible at several points of the process.

  • The implementation of filter or data transformation occurs as ABAP coding.

On one screenshot, Rule Assignment and then Add Table are selected. The other screenshot shows that the table, ZSFLIGHT_00 is selected and a choice of rules can be chosen: Event-Related Rule or Field-Related Rule.

A rule can filter or transform data during the load and replication process.

In transaction LTRS, choose Rule Assignment and add the relevant table. A new user interface appears, and you can create a new rule for the table.

You can choose between two different rule types:

  1. Event-Related Rule

    • More flexible
    • Knowledge of data processing within the SAP Landscape Transformation Replication Server
    • Has access to all fields of a record
  2. Field-Related Rule

    • Less flexible
    • Easy to create by using parameters

As briefly discussed, there are different use cases for rules. However, you can group them into two different types:

  1. Data Transformation Rule

    For example, the SID field is newly created and is not available in the source table. No value is available.

    • You want to fill the SID field with the unique value T40, which you see on SAP HANA, where the record is coming from.
    • This is especially important in an N:1 scenario so that you avoid duplicates and see the origin.
  2. Filter Rule

    For example, within the table, you have several carriers, such as Lufthansa (LH), American Airlines (AA), and many more.

    You want to move the data from the carrier Lufthansa ( CARRID = "LH") to SAP HANA.

Note

Remember that a rule is a small piece of ABAP code that can be defined as a separate ABAP include (or also defined directly in LTRS depending on the SLT version).

For cluster tables (such as BSEG), create a transformation or filter for BSEG for the initial load, and a transformation or filter for RFBLG for the replication.

Field-Related Rule

Field-based rules are used to calculate a target field from up to three source fields. Field-based rules are easier to use, but offer less flexibility than event-based rules.

Screenshot of SAP LT Replication Server interface showing Advanced Replication Settings. A field-related rule is set for table ZSFLIGHT_00 targeting PLANETYPE. Import fields are empty.

In the latest SLT versions, for specifying custom coding, an include program must be defined locally in the SAP Landscape Transformation Replication Server system. However, in previous SLT versions, it was also possible to store a line of code directly under the Rule Assignment folder.

Note that each line of code is restricted to 72 characters. This means that only simple mapping rules can be realized. (When saving the settings, a syntax check is performed.) In contrast, include programs do not have any size limitations and can be used to implement complex mapping rules with the support of the ABAP editor. You specify the name of the include program in the Include Name field.

When you create a new rule, it has the status New. To view the import parameter fields, a rule must have the status Released. You can also set the status of a rule to Obsolete.

Import Parameter

Import parameters are used within ABAP code to apply values. These can be the values of fields in the sender record or literals.

In the case of source fields, the three import parameters have the following format:

  • i_<fieldname_1>_1
  • i_<fieldname_2>_2
  • i_<fieldname_3>_3

The sections <fieldname_1>, <fieldname_2>, and <fieldname_3> are the names of the fields that are entered as import parameters, for example, i_mandt_1.

If you have defined a literal, the technical name is as follows:

  • i_p1
  • i_p2
  • i_p3

Export Parameter

The export parameter specifies the field name of the receiver structure, which has to be filled within the parameter rule. The parameter of the code has the format 'e_<fieldname>', where <fieldname> is the name of the field entered as the assignment target field, for example, e_mandt.

Note

Field-based rules are not supported for migration objects referring to INDX-like tables that are to be declustered.

Event-Related Rule

For event-related rules, custom coding can be inserted at eight predefined points of processing (the events). When you create a new event-related rule, the system displays a dialog box where you can choose the event, for example, Begin of Processing (BOP). All fields of source and target tables can be accessed.

Three screenshots showing this process: End of Record is selected as Target Event under Create Event-Related Rule; data is displayed under Rule Overview; ABAP code editor is displayed with highlighted code snippet assigning value 'T40' to the SID field in structure wa_r_zsflight_00.

The data of source and target work areas can be accessed through field symbols in the Begin of Record (BOR) and End of Record (EOR) events.

The field symbols follow a generic naming convention, where tablename denotes the source table name as follows:

  • <wa_s_tablename> for the source system

    For example, <wa_s_zsflight>-sid

  • <wa_r_tablename> for target work areas

    For example, <wa_r_zsflight>-sid

In the case of event-related rules, even though the UI allows assigning parameters, this is not supported yet.

Note that for migration objects referring to INDX-like tables that are to be declustered, not all events are available. For such migration objects, only events BOP, BOT, EOT, and EOP are available.

The image shows the process flow of the event-related rules: the text provides the detail.

The figure shows the process flow of a data replication and the points in the code (events) where you can apply rules. The events are defined as follows:

BOP (Begin of Processing)
  • Processed only once, before the data transfer really starts

  • Can be used to initialize certain global fields that might be used in subsequent events, for example, fill internal mapping tables

EOP (End of Processing)

Processed only once, after the data transfer is completed

BOT (Begin of Block)

Access to all data records of a portion read from the sender system

EOT (End of Block):

Access to all data records immediately, before they are passed to the receiver system

BOL (Begin of Loop):

As with BOT, if only one table is included in the conversion object

In the case of objects with multiple tables, it can be applied to each specific table.

EOL (End of Loop):

As with EOT, if only one table is included in the conversion object

In the case of objects with multiple tables, it can be applied to each specific table.

BOR (Begin of Record):

This event is processed before the field mapping of the individual fields is started.

EOR (End of Record):

This event is processed after field mapping of the individual fields of a certain data record has finished.

Note that the Begin of Block (BOT) and Begin of Loop (BOL) events, as well as the End of Loop (EOL) and End of Block (EOT) events, are executed directly after each other for simple structures, such as transparent tables.

For more complex structures, such as cluster tables, the Begin of Block (BOT) and End of Block (EOT) events are triggered before looping over the clusters. However, the loop events, Begin of Loop (BOL) and End of Loop (EOL), are executed before looping over the entries of the logical tables in a cluster. Similarly, the Begin of Record (BOR) event and the End of Record (EOR) event do the same if no parameter rules are defined.

The naming convention to access tables for events (BOT, BOL, BOL, EOT, EOL, EOL) is as follows:

  • <it_s_tablename> for the source table

  • <it_r_tablename> for the target table

Skip Rule Filtering

You only want to transfer a subset of the records in the source table. For instance, you only want to transfer specific records belonging to a specific company code or plant. For this purpose, you can use the SLT macro SKIP_RECORD in a transformation rule.

Screenshot of SAP LT Replication Server showing rule assignment for table ZFLIGHT_00. Rule details include type, status, and implementation. Coding snippet involves a conditional skip record logic.

For example, for the filter rule, you work with the export parameter for the carrier (E_CARRID) and compare it with the value of Lufthansa ('LH'). If the value of CARRID is not equal (NE) to 'LH' , then the system will execute SKIP_RECORD, and the record will not be transferred to the target.

Code Snippet
123456
IF E_CARRID NE 'LH’. SKIP_RECORD. ENDIF.

Screenshot showing SAP LT Replication Server configuration for table ZSFLIGHT_00, highlighting Z_SKIP_RULE in ABAP Editor where records with carrid LH are skipped.

You can also use the SKIP_RECORD macro in the ABAP includes of event-related rules.

You specify the filter code as follows:

Code Snippet
1234
IF <wa_s_zsflight_00>-carrid NE 'LH'. SKIP_RECORD. ENDIF.

Add a New Field to Your Target Structure in the SLT

Business Example

In this exercise, you will add a new field to a target table by making changes to the table structure definition in SAP SLT within the Advanced Replication Settings.

Fill a New Field using an Event-Related Rule

Business Example

In this exercise, you can create an incluse that can be used in an Event-Related rule on Advanced Replication Settings to fill a new field of a table.

Initial Load Filter

If you need only a subset of the data from your source system, you can improve the performance of the initial load and the replication process by transferring only the data that you need.

To do this, specify the filter conditions for the table in transaction LTRS under Performance Options. You can specify filter conditions for key fields and for non-key fields. You can filter the initial load, the replication process, or both. You can also save the filters without using them.

Screenshot showing filter options with three modes: filter initial load and replication processes (selected), filter initial load only, and filter replication process only. Two filters are set, each setting MANDT with values '000' and '650'.

Note

If the source system is an SAP S/4HANA system (release 2020 or higher), then the filter conditions are used for both the initial load and the replication process. It is not possible to use filters for just the initial load or for just the replication process.

Note

If you want to specify filter conditions for fields, we recommend that an index exists for the fields (to optimize performance).

For example, table ZSFLIGHT_00 contains 6 billion records. There are six different clients (the key field in position 1 in table), and the records are distributed evenly across all clients. According to your business requirements, you only require the data from two clients in the target system. You specify that only two clients should be included in the initial load. SLT then only transfers 2 billion records to the target system. This also significantly improves the performance of the initial load, as follows:

  • Filter 1: MANDT = '000'

  • Filter 2: MANDT = '650'

Only records from clients 000 and 650 are transferred to the target system.

Note

You cannot define filters for tables that use reading type 3, Primary Key Order.

You can define several filters and, for each filter, define multiple filter conditions. For each filter condition, you can define a value range for every key field, such that only specific values are replicated to the target.

For any additional settings to take effect, you must save your changes in the table before you start the initial load.

Note the following general aspects when you specify filters:

  • The filter options are not available for objects with a name that is longer than 20 characters. The Filter Options tab is hidden for such objects.
  • When specifying filters for a field, you should consider creating an index for the field to improve performance.
  • Using filters on a non-key field for the initial load can negatively impact performance. You can either create an index for the field or use reading type 4 or 5 to improve performance.
  • If you have manually specified filters that are not compliant with the dialog logic or you have specified different filters for the initial load and for the replication process, you can view these filters on the Filter Options tab but cannot edit them.
  • For INDX-like tables, filter conditions can be specified for the replication process only. It is not possible to use filters for just the initial load or for the initial load and replication process. In addition, it is only possible to specify filter conditions that use the key fields of the INDX-like table.
  • For cluster tables, the filter conditions are used for both the initial load and the replication process. It is not possible to use filters for just the initial load or for just the replication process.

Note the following aspect with regards to filtering the initial load:

It is not possible to filter the initial load if you use reading type 3.

Note the following aspects with regards to filtering the replication process:

  • When a filter is specified on a non-key field, the filter does not work on data which is deleted or archived.
  • When a filter is specified for a table being part of a cluster table, for example BSEG of RFBLG cluster, the filter does not work on data which is deleted or archived. If you specify the filter on the cluster table (RFBLG), the filter will only apply to delete and archive operations.

    Caution

    Due to the way SLT deals with cluster tables - first deleting all data for the key of a respective record of the cluster table (RFBLG) and then inserting the data of the table (BSEG) – putting a filter on the cluster table may lead to inconsistent data/errors in the replication process.
  • Filters for the replication process are not supported:
    • when replicating data to SAP BW by Using the ODP Framework
    • when replicating data to Central Finance

If you still want to filter the data of the table, although one of the limitations mentioned here apply, you can still use your own rules to filter the data which can be specified in the Rule Assignment area of transaction LTRS (except when replicating data to Central Finance).

If you need multiple value ranges for the same key field, you must create a separate filter for each range because each key field can only be used once in the filter condition of one filter. All filters are used when data is selected from the source system (using an AND operator), so only those records that fit defined filter conditions are transferred.

Trigger Options

Filtering using Trigger Options

Database Type

Syntax

Example

SAP ASE, Microsoft SQLField names must be placed in double quotation marks."BUKRS" = 1000
Max DB, IBM DB2, IBM DB4, IBM DB6Field names must be placed in double quotation marks, and be prefixed with the string "___." (three underscores and a period).___."BUKRS" = 1000
SAP HANA, OracleField names must be placed in double quotation marks, and be prefixed with the string ":___." (a colon, three underscores, and a period).:___."BUKRS" = 1000

You can also create more complex filters by using the available syntax for a WHERE clause of a SELECT statement (for example AND, OR, LIKE).

SLT captures all updates, inserts, and deletes for a table that is in replication. If your configuration does not use the new Subscription-based CDC recording mechanism and you expect a high change rate on the table, but your use case requires only a subset of the records, you can minimize the replication volume by a trigger filter. Compared to replication filters (explained previously), where data is filtered on the SLT server, in this case, data is already filtered in the trigger. Therefore, you might gain performance benefits if filtering when already in the source system.

Note

This option is intended for expert users only. We strongly recommend that you contact SAP for support before using this feature because incorrect use can result in serious data inconsistencies.

Any filters that you define here are only applicable for the replication process; they do not affect the initial load. To filter the initial load and replication, additional filter rules have to be defined.

Within the trigger options in LTRS, you can specify a trigger code that will be added when SLT creates the database trigger on the source system. Ensure that the trigger condition is correct. The trigger option is an expert function and the code is database-specific. Remember that with the wrong settings you can harm your source system.

There are two options for changing the settings for triggers:

  • You can specify general settings that apply to all triggers in the source system.
  • You can also specify table-specific settings. Table-specific settings take priority over any general settings.

Note

You cannot customize the trigger code for database triggers for pool tables.

For 1:N replication scenarios, triggers are shared between all configurations. This means that the trigger options are used for the configuration that created the trigger and will also be used for all other active configurations.

If you need to redefine an existing trigger, see SAP Note 2082199.

If you select the Read from Single Client option during the configuration creation, the client is hard-coded to the trigger as well.

Views

Creating a projection view, with a subset of table fields, means filtering on the source system. Only data in fields that have been selected for the view are transferred by SLT into the target system. The table in the target system has the same name as in the source system, but only contains the subset of fields.

Diagram of data replication from Table A in Source System to SAP HANA via SAP LT Replication Server, utilizing projection views to filter and read view C for replication.

You can specify different views for initial load and replication:

View for Initial Load

You can use a database view to select data from the source table during the initial load. For example, you can use a view to filter columns so that not all columns are loaded to the target system, or you can use a view to join fields from other tables.

The view that you specify here must contain at least all of the key fields from the source table.

View for Replication

You can use a database view to select data from the source table during the replication process. For example, you can use a view to filter columns so that not all columns are loaded to the target system. Or you can use a view to join fields from other tables.

The view that you specify here must contain at least all of the key fields from the source table.

Note

As per the product functionality, when the source system is SAP S/4HANA 2020 or higher (capable of replicating CDS views), it is not possible to specify an alternate database view for Initial Load and view for Replication. Refer to note: 3222382 - View for Initial Load/Replication is not available in LTRS Table Settings when the source system is S4/HANA 2020 or higher - SLT.

Note

If the source system is connected to the SAP Landscape Transformation Replication Server system by means of a database connection, you must enter the name of a database table or view in the Structure Template field. This database table or view must contain all of the key fields from the source system table.

If you enter a database view in the Structure Template field, ensure that it exists in the SAP Landscape Transformation Replication Server system. The value of the Template Location field must be SLT System.

As of DMIS 2011 SP08, the replication of views is possible for non-ABAP sources.

Diagram showing data flow: Source System to SAP LT Replication Server to SAP HANA, loading Table A from view C, integrating fields from Table B.

As well as projection views, you can also load data from views that are joining two tables. If you join fields of different tables into one view, one table (A) is the leading table, and fields from other tables can be added.

In the SLT system, the table deviation has to be built as a transformation of table A or by using the Structure Template option. The table keeps the name, as in the source system, but is enlarged with the additional fields.

Filter Data Using a Skip Rule

Business Example

You want to replicate a table into the target, but want to restrict the data that is being transferred to a specific field value. For instance, you want to replicate data for a specific company code or operating concern only. In this exercise, you will filter data using a skip rule.

Load from the DB View

Business Example

You want to load the ZSFLIGHT_V## table into SAP HANA, and enrich the information by joining carrier information from the ZSCARR_V## table. In this exercise, you will load from the DB view.