Performing Verification

Objective

After completing this lesson, you will be able to perform verification

Test Execution

Before you can perform a verification, you must execute your test.

During the test execution, a message extracted from a source system is injected into a target system and is processed by the standard pipeline on the target system. The PIT tool collects the processing results from the target system and stores them locally as execution results. Simultaneously, a message exchange with all available log versions is retrieved from the target system. The result of each execution run is a collection of correlated and comparable source and target messages.

In the target system, a message is always injected into the XI-pipeline after the sender adapter. Therefore, the processing starts after the sender channel module chain.

You can choose whether to skip the processing in the receiver adapter (the message processing ends before the receiver module chain) or if the message is to be delivered to the current receiver

Monitoring

Test messages injected into a target system by the PIT tool are sent via the dedicated test connection PITTEST. Therefore, you can use this connection type as a filter in the PI Message Monitor of the target system in order to find all test messages in this system. The messages are displayed in the Message Status Overview. The column Test Message in the table Selection Details indicates if the processed messages are test messages or not. You can include or exclude test messages from the display list by selecting or deselecting Include test messages in the Custom Settings of the Table Settings Dialog

Different Ways to Run a Test

There are different ways to trigger a test execution:

  • Trigger the direct execution by using a run configuration.
  • Schedule a test to be run once or regularly at a specified time. Scheduling requires a run configuration or a test suite run configuration and is supported as of SP17.
  • Launch a test using a test case object without a run configuration. Use this option for test execution before SP17.

Alerting Emails

In the NWDS, you can receive an automatic alert if a problem occurs while running tests.

You can use these notifications in combination with schedules. When a test is executed regularly in the background, as part of a schedule, and an error occurs, you automatically receive an email describing the error. You don't have to look into the system regularly to check the results for errors. Errors occurring during the execution and the verification phase are recognized and you will receive an email. Currently, only email notification is supported.

Note

As a prerequisite for the email alert, the administrator of the PIT system has to configure the Java Mail Client.

To create an Alert Configuration, perform the following steps:

  1. Go to the Alerts view.
  2. Choose Create new alert configuration.
  3. Enter an Alert Name and a Mail To address.

    Note

    The name must be unique and the name length is limited to 250 char.
  4. Choose Finish.

Mastering the Verification of Messages

Test execution is only the first step of a successful test. Even in case message processing was successful in the target system, you don't know whether your expectations are met. Configurations can be different in the two systems. For example, mappings, value mappings, or routing conditions can differ. A verification step determines if a message is processed as expected in the target system.

During the verification process every message exchange from the target system is compared to the corresponding (reference) message exchange from the source system.

Verification responds to the following questions:

  • Is the message structure the same?
  • Is the number of messages generated during a receiver or a mapping split identical to the expected number of messages?
  • Do the message headers match?
  • Which message in the target matches with a given source message?
  • Are all dynamic headers present in the target message and are their values the same?
  • Are the payloads of the reference and the target message the same?
  • Is the number of attachments, their content-type, length, and content the same?

The questions are addressed with the verification steps in the test case configuration.

The structure verification is mandatory and is always executed at the beginning of the verification chain. The verification fails, if the structure is different (for example, if three messages were produced in the target system, instead of the expected two), or if messages can't be matched (for example, in the source system message A and message B were produced, but, in the target system, you have message B and message C). In these cases, the verification fails and the subsequent verification steps are skipped.

You can skip verification steps during configuration by disabling the steps in the test case verification.

In some cases, you want to accept two messages as equivalent, even if these messages aren’t the same, for example, if you have a time-stamp field in the payload. Here, the values always differ and you never get a successful verification result. Therefore, the PIT tool offers the possibility to handle such minor deviations through exemptions and replacement rules (exempt the time-stamp field from verification).

Configuring Verification

The verification procedure compares the message exchanges from the tested integration scenario with the corresponding reference data.

A test message exchange is equivalent to its reference if both message exchanges have a similar structure and all comparable message parts have an equivalent content. You are then able to relate individual messages to their reference messages.

The verification consists of several steps that are executed consecutively. You can skip any step except for the structure comparison. You can also customize individual verification steps and add own equivalence conditions.

Verification steps:

  1. First, the structure comparison is done. During this mandatory step, the system checks that the number of outgoing messages (after mapping) in the source is equal to the number of messages (after mapping) in the target.

    In addition, message headers (sender party and service, interface and interface namespace, receiver party and service) are compared to find matching pairs in source and target.

    If the sender or the receiver system is a business system, the header values of the same message can differ in both systems. This is due to the business system mapping defined in the System Landscape Directory (SLD). Here, the structure comparison always fails by default. To avoid this behaviour, you can define a Replacement Rule. A replacement rule specifies which value is handled as being equivalent in different systems. You can define replacement rules for a party, a service, an interface, or an interface namespace.

    If any deviation occurs during structure comparison, further verifications steps are skipped. The verification ends with an error.

  2. During the header comparison, the dynamic headers of source and target messages are compared. Some header attributes are excluded from comparison by default, as they’re internal, or always different: such as PIT-specific headers, and all XI internal headers in namespace http://sap.com/xi/XI/Message/30/.

    You can also manually exclude some headers from comparison if the values are different by design (see header exemptions).

  3. During the payload comparison, the actual payloads of the matching messages are compared. Depending on the payload mime-type, it’s compared using the XML, text, or binary comparator.

    For XML payloads, you can exclude some nodes from verification by providing an XPath exemption.

  4. During the comparison of the message attachments, the system checks attachments with similar content ID and mime-type by comparing their size and content.

Verification Process: Execution

You can execute a verification process for a successful execution task either from the job browser or from the action log.

Before triggering an execution of a test based on a run configuration (via direct or scheduled execution), you can decide to run the verification automatically right after the execution (default setting).

It's possible to execute a verification process multiple times for the same execution task. It's best practice during development, to execute, maintain exemptions, and execute again.

Test Case Verification Results Editor

To open the verification results editor, perform one of the following steps:

  • Double-click a verification job or a verification task in the job browser or the action log.
  • Select the context menu function Show Verification Results.

The editor provides details for all test messages in the run and their recurrences including the verification status, and, if applicable, verification problems.

In the Overview tab, you get basic information about the verification, such as the test case run configuration (if applicable), the test case, and the test dataset used. It also shows whether the verification is executed as part of a test suite execution. The number displayed in square brackets behind the object name shows the version used during execution. Select the label link to display the content of the corresponding version.

In the Summary section of the overview, you see the number of messages processed during the execution run and the verification run and whether the runs were successful. Select the number link to navigate to the Message Overview to display the details.

The Message Overview tab lists all messages in the run as pairs (incoming source and corresponding incoming target message). For each message pair, you get the execution and verification status. You also get the number of differences found during the verification, if any.

If the execution of a test message failed, you find the corresponding error message in the Test Execution Problems view. If the execution was successful, the target message is verified and the verification status has one of the following values:

Verification Status

ValueDescription
Message Exchange Structure DiffersSource and target message exchanges have fundamental differences. Therefore, the comparator wasn’t able to correlate and compare child messages and their content.
Messages DifferThe comparator was able to correlate and compare child messages. Some messages show content differences in their payload, headers, or attachments.
No DifferencesThe comparator was able to correlate and compare child messages and didn’t detect any deviations. Source and target message exchanges are equivalent in structure and content.
SkippedThe message execution failed and the verification was skipped.

The Error Overview tab shows the complete list of all verification problems and the number of occurrences in the target messages. When you select an item in the table, the affected incoming target message identifiers are shown in the following table. By double-clicking a target message identifier, you navigate to the Exchange Structure Overview.

The Exchange Structure Overview enables you to inspect and compare the original source message exchange with the target message exchange. Select Browse, to display a specific pair of source and target message exchange. Choose one of the incoming target message identifiers, and confirm with OK. You can also display a specific message in the Exchange Structure Overview by double-clicking the message identifier in the Message Overview or the Error Overview tab.

In the split window, you see the source and target trees, both showing the respective message exchange with all relevant message versions. You can display the results in the Compare Mode. The compare mode is the default view for non-structure-related differences. In this mode, the selection in both trees is synchronized.

Verification Errors

Verification errors can have many reasons. Often the failure is caused by a slightly different configuration in source and target system:

Reasons for Verification Errors:

  • Different message mapping.
  • Different user defined function in a mapping.
  • Different data provider for mapping look-ups.
  • Different routing condition.
  • Differences in value mapping.

The best way to avoid verification errors is by trying to avoid differences between systems upfront or by aligning both configurations in the course of testing.

If a correction is not possible or a deviation is due to design (for example, in case of a business system in message header), the PIT tool offers you the means to handle small deviations by using exemptions or replacement rules.

In the following, some common verification problems are described as well as how to handle them.

Verification Levels

Verification Levels: Structure:

Structure – Missing

Error TypeStructure – Missing
Error Message ExampleUnable to find matching AM version in target msg
ReasonMessage Header in source and target can’t be matched because:
  • Sender or receiver communication component is different (for example: business system is replaced according to the SLD transport path)
  • The name and the namespace of the service interface is different
  • Communication party values are different
Best PracticesDefine a replacement rule for each deviating pair of values.

Verification Levels: Structure – Difference

Structure – Difference

Error TypeStructure – Difference
Error Message ExampleNumber of request messages (expected/current): 1 / 2
Reason

The number of outgoing messages in the source and target message exchange is different. The source structure, for example, has only one outgoing message, but the target structure has two.

Possible causes are a difference in the interface mapping (split mapping) or a difference in routing conditions.

Best PracticeAlign configurations in the Integration Directory/SAP Process Integration Designer in the source or in the target system. As a further step, trigger Synchronize in the test case for the affected configuration objects. It is not possible to exclude these differences by means of the PIT tool.

Verification Level: Header

Content – Different

Error TypeContent – Difference
Error Message ExampleUnexpected value for header http://bestfood.it Pizza; expected: Tonno, current: Prosciutto
ReasonDynamic header values differ between source and target message.
Best Practice
  • Check if the message mapping sets dynamic headers using the function library and align the implementation.
  • Check if a value is produced or manipulated by a module chain and align configuration.
  • In case value is always different by design, define a header exemption.

Structure – Additional

Error TypeStructure – Additional
Error Message ExampleUnexpected header in result message: http://bestfood.it Spaghetti
ReasonA dynamic header is available in target message, but not in source message.
Best Practice
  • Check if the message mapping sets dynamic headers using the function library and align the implementation.
  • Check if a value is produced or manipulated by a module chain and align configuration.
  • In case the value is always different by design, define a header exemption.

Structure – Missing

Error TypeStructure – Missing
Error Message ExampleUnexpected header in result message: http://bestfood.it Pizza
ReasonA dynamic header is available in the source message, but is missing in the target message
Best Practice
  • Check if the message mapping sets dynamic headers using the function library and align the implementation.
  • Check if a value is produced or manipulated by a module chain and align configuration.
  • In case the value is always different by design, define a header exemption.

Verification Level: Payload/Attachment

Content – Difference

Error TypeContent – Difference
Error Message ExampleByte content deviations. Byte-Length (expected/current): 1088734 / 1432622
ReasonA payload or an attachment of type binary differs. Instead of showing the actual difference (not useful for binary content), the byte size is displayed.
Best PracticeCheck both payloads/attachments manually and align/make sure they are the sane.

Content – Missing

Error TypeContent – Missing
Error Message ExampleMissing element person.
Reason
  • Source message contains an XML tag or XML structure which is not available in the target message.
  • Service interface on source and target differ.
Best Practice
  • Check service interfaces in source and target and align.
  • In case of compatible enhancements exclude XML tags from verification using XPath exemption for source structure.

Content – Additional

Error TypeContent – Additional
Error Message ExampleUnexpected element PIT_PatternMessage2
Reason
  • Target message contains an XML tag or XML structure that is not available in the source message.
  • Service interface on source and target differ.
Best Practice
  • Check service interfaces in source and target and align.
  • In case of compatible enhancements exclude XML tags from verification using XPath exemption for target structure.

Content – Additional/Missing

Error Type

Content – Additional

Content – Missing

Error Message Example

Missing element <...>

Unexpected element <...>

Reason
  • Target and source message do not match.
  • One message has prefix and namespace defined, the other hasn't.
Best PracticeCheck if service interfaces in source and target are equal.

Content – Difference

Error TypeContent – Difference
Error Message ExampleValues of element Name differ; expected: Dr. Fred Flyer, current: Doktor Fred Flyer
ReasonThe values for an XML tag are different.
Best Practice

Check message mapping or value mapping for differences in source and target system, also inspect mapping look-ups.

In case value is always different by design, define an XPath exemption.

General

Error TypeGeneral
Error Message ExampleUnable to verify Payload. Reason: Exemptions contain invalid XPath expressions.
ReasonAn invalid XPath expression was maintained as exemption for a test case.
Best PracticeEnter a correct XPath expression.

Verification Level: Attachment

Structure - Additional / Missing

Error Type

Structure – Additional

Structure – Missing

Error Message Example

Additional Attachment found in target: payload-63aa748e167911eab8d300000040dc7e@sap.com, application/octet-stream1

Attachment not found in target: <…>

Reason
  • Target message contains an attachment which is not available in source, or vice versa.

  • Attachment Contend-Id or Content-Type do not match.
Best PracticeCheck in source and target configuration why attachments are not equal and align.

Verification Exemption Wizard

While inspecting the results of a test case verification, you sometimes encounter minor deviations you would like to ignore. For these cases, the verification exemption wizard offers a quick way to add exemptions to the test case.

To add exemptions to the test case:

  1. Select a difference in the Test Verification Problems view, and choose Add Exemption. You can choose the function via the context menu, the tool bar button, or by selecting CTRL + 1.
  2. Select one or more verification steps for the exemption.
  3. For the exemption type XPath Expression:
    1. Select either source or target XPath (the source and the target expressions usually differ).
    2. Modify the XPath by changing predicates, if necessary.
  4. For the exemption type Header all fields are already prefilled by the wizard.
  5. Choose Finish to add the exemption to the test case.
  6. The test editor opens dirty (tab header marked with wildcard character *). That means, the wizard doesn't save the test case automatically. Save the changes or close the wizard without saving to revert the changes.

Trigger a Verification Run

Business Scenario

You want to verify the result of your successful migration test. Therefore, start a verification run.

Exercise Information

Note

In this exercise, when the values include ##, replace the character with a two-digit number (01–30).

Exercise Options

You can perform this exercise in two ways:

  1. Live Environment: choose Start Exercise, and from the entry page choose Open PDF Document. Follow the steps described in this pdf in your own system landscape.
  2. Simulation: choose Start Exercise, and from the entry page choose Start Tutorial. Watch the step-by-step instructions within the simulation.

Note

We recommend running the simulation first.

Task 1: Launch a Verification

Steps

  1. Launch a Verification.

    1. In the Job Browser, select the latest successful processed Test Run.

    2. Open the context menu (right mouse click) of this successful executed Test Run, and choose Launch Verification.

    3. Once the job finished, you will see an entry in the Job Browser.

Task 2: Check the Verification Results

Steps

  1. Check your Verification Results.

    1. Double-click on the finished Verification Job.

    2. On the Overview tab of the Test Case Verfication Result window, you will see Verification Run information as well as the Summary.

    3. Open the Message Overview tab to see the your processed message. .

    4. Double-click on one of your messages.

    5. In the Exchange Structure Overview tab, you can compare now the Source Message with the Target Message.

    6. Go through the Source Message line by line and verify the values with the Target Message.

Log in to track your progress & complete quizzes