Understanding the Integration Flow

Objectives

After completing this lesson, you will be able to:
  • Deploy the integration flow.
  • Use the integration package and integration flows.
  • Use Postman Collection.
  • Use Generic Receiver.
  • Use a converter.
  • Use mapping.

How to Work with Example Integration Flows

Understanding Example Integration Flows

Example integration flows help you quickly grasp key integration concepts. These flows are simple, easy to set up, and executable in a short time.

Accessing Integration Packages

The example integration flows are available in dedicated integration packages on SAP Business Accelerator Hub. These packages contain all the necessary components. To use them, you must:

  • Copy the Integration Package
  • Deploy Integration Flows

For detailed steps, refer to Copying the Integration package and Deploying the Integration Flows.

Using an HTTP Client as the Sender Component

Many integration flows are triggered via HTTP calls. You can use any HTTP client, such as SAP Business Application Studio or third-party tools such as Postman or Insomnia, to execute these flows.

To simplify execution, Postman collections are provided, allowing you to send HTTP requests with minimal effort. These collections are included in the integration packages. They contain sets of pre-configured example content, which can be used to test integration flows.

Note

We are going to use SAP Business Application Studio as HTTP client for the following exercises. You can read more about SAP Business Application Studio on the SAP Help Portal.

Using the Generic Receiver

To streamline integration testing, a Generic Receiver is provided. This eliminates the need for configuring an actual receiver system.

How the Generic Receiver Works

  • It is implemented as a separate integration flow called Generic Receiver.
  • It logs each incoming request by creating a data store entry.
  • Each integration package includes its own Generic Receiver flow with a unique package-specific address.

Prerequisite for Running Example Integration Flows

Before running an example integration flow, you must first deploy the Generic Receiver integration flow from the same package.

OData Calls to an External Component

Some integration flows retrieve data from an external Webshop application for training and demonstration purposes.

Summary

Example integration flows help users understand key integration concepts through simple, easy-to-execute scenarios. These flows are available in SAP Business Accelerator Hub within dedicated integration packages, which must be copied and deployed before use.

Many flows are triggered via HTTP calls, and Postman collections are provided to simplify execution. A Generic Receiver integration flow is included to eliminate the need for configuring a receiver system. Some flows also interact with an external Webshop application for training purposes.

Copying the Integration Package and Deploying the Integration Flows

To work with the example integration flows, follow these steps:

Copy the Integration Package

Each example integration flow is part of a dedicated integration package on SAP Business Accelerator Hub. To begin:

  • Copy the integration package that corresponds to the guideline you want to explore into your workspace.
  • Open the package in the Design section of the Web UI.

Locate the Integration Flows

  • Integration flows are found in the Artifacts tab.
  • Postman collections are available in the Documents tab.
  • You can identify relevant flows by searching for their names, which follow this pattern: CategoryGuidanceExtension.

Deploy the Integration Flows

  • Deploy each required integration flow in the package.
  • Ensure that the Generic Receiver integration flow is also deployed, as it serves as a shared receiver for all flows.

Summary

To work with example integration flows, first copy the relevant integration package from SAP Business Accelerator Hub into your workspace. Open the package in the Design section, where integration flows are in the Artifacts tab and Postman collections in the Documents tab.

Next, deploy the required integration flows, ensuring that the Generic Receiver integration flow is also deployed, as it serves as a shared receiver.

By following these steps, you can efficiently set up and run example integration flows for different guidelines.

Read more here: Copying the Integration Package and Deploying the Integration Flows | SAP Help Portal

Working with Postman Collections

Working with Postman Collections

Postman collections simplify the execution of example integration flows by providing pre-configured HTTP requests. Each guideline or pattern has an associated integration package that contains both the example integration flows and a Postman collection to trigger them.

Note

Postman collections are made with the third-party tool Postman in mind, but can also be used with other tools with which HTTP requests can be sent to your integration flow.

Downloading the Postman Collection

Each Postman collection follows this naming pattern: GuidelineName_PostmanCollection.

To download and use a collection, perform the following steps:

  • Navigate to the Documents tab of the integration package.
  • Download the compressed file and extract it to a folder.
  • Import the extracted .json file into SAP Business Appliction Studio, Postman or Insomnia.

We will learn how to use content from Postman collections in SAP Business Application Studio later.

Handling CSRF Protection in HTTP Requests

If the integration flow has Cross-Site Request Forgery (CSRF) protection enabled in the HTTPS sender adapter, extra steps are required for POST requests:

  • Step 1: Send a GET request to fetch the X-CSRF-Token from the Target system.
  • Step 2: Send the POST request, including the retrieved token.

Postman collections that require CSRF protection include both requests automatically.

For more details, refer to HTTPS Sender Adapter and Sending HTTP Requests and Processing Integration Flows.

Timer-Triggered Integration Flows

Some integration flows are not triggered via HTTP requests but instead start via a Timer event in these cases:

  • The message body is created within the integration flow using a Content Modifier step.
  • No manual HTTP request is required-deployment automatically triggers execution.

Summary

Postman collections help execute example integration flows by providing pre-configured HTTP requests. Each integration package includes a corresponding Postman collection, which can be downloaded from the Documents tab and imported into SAP Business Application Studio, Postman or an equivalent tool.

Before execution, a Postman collection must be set up with connection parameters (username, password, and host details). The collection is then run using the Collection Runner (depending on the tool you're using, this process might be slightly different).

For CSRF-protected flows, aGET request fetches the X-CSRF-Token, which is then used in the POST request. Some flows are triggered by Timer events and do not require HTTP requests.

By following these steps, you can efficiently test and execute integration flows using Postman or an equivalent tool.

Working with a Generic Receiver

Working with a Generic Receiver

The Generic Receiver is a dedicated integration flow (Generic Receiver) that simplifies receiver system setup by creating a data store entry for each incoming call.

Purpose of the Generic Receiver

Configuring sender and receiver systems in integration scenarios can be complex and time-consuming.

Even simple setups, such as connecting an e-mail account via the Mail adapter, require more steps like:

  • Setting up security configurations for the e-mail account.
  • Uploading the e-mail provider's server certificate to the keystore.
  • To avoid the complexities and allow users to focus on integration flow guidelines, the Generic Receiver has been integrated into certain example flows as a preconfigured receiver component.

How Does a Generic Receiver work?

  • The Generic Receiver integration flow is called whenever an integration flow requires a receiver.
  • It uses the Process Direct adapter to receive calls at a fixed address.
  • Upon receiving a request, it creates a data store entry on your Cloud Integration tenant.

Prerequisites for Using the Generic Receiver

To read data store content, your user must have the AuthGroup.BusinessExpert authorization assigned.

By using the Generic Receiver, users can bypass complex receiver setup and focus on learning and implementing integration flow best practices.

The Generic Receiver integration flow.

Key Headers Used in Generic Receiver

If no context value is provided, the Check Context Script step assigns a default value: "Result".

  • context: Identifies the pattern or guideline used in the calling integration flow.
  • receiver (optional): Specifies the target receiver for the message.

Storing Data in the Data Store

When the Generic Receiver flow is called, the Data Store Write step creates an entry with the following attributes.

  • Data Store Name: Dynamically set using ${header.context}, based on the provided context header.
  • Entry ID: generated from ${header.receiver}. If no receiver value is provided, a unique identifier is created automatically.

Handling Multiple Receivers

If an integration flow routes messages to multiple receivers(for example, Content-based Routing scenarios), multiple data store entries can be created. The Entry ID is determined dynamically based on the receiver header value at runtime.

Integration package-Specific Addresses

Each integration package has its own Generic Receiver integration flow with a package-specific address. This ensures that the Generic Receiver correctly processes calls within different integration scenarios.

Example: Content-Based Routing

In the Pattern: Content-based Routing - Ignore If No Receiver integration flow, the context, and receiver headers are set dynamically:

HeaderValue
contextContentBasedRouting-IgnoreIfNoReceiver
receiverBased on the shippingCountry element in the message

This approach allows integration flows to store and categorize messages efficiently while supporting dynamic routing logic.

Summary

The Generic Receiver integration flow is triggered by a sender integration flow via the ProcessDirect adapter. It processes messages using two key headers:

  • context: Identifies the integration pattern or guideline.
  • receiver (optional): Specifies the target receiver.

If no context is provided, a default value "Result" is assigned. The Data Store Write step creates an entry using these headers, ensuring efficient data storage.

For multireceiver scenarios (for example, Content-based Routing), multiple data store entries can be created.

Each integration package has a package-specific Generic Receiver flow to manage requests within its scope.

Modeling Basic - Generic Receiver

Business Scenario

Understand the role of the Generic Receiver integration flow and how it simplifies the receiver system setup in SAP Cloud Integration.

Task Flow

In this exercise, you will perform the following tasks:

  1. Create an integration package and an Integration Flow artifact.
  2. Create an Integration Flow with a Generic Receiver containing a Groovy Script and a Data Store operation.

Prerequisites

You have access to SAP Integration Suite and the Cloud Integration capability, e.g. in your SAP BTP Trial account. Alternatively, you can watch the step-by-step demo below.

Exercise Outcome

You have gained initial experience with building a basic integration flow.

What will you learn from this exercise?

You will gain hands-on knowledge of how the Generic Receiver works and how it helps to manage integration flows more effectively.

Working with CSV to XML Converter

Understanding the CSV to XML Converter

In this lesson, you will learn how to convert a CSV file into XML format using the CSV to XML converter. This process is essential for transforming structured data into a format suitable for integration with other systems.

Configuring the CSV to XML Converter

To effectively use the converter, you need to understand how to configure it. This section guides you through the setup process, ensuring that your CSV data is correctly formatted for XML transformation.

Practical Implementation: Converting CSV to XML

To demonstrate the capabilities of the converter, let's use a simple example. In this scenario, product information is stored in a CSV file (comma-separated values) and needs to be transformed into an XML message.

The Integration Flow, Modeling Basic - CSV to XML Converter, is structured as follows:

  1. Input Data: A CSV file containing product details.
  2. Processing: The CSV to XML Converter processes the data and maps it into XML format for further use.
  3. Output Data: A structured XML message ready.
SAP integration process diagram showing CSV to XML conversion. Flow: Sender via HTTPS, Start, CSV to XML Converter, Define context for monitoring, End, then to Receiver via ProcessDirect.

How does the example scenario work?

  1. Receiving Data:
    • The integration flow begins by receiving a message via an HTTPS adapter.
    • This message contains a text file with product information.
  2. Understanding the Data Structure:
    • The first row of the line includes parameter names, which define the structure of the data.
    • Each subsequent row contains product details following the parameters.
Spreadsheet with columns: Category, ProductId, DimensionWidth, WeightUnit, DimensionUnit, DimensionHeight, DimensionDepth, Weight, Name, displaying data for notebooks, PDAs, scanners, printers, and speakers.

Selecting an XSD File

To ensure the correct mapping of CSV data to XML, an XSD (XML Schema Definition) file is required. This file defines the structure of the XML output.

  • Choose the Select button to upload an XSD file from your local system.
  • Once uploaded, the XSD file is added to the resources of the integration flow.

Defining the XML Structure

Each row in the CSV file represents a product entry. To properly structure this in XML:

  • The XPath expression Products/Product is used as the Path to Target Element in XSD.
  • This ensures that each product entry is placed inside a <Product> node within the <Product> root element.

Filtering Data by Category

To include only products from the Notebooks category.

Configuring the CSV Format

  • The CSV data fields are separated by a semicolon ( ; ).
  • To match this format, the Field Separator in CSV is set to Semicolon ( ; ).

Handling Headers

This first row in the CSV file contains column names, not actual product data. To prevent it from being converted into an XML entry, the Exclude First Line Header option is enabled. This ensures that only actual data is processed.

Mapping CSV Values to XML Elements

To correctly assign values to their corresponding XML Elements, the Configure CSV Headers to match the XSD Elements option is selected from the drop-down menu. This maps CSV column headers to the appropriate elements in the XSD structure.

CSV to XML Converter screen showing options for XML schema, path, record marker, field separator, excluding header, and configuring CSV headers. Processing tab is selected.

Once the CSV data has been successfully converted into XML, the message body is structured as follows.

XML file with products data including ID, name, category, dimensions (depth, width, height), and weight under UTF-8 encoding version 1.0.

By following these steps, you can successfully transform structured CSV data into a well-formed XML message.

CSV to XML Converter

Business scenario

Understand the CSV to XML Converter by using the CSV to XML Converter in an integration flow.

Task Flow

In this exercise, you will perform the following tasks:

  1. Create an Integration Flow with a CSV to XML Converter.
  2. Configure the CSV to XML Converter.

Prerequisites

  • You have access to SAP Integration Suite and Cloud Integration.
  • Basic understanding of CSV and XML file format.
  • Understand how XSD (XML Schema Definition) files define the XML structure.

Exercise Outcome

After completing this exercise, you will have successfully transformed a CSV file into a structured XML format, configured an XSD file to define the XML structure, mapped CSV headers to corresponding XML elements, and applied filtering to process data based on category.

What will you learn from this exercise?

Through this exercise, you will gain knowledge in:

  • Data Transformation - Understanding how to structure data for different formats.
  • XML Configuration - Using XSD files to define XML structures.
  • Data Filtering - Applying category-based filtering to extract relevant records.
  • Integration Flow - Understanding how CSV data can be converted into XML for system integration.

This hands-on exercise prepares you for working with structured data in enterprise applications and integrations.

Set up Authentication to Send the Messages

Business Scenario

You want to send a message via the POST method to your integration flow.

Task Flow

In this exercise, you will perform the following tasks:

  1. Set up SAP Business Application Studio with all the needed environment variables.
  2. Send a message to your integration flow.

Prerequisites

  • You have access to SAP Integration Suite and Cloud Integration.
  • You have a working and deployed CSVtoXML Integration Flow.
  • You have a working and deployed Generic Receiver Integration Flow.
  • You have access to SAP Business Application Studio.
  • Authentication with clientID and clientsecret is set for your account.

Note

If you don't have clientID and clientsecret set up in your subaccount, please refer to the Basic Authentication with clientsecret and clientid for Integration Flow Processing documentation.

Note

If you are working with a SAP BTP Trial account, you need to set up SAP Business Application Studio first. For more information, refer to the Set Up SAP Business Application Studio tutorial.

Exercise Outcome:

After completing this exercise, you can successfully send a message via SAP Business Application Studio to your setup Generic Receiver and CSVtoXML integration flow.

What will you learn from this exercise?

Through this exercise, you will gain knowledge in:

  • Using the provided Postman collection.
  • Setting up SAP Business Application Studio to send an authentication.
  • Sending a message to your integration flow in the SAP Integration Suite.

This hands-on exercise prepares you for working with SAP Business Application Studio and the Postman collection to prove that your integration flow is working.

Task 1: Set Up Your Dev Space in SAP Business Application Studio

Steps

  1. Set up the SAP Business Application Studio and create a dedicated Dev Space for development. A Dev Space in SAP Business Application Studio is an isolated, preconfigured development environment that provides all required tools, runtimes, and extensions for building and running SAP applications.

    Note

    If you are using a SAP BTP trial account, you need to first enable and subscribe to SAP Business Application Studio. Check out the Set Up SAP Business Application Studio tutorial.
    1. Click on Instances and Subscriptions. Search for the SAP Business Application Studio application and click on it.

      SAP Business Application Studio
    2. SAP Business Application Studio opens in a new tab. Click on Create Dev Space.

      Create Dev Space
    3. Now, enter a name for your Dev Space. Scroll down and select Basic and confirm by clicking on Create Dev Space in the lower right corner.

      Enter Dev Space Name
    4. Wait for your Dev Space to reach the status Running and then click on it to open.

      Dev Space Running
  2. Create a file in the SAP Business Application Studio to configure the authorization by combining the required credentials and secrets, enabling the generation of a new bearer token whenever needed.

    1. We are now on the overview page of our SAP Business Application Studio Dev Space. We want to create a file to generate a bearer token for further authorization. Click on the hamburger menu on the top left corner and select FileNew File...

      Create a new file
    2. Name the new file GetToken.http and hit enter. The click OK to confirm.

      Enter file name
    3. Your newly created file opens with an editor view. Paste the following code into the editor. We will fill out the Token URL and the Auth Client Credentials in the next steps.

      Code Snippet
      12345678
      ### Request OAuth Token POST <Token URL> Authorization: Basic <Auth Client Credentials> Content-Type: application/x-www-form-urlencoded grant_type=client_credentials
      Code for OAuth Token
    4. We now need the ClientID, Client Secret, and the Token URL for our SAP Integration Suite. Navigate to the cockpit of your SAP BTP subaccount. Go to Instances and Subscriptions and find the default_integration_flow instance. Click on 3 keys to see the credentials.

      default_integration_flow instance
    5. Select the Form view. Here you find the clientID, clientsecret, and the tokenurl that we will need for the following steps. You can just leave the Credentials tab open and get back to it as needed.

      Credentials
    6. Now we want to merge the clientID and the clientsecret into one Base64-coded string. This will be our value for <Auth Client Credentials>. Navigate back to your SAP Business Application Studio Dev Space. Click on the burger menu and open TerminalNew Terminal.

      Open a New Terminal
    7. In the terminal, enter the following code. Then replace <clientID> and <clientsecret> with your respective credentials from the steps before. Hit enter. You will receive a long string containing numbers and letters. Copy this new string and paste it in your GetToken.http file in place of <Auth Client Credentials>.

      Code Snippet
      1
      echo -n '<clientID>:<clientsecret>' | base64
      Create Base65-coded string
    8. Enter the token URL next to POST and then click on Send Request. If everything works, you get a response code 200 OK and you will find your freshly generated bearer/access token in the response file. We will need this access token in the coming exercises to send HTTP requests to our integration flows. This token will expire after a few minutes, so keep in mind that you have to repeat this step after a while to get a fresh token.

      response access token

Send a Message to the CSV_to_XML Integration Flow

Business Scenario

Understand the concept of using SAP Business Application Studio and the Postman collection to send messages to your integration flow.

Task Flow

In this exercise, you perform the following tasks:

  1. Check and verify that the HTTPS adapter address is correctly set in your integration flow.
  2. Send a message to your integration flow.

Prerequisites

  • You have access to SAP Integration Suite, Cloud Integration, and to SAP Business Application Studio.
  • You have a working and deployed CSVtoXML Integration Flow.
  • You have a working and deployed Generic Receiver Integration Flow.

Exercise Outcome

After completing this exercise, you will have successfully sent a message via SAP Business Application Studio to your Generic Receiver and CSVtoXML integration flow.

What will you learn from this exercise?

Through this exercise, you will gain knowledge in:

  • Using the provided Postman collection.
  • Sending a message to your integration flow in SAP Integration Suite.

This hands-on exercise prepares you for working with SAP Business Application Studio and the Postman collection to proof that your integration flow is working.

Task 1: Send and Receive Messages

Steps

  1. Ensure that the Generic_Receiver integration flow and the CVStoXML integration flo are both deployed. Set the log level to Trace.

    1. In your SAP Integration Suite account, navigate to Integrations and APIs and your recently created Basic_Modeling Package. Check your Generic_Receiver integration flow and make sure it is still deployed.

      Basic_Modeling Integration Package
    2. Navigate to your CVS_to_XML integration flow and open it. Check the deployment status and click on Navigate to Manage Integration Content.

      CVS_to_XML Integration Flow
    3. Set the log level to Trace, we will need this later for our monitoring. Deploy your integration flow again and navigate back to the Manage Integration Content page. There you will see your endpoint address. We will need this address to send a message to this integration flow via SAP Business Application Studio. You can either copy and paste it to an editor, or leave the tab open and come back later to copy it.

      Note

      It can take some time for the endpoint address of your integration flow to appear after deployment. If it is not visible yet, wait a few minutes and then check again.
      Endpoint Address
    4. Now we take a look at our Postman collection. Download the postman collection called ModelingBasics.postman_collection.zip and unpack it. You will now have a .json file on hand.

      Download Postman Collection
    5. Double-click on the .json file to open it. This Postman collection contains several sets of data. For this exercise, we want to use the CsvToXml data set. Open the search bar and search for csvto. You will see the CsvToXml data set section. For our exercise payload, we need the content after the raw section. You can copy and paste it to an editor or leave the file open and come back to it later.

      Note

      Make sure that you only copy the text inside the quotation marks.
      csv to xml data set
  2. Create a new .http file in the SAP Business Application Studio and add the message content to it. This file is then used to send the message request to the integration flow.

    1. Create a .http file to Send Messages to Your Integration Flow. Navigate to your recently created dev space in SAP Business Application Studio. Click on the burger menu in the top left corner and click on FileNew File....

      create a new .http file
    2. Name it CSVtoXML.http and hit enter. Confirm again with OK.

      Name the .http file
    3. Your .http file is now created and open. Paste the following code into the editor. We will fill out the <Endpoint URL>, the <Bearer Token>, and the <PAYLOAD> in the following steps.

      Code Snippet
      12345
      ### Send Message to Iflow POST <Endpoint URL> Authorization: Bearer <Bearer Token> <PAYLOAD>
      code block
    4. Copy and paste the content that we selected from the Postman collection .json file to the <PAYLOAD> section. (Replace the <PAYLOAD> text completely.)

      Paste payload in .https file
    5. The payload still contains some line breaks that we need to remove so that CSV content is in the correct format. We are doing this using the search and replace function of the editor. Open the search function with strg+f. Type in \\r\\n|\\n in the search bar. Activate the regex-mode .*. In the replace section, enter \n. The click in the replace all symbol.

      Remove LIne Breaks
    6. Your payload should now look like a block and not like a string. Now enter the endpoint address from your CSV_to_XML integration flow and enter your bearer token. Make sure the bearer token is still valid. If not, create a fresh one using your GetToken.http file.

      Fill in the missinf elements
    7. Now click on Send Request to send the message to your integration flow. If everything went well, you will receive a 200 OK response.

      send the message
  3. Check the monitoring to verify the message processing. Review the payload within the integration flow and also take a look at the Data Store.

    1. Go back to your SAP Integration Suite account and navigate to the Monitor Message Processing view of your CVS_to_XML integration flow. Click on Trace to get a detailed monitoring view.

      Monitor Message Processing View
    2. Select the HTTPS element and navigate to the Message Content and the Payload. Here you can see the original payload the you have sent to your integration flow.

      message content before
    3. Now, select the End element and check out the payload. You can now see that it has been converted to XML.

      message content after
    4. Our CVS_to_XML integration flow sends the payload through the Generic_Receiver integration flow to a data store. Navigate to MonitorIntegration and APIsManage StoresData Stores.

      Navigate to data store
    5. You can now see that a data store has been created and that it has a unique ID and Message ID.

      Data store view

Mapping Context

Understanding the Mapping Context

Before implementing message mapping, it is crucial to understand the concept of mapping context. The mapping context ensures that source values are correctly assigned to target fields, particularly when the source and target structures differ in terms of hierarchy levels and occurrences.

In this concept, we will explore the significance of setting the mapping context correctly to avoid data loss or incorrect mappings.

Exploring the Use Case

Consider an incoming XML message containing product classification information structured as follows:

A screenshot of XML code depicting product hierarchy, with main category Printers and Scanners. It includes multifunction printers and scanners, each with respective products listed.

Our goal is to flatten this structure into a product list while assigning the main category as a node attribute, resulting in:

Screenshot of XML code snippet with namespace and product descriptions for categories Printers and Scanners featuring Multi Print, Multi Color, Power Scan, and Photo Scan.

Implementing the Mapping Context

To achieve this transformation, follow these steps:

  1. Map the source field ns1:ProductHierarchy\MainCategory\Product to the target field ns1:Products\Product.
  2. Set the context for each field appropriately in the mapping expression editor.
  3. Ensure that the product list remains complete by selecting the root node as the context in the source field.

How Context Affects the Output.

  • If the context is set to Category, only the first category's products appear.
  • If set to MainCategory, only the first main category's products are included.
  • If set to the root ProductHierarchy, the entire product list is included.

Implementation

Now, we look at the implementation of message mapping within an Integration Flow. Understanding how to structure and process message transformations ensures accurate data assignment between the source and target fields.

SAP integration process flow diagram showing sender to receiver communication via HTTPS, message mapping, context definition, and direct processing end.

The example Integration Flow Modeling Basics - Mapping Context follows a structured approach:

  1. Receiving the Message: The Integration Flow receives an incoming message through an HTTPS adapter.
  2. Message Mapping Step: The message undergoes transformation in a mapping step to fit the target structure.
  3. Defining Field Mappings: The mapping is established as follows:

    The source field ns1:ProductHierarchy\MainCategory\Category\product is mapped to the target field ns1:Products\Product.

Diagram shows a mapping tool interface with ProductHierarchy on the left, Products on the right, and a mapping expression at the bottom connecting Product to Product. Functions panel is on the left.

In the mapping expression editor, you can define the context for each field within the source structure individually. To ensure that the target structure contains a complete list of products, the message context must be set correctly. This is done by selecting the message root node as the context in the context menu of the source field.

Dropdown menu with options: Copy Expression, Contexts, Return as XML, Display Queue, Find Field, Information. Context submenu shows ns1:ProductHierarchy, MainCategory, and Category.

Understand the Rationale behind Context Settings

To grasp why this setting is necessary, let's analyze how message processing behaves:

  1. Message import and context separation

    The XML instance is imported into the processing queues before the target field mapping takes effect. Context changes within the queue determine how data is grouped.

  2. Root Node Generation

    The root node ns1:products is created.

  3. Mapping values to the Target structure

    Within the current context, values are mapped to the target node ns1:Products\Product.

  4. Context closure and completion

    Once the first context is processed, it closes along with the ns1:Products context in the target structure. The transformation is finalized once all fields in the target structure have been processed.

Impact of Context Selection on Output

If the context of the product source field is set at the category level, only a subset of the product list is generated. Specifically, only the products within the first context, separated by a context change, are included.

This means that the way you set the context in the source structure directly impacts the result.

Example Outcomes Based on Context Selection:

  1. Context set to category level (ns1:ProductsHierarchy\MainCategory\Category\Product = Category).

    This default setting when creating a field mapping results in only the products of the first category being included, such as Multifunction Printers:

    Code Snippet
    123456
    <?xml version="1.0" encoding="UTF-8"?> <ns0:Products xmlns:ns0="http://demo.sap.com/mapping/context"> <Product MainCategory="Printers and Scanners">Multi Print</Product> <Product MainCategory="Printers and Scanners">Multi Color</Product> </ns0:Products>
  2. Context Set to Main Category Level (ns1:ProductHierarchy\MainCategory\Category\product = MainCategory).

    This setting includes all products within the first main category, printers, and Scanners:

    Code Snippet
    12345678
    <?xml version="1.0" encoding="UTF-8"?> <ns0:Products xmlns:ns0="http://demo.sap.com/mapping/context"> <Product MainCategory="Printers and Scanners">Multi Print</Product> <Product MainCategory="Printers and Scanners">Multi Color</Product> <Product MainCategory="Printers and Scanners">Power Scan</Product> <Product MainCategory="Printers and Scanners">Photo Scan</Product> </ns0:Products>
  3. Context Set to Root Level (ns1:ProductHierarchy\MainCategory\Category\product = ns1:ProductHierarchy).

    Selecting the root context ensures that the complete list of products is included.

Screenshot of a data mapping and transformation interface, displaying product hierarchy structure, mapping expressions, and functions used to transform and merge specific data fields.

Using the useOneAsMany Function for Mapping

To map the MainCategory attribute of ns1:Products\product, the standard function useOneAsMany is applied. This function ensures that the main category name is duplicated correctly.

  • First argument: Specifies the value to be passed to the target (in this case, the Name attribute of the MainCategory node).
  • Second argument: Determines how many times the value should be repeated in the target.
  • Third argument: Defines the context change in the target.

Both the second and third arguments are derived from the source field ns1:ProductHierarchy\MainCategory\Category\Product. By carefully setting the context, you can control how data is grouped and processed, ensuring accurate transformation results.

Two screenshots show dropdown menus with options like Copy Expression and Contexts. The menu connects to ns1:ProductHierarchy with options MainCategory and Category being selected.

To correctly assign the appropriate main category name to the products, the context for both the Name attribute of the MainCategory node and the Product field must be set to MainCategory.

The output of the standard function useOneAsMany must include a context change after each value. Without this, the second and third products would be incorrectly assigned to the Computer Systems and Computer Components main categories, respectively, while the remaining products would have an empty MainCategory attribute.

A user interface showing an options menu with Copy Expression, Display Queue, and Information choices. The cursor points to Display Queue.

To prevent this issue, we insert the standard function splitByValue between the product source field and the third input argument, using the Context Change on Each Value option. As mentioned earlier, the third argument determines the context change in the target structure.

A flowchart showing the node splitByValue with sections input and Advanced. The text Context Change on Each value is displayed with a dropdown menu button.

The message queue of the useOneAsMany function appears as follows. It shows that the main category, Printers and Scanners, is repeated until a context change occurs in the second input argument. Based on the context settings, this change happens correctly when the main category shifts.

A table titled Function Message Queue, showing columns useOneAsMany:in1 to useOneAsMany:out1 with various entries related to printers, computer systems, components, USB sticks, and webcams.

Get more information at:

Mapping Context

Business Scenario

Understand the Mapping Context by using the Mapping artifact into the integration flow to ensure that source values are correctly assigned to target fields, particularly when the source and target structures differ in terms of hierarchy levels and occurrences.

Task Flow

In this exercise, you perform the following tasks:

  1. Create an integration flow with a message mapping.
  2. Configure the Mapping in the integration flow.

Prerequisites

  • You have access to SAP Integration Suite and Cloud Integration.
  • Basic understanding of message mapping.

Exercise Outcome

By completing this exercise, you learn how to set and apply the mapping context to ensure correct data transformation, prevent data loss, and accurately assign source values to target fields in message mapping.

What will you learn from this exercise?

Through this exercise, you will gain knowledge in:

  • Understanding Mapping Context: How mapping context affects data transformation and ensures correct assignment of source values to target fields.
  • Context Settings and their impact: How different context levels (Category, MainCategory, Root) influence the structure and completeness of the output.
  • Applying Standard Functions: How to use useOneAsMany and splitByValue to correctly handle data replication and context changes.
  • Ensuring Accurate Data Mapping: Best practices for preventing incorrect assignments and missing values by properly setting the mapping context.
  • Implementing Message Mapping: Step-by-step guidance on defining field mappings and processing message transformation in an integration flow.

By the end of this exercise, you will be able to confidently apply mapping context settings to achieve accurate and complete data transformation in message mapping.

Send the Message and Check the Integration Flow

Business Scenario

Understand how to validate your integration flows for correctness using SAP Business Application Studio with the provided PostmanCollection.

Task Flow

In this exercise, you will perform the following tasks:

  1. Check the deployment status of your integration flows.
  2. Set up testing environments in SAP Business Application Studio and create test requests.

Prerequisites

  • Access to SAP Integration Suite and Cloud Integration.
  • Basic understanding of message mapping.
  • Access to SAP Business Application Studio to send a message to your integration flow.

Exercise Outcome

You will be able to confidently test and validate your integration flows using SAP Business Application Studio, ensuring they behave correctly, handle data as expected, and return the proper responses in various scenarios.

What will you learn from this exercise?

Through this exercise, you will be able to:

  • Understand how to use SAP Business Application Studio for API testing.
  • Create and execute test cases for your integration flows.
  • Identify and fix issues in your integration logic.
  • Deploy or maintain integration flows in production.

Task 1: Check the Deployment Status of Your Integration Flows

Steps

  1. Ensure that the Generic_Receiver integration flow and the Mapping_Context integration flow are both deployed. Set the log level to Trace.

    1. In your SAP Integration Suite account, navigate to Integrations and APIs and your recently created Basic_Modeling package. Check your Generic_Receiver integration flow and make sure it is still deployed.

    2. Navigate to your Mapping_Context integration flow and open it. Check the deployment status and click on Navigate to Manage Integration Content. Set the log level to Trace, we will need this later for our monitoring. Deploy your integration flow again and navigate back to the Manage Integration Content page. There you will see your endpoint address. We will need this address to send a message to this integration flow via SAP Business Application Studio. You can either copy and paste it to an editor, or leave the tab open and come back later to copy it.

      Endpoint Address
    3. Go to the Postman collection .json file that we downloaded in the previous exercise. Double-click on the .json file to open it. For this exercise, we want to use the MappingContext data set. Open the search bar and search for mapping. Take a look at the MappingContext data set section. In this case, we are not going to copy it to our .http file. The clean-up process of removing line breaks is a bit more extensive in this case. To save time, you can copy the formatted XML payload from here:

      Mapping Context Data Set
      Code Snippet
      123456789101112131415161718192021222324252627282930313233343536373839404142
      <?xml version="1.0" encoding="UTF-8"?> <ns0:ProductHierarchy xmlns:ns0="http://demo.sap.com/mapping/context"> <MainCategory Name="Printers and Scanners"> <Category Name="Multifunction printers"> <Product>Multi Print</Product> <Product>Multi Color</Product> </Category> <Category Name="Scanners"> <Product>Power Scan</Product> <Product>Photo Scan</Product> </Category> </MainCategory> <MainCategory Name="Computer systems"> <Category Name="Computer system accessories"> <Product>Notebook Lock</Product> <Product>USB Stick 1 GB</Product> <Product>USB Stick 2 GB</Product> <Product>Web cam reality</Product> </Category> <Category Name="Notebooks"> <Product>Notebook Basic 15</Product> <Product>Notebook Pro 15</Product> </Category> <Category Name="PCs"> <Product>Gaming Monster</Product> <Product>Gaming Monster Pro</Product> </Category> </MainCategory> <MainCategory Name="Computer components"> <Category Name="Graphic cards"> <Product>Hurricane GX</Product> <Product>Gladiator MX</Product> </Category> <Category Name="Telecommunication"> <Product>Modem Hyper Speed</Product> <Product>ADSL progress T1</Product> <Product>ISDN direct</Product> </Category> </MainCategory> </ns0:ProductHierarchy>

      Note

      You can also find this code block on GitHub for easier copying.
  2. Create a new .http file in the SAP Business Application Studio and add the message content to it. This file is then used to send the message request to the integration flow.

    1. Navigate to your recently created dev space in SAP Business Application Studio. Click on the burger menu in the top left corner and click on FileNew File....

      create new .http file
    2. Name it MappingContext.http and hit enter. Confirm again with OK.

      Name the .http file
    3. Your .http file is now created and open. Paste the following code into the editor. We will fill out the <Endpoint URL>, the <Bearer Token>, and the <PAYLOAD> in the following steps.

      Code Snippet
      12345
      ### Send Message to Iflow POST <Endpoint URL> Authorization: Bearer <Bearer Token> <PAYLOAD>
      code block
    4. Copy and paste the formatted XML content to the <PAYLOAD> section. (Replace the <PAYLOAD> text completely.)

    5. Now enter the endpoint address from your Mapping_Context integration flow and enter your bearer token. Make sure the bearer token is still valid. If not, create a fresh one using your GetToken.http file.

    6. Click on Send Request to send the message to your integration flow. If everything went well, you will receive a 200 OK response. The response file will also show you what the XML file will look like after the mapping from the integration flow.

      send the message
  3. Check the monitoring to verify the message processing. Review the payload within the integration flow and also take a look at the Data Store.

    1. Go back to your SAP Integration Suite account and navigate to the Monitor Message Processing view of your Mapping_Context integration flow. Click on Trace to get a detailed monitoring view.

    2. Select the Message Mapping element and navigate to the Message Content and the Payload. Here you can see the original payload the you have sent to your integration flow.

    3. Now, select the End element and check out the payload. You can now see that the payload has been transformed.

      message content after
    4. Our Mapping_Context integration flow sends the payload through the Generic_Receiver integration flow to a data store. Navigate to MonitorIntegration and APIsManage StoresData Stores.

      Navigate to data store
    5. You can now see that a Data Store has been created and that it has a unique ID and Message ID.

      Data Store View

Use a Timer to Send Messages

Use a Timer and a Content Modifier instead of an HTTP Client

In the previous exercises, we have used SAP Business Application Studio as an HTTP client to send a message to our iflow. This required setting up authentication in SAP Business Application Studio and using a sender element in our iflow.

If you just want to quickly test the functions of your iflow, you can use a timer element and a content modifier containing your payload to simulate messages coming into your iflow. This is for example useful, when you haven't decided yet where to send your messages from or if you just need a quick solution without having to set up SAP Business Application Studio or any other HTTP client.

  1. Remove sender element
    Remove the sender elememt from your integration flow. Click on the sender and then select the trashcan symbol.
  2. Add Content Modifier
    Add a Content Modifier element to your integration flow.
  3. Modify content Modifier
    Double-click on the Content Modifier element. In the menu, select the Message Body tab. Here, we are adding our payload in the Body field. Make sure that the payload has the right format for your iflow. If not, further mapping steps may be required. For this example, we are using the CSVtoXML sample payload, since we are working with the CSV to XML converter Iflow. Also check that the Type is set to Constant.
  4. Add timer element
    Now add a Timer element. You could change its name, but there is no need to edit it further for our iflow example. This way, the Timer element triggers the integration flow anytime we deploy it.
  5. Remove start element
    Since we are already using our Timer element, we need to remove the Start element that has been there by default. Simply click on it and then select the trash can symbol.
  6. Save your iflow. Deploy it. And then check the monitoring to see if everything worked out.