In this lesson, we are going to focus on a realistic example on how to upload data from a source system to SAP BRH, using different tools.
Specifically, we are going to start from a data source consisting of a Microsoft Excel sheet, and we are going to end with the final result available in SAP BRH.

The reader is expected to be familiar with the content of the previous lesson, that addresses also the topics of communication method, of data storage and others.

As we have seen previously, regardless to the communication method (asynchronous or synchronous) the payload is always the same.
In this case we are considering as the data source a Microsoft Excel sheet, however the same could have been if the data is being imported into a database, or being delivered through an API call of the originating source system.
In this example, we have a table and it is essential to compare both the semantic and the syntax of the content in order to ensure that the data can be appropriately represented in SAP BRH.

We are focusing on a list of batches that we want to use to upload in SAP BRH.
For the current stage of our integration work, we will only focus on the following aspects:
- which kind of information is handled by the SAP BRH API for the Batch
- the syntax of how to specify it
Later we will come back to the same page in order to extract other information that is needed for the integration such as:
- Where to post the data
- What to expect in case of failure
- What to expect in case of success
Single source of truth
The single source of Truth for the Batches is provided by theSAP API Business HUB .
The following table shows the current specification of the payload for the creation of a Batch.
For the sake of simplicity, the objects releaseCountries, clinicalTrial and clinicalTrialStudies are not discussed later.

The most difficult part of the process is how to "map" the content of the source system with the API specification.
This can only be performed manually, and it is in the quality of this manual work that everything else will be decided.
The steps that must be executed are:
- Semantic mapping: which property in the source system is matching the same meaning in the target system? This might seem trivial but actually might not be since the name of the same property in different systems could have a different meaning. For example, a property called "Name" in one system could be called "Last name" in another system.
- Granularity: some systems might have a lower level of granularity that then requires generating new data. For example, in a source system the property "Name" could be "Jane Smith", whereby the target system would require "First name" and "Last name". In this example, the mapping process would have then to generate "Jane" as "First name" and "Smith" as "Last name".
- Allowed values: we have already defined in the previous lesson the handling of "nullable" items. As we have seen in SAP BRH most of the input values for the Batch are "nullable". However there is also another perspective, namely which are the allowed ones.
- String: We have already seen in the previous lesson that a "String" is actually a "double quote" embedded series of characters, that can be specified in any of the UTF-16 allowed character sets. In the previous lesson (and also in the screen shot above) we show that SAP BRH is able to handle several character sets for any of the "string" properties.
- Number: Here one has to be very careful. A "Number" is not simply a series of digits, but it could also contain "separators" such as a dot or comma. In case of sap BRH, the Number must be "dot" separated and not "comma" seperated, since the "comma" when it is not part of a string is used as a property separator (JSON data format). In English speaking source systems, the number 1 Million and 1 cent is described as 1,000,000.01. In SAP BRH that would cause serious issues, and it could also prove difficult to understand the resulting error message. Instead it should be 1000000.01; please be careful not to specify it as 1000000,01 (comma separator is not allowed).
- Date: As you can see, in SAP BRH there is no "Date" as type, since the underlying JSON cannot handle anything but numbers and strings. Therefore all "dates" and "dates and time stamp" are simply strings in SAP BRH. This poses a huge problem in debugging, since the format has to be very specifically checked.
- Date and Time: For few properties is not only a date, but also the precise time. The format of the time is very peculiar and I recommend to strictly follow the example that is provided in the insert (the black area in above shown visualization) of the API Hub. Clearly a problem that remains open when using time is the matter of Timezone.

By applying the previously described analysis to our starting table, we realize that there are a few problems:
- The "ID" of one of the batches is too long (>10 characters) and therefore must be truncated. Is this something that will still allow to identify it properly in the whole process? Likely not, then a new ID will have to be generated and a mapping will have to be created and maintained.
- Some information (Material Description, Material Color) cannot be used and needs to be dropped
- Other information needs to be converted, in this case "Material UoM" needs to be converted into a 3 digit representation. In SAP S/4HANA the topic of "Unit of Measurement" is very well detailed. However, SAP BRH has been created in a SAP-agnostic way in order to also allow customers that do not use SAP S/4HANA . Therefore a choice of the unit of measurement must be applied to the specific SAP BRH instance in a consistent way for all objects of the customer
- Missing values: SAP BRH requires some parameters to be present in the payload, otherwise the payload would not be accepted by SAP BRH. These are symbolized with a red mark in the original documentation, and with a "y" in the table above. For our example we see that we are missing the values for several properties:
- material ID
- plant ID
- Status_ID
- Source Modified By
- source Identifier
- sourceModifiedAt
However, at the same time some of those mandatory values are nullable. Therefore one can provide them with a "null" as the value. For the sake of our example we will add those values to the sample data. Clearly in an integration approach the decision on how to handle missing values is crucial, and has to be well analyzed with all implications.
Payload
ID | quantity | UnitOfMeasure_ID | Material_ID | plant_ID | Status_ID | sourceModifiedBy | sourceIdentifier | sourceModifiedAt |
---|---|---|---|---|---|---|---|---|
"Batch 0123" | 1 | "KG" | "M0001" | "P0001" | "A" | "string" | "ER1CLNT100" | 2017-04-13T15:51:04.0000000Z |
"01234567" | 10 | null | "M0002" | "P0001" | "A" | "string" | "ER1CLNT100" | 2017-04-13T15:51:04.0000000Z |
"123456" | null | null | "M0003" | "P0001" | "A" | "string" | "ER1CLNT100" | 2017-04-13T15:51:04.0000000Z |
In the above table we see the final content of the payload after having performed the appropriate semantic and syntactic mapping.
For those fields where we did not have any value, the values shown in the example of the SAP API Hub Help are used.
This is the content that we will use to load in SAP BRH.

As we have seen previously, several different approaches can be taken to transfer the data from an existing system into SAP BRH and to get the release decisions from SAP BRH back into the source system.
This flexibility is due to the architectural decision to avoid proprietary protocols, and instead use open API approaches.
In this lesson, we will use synchronous communication (ODATA4 protocol) as an example to demonstrate the implementation of SAP BRH APIs for data transfer.
There are several tools available for this purpose, and their choice depends on a variety of reasons.
In the rest of this lesson we will use the tool "Postman". There are several tutorials on how to install and use Postman, so we will omit that information here.
The "Postman" tool is commonly used for prototyping and testing purposes and offers a free usage tier. This is the version used in these examples. For technical consultants, the examples described from Postman can be replicated in any other prototyping tool, as the concept remains the same.
Video
How you can use Postman Part 1 - Microlearning on How you can use Postman Part 1 - SAP Media Share
The above video summarizes how to install and use Postman, however it is not specific for SAP BRH.

In the figure above one can see how one POST call should be formatted, with the example data from the first row of the above example.
In order to make the above example operational, one needs to specify the variable "host" and the AUTH too.
Lets see what happens if we do not specify the AUTH and simply proceed with the "defaults" that POSTMAN provides.

This is what will happen if one executes that command with the payload but without having provided a proper authorization.
How to Authorize with SAP BRH?
This is specified in the Administration Guide for SAP Batch Release Hub for Life Sciences in the chapter Providing Access to APIs for SAP Batch Release Hub for Life Sciences:
"Use the above details to perform a POST operation for generating the token, which is valid for a specified number of hours. Using the token you can perform the create, read, update, and delete (CRUD) operations."
In that part of the documentation, the help does not explain what the token is, and how to get it and to use it, however this is described in other chapters, for example in "Configure Landscape":
- Client ID: Match the clientid in the service-key
- Client Secret: Match the clientsecret in the service-key
- Token Service URL: Match the url in the service-key with the suffix /oauth/token?grant_type=client_credentials
- Token Service User: Same as Client ID
- Token Service Password: Same as Client Secret
However before we look into the specific details, we need to do a step-back and look at the high level data flows.

The picture above describes the actions that are performed by the three main components of the system, whereby we are using both shape and color coding (black = non-SAP, blue = SAP) to specify the origin of the systems.
- The Integration APP issues to the SAP Authorization Server a request for Authorization. It uses the ClientID and ClientSecret that the BTP administrator has provided them. These information are created by the BTP Administrator, and in the SAP BRH help are called "service keys". How to create them is specified in the already mentioned Providing Access to APIs for SAP Batch Release Hub for Life Sciences chapter of the SAP BRH Help.
- The SAP Authorization server verifies that the credentials provided are valid, and then returns an auth2 token formatted according to the open standard RFC7519 named "JSON Web Token". We will look into that later. Now let's focus on the processing flow. If the authorization fails, then it returns a corresponding message with "some" hint on the reasons and how to fix it, will look some examples of that later.
- With the Token, the Integration APP issues calls to the SAP BRH Application server.
- The SAP BRH Application server verifies the validity of the token by sending it to the Authentication server
- The authentication server verifies that the token is valid and then returns either an "OK" message or an error message.
- If the token was valid, the SAP BRH Application process the payload content and then returns either as "success" message or an error message.
The process of generating a token can be time consuming, and therefore it is important that the Integration App tries to reuse the same token as much as possible. It is recommended that in case of "token expiration" result as an output of step 6, the integration app should be configured to request a new token (Step 1), otherwise to proceed directly to step 3.
As can be seen above, the whole process follows the synchronous pattern, with 1 payload as input, and 2 possible payloads as output for each of those processes. The integration app must be able to properly handle all of these cases.
Now that the overarching process is described let's look into the details.

SAP BRH is built on top of the SAP Business Technology Platform (SAP BTP), which with any SAP software is built and operated with a "Security first" mindset and credo.
The BTP Help provides all details about the platform, including how to develop with it.
The chapter API Access Control of the BTP Help describes in detail the principle, the architecture and the components used. There you will find, amongst others, the following explanation:
"The User Account and Authentication (UAA) component is the central infrastructure component of the runtime platform for authentication and authorization management. … The UAA acts as an OAuth authorization server and issues an appropriate access token. It enables the business system to directly access an application in the runtime container. Runtime containers act as OAuth resource servers, using the container security API of the relevant container (for example, Java) to validate the token issued by the OAuth authorization server."
How to proceed is shown hereafter step by step:
Information needed to get the authorization token
Parameters | Example |
---|---|
url | https://api.authentication.eu10.hana.ondemand.com |
clientid | aa-bb-cccc11c1-d222-333e-44f4-g5g55ggg555g!a6666 |
clientsecret | aA1B2CcCCC3dDd+ee444fFF5ggG= |
Note
In this documentation, we are using red variables contained between curly brackets to refer to the values of the corresponding parameters. For example, the value for the "url" parameter is hereafter referred as {{url}} . This is also the convention used in Postman. Those variables can be managed in Postman through in the so called "Environment", the final environment configuration will be shown at the end of the lesson, and will be downloadable for direct use in your local postman installation.The values of the examples in this table are taken from the chapter Access SAP Authorization and Trust Management Service APIs.
This information is provided through the service keys that the BTP Administrator has created in order to provide access to SAP BRH API.
That task is specified in the Providing Access to APIs for SAP Batch Release Life Sciences. Chapter of the Administration Guide for SAP Batch Release Hub for Life Sciences .

It is important to understand in detail what happens if one does not provide all the required parameters.
Let's start with the first step:
- url
- clientid
- Clientsecret
The screen above shows how the basic configuration of the request for creating a token in Postman looks with only these three parameters.
The value of the URL to be used is the value of the "url" parameters within the uaa section of the integration keys that the customer's subaccount BTP Administrator has created for the brh-integration instance.

At this point if you execute the POST you will get a "400 Bad Request", however the server is very friendly and will also say why … "Missing grant type".
The error is returned from the /oauth/token endpoint of the authentication service. However the already mentioned Access SAP Authorization and Trust Management Service API explains that the OAUTH call generates client credentials, and it also shows that in order to get the token one needs to specify "grant_type=client_credentials".
In Postman this is done by:
- Setting in the header the Content-Type: application/X-www-form-urlencoded
- Adding the key: value pair grant_type:client_credentials
JWT
123456789101112
{"access_token":"eyJhbGciOiJSUzI1N...
",
"token_type":"bearer",
"expires_in":43199,
"scope":"xs_user.write uaa.resource
xs_authorization.read
xs_idp.write xs_user.read
xs_idp.read xs_authorization.write",
"jti":"be340353ac694b4cb504c6823f9386
47"}
Executing that POST will result in a "200 OK" and return a JSON that looks similar to the above example (taken from the "Access SAP Authorization and Trust Management Service APIs" chapter of the "SAP Business Technology Platform").
That content is the JSON representation of the JWT token, previously described.
Of all the values shown there, the only one that is constant is the token_type.
The other values are application dependent and therefore for SAP BRH they are different from the above example.
All those parameters are described in Getting an application access token from the SAP Business Technology Platform online manual.
A good source of detailed knowledge on AUTH0 JWT is AuthO Jason Web Tokens, however in order to proceed with the objective of posting the content of our table into SAP BRH it is not needed.

In order to store the token for later reuse, in Postman a common way is to use the test capability. In the example code shown above, we are using the Postman javascript API to process the request and automatically load the token content into the variable {{token}} for later reuse in subsequent calls - namely for the purpose of this tutorial to post the content of our semantically, and syntactically corrected batches.
The full documentation of the postman API is in Postman JavaScript reference and several videos can be found describing it fully.

Now that we have obtained the token with the first call to the authentication service, it is time to use it and see it working with our payload.
Like done previously in order to successfully get an authorization token, we are using the HEADER of the POST, however with different keys and values.
In order to add the authorization token to the call we have to use the way shown above, whereby the variable {{token}} is used to store the value of the token collected through the previous call.
Video
SAP Uses OData - SAP Uses OData - SAP Media Share
In order to properly understand the interaction with the Synchronous API provided by SAP BRH it is important to dive a bit deeper into one detail that was mentioned in the previous lesson, and namely on the fact that the synchronous services uses ODATA.
Video
Programming Model Quickie: Message Handling - OData - Programming Model Quickie: Message Handling - OData - SAP Media Share
The video above explain in few minutes some of the technical details of ODATA that are useful to have a solid understanding of the underlying message handling and properly process the response from SAP BRH API.

The posting was successful. This can be recognized by:
- The numerical code "201", that coincide with the documented success code
- The text version of the response code: "Created"
Additionally the content of the body of the response reports the same data we provided in the POST and in addition the:
- correlationIdentifier: A unique identifier across records provided in the same request context to SAP Batch Release Hub (internal).
- createdAt: The timestamp when the record was modified in SAP Batch Release Hub (internal).
- createdBy: The user who triggered the creation of the record in SAP Batch Release Hub (internal).
- modifiedAt: The timestamp when the record was modified in SAP Batch Release Hub (internal).
- modifiedBy: The user who triggered the modification of the record in SAP Batch Release Hub (internal).
These fields are described in the https://api.sap.com/api/BatchesService/schema .
It is important to notice that:
- @odata.context: this parameter is not specified in the SAP BRH Help since it is assumed that the developers know that . A good tutorial is available and some information was provided in the video of the preceding page.
- the createdAt and modifiedAt are the timestamp when the record was created or modified in SAP Batch Release Hub (internal) and therefore differ from the sourceModifiedAt since that is the date and time when a record was created or changed in the source system.
- the createdBy and modifiedBy are the technical user that is executing the service and not the "sourceModifiedBy" since the latter is " A unique identifier for the user who created or changed a record."
- orderType: Purchase. This is a piece of information that the server automatically created and that is the "default" for the type of order that triggered the batch creation.
How to verify that the data is truly in SAP BRH?
12345
/BatchesStage
/BatchesStage(ID='Batch 0123',material_ID='M0001',plant_ID='P0001',sourceModifiedAt=2017-04-13T15:51:04.000Z)
/BatchesStage/$filter=ID eq 'Batch 0123'
In order to verify that the batch we sent was also received, we can query SAP BRH using the API as specified in https://api.sap.com/api/BatchesService/resource.
One of the advantages of using ODATA4, is the extended search capability that it is automatically provided.
In this case, for example, one can:
- Get all the Batches that are in the "Stage" area
- Get all the Batches that are matching some specific property or property combination
- Get only the Batch that we have just created (second call)

By executing a GET call to the same endpoint we used before, and simply having the Authorization with the appropriate bearer {{token}} values set, you will receive all the Batches that are currently in the Stage area. This is very useful since at any time you can see which data is currently loaded in the system. This applies to ALL objects in SAP BRH.
Clearly, executing that call will consume quite some time and network, and therefore should be done carefully.

When we want simply to check that our POST of the "Batch 0123" properly worked, we can use the same query as above, however applying the ODATA $filter operation.
This is shown in the above slide. This possibility is *not* documented in the SAP BRH API since it is a native feature of ODATA4 , therefore you are recommended to refer to the official ODATA4 documentation that you can find in OData Version 4.0. and additional pages therein.

In the case of SAP BRH Batches, there could be multiple batches that have the same ID however they are coming from different plants, or the information of the same batch could have been modified. As you know already, in SAP BRH nothing gets deleted - eventually it can be modified. However, all the changes are kept in the system for reference purposes.
The unique way to identify a Batch is documented in the API (implicitly) and that is given by four mandatory parameters:
- Batch ID
- Material ID
- Plant ID
- sourceModifiedAt
This is precisely shown in the figure above.
Variables needed:
- the {{host}} to query
- the {{token}} to authenticate
- the ID
- material ID, plant ID and sourceModifiedAt
Response: exactly what we asked for, with additional @odata.context.

As shown, one can verify that the data is in SAP BRH by directly logging into it, and through the Data monitoring tile, enter in the Manage Staging and Active Data, and there search for example with the Batch ID .
A "goodie" provided through the usage of the standard SAP UI is the possibility to have a link to the user interface for this specific record and share it with others (that must have been granted access to SAP BRH), by clicking on the symbol shown above on the top right part.
https://YOURSERVER/cp.portal/site#DsoApp-display?sap-ui-app-id-hint=iron.dsoapp&/BatchesStage(ID='Batch%25200123',material_ID='M0001',plant_ID='P0001',sourceModifiedAt=2017-04-13T15%253A51%253A04.000Z)
Note
This is a sample link, in order to work please replace the URL with the specific for your tenant.Performance first
Query | Time (milliseconds) |
---|---|
All Batches in Stage | 550 |
Use $filter | 1939 |
Only the specific Batch | 383 |
As you can see from the above examples, there is always an "ms" line close to the "OK" result, and shows the total amount in milliseconds required.
Since we are operating on a multitenant system the overall performance could slightly change depending to system load, and therefore before taking some decisions one should perform multiple queries and then calculate an average out of that.
However from the single queries we completed, on the very limited number of batches in Stage area (24) we can take some good lessons:
- By querying with all four variables we get a much faster response than anything else
- Getting all batches and then doing a filtering inside the client requires 50% more time than getting only the specified batch
- Using $filter consumes a lot of time on the server This increases the server load and it is something that is important to watch for. Nevertheless, $filter is the only choice to collect the records that we need to have matching some other criteria - and avoiding dumping down the whole content of all records.

In the lesson 2, we described the data object status to bring data intoSAP BRH into active stage:
- manually, through the user interface
- all in the staging area through separate API call.
- Autoactivate them when doing the POST
Hereafter detailed examples are going to be shown for each of those scenarios.

This is the originally intended approach of staging in data warehousing.
Namely, the data is loaded automatically and then, after human intervention, a decision is made whether the data was loaded correctly and which data should really be used.
In this case we:
- create a new Batch (Batch ID: "Batch 0124") and load it in the Staging area.
- activate it manually
- Review the results: the Batch is now in the "Active" and no longer in the "Staging" area

In addition to manually activating a batch, one can also use a programmatic approach to move all batches from staging status to active status.
For example, a typical use case would be that all manual data cleansing checks are performed during a system preparation period during the week, and the users doing so do not have the required role to clear data from staging. This may be due to segregation of duties in a validated environment. With an automated post call created in Postman, the process can be triggered and executed automatically.

In this example we do not show the "BODY" since it was the same as previously, while we focus our attention on:
- The header:
Here we see only difference from the previous POST to the Stages is the addition of the header key "x-sapbrh-autoactivate" = true.
- The response
It looks the same as before, unchanged.
If we would now query the system we will find:
- No presence of that batch in the Batches Stage
- Presence of that batch in the Batches Active.

Now we have all the parts to accomplish the task to upload a valid content into SAP BRH staging area via Postman.
- Export from Excel into CSV
Postman is not able to automatically read Excel files, these need to be exported in CSV or JSON format. Excel cannot export to JSON and therefore in this example we simply use the export to comma value separated format
. - Configure Postman to process all records in the table
The general approach is described in Using CSV and JSON Data Files in the Postman Collection Runner
- Execute the collection

123456789101112
const body = {};
if (pm.iterationData.get("ID") != "") {
body.ID = ""+ pm.iterationData.get("ID");
}
…
if (pm.iterationData.get("sourceIdentifier") != "") {
body.sourceIdentifier = pm.iterationData.get("sourceIdentifier");
}
pm.globals.set("body",JSON.stringify(body));
Postman is not only capable to automatically process the output of a request (as shown previously for the handling of the token) but also operates on data before this is sent to the server.
We are going to use this feature to demonstrate how to automatically extract the values from the Excel exported CSV file, and for each row create a new Batches object .
- Dynamic body
There two main approaches in Postman to automatically process external file and use it as BODY for a POST: either using a predefined template in the BODY, or automatically creating the BODY during a Pre-request script. The second approach is a bit more cumbersome, however is best in order to appropriately handling missing or incomplete data that would otherwise break the flow of POST. The first change is therefore to the BODY of the postman call, previously it was storing the key:value pairs required, in JSON format, and in this place instead it will only have a placeholder {{body}} that is going to be pre-filled with structure and values by the pre-request script.
- Pre-request
The pre-request script in this case simply constructs an empty body object, and then populates it with the key: values if these are not empty or null. At the end, it sets the value of the "body" parameter to the appropriately formatted content (in this case it needs to be converted from JSON to String).The full script for this example is provided as part of the collection, as well as the input samples and the Postman environment file.

Once the POST is ready, then simply click on the Collection item and select "Run".That will open up the "Runner" tab where you only need to select:
- Request Token
- One Batch in Stage from CSV
- Select the input file (here called Lesson.csv)
- Select the appropriate Environment (in the screenshot above is not selected)
Click "run lesson4".
All the content of the CSV will be read, appropriately processed, and you will then see the results in the ways we described previously.

Here is the result, filtering for the batches that we just created (the only one that I have in the Stage area for the Plant "P0001".

Congratulation, you reached the end of this lesson.
At this point you have all knowledge required to automatically push data from whatever source of interest to SAP BRH.
In the specific case we described the use of Postman, perhaps in the future other tools will be shown.
Hereafter the links that have been mentioned in this lesson are provided grouped by topics. These should be considered more as "starting points" rather than a "full list", and the reader is encouraged to extending her/his knowledge by exploring from those pages the other information sources that those pages contain, and/or in which are contained.
SAP BRH API
BRH API | |
---|---|
About This Administration Guide | https://help.sap.com/docs/BATCH_RELEASE_HUB_LS_CLOUD/5a6fd89b06e54ea5b82bd8da0e38c9a7/1ed947ea700f41f495eb9c3fa98b77bc.html |
Providing Access to APIs for SAP Batch Release Hub for Life Sciences | https://help.sap.com/docs/BATCH_RELEASE_HUB_LS_CLOUD/5a6fd89b06e54ea5b82bd8da0e38c9a7/84e2370497154d94a3e82fbcbd9f8d48.html |
API Reference - Batches - Staging | https://api.sap.com/api/BatchesService/resource |
Batches - Schema View | https://api.sap.com/api/BatchesService/schema |
OData
OData | |
---|---|
Programming Model Quickie: Message Handling - OData | https://video.sap.com/media/t/1_neik1fbn |
SAP Uses OData | https://video.sap.com/media/t/1_cyoegdhc |
OData - Advanced Tutorial | https://www.odata.org/getting-started/advanced-tutorial/ |
OData Version 4.0. | https://docs.oasis-open.org/odata/odata/v4.0/odata-v4.0-part1-protocol.html |
SAP Security
SAP Security | |
---|---|
AuthO Jason Web Tokens | https://auth0.com/learn/json-web-tokens |
Authorization and Access Control | https://cap.cloud.sap/docs/guides/authorization |
Physical and Environmental Layer | https://help.sap.com/docs/CP_CONNECTIVITY/cca91383641e40ffbe03bdc78f00f681/a8bae56b47184ecaa54823327fe3ce49.html |
Cloud Infrastructure Layer | https://help.sap.com/docs/CP_CONNECTIVITY/cca91383641e40ffbe03bdc78f00f681/a30325af0903450993d9afb0128d7bd2.html |
SAP Business Technology Platform
SAP Business Technology Platform | |
---|---|
Overview | https://help.sap.com/docs/BTP/65de2977205c403bbc107264b8eccf4b/6a2c1ab5a31b4ed9a2ce17a5329e1dd8.html |
Access SAP Authorization and Trust Management ServiceAPIs
Access SAP Authorization and Trust Management ServiceAPIs | |
---|---|
Access SAP Authorization and Trust Management Service APIs | https://help.sap.com/docs/BTP/65de2977205c403bbc107264b8eccf4b/ebc9113a520e495ea5fb759b9a7929f2.html |
SAP Cloud Management Service - Service Plans | https://help.sap.com/docs/BTP/65de2977205c403bbc107264b8eccf4b/a508b724bf6d457ca7ac024b8e4b8457.html |
Getting an Application Access Token | https://help.sap.com/docs/BTP/65de2977205c403bbc107264b8eccf4b/6391b5dfe4704c6c8b71a32126828e9c.html |
https://help.sap.com/docs/BTP/65de2977205c403bbc107264b8eccf4b/3a23d64d74574a18a551d13fe11b94b1.html |
Multitenancy on BTP Cloud Foundry environment
Multitenancy on BTP Cloud Foundry environment | |
---|---|
Developing Multitenant Applications in the Cloud Foundry Environment | https://help.sap.com/docs/BTP/65de2977205c403bbc107264b8eccf4b/5e8a2b74e4f2442b8257c850ed912f48.html |
Multitenancy | https://help.sap.com/docs/BTP/65de2977205c403bbc107264b8eccf4b/5310fc31caad4707be9126377e144627.html |
Multitenancy (Classic) | https://cap.cloud.sap/docs/java/multitenancy |
Postman
Postman | |
---|---|
Microlearning on How you can use Postman Part 1 | https://video.sap.com/media/t/1_kibgjhu4 |
Using CSV and JSON Data Files in the Postman Collection Runner | https://blog.postman.com/using-csv-and-json-files-in-the-postman-collection-runner/ |
Postman JavaScript reference | https://learning.postman.com/docs/writing-scripts/script-references/postman-sandbox-api-reference/ |