Integrating SAP HANA Cloud, Data Lake Relational Engine as Cold Store Option

Objective

After completing this lesson, you will be able to integrate SAP HANA Cloud, data lake Relational Engine with SAP BW/4HANA.

Data Tiering Optimization and SAP HANA Cloud, Data Lake Relational Engine

Data tier options. Hot store, the standard tier, is based on SAP HANA memory. Warm store, the extension tier, is based on SAP HANA extension node or NSE. Cold store, the external tier, is based on SAP IQ, Apache hadoop, or SAP HANA, data lake Relational Engine.

Data-tiering optimization (DTO) helps you to classify the data in your DataStore object as hot, warm, or cold, depending on how frequently it is accessed.

Depending on this classification and how the data is used, the data is stored in different storage areas. DTO provides you with a central UI, where all storage options can be set. Partitions from the SAP HANA database can be used for this.

Note

Note

You have the option thus far of storing data using the data-archiving process. However, for a DataStore object you can only use one of these options: either DTO or the data-archiving process. For some scenarios, a data-archiving process is required. For example, archiving with time characteristics that are not contained in the key, multi-dimensional partitioning in the external tier (cold).

You can define which temperatures or memory areas are supported by an object in the editing screen for the DataStore object. The actual information about which data is stored at which temperature or in which memory area is not specified in the modeling of the DataStore object. This information is specified in a separate definition (saved on the local system) of the object temperature or partition temperature (not a transport property), for example in the data-tiering option maintenance app in BW/4HANA Cockpit or in transaction RSOADSODTO.

You have the following options:

  • Hot (SAP HANA standard nodes): The data is stored in SAP HANA.
  • Warm (SAP HANA Extension Nodes or SAP HANA Native Storage Extension): The data is stored on disk, either in an SAP HANA extension node or managed with the SAP HANA Native Storage Extension.
  • Cold (external cold store): The data is stored externally (in SAP HANA Cloud, data lake Relational Engine or in SAP IQ).

The integration of SAP HANA Cloud, data lake Relational Engine with SAP BW/4HANA allows for efficient data management by separating frequently accessed data from rarely accessed data. Using SAP HANA Cloud, data lake Relational Engine as cold store, you can reduce resource demands and costs while outsourcing hardware and software management to the cloud, making it an attractive solution to optimize your data storage strategy.

Setting up Cold Store with the Relational Data Lake Engine

Before starting with SAP HANA Cloud, data lake Relational Engine as cold store, apply the corrections provided with SAP Note 3570462.

Perform the following steps

  1. Create a user: In SAP HANA Cloud, data lake Relational Engine, to create a new user, use the following SQL statement:
    Code Snippet
    1
    CREATE USER "<SchemaUser>" IDENTIFIED BY '<password>'
    (For more information, see Creating a New User in the SAP HANA Cloud, Data Lake Security for Data Lake Relational Engine documentation on our online help pages http://help.sap.com)
  2. Grant a role: Grant the user the SYS_DL_CUSTOMER_ADMIN_ROLE role with the WITH ADMIN option using the following SQL statement:
    Code Snippet
    1
    GRANT ROLE SYS_DL_CUSTOMER_ADMIN_ROLE TO <SchemaUser> WITH ADMIN OPTION;
    (For more information, see Creating an Administration Recovery User in the SAP HANA Cloud, Data Lake Security for Data Lake Relational Engine documentation on our online help pages http://help.sap.com.)
  3. Set up a Database Connection. In SAP BW/4HANA, you create a database connection to SAP HANA Cloud, data lake Relational Engine.
  4. Go to transaction DBCO and choose New Entry.
  5. Enter a name in the DB Connection field.
  6. Enter SYB in the DBMS field (Database Management System)
  7. Enter the user name and password that you have created before in SAP HANA Cloud, data lake Relational Engine.
  8. Save your entries.
  9. With the database user of the SAP BW/4HANA system using SQL syntax, add the connection information to the database connection like this:

    Code Snippet
    1
    UPDATE dbcon SET CON_ENV = 'SYBASE_CONTYPE=IQ SIQ_CON_STR=host=<Datalake-SQL endpoint>;DatabaseName=iqaas;ENC=tls(fips=NO;tls_type=rsa;skip_certificate_name_check=1;direct=yes);logfile=/usr/sap/<SID>/D00/work/cli.out;dump=/usr/sap/<SID>/D00/work' WHERE CON_NAME = '<your connection name>'

Note

You must add the information using SQL syntax because of length restrictions in the user interface of transaction DBCO.

For more information, refer to the Help pages:

SAP HANA Cloud, SAP HANA Database User Connections to Data Lake Relational Engine | SAP Help Portal

Log in to track your progress & complete quizzes