SAP HANA Cloud, Data Lake Files

SAP HANA Cloud, Data Lake Files

It’s also possible to use SQL commands to import and export data directly from SAP HANA Cloud data lake file storage. This requires the path to the file store, along with the necessary credentials for users who need to access data lake.

Overview

A valid credential name and the path to the file store are required in the import and export statements. To access SAP HANA Cloud data lake file storage the use of the WITH CREDENTIAL clause is mandatory for execution of Import and Export statements. The credential uses the public certificate key of the CA used for access to HANA Cloud and this is uploaded to a PSE certificate store. This avoids the need to repeatedly enter the certificate as plain text for each Import/Export statement.

Note

The process of setting up the secure store and configuring access to the Data Lake is outside the scope of this workshop and has already been completed for you. Further details on this process can be found

The general syntax for exporting a table in SAP HANA Cloud is as follows:

Code Snippet
123
export <TABLE_NAME> as parquet into 'hdlfs://<HDLFS endpoint>/hdb_export_directory' WITH CREDENTIAL 'myCredential'

The security certificates and credentials have been pre-configured, and these will be accessed via the ‘myCredential’ tag.


Try it out!

The following exercise demonstrates how to import and export objects from SAP HANA Cloud to the HANA Data Lake file system.

Configure Data Lake Connection

The first step is to create the connection to the HANA Data Lake. A REST API endpoint, along with client certificate and key will be required for this.

  1. In the Database Explorer, select the highlighted ‘+’ icon to add a new instance.

Add
  1. In the Add Instance dialogue box that pops up, click the drop-down and select Data Lake Files.

Instance
  1. Paste the following string into the REST API Endpoint field:

{placeholder|api_endpt_dl}

  1. Paste the following certificate details into the Client Certificate box:

-----BEGIN CERTIFICATE----- MIIEqjCCApICCQCxEZQEPvZasTANBgkqhkiG9w0BAQsFADAXMRUwEwYDVQQDDAx0 dXRvcmlhbHVzZXIwHhcNMjUwNjMwMTIxOTIxWhcNMzUwNjI4MTIxOTIxWjAXMRUw EwYDVQQDDAx0dXRvcmlhbHVzZXIwggIiMA0GCSqGSIb3DQEBAQUAA4ICDwAwggIK AoICAQDA5S1heV9jgMHXGE3c5VJpxDntyfFlsEFTjdS221+JgF4gPD5YguiF3G7x oNvSPU9EMacfujA308Msxqr0ebyu006ZuENI8UoLtNlDyZIEjPGILeUaQycESx+C QBKlFwlL8KzE5NEq+AQOJfWp6vEQMnozE66SNdkPD8qNhehi5mBMDK0CrWTrsewb EA6p50up1bedFsHRnIZ2AueMdUls36BiM06tkGY179zxcidQGiy24klCFLLrPelT vj2W3tKf9OkGLrTmpEn/2dSat6yGXm7JbcaD70j0a9HhsuzLr7weVKjDhyTKlYi+ yt0UlQ0LU2Il0F5bXAVZbzGUergrXMwFEHZWUXSSVTYf6+XHo5vU4msK8FSVXe6P CfFlqmV+ZTTEnLqnpoOROShVlUvVNNSSdt8Qq64GnKFogpNaAKfZYezM4iC6p6MC FmFbh9q3ZIaVEDWJZ3X/CzrgxXiydVaQfRo66XEFH0GOO8/pU3rwdYNqUucwN0ZB BX8GVRH+G4XIBQAe387z+14imztdEQcJds1eFuMqPmChgo9Tv3md8iNtE9mOXhhl 5JUur6B2DdIKXHiBqXCGQ26lh/EwgNb1Pymtf1nU6Db9sGw5mJiNFcvEcB5iY32k cDpp/hwM/jFut5DPpa6Nx1eUF1G52e3DuVt8MNbbqiP4TLsM7wIDAQABMA0GCSqG SIb3DQEBCwUAA4ICAQCk3B7xg2jYffEiWaqD2+lMcZAKFDMV6fY5o5leWQvo+rP6 r3DtvGnStNqSSwIIhr2/3c5lqiinpvXtrdB/N+C9PchOmhp4Mq3J5sFe4HyXsgWg SaT/4uX1+UgOSROMO110lgRUejMn/XtRobrQ43N9jPrv2bAQp7mD7olC9ZOZxPIo ABMb3HcOVPEq+9Lbl8QCPyLjauTOQnjRkyfAdSirWuTOPFBFSiEFolxy+lfszKm7 mfBEAzKcLfXEoD8DMxLJcUthezPns7C0GBfbkBDmAUSXB9kTZMS2ezJDgrjeeirM YY0VH1k9pW9H8TfXo/Ox5wFSP9LO4m71ubBn6VUq2ISkAWSe0d/YHURqdBg/BfTe bmqLYsShRgWQWmw9s4KfdNPfh3oJlHn4KddeHKNGz62gGVDzZROI8n/sK876elST lPtkJ6BScfJvnyyqDhuWNuso+2VGxuqHMhLqxf7sgdQd1lS1EpS4POmXMNZaZpo/ E3GxUZ3m/j0bjXvWmm20vvK7m0O5wy71t4FB2qjz7luUYjDjD+j1LWP5oIbvznOW 0wZDaOXvVDgew5zak6cbAtcaPtsWNKZ/0QZ+yoB15aMswbR4Wk9DC9bT6vMuUnxk qYbZeIm7SvGPssg0Y5aidV8uSrg6u5VJYipejkoVDf86yvFoKGsAr1u4LdTrTQ== -----END CERTIFICATE-----

  1. Paste the following key certificate details into the Client Key box:

-----BEGIN PRIVATE KEY----- MIIJQgIBADANBgkqhkiG9w0BAQEFAASCCSwwggkoAgEAAoICAQDA5S1heV9jgMHX GE3c5VJpxDntyfFlsEFTjdS221+JgF4gPD5YguiF3G7xoNvSPU9EMacfujA308Ms xqr0ebyu006ZuENI8UoLtNlDyZIEjPGILeUaQycESx+CQBKlFwlL8KzE5NEq+AQO JfWp6vEQMnozE66SNdkPD8qNhehi5mBMDK0CrWTrsewbEA6p50up1bedFsHRnIZ2 AueMdUls36BiM06tkGY179zxcidQGiy24klCFLLrPelTvj2W3tKf9OkGLrTmpEn/ 2dSat6yGXm7JbcaD70j0a9HhsuzLr7weVKjDhyTKlYi+yt0UlQ0LU2Il0F5bXAVZ bzGUergrXMwFEHZWUXSSVTYf6+XHo5vU4msK8FSVXe6PCfFlqmV+ZTTEnLqnpoOR OShVlUvVNNSSdt8Qq64GnKFogpNaAKfZYezM4iC6p6MCFmFbh9q3ZIaVEDWJZ3X/ CzrgxXiydVaQfRo66XEFH0GOO8/pU3rwdYNqUucwN0ZBBX8GVRH+G4XIBQAe387z +14imztdEQcJds1eFuMqPmChgo9Tv3md8iNtE9mOXhhl5JUur6B2DdIKXHiBqXCG Q26lh/EwgNb1Pymtf1nU6Db9sGw5mJiNFcvEcB5iY32kcDpp/hwM/jFut5DPpa6N x1eUF1G52e3DuVt8MNbbqiP4TLsM7wIDAQABAoICAHY/R118mPQXZzyjYn/qeGXr 65MQe5HZxYIh7OcXfe59qbXbGGEMBtygb8Zh1L+B5Aiq44d4jLx8bSYSt5Pc2MTX NlTXdy/M8pZQi+uZMpDlkd7oO4yo488m+i7Pdl6Ly6Ao7tBYcL0Fgv7lWD6MjrBe MTkEdLyek+w+wu2UzAtsrFWSY7DDeJ1f2ZwRcNvItUJsh/2diL0lVewcBbO95n/+ aoQoba/KWE3yN6AvWVgqY4TNHRmMNM4uis1STtEg0T9LVFn84dqNletoroMV24YX d5S1Xb9dPXzQZ74TAEQw+4IgZcBPJUem+Uf0dih64p8xw3Xo0R7yPzk/B8a8oA6r uokc4Ighf4/rVBCyUOUwQ6JjiseqmTqU9zs3izsYarmr9DSEF6SKS8cdkQL2lZod 4MJ3ns9FAedbFsvQhHqemm15V6id5AELg59qxCDOr2K77P5mRIIAMriQB9D7PGT/ 5Ef58KTmLqFsSuis7wlchws8XCyg0GccfYhp9z+k18cbj2zKuacWXusnllbOGrj7 KBXfkRxnpY8LCOui2oVzRAMZ6vPFRvKIFWypNlirJ7qlyh+e+7elcr7NL8DJrefE mb2I1/g8UBLkh/QJv1B/fNELMUvVkFSemypDkwRedlTjBDVRkEozZddLhbaaT6KR GvsAp2w/pXPSRdfiiG3pAoIBAQDv26vOMO7B+7SLTXA0oMVlZ4FxOAgPbsawUyZq 5r+bVDsz++GvwBMvmVC9V0nZYCBxgE8Xm1+dVwFjpmEQ9nL/QWU/IJREqA0gidi+ MUu1grFykoLszbL6GNj+rS92U/AE4e+FVtSfFdtB3E7kAQuRTsXQSbhr+xBfDNKc F9W500rxpK2FNa7/D09dQT2eT/FLOEvr+XvVkSDSF10oOUWAlQwyxnny61J0J0pa NkhzTsrr6p6upB5U3RdWXgsenKfaCBUNIFvdE/EhPZwV7RM6I0SaJ+rCBCHpYK+n OWTbQMY76WMk+6jRghqVGDId7QVw9iIa3FsFwPzLwwKSt/VbAoIBAQDN4GtcBEmr KtZKFzPlkm7mPqhJLmcik2iE4K3JI6yeNkCH4KLLdeBgQPYK1oqmcw7pqINfDWiB E0NofXrSYmsMTXfTrJv8IBSMhEb90jqJAX/4ItcUboUvPMlA32Bz9ZXQQJUGakr3 yGP952k1dtO+ia9zWsKLzQKCEBIJzERXHi1G2zhEzr1JS8Tevf3x2q/voJzHcHw5 7dSu+8I0J0UD8qFYz0/rAvHU0yxISrAQIdcxPj6aGXHvKT2EXGmXhG4gqlMMmnTJ fc9oeF4eryFhgIOIipqSTMgYZEEziabGVDEKXl747YJPGRSoO9bJq8lZuMHE0NpH K1XU4UscU1b9AoIBAAPsY/MLedxScbGVorA5sQDnoIZplDgsZ7OaqJY3EAtKZW2E QTzOt4Gf0GJIezzDBcTVhHWuZPH96lfm8M+GOHtUZoOR1NsKpKxTlOgHhmWBrcVi ACI/ex38KhCAZLZaySJLT8+ui+vOP7fkEqZIHhhSYYn2qr2ZbbsgSpGjEivn+OHa /7ueFS624WV1G7zrMdc5JITHn+7uYMsO9schqLFhvsFV3cjLcJ3VxrGTEbxRHaCe Rdo1dQmOVENlZmkpxtUaLL36OYZ0PQ1+JH93UUkyKn0bCgwCZUub+UOS/MNZW+S+ /JAuTaSfrC+kJa23os8mN4iY8iQ18+yhU/lyaOECggEAQ0qgLMJmBeGJi0wMNXRo dS9zt//HzCBcYFpD02Oqr0fSTddZvqD3xt5UzXQUe5VOAYeH8RCj4NSYCtZAx/VN QgcsLG7VyVH/BXLS9XG6mj94QXW50T7Sn38B1QduMljKL/h7/U3O1TCVpk72L3h4 e2XriAvZXPKUaEiybKtshwEhEXDj8nGUoxl17sww7dVp/iwEMpgMsq6O/ubhMkTH pSOwIyQX3hvnSYYAilyHMa6t9LWApsO8XsyfA91k0haCHIEvhgYSOC28FkyJjwP1 KDZmR8HuBfW8zWQvf5X1ggvQv6/a4QhYuFvdSaPNb7uLo5zlENynC+4ZLuy4rTh8 TQKCAQEAsbVdLMx6yB8s2DEWD5gkIBuvW9VxBpwXxVQ9h9ToaFZ/Jwur0teM2w3Z CdR/+PxT/0g4MzUifwDC87TSwjZhgimI7fRakaOo682wqLjxbSD+VvDu6pAPTEvj Rxo+bbuhLQwizazcIitDgXDiSVrTvmQDGhmOnAds62gYrqCCwWNISGnJysV2SXGY 866heKuq118zFakmVzR5G0y16EQX878MmjwXe8Ke9aSLpkJ7auuAo7hxcSfsjmGk xEqX7215vUnqqpCYQrVdJHh3Uoi481NpfN+k3pFTQEMKiP+ay7tSAcrinp7IjnCt sMMPzbptb4+2mxaMqGV8Zbgs0DEoug== -----END PRIVATE KEY-----

  1. Change the Display Name to -HDLFS and click on OK.

Add
  1. The connection to the HANA Data Lake is now visible from within the Database Explorer:

Export table to Parquet Files

In the following examples, the GX_SALES table from the schema is used, but the same process can be applied to any object in the database.

  1. In the Database Explorer window, open a new SQL console window.

Note

: If the SQL console icon is greyed out, just click on the HANA Cloud database hostname to enable it.

Now export the table into the SAP HANA Data Lake as a parquet file.

  1. Copy the below SQL query, paste it into the console and click the green play icon to execute it.
Code Snippet
1234567
-- Create required Credential CREATE CREDENTIAL FOR USER {placeholder|userid} COMPONENT 'SAPHANAIMPORTEXPORT' PURPOSE 'myCredential' TYPE 'X509' PSE 'MYPSE'; -- Export table as parquet file export GX_SALES as parquet into 'hdlfs://{placeholder|api_endpt_dl}/{placeholder|userid}_hdl_file_export' WITH CREDENTIAL 'myCredential';
  1. Verify that the table is now in the data lake. Expand the -HDLFS --> _hdl_file_export --> index --> --> GX and select the GX_SALES table. The parquet file, along with some others, should be visible in the lower pane.

  1. In the same (or a new) SQL console window, paste the following query and run it to check the number of records currently in the local GX_SALES table.
Code Snippet
12
-- Get No. of records from GX_SALES table select count(*) from GX_SALES;

  1. Now delete all records from this table. Run the following query:
Code Snippet
12
-- Truncate all records in GX_SALES truncate table GX_SALES;
  1. Check the number of records now in the table. The result should be 0.
Code Snippet
12
-- Get No. of records from GX_SALES table select count(*) from GX_SALES;

  1. Now import the GX_SALES table from HANA Data Lake back into HANA Cloud. Run the following SQL:
Code Snippet
1234
-- Import the table from HANA Data Lake import from parquet file 'hdlfs://{placeholder|api_endpt_dl}/{placeholder|userid}_hdl_file_export/index/{placeholder|userid}/GX/GX_SALES/parquetData' into GX_SALES WITH CREDENTIAL 'myCredential';

  1. Run the count statement again to check the number of records in the table now. The records should have been re-populated from the data lake table.

Finally, let’s remove the data lake connection and clean up the files created there.

  1. Right-click on the file export folder _hdl_file_export and select Delete.

  1. Confirm deletion in the pop-up by clicking Delete.

Well done!! This lesson demonstrates how to export objects and their data to the HANA Data Lake in parquet file format, and how to import these objects into HANA Cloud.