Identifying Data Tiering Options in SAP HANA Cloud


After completing this lesson, you will be able to:

  • Describe the SAP HANA Cloud data tiering options

SAP HANA Cloud Storage Options

Lesson Overview

In this lesson, you'll learn about the different data tiering temperatures available for SAP HANA Cloud, gain knowledge about SAP HANA Cloud, native storage extension (NSE), and the integrated SAP HANA Cloud, Data Lake.

Overview Data Tiering Options in SAP HANA Cloud

Where you store your data strongly depends on how often you access it, its type, its operational usefulness, and security requirements. Another critical factor when assessing the value of data is time. As your data ages, it may become less relevant for analytics and it will be accessed less frequently. When you combine these different data values with limited IT budgets, there’s a need for cost-effective data management strategies that prioritize high-value data.

Data tiering in SAP HANA Cloud gives you a cost-effective storage solution, which allows you to assigning data to different storage and processing tiers so that you can fulfill your data management strategies.

In SAP HANA Cloud, you can split data between different temperature tiers: hot, warm, and cold. As the data’s value changes, you can move it between each tier.

  • Hot data: It is often changed and stored in-memory. Hot storage is ideal for frequently changed data that is used for real-time processing and analytics. In SAP HANA Cloud, hot data is both the highest performance and highest TCO storage option.
  • Warm data: It is less frequently changed and, therefore, stored on disk, which in SAP HANA Cloud is the native storage extension (NSE). Warm data isn’t fully loaded into memory, so this option is more cost-effective than hot data but still has very low latency. It’s best for data that doesn’t need to be accessed frequently.
  • Cold data: It is not accessed often and, therefore, can be located in cold storage. This cold data is managed separately from the SAP HANA Cloud database but can still be accessed by using the data virtualization capabilities. In SAP HANA Cloud, we recommend that you store cold data in the integrated data lake. This combines the massive storage capacity of a data lake – up to the petabyte scale – while keeping it all within a structure that simplifies and accelerates data analysis. The SAP HANA Cloud relational data lake ensures that applications can rapidly access data despite massive data volumes.

SAP HANA Cloud, Native Storage Extension (NSE)

SAP HANA native storage extension is a general-purpose, built-in warm data store in SAP HANA that lets you manage less-frequently accessed data without fully loading it into memory. It integrates disk-based or flash-drive based database technology with the SAP HANA in-memory database for an improved price-performance ratio.

The capacity of a standard HANA database is equal to the amount of hot data in memory. However, the capacity of a HANA database with NSE is the amount of hot data in memory plus the amount of warm data on disk. The SAP HANA native storage extension is enabled by default in an SAP HANA Cloud, SAP HANA database instance.

SAP HANA native storage extension is integrated with other SAP HANA functional layers, such as query optimizer, query execution engine, column store, and persistence layers. The key highlights of the SAP HANA native storage extension include the following:

  • A substantial increase in SAP HANA data capacity, with good performance for high-data volumes.
  • The ability to co-exist with the SAP HANA in-memory column store, preserving SAP HANA memory performance.
  • An enhancement of existing in-market paging capabilities by supporting compression, dictionary support, and partitioning.
  • An intelligent buffer cache that manages memory pages in SAP HANA native storage extension column store tables.
  • The ability to monitor and manage buffer cache statistics via system views.
  • The ability to support any SAP HANA application.
  • A simple system landscape with high scalability that covers a large spectrum of data sizes.

SAP HANA Cloud, Data Lake

Big data is here, no matter the goals of your organization. Sometimes data isn't only generated in large volumes, but it also must be stored for long periods of time, in case you must comply with rules and regulations related to data. We call this cold data, the data that typically is written once, updated rarely, and analyzed for patterns. Organizations face the challenge of finding a sustainable, cost-effective, and efficient solution for all this data that mostly doesn't need to be accessed or used often, but still must be stored.

This is where the SAP HANA Cloud data lake can be very effective. It’s a data lake that can reach a petabyte scale, as well as store structured and unstructured data. This makes sure that you can store the data without structuring it first and then use it for analytics, if necessary. Built on SAP Relational Engine, the data lake provides excellent performance for analytics across large volumes of data. With the SAP HANA data lake, you avoid the risk of creating an uncontrollable, unusable data dump.

The data lake in SAP HANA Cloud is integrated, but at the same time independent from the SAP HANA database when it comes to storage and compute. In SAP HANA Cloud, you can choose to use a data lake related to each SAP HANA Cloud instance you create. It’s a simple click that allows you to enable a data lake in your instance, either when you first create it or later, as needed . You can also take advantage of cloud elasticity to scale your data lake storage and compute up or down whenever necessary.

For the initial release of SAP HANA Cloud, a data lake starts with 1 TB of storage size and can go up to 90 TB, in 1-TB increments. The maximum compute capability of a data lake is 162 vCPU. To access your data lake, you can use the existing SAP HANA tools, like the Database Explorer. The data lake can support ingestion of data from object storage locations (for example, Azure blob store, Amazon S3) at a rate of 1 TB per day per dedicated vCPU per table, up to a maximum of 16 TB per day per table.

Another important point is that the SAP HANA Cloud data lake inherits the security and data protection available for all of your instances of SAP HANA Cloud automatically, without any action needed from you.

Log in to track your progress & complete quizzes