SAP Data Lifecycle Manager

Objectives
After completing this lesson, you will be able to:

After completing this lesson, you will be able to:

  • Explain the functionality of the SAP Data Lifecycle Manager

SAP Data Lifecycle Manager

Multi-temperature Data Management

In data management, depending on the data volume and access frequency, we talk about hot, warm, and cold data:

  • Hot data: Frequent access, high-value, high query performance
  • Warm data: less frequent access, less value, reasonable query performance
  • Cold data: Rarely accessed, low-value, low query performance

DLM: Hot Data Store

Hot data store:

The hot data store is managed in the in-memory database tables of SAP HANA. This is the default location for data and provides the best possible performance but at the highest cost.

DLM: Warm Data Store

Warm data store:

  • One SAP HANA slave node in the scale-out landscape is reserved for warm data storage and processing
  • DLM-specified aging rules to move complete table partitions between hot store and warm store
  • No DLM-generated SAP HANA View (pruning/UNION) required, due to single partitioned table
  • Data access managed by SAP HANA including partition pruning
  • SQL HANA Schema and HDI container scenarios are supported
  • No impact to data update/delta handling, as records are moved to unique table partition, based on partitioning criteria

DLM: Cold Data Store

Cold data store:

  • DLM uses the HANA Spark Controller to move data bi-directional between HANA in-memory store and Hadoop
  • Cold Data are managed by the HANA Spark Controller and are only accessible via HANA SDA
  • DLM provides pruning views to optimize data access to cold data
  • Cross schema, cross- and local HDI scenarios are supported

Data Lifecycle Manager

Data Lifecycle Manager (DLM) functionality:

  • Integration to custom-built or insert-only focused SAP HANA applications like IoT and sensor data (Big Data)
  • Modeling aging rules on persistence objects
  • Generated pruning node views for optimized access between source tables and target tables
  • Modeling relocation rules on remote source table
  • Automated and orchestrated data relocation between data stores
  • Scheduling data relocation using SAP HANA task chains
  • Generated stored procedures to perform mass data relocation - in and out

Save progress to your learning plan by logging in or creating an account