Data Aging on SAP HANA

View Is there a need to adapt the current system replication configuration to include cold partition(s)?

Is there a need to adapt the current system replication configuration to include cold partition(s)?

No changes need to be done to the current system replication configuration as the cold partitions are belonging to the table itself as all other partitions and will be replicated to the replication side together with the rest of the table.

Read more

View Is there a need to adapt the current backup strategy to include cold partition(s)?

Is there a need to adapt the current backup strategy to include cold partition(s)?

No changes need to be done to the current backup strategy as the cold partitions are belonging to the table itself as all other partitions and will be backed up together with the rest of the table.

Read more

View Why is the uniqueness check for cold data switched off in SAP HANA per default?

Why is the uniqueness check for cold data switched off in SAP HANA per default?

For ABAP applications using the Data Aging Framework, the default is set to “no uniqueness check” by SAP HANA. Unique constraints for primary keys or unique indexes are normally checked across all partitions in case the partitioning is done over the artificial temperature that is not part of the primary key column of a table….

Read more

View When to create a new partition for the cold data?

When to create a new partition for the cold data?

We create a new partition range when data to be aged is not covered by an existing partition range in historical area. A new partition range can also be created when the maximum capacity threshold for an existing partition is reached soon.

Read more

View Should Aging partitions be on first level or on second level?

Should Aging partitions be on first level or on second level?

Data Aging Partitioning can be done on first level without any second level or used as a second-level below a hash- or range-partition. If there is any risk that the remaining amount of records in “current” partition after Data Aging can approach the two billion records limit in the future, choose a two-level approach with aging…

Read more

View Is it possible to add a different partitioning on top of Data Aging?

Is it possible to add a different partitioning on top of Data Aging?

It is possible to add a partitioning on top of Data Aging/after having introduced Data Aging. However, the preferred way is to start the other way round, in order to not move around massive amounts of aged data again. Start with Hash (or Range) on first level, then add the time selection partitioning using the…

Read more

View How to create partitions for tables participating in Data Aging?

How to create partitions for tables participating in Data Aging?

The SAP HANA database offers a special time selection partitioning scheme, also called aging. Time selection or aging allows SAP Business Suite application data to be horizontally partitioned into different temperatures like hot and cold. The partitioning of tables that participate in Data Aging is administered on the ABAP layer (Transaction DAGPTM). Partitioning needs to…

Read more

View What does quick partitioning mean?

What does quick partitioning mean?

Initially all Tables participating in Data Aging have an additional column _DATAAGING, which is the basis for the time selection partitioning. The default value of this column is ‘00000000’.  When the time selection (=RANGE) partitioning is performed on this column, all records will remain in the hot partition – no data has to be moved….

Read more

View Will the Data Aging Framework decide to move the data from the hot partition to the cold partition(s) automatically?

Will the Data Aging Framework decide to move the data from the hot partition to the cold partition(s) automatically?

The application logic determines when current data turns historical by using its knowledge about the object’s life cycle. The application logic validates the conditions at the object level from a business point of view, based on the status, execution of existence checks, and verification of cross-object dependencies. The framework executes the move.

Read more

View How and when will the data be moved from the hot partition to the cold partition(s)?

How and when will the data be moved from the hot partition to the cold partition(s)?

The application logic determines when current data turns historical by using its knowledge about the object’s life cycle. The application logic validates the conditions at the object level from a business point of view, based on the status, execution of existence checks, and verification of cross-object dependencies. The data will be moved during a Data…

Read more

View How to restrict the memory consumption used by the cold data (paged attributes)?

How to restrict the memory consumption used by the cold data (paged attributes)?

As of SAP HANA SPS 09 the memory footprint of the Paged Attributes is enhanced. It is possible to configure the amount of memory used by page loadable columns. The parameter for the lower limit is page_loadable_columns_min_size (page_loadable_columns_min_size_rel) which is set to 5% of the effective allocation limit of the indexserver per default. The parameter…

Read more

View How to monitor the memory consumption used by the cold data (paged attributes)?

How to monitor the memory consumption used by the cold data (paged attributes)?

On table level the monitoring capabilities allow to only show how much data is loaded into memory per partition/per column. As technically Data Aging in SAP HANA is using Paged Attributes for cold partitions only during the access of these data it needs to be loaded to Memory. There is a dedicated Cache area with…

Read more

  1. Pages:
  2. 1
  3. 2
  4. 3