Preparing Impact Analysis to Determine Potential Restrictions for Business


After completing this lesson, you will be able to:

  • Describe the concept of Impact Analysis.
  • Export statistical usage data from the production system.

Concept of Impact Analysis

This section introduces the Impact Analysis. Impact Analysis works based on (production) statistical data and Software Update Manager (SUM) table classification data to help determine potential impacts on business processes during a Zero Downtime Option (ZDO) maintenance event.

Running an upgrade using a downtime-optimization approach can result in certain kinds of impacts for the ongoing business operations on the bridge subsystem. Hence, the Impact Analysis is an important step in the preparation of the ZDO procedure.

In the case of ZDO, the following impacts are checked by the Impact Analysis:

  • Read-only restrictions for end users on the bridge instance
  • Database triggers might have to be removed from certain tables or an initial load is needed after the completion of the upgrade
  • Additional database space needed for the clone tables
  • Tables that will be smart-switched but have a high number of changes

To prevent unexpected occurrence of such impacts during the maintenance event on your production system, you would like to identify them in advance. This can be achieved by exporting table statistics from your production system and providing them to the Software Update Manager in your very first cycle running in a sandpit environment. The Impact Analysis is equipped with the capabilities to give a projection on "what would happen if the defined upgrade scope would be applied in the production system?"

It is important to understand that the results of the Impact Analysis are based on these facts:

  • The defined scope (list of software components as well as the source and target levels of the software components versions) must be identical in all systems that should be upgraded in your project. When changing the stack definition, the results will be different and need to be interpreted again.
  • The exported table statistics should be representative of the time the business operations will work on the bridge subsystem. All relevant business processes should be captured accordingly in the exported statistics file.

SAP Notes Related to Impact Analysis for Zero Downtime Option:

  • SAP Note 2402270 - Export of Table Statistics for SUM Impact Analysis
  • SAP Note 2471883 - SUM Impact Analysis for ZDO
  • SAP Note 3092738 - Software Update Manager Toolbox - Central SAP Note

The analysis is based on statistical data retrieved from the productive system, as only there real end-user activity is happening. To perform the export of the statistical data, the Software Update Manager (SUM) Toolbox has to be available in the productive system. SUM Toolbox is available using TCI SAP Note (TCI: transport-based correction instruction). In addition, the tool will be shipped with the regular software delivery of SAP_BASIS. SUM Toolbox includes a tool which does the export of the statistical table access data. Additionally, a dialog version of the Impact Analysis is included.


As the SUM Toolbox is not only used for the export of statistical data, but also to perform the Impact Analysis using SAP GUI in dialog mode, it is important to deploy the TCI prior SUM is started. Then, the TCI should be transported to all systems in the landscape including the sandbox system.

The data that is exported using SUM Toolbox is later analyzed by SUM in phase RUN_IMPACT_ANALYSIS_ZDO. Furthermore, the analysis can be performed in dialog mode using SUM Toolbox like described in the following lessons.

This figure shows the procedure flow at a high-level. All steps will be explain in separate lessons during this unit.

One part of the upgrade procedure is called table classification, which calculates how the tables are handled during the upgrade. This includes both sets of tables: customer-created and SAP-delivered tables. Technically, the table classification happens in phase RUN_RSPTBFIL_ZDM_CLASSIFY.

The most important table classification types are as follows:

  • Share [upgrade does not touch the table or just adds a new non-key field]: These tables will not be cloned. No restrictions apply.
  • Clone [that is, upgrade delivers table content or a structural change]: Additional database space required for table clone and change record and replay. No restrictions apply.
  • Clone read-only [that is, upgrade delivers a complex structural change or an XPRA accesses the table]: Additional database space required for table clone and change record and replay. Table will be read-only for the business users while working on the bridge subsystem.

Schematic Illustration of the Impact Analysis

Following the example, Table-B will be cloned and set to read-only for the bridge subsystem. This could potentially lead to a business impact for end users working on the bridge subsystem since read-only tables are blocked against write accesses. If the table statistics file (ZDIMPANA.ZIP) contains the information that the Table-B changed, the Impact Analysis will throw an error. The result needs to be interpreted in a way to see whether the bridge subsystem really needs to write into the table. It is now important to figure out which business processes write into the read-only tables. If the processes are identified it has to be verified with the responsible business process owner and key users whether the read-only constraint can become critical for the business.

Table-D will be cloned as the upgrade will perform a change on the table. However, Table-D will be fully available with read and write accesses for the bridge subsystem. As Table-D will be cloned but has database triggers of type SLT, an error is displayed. After SUM is finished, an initial reload is needed for clones tables with database triggers (for details, see lesson SLT Database Trigger Handling in ZDO of unit 3).

Lastly, Table-D will be smart-switched since the table is being cloned. Tables that will be smart-switched will be renamed on-the-fly in phase EU_SWITCH_ZDM. Dependent on the load, this could impact the SUM tool since renaming tables requires an exclusive table lock on database level.

Further details on how the result of the Impact Analysis should be evaluated and interpreted can be found in lesson Results of the Impact Analysis in unit 4.

Overview on Reports related to Impact Analysis

Delivered with SUM 2.0 tool import (will be deleted after the upgrade)
Tools delivered with SUM Toolbox (SAP Note 3092738)
Export data for impact analysisExport of table statistics of the production system to a compressed file (that is, ZDIMPANA.ZIP)
Export of SUM classification dataExport of table classification data of the Software Update Manager (that is, PUTTB_SHD.ZIP)
Impact AnalysisDialog version of the Impact Analysis, which can be used after completion of phase RUN_RSPTBFIL_ZDM_CLASSIFY in any system which is upgraded using ZDO with the same software stack

Further Information

For more information and FAQ, refer to the following blog posts in the SAP Community:

Prepare and Export Statistical Data from Production

This section shows how to retrieve statistical table data from the production system.

Exporting the table statistics is the first step in running the Impact Analysis. The export report retrieves all table changes (updates, inserts, and deletes) as well as size and database trigger information. The output is stored in a compressed archive named ZDIMPANA.ZIP.


In order to export table size information, make sure that the database statistics are up to date. Outdated database statistics may result in inaccurate size values. In addition, it is important that the report created to export the statistical data is executed in the productionsystem.

The Impact Analysis as well as the tool to export the table statistics is part of the Software Update Manager (SUM) Toolbox, which has to be present in the production system. The export works as follows:

  1. Call the transaction code SUMTOOLBOX in the production system.
  2. Select the tool Export data for Impact Analysis in the navigation tree.
  3. Use the F4 help to provide a file name and path like in this example (C:\Temp\ADM330e\ZDIMPANA.ZIP). In addition, enter a start and end date for the data which should be displayed in the result screen.


    The start and end date fields are optional. If no dates are entered, all available periods are shown in the ALV grid.
  4. The button Display Period will query the data available in the system and update the ALV grid below. Important: This action will not start the export into the ZIP file.


    The column Contains Imports indicates that during that time, transports or other software changes like updates of software components were imported into the system. If you have, for instance, daily imports running into the system, all lines will have the flag set in this column that means that these lines have to be exported. This can potentially lead to false-positive results as also the import of transports or other software changes change data in the tables. Further information on how to reduce the number of false-positive results in that case can be found in the lesson. 'Best Practices to run the Impact Analysis' of this unit.
  5. Now, select the timeframe, which should be exported like in this example the weekly data from March 7, 2022 to March 13, 2022. Hold the CTRL key on the keyboard to select multiple lines.


    The exported table statistics should be representative of the time when the upgrade is running. In particular, it is important to consider the time when business users operate the bridge subsystem. The ideal time frame would be a data set that captures all activities and business processes like the time of the week when the SUM is performing the upgrade.
  6. Last, choose the Export button to start exporting the table statistics for the selected period(s) into the compressed file.

Provide the exported file ZDIMPANA.ZIP to the SUM/abap/save directory. The directory will not be available right after extracting SUM as SUM creates it as part of the per-processing activities. When the save directory is shown as sub directory of SUM/abap, the provisioning can take place.

Export Statistical Data from Production

Select Start Exercise to start the simulation.


  1. Log on to the productive SAP S/4HANA 2020 system.

    1. Log on to the system with user DDIC_DEV. The password is already pre-filled.

  2. Start the export tool in SUM Toolbox.

    1. Enter transaction code SUMTOOLBOX in the transaction box.

    2. Double-click on the tool Export data for Impact Analysis in the navigation tree on the left hand side.

  3. Select the evaluation period March 28, 2022 till April 10, 2022.

    1. Select the From Date 28.03.2022. Use the F4 help date picker to fill the field.

    2. Select the To Date 10.04.2022. Use the F4 help date picker to fill the field.

    3. Choose the button Display Periods to load the result list.


      In this example, the productive SAP S/4HANA system receives weekly transport imports. Hence, the column Contains Imports is checked for the weekly data.

  4. Export the time periods for calendar weeks 14 and 15.

    1. Select the first line in the result list with the start date 28.03.2022 and end date 03.04.2022. This period represents calendar week 14.

    2. Change the file name from ZDIMPANA.ZIP to ZDIMPANA_CW14.ZIP by opening the file dialog using the F4 help icon. Confirm the dialog by choosing the Save button.

    3. Start the export into the ZIP file ZDIMPANA_CW14.ZIP by choosing the Export button.

    4. Repeat the last four steps for the start date 04.04.2022 and end date 10.04.2022. Choose the file name ZDIMPANA_CW15.ZIP.

Log in to track your progress & complete quizzes