Explaining Master Data Quality: Process and Rules

Objective

After completing this lesson, you will be able to outline the core processes of data quality management with SAP Master Data Governance on SAP S/4HANA.

Data Quality: Process Flow

The figure illustrates the data management process in SAP Master Data Governance and explains some details.

Define Quality
Define requirements based on your company's business processes.
Set priorities according to value, impact, and quality evolution.
Enter Quality
Ensure quality at point of entry.
Consider all entry-points: single changes, mass changes, load scenarios, in daily business, projects, and so on.
Monitor Quality
Operational motivation: Detect issues before processes fail.
Tactical motivation: Ensure progress and performance of current activities.
Strategic motivation: Enable achievements, define new initiatives.
Improve Quality
Correct data and drive the correction process.
Fix data entry processes.
Evolve the definition of quality.
The figure shows the main process steps to improve the data quality. The steps are: Define Quality, Enter Quality, Monitor Quality, and Improve Quality. The figure also shows the Data Quality Evaluation Overview values for Business Partner and Product, as well as the Evaluation Results for both domains.

Note

All Data Quality Management features of the classic mode in SAP Master Data Governance are supported in the cloud-ready mode.

Business Value

SAP Master Data Governance is the central place for master data quality rules. It provides transparency on business aspects, use, technical implementations, consistent quality definition, and continuous evaluation and monitoring. The following are examples of the business value of SAP Master Data Governance:

  • Business partner and product master data covered as packaged applications, and platform for custom-defined objects
  • Collaboratively describe, catalog, and implement rules for data quality evaluation
  • Schedule quality evaluations, analyze evaluation results, and initiate correction of erroneous data
  • Get an overview of current data quality status and KPIs
  • Enable drill-down analysis of data quality scores across multiple dimensions

Data Quality Rules: Management

Managing data quality rules in one single place for multiple purposes offers the following:

Innovations
With SAP S/4HANA, you can manage your rules for master data quality in one single place:
  • Repository to catalog and define data-quality rules
  • Comprehensive description of rules, including business aspects, ownership, and rule implementations
  • Collaboration and status handling during the lifecycle of rules, from creation to obsolescence
Business value
  • Central place for master data quality rules
  • Transparency on business aspects, uses, and technical implementations of rules
The screenshot shows the Data Quality Rule start screen for rule ZMDQ_MVKE_KONDM_2_SPART with the General Information Tab.

Properties of Data Quality Rules are:

General
  • Rule name, ID, and Status
  • Checked field - Field that is the subject of the rule (see later)
  • Base table - Basis for the evaluation of the rule (see later)
Business Details
  • Description - Detailed description of what the rule checks
  • Reason - Explanation of why the rule is needed and the impact if data does not comply to the rule
  • Scope - Description of the data set to which the rule is applicable
  • Link - URL link to further information that is stored elsewhere
Contacts
  • Rule Owner - A user who is responsible for the rule
  • Implementation Expert - A user who will do the implementation
  • Business Contact - Free text, for example, the rule requesters
  • Data Owner - A free text field used to identify the user that must, and can, correct the data, for example, the responsible purchasing department

Rule Status

Given below are the uses of a data quality rule with the corresponding UI actions to set the status.

Use of the status of a data quality rule:

  • Represents its life-cycle status
  • Determines if the rule is considered during data quality evaluation

Status transitions:

  • Pre-conditions are checked, for example, all BRFplus expressions must be active before a rule can be set to Approved status.
  • Authorizations are required.

This allows flexible configuration, for example:

  • One user does everything.
  • There is a strict segregation of duties.

The figure shows the possible status.

The figure shows the different statuses of the Rules, like: New, To be implemented, To be tested, Approved, and Disabled.

Note

Only straightforward status transitions are shown in the figure. Further transitions are possible.

Authorization Object MDQRM_RULE

Facts about the Authorization Object MDQRM_RULE

  • The values of the authorization field MDQ_RULSTS correspond to the status of the data quality rule.

  • If a rule is created, the status is set to New by the system. Changes to the status of the rule can only be done by users with the corresponding activity to change the status of the rule (see authorization field ACTVT above).

  • You can use the authorization field MDQ_RULSTS  to enforce the segregation of duties in the rule definition and activation process.

Note

Status Active and activity Activate are obsolete with SAP S/4HANA 1909.

UI Status and required Activity

UI Actionrequires ....Action
Edit(02) Change
Delete(06) Delete
Send for Implementation(49) Request
Send for Testing(98) Mark for Release
(Create Expressions)(64) Generate
Approve(F1) Approve
Disable(H1) Deactivate
Create(01) Add or Create
Display(03) Display

Rule Status

Rule StatusDescription
NEWNew
APPROVEDApproved
TOBEIMPLTo Be Implemented
TOBETSTTo Be Tested
DISABLEDDisabled

Collaborating on Data Quality Rules

Sharing links to rules
  • Copy and paste from the browser.
  • Send e-mail from the app.
Collaboration with SAP CoPilot
  • Write personal notes directly in the app.
  • Add users to start a conversation.
  • Link rules and other objects.
  • Add screenshots, including edit marks.

Import and Export of Data Quality Rules

With SAP Master Data Governance on SAP S/4HANA 2020, the import and export of Data Quality Rules were enhanced:

  • Export data quality rules, including their BRFplus implementation, and download them in OpenOffice XML format.
  • Validate the import file.
  • Import new data quality rules, including BRFplus implementation.

To export data quality rules, complete the following steps:

  1. Select the rules for export in the Manage Rules app.
  2. Describe the export.
  3. Start the export, enabling the system to create a file.
  4. Download the exported rules as an OpenOffice XML (XLS) format.

BRFplus information is stored in a .zip file.

The figure shows the three steps to export Data Quality Rules.

To import Data Quality Rules, complete the following steps:

  1. Start the Import app from the launchpad.
  2. Create an import and add a file.
  3. The system examines the file content.
  4. Initiate the import of the rules.
The figure shows the three steps to import a Data Quality Rule.

Validation Rules Simulation

With the Validation Rules app, you can collaboratively describe, catalog, and implement rules for data quality using a central rule repository. These validation rules can be used for data quality evaluations and to check data in change requests, in consolidation and in mass processing for products and business partners.

As a Master Data Steward, you can use this app to get structured and comprehensive access to rules as well as to link rules from the rule mining process to validation rules in the central rule repository. You can also trigger the rule export process from this app.

Key Features

  • Define rules including business aspects, responsibilities, and usages.
  • Define status handling and lifecycle of rules.
  • Define the technical implementation and execution of rules with BRFplus.
  • Assign one or several rule usages to be able to proceed with the rule implementation.
  • Configure a message that will appear in the master data processes where the rule is applied.
  • Link rules from the rule mining process to validation rules in the central rule repository. If automatic implementation is supported, the BRFplus implementation is automatically added.
  • Start the export of rules.
  • Enable or disable the usage for each master data process.
  • Simulate rules to check if they work as intended.
  • Display the history of a validation rule by selecting Show Audit Trail in the Administrative Data section.

Simulation

Use the Simulation function to judge the effect of validation rules. Simulations can be used to verify your implementation to ensure its correctness. Simulations can also give you a preview of the rule results so you can see the potential impact on your data quality.

The Simulation function can be accessed through the Simulations tab in the Validation Rules apps.

Perform the Following Steps to Create a Simulation

  1. In the Validation Rules app for your chosen domain, choose a rule and, in the Simulations section, choose Create.
  2. Enter a description for your simulation and choose Save.The screenshot shows the Create Simulation window, with a description for the simulation.
  3. Select the data to which you want the rule to apply.The screenshot shows the select data screen in the simulation process.
  4. Optionally, navigate to the Simulate Active Version and Simulate Inactive Version tabs to change the package size and set checkpoints before starting the simulation. The default package size is 500. This is used for processing. Reduce the package size if you are dealing with complex records to avoid timeout issues.
  5. Choose Start. You simulate based on the active and inactive version of the rule implementation. If a rule does not have an active version or an inactive version, that step is skipped.

    It will stop for review once finished. Selected data is evaluated against the active version of the rule's BRFplus implementation, and then also the inactive version.

    Note

    An active version of a rule has its BRFplus implementation saved and activated. An inactive version of a rule has changes in the BRFplus that have been saved but not yet activated. There can only be one active and one inactive version of a rule at any one time.
  6. Review the results shown for the rule simulation.The screenshot shows the simulation results screen.
  7. Navigate back to the Simulations tab in the validation rule to get an overview of the simulation results for the active and inactive versions of that rule. Decide if you want to rework the rule implementation or send it for approval.

Data Quality Rules: Usage and Implementation

Data Quality Rules can be used for the following purposes:

Data quality evaluation

Rules are applied to active data in the system.

Check in of change requests

Data processed with change requests is validated.

Check in of consolidation and check in of mass processing

Rules are used in the validation and activation steps of processes.

Individual enablement and disablement of usages

The rules are independent.

Reuse of the same rule implementation

Rule can be used in multiple implementations.

Note

To enable usage, the user needs the authorization object MDQRM_RLUS with the MDQ_RULUSG field. The authorization can be granted per usage.

Data Quality Management with SAP Master Data Governance

The figure illustrates the fields in Data Quality Management to add rules in change request processing.

Rule Implementation with BRFplus

Facts about rule implementation with BRFplus

Implementation with BRFplus:
Business Rule Framework plus (BRFplus) is used for implementing and executing rules.
Powerful, versatile, and assisted:
  • Implementation with BRFplus from simple "field is not initial" to complex calculations
  • Powerful and context-aware BRFplus workbench
  • Robust and proven rules engine with code generation
  • BRFplus structuring artifacts created automatically with Prepare, once a usage is added
The screenshot shows the usage of data quality rules in change requests. In this case the rule checks if the entered two search terms are identical, and throws a warning.

Base Table Meaning

Facts about the meaning of the base table

  • Semantically, Base Table defines the level of the information hierarchy of a product to which the rule applies.

  • Example 1: If Basic Data (MARA) is used as the base table, the rule is applied to the basic data of a product only. Consequently, there is only one outcome of this rule for a product.

    Usage example: A check if MARA-NTGEW (Net Weight) is not initial.

  • Example 2: If Plant Data (MARC) is used as the base table, the rule is applied to the data of every plant of a product. Here, there is an outcome of the rule for each plant for which the product is available.

    Usage example: A check if MARC-DISMM (MRP Type) is not initial.

Note

The base table must be determined when the first usage is prepared, and is fixed afterwards.

Scope Expression and Condition Expression

The figure illustrates the split of logic:

Split of Logic

The implementation is split into Scope and Condition for clarity and simpler re-use.

If the data is not in the scope of the rule, no outcome is provided.

Scope

The Scope Expression is used by the system to determine if a product (or parts of a product as specified by Base Table) is in the scope of the rule and therefore checked with the Condition Expression.

Condition Expression

The Condition Expression determines the outcome of the rule, which is either OK or Not OK.

The figure shows the split of logic.

The figure illustrates a typical example.

The figure explains the Rule Scope and the Rule Condition. Rule Scope: The first part of the rule For all products of type finished goods (MARA-MTART=FERT) is called the Rule Scope. The scope defines to which product master data the rule shall be applied. For all product data in the scope of the rule, the conditions are evaluated. This produces the outcome of either OK or NOT OK. For all product data that is not in the scope of the rule, the outcome is undefined. Rule Condition: We call the second part of the rule the filed Division (MARA-SPART) must be filled the rule condition. This is an actual check of the rule, and the system reports if products comply with or violate this rule. When the rule is evaluated, the rule condition determines the outcome of either OK or Not OK

To implement it, perform the following:

BRFplus Workbench
  • Clicking on the scope expression opens the generated expression in its initial version in the BRFplus workbench.
  • Use the Edit Operand button to implement the logic.
Simple Example

For all products of type finished goods (MARA-MTART = FERT)…

Extended Example

For more complex use cases and to re-use already implemented logic, you can also extend the expression and use all BRFplus capabilities.

The screenshot shows different examples of implementing the scope expression. On the first screenshot, you can see the initial version. The screenshot in the middle shows a simple example. And the last screenshot shows an extended example.

Note

Implementing the condition expression follows the same pattern.

Analyze Data Quality in Company Code Data for Customers and Suppliers

The analysis of data quality offers you the following functionality:

  • Analyze data quality issues with customer/supplier company code attributes.
  • Correct business partner data using both single and mass processing.
  • Export incorrect data.
The screenshot shows the analysis screens of company code data for customer and supplier.

Data Quality Rules in Mass Processing and Consolidation

You can integrate Data Quality Rules in the Mass Processing and Consolidation Process. This occurs in the validation and activation phase of the process. With SAP S/4HANA 2020, you can create predefined messages that you can use to inform the user in the mass processing and consolidation process.

Predefined Message

  • Maintain predefined messages for data quality rules.
  • Use message variables from the rule context and master data context.
  • Define message severity.
The screenshot shows a predefined message for data quality rules.

In the next step, create a mass processing or consolidation process and use your data quality rule in the process.

Data Quality Rule in Mass Processing or Consolidation

  • Use a data quality rule in mass processing and consolidation.
  • Use one implementation for different master data processes.
  • Use a predefined message to inform users on consolidation and mass processing UI.
The figure shows the screenshots of Data Quality Rules used in Mass Processing and Consolidation processes.
The figure shows a screenshot showing warnings based on implemented Data Quality Rules in a mass processing process.

Analyze Data Quality Using the Data Quality Dashboard in SAP Analytics Cloud

Data Quality Dashboard in SAP Analytics Cloud provides an easy-to-use dashboard for end users, from chief data officers to master data specialists. These dashboards are delivered in the content network of SAP Analytics Cloud. To use these dashboards, you must import the SAP Master Data Governance package into your SAP Analytics Cloud tenant and set up the connection to your SAP S/4HANA Cloud 2008 tenant or SAP S/4HANA 2020 system. You can now use it out-of-the-box for Product Data and Business Partner data.

This dashboard offers you the following possibilities:

  • Get a real-time overview of master data quality and historical trends.
  • Customize your dashboard for greater flexibility to create your own charts.
  • Benefit from a cloud-based central analytics platform.
  • Explore data quality score by category, dimension, and data quality rule.
The figure shows some screenshots of SAP Analytics Cloud and some configuration steps.

The dashboard enables you to get the information you need and provides you more drilldown functions out-of-the-box. Through integration with data quality management in SAP Master Data Governance, you can navigate easily to your back-end system and do even deeper drilldowns and correction there.

The Data Quality Dashboard uses live data connections to the data source in SAP S/4HANA systems with integrated authorization checks. Through this segregation function control, nobody should be able to see data for which they are not authorized.

The figure shows some screenshots of the Data Quality Dashboard in SAP Analytics Cloud. You can see Scores by Category, Scores By Category Dimension and Score By Data Quality Rule.

Custom Fields Enabled Data Quality Management

SAP S/4HANA allows you to maintain information that is business-important but not SAP standard by defining custom fields. Key users can use the Custom Field app to manage these fields.

Note

For more details on how to use this app, see: Extending Data Quality Management for Your Custom Fields | SAP Blogs.

The figure describes the High Level Scope and Features as well as the Benefits and the Business Value of Custom Fields enabled Data Quality Management.

To use Data Quality Management for Custom Fields, complete the following steps:

  1. Define Validation Rule for Custom Field.
  2. Apply Validation to your Custom Field in Master Data Processes.

    Quality checks on custom fields can be executed in consolidation and mass processing applications.

  3. Evaluate Custom Fields and Remediate Data Quality Issue.

    In addition to data checking in the data change process, Master Data Governance Data Quality Management provides the quality evaluation function, which supports you in discovering erroneous master data based on your defined validation rules. Custom fields are also evaluated. After evaluation, the custom fields are also available in the evaluation results apps for you to review the results.

  4. Automation – Define Derivation Rule (SAP S/4HANA 2021 FSP1 and above).

    In Data Quality Management, derivation scenarios group derivation rules together to automate the value(s) of single or multiple fields. Custom fields can be used as Condition and Result fields in derivation rules.

  5. Automation – Apply Derivation Rules in Master Data Processes (SAP S/4HANA 2021 FSP1 and above).

    Derivation scenarios can then be used in mass processing and consolidation, for example, to automate the extension of materials to plants or to create customer data for a business partner.