Data Crossroads has published the Data Management Maturity Assessment Review second year in a row. In the first article of the series, we have explained the methodology for measuring maturity. Then, we demonstrated the general trends in data management, data governance, data modeling, information systems architecture, and data chains. This article focuses on trends in the optimization of data quality.
In this article, we will:
- Explain the structure of this capability
- Demonstrate the general trends in the optimization of data chains
- Investigate changes in four key performance indicators
Data quality
In the “Orange” model of data management, data quality is a business capability that enables delivering data and information of the required quality.
Four components enable this capability. These are process, role, data/deliverable, and tool. Each component consists of a set of items. The complete list of items per component can be seen in Figure 1.
The logic behind this model is simple. For instance, data quality capability delivers data quality checks and controls along data chains. These checks and controls are one of the deliverables of this capability. A process to develop, document, and maintain data quality checks and controls delivers this artifact. Data quality analysts, IT engineers, and business data stakeholders participate in this process and deliver checks and controls. A data quality application and business rules repository assist in documenting data lineage.
Now let us look at the general trends in data quality maturity.
General trends in data quality maturity
In the first article of this series, we see the general changes in data management maturity. Data quality maturity slightly worsened in 2020 compared to 2019, as shown in Figure 2.
Some movements between the maturity levels presented in Figure 3 provide possible explanations.
In general, the trend seems optimistic. The number of companies at the lowest “uncontrolled” level declined. The number of respondents at the “in development” and “capable” maturity levels has grown. The number of companies with “ad-hoc” status remains the same.
The “Orange” maturity scan uses four performance indicators to assess the maturity of this capability. Let us take an in-depth look at each of them.
Indicator 1: “information for decision-making.”
Data management supports the decision-making processes at different management levels. This is one of the key data management business values. To make correct business decisions, managers should get the required information.
As demonstrated in Figure 4, the situation with information delivery has improved drastically.
The number of companies at the three lowest maturity levels has lowered significantly. Simultaneously, more companies have reached the “capable” and “effective” level of maturity.
The required information shouldn’t only be available, but it should also be delivered at the right time.
Indicator 2: “on-time data and information delivery.”
Rapid changes in the business environment require appropriate and fast responses. For that, decision-makers should have information in (near) real-time. To deliver such information, corresponding data should be available for processing also in (near) real-time.
The situation with timely delivery has also improved significantly, as seen in Figure 5.
The percentage of respondents at the three highest maturity levels has increased. The number of companies at the two lowest levels of maturity has lowered.
Information should not only be delivered on time. For correct decision-making, it must also be at the required level of quality.
Indicator 3: “data at the required level of quality.”
The quality of data makes sense only in a particular content. The quality of data should be measured against requirements. Until business users set up data requirements, no claims can be made for data quality. This indicator demonstrates positive trends similar to the previous indicators, as shown in Figure 6.
The number of companies with the highest maturity levels “in development,” “capable,” and “effective” has increased. Fewer companies have demonstrated “uncontrolled” and “ad-hoc” maturity levels.
An established data quality framework maintains data quality at the required level.
Indicator 4: “data quality roles and processes.”
An established data quality framework includes data quality-related roles and processes. Such a framework enables the effective functioning of the data-quality capability.
The trends of the data quality framework indicator are positive to some extent, as seen in Figure 7.
The number of companies that assessed their level of maturity as “capable” and “effective” has grown. Few companies have no data quality roles and rules. Still, a significant percentage of respondents (32%) have roles and rules at the “ad-hoc” level.
Conclusion
- The overall level of data quality maturity slightly worsened in 2020 compared to 2019.
- At the same time, trends remain positive.
- The following facts express the positive trends:
- The situation with the availability and timely delivery of data has improved significantly
- The delivery of data at the required quality has also demonstrated significant development
- The established data quality framework has demonstrated some positive shifts between the maturity levels.
Development tips
To improve the situation, companies should pay attention to the following:
- Prioritizing data-quality initiatives focused on data critical for business operations
- Defining data-quality requirements
- Designing and implementing data quality checks and controls along key data value chains.
This is the last article of the “Maturity Trends 2020” series.
Compare the maturity status of your company by performing a Data Management Maturity Scan.
For more insights, visit the Data Crossroads Academy site: