Data Crossroads has published the Data Management Maturity Assessment Review, second year in a row. In the first article of the series, we have explained the methodology to measuring maturity. Thenwe have demonstrated the general trends in data management, data governance, data modeling, information systems architecture, and data chains. This article focuses on trends in the optimization of data quality.
In this article, we will:
- Explain the structure of this capability
- Demonstrate the general trends in the optimization of data chains
- Investigate changes in four key performance indicators
Four components enable this capability. These are process, role, data/deliverable, and tool. Each component consists of a set of items. The full list of items per component can be seen in Figure 1.
The logic behind this model is simple. For instance, data quality capability delivers data quality checks and controls along data chains. These checks and controls are one of the deliverables of this capability. A process to develop, document, and maintain data quality checks and controls delivers this artifact. Data quality analysts, IT engineers, and business data stakeholders take part in this process and deliver checks and controls. A data quality application and business rules repository assist in documenting data lineage.
Now let us take a look at the general trends in data quality maturity.
General trends in data quality maturity
In the first article of this series, we have seen the general changes in data management maturity. The maturity of data quality slightly worsened in 2020 compared to 2019, as shown in Figure 2.
Some movements between the maturity levels presented in Figure 3 provide possible explanations.
In general, the trend seems positive. The number of companies at the lowest “uncontrolled” level declined. The number of respondents at the “in development” and “capable” levels of maturity has grown. The number of companies that have “ad-hoc” status essentially remains the same.
The “Orange” maturity scan uses four performance indicators to assess the maturity of this capability. Let us take anin-depth look at each of them.
Indicator 1: “information for decision-making”
Data management supports the decision-making processes at different management levels. This is one of the key data management business values. To make correct business decisions, managers should get the required information.
As demonstrated in Figure 4, the situation with the delivery of information has improved drastically.
The number of companies at the three lowest levels of maturity has lowered significantly. Simultaneously, more companies have reached the “capable” and “effective” level of maturity.
The required information shouldn’t only be available but it should also be delivered at the right time.
Indicator 2: “on-time data and information delivery”
Rapid changes in the business environment require appropriate and fast responses. For that, decision-makers should have information in (near) real time. To deliver such information, corresponding data should be available for processing also in (near)real time.
The situation with timely delivery has also improved significantly, as can be seen in Figure 5.
The percentage of respondents at the three highest maturity levels has increased. The number of companies at the two lowest levels of maturity has lowered.
Information should not only be delivered on-time. For correct decision-making, it must also be at the required level of quality.
Indicator 3: “data at the required level of quality”
The quality of data makes sense only in a particular content. The quality of data should be measured against requirements. Until business users set up data requirements, no claims can be made for the quality of data. This indicator demonstrates positive trends similar to the previous indicators, as shown in Figure 6.
The number of companies with the highest maturity levels “in development,” “capable,” and “effective” has increased. Fewer companies have demonstrated the “uncontrolled” and “ad-hoc” levels of maturity.
An established data quality framework maintains the quality of data at the required level.
Indicator 4: “data quality roles and processes”
An established data quality framework includes data quality-related roles and processes. Such a framework enables the effective functioning of the data-quality capability.
The trends of the data quality framework indicator are positive to some extent, as seen in Figure 7.
The number of companies that assessed their level of maturity as “capable” and “effective” has grown. Less companies have no data quality roles and rules. Still, a significant percentage of respondents (32%) has roles and rules at the “ad-hoc” level.
- The overall level of data quality maturity slightly worsened in 2020 compared to 2019.
- At the same time, trends remain positive.
- The following facts express the positive trends:
- The situation with the availability and timely delivery of data has improved significantly
- The delivery of data at the required quality has also demonstrated significant development
- The established data quality framework has demonstrated some positive shifts between the levels of maturity.
To improve the situation, companies should pay attention to:
- Prioritizing data-quality initiatives focused on data critical for business operations
- Defining data-quality requirements
- Designing and implementing data-quality checks and controls along key data value chains.
This is the last article of the “Maturity Trends 2020” series.
Compare the maturity status of your company by performing a Data Management Maturity Scan.