Data Crossroads has published the Data Management Maturity Assessment Review, second year in a row. In the first article of the series, we have explained the methodology to measuring maturity. Thenwe have demonstrated the general trends in data management, data governance, data modeling, information systems architecture, and data chains. This article focuses on trends in the optimization of data quality.
In this article, we will:
Explain the structure of this capability
Demonstrate the general trends in the optimization of data chains
Investigate changes in four key performance indicators
Four components enable this capability. These are process, role, data/deliverable, and tool. Each component consists of a set of items. The full list of items per component can be seen in Figure 1.
Figure 1: Data quality components in detail.
The logic behind this model is simple. For instance, data quality capability delivers data quality checks and controls along data chains. These checks and controls are one of the deliverables of this capability. A process to develop, document, and maintain data quality checks and controls delivers this artifact. Data quality analysts, IT engineers, and business data stakeholders take part in this process and deliver checks and controls. A data quality application and business rules repository assist in documenting data lineage.
Now let us take a look at the general trends in data quality maturity.
General trends in data quality maturity
In the first article of this series, we have seen the general changes in data management maturity. The maturity of data quality slightly worsened in 2020 compared to 2019, as shown in Figure 2.
Figure 2: The comparison of the maturity levels per DM sub-capability.
Some movements between the maturity levels presented in Figure 3 provide possible explanations.
Figure 3: The changes per maturity level.
In general, the trend seems positive. The number of companies at the lowest “uncontrolled” level declined. The number of respondents at the “in development” and “capable” levels of maturity has grown. The number of companies that have “ad-hoc” status essentially remains the same.
The “Orange” maturity scan uses four performance indicators to assess the maturity of this capability. Let us take anin-depth look at each of them.
Indicator 1: “information for decision-making”
Data management supports the decision-making processes at different management levels. This is one of the key data management business values. To make correct business decisions, managers should get the required information.
As demonstrated in Figure 4, the situation with the delivery of information has improved drastically.
Figure 4: Trends in information availability for decision-making.
The number of companies at the three lowest levels of maturity has lowered significantly. Simultaneously, more companies have reached the “capable” and “effective” level of maturity.
The required information shouldn’t only be available but it should also be delivered at the right time.
Indicator 2: “on-time data and information delivery”
Rapid changes in the business environment require appropriate and fast responses. For that, decision-makers should have information in (near) real time. To deliver such information, corresponding data should be available for processing also in (near)real time.
The situation with timely delivery has also improved significantly, as can be seen in Figure 5.
Figure 5: Trends in on-time data and information delivery.
The percentage of respondents at the three highest maturity levels has increased. The number of companies at the two lowest levels of maturity has lowered.
Information should not only be delivered on-time. For correct decision-making, it must also be at the required level of quality.
Indicator 3: “data at the required level of quality”
The quality of data makes sense only in a particular content. The quality of data should be measured against requirements. Until business users set up data requirements, no claims can be made for the quality of data. This indicator demonstrates positive trends similar to the previous indicators, as shown in Figure 6.
Figure 6: Trends in the delivery of data at the required level of quality.
The number of companies with the highest maturity levels “in development,” “capable,” and “effective” has increased. Fewer companies have demonstrated the “uncontrolled” and “ad-hoc” levels of maturity.
An established data quality framework maintains the quality of data at the required level.
Indicator 4: “data quality roles and processes”
An established data quality framework includes data quality-related roles and processes. Such a framework enables the effective functioning of the data-quality capability.
The trends of the data quality framework indicator are positive to some extent, as seen in Figure 7.
Figure 7: Trends in the establishment of data quality framework.
The number of companies that assessed their level of maturity as “capable” and “effective” has grown. Less companies have no data quality roles and rules. Still, a significant percentage of respondents (32%) has roles and rules at the “ad-hoc” level.
The overall level of data quality maturity slightly worsened in 2020 compared to 2019.
At the same time, trends remain positive.
The following facts express the positive trends:
The situation with the availability and timely delivery of data has improved significantly
The delivery of data at the required quality has also demonstrated significant development
The established data quality framework has demonstrated some positive shifts between the levels of maturity.
To improve the situation, companies should pay attention to:
Prioritizing data-quality initiatives focused on data critical for business operations
Defining data-quality requirements
Designing and implementing data-quality checks and controls along key data value chains.
This is the last article of the “Maturity Trends 2020” series.
Irina is a data management practitioner with more than 10 years of experience. The key areas of her professional expertise are the implementation of data management frameworks and data lineage.
Throughout the years, she has worked for global institutions as well as large- and medium-sized organizations in different sectors, including but not limited to financial institutions, professional services, and IT companies.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.
If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.
3rd Party Cookies
This website uses Google Analytics to collect anonymous information such as the number of visitors to the site, and the most popular pages.
Keeping this cookie enabled helps us to improve our website.
Please enable Strictly Necessary Cookies first so that we can save your preferences!