Data Crossroads has published the Data Management Maturity Assessment Review for the second year in a row. In the previous article of this series, we explained the methodology to measuring maturity. Then we have demonstrated the general trends in data management and data governance. In this article, we will examine the trends in data modeling maturity.
In this article, we will:
- Explain the structure of this capability
- Demonstrate the general trends in data modeling maturity
- Investigate changes in four key performance indicators
In the “Orange” model of data management, data modeling is a business capability that describes data using data models. Data models assist in defining information and data requirements, and documenting data structures at different levels of abstraction.
Four components enable this capability. These are process, role, data/deliverable, and tool. Each component consists of a set of items. The full list of items per component can be seen in Figure 1.
The logic behind this model is simple. For example, a conceptual data model is a deliverable of this capability. A process to develop, document, and maintain data models delivers this artifact. Data modelers take part in this process and produce conceptual models. A data modeling application assists in designing, integrating, and maintaining these models.
Now let us take a look at the general trends in data modeling capability maturity.
General trends in data modeling maturity
In the first article of this series, we have seen the general changes in data management maturity. The maturity of data modeling reached a slightly higher level in 2020 compared to 2019, as shown in Figure 2.
The results in Figure 3 demonstrate some shifts between the levels of maturity.
The overall maturity level of data modeling has improved for two reasons. First, the number of respondents at the two lowest levels of maturity has decreased. Second, many more companies have moved to the “in development” or “capable” stages.
The “Orange” maturity scan uses four performance indicators to assess the maturity of this capability. Let us take anin-depth look at each of them.
Indicator 1: “business glossary”
A business glossary assists in creating a common language within a company. A business glossary allows, for example, for increasing the efficiency in gathering data requirements, resolving data quality issues.
Figure 4 demonstrates the trends in the development of a business glossary. The situation at the lowest maturity level “uncontrolled” and “ad-hoc” has improved. The number of respondents has decreased. The number of companies at the highest level of maturity has increased significantly.
A business glossary is only the first step in assessing the maturity of data modeling. Data models are the next step.
Indicator 2: “data models”
Data models describe data at different levels of abstraction. The data elements and relationships between them form the models’ content. Data models serve several purposes. First, data models allow for structuring data and diminishing ambiguous data elements and relationships. Second, data models assisting in finding dependencies between information and data requirements.
Figure 5 shows confident improvement in the situation with data models.
The decrease of respondents at the two lowest levels and the increase of those in the three higher levels prove this plausible situation.
Data models serve to ease the gathering of information and data requirements.
Indicator 3: “information and data requirements”
The data value creation cycle demonstrates the dependency between information and data. To provide the required information, data management delivers and transforms corresponding source data. Information and data models serve to link the information and data required for it. Therefore, gathering information and data requirements is an important maturity indicator.
The trends with data and information requirements follow the trends with data models. Figure 6 demonstrates these trends.
The number of participants within the three lowest levels has decreased. We can see a sufficient increase of participants within the highest levels of maturity.
Critical data allows one to effectively scope data management initiatives. The usage of such a techniquedemonstrates the maturity of data modeling capability.
Indicator 4: “critical data identified”
Critical data is a means to scope and prioritize data management initiatives. Data elements have different levels of business criticality within diverse contexts. For example, personal data is the most critical for any data initiative related to compliance with personal data regulations. The same data can be less important in the finance context, as this data may be not needed.
Figure 7 shows that the situation with critical data improved in 2020 compared to 2019.
The patterns of change are similar to those discussed above for other indicators.
The demonstrated results have brought us to the following conclusions.
- The general level of data modeling maturity grew in 2020 compared to 2019.
- The changes in data modeling demonstrate positive trends. More companies have paid attention to the development of business glossaries and data models. They also have put forth more effort into gathering and aligning information and data requirements. Companies used the concept of critical data more often in 2020.
We are in the era of big data and big changes in data application technologies, a situation that requires new methodologies to model data. These changes motivate companies to put more effort and resources into the development of data modeling. Therefore, companies should:
- Continue the development of a company-wide business glossary to improve business communication
- Review existing methods and implement new data modeling techniques
- Optimize your data requirements by designing data models for the sets of critical data
In the following article, we investigate the trends in the information systems architecture maturity.
Compare the maturity status of your company by performing a Data Management Maturity Scan.