Once upon a time, there was a company. Their data management practices were based on a particular model. One day, they invited external consultants to assess the level of maturity of their data management. The consulting company that was involved started comparing different data management maturity models but ended up using its data management model for its assessment. I think you can imagine how effective this assessment turned out to be and how accurate the results were.

No one doubts the fact that a maturity assessment can be very useful. It can be used for multiple purposes, such as:

  • evaluating your company’s current performance
  • defining your ‘to be’ situation
  • assessing the gap between the current and the ‘to be’ situation, producing results that will be a foundation for an action plan
  • comparing your company with its peers and specifying the best practices in your industry.

In data management (DM), we have plenty of data management maturity models; the most well-known are DAMA-DMBOK1, DCAM2, CMMI CERT-RMM (Data Management Maturity Model) by CMMI3, IBM Data Governance Council Maturity model4, Stanford Data Governance Maturity Model5, Gartner’s Enterprise Information Management Maturity Model6.

When I look at all of these models, as a data management practitioner, two questions pop into my mind:

  1. What are the key components (or metamodel) of a maturity model for data management?
  2. Are all of these models compatible with each other? Can we compare our results and performance if I use one model and my peer uses another

one?

 

The key components of a data management maturity model

In order to compare different models, we need to agree on our understanding of the metamodel of a DM maturity model. As a reference, I took the papers on maturity models by Carnegie Mellon University7  and one by the Institute of Internal Auditors8.

From the information I found in these sources, I have identified four key components that would comprise the metamodel:

  • Levels
    …that are progression stages in data management.
  • Subject domains and sub-domains
    …which I specify as DM business capabilities. According to the Open Group definition, a capability is ‘an ability [..] that a business may possess or exchange to achieve a specific purpose or outcome’ and is constituted from roles, processes, information, and tools. 9
  • Subject domain dimensions
    …that characterize each (sub-)domain and are used as assessment criteria.
  • Artifacts
    that are benchmarks or examples of practices or deliverables for domain dimensions.

Now we are equipped to compare the maturity models mentioned above.

 

Comparing DM maturity models

Before going deeper, I would like to draw your attention to two observations that make an impact on the overview of the models:

  1. Information about the models is not always available on the intranet in full scope, which would be required for the comparison.
  2. Some models like DAMA, DCAM, and CMMI are applied to data management, while the ones by IBM, Stanford, and Gartner focus on data/information governance aspects. These differences in the approaches are not entirely clear.

DM maturity model levels

The first component of the metamodel of DM maturity is the number and name of the maturity levels. While DAMA and DCAM models offer 6 levels, IBM and Stanford models go for 5 levels. Only IBM and Standford models agree on the names. The DAMA and DCAM models have their approaches. There is no information available about the CMMI and Gartner models. So, as we can see, the first component of the metamodel is already not mutually agreed upon.

DM maturity model subject domains

The next component of the DM maturity metamodel is subject domains and sub-domains.

The first big challenge is the type of these domains. In the DAMA model, this is a Knowledge Area. In DCAM, it is a business capability. CMMI and Stanford models take a Process as the basis, and IBM discusses Competency. As soon as definitions of all of these domain types are not aligned, you get a feeling that you compare ‘apples’ with ‘pears.’

The second challenge: is the number of domains which also shows striking differences. Here is a quick overview:

The key question, of course, is: what about the content of these domains and sub-domains? Does it vary just as much as their number?

The third challenge is the content of the domains. I have used the metamodel of data management introduced in one of my earlier articles. I have made a comparison by matching the domains from different models with the DM metamodel.

 

At first glance, you can see seven key Subject Areas where the Subject domains are located. These are:

  • Data
  • Data and System Design
  • Technology, Governance
  • Data Quality
  • Security
  • Related Capabilities.

You can see that the difference in approaches to defining the key Domains are rather significant. It is not the purpose of this article to deliver a detailed analysis. Still, there is one striking observation I would like to share: the Subject domains and deliverables of these domains are being mixed. For example, let us have a look at Data governance. The domain ‘Data governance’ exists in four different models. Some other domains, like ‘Data management strategy,’ which appears in three models, are considered a deliverable of the Data Governance domain in other models, such as the DAMA model.

Such a big difference of opinions on key Subject domains is somewhat confusing.

Subject domain dimensions

Subject domain dimensions are characteristics of (sub-) domains. It is essential to know them as they form a base for describing Domain artifacts and maturity levels. For a business capability model, the key dimensions are roles, processes, information, and tools.

Only in the case of the models of DAMA and Stanford the information about the dimensions is available. DAMA model specifies Activity, Tools, Standards, and People and Resources. Stanford has come up with People, Policies, and Capabilities. So, you can see that there is no alignment even if you compare these two models.

DM maturity artifacts

The artifact is the last component of the DM maturity model. The description of artifacts, or in other words, the description of the level of maturity per Subject (sub-)domain and Subject dimensions, could only be found in the DCAM and Stanford models. Considering that Subject domains and their domains are different, it is hardly possible to compare the outcomes of different models.

 

Conclusions
  1. Several Data management/governance maturity models are available but can hardly be compared. The differences can be found in each of the four maturity metamodel components: levels, subject (sub-)domains, subject domain dimensions, and artifacts.
  2. Each company that would like to assess its maturity should align the DM model they use with the DM maturity model.
  3. The situation with maturity models hardly allows us to reach one of the key goals of the maturity models: creating universal benchmarks for different companies.

For more insights, visit the Data Crossroads Academy site: //academy.datacrossroads.nl.

———————————————

References:

  1. DAMA
  2. DCAM
  3. //cmmiinstitute.com/data-management-maturity
  4. //www-935.ibm.com/services/uk/cio/pdf/leverage_wp_data_gov_council_maturity_model.pdf
  5. //web.stanford.edu/dept/pres-provost/cgi-bin/dg/wordpress/wp-content/uploads/2011/11/StanfordDataGovernanceMaturityModel.pdf
  6. //blogs.gartner.com/andrew_white/files/2016/10/On_site_poster.pdf
  7. //resources.sei.cmu.edu/library/asset-view.cfm?assetid=58916
  8. //www.iia.nl/SiteFiles/IIA_leden/PG%20Maturity%20Models.pdf
  9. The Open Group Guide ‘Business Capabilities.’

[/vc_column_text][/vc_column][/vc_row]