“Maturity is a high price to pay for growing up” 
– Tom Stoppard

Once upon a time, there was a company. Their data management practices were based on a certain model. One day, they have invited external consultants to assess the level of maturity of their data management. The consulting company that was involved was using their own data management model for their assessment. I think you can imagine how effective this assessment turned out to be and how accurate the results were.

No one doubts the fact that a maturity assessment can be very useful. It can be used for multiple purposes, such as:

  • evaluating your company’s current performance
  • defining your ‘to be’ situation
  • assessing the gap between the current and the ‘to be’ situation, producing results that will be a foundation for an action plan
  • comparing your company with its peers and specifying the best practices in your industry.

In data management (DM), we have a plenty of data management maturity models, the most well-known are: DAMA-DMBOK1, DCAM2, CMMI CERT-RMM (Data Management Maturity Model) by CMMI3, IBM Data Governance Council Maturity model4, Stanford Data Governance Maturity Model5, Gartner’s Enterprise Information Management Maturity Model6.

When I look at all of these models, as a data management practitioner, two questions pop into my mind:

  1. What are the key components (or metamodel) of a maturity model for data management?
  2. Are all of these models compatible with each other? If I use one model and my peer uses another

one, can we compare our results and performance?

 

The key components of a data management maturity model

In order to compare different models, we need to agree on our understanding of the metamodel of a DM maturity model. As a reference, I took the papers on maturity models by the Carnegie Mellon University7  and one by the Institute of Internal Auditors8.

From the information I found in these sources, I have identified four key components that would comprise the metamodel:

  • Levels
    …that are progression stages in data management.
  • Subject domains and sub-domains
    …which I specify as DM business capabilities. According to the Open Group definition, a capability is ‘an ability [..] that a business may possess or exchange to achieve a specific purpose or outcome’ and which is constituted from roles, processes, information and tools. 9
  • Subject domain dimensions
    …that characterize each (sub-)domain and are used as assessment criteria.
  • Artifacts
    that are benchmarks or examples of practices or deliverables for domain dimensions.

Now we are equipped to compare the maturity models mentioned above.

 

Comparing DM maturity models

Before going deeper, I would like to draw your attention to two observations that make an impact on the overview of the models:

  1. Information about the models is not always available on the intranet in full scope, which would be required for the comparison.
  2. Some models like DAMA, DCAM and CMMI are applied to data management, while the ones by IBM, Stanford and Gartner focus on data/information governance aspects. These differences in the approaches are not quite clear.

DM maturity model levels

The first component of the metamodel of DM maturity is the number and name of the maturity levels. While DAMA and DCAM models offer 6 levels, IBM and Stanford models go for 5 levels. Only IBM and Standford models agree on the names. The DAMA and DCAM models have their own approach. There is no information available about the CMMI and Gartner models. So, as we can see, already the first component of the metamodel is not mutually agreed upon.

DM maturity model subject domains

The next component of the DM maturity metamodel is subject domains and sub-domains.

The first big challenge is the type of these domains. In DAMA model, this is a Knowledge Area. In DCAM it is a business capability. CMMI and Stanford models take a Process as the basis, and IBM talks about a Competency. As soon as definition of all of these domain types are not aligned, you get a feeling that you compare ‘apples’ with ‘pears’.

The second challenge: the number of domains and also shows striking differences. Here is a quick overview:

The key question of course, is: what about the content of these domains and sub-domains? Does it vary just as much as their number?

The third challenge is the content of the domains. I have used the metamodel of data management that I have introduced in one of my earlier articles. I have made a comparison by matching the domains from different models with the DM metamodel.

 

From the first glance, you can see that there are seven key Subject Areas where the Subject domains are located. These are:

  • Data
  • Data and System Design
  • Technology, Governance
  • Data Quality
  • Security
  • Related Capabilities.

You can see that the difference in approaches to define the key Domains are rather big. It is not the purpose of this article to deliver a detailed analysis, but there is one striking observation I would like to share: the Subject domains and deliverables of these domains are being mixed with one another.  For example, let us have a look at Data governance. The domain ‘Data governance’ exists in four different models. Some other domains like ‘Data management strategy’, that appears in three models, is considered as a deliverable of Data Governance domain in other models, for example in DAMA model.

Such a big difference of opinions on key Subject domains is rather confusing.

Subject domain dimensions

Subject domain dimensions are characteristics of (sub-) domains. It is important to know them as they form a base for the description for Domain artifacts and maturity levels.  For a business capability models, for example, the key dimensions are: roles, processes, information and tools.

Only in case of the models of DAMA and Stanford, the information about the dimensions is available. DAMA model specifies Activity, Tools, Standards and People and Resources. Stanford has come up with People, Policies, Capabilities.  So, you can see that even if you compare these two models, there is no alignment.

DM maturity artifacts

The artifact is the last component of DM maturity model. The description of artifacts, or in other words, the description of the level of maturity per Subject (sub-)domain and Subject dimensions, could only be found in the DCAM and Stanford models. Taking into account that Subject domains and their domains are different, it is hardly possible to compare outcomes of different models.

 

Conclusions
  1. There are several Data management / governance maturity models available, but they can hardly be compared. The differences can be found in each of the four maturity metamodel components, which are: levels, subject (sub-)domains, subject domain dimensions and artifacts.
  2. Each company that would like to assess its maturity should align the DM model they use with the DM maturity model.
  3. The situation with maturity models hardly allows us to reach one of the key goals of the maturity models: creating universal benchmarks for different companies.

 

———————————————

References:

  1. DAMA
  2. DCAM
  3. //cmmiinstitute.com/data-management-maturity
  4. //www-935.ibm.com/services/uk/cio/pdf/leverage_wp_data_gov_council_maturity_model.pdf
  5. //web.stanford.edu/dept/pres-provost/cgi-bin/dg/wordpress/wp-content/uploads/2011/11/StanfordDataGovernanceMaturityModel.pdf
  6. //blogs.gartner.com/andrew_white/files/2016/10/On_site_poster.pdf
  7. //resources.sei.cmu.edu/library/asset-view.cfm?assetid=58916
  8. //www.iia.nl/SiteFiles/IIA_leden/PG%20Maturity%20Models.pdf
  9. The Open Group Guide ‘Business Capabilities’.