In the previous articles of this series, we have discussed how to build a company-specific data management maturity assessment and the way to benchmark the results for data modeling sub-capability. Now we will focus on information systems architecture maturity.
In this article, I will share an in-depth approach for measuring and benchmarking the maturity level of the information systems architecture sub-capability. Benchmark results used in this article have been based on ‘Data Management Maturity Review 2019.’
We will cover the following four topics:
- Definition of the ‘information systems architecture’ sub-capability and its dimensions
- Specification of indicators (KPIs) for measuring the performance
- Benchmarking results based on a set of indicators
- Development tips
Information Systems Architecture sub-capability and its dimensions
Information Systems Architecture is one of the five sub-capabilities of the ‘Orange’ model of data management that is explained in Data Management Maturity 101: What is a data management maturity assessment and why does a company need it?. The overview of the model is shown in Figure 1.
In the context of the “Orange” model, ‘information systems architecture’ is a business capability that ensures deliverables of data and application architecture required for designing data
and information value chain’ as specified in ‘Data Management Maturity Review 2019.’
The following dimensions enable a (sub) capability: role, process, data (input and output), and tools.
In Figure 2, each dimension of the information systems architecture sub-capability is described in detail.
Figure 2. The dimensions of the information systems architecture sub-capability.
In our context, ‘data’ stands for formal deliverables/artifacts of the data management sub-capability. The key deliverables of this sub-capability are related to the rules and roles that ensure the operation of the data management function.
It is worth mentioning that for management, not all information systems architecture artifacts (specified by Togaf 9.2) are required. The key deliverables should include the repository of reports, dashboards, reports’ flows. Reports and dashboards locate at the end of the data chain. They represent only one type of data/information usage. The analysis and optimization of the usage of data/information are important for data chain design.
Data chains can be documented at different levels. Usually, the documentation starts with the documentation of the data flows between systems and applications. Data sets locate in systems and applications. The documentation of data flows at the level of data sets with links to the corresponding applications is recommended. Catalogs of systems, applications, and data sets should be created as the first step of such documentation.
‘Process’ signifies a data management-related business process at different levels of abstraction.
The processes related to the information systems architecture will focus on the development, analysis, maintenance of catalogs of systems, applications, reports, data sets, etc.
‘Role’ describes the participation of people in business operations. It can represent business units, functional jobs, a set of data management-related accountabilities and responsibilities (in RACI context), etc.
There are several important groups of roles involved in the conduction of information systems architecture.
First, these are data management-related roles such as data-, application-, solution architects. They will be accountable for the creation of the repositories of systems, applications, interfaces. For reports’ catalogs, business subject matters and business analysts have to be involved. Data management professionals may only assist with the creation of the catalogs/repositories for reports.
‘Tools’ include information technology systems and applications as well as resources required for performing the data management function, e.g. budget.
The most important requirement for the documentation of information systems artifacts is the ability to integrate the catalogs with other data management deliverables.
Specification of indicators (KPIs) to measure the performance
Each of the sub-capability dimensions described above can serve as a specific indicator (KPI) to measure performance.
By assigning maturity levels to chosen indicators each company can create its maturity assessment.
I will demonstrate four indicators as examples. These indicators have been used as the foundation of our Data Management Maturity Scan:
Indicator 1 (process) : the presence of reports’ (flow) optimization practices
The creation of the report catalogs is not the purpose by itself. This task has two different goals.
The first goal is to optimize the number of reports by minimizing it to the actually required information. The optimization of reports might take some time. But it is not a one-off exercise. Information needs and the development of reports is an ongoing process. The second one is to establish the process that will enable such optimization permanently.
Indicator 2 (process) : the presence of the optimization practices of information systems architecture
All said above about the report optimization practices is also applicable to the process of the optimization of systems-, applications-, and data architecture.
Indicator 3 (process) : the process to manage master and reference data
One of the important attention areas of data management is the management of reference and master data. Recognition of master and reference data is only the first step in the process. Optimization of information systems architecture closely relates to the optimization of reference and master data architecture. Therefore, the process to manage master and reference data is of high importance.
Indicator 4 (role) : the presence of the formal function of information systems architecture
The formality of this capability means that some functional roles within an organization should take accountability to perform tasks related to the architecture.
Below you will find the benchmarking results for the four above-mentioned indicators (KPIs). You can use these four indicators to quickly benchmark the situation in your company against.
Each of the indicators has been evaluated at one of five maturity levels, that demonstrate the level of development.
The results presented in Figure 3, led us to the following conclusions:
- Almost 40% of companies don’t have any process for optimization of either reports or information systems architecture. Yet, more companies pay attention to the implementation and maintenance of the information systems architecture practices than reporting practices.
- Remarkably, more than 65 % of respondents pay attention to the establishment of the management of master and reference data.
- More than 50% of companies have formalized the information systems architecture function. It is interesting to compare the results of the second and fourth indicators. You see that the percentage of the companies that have not formalized this function is bigger than companies that don’t have the process of the information systems’ optimization in place.
To improve the situation with the information systems architecture companies should:
… align activities of data modeling and information systems architecture capabilities. Data modeling focuses on the recognition of data and information needs. Information systems architecture ensures the implementation of these needs in practice.
…specify critical data elements. It will make the optimization process of reporting and architecture practices feasible and successful.
…invest in the development of repositories for information systems architecture artifacts to be able to design and optimize the information systems architecture.
In the next article, an analysis will be provided for Data Quality capability.