In the previous articles of this series, we have discussed how to build a company-specific data management maturity assessment and the way to benchmark the results for data modeling sub-capability. Now we will focus on information systems architecture maturity.
In this article, I will share an in-depth approach for measuring and benchmarking the maturity level of the information systems architecture sub-capability. Benchmark results used in this article have been based on the ‘Data Management Maturity Review 2019.’
We will cover the following four topics:
- Definition of the ‘information systems architecture’ sub-capability and its dimensions
- Specification of indicators (KPIs) for measuring the performance
- Benchmarking results based on a set of indicators
- Development tips
Information Systems Architecture sub-capability and its dimensions
Information Systems Architecture is one of the five sub-capabilities of the ‘Orange’ data management model explained in Data Management Maturity 101: What is a data management maturity assessment, and why does a company need it? The overview of the model is shown in Figure 1.
In the context of the “Orange” model, ‘information systems architecture’ is a business capability that ensures deliverables of data and application architecture required for designing data
and information value chain’ as specified in ‘Data Management Maturity Review 2019.’
The following dimensions enable a (sub) capability: role, process, data (input and output), and tools.
In Figure 2, each dimension of the information systems architecture sub-capability is described in detail.
Figure 2. The dimensions of the information systems architecture sub-capability.
In our context, ‘data’ stands for formal deliverables/artifacts of the data management sub-capability. The key deliverables of this sub-capability are related to the rules and roles that ensure the operation of the data management function.
It is worth mentioning that not all information systems architecture artifacts (specified by Togaf 9.2) are required for management. The key deliverables should include the repository of reports, dashboards, and reports’ flows. Reports and dashboards locate at the end of the data chain. They represent only one type of data/information usage. The analysis and optimization of the use of data/information are important for data chain design.
Data chains can be documented at different levels. Usually, the documentation starts with the documentation of the data flows between systems and applications. Data sets locate in systems and applications. The documentation of data flows at the level of data sets with links to the corresponding applications is recommended. Catalogs of systems, applications, and data sets should be created as the first step of such documentation.
‘Process’ signifies a data management-related business process at different levels of abstraction.
The processes related to the information systems architecture will focus on developing, analyzing, and maintaining catalogs of systems, applications, reports, data sets, etc.
‘Role’ describes the participation of people in business operations. It can represent business units, functional jobs, a set of data management-related accountabilities and responsibilities (in the RACI context), etc.
Several important groups of roles are involved in the conduction of information systems architecture.
First, these are data management-related roles such as data-, application-, and solution architects. They will be accountable for creating the repositories of systems, applications, and interfaces. Business subject matters and business analysts must be involved in reports’ catalogs. Data management professionals may only assist with creating the catalogs/repositories for reports.
‘Tools’ include information technology systems, applications, and resources required to perform the data management function, e.g., budget.
The most important requirement for documenting information systems artifacts is the ability to integrate the catalogs with other data management deliverables.
Specification of indicators (KPIs) to measure the performance
Each sub-capability dimension described above can serve as a specific indicator (KPI) to measure performance.
Each company can create its maturity assessment by assigning maturity levels to chosen indicators.
I will demonstrate four indicators as examples. These indicators have been used as the foundation of our Data Management Maturity Scan:
Indicator 1 (process): the presence of reports’ (flow) optimization practices
The creation of the report catalogs is not the purpose by itself. This task has two different goals.
The first goal is to optimize the number of reports by minimizing it to the required information. The optimization of reports might take some time. But it is not a one-off exercise. Information needs and the development of reports is an ongoing processes. The second one is establishing the process that will permanently enable such optimization.
Indicator 2 (process): the presence of the optimization practices of information systems architecture
All said above about the report optimization practices is also applicable to the process of the optimization of systems-, applications-, and data architecture.
Indicator 3 (process): the process of managing master and reference data
One of the important attention areas of data management is the management of reference and master data. Recognition of master and reference data is only the first step in the process. Optimization of information systems architecture closely relates to optimizing reference and master data architecture. Therefore, the approach to managing master and reference data is highly important.
Indicator 4 (role): the presence of the formal function of information systems architecture
The formality of this capability means that some functional roles within an organization should take accountability for performing tasks related to the architecture.
Below are the benchmarking results for the four indicators mentioned above (KPIs). You can use these four indicators to benchmark your company’s situation quickly.
Each indicator has been evaluated at one of five maturity levels demonstrating the development level.
The results presented in Figure 3 led us to the following conclusions:
- Almost 40% of companies don’t have any process for optimizing reports or information systems architecture. Yet, more companies focus on implementing and maintaining information systems architecture practices than reporting practices.
- Remarkably, more than 65 % of respondents pay attention to the establishment of the management of master and reference data.
- More than 50% of companies have formalized the information systems architecture function. It is interesting to compare the results of the second and fourth indicators. You see that the percentage of companies that have not formalized this function is more significant than those that don’t have the information systems’ optimization process.
To improve the situation with the information systems architecture, companies should:
… align activities of data modeling and information systems architecture capabilities. Data modeling focuses on the recognition of data and information needs. Information systems architecture ensures the implementation of these needs in practice.
…specify critical data elements. It will make the optimization process of reporting and architecture practices feasible and successful.
…invest in developing repositories for information systems architecture artifacts to design and optimize the information systems architecture.
The following article will provide an analysis of Data Quality capability.
For more insights, visit the Data Crossroads Academy site: