This article describes trending topics in DQ, MDM, Data Risks, and Culture in 2025.

This is the fifth and last article in a series where I share top trending topics, my impressions, and key insights gathered during the #DGIQ and #EDW2025 conference, hosted by @Dataversity. This series offers a general summary and does not focus on any specific presentation from the conference.

In this article, I will focus on several data management capabilities discussed at the conference:

  • Data Quality
  • Master Data Management
  • Operational Resilience and Risk Preparedness
  • Data Culture and Skills

Data Quality

Data quality risks are a leading cause of project failure, yet they remain consistently underestimated.

Despite growing awareness, many organizations still overlook the complexity and severity of data-related risks. Common issues include poor documentation, unrealistic assumptions about existing data quality, and an overemphasis on systems and processes that neglect the actual state of the data itself. These risks often remain hidden until late in the project, when they are far more costly to address. Other risk factors include data duplication, semantic mismatches, and gaps between source and target data models. The impact extends beyond technical failures—delays, budget overruns, compliance violations, and reputational damage are all possible consequences. Organizations must recognize that proactively addressing data quality is not optional—it is essential for delivering business value and avoiding failure.

Data quality metrics and scores serve distinct purposes—metrics measure, while scores summarize.

Effective data quality management depends on both granular metrics and high-level scores. Metrics evaluate specific aspects of data quality—such as completeness, accuracy, or consistency—by applying predefined rules and logic. These detailed measurements enable organizations to pinpoint areas where issues exist and track progress over time. However, raw metrics can be challenging to interpret, especially for non-technical stakeholders. This is where scores come in. A data quality score aggregates multiple metrics into a single, weighted number that reflects the overall health or trustworthiness of a dataset. While metrics inform technical action, scores support broader communication and prioritization. Together, they provide the clarity and visibility needed to build trust, monitor change, and drive continuous improvement. Choosing the right balance between metrics and scores—and ensuring both are explainable and actionable—is key to embedding data quality in decision-making.

Managing data quality across the Medallion architecture requires targeted controls at every layer.

A Medallion architecture organizes data into three layers: Bronze (raw ingestion), Silver (cleansed and standardized), and Gold (refined, analytics-ready data). Each layer has distinct data quality requirements and risks. At the Bronze stage, the priority is to detect ingestion errors and basic anomalies before they pollute downstream processes. In the Silver layer, quality checks become more complex, targeting data consistency, schema integrity, and conformance to business logic. By the time data reaches the Gold layer, advanced validation ensures that aggregations, KPIs, and analytical outputs are accurate and trustworthy. Automated tests, continuous monitoring, and regression testing are essential throughout the pipeline. This layered approach enables early detection, faster resolution, and builds trust in the data products delivered to end users.

Data quality directly enables confident decisions, operational efficiency, and strategic trust.

When data is complete, accurate, and timely, business users reduce manual work, identify issues more quickly, and act with greater confidence. Trusted data eliminates guesswork, supports automation, and builds a foundation for broader governance and analytics initiatives. In short, data quality isn’t just a technical necessity—it’s a business enabler.

Master Data Management

MDM is a foundation for ethical, effective AI adoption.

Master Data Management (MDM) provides the trusted, high-quality datasets AI models need. Without clearly defined and governed master data—like customer, product, or location—organizations risk building AI systems on inaccurate or biased inputs. MDM ensures consistency, privacy alignment, and transparency by linking data governance and architecture to AI use cases. It helps meet regulatory requirements (e.g., EU AI Act) and assess whether sensitive data should be used at all. Aligning MDM with AI governance allows businesses to innovate responsibly while reducing operational and reputational risks.

MDM success depends on aligning data, process, and governance.

Many MDM programs fail not due to technology, but because of misalignment between IT solutions, business processes, and governance. A successful MDM initiative requires collaboration between data architects, process owners, and data stewards. Tools like business process models, customer journey maps, and CRUD matrices help map data creation and usage across workflows. Governance structures must clearly define stewardship roles, establish data quality rules, and outline decision-making responsibilities. This alignment ensures that master data remains consistent and valid across domains, systems, and teams—ultimately improving business performance and operational reliability.

MDM orchestration delivers trusted, fit-for-purpose data across ecosystems.

Modern MDM is no longer a centralized static database—it’s an orchestrated framework of coordinated processes, stewardship, and data exchange. It spans multiple domains (Who, What, Where), lifecycles, and systems. To deliver business value, MDM must integrate match and merge rules, survivorship logic, validation workflows, regulatory safeguards, and interoperability standards, such as GS1 or ISO. Governance plays a constitutional role in defining and enforcing rules across the master data universe. An orchestrated approach makes MDM sustainable and adaptable, enabling organizations to scale and synchronize high-quality data across internal systems and trading partners.

Operational Resilience and Risk Preparedness

Tabletop exercises help organizations uncover security gaps before real incidents occur.

These scenario-based simulations are not live drills, but rather structured, collaborative discussions that reveal weaknesses in incident response, communication, and policy alignment. Whether simulating ransomware, data breaches, or infrastructure failures, tabletop exercises help clarify roles, test escalation paths, and reinforce legal and regulatory readiness. Their value lies in surfacing assumptions, enhancing documentation, and enhancing overall preparedness across both technical and business teams.

Data risk should be treated as a standalone domain within an enterprise risk management framework.

Data risks span multiple functions—operations, compliance, cybersecurity—and often go unrecognized within broader risk categories. Mature organizations define data risk as a distinct capability, governed by its policies, controls, and taxonomies. It includes data quality, loss, leakage, misuse, and regulatory violations. Elevating data risk enables more targeted controls, better accountability, and cross-functional alignment across the three lines of defense.

Embedding data risk into operational processes ensures ongoing visibility and accountability.

To operationalize data risk, organizations must define specific control objectives, such as encryption, access control, backup validation, and root cause analysis for quality issues, to effectively manage data risk. These controls should be assigned to process owners, integrated into day-to-day workflows, and monitored through defined metrics, such as breach frequency or data recovery times. This integration turns risk management from a periodic checklist into a continuous, proactive practice.

Cross-functional leadership and metrics are essential for sustaining data risk programs.

Data risk touches nearly every discipline—from product development and compliance to IT and analytics. Sustaining a program requires shared ownership among CDOs, CROs, CIOs, and business leaders. Risk committees, KPIs/KRIs, and performance reporting must be aligned with strategic goals. With executive buy-in, organizations can move beyond theory, demonstrating value through early wins and consistent monitoring of known vulnerabilities.

Data Culture and Skills

Everyone is a data user, and that’s where data literacy begins.

Data is not just numbers or IT systems. It’s documents, dashboards, emails, images—everyday work. Building a data-literate organization starts with acknowledging that everyone uses data. A good literacy program addresses this by demystifying data, connecting it to people’s roles, and enabling them to act meaningfully on the data. The goal isn’t technical mastery, but the ability to use data to generate value. Relevance, storytelling, and hands-on engagement—not formal training alone—are essential to success. And remember: people must feel empowered, not intimidated, by the idea of “doing data.”

Building data discipline is more about behavior than tools.

At its core, data discipline means putting people, not data, at the center. Whether the data is large or small, mundane or mission-critical, discipline is needed if it supports a business process. This involves clear ownership, awareness of the life cycle, and a shared understanding. Culture must prioritize effectiveness over efficiency, and leaders must partner across functions to embed shared standards and behaviors that drive success. Success comes from uniting governance, standardization, and skill-building—not from perfection, but from consistency and collaboration.

Competency frameworks link skills to data lifecycle roles.

The Government of Canada’s Data Competency Framework shows how to connect specific behaviors and skills to distinct phases of the data lifecycle. It defines five data roles—from contributor to trustee—and associates them with specific responsibilities. This structured approach improves workforce planning, training design, and even job descriptions. By embedding competencies into lifecycle activities, organizations clarify expectations, reduce ambiguity, and ensure the right skills are developed where they matter most. It’s a foundational step in building a truly data-driven workforce.

Culture change starts with regular people and clear roles.

True data transformation doesn’t begin with CDOs or dashboards—it starts when “regular people” recognize themselves as data creators and customers. These roles form the backbone of data quality and trust. When empowered, they drive improvement by identifying root causes and building better habits. Provocateurs—those who challenge the status quo—play a crucial role in driving progress. Data teams must shift from being reactive problem-solvers to proactive coaches. A healthy data culture depends on everyone stepping up, starting small, and staying consistent.

Recommendations for Advancing Data and AI Governance in Practice

Treat data quality as a business-critical risk.

Recognize data quality risks early and embed checks throughout the data lifecycle. Utilize structured metrics and meaningful scores to identify issues, establish trust, and foster continuous improvement.

Align MDM with business processes and governance roles.

Master Data Management must go beyond tools. Ensure alignment between data models, business workflows, and stewardship to deliver consistent, trusted data across domains.

Integrate data risk into enterprise risk frameworks.

Define data risk as a distinct domain with dedicated controls, metrics, and ownership. Regular exercises and embedded monitoring increase resilience and ensure preparedness.

Invest in data literacy and competency development.

Make data relevant to everyday roles. Utilize clear responsibilities, behavioral frameworks, and accessible tools to foster a shared understanding and a stronger data culture.

Coordinate data efforts through cross-functional leadership.

Data governance, quality, MDM, and risk management all depend on collaboration. Establish shared objectives and decision-making structures across business, IT, and compliance functions to ensure alignment and effective collaboration.