Skip to main content

Themes

Audit & Assurance
Business & IT Value

Keywords

Effective master data management

The article is intended as a quick overview of what effective master data management means in today’s business context in terms of risks, challenges and opportunities for companies and decision makers. The article is structured in two main areas, which cover in turn the importance of an effective master data management implementation and the methodology to get there. At the end of the article we aim to illustrate the concepts by presenting a real-life case study from one of our clients, as well as some lessons learned throughout our day-to-day projects.

Introduction

How can we implement master data management (MDM) effectively within our ERP system? I use master data (MD) throughout multiple systems, but how can I ensure its consistency? How can proper MDM mitigate risks within our organization? These are only a few of the questions business managers have started to ask within the past years, as more and more companies began to show a growing interest in the topic of MDM and the benefits (both financial and organizational) that effective MDM can bring.

A number of developments have placed MDM back on the agenda, such as a focus on cost savings, investigating centralization options, and minimizing process inefficiencies. Also, market and compliance regulations such as SOx, Basel II, Solvency 2, which all in some way address the topic of having control over data integrity and reliable reporting, can be triggers for MDM initiatives.

This article is intended as an overview of the MDM concept. It includes some of the challenges companies might face due to improper MDM, as well as KPMG’s experience in this field and the approach we propose for successful MDM.

How bad master data management impacts good business

MDM, in a nutshell, refers to the processes, governance structures, systems and content in place to ensure consistent and accurate source data for transaction processes (such as the management of customer master data, vendor master data, materials, products, services, employees and benefits, etc.). It is a term that emerged in recent years as a hot topic on the IT and business integration agenda. Partly because of companies’ wish for improved efficiency and cost savings, some of it due to the numerous issues being encountered during daily activities, compliance issues arose and opportunities were missed due to lack of a good set of data.

Because master data is often used by multiple applications and processes, an error in master data can have a huge effect on the business processes.

Decision making in the context of bad data

A lot of companies have invested in recent years in business intelligence solutions. One goal, among others, is to achieve better insight into such things as process performance, customer and product profitability, market share, etc. These reporting insights are often the basis for key decision making, however, the quality of the reporting is immediately impacted by the quality of the data. Bad data quality leads to misinformed or under-informed decisions (mostly related to setting the wrong priorities). Also, the return on costly investments in business intelligence is partly diminished if the source data is corrupt or if not enough characteristics are recorded in the master data.

Operational impact of bad master data

A major component of any company’s day-to-day business is the data that is used in business operations and is available to the operational staff. If this data is missing, out of date, or incorrect, the business may suffer delays or financial losses. For example, the production process may be halted due to incorrect material or vendor information. Some examples have been known where incorrect product master data was recorded on product labels for consumer products, resulting in the rejection of a whole shipment destined for import into the target market, ultimately resulting in considerable financial and reputational losses.

Every time wrong data is detected in the system, a root-cause analysis and corrective actions must be performed in order to correct and remediate the issues. This, together with the process rework and corrective actions, takes considerable time and organizational resources. Therefore, addressing and integrating MDM at the start should be part of an operational excellence initiative, in order to solve part of the process inefficiencies.

Compliance

The growing number of quality standards and regulations (industry specific or not) has also drawn attention to MDM. In order to comply with these requirements, companies must meet certain criteria which are directly or indirectly impacted by the quality of data in the systems. There are many compliance risks that companies run from having bad MDM:

  • SOx risks occur in maintaining reporting structures and processing critical master data such as vendor bank accounts, fixed-asset data, contracts and contract conditions.
  • Healthcare, pharmaceutical or food & beverage companies that are regulated by federal health and safety standards may have significant exposure to legal risk and could even lose their operating licenses if their master records are incorrect with respect to expiration dates, product composition, storage locations, recording of ingredients, etc.
  • Fiscal liabilities, such as VAT, produce risk. The VAT remittance may be incorrect if the relevant fields in the master data are not appropriately managed, possibly leading to inaccurate VAT percentages on intercompany sales.

Overview of the master data management environment

In the current business environment, companies often don’t have a precise overview of their customers, products, suppliers, inventory or even employees. Whenever companies add new enterprise applications to “manage” data, they unwittingly contribute to the increased complexity of data management. As a result, the concept of MDM – creating a single, unified view of key source data in an organization – is growing in importance.

Definitions

MDM is a complex topic, as it combines both strategic components (organization & governance) and highly detailed activities (rules for master data items on field level, control points to achieve completeness & uniqueness of MD). Below we detail some widely known industry views on MDM:

  • “The discipline in IT that focuses on the management of reference or master data that is shared by several disparate IT systems and groups” – Wikipedia
  • “MDM is much more than a single technology solution; it requires an ecosystem of technologies to allow the creation, management, and distribution of high-quality master data throughout the organization” – Forrester
  • “MDM is a workflow-driven process in which business units and IT collaborate to harmonize, cleanse, publish and protect common information assets that must be shared across the enterprise.” – Gartner

Scope of master data management

There are some very well-understood and easily identified master data items, such as “customer” and “product.” Most people define master data by simply reciting a commonly agreed upon master data object list, such as customer, material, vendor, employee and asset. But how you identify the data objects that should be managed by a MDM system is much more complex, and defies such rudimentary definitions. In fact, there is a lot of confusion around what should be considered master data and how it is qualified, necessitating a more comprehensive treatment.

C-2011-0-Jonker-01

Figure 1: Characteristics of master data

However, there is no easy universal view on what master data is. How master data is perceived differs from organization to organization and from system to system. Let’s take, for example, sales prices. They may be considered by certain organizations to be master data and handled according to the specific master data flows, or they may be considered to be transactional data and handled accordingly. This may be because of the frequency of change, the nature of the product that is being sold, the level of customer interaction, etc. In some businesses, sales prices are configuration data, maintained by a technical department because they are changed once a year. In other businesses, sales prices change frequently and are managed by the business, so they are considered master data.

The KPMG approach to master data management

The benefits and reasons for optimizing MDM have been addressed before. This section will address how to implement effective MDM within an organization. A number of models exist around MDM, such as DataFlux ([Losh08]), which focuses on a single view of data, and Gartner ([Radc09]), which uses building blocks for their MDM model.

The KPMG MDM model is based on KPMG’s in-depth knowledge of MDM and experience gained during the design and implementation of MDM models and processes for complex organizations with integrated IT landscapes in a range of industries. The next section will explain the reasoning behind the KPMG model, how it should be used and where it deviates from existing MDM models.

Master data management touches every aspect of an organization

Different building blocks of master data management

The MDM model is composed of four elements (governance, process, content and systems) within the various levels of an organization (strategic, tactical and operational), which ensures that the model includes every aspect of the organization. These four elements are interconnected and each of them needs to reach a similar level of growth and improvement in order to produce well-balanced MDM within an organization.

C-2011-0-Jonker-02

Figure 2: Different building blocks of master data management

Maturity model for master data management

In order to assess the MDM maturity of organizations and the progress of a MDM quality improvement project, MDM has been envisioned as a model with five maturity levels. This maturity-level model makes it possible to measure the status of MDM within organizations, based on predefined elements. The KPMG model uses governance, process, content and systems as the key elements for this purpose.

The MDM maturity-level model consists of five levels, where at level 1 (the initial level), there is no ability to manage data quality, but there is some degree of recognition that data duplication exists within the organization. On the reactive level (level 2), some attempts to resolve data quality issues and initiate consolidation are performed. At the managed level (level 3), organizations have multiple initiatives to standardize and improve quality and a mature understanding of the implications of master data for business processes. When the organization has a well-managed framework and KPI’s (key performance indicators) to maintain high-quality data, the proactive level (level 4) is reached. An organization is at the strategic performance level (level 5) if all the applications refer to a single comprehensive master data repository, if the quality of master data is a KPI for all process and data owners, and if synchronization, duplication checks and validations are embedded in tools.

At the start of a MDM project, the ambition level should be set indicating what maturity level the organization aims to reach (for example, maturity level 4: pro-active). This gives a target to work towards in MDM implementation. Figure 3 shows the different ambition levels, explaining what reaching level 4 would involve.

C-2011-0-Jonker-03

Figure 3: Maturity levels of MDM

Master data management model implementation approach

Although a MDM implementation is much more than just tooling and configuring system functionality, the phases commonly found in existing system-implementation methodologies can also be used for a MDM implementation. Based on experiences and good practices with MDM implementations, the following phased approach has been developed. In the remainder of this section we describe, for each phase, the steps required when implementing an MDM model within an organization.

Initiation: agree on business need, scope, definitions and approach

In this phase the initial business case for master data management is defined. It is important to address all business areas here, including “IT demand,” “IT supply,” “business” and “finance and reporting.” All these business domains benefit from solid master data management.

In addition to typical project start up activities, in implementing master data management the following should be addressed:

  • What is our system and organizational scope, and which data elements do we consider master data, which will therefore be within the project’s scope.
  • Define common names for the master data objects within the project’s scope, independently of the system in which they occur. This is very important, since similar master data objects can be named differently in different systems as well as throughout the company. For example, is a vendor the same as a supplier, and what do we consider the customer master data? Is it the buyer, or is it also the shipping location?
Assessment: determine the current situation and set the right priorities

The primary deliverable of the assessment phase is a detailed implementation plan indicating all design, implementation and monitoring activities that will be put into place to make the MDM organization work. To be able to draft this plan, a comprehensive review of the current MDM organization is necessary, in relation to the defined maturity level. The implementation plan should contain those steps that need to be taken for each building block, classified per master data object, steps that will close the gap between the current maturity level and the desired maturity level.

The assessment itself consists mainly of conducting interviews and reviewing existing documentation. This will be combined with data analyses to get insight into the current quality of data as a benchmark that can be referred to during the course of (and after) the project, to measure its success.

The goal of the assessment phase is to prioritize the objects that make up master data management. Prioritizing the different master data objects can be done by looking at criteria such as use of master data, distribution over systems, impact on business processes, strategic and operational requirements, current data quality and issues, other projects, complexity and volume.

This assessment phase results in a “heatmap,” where the different master data objects are plotted based on their current MDM maturity level, so they can be compared to the desired maturity level and the applicable decision criteria. The “heatmap” can be used to cluster similar groups of master data objects having similar current data quality and the same level of complexity. The grouping enables a phased prioritization approach, possibly having different implementation waves. This is illustrated in Figure 4.

C-2011-0-Jonker-04

Figure 4: Heatmap example

Design: how to reach the desired master data management maturity level

This phase is focused on agreeing on the design of the planned MDM structure.

A central role in this phase is considering if and what activities will be centrally or de-centrally governed. This does not include deciding where the activities will be performed (in a central department or distributed throughout the organization), but only whether you actually standardize and centrally steer MDM activities or not (i.e. do you leave this up to the business). In other words, what is going to be the scope and reach of your central MDM structure, and where you are going to allow for business interpretation and administration. In making this decision, a number of factors may play a role:

  • What kinds of objects are already centrally managed? If the company is already used to central management for certain data objects, then it is not advisable to change this.
  • What is the frequency of changes and the process criticality? Certain objects are changed frequently and have strong process impact. For example, a master plan or routing in a production environment can determine which production lines are involved and in which order the product is developed. If a production line should fail, the plan should be adjusted on the spot, to re-route the production over alternative lines.
  • What is the impact of the change, and in which environment does the object operate. If a master data object is part of an isolated system, barely influencing other master data, other business units, and reporting, then this could be de-centrally managed.
  • Local laws and regulations. For some master data objects, country-specific laws and regulations may apply. In these cases it may be more efficient to leave the governance over the related data attributes to the national level.
  • How the organization is structured, what countries, business lines, shared services or (outsourced) third parties are there. The complexity of the organization should not be the deciding factor for central or de-central management, however, it is something that could influence the decision.

Based on this outcome, the first design action should be the governance structure and organizational plan.

A second important step, which is related to the design of the planned MDM structure, is the appointment of master data owners, who will be ultimately responsible for their master data objects. The master data owner will, in the course of the MDM project, act as a change agent taking decisions and making sure that, for his or her master data object, roles will be assigned to employees.

A third step is the design of processes and models. These include the standard MDM maintenance processes (to create, change, block, remove, update, etc.), the MDM incident and issues management processes, guidelines for monitoring and compliance, templates around content and quality (e.g. template for data rule books), the MDM governance model and role model, and other common MDM themes like an MDM portal.

Implementation: getting there

As with most implementations, organizational support and sponsorship is an important element to realize a change. This starts with awareness and consequently a change in the mindset of the master data owners. As mentioned before, the master data owners are key in facilitating and realizing the change from the current (as is) to the new (to be) MDM model for their specific master data object. They will not be able to effectively fulfil this role if they do not fully understand the centrally designed and adapted organizational and process model. The implementation phase, therefore, should start with awareness and training workshops for the master data owners and their team members. The objective of these meetings is to change the mindset and get full buy-in for the newly designed concepts.

After that, the master data owners will be in the driver’s seat and will start communicating with other stakeholders. They will be informed and, whenever necessary, trained in the use of object-specific master data processes, rules, templates, etc. Although master data owners usually have the seniority to carry this process, the involvement and support of senior management (C-level) is necessary to underline the importance of effective MDM for the organization.

Implementing MDM includes “soft” implementation activities (such as aligning processes, assigning roles and responsibilities, deciding on quality criteria and service levels), but also technical “hard” implementation activities. These include: implementing (or extending) the use of workflow, aligning system authorizations with the MDM role design, developing reports and data-quality dashboards, implementing technical data validation rules, automating interfaces and migrating data to one source.

Figure 5 gives an overview of the different “hard” and “soft” implementation activities for becoming a level-4 “pro-active” MDM organization.

C-2011-0-Jonker-05

Figure 5: Graphical overview of level 4 MDM maturity

An organization can decide to implement specific MDM systems and tooling. There are a great number of software suppliers offering specific MDM systems that provide the functionalities described above (and many more). Some believe that MDM issues can be solved by selecting and implementing an MDM tool. That, however, is a misconception. Yes, somewhere down the line organizations may need technology for extraction, transformation, load and data monitoring. Effective MDM, however, starts with a clear and concise governance and organizational model. No tool alone is going to solve an enterprise’s data problems. Organizations must understand that improving their data quality – and building the foundation for an effective MDM implementation – requires them to address internal disagreements and broken processes, and that it is not necessarily a technology-focused effort but a strategic, tactical and operational initiative.

Monitoring: ensuring we stay there

Having completed the implementation phase, the next step is implementing the tools and techniques to actively monitor the quality of the data and the quality of the processes. The objective is to sustain and improve MDM processes along the line. Main activities in this phase are monitoring the data quality and the request processes of the master data objects (for example, against KPI’s or service level agreements).

Often considerable time and effort is spent on data cleansing actions, while less attention is paid to maintaining good data quality. In order to continuously improve the master data process and data quality, efficient monitoring processes should be in place, basically covering the four pillars of MDM. Next some examples are given of what can be monitored:

  • Governance performance: review of issues, problem and management processes
  • Process performance: process response times (time to approve, administer, etc.), percentage of approved and rejected changes, number of emergency changes, number of incidents and time of resolution, metrics around meeting agreed service levels
  • Content and quality: data completeness (empty fields, number of pending transactions because of incomplete MD, missing critical data, etc.), data accuracy (data not matching business rules, incorrect hierarchy assignment, incorrect data over multiple systems), data validity (checks on outdated unused records), data accessibility (number of unauthorized changes, role assignment, temporary authorizations, etc.), data redundancy (double records, double recording in multiple systems)
  • Systems and tooling: interface processing (timely/untimely interface processes, number of issues), unauthorized MD object attribute changes (e.g. adding fields).

The initial implementation of a typical MDM project will end here. However, knowing that today’s organizations are dynamic and that they are frequently improving their processes, setting up an effective MDM structure is never a one-time exercise.

Client case: Master data management at an international consumer company

In early 2008, this company started an initiative to improve the MDM structure by moving towards a more pro-active level that would allow MDM to be one of the enabling processes in realizing strategic business goals. For this initiative, a centralized approach was chosen, where a central MDM body would govern the master data processes of all operating companies in the group. At the group level, a new business MDM department was formed.

A clear example of the benefit realized through this project was the standardization of the brand codes used. When all systems were aligned according to the central data standard, a clear and consistent way of reporting and comparing between different countries and operating companies was established.

Master data management in the roll-out of a new central sales system

With the development and roll-out of a new sales system, the MDM approach was completely integrated from the start of the project. This direct approach within the project resulted in a solid embedding of the data standards and MDM processes in the new sales environment.

During the blueprint phase, the MDM custodians were able to define how the master data objects were to be interpreted in accordance with the standards. During the realization phase of the project, the data definitions were aligned with the systems already existing within the company. As part of the data migration of customer, material and vendor master data, specific validations were executed to ensure the data followed the central data standards. The integration of the MDM processes within the project reduced the go-live risks of the system significantly, as the company was comfortable with the quality of the configuration, organizational and migrated master data.

Improvement opportunities for the next roll-out project

During the project, a number of issues came to light when project consultants proposed solutions slightly deviating from the master data standards. The tension between functionality, project timeline and data standards required support from top management to ensure that central standards were met.

As part of the integration of MDM into the implementation project for the new sales system, the MDM support organization after go-live needed to be developed. When the MDM procedures are not clear or easily available, the central standards tend to give way to local interpretations. A central support tool to register, approve and execute master data change requests proved to be critical in this respect. Subsequently the right level of training was provided to the local master data organization, which ensured solid embedding of the data standards.

Tooling to extract, report and monitor data quality was developed during the project and provided insight into the use of the data standards in both the local and centrally maintained master data objects.

Lessons learned

When looking at recent MDM optimization and implementation projects, there are a number of key messages that we would like to share:

  • MDM cannot be effective without proper data governance. If no one is accountable for data quality, then there is no place for escalating issues or setting data standards and monitoring data quality. The difficulty in MDM optimization projects is often finding the right balance between centralized vs. decentralized maintenance and assigning the right responsibilities to the right people. Master data ownership should be taken seriously, and the people who are assigned to this responsibility should be encouraged, or monitored, after taking full responsibility.
  • MDM should not be implemented as an IT project, but rather as a business improvement project. When the focus is too much on IT (e.g. building workflows, building reports) the actual project success factors are overlooked.
  • Although it seems redundant as an activity, it is very important to have a uniform view per master data object of what is actually meant by the object (definitions). For example, when naming a master data object “product” we have seen that this can be interpreted in a number of ways. This results in a range of different issues, which may in fact not relate to the same master data object.
  • Do not approach MDM from a systems angle. Instead place the master data object front and center. System ownership has its place and function within an organization, but can conflict with proper MDM. The goal of MDM is to cross boundaries such as business lines, processes and systems. The master data owner issues the standard which should be adopted, irrespective of the system.
  • MDM is a complex topic, and requires a combination of both strategic components (organization & governance) and highly detailed activities (rules for master data items on field level, control points to achieve completeness & uniqueness of MD). This requires also the right mix in the project team of technical expertise and business-process knowledge.
  • Use a phased approach. In addressing all master data objects in a company when implementing or optimizing MDM, one basically touches almost all business functions. In order to spread the workload internally (in the project team) and also throughout the company it is advisable to implement the new MDM organization through implementation waves.
  • Consider interrelated connections between master data objects. Although a wave approach is advised (see the previous bullet), the master data quality of related objects should be improved in parallel or at least with only small time gaps between waves. For example, it is of little value to improve sales contract administration while your customer master data is still of poor quality.

Through this article, we hope to have clarified that MDM is an important topic in the current business environment. Even though it will take away some precious time from other vital initiatives of the company, the benefits will be substantial throughout the organization in a relatively short time. The best businesses do run best-in-class MDM processes.

Literatuur

[Bigg08] S.R.M. van den Biggelaar, S. Janssen and A.T.M. Zegers, VAT and ERP: What a CIO should know to avoid high fines, Compact 2008/2.

[Butl09] D. Butler and B. Stackowiak, MDM, An Oracle White Paper, June 2009.

[Dubr10] Vitaly Dubravin, 7 Pillars of a Successful MDM Implementation, 11 April 2010.

[Fish07] Tony Fisher, Demystifying MDM, 20 April 2007.

[IBMM07] IBM, IBM MDM: Effective data governance, 11 November 2007.

[Kast10] Vasuki Kasturi, Impact of Bad Data, 27 February 2010.

[Laws10] Loraine Lawson, MDM: Exercise for Your Data, 16 April 2010.

[Losh08] D. Loshin, MDM Components and the Maturity Model, A DataFlux White Paper 8 October 2008.

[Radc09] J. Radcliffe, The Seven Building Blocks of MDM: A Framework for Success, Research 27 May 2009.

[SAPM03] SAP, SAP® MDM, 2003.

[Sunm08] SUN, SUN™ MDM SUITE, White Paper June 2008.

[Wolt06] Roger Wolter and Kirk Haselden, The What, Why, and How of MDM, Microsoft Corporation, November 2006.

http://tdwi.org/