·
4 min read

Master Data Services – What’s the big deal?

Hi Kirk Haselden here.

Kirk Haselden casual I’m the product unit manager for Master Data Services. I’m responsible for the master data management strategy, team and product delivery at Microsoft. I manage the day to day aspects of the team, drive execution and overall product development. I have written two technical references for SAMS press on Integration Services, numerous technical articles and have 25 patents or patents pending. My team is a collection of talented developers, testers and program managers in Redmond and Denver.

In June of 2007, Microsoft acquired a master data management software vendor known as Stratature. Since that time, the master data management team has been quietly working away turning the core capabilities of the Stratature product into a Microsoft product. We’ve kept all the important features of the product and added some critical capabilities such as a fully featured and capable Windows Communication Foundation web service. We’ve optimized the code and database. We’ve done the security work that all Microsoft products must do to ship. We’ve also added a few other capabilities that customers have requested. As was announced at TechEd Monday, we’re now getting ready to ship the product as Master Data Services in the SQL Server 2008 R2 release.  Let me explain what Master Data Services (MDS) is and why it’s critical in today’s enterprise.

As corporate information ecosystems become more complex, so goes the management of company lists. Master lists are used, accessed, managed, changed and unfortunately corrupted in myriad ways throughout the company in various divisions and in various locations. For example, customer data may be used in the front office for direct contact with customers or in the back office in marketing, billing and other business processes. ERP and CRM systems do a great job of providing the functionality around how you use say, the product master or customer master. That’s what they were designed to do. However, they don’t do a great job of actually managing the forces that impact the master data itself. Four of the most important of those forces are decay, conflict, corruption and inconsistency.

  • Decay is when the world changes around the data. The data was accurate at one time, but then the world changed but the data wasn’t updated to reflect that change. The classic example is when a customer moves.
  • Conflict is when duplicate data is introduced into the system. The data may exactly duplicate or differ substantially from established and validated data. In either case, it is usually although not always undesirable. A good example is when two identical parts manufactured in different locations have different part numbers.
  • Corruption is when established, accurate and validated data is changed in some way that invalidated it. For example, if the price of an PVC elbow joint is misquoted as $550 instead of $5.50, it won’t sell much.
  • Inconsistency is when the data is formed in unpredictable, erratic or incompatible ways. For example, a company may have a different naming convention for their product codes depending on the division or country where the product is generated.  Not using the same canonical address format is another typical example.

Traditional applications manage master data and these forces insufficiently. What is needed to solve these problems is an application that focuses on managing these forces with a set of tools that is specifically designed to ensure that the master data remains authoritative, is available in a secure way across the enterprise and is integrated with the applications that use it. Often the term “One source of the truth” is used to describe what master data management applications deliver. I prefer the term “authoritative source of master data”. Having one source isn’t always desirable nor is it always realistic. Master data isn’t truth. It’s simply data that represents the world at a given time slice. The trick and it’s a difficult one, is to keep the master data authoritative. Authoritative data is reliable, represents the state of the world at the time it’s referenced and is delivered in a secure, integrated and performant way.

Why now? Why has master data management emerged as one of the top 2 or 3 initiatives on the average CIO’s mind? The answer lies somewhere in the perfect storm of economic downturn, corporate information ecosystem complexity, strict governmental regulations such as HIPAA and Sarbanes Oxley, services oriented architectures, and the recognition that solving the problem through custom solutions is extremely difficult. Investing in master data management solutions, especially in today’s difficult economic times makes dollars and sense. Master data management is an investment in cost savings, revenue recovery, human resource optimization, and capital investment efficiency.

Unlike transactional data, master data doesn’t capture one time transactions. It’s longer lasting and slowly changing, making it difficult to manage in one time processes. Master data management is a continuing effort that must be a part of the entire lifecycle of the master data from inception to deletion. Master data is involved with transactional data everywhere including business intelligence, mandatory financial reports, billing, marketing and other critical business activities. If the master data, the gas that fuels the corporate engine, is dirty, the company doesn’t run very efficiently, or at all. Find a way to clean up the fuel and that corporate engine will run more efficiently.

In addition to the DPI blog, I’ll be blogging through the coming months leading up to our release. Monday, our product team site also went live: http://www.microsoft.com/sqlserver/2008/en/us/mds.aspx

We’re excited to be working on a solution to a problem that exists in every substantial company in the world.

Thanks,

Kirk