Save Operational MDM for Phase 2

In my final installment on the 5 Critical Success Factors for Initiating Master Data Management, I want to discuss why tackling Operational MDM is valuable, and when to do it. 

A major contributor to disruptions in MDM projects is lack of stakeholder agreement with respect to what the team is trying to accomplish with MDM.  It’s important to get everyone on the team clear on the two purposes of MDM.

The first purpose is to facilitate better reporting (Reporting MDM).  The goal with Reporting MDM is to gather and aggregate data (sales, invoices, purchase orders, etc.) such that in an Enterprise Reporting Context, all of the data is included from a set of source systems.  A Reporting MDM system does this by Matching Master Data records from each of these systems.  Then, it provides Views of these matches (groups of master records) to subscribing systems and users to consume and use for their own purposes.  It sounds simple, and in fact, it is pretty simple.

The second purpose is to improve the overall data quality in each operational system (Operational MDM).  The goal with Operational MDM is to ensure that each representation of the same “thing” (e.g. Vendor) is the same in all systems which house master data for that “thing”.  An Operational MDM system does this by Matching Master Data records from each source and then Harmonizing the records (i.e. makes all of the master records in a group “line up”).  Finally, it Distributes the harmonized data back to the source systems.  Imagine knowing that all representations of your most valued customers, are verifiably represented in a logically consistent way in all of your AR systems.

Visually, an Operational MDM synchronization process might look like this.

Operational MDM in Action

Once, we have those two concepts solidly understood, the question becomes: can we have both?  Yes, you can have both.  However, if delivering value to the business quickly is a consideration (it should be), I recommend that you tackle Reporting MDM first.  Reporting MDM has fewer technology hurdles, initiates a Data Governance program, and delivers real value quickly.

Here is what Operational MDM will take:

  1. A Data Bus – you’ll need a integration solution which can handle connections to all of the LOB systems which you want to synchronize.  My team uses Microsoft BizTalk Server for this.
  2. Subject Matter Expertise – you’ll need access to the people who understand the target systems extremely well.  Often they will need to expose API’s to the MDM team, so that synchronization can be “real-time” (a change is made to MDM and the change event propagates to all of the affected systems)
  3. Business Process Review – your Data Governance team will likely need to consider the full lifecycle of the master data- creation, maintenance  and archive.

In summary, Operational MDM is achievable and yields tremendous value.    But first, build the foundation and “put some points on the board”.  If you build a  Federated Data Model, Keep MDM Separate, Flip the Script and Formulate your Governance Plan, Phase 1 will be successful, and you’ll get funding for Operational MDM in Phase 2.

Good luck!

Formulate Essential Data Governance Practices

The creation of a  Data Governance function at your organization is a critical success factor in implementing Master Data Management.  Just like any machine on a factory floor, Master Data is an Asset.  An Asset implies ownership, maintenance and value creation: so too with Master Data.  To borrow an analogy from the Manufacturing world, Transactional data is the Widget, and Master data is one of the machines that makes the Widget.  It is part of your organization’s Value Chain.

Unfortunately, firms starting on the road to MDM fall peril to one of two pitfalls on either extreme of the Data Governance mandate.  The first pitfall is to treat MDM as a one-time project, not a program.  Projects have an end-date, Programs are continual.  Have you ever heard of an Asset Maintenance Project?  That’s a recipe for crisis.  Firms which maintain their assets as a Program do far better.

The second pitfall is ask the Data Governance team to do too much, too fast.  Governance cannot do much without some asset to govern.  Have you ever heard of a Machinery Maintenance Program instituted before you figured out what type of machinery you needed, what the output requirements were, or before you made the capital purchase?  I haven’t either.  First, you acquire the capital.  You do so with the expectation that you will maintain it.  Then you put it into production. Then you formulate the maintenance schedule and execute that plan.

In order to successfully stand up a Data Governance function for your Master Data Program, you’ll need to understand these essential roles in Data Governance: Executive Steering, Data Owners, Data Stewards. 

Follow these Do’s and Don’ts:

Do establish an Executive Steering Committee for all Master Data practices in your enterprise, focused upon strategic requirements, metrics and accountability.

Do establish Data Quality Metrics.   Tie them to strategic drivers of the business. Review them regularly.  Your MDM toolset should provide analytics or dashboards to provide this view.

Don’t ask the Steering Committee to own the data model or processes – that is the Data Ownership role.

Do establish a Data Ownership group for each domain of Master Data.  Ownership teams are typically cross-functional, not simply the AR manager for Customer Master Data, or the HR manager for Employee Master Data.  As you evolve down the maturity path, you will find that master data has a broad set of stakeholders – Do be ready to be inclusive.

Do establish regular status meetings where Data Ownership meets with the Executive Steering Committee to review priorities and issues.

Don’t require that Data Owners “handle the data”.  That is the Data Stewardship role.

Do formalize a Data Stewardship team for each domain of Master Data.  Data Stewards are “data people” – business people who live in the data, but with no technical skills required, per se (though technology people can contribute to a Data Stewardship team).

Don’t restrict Data Stewards to just  the people who report to the Data Owner – think cross-functional!

Do anticipate conflicts – Data Owners should have some political skills.  The reality is that Master Data is valuable to a broad set of constituencies within an enterprise.  Be practical as it relates to one faction’s “Wish List” and keep moving the ball forward.

Without a Data Governance function, MDM tends to be a one-time project (“Clean it Once”) and fails to deliver real value.  Without a clear vision of how Data Governance support MDM, it can hold things up. A rational Data Governance function does not need to hold up the execution of a Master Data project – it supports it. Keep Data Governance strategic, cross functional, and flexible. Then, let the MDM technology team deliver the tools.

Face API and Power BI

At last week’s Build2015 developer conference, Microsoft demonstrated many great new tools. One demo which got quite a bit of attention was the How Old Am I? app ( The demo allows users to upload pictures and let the service “guess” the age and gender of the individuals in the photo. Within a few hours, the demo went viral, with over 210,000 images uploaded to the site from all over the world. The result was a dashboard of requests from all over the globe.

Power BI

This solution shows off the use of a number of powerful technologies.

Face APIProject Oxford is a set of Artificial Intelligence API’s and REST services which developers can use today to build Intelligent Systems. In addition to Facial Recognition, the Project Oxford AI services include Speech Recognition, Vision (or Image Recognition and OCR), and Language Understanding Intelligent Services – leveraging the technology capabilities of Bing and Cortana.

Azure Event Hubs –  a highly scalable publish-subscribe ingestor that can intake millions of events per second, the Event Hubs API is used to stream the JSON document from the web page when the user uploads a picture.

Azure Stream Analytics – a fully managed low latency high throughput stream processing solution. Azure Stream Analytics lets you write your stream processing logic in a very simple SQL -like language.   This allows the solution to take measurements every 10 seconds of how many requests, from which countries, of which gender and age.  These measurements become Facts for your analysis.

Power BI – choose PowerBI as the output of our stream analytics job (click here to learn how). Then the team went to, and selected the dataset and table created by Azure Stream Analytics. There is no additional coding needed to create real time dashboards.

The only down side to this is that my worst fears have been confirmed – I look older than I actually am by over 10 years! :(
How old do I look?!?!

The Business Value of Microsoft Azure – Part 5 – Notification Hubs

This article is part 5 of a series of articles that focus on the Business Value of Microsoft Azure. Microsoft Azure provides a variety of cloud based technologies that can enable organizations in a number of ways. Rather than focusing on the technical aspects of Microsoft Azure (there’s plenty of that content out there) this series will focus on business situations and how Microsoft Azure services can benefit.

In our last article we focused on virtualization and the use of virtual machines as part of an Infrastructure as a Service (IaaS) solution. While this is a great approach for traditional server workloads, there has been a significant shift in the way individuals interact with and consume information suggesting the need for something different. Specifically, a mobile device has overtaken the PC in terms of unit sales/year and this presents a scenario that many municipalities can tap into.

Let’s think back to our fictional town of Gamehendge. A hurricane is approaching and Mayor Wilson needs to warn its citizens. To handle the scale required to communicate in this fashion would require a significant notification infrastructure. Why pay for this type of scale when it’s only needed on occasion? Microsoft Azure Notification Hubs is a massively scalable mobile push notification engine for quickly sending millions of messages to iOS, Android, Windows, or Kindle devices. It’s possible to tailor notifications to specific citizens or entire groups with just a few lines of code, and do it across any platform.

Further, in Gamehendge there is a population that doesn’t speak English as their native language. Traditional communications can often go without understanding. The templates feature of Notification Hubs provide a handy way to send localized push notifications so you’re speaking to citizens in their language. Templates also eliminates the hassle of storing the localization settings for each group.

Combining the scalability and configurability of the Notification Hubs solution, along with its ability to work with either on-premise or cloud based systems, your municipality gains the ability to notify your citizens of any information that can prepare and inform them of upcoming events in the event of an emergency or as part of a more generalized community awareness system. While the Notification Hubs feature is just one small component of the Azure platform, it can have a significant impact in your community.

As a partner with BlumShapiro Consulting, Michael Pelletier leads our Technology Consulting Practice. He consults with a range of businesses and industries on issues related to technology strategy and direction, enterprise and solution architecture, service oriented architecture and solution delivery.