Archive for Brian Berry

6 Steps For Creating Golden Records

If you are an organization seeking to improve the quality of the data in your business systems, begin by automating the creation of Golden Records. What is a Golden Record? A Golden Record is the most accurate, complete and comprehensive representation of a master data asset (i.e. Customer, Product, Vendor). Golden Records are created by pulling together incomplete data about some “thing” from the systems in which they were entered. The System of Entry for a customer record may be a Customer Relationship Management (CRM) or Enterprise Resource Planning (ERP) system. Having multiple systems of entry for customer data can lead to poor quality of customer master data – even giving your employees bad information to work off of.

But why not simply integrate the CRM and ERP systems, so that each system has the same information about each customer? In theory, this is a perfect solution; in practice, it can be difficult to achieve. Consider these problems:

  1. What if there are duplicate records in the CRM? Should two records be entered into each ERP? Or the reverse: what if one CRM customer should generate two customer in the ERP (each with different pricing terms, for example)?
  2. What if one or more ERP systems require data to create a record, but that data is not typically (or ever) collected in the CRM? Should the integration process fail, what will be the remediation process?
  3. What if one of your ERP systems cannot accommodate the data entered in CRM or other systems? For example, what if one of your ERP systems cannot support international postal codes? Are you prepared to customize or upgrade that system?

There are many more compatibility issues that can occur. The more Systems of Entry you must integrate, the more likely you are to have many obstacles standing between you and full integration. If your business process assumptions change over time, the automated nature of systems integration itself can become a source of data corruption, as mistakes in one system are automatically mirrored in others.

Golden Record Management, by contrast, offers a significantly less risky approach. Golden Records are created in the Master Data Management (MDM) system, not in the business systems. This means that corrections and enhancements to the master data can be made without impacting your current operations.

6 Steps For Creating Golden Records

At a high level, the process of creating Golden Records looks like this:

  1. Create a model for your master data in the master data management system. This model should include all the key attributes MDM can pull from Systems of Entry that could be useful to creating a Golden Record.
  2. Load data into the model from the variety of SOE’s available. These can be business systems, spreadsheets, or external data sources. Maintain the identity of each record, so that you know where the data came from and how the SOE identifies it (for example, the System ID for the record).
  3. Standardize the attributes that will be used to create clusters of records. For Customers and Vendors, location and address information should be standardized.
  4. If possible, verify attributes that will be used to create clusters of records.
  5. Create clusters of records, by Matching key attributes, to create groups of master data records. The cluster identifier will be the Golden Record identifier. You can also think of this in terms of a hierarchy. The Golden Record is the Parent and the source records are the Children.
  6. Populate the Golden Record, created in MDM, with attributes from the records in its cluster (the source data). This final step, called Survivorship, requires a deeper understanding of how the source data was entered than the previous five steps. We want to create a Golden Record that contains all the best data. Therefore, we need to make some judgements about which of the SOE’s is also the best System of Record for a given attribute (or set of attributes).

Great! We’ve consolidated our master data, entered from a variety of systems, into one system which also contains a reference to a parent record, called the Golden Record. This Golden Record is our best representation of the “thing” we need to understand better.

But wait! The systems of entry, the systems your business USES to operate, have not been updated. Can you still take advantage of these Golden Records?

The answer is “yes” – you can take advantage of the Golden Records in two ways:

  1. As the basis for reporting, because each Golden Record is also a “roll-up” of real system records that are referenced by orders, returns, commissions, etc. Golden Records provide a foundation for consistent Enterprise Reporting.
  2. As the basis for data quality improvements in each system of entry, assuming these systems can import a batch of data and update existing records that match a system ID.

These benefits of Golden Records are gained without the high risk and high costs that come with systems integration. Further, if you have modeled your master data correctly, it is possible to automate the data quality benefits of Golden Records Management, by updating these systems in real-time. See how BlumShapiro can help with your master data needs and golden record creation.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

The Value of Golden Records

Running multiple ERP systems simultaneously can be quite painful for any mid-size organization. Since each ERP maintains their own chart of accounts, financial consolidation and reporting can become all-consuming for the finance teams. When each ERP has its own Customer Master, sales team visibility into strategic accounts is limited, while smaller accounts receive terms that can become big problems for AR. These separate ERP systems lead to issues for other departments—marketing wants a single comprehensive product master; supply chain managers want a single comprehensive vendor master.

Obviously, there is hyperbole involved in my description. However, these are some of the many reasons executive management would like all business units working from a single ERP, with integrated financial reporting, consistent business processes for the whole company and lowered costs of operations.

So, you initiated a multi-year ERP implementation / migration / consolidation project.

At the outset, each ERP specialist is skeptical of the consolidation strategy. “Our ERP is tailored to our business unit” is a common argument for keeping each ERP running. When asked, “How’s the quality of the data?” the same ERP specialists may complain that the data quality is poor. Unfortunately, data problems don’t get better by maintaining the status quo.

Severe master data quality problems present an obstacle to an efficient ERP transition. Let’s think about the customer: if you were to bring all customer master records into a new system wholesale, you’d have many duplicated accounts. You’d have diverse naming convention issues. You’d have some accounts that refer to distribution centers, some to end users, some to drop ship locations. You’d have a wide variety of payment terms.

Get your ERP ambitions moving again, and focus on data quality in a way that enables the final goal—centralized and integrated business processes. Here’s how:

  1. Build Golden Records for Customer. A Golden Record is a representation of your master data, which is the fullest, cleanest and most accurate information available. They are created from consolidating master data from multiple Systems of Record (ERP’s and other systems), standardizing that data, verifying the accuracy where possible, and then building clusters of similar records. This process of matching facilitates the creation of Golden Records, which contain the best information from all the master data in the cluster.
  2. Do the same for Product
  3. Do the same for Vendor

Are you sensing a pattern? Provided your systems of record have a reasonable amount of data characterizing each row of data, similarity clusters can be built. Inaccurate, non-standard data makes the process a little harder, but feasible. Accounting Master Data (i.e., GL Accounts) further benefit from a Uniform Chart of Accounts, to which all other systems may be mapped.

Golden Records Management is a non-intrusive, low-risk tool for accelerating the ERP migration process. Building Golden Records is repeatable for many types of master data and provides a means for preparing the best possible data for import into any new system. In Part 2, I’ll talk about how Golden Records and Master Data Management deliver a perpetual framework for Data Quality, extending the lifetime of legacy systems.

Want to learn more about the impact of master data on your organization? Join us on December 6 in Hartford, CT for our half-day workshop Discovering the Value in Your Data. Hear from data governance experts from BlumShapiro Consulting and Profisee as they address key topics for business, finance and technology leaders on data and master data management.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

5 Tips to Make Power BI Easy on the Brain

How you present your data driven insight is important! But unfortunately, analysts can sometimes forget to tell their story effectively, leaping from data exploration to a dashboard without giving much thought to how audiences will receive the information. Your insights can get lost in a messy report. If the information is critical, shouldn’t the communication medium be crisp, clean and understood at a glance?

Advanced Analytics tools, such as Qlik and Power BI, are fantastic for creating interactive dashboards and reports you can use to explore large datasets, understand trends, track key ratios and indicators, and then share insights with your colleagues, or your boss. What makes these tools useful? They take data and refine it into information by placing a visualization over it, thereby helping our visually oriented brains make sense of the numbers. When it comes to understanding the meaning behind the numbers, a data table or Excel report can leave a brain very, very tired.

Here are some quick tips for making your next analytics report “Easy on the brain”!

Our 5 Power BI Tips:

Respect the Rim

Before my career in technology began, I worked as a waiter, and I worked at some pretty classy spots. If you have never had the pleasure, let me share with you that before each plate makes it to its appointed destination, it is briefly inspected. If any sauces, herbs, or actual food has errantly landed on the rim of the plate, it is removed. “Respect the Rim,” my mentor once told me. The same is true for your data and information. Enforce a thin empty “margin” around each of your Power BI reports. By using the “Snap to Grid” feature, make sure that each visual on the report is aligned to your self-imposed margin. The analysis will look sharper and more credible.

What’s the Headline?

Most reports have data points which are essential, such as Key Performance Indicators (KPI) defined by management, or other indicators such as “Bottom Line” financials, Net Income or EBITDA. These essential measurements are the Headline for the report. Don’t bury the Headline – always place key information in the upper left-hand corner of the report, either in a Power BI Card or KPI visual.

Once you have done that, you can further segment visual information into key categories, and keep them segmented into groups. For example, beneath your KPI you may want to provide leading indicators or contributing factors. Another option may be to provide a group of categorical breakdowns together, or key ratios that contribute to the success or challenges of your headline. You may want to provide a group of visuals providing detailed exposition – a table or other visual with detailed categories. The information should flow from the Headline to the Exposition, just as a newspaper story would.

              

Have a Perspective

Reports that provide multiple ways to filter and slice the data are very helpful to data analysts, data scientists and casual explorers. These are tools to help you with data exploration. Once you’ve found insights worth sharing, focus the audience’s attention on that information by removing the extraneous bits. Don’t worry – the underlying Power BI data model remains for exploration of new insights later.

Make It Mobile

I always make a point of creating a phone layout for my reports, because it is very easy in Power BI. Due to the smaller form factor, phone layouts require you to further choose the essential information. However, if you’ve followed my advice, then you know what the headline is already. Simply drag the headlines onto the phone layout for your report before publishing. Among many outstanding features, the Power BI mobile app allows users to get notifications and data-driven alerts. End users can even mark up phone reports, distribute with their annotations and launch a conversation.

Intentional Style

Use good judgement and don’t get carried away with excessive colors, logos, or background images. I find that a company logo can be helpful for some audiences. However, when it comes to colors, be mindful of some universal rules.

  1. Some colors convey information, intended or otherwise (Red, Yellow and Green, for example)
  2. Company branding adds a professional touch – use the color scheme
  3. Use no more than 10 data colors, no more than 3 backgrounds and no more than 4 fonts

I recommend saving a Custom Theme for Power BI that reflects this guideline. Save off your company theme and save for later use. Any changes can be applied globally.

Your Most Important Information, Quickly Understood

Critical information is more valuable when it is quickly understood. Indeed, each visualization available conveys information in its own manner; therefore, it is important for professionals who prepare analysis with advanced analytics tools to be mindful of the strengths and weaknesses of each. Regardless of audience, these five guidelines apply.

Advanced analytics tools are a critical component to digital transformation, because they enable data-driven decision making. Among other things, data-driven decision making requires the creation of information from data. Often, that data is massive, or moving quite rapidly. To extract insights and reduce business uncertainty, talk to BlumShapiro about our analytics service offerings. We’ll provide a road map to data-driven decision making, enabling digital information at your fingertips. And you can focus on your business.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

Our 5 Rules of Data Science

In manufacturing, the better the raw materials, the better the product. The same goes for data science, where a team cannot be effective unless the raw materials of data science are available to them. In this realm, data is the raw material which produces a prediction. However, raw materials alone are not sufficient. Business people who oversee machine learning teams must demand that best practices be applied, otherwise investments in machine learning will produce dubious business results. These best practices can be summarized into our five rules of data science.

For the purpose of illustration, let’s assume the data science problem our team is working on is related to the predictive maintenance of equipment on a manufacturing floor. Our team is working on helping the firm predict equipment failure, so that operations can replace the equipment before it impacts the manufacturing process.

Our 5 Rules of Data Science

1. Have a Sharp Question

A sharp question is specific and unambiguous. Computers do not appreciate nuance. They are not able to classify events into yes/no buckets if the question is: “Is Component X ready to fail?” Nor does the question need to concern itself with causes. Computers do not ask why – they calculate probability based upon correlation. “Will component X overheat?” is a question posed by a human who believes that heat contributes to equipment failure. A better question is: “Will component X fail in the next 30 minutes?”

2. Measure at the Right Level

Supervised learning requires real examples from which a computer can learn. The data you use to produce a successful machine learning model must demonstrate cases where failure has occurred. It must also demonstrate examples where equipment continues to operate smoothly. We must be able to unambiguously identify events that were failure events, otherwise, we will not be able to train the machine learning model to classify data correctly.

3. Make Sure Your Data is Accurate

Did a failure really occur? If not, the machine learning model will not produce accurate results. Computers are naïve – they believe what we tell them. Data science teams should be more skeptical, particularly when they believe they have made a breakthrough discovery after months of false starts. Data science leaders should avoid getting caught up in the irrational exuberance of a model that appears to provide new insight. Like any scientific endeavor, test your assumptions, beginning with the accuracy and reliability of the observations you started with to create the model.

4. Make Sure Your Data is Connected

The data used to train your model may be anonymized, because factors that correlate closely to machine failure are measurements, not identifiers. However, once the model is ready to be used, the new data must be connected to the real world – otherwise, you will not be able to take action. If you have no central authoritative record of “things”, you may need to develop a master data management solution before your Internet of Things with predictive maintenance machine learning can yield value. Also, your response to a prediction should be connected. Once a prediction of failure has been obtained, management should already know what needs to happen – use insights to take swift action.

5. Make Sure You Have Enough Data

The accuracy of predictions improve with more data. Make sure you have sufficient examples of both positive and negative outcomes, otherwise it will be difficult to be certain that you are truly gaining information from the exercise.

The benefits of predictive maintenance, and other applications of machine learning, are being embraced by businesses everywhere. For some, the process may appear a bit mysterious, but it needn’t be. The goal is to create a model which, when fed real-life data, improves the decision making of the humans involved in the process. To achieve this, data science teams need the right data and the right business problem to solve. Management should work to ensure that these five questions are answered to their satisfaction before investing in data science activities.

Not sure if you have the right raw materials? Talk to BlumShapiro Consulting about your machine learning ambitions. Our technology team is building next generation predictive analytics solutions that connect to the Internet of Things. We are helping our clients along each step of their digital transformation journey.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics