Archive for Digital Transformation

6 Steps For Creating Golden Records

If you are an organization seeking to improve the quality of the data in your business systems, begin by automating the creation of Golden Records. What is a Golden Record? A Golden Record is the most accurate, complete and comprehensive representation of a master data asset (i.e. Customer, Product, Vendor). Golden Records are created by pulling together incomplete data about some “thing” from the systems in which they were entered. The System of Entry for a customer record may be a Customer Relationship Management (CRM) or Enterprise Resource Planning (ERP) system. Having multiple systems of entry for customer data can lead to poor quality of customer master data – even giving your employees bad information to work off of.

But why not simply integrate the CRM and ERP systems, so that each system has the same information about each customer? In theory, this is a perfect solution; in practice, it can be difficult to achieve. Consider these problems:

  1. What if there are duplicate records in the CRM? Should two records be entered into each ERP? Or the reverse: what if one CRM customer should generate two customer in the ERP (each with different pricing terms, for example)?
  2. What if one or more ERP systems require data to create a record, but that data is not typically (or ever) collected in the CRM? Should the integration process fail, what will be the remediation process?
  3. What if one of your ERP systems cannot accommodate the data entered in CRM or other systems? For example, what if one of your ERP systems cannot support international postal codes? Are you prepared to customize or upgrade that system?

There are many more compatibility issues that can occur. The more Systems of Entry you must integrate, the more likely you are to have many obstacles standing between you and full integration. If your business process assumptions change over time, the automated nature of systems integration itself can become a source of data corruption, as mistakes in one system are automatically mirrored in others.

Golden Record Management, by contrast, offers a significantly less risky approach. Golden Records are created in the Master Data Management (MDM) system, not in the business systems. This means that corrections and enhancements to the master data can be made without impacting your current operations.

6 Steps For Creating Golden Records

At a high level, the process of creating Golden Records looks like this:

  1. Create a model for your master data in the master data management system. This model should include all the key attributes MDM can pull from Systems of Entry that could be useful to creating a Golden Record.
  2. Load data into the model from the variety of SOE’s available. These can be business systems, spreadsheets, or external data sources. Maintain the identity of each record, so that you know where the data came from and how the SOE identifies it (for example, the System ID for the record).
  3. Standardize the attributes that will be used to create clusters of records. For Customers and Vendors, location and address information should be standardized.
  4. If possible, verify attributes that will be used to create clusters of records.
  5. Create clusters of records, by Matching key attributes, to create groups of master data records. The cluster identifier will be the Golden Record identifier. You can also think of this in terms of a hierarchy. The Golden Record is the Parent and the source records are the Children.
  6. Populate the Golden Record, created in MDM, with attributes from the records in its cluster (the source data). This final step, called Survivorship, requires a deeper understanding of how the source data was entered than the previous five steps. We want to create a Golden Record that contains all the best data. Therefore, we need to make some judgements about which of the SOE’s is also the best System of Record for a given attribute (or set of attributes).

Great! We’ve consolidated our master data, entered from a variety of systems, into one system which also contains a reference to a parent record, called the Golden Record. This Golden Record is our best representation of the “thing” we need to understand better.

But wait! The systems of entry, the systems your business USES to operate, have not been updated. Can you still take advantage of these Golden Records?

The answer is “yes” – you can take advantage of the Golden Records in two ways:

  1. As the basis for reporting, because each Golden Record is also a “roll-up” of real system records that are referenced by orders, returns, commissions, etc. Golden Records provide a foundation for consistent Enterprise Reporting.
  2. As the basis for data quality improvements in each system of entry, assuming these systems can import a batch of data and update existing records that match a system ID.

These benefits of Golden Records are gained without the high risk and high costs that come with systems integration. Further, if you have modeled your master data correctly, it is possible to automate the data quality benefits of Golden Records Management, by updating these systems in real-time. See how BlumShapiro can help with your master data needs and golden record creation.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

The Value of Golden Records

Running multiple ERP systems simultaneously can be quite painful for any mid-size organization. Since each ERP maintains their own chart of accounts, financial consolidation and reporting can become all-consuming for the finance teams. When each ERP has its own Customer Master, sales team visibility into strategic accounts is limited, while smaller accounts receive terms that can become big problems for AR. These separate ERP systems lead to issues for other departments—marketing wants a single comprehensive product master; supply chain managers want a single comprehensive vendor master.

Obviously, there is hyperbole involved in my description. However, these are some of the many reasons executive management would like all business units working from a single ERP, with integrated financial reporting, consistent business processes for the whole company and lowered costs of operations.

So, you initiated a multi-year ERP implementation / migration / consolidation project.

At the outset, each ERP specialist is skeptical of the consolidation strategy. “Our ERP is tailored to our business unit” is a common argument for keeping each ERP running. When asked, “How’s the quality of the data?” the same ERP specialists may complain that the data quality is poor. Unfortunately, data problems don’t get better by maintaining the status quo.

Severe master data quality problems present an obstacle to an efficient ERP transition. Let’s think about the customer: if you were to bring all customer master records into a new system wholesale, you’d have many duplicated accounts. You’d have diverse naming convention issues. You’d have some accounts that refer to distribution centers, some to end users, some to drop ship locations. You’d have a wide variety of payment terms.

Get your ERP ambitions moving again, and focus on data quality in a way that enables the final goal—centralized and integrated business processes. Here’s how:

  1. Build Golden Records for Customer. A Golden Record is a representation of your master data, which is the fullest, cleanest and most accurate information available. They are created from consolidating master data from multiple Systems of Record (ERP’s and other systems), standardizing that data, verifying the accuracy where possible, and then building clusters of similar records. This process of matching facilitates the creation of Golden Records, which contain the best information from all the master data in the cluster.
  2. Do the same for Product
  3. Do the same for Vendor

Are you sensing a pattern? Provided your systems of record have a reasonable amount of data characterizing each row of data, similarity clusters can be built. Inaccurate, non-standard data makes the process a little harder, but feasible. Accounting Master Data (i.e., GL Accounts) further benefit from a Uniform Chart of Accounts, to which all other systems may be mapped.

Golden Records Management is a non-intrusive, low-risk tool for accelerating the ERP migration process. Building Golden Records is repeatable for many types of master data and provides a means for preparing the best possible data for import into any new system. In Part 2, I’ll talk about how Golden Records and Master Data Management deliver a perpetual framework for Data Quality, extending the lifetime of legacy systems.

Want to learn more about the impact of master data on your organization? Join us on December 6 in Hartford, CT for our half-day workshop Discovering the Value in Your Data. Hear from data governance experts from BlumShapiro Consulting and Profisee as they address key topics for business, finance and technology leaders on data and master data management.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

See the Impact Digital Transformation Can Have on Your Bottom Line

Digital Transformation has become an industry buzzword. We’re here to clarify what it means in dollars and cents.

Digital transformation represents an organizational change where data becomes relevant and valuable. Once transformed, these organizations use data to improve decision making, connect with their customers, improve vendor relationships and allow employees to provide higher level skills and value to the organization.

Digitally transformed organizations think about their products and services in both a physical and digital space, use technology to improve customer service, often have an enhanced perspective of their market and how their business model operates within that redefined market.

We believe that digital transformation is a qualification to compete in today’s business environment.

The question people often ask next is – how much does this cost?  We posit the answer to that question is nothing.  The cost (and risk) is in remaining stagnant. Digital transformation uncovers assets previously underutilized by the organization. The proper investments in digital transformation will only empower your organization to survive and thrive – and when done properly it should yield an immediate and direct ROI that returns value back to the organization straightaway.

We’ve developed a Digital Transformation Accounting Worksheet ROI calculator for you to experiment with. Punch in your numbers and let us know what you think.  We would be happy to discuss your digital transformation in more detail.

About Noah:

240-Ullman,-NoahNoah is the Director of Business Development for BlumShapiro’s Technology Consulting Group. He brings over 25 years of business experience from entrepreneurial start ups, to over a decade of working at Microsoft in various sales, marketing and business development roles. Noah has launched Windows XP, Office XP, Tablet PC, Media Center PC, MSN Direct Smartwatches (an early IoTattempt), several videogames, a glove controller, and a wine import company/brand. Noah spent three years living overseas building out Microsoft’s Server and Tools business in Eastern Europe working with the IT Pro and developer communities. He considers himself a futurist, likes science fiction and loves applying what was recently science fiction to real world problems and opportunities. 

5 Tips to Make Power BI Easy on the Brain

How you present your data driven insight is important! But unfortunately, analysts can sometimes forget to tell their story effectively, leaping from data exploration to a dashboard without giving much thought to how audiences will receive the information. Your insights can get lost in a messy report. If the information is critical, shouldn’t the communication medium be crisp, clean and understood at a glance?

Advanced Analytics tools, such as Qlik and Power BI, are fantastic for creating interactive dashboards and reports you can use to explore large datasets, understand trends, track key ratios and indicators, and then share insights with your colleagues, or your boss. What makes these tools useful? They take data and refine it into information by placing a visualization over it, thereby helping our visually oriented brains make sense of the numbers. When it comes to understanding the meaning behind the numbers, a data table or Excel report can leave a brain very, very tired.

Here are some quick tips for making your next analytics report “Easy on the brain”!

Our 5 Power BI Tips:

Respect the Rim

Before my career in technology began, I worked as a waiter, and I worked at some pretty classy spots. If you have never had the pleasure, let me share with you that before each plate makes it to its appointed destination, it is briefly inspected. If any sauces, herbs, or actual food has errantly landed on the rim of the plate, it is removed. “Respect the Rim,” my mentor once told me. The same is true for your data and information. Enforce a thin empty “margin” around each of your Power BI reports. By using the “Snap to Grid” feature, make sure that each visual on the report is aligned to your self-imposed margin. The analysis will look sharper and more credible.

What’s the Headline?

Most reports have data points which are essential, such as Key Performance Indicators (KPI) defined by management, or other indicators such as “Bottom Line” financials, Net Income or EBITDA. These essential measurements are the Headline for the report. Don’t bury the Headline – always place key information in the upper left-hand corner of the report, either in a Power BI Card or KPI visual.

Once you have done that, you can further segment visual information into key categories, and keep them segmented into groups. For example, beneath your KPI you may want to provide leading indicators or contributing factors. Another option may be to provide a group of categorical breakdowns together, or key ratios that contribute to the success or challenges of your headline. You may want to provide a group of visuals providing detailed exposition – a table or other visual with detailed categories. The information should flow from the Headline to the Exposition, just as a newspaper story would.

              

Have a Perspective

Reports that provide multiple ways to filter and slice the data are very helpful to data analysts, data scientists and casual explorers. These are tools to help you with data exploration. Once you’ve found insights worth sharing, focus the audience’s attention on that information by removing the extraneous bits. Don’t worry – the underlying Power BI data model remains for exploration of new insights later.

Make It Mobile

I always make a point of creating a phone layout for my reports, because it is very easy in Power BI. Due to the smaller form factor, phone layouts require you to further choose the essential information. However, if you’ve followed my advice, then you know what the headline is already. Simply drag the headlines onto the phone layout for your report before publishing. Among many outstanding features, the Power BI mobile app allows users to get notifications and data-driven alerts. End users can even mark up phone reports, distribute with their annotations and launch a conversation.

Intentional Style

Use good judgement and don’t get carried away with excessive colors, logos, or background images. I find that a company logo can be helpful for some audiences. However, when it comes to colors, be mindful of some universal rules.

  1. Some colors convey information, intended or otherwise (Red, Yellow and Green, for example)
  2. Company branding adds a professional touch – use the color scheme
  3. Use no more than 10 data colors, no more than 3 backgrounds and no more than 4 fonts

I recommend saving a Custom Theme for Power BI that reflects this guideline. Save off your company theme and save for later use. Any changes can be applied globally.

Your Most Important Information, Quickly Understood

Critical information is more valuable when it is quickly understood. Indeed, each visualization available conveys information in its own manner; therefore, it is important for professionals who prepare analysis with advanced analytics tools to be mindful of the strengths and weaknesses of each. Regardless of audience, these five guidelines apply.

Advanced analytics tools are a critical component to digital transformation, because they enable data-driven decision making. Among other things, data-driven decision making requires the creation of information from data. Often, that data is massive, or moving quite rapidly. To extract insights and reduce business uncertainty, talk to BlumShapiro about our analytics service offerings. We’ll provide a road map to data-driven decision making, enabling digital information at your fingertips. And you can focus on your business.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics