Archive for Brian Berry

5 Tips to Make Power BI Easy on the Brain

How you present your data driven insight is important! But unfortunately, analysts can sometimes forget to tell their story effectively, leaping from data exploration to a dashboard without giving much thought to how audiences will receive the information. Your insights can get lost in a messy report. If the information is critical, shouldn’t the communication medium be crisp, clean and understood at a glance?

Advanced Analytics tools, such as Qlik and Power BI, are fantastic for creating interactive dashboards and reports you can use to explore large datasets, understand trends, track key ratios and indicators, and then share insights with your colleagues, or your boss. What makes these tools useful? They take data and refine it into information by placing a visualization over it, thereby helping our visually oriented brains make sense of the numbers. When it comes to understanding the meaning behind the numbers, a data table or Excel report can leave a brain very, very tired.

Here are some quick tips for making your next analytics report “Easy on the brain”!

Our 5 Power BI Tips:

Respect the Rim

Before my career in technology began, I worked as a waiter, and I worked at some pretty classy spots. If you have never had the pleasure, let me share with you that before each plate makes it to its appointed destination, it is briefly inspected. If any sauces, herbs, or actual food has errantly landed on the rim of the plate, it is removed. “Respect the Rim,” my mentor once told me. The same is true for your data and information. Enforce a thin empty “margin” around each of your Power BI reports. By using the “Snap to Grid” feature, make sure that each visual on the report is aligned to your self-imposed margin. The analysis will look sharper and more credible.

What’s the Headline?

Most reports have data points which are essential, such as Key Performance Indicators (KPI) defined by management, or other indicators such as “Bottom Line” financials, Net Income or EBITDA. These essential measurements are the Headline for the report. Don’t bury the Headline – always place key information in the upper left-hand corner of the report, either in a Power BI Card or KPI visual.

Once you have done that, you can further segment visual information into key categories, and keep them segmented into groups. For example, beneath your KPI you may want to provide leading indicators or contributing factors. Another option may be to provide a group of categorical breakdowns together, or key ratios that contribute to the success or challenges of your headline. You may want to provide a group of visuals providing detailed exposition – a table or other visual with detailed categories. The information should flow from the Headline to the Exposition, just as a newspaper story would.

              

Have a Perspective

Reports that provide multiple ways to filter and slice the data are very helpful to data analysts, data scientists and casual explorers. These are tools to help you with data exploration. Once you’ve found insights worth sharing, focus the audience’s attention on that information by removing the extraneous bits. Don’t worry – the underlying Power BI data model remains for exploration of new insights later.

Make It Mobile

I always make a point of creating a phone layout for my reports, because it is very easy in Power BI. Due to the smaller form factor, phone layouts require you to further choose the essential information. However, if you’ve followed my advice, then you know what the headline is already. Simply drag the headlines onto the phone layout for your report before publishing. Among many outstanding features, the Power BI mobile app allows users to get notifications and data-driven alerts. End users can even mark up phone reports, distribute with their annotations and launch a conversation.

Intentional Style

Use good judgement and don’t get carried away with excessive colors, logos, or background images. I find that a company logo can be helpful for some audiences. However, when it comes to colors, be mindful of some universal rules.

  1. Some colors convey information, intended or otherwise (Red, Yellow and Green, for example)
  2. Company branding adds a professional touch – use the color scheme
  3. Use no more than 10 data colors, no more than 3 backgrounds and no more than 4 fonts

I recommend saving a Custom Theme for Power BI that reflects this guideline. Save off your company theme and save for later use. Any changes can be applied globally.

Your Most Important Information, Quickly Understood

Critical information is more valuable when it is quickly understood. Indeed, each visualization available conveys information in its own manner; therefore, it is important for professionals who prepare analysis with advanced analytics tools to be mindful of the strengths and weaknesses of each. Regardless of audience, these five guidelines apply.

Advanced analytics tools are a critical component to digital transformation, because they enable data-driven decision making. Among other things, data-driven decision making requires the creation of information from data. Often, that data is massive, or moving quite rapidly. To extract insights and reduce business uncertainty, talk to BlumShapiro about our analytics service offerings. We’ll provide a road map to data-driven decision making, enabling digital information at your fingertips. And you can focus on your business.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

Our 5 Rules of Data Science

In manufacturing, the better the raw materials, the better the product. The same goes for data science, where a team cannot be effective unless the raw materials of data science are available to them. In this realm, data is the raw material which produces a prediction. However, raw materials alone are not sufficient. Business people who oversee machine learning teams must demand that best practices be applied, otherwise investments in machine learning will produce dubious business results. These best practices can be summarized into our five rules of data science.

For the purpose of illustration, let’s assume the data science problem our team is working on is related to the predictive maintenance of equipment on a manufacturing floor. Our team is working on helping the firm predict equipment failure, so that operations can replace the equipment before it impacts the manufacturing process.

Our 5 Rules of Data Science

1. Have a Sharp Question

A sharp question is specific and unambiguous. Computers do not appreciate nuance. They are not able to classify events into yes/no buckets if the question is: “Is Component X ready to fail?” Nor does the question need to concern itself with causes. Computers do not ask why – they calculate probability based upon correlation. “Will component X overheat?” is a question posed by a human who believes that heat contributes to equipment failure. A better question is: “Will component X fail in the next 30 minutes?”

2. Measure at the Right Level

Supervised learning requires real examples from which a computer can learn. The data you use to produce a successful machine learning model must demonstrate cases where failure has occurred. It must also demonstrate examples where equipment continues to operate smoothly. We must be able to unambiguously identify events that were failure events, otherwise, we will not be able to train the machine learning model to classify data correctly.

3. Make Sure Your Data is Accurate

Did a failure really occur? If not, the machine learning model will not produce accurate results. Computers are naïve – they believe what we tell them. Data science teams should be more skeptical, particularly when they believe they have made a breakthrough discovery after months of false starts. Data science leaders should avoid getting caught up in the irrational exuberance of a model that appears to provide new insight. Like any scientific endeavor, test your assumptions, beginning with the accuracy and reliability of the observations you started with to create the model.

4. Make Sure Your Data is Connected

The data used to train your model may be anonymized, because factors that correlate closely to machine failure are measurements, not identifiers. However, once the model is ready to be used, the new data must be connected to the real world – otherwise, you will not be able to take action. If you have no central authoritative record of “things”, you may need to develop a master data management solution before your Internet of Things with predictive maintenance machine learning can yield value. Also, your response to a prediction should be connected. Once a prediction of failure has been obtained, management should already know what needs to happen – use insights to take swift action.

5. Make Sure You Have Enough Data

The accuracy of predictions improve with more data. Make sure you have sufficient examples of both positive and negative outcomes, otherwise it will be difficult to be certain that you are truly gaining information from the exercise.

The benefits of predictive maintenance, and other applications of machine learning, are being embraced by businesses everywhere. For some, the process may appear a bit mysterious, but it needn’t be. The goal is to create a model which, when fed real-life data, improves the decision making of the humans involved in the process. To achieve this, data science teams need the right data and the right business problem to solve. Management should work to ensure that these five questions are answered to their satisfaction before investing in data science activities.

Not sure if you have the right raw materials? Talk to BlumShapiro Consulting about your machine learning ambitions. Our technology team is building next generation predictive analytics solutions that connect to the Internet of Things. We are helping our clients along each step of their digital transformation journey.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

Microsoft Announces Power BI Premium: Removes Functionality on Free Version

Many of our clients come to us looking for solutions to help them achieve “Business Intelligence for Everyone” in their organization while avoiding the pitfalls of reporting in Excel. Our response is simple: Microsoft Power BI is an easy-to-use, non-technical business intelligence tool which is far more robust than Microsoft Excel for reporting. End users who rely upon Excel for reporting often view Power BI as a logical step up. With Power BI, users can automate mundane data transformation steps, connect to a broad range of data sources and securely collaborate with colleagues  —all within an environment that looks and feels just like Excel. Our clients have reported that Power BI’s free edition includes enough functionality to get started on any reporting initiative, automate data extraction and transformation activities and share the results with a team of executives, analysts, managers and colleagues. However, as Power BI data and report volumes grow, organizations may choose to step up to Power BI Pro, which upgrades users from 1GB to 10GB of data and enables complex analytics sharing capabilities, even outside the organization.

Finding a Solution for Larger Organizations

The current Power BI service does present some challenges to larger, more sophisticated organizations. Some of the issues include:

  •   Sharing and collaboration features would often become complex and difficult to manage
  • Compute resources are shared, not dedicated, and there is no ability to provision additional compute resources
  • Structured reporting capabilities are not well suited for interactive reports and “single pane of glass” dashboards delivered in Power Bi

These issues begged for a simpler, more manageable model for large organizations.

Introducing Power BI Premium

In early May 2017, Microsoft announced its intention to introduce a new licensing level for Power BI, Power BI Premium. Power BI Premium is designed to address the shortcomings of Power BI Pro. Here are three things to know about Power BI Premium:

  1. Power BI Premium Edition will support Power BI Apps. Power BI Apps replace Content Packs and Power BI Embedded. Organizations that currently share Power BI content externally with Power BI Embedded should plan to migrate to Power BI Premium Edition.
  1. Power BI Premium Edition offers dedicated capacity for organizations that need more control. Instead of paying strictly per user, Power BI Premium is licensed on a combined capacity and usage model. This enables organizations who struggle with the per user data limits enforced on Free and Pro Edition users (1 GB and 10GB maximums, respectively) to load data models that are much larger. As with other Azure services, organizations can scale up and scale down capacity as their needs change.
  1. Power BI Premium Edition includes a license for Power BI Report Server—a full featured on-premises solution supporting both Power BI (interactive) reports and Reporting Services (paginated, structured) reports.

Important Note for Power BI Free Edition Users

Power BI Free Edition became quite attractive because many users within the same organization could share content without paying any fee. Unfortunately, Power BI Free Edition functionality will be changing soon. Users on the Free Edition will no longer be able to share dashboards with colleagues, other than by printing them out, or showing others their “personal dashboard” in a browser. As of June 1, users enjoying dashboard sharing will no longer be able to do so under the Free Edition.

June 1st is right around the corner, and some organizations have built fully functional company dashboards using Free Edition licenses. These organizations now face the prospect of having to either upgrade to Power BI Pro Edition ($10/user/month) or lose vital collaboration features. This is why Microsoft is offering a 1-year trial of Power BI Pro licenses to users who have previously signed up for Power BI Free Edition. This allows organizations to carefully consider which users need Power BI Pro for data model, report and dashboard creation and collaboration and which do not. Some organizations will stay on the Free Edition, and simply share their BI content via PowerPoint. Others will look at Power BI Pro or Premium licensing and continue to see value.

Next Steps

Microsoft has stated that general availability of Power BI Premium is on the horizon, but no specific release date has been communicated. If your organization has many users creating reports and dashboards with the Free Edition, here are some things you can do to get ready for the change.

  1. Take advantage of the 1-Year Power BI Pro trial – encourage users to respond to any email communication from Microsoft and take advantage of the grace period
  1. Download the Power BI Report Server and take it for a spin
  1. Review the Power BI Premium Calculator to understand what your costs would look like under the Power BI Premium model

For more information on how to achieve high performance analytics and reporting with Power BI, contact Brian Berry and our Data Analytics team at bberry@blumshapiro.com, or by phone at 860.570.6368.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

Using Real Time Data Analytics and Visualization Tools to Drive Your Business Forward

Business leaders need timely information about the operations and profitability of the businesses they manage to help make informed decisions. But when information delivery is delayed, decision makers lose precious time to adjust and respond to changing market conditions, customer preferences, supplier issues or all three. When thinking about any business analytics solution, a critical question to ask is: how frequently can we (or should we) update the underlying data? Often, the first answer from the business stakeholders is “as frequently as possible.” The concept of “real time analytics,” with data being provided up-to-the minute, is usually quite attractive. But there may be some confusion about what this really means.

While the term real time analytics does refer to data which is frequently changing, it is not the same as simply refreshing data frequently. Traditional analytics packages which take advantage of data marts, data warehouses and data cubes are often collectively referred to as a Decision Support System (DSS). A DSS helps business analysts, management and ownership understand historical trends in their business, perform root cause analysis and enable strategic decisions. Whereas a DSS system aggregates and analyzes sales, costs and other transactions, a real time analytics system ingests and processes events. One can imagine a $25 million business recording 10,000 transactions a day. One can imagine that same business recording events on their website: login, searches, shopping cart adds, shopping card deletes, product image zoom events. If the business is 100% online, how many events would that be? The answer may astonish you.

Why Real Time Analytics?

DSS solutions answer questions such as “What was our net income last month?”, “What was our net income compared to the same month last year?” or “Which customers were most profitable last month?” Real time analytics answers questions such as “Is the customer experience positive right now?” or “How can we optimize this transaction right now?” In the retail industry, listening to social media channels to hear what customers are saying about their experience in your stores, can drive service level adjustments or pricing promotions. When that analysis is real-time, store managers can adjust that day for optimized profitability. Some examples:

  1. Social media sentiment analysis – addressing customer satisfaction concerns
  2. Eliminating business disruption costs with equipment maintenance analytics
  3. Promotion and marketing optimization with web and mobile analytics
  4. Product recommendations throughout the shopping experience, online or “brick and mortar”
  5. Improved health care services with real time patient health metrics from wearable technology

In today’s world, customers expect world class service. Implicit in that expectation is the assumption that companies with whom they do business “know them”, anticipate their needs and respond to them. That’s easy to say, but harder to execute. Companies who must meet that expectation need technology leaders to be aware of three concepts critical to making real time analytics a real thing.

The first is Internet of Things or IoT. The velocity and volume of data generated by mobile devices, social media, factory floor sensors, etc. is the basis for real time analytics. “Internet of Things” refers to devices or sensors which are connected to the internet, providing data about usage or simply their physical environment (where the device is powered on). Like social media and mobile devices, IoT sensors can generate enormous volumes of data very, very quickly – this is the “big data” phenomenon.

The second is Cloud Computing. The massive scale of IoT and big data can only be achieved with cloud scale data storage and cloud scale data processing. Unless your company’s name is Google, Amazon or Microsoft, you probably cannot keep up. So, to achieve real-time analytics, you must embrace cloud computing.

The third is Intelligent Systems. IBM’s “Watson” computer achieved a significant milestone by out-performing humans on Jeopardy. Since then, companies have been integrating artificial intelligence (AI) into large scale systems. AI in this sense is simply a mathematical model which calculates the probability that data represents something a human would recognize: a supplier disruption, a dissatisfied customer about to cancel their order, an equipment breakdown. Using real time data, machine learning models can recognize events which are about to occur. From there, they can automate a response, or raise an alert to the humans involved in the process. Intelligent systems help humans make nimble adjustments to improve the bottom line.

What technologies will my company need to make this happen?

From a technology perspective, a clear understanding of cloud computing is essential. When evaluating a cloud platform, CIO’s should look for breadth of capability and support for multiple frameworks. As a Microsoft Partner, BlumShapiro Consulting works with Microsoft Azure and its Cortana Intelligence platform. This gives our clients cloud scale, low cost and a wide variety of real time and big data processing options.

CIO Article 1

This diagram describes the Azure resources which comprise Cortana Intelligence. The most relevant resources for real time analytics are:

  1. Event Hubs ingest high velocity streaming data being sent by Event Providers (i.e. Sensors and Devices)
  2. Data Lake Store provide low cost cloud storage which no practical limits
  3. Stream Analytics perform in-flight processing of streaming data
  4. Machine Learning, or AzureML, supports the design, evaluation and integration of predictive models into the real-time pipeline
  5. Cognitive Services are out-of-the-box Artificial Intelligence services, addressing a broad range of common machine intelligence scenarios
  6. Power BI supports streaming datasets made visible in a dashboard context

Four Steps to Get Started with Real Time Analytics

Start with the Eye Candy – If you do not have a dashboard tool which supports real-time data streaming, consider solutions such as Power BI. Even if you are not ready to implement an IoT solution, Power BI makes any social media or customer marketing campaigns much more feasible. Power BI can be used to connect databases, data marts, data warehouses and data cubes, and is valuable as a dashboard and visualization tool for existing DSS systems. Without visualization, it will be very difficult to provide human insights and actions for any kind of data, slow or fast.

Get to the Cloud – Cloud storage costs and cloud processing scale are the only mechanisms by which real time analytics is economically feasible (for most companies). Learn how investing in technologies like Cloud Computing can really help move your business forward.

Embrace Machine Intelligence – To make intelligent systems a reality, you will need to understand machine learning technologies, if only at a high level. Historically, this has meant developing a team of data scientists, many of whom have PhD’s in Mathematics or Statistics, and open source tools like R or Python. Today, machine learning is much more accessible then it has ever been. AzureML helps to fast track both the evaluation and operationalization of predictive models.

Find the Real-Time Opportunity – As the technology leader in the organization, CIO’s will need to work closely with other business leaders to understand where real-time information can increase revenue, decrease costs or both. This may require imagination. Start with the question – what would we like to know faster? If we knew our customer was going to do this sooner, how would we respond? If we knew our equipment was going to fail sooner, how would we respond? If we knew there was an opportunity to sell more, how would we respond?

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

Power BI Demo CTA