Archive for Predictive Analytics

Two Key Benefits of HR Analytics

In my last article, I wrote about the definition of HR Analytics and the skills needed to be successful in this field. In this article, I want to discuss two key benefits of HR analytics to the HR function in an organization and to the business: Evidence Based Decisions and Reducing Human Bias.

HR professionals want to be strategic partners with business leaders, not simply a cost center designed to maintain policies and procedures. While these policies are important, analytics provides HR with a means to demonstrably improve the efficiency of a company’s people resources. It does this in several ways.

Evidence Based Management Decisions

Through its dependence upon data and facts, HR Analytics delivers evidence, and evidence trumps intuition. To support these benefits, I’ll ask two questions:

Is your interview process optimized to find the best candidate for a position?

If you have ever participated in an interview process from the hiring perspective, you may be aware that at many companies, interviewing candidates can be an informal, non-standardized process. At worst, interviewees are simply asked by HR “What did you think?” More sophisticated HR methodologies define a standardized process for who the candidate meets and what questions are asked. At each stage, feedback is collected and quantified, typically in the form of ratings. Are these ratings predictive of future performance in the job role to be filled? HR Analytics can tell you the factors that are predictive of high performers in certain job roles (or tell you that you don’t know and that you should either change your process or collect different data points).

Does internal employee training improve company performance?

Most HR professionals would say ”Yes, employee training is a good thing and we need to do it.“ Many top companies spend precious resources to train their sales staff or send aspiring leaders to leadership training. Does this training have a material impact on performance? On the company’s bottom line? HR Analytics aspires to quantify that benefit. To do this, we may need to pull together data from several systems, such as on-the-job performance data, financial data and data collected during the training process. We should define the performance metrics that are most important in that job role. We must also consider a baseline of performance (i.e., comparable employees who were not able to take the training). By taking a more scientific approach, we can quantify the benefit and produce evidence of impact. We may also demonstrate that certain training is ineffective.

Reducing Human Bias

If you have read Michael Lewis’s book The Undoing Project, then you know about the work done by psychologists in the last 50 years to explain how bias interrupts the human mind’s ability to perceive information. Literally, our personal bias leads us to see things that simply are not there. We all have expectations, and these expectations are based upon hard won human experience—most of which has served us very well in life. But in the case of making HR judgments, or indeed any judgement requiring us to process large amounts of information, bias is quite detrimental.

In the questions/examples provided above, we see the opportunity for human bias to creep into common HR processes and potentially undermine them. First, let’s examine the interviewing process. As people, we may have expectations about how a qualified candidate dresses, how they speak, and which personality traits are most prominent in a good candidate. These are likely informed by our own experience, and colleagues who may have made a deep impression on us. Just as likely, information contradicting the same bias is dismissed. This means that our human minds are not able to process large amounts of information in a uniform and objective manner. When applied correctly, HR analytics can do this much better.  For example, an HR analytics team would consider data collected during the evaluation phase and performance data for successful applicants; in other words, before and after hire. Hopefully, many applicants become very successful at your firm, but you also know that many do not. We can apply a label certain to each candidate profile, recognizing that the candidate either was or was not successful.  We can then train our analytics algorithms to learn what a successful employee will look like, mathematically, at hire time and reduce our human bias. Bear in mind that bias can still creep into the process, if interviewers fail to recognize the need for standardization and quantification.

Similarly, as it relates to evaluating training against performance, we see an opportunity for bias to lead to conclusions that are false, or at least for which there is no evidence. Business leaders can (and should) demand this evidence from HR, so that they know that capital is being deployed correctly in support of the firm’s financial well-being. To be clear, it can be very difficult to prove causation between training and financial ratios (i.e., that training causes an increase in Net Income). However, HR should be able to provide evidence demonstrating correlation between employees who perform well on the job (be that metric in sales figures or on-time delivery) and those who attend certain training activities. When HR provides evidence of this correlation, it becomes a strategic partner with business leaders, helping them see and understand the patterns in human behavior.

See Differently, Know the Facts

Analytics offers HR professionals an opportunity to approach decision making differently. Measurements and quantification of candidate and employee characteristics and performance can provide evidence of correlation between the policies HR is supporting and the outcomes the business seeks to drive. By thinking differently about HR, we can reduce our propensity to see things that are not there, replacing that vision with a clear eyed, scientific, data-driven approach.

Want to learn more about the world of HR Analytics? We are speaking at this year’s CBIA Human Resources Conference on the topic. We hope to see you there!

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

See the Impact Digital Transformation Can Have on Your Bottom Line

Digital Transformation has become an industry buzzword. We’re here to clarify what it means in dollars and cents.

Digital transformation represents an organizational change where data becomes relevant and valuable. Once transformed, these organizations use data to improve decision making, connect with their customers, improve vendor relationships and allow employees to provide higher level skills and value to the organization.

Digitally transformed organizations think about their products and services in both a physical and digital space, use technology to improve customer service, often have an enhanced perspective of their market and how their business model operates within that redefined market.

We believe that digital transformation is a qualification to compete in today’s business environment.

The question people often ask next is – how much does this cost?  We posit the answer to that question is nothing.  The cost (and risk) is in remaining stagnant. Digital transformation uncovers assets previously underutilized by the organization. The proper investments in digital transformation will only empower your organization to survive and thrive – and when done properly it should yield an immediate and direct ROI that returns value back to the organization straightaway.

We’ve developed a Digital Transformation Accounting Worksheet ROI calculator for you to experiment with. Punch in your numbers and let us know what you think.  We would be happy to discuss your digital transformation in more detail.

About Noah:

240-Ullman,-NoahNoah is the Director of Business Development for BlumShapiro’s Technology Consulting Group. He brings over 25 years of business experience from entrepreneurial start ups, to over a decade of working at Microsoft in various sales, marketing and business development roles. Noah has launched Windows XP, Office XP, Tablet PC, Media Center PC, MSN Direct Smartwatches (an early IoTattempt), several videogames, a glove controller, and a wine import company/brand. Noah spent three years living overseas building out Microsoft’s Server and Tools business in Eastern Europe working with the IT Pro and developer communities. He considers himself a futurist, likes science fiction and loves applying what was recently science fiction to real world problems and opportunities. 

Our 5 Rules of Data Science

In manufacturing, the better the raw materials, the better the product. The same goes for data science, where a team cannot be effective unless the raw materials of data science are available to them. In this realm, data is the raw material which produces a prediction. However, raw materials alone are not sufficient. Business people who oversee machine learning teams must demand that best practices be applied, otherwise investments in machine learning will produce dubious business results. These best practices can be summarized into our five rules of data science.

For the purpose of illustration, let’s assume the data science problem our team is working on is related to the predictive maintenance of equipment on a manufacturing floor. Our team is working on helping the firm predict equipment failure, so that operations can replace the equipment before it impacts the manufacturing process.

Our 5 Rules of Data Science

1. Have a Sharp Question

A sharp question is specific and unambiguous. Computers do not appreciate nuance. They are not able to classify events into yes/no buckets if the question is: “Is Component X ready to fail?” Nor does the question need to concern itself with causes. Computers do not ask why – they calculate probability based upon correlation. “Will component X overheat?” is a question posed by a human who believes that heat contributes to equipment failure. A better question is: “Will component X fail in the next 30 minutes?”

2. Measure at the Right Level

Supervised learning requires real examples from which a computer can learn. The data you use to produce a successful machine learning model must demonstrate cases where failure has occurred. It must also demonstrate examples where equipment continues to operate smoothly. We must be able to unambiguously identify events that were failure events, otherwise, we will not be able to train the machine learning model to classify data correctly.

3. Make Sure Your Data is Accurate

Did a failure really occur? If not, the machine learning model will not produce accurate results. Computers are naïve – they believe what we tell them. Data science teams should be more skeptical, particularly when they believe they have made a breakthrough discovery after months of false starts. Data science leaders should avoid getting caught up in the irrational exuberance of a model that appears to provide new insight. Like any scientific endeavor, test your assumptions, beginning with the accuracy and reliability of the observations you started with to create the model.

4. Make Sure Your Data is Connected

The data used to train your model may be anonymized, because factors that correlate closely to machine failure are measurements, not identifiers. However, once the model is ready to be used, the new data must be connected to the real world – otherwise, you will not be able to take action. If you have no central authoritative record of “things”, you may need to develop a master data management solution before your Internet of Things with predictive maintenance machine learning can yield value. Also, your response to a prediction should be connected. Once a prediction of failure has been obtained, management should already know what needs to happen – use insights to take swift action.

5. Make Sure You Have Enough Data

The accuracy of predictions improve with more data. Make sure you have sufficient examples of both positive and negative outcomes, otherwise it will be difficult to be certain that you are truly gaining information from the exercise.

The benefits of predictive maintenance, and other applications of machine learning, are being embraced by businesses everywhere. For some, the process may appear a bit mysterious, but it needn’t be. The goal is to create a model which, when fed real-life data, improves the decision making of the humans involved in the process. To achieve this, data science teams need the right data and the right business problem to solve. Management should work to ensure that these five questions are answered to their satisfaction before investing in data science activities.

Not sure if you have the right raw materials? Talk to BlumShapiro Consulting about your machine learning ambitions. Our technology team is building next generation predictive analytics solutions that connect to the Internet of Things. We are helping our clients along each step of their digital transformation journey.

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

Using Real Time Data Analytics and Visualization Tools to Drive Your Business Forward

Business leaders need timely information about the operations and profitability of the businesses they manage to help make informed decisions. But when information delivery is delayed, decision makers lose precious time to adjust and respond to changing market conditions, customer preferences, supplier issues or all three. When thinking about any business analytics solution, a critical question to ask is: how frequently can we (or should we) update the underlying data? Often, the first answer from the business stakeholders is “as frequently as possible.” The concept of “real time analytics,” with data being provided up-to-the minute, is usually quite attractive. But there may be some confusion about what this really means.

While the term real time analytics does refer to data which is frequently changing, it is not the same as simply refreshing data frequently. Traditional analytics packages which take advantage of data marts, data warehouses and data cubes are often collectively referred to as a Decision Support System (DSS). A DSS helps business analysts, management and ownership understand historical trends in their business, perform root cause analysis and enable strategic decisions. Whereas a DSS system aggregates and analyzes sales, costs and other transactions, a real time analytics system ingests and processes events. One can imagine a $25 million business recording 10,000 transactions a day. One can imagine that same business recording events on their website: login, searches, shopping cart adds, shopping card deletes, product image zoom events. If the business is 100% online, how many events would that be? The answer may astonish you.

Why Real Time Analytics?

DSS solutions answer questions such as “What was our net income last month?”, “What was our net income compared to the same month last year?” or “Which customers were most profitable last month?” Real time analytics answers questions such as “Is the customer experience positive right now?” or “How can we optimize this transaction right now?” In the retail industry, listening to social media channels to hear what customers are saying about their experience in your stores, can drive service level adjustments or pricing promotions. When that analysis is real-time, store managers can adjust that day for optimized profitability. Some examples:

  1. Social media sentiment analysis – addressing customer satisfaction concerns
  2. Eliminating business disruption costs with equipment maintenance analytics
  3. Promotion and marketing optimization with web and mobile analytics
  4. Product recommendations throughout the shopping experience, online or “brick and mortar”
  5. Improved health care services with real time patient health metrics from wearable technology

In today’s world, customers expect world class service. Implicit in that expectation is the assumption that companies with whom they do business “know them”, anticipate their needs and respond to them. That’s easy to say, but harder to execute. Companies who must meet that expectation need technology leaders to be aware of three concepts critical to making real time analytics a real thing.

The first is Internet of Things or IoT. The velocity and volume of data generated by mobile devices, social media, factory floor sensors, etc. is the basis for real time analytics. “Internet of Things” refers to devices or sensors which are connected to the internet, providing data about usage or simply their physical environment (where the device is powered on). Like social media and mobile devices, IoT sensors can generate enormous volumes of data very, very quickly – this is the “big data” phenomenon.

The second is Cloud Computing. The massive scale of IoT and big data can only be achieved with cloud scale data storage and cloud scale data processing. Unless your company’s name is Google, Amazon or Microsoft, you probably cannot keep up. So, to achieve real-time analytics, you must embrace cloud computing.

The third is Intelligent Systems. IBM’s “Watson” computer achieved a significant milestone by out-performing humans on Jeopardy. Since then, companies have been integrating artificial intelligence (AI) into large scale systems. AI in this sense is simply a mathematical model which calculates the probability that data represents something a human would recognize: a supplier disruption, a dissatisfied customer about to cancel their order, an equipment breakdown. Using real time data, machine learning models can recognize events which are about to occur. From there, they can automate a response, or raise an alert to the humans involved in the process. Intelligent systems help humans make nimble adjustments to improve the bottom line.

What technologies will my company need to make this happen?

From a technology perspective, a clear understanding of cloud computing is essential. When evaluating a cloud platform, CIO’s should look for breadth of capability and support for multiple frameworks. As a Microsoft Partner, BlumShapiro Consulting works with Microsoft Azure and its Cortana Intelligence platform. This gives our clients cloud scale, low cost and a wide variety of real time and big data processing options.

CIO Article 1

This diagram describes the Azure resources which comprise Cortana Intelligence. The most relevant resources for real time analytics are:

  1. Event Hubs ingest high velocity streaming data being sent by Event Providers (i.e. Sensors and Devices)
  2. Data Lake Store provide low cost cloud storage which no practical limits
  3. Stream Analytics perform in-flight processing of streaming data
  4. Machine Learning, or AzureML, supports the design, evaluation and integration of predictive models into the real-time pipeline
  5. Cognitive Services are out-of-the-box Artificial Intelligence services, addressing a broad range of common machine intelligence scenarios
  6. Power BI supports streaming datasets made visible in a dashboard context

Four Steps to Get Started with Real Time Analytics

Start with the Eye Candy – If you do not have a dashboard tool which supports real-time data streaming, consider solutions such as Power BI. Even if you are not ready to implement an IoT solution, Power BI makes any social media or customer marketing campaigns much more feasible. Power BI can be used to connect databases, data marts, data warehouses and data cubes, and is valuable as a dashboard and visualization tool for existing DSS systems. Without visualization, it will be very difficult to provide human insights and actions for any kind of data, slow or fast.

Get to the Cloud – Cloud storage costs and cloud processing scale are the only mechanisms by which real time analytics is economically feasible (for most companies). Learn how investing in technologies like Cloud Computing can really help move your business forward.

Embrace Machine Intelligence – To make intelligent systems a reality, you will need to understand machine learning technologies, if only at a high level. Historically, this has meant developing a team of data scientists, many of whom have PhD’s in Mathematics or Statistics, and open source tools like R or Python. Today, machine learning is much more accessible then it has ever been. AzureML helps to fast track both the evaluation and operationalization of predictive models.

Find the Real-Time Opportunity – As the technology leader in the organization, CIO’s will need to work closely with other business leaders to understand where real-time information can increase revenue, decrease costs or both. This may require imagination. Start with the question – what would we like to know faster? If we knew our customer was going to do this sooner, how would we respond? If we knew our equipment was going to fail sooner, how would we respond? If we knew there was an opportunity to sell more, how would we respond?

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

Power BI Demo CTA