Tag Archive for Data Management

Two Key Benefits of HR Analytics

In my last article, I wrote about the definition of HR Analytics and the skills needed to be successful in this field. In this article, I want to discuss two key benefits of HR analytics to the HR function in an organization and to the business: Evidence Based Decisions and Reducing Human Bias.

HR professionals want to be strategic partners with business leaders, not simply a cost center designed to maintain policies and procedures. While these policies are important, analytics provides HR with a means to demonstrably improve the efficiency of a company’s people resources. It does this in several ways.

Evidence Based Management Decisions

Through its dependence upon data and facts, HR Analytics delivers evidence, and evidence trumps intuition. To support these benefits, I’ll ask two questions:

Is your interview process optimized to find the best candidate for a position?

If you have ever participated in an interview process from the hiring perspective, you may be aware that at many companies, interviewing candidates can be an informal, non-standardized process. At worst, interviewees are simply asked by HR “What did you think?” More sophisticated HR methodologies define a standardized process for who the candidate meets and what questions are asked. At each stage, feedback is collected and quantified, typically in the form of ratings. Are these ratings predictive of future performance in the job role to be filled? HR Analytics can tell you the factors that are predictive of high performers in certain job roles (or tell you that you don’t know and that you should either change your process or collect different data points).

Does internal employee training improve company performance?

Most HR professionals would say ”Yes, employee training is a good thing and we need to do it.“ Many top companies spend precious resources to train their sales staff or send aspiring leaders to leadership training. Does this training have a material impact on performance? On the company’s bottom line? HR Analytics aspires to quantify that benefit. To do this, we may need to pull together data from several systems, such as on-the-job performance data, financial data and data collected during the training process. We should define the performance metrics that are most important in that job role. We must also consider a baseline of performance (i.e., comparable employees who were not able to take the training). By taking a more scientific approach, we can quantify the benefit and produce evidence of impact. We may also demonstrate that certain training is ineffective.

Reducing Human Bias

If you have read Michael Lewis’s book The Undoing Project, then you know about the work done by psychologists in the last 50 years to explain how bias interrupts the human mind’s ability to perceive information. Literally, our personal bias leads us to see things that simply are not there. We all have expectations, and these expectations are based upon hard won human experience—most of which has served us very well in life. But in the case of making HR judgments, or indeed any judgement requiring us to process large amounts of information, bias is quite detrimental.

In the questions/examples provided above, we see the opportunity for human bias to creep into common HR processes and potentially undermine them. First, let’s examine the interviewing process. As people, we may have expectations about how a qualified candidate dresses, how they speak, and which personality traits are most prominent in a good candidate. These are likely informed by our own experience, and colleagues who may have made a deep impression on us. Just as likely, information contradicting the same bias is dismissed. This means that our human minds are not able to process large amounts of information in a uniform and objective manner. When applied correctly, HR analytics can do this much better.  For example, an HR analytics team would consider data collected during the evaluation phase and performance data for successful applicants; in other words, before and after hire. Hopefully, many applicants become very successful at your firm, but you also know that many do not. We can apply a label certain to each candidate profile, recognizing that the candidate either was or was not successful.  We can then train our analytics algorithms to learn what a successful employee will look like, mathematically, at hire time and reduce our human bias. Bear in mind that bias can still creep into the process, if interviewers fail to recognize the need for standardization and quantification.

Similarly, as it relates to evaluating training against performance, we see an opportunity for bias to lead to conclusions that are false, or at least for which there is no evidence. Business leaders can (and should) demand this evidence from HR, so that they know that capital is being deployed correctly in support of the firm’s financial well-being. To be clear, it can be very difficult to prove causation between training and financial ratios (i.e., that training causes an increase in Net Income). However, HR should be able to provide evidence demonstrating correlation between employees who perform well on the job (be that metric in sales figures or on-time delivery) and those who attend certain training activities. When HR provides evidence of this correlation, it becomes a strategic partner with business leaders, helping them see and understand the patterns in human behavior.

See Differently, Know the Facts

Analytics offers HR professionals an opportunity to approach decision making differently. Measurements and quantification of candidate and employee characteristics and performance can provide evidence of correlation between the policies HR is supporting and the outcomes the business seeks to drive. By thinking differently about HR, we can reduce our propensity to see things that are not there, replacing that vision with a clear eyed, scientific, data-driven approach.

Want to learn more about the world of HR Analytics? We are speaking at this year’s CBIA Human Resources Conference on the topic. We hope to see you there!

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

5 Critical Skillsets for HR Analytics

Increasingly, companies are applying analytics and data science procedures to new areas of their business. Human Resources (HR) management, with its central role in managing the People in a business, is one such area. HR Analytics is a fact-based approach to managing people. A fact-based approach helps organizations validate their assumptions about how best to manage their people. This makes good business sense: on average, companies spend 70% of their budget on personnel expenses.

Using data and statistical methods, HR may look to examine people-oriented questions, such as:

  • Can we better understand employee absenteeism rates at a labor-intensive business, such as retail, food service or industrial manufacturing? Can we predict it?
  • Do our compensation realities reflect fair and balanced job classification policies? Asked differently, which factors are most predictive of compensation: ones we want to reward (i.e. education level, on-the-job performance) or ones we need to ignore (i.e. gender, age or race)?
  • What is our real employee churn rate? Can we identify employees headed out the door and take preventive steps?
  • Are our service response times keeping pace with spikes in customer demand?

These questions, and many more, can be answered with datasets, data science and statistics.  But how?  Analytics involves skill sets that go beyond those considered “traditional.” Knowledge of recruitment, hiring, firing and compensation are key to understanding HR processes. However, HR professionals often struggle to answer these questions in a data-driven manner, because they lack the diverse skills required to perform advanced analytics. These skills include statistical and data analytical techniques, data aggregation, and mathematical modelling. Finding the right data can be another challenge. Data analytics requires data, and that data is likely to reside in several different systems. IT professionals play a critical role. Finally, communication to the business is a key skill. HR Analytics projects may produce analysis and models that contradict conventional wisdom.  Action on these insights requires the team to communicate the what, why and how’s of Data Science.

To be successful, HR Analytics projects require five distinct skillsets to be successful in creating value for an organization.

  • Without Business input, HR Analytics projects may answer questions with no value added to the organization.
  • Without Marketing input, insights from HR Analytics will fail to be adopted by the business.
  • Without HR input, the team will struggle to recognize relevant data and interpret the outcomes.
  • Without Data Analytics input, analysis will be “stuck in first gear” – producing basic descriptive statistics (i.e. Averages and Totals), but never advancing to diagnostic (i.e. root cause) or predictive (i.e. Machine Learning) models.
  • Without IT input, the team struggles to acquire relevant data in a usable format.

HR leaders must engage all the required perspectives and skillsets to be successful with analytics. Business, marketing, HR and IT are common perspectives found in most organizations. But Data Analytics professionals, able to cleanse data, identify candidate predictive models and evaluate model output, are typically lacking.  We encourage HR professionals, interested in learning more about The Power of Data, to reach out to our Data Analytics Advisory Services team. Our goal is to help you understand the data science process, identify business opportunities, and potentially offer analytics services that fill in the missing pieces for your puzzle.

Want to learn more about the world of HR Analytics?  We are speaking at this year’s CBIA Human Resources Conference on the topic. We hope to see you there!

Berry_Brian-240About Brian: Brian Berry leads the Microsoft Business Intelligence and Data Analytics practice at BlumShapiro. He has over 15 years of experience with information technology (IT), software design and consulting. Brian specializes in identifying business intelligence (BI) and data management solutions for upper mid-market manufacturing, distribution and retail firms in New England. He focuses on technologies which drive value in analytics: data integration, self-service BI, cloud computing and predictive analytics

How Much is Your Data Worth?

Data is the new currency in today’s modern businesses. From the largest international conglomerate down to the smallest neighborhood Mom-and-Pop shop, data is EVERYTHING! Without data, you don’t know who to bill for services, or for how much. You don’t know how much inventory you need on hand, or who to buy it from if you run out. Seriously, if you lost all of your data, or even a small but vitally important piece of it, could your company recover? I’m guessing not.

“But,” you say, “We have a disaster recovery site we can switch to!”

That’s fine if your racks melt down into a pool of heavy metals on the server room floor, then yes, by all means switch over to your disaster recovery site because molten discs certainly qualify as a “disaster!” Databases hosted on private or public cloud virtual machines are less susceptible, but not immune, to hardware failures.  But what about a failure of a lesser nature? What if one of your production databases gets corrupted because of a SQL Injection hack, cleaned out by a disgruntled employee, or is accidentally purged because a developer thought he was working against the DEV environment? Inadvertent changes to data are no respecter of where such data is stored, or how it is stored! And, sorry to say, clustering or other HADR solutions (High Availability/Disaster Recovery, such as SQL Server Always On technology) may not be able to save you in some cases. Suppose some data gets deleted or is modified in error. These ‘changes’, be they accidental or on purpose, may get replicated to the inactive node of the cluster before the issue is discovered. After all, the database system doesn’t know if it should stop such changes from happening when the command to modify data is issued. How can it tell an ‘accidental purge’ from regular record maintenance? So the system replicates those changes to the failover node. You end up with TWO copies of an incorrect database instead of one good one and one bad! And worse yet, depending on your data replication latency from your primary site to the disaster recovery site, and how quickly you stop the DR site from replicating, THAT may get hosed too if you don’t catch it in time!

Enter the DATABASE BACKUP AND RESTORE, the subject of this article. Database backups have been around as long as Relational Database Management Systems (RDBMS). In my humble opinion, a product cannot be considered a full-featured RDBMS unless it has the capability of performing routine backups and allows for granular restore to a point in time. (Sorry, but Microsoft Excel and Access simply do not qualify.) Being a Microsoft guy, I’m going to zero in on their flagship product: SQL Server, but Oracle, SAP, IBM and many others will have similar functionality. (See the Gartner Magic Quadrant for database systems for a quick look at various vendors, including Microsoft a clear leader in this Magic Quadrant.)

So what is a BACKUP? “Is it not simply a copy of the database?” you say, “I can make file copies of my Excel spreadsheet. Isn’t that the same as a backup?” Let me explain how database backups work and then you can decide the answer to that question.

First of all, you’ll need the system to create a FULL database backup. This is a file generated by the database server system, stored on the file system, the format of which is proprietary to the system. Typically, full backups are taken once per night for a moderately sized database, for example under 100 GB, and should be handled via an automated scheduling service such as SQL Agent.

iStock_000006412772XSmallNext, you’ll need TRANSACTION LOG backups. Log backups, as they are known, record every single change in the database that has occurred since the last full or log backup. A good starting point is scheduling log backups at least every hour, with possible tightening down to every few minutes if the database is extremely active.

Now, to restore a database in the event of a failure, you need to do one very important step: backup the transaction log one last time if you want to have any hope of restoring to a recent point. To perform the actual restore, you’ll need what is known as the ‘chain of backups’ which includes the most recent full backup and every subsequent log backup. During the restore, you will be able to specify a point in time anywhere from the time of the full backup to the time of the latest log backup, right down to the second or millisecond.

So we’re all set right? Almost. The mantra of Database Administrators the world over regarding backups is this: “The backups are only as good and sure as the last time we tested the RESTORE capability.” In other words, if you haven’t tested your ability to restore your database to a particular point in time, you can’t be sure you’re doing it right. Case in point: I saw a backup strategy once where the FULL backups were written directly to a tape drive every night, then first thing in the morning, the IT guys would dutifully eject the tapes and immediately ship them out to an off-site storage location. How can you restore a database if your backups are not available? Case two: The IT guys, not understanding SQL backup functionality and benefits, used a third party tool to take database backups, but didn’t bother with the logs. After four years of this, they had a log that was 15 times the size of the database! So big, in fact, that there was no space available to hold its backup. About a year after I got the situation straightened out with regular full AND transaction log backups going, the physical server (virtualization was not common practice then) experienced a debilitating hardware failure and the whole system was down for three days. Once running again, the system (a financials software package with over 20,000 tables!) was restored to a point in time right before the failure. Having the daily FULL backups saved the financials system (and the company). But also having the log backups saved many people a day’s work if we had had to go back to the latest FULL backup.

So, what’s your data worth? If your data is critical to your business, it is critical that you properly back up the data. Talk to us to learn how we can help with this.

About Todd: Todd Chittenden started his programming and reporting career with industrial maintenance applications in the late 1990’s. When SQL Server 2005 was introduced, he quickly became certified in Microsoft’s latest RDBMS technology and has added certifications over the years. He currently holds an MCSE in Business Intelligence. He has applied his knowledge of relational databases, data warehouses, business intelligence and analytics to a variety of projects for BlumShapiro since 2011. 

Technology Talks Newsletter CTA

Formulate Essential Data Governance Practices

The creation of a  Data Governance function at your organization is a critical success factor in implementing Master Data Management.  Just like any machine on a factory floor, Master Data is an Asset.  An Asset implies ownership, maintenance and value creation: so too with Master Data.  To borrow an analogy from the Manufacturing world, Transactional data is the Widget, and Master data is one of the machines that makes the Widget.  It is part of your organization’s Value Chain.

Unfortunately, firms starting on the road to MDM fall peril to one of two pitfalls on either extreme of the Data Governance mandate.  The first pitfall is to treat MDM as a one-time project, not a program.  Projects have an end-date, Programs are continual.  Have you ever heard of an Asset Maintenance Project?  That’s a recipe for crisis.  Firms which maintain their assets as a Program do far better.

The second pitfall is ask the Data Governance team to do too much, too fast.  Governance cannot do much without some asset to govern.  Have you ever heard of a Machinery Maintenance Program instituted before you figured out what type of machinery you needed, what the output requirements were, or before you made the capital purchase?  I haven’t either.  First, you acquire the capital.  You do so with the expectation that you will maintain it.  Then you put it into production. Then you formulate the maintenance schedule and execute that plan.

In order to successfully stand up a Data Governance function for your Master Data Program, you’ll need to understand these essential roles in Data Governance: Executive Steering, Data Owners, Data Stewards. 

Follow these Do’s and Don’ts:

Do establish an Executive Steering Committee for all Master Data practices in your enterprise, focused upon strategic requirements, metrics and accountability.

Do establish Data Quality Metrics.   Tie them to strategic drivers of the business. Review them regularly.  Your MDM toolset should provide analytics or dashboards to provide this view.

Don’t ask the Steering Committee to own the data model or processes – that is the Data Ownership role.

Do establish a Data Ownership group for each domain of Master Data.  Ownership teams are typically cross-functional, not simply the AR manager for Customer Master Data, or the HR manager for Employee Master Data.  As you evolve down the maturity path, you will find that master data has a broad set of stakeholders – Do be ready to be inclusive.

Do establish regular status meetings where Data Ownership meets with the Executive Steering Committee to review priorities and issues.

Don’t require that Data Owners “handle the data”.  That is the Data Stewardship role.

Do formalize a Data Stewardship team for each domain of Master Data.  Data Stewards are “data people” – business people who live in the data, but with no technical skills required, per se (though technology people can contribute to a Data Stewardship team).

Don’t restrict Data Stewards to just  the people who report to the Data Owner – think cross-functional!

Do anticipate conflicts – Data Owners should have some political skills.  The reality is that Master Data is valuable to a broad set of constituencies within an enterprise.  Be practical as it relates to one faction’s “Wish List” and keep moving the ball forward.

Without a Data Governance function, MDM tends to be a one-time project (“Clean it Once”) and fails to deliver real value.  Without a clear vision of how Data Governance support MDM, it can hold things up. A rational Data Governance function does not need to hold up the execution of a Master Data project – it supports it. Keep Data Governance strategic, cross functional, and flexible. Then, let the MDM technology team deliver the tools.