The most useful analytics begin with a clear question that is relevant to the business. For example, does it pay to continue to recruit at small universities or should we stick with the big ones? This gives the analytics team clear direction and they know that an actionable recommendation that matters to the business will arise from their work.
Often, it’s helpful to frame the question as a specific hypothesis, such as: “We hypothesize the cost per hire at small universities is high because we recruit fewer people per campus visit.” This isn’t an argument, it’s a statement to be tested against data. When we have a specific hypothesis it usually becomes clear what data or other evidence we need to see if the hypothesis is true.
However, I was asked a good question at a recent mini-workshop in Barrie, Ontario: What if there are too many hypotheses?
Consider the case of customer complaints about front line staff. We can easily think of many different hypotheses for what actions might reduce complaints: hire staff who are more agreeable, have a longer onboarding period to teach customer service skills, adjust incentives to alter behavior, change how managers are trained — the list goes on.
How is an analytics team meant to study all these hypotheses? Aren’t we forced to go back to relying on intuition about what to change?
It’s true that we can’t study every hypothesis, but we don’t need to give up on analytics. The starting point is that if we talk to people directly in the business, they probably have a pretty good sense of the most relevant hypotheses. Listen to them and at least we’ll know where to start, rather than trying to address the whole list. It’s not perfect since managers in the business will at times be shortsighted, or have pet theories unsupported by data. Lack of perfection isn’t a reason not to move forward.
This brings us to a broader point. In HR analytics we can never test all the options, nor can we ever analyze data to the point of an indisputable conclusion. We do have to rely on intuition and experience to point us to hypotheses worth testing. We need to remember that our goal is not to make perfect decisions, just to make better ones within the practical constraints we face. As analytics professionals in a business we can’t be purists, that’s okay, just make things better and that’s a worthy accomplishment.
Special thanks to our community of practice for these insights. The community is a group of leading organizations that meets monthly to discuss analytics and evidence-based decision making in the real world. If you’re interested in moving down the path towards a more effective approach to people analytics, then email me at dcreelman@creelmanresearch.com.