What Do You Do When You Have Too Many Hypotheses?

The most useful analytics begin with a clear question that is relevant to the business. For example, does it pay to continue to recruit at small universities or should we stick with the big ones? This gives the analytics team clear direction and they know that an actionable recommendation that matters to the business will arise from their work.

Often, it’s helpful to frame the question as a specific hypothesis, such as: “We hypothesize the cost per hire at small universities is high because we recruit fewer people per campus visit.” This isn’t an argument, it’s a statement to be tested against data. When we have a specific hypothesis it usually becomes clear what data or other evidence we need to see if the hypothesis is true.

However, I was asked a good question at a recent mini-workshop in Barrie, Ontario: What if there are too many hypotheses?

Consider the case of customer complaints about front line staff.  We can easily think of many different hypotheses for what actions might reduce complaints: hire staff who are more agreeable, have a longer onboarding period to teach customer service skills, adjust incentives to alter behavior, change how managers are trained — the list goes on.

How is an analytics team meant to study all these hypotheses? Aren’t we forced to go back to relying on intuition about what to change?

It’s true that we can’t study every hypothesis, but we don’t need to give up on analytics. The starting point is that if we talk to people directly in the business, they probably have a pretty good sense of the most relevant hypotheses. Listen to them and at least we’ll know where to start, rather than trying to address the whole list. It’s not perfect since managers in the business will at times be shortsighted, or have pet theories unsupported by data. Lack of perfection isn’t a reason not to move forward.

Article Continues Below

This brings us to a broader point. In HR analytics we can never test all the options, nor can we ever analyze data to the point of an indisputable conclusion. We do have to rely on intuition and experience to point us to hypotheses worth testing. We need to remember that our goal is not to make perfect decisions, just to make better ones within the practical constraints we face. As analytics professionals in a business we can’t be purists, that’s okay, just make things better and that’s a worthy accomplishment.

Special thanks to our community of practice for these insights. The community is a group of leading organizations that meets monthly to discuss analytics and evidence-based decision making in the real world.  If you’re interested in moving down the path towards a more effective approach to people analytics, then email me at dcreelman@creelmanresearch.com.

David Creelman

David Creelman, CEO of Creelman Research, is a globally recognized thinker on people analytics and talent management. Some of his more interesting projects included:

  • Conducted workshops around the world on the practical aspects of people analytics
  • Took business leaders from Japan’s Recruit Co. on a tour of US tech companies (Recruit eventually bought Indeed.com for $1 billion)
  • Studied the relationship between Boards and HR (won Walker Award)
  • Spoke at the World Bank in Paris on HR reporting
  • Co-authored Lead the Work: Navigating a world beyond employment with John Boudreau and Ravin Jesuthasan. The book was endorsed by the CHROs of IBM, LinkedIn and Starbucks.
  • Worked with Dr. Wanda Wallace on “Leading when you are not the expert” which topped the “Most Popular List” on the Harvard Business Review’s blog.
  • Worked with Dr. Henry Mintzberg on peer coaching, David’s learning modules are among the most popular topics.

Currently David is helping organizations to get on-track with people analytics.

This work led to him being made a Fellow for the Centre of Evidence-based Management (Netherlands) for his contributions to the field.

n