What do you think are the areas where people often over-generalise findings? (that people should be aware of)

As organisations start playing with these tools and practices. I think it’s pretty easy for people to take a small finding or result, and generalise it to the rest of their organisation where it may not be so true.

For example, I’m thinking of a time when a pattern emerges from survey result data suggesting that people in Finance have a certain experience that’s slightly more positive (a few percentage points) than the average. And what was a small difference, becomes “finance people are much more engaged/strong in leadership/etc than the rest of the business”.

I’d be curious if your experience points towards a few key ways that people can be tad more cautious towards, when making sense of their people analytics insights?

Fantastic observation and question, Chris. You’re absolutely right. In a thirst for certainty analysts as well as those consuming insight often see variability as a salient truth or evidence of a certain dynamic. This is dangerous as in most cases analytics grabs a snapshot at a certain point in time or over a period of time. In either case we’re looking into history, and the current dynamics involved in that view into the past have have certainly shifted in some way. Does this mean the data and/or insights are worthless? Of course not. It does, however, mean they need to be interpreted properly.

Let’s say in your example that the variability in Finance’s survey results were consistent over three quarters. Do those view percentage points validate the notion that “finance people are much more engaged”? It may, yet it may not. How many people responded in finance? How is engagement being defined? What is the variability among respondents? What is the variability in responses period over period (quarter over quarter in this example)? What is the tenure of the leader or leadership team? In other words, there are ways to provide context to the data that will help ascertain whether or not that difference relative to other functions is meaningful and actionable.

Given the above, a few ways analysts, HR professionals, leaders, and others can interpret people-related insights more appropriately (using this in place of “cautious”):

  1. Understand we’re looking for insights that elevate confidence. We’re not looking for “answers”. Answers imply certainty, and if you want that get into chemistry or some other more linear discipline. In People Analytics we’re talking about gaining insights into individual, team, group, and organizational behavior over time. This work is not for the faint of heart. We’re merely looking to brighten the light on what’s happening and, in some cases, what’ll likely happen in the future.
  2. Start with the story. This is a bit controversial, yet it’s been working well for me for many years. In short, what is the narrative that inspire a certain decision/change? Do the data or insights being considered inspire confidence in that story, discredit that story, or are they totally irrelevant to the story. If the latter, why is it taking people’s time and mindshare? If the story is discredited, that’s good. What was learned and what’s the best way forward? If the story is supported, how much has it elevated confidence? Is there more than can be done? Additional data? More appropriate variables? Different types of analysis? With few notable exceptions, I do not advocate trying to wrap a story around a certain measure or segmented analysis. Capturing the outline of the story, the set of hypotheses, what internal customers are already thinking offers related data context and, as a result, it’s likely going to be better received and acted upon.
  3. This last one is a bit more mundane yet no less important: Record, say in video or in short use cases, how to interpret data. Give your internal customers experiences that show them what false positives look like, what bias looks like, what meaningful (and un-meaningful) variability looks like. Many are said to be intimidated by data, analytics, statistics, story-telling, etc. While I’m compassionate to this it’s nearly a non-negotiable is our personal lives as well as our professional lives. It’s also easier than many people think; and while I certainly appreciate different ways of being and varying strengths person to person, I do know that learning how to interpret data is very analogous to learning how to read. Yes, it’s a different language, yet the logic flows much like sentences and stories. In the end, like learning anything, to learn means being engaged is what’s being learned – practice. We need to practice this work and make mistakes, to the extent possible, in controlled environments. Again, for better or worse, data interpretation, communication, and change facilitation is now the norm. All professionals inside and outside of HR must grow accordingly.

Thanks for your question, Chris; and hope this helps!

1 Like

That is a thoroughly awesome answer, thanks @Al_Adamsen :+1: Some really practical tips in there