What’s the deal with ‘evidence-based practice’? Aren’t we all evidence based?

To an extent, yes - we all use evidence. But as we argue in our positioning paper, In search of the best available evidence, there are two big things we tend to get wrong when using evidence to inform decisions.

First, we’re often not great at gauging the quality of evidence we’re looking at. There is a well-established hierarchy of scientific evidence on cause-and-effect relationships. The ‘gold standard’ is randomised controlled trials, which carry a good deal more weight than, say, simple before-and-after studies, and far more than surveys run at one point in time. If we can take note of this hierarchy in looking at evidence, we are well on the way to making more reliable, better decisions.

Second, we tend to cherry pick evidence that supports our pet theory. It feels great when you find a piece of research that confirms what you long suspected. But barring the ridiculous, the chances are you’ll be able to find research – even good quality research – to back your opinion whatever it is. To find out if a technique is really worth replicating, we should look at the wider body of evidence. So sitting above the hierarchy of single studies, we have a ‘platinum standard’ of systematic reviews and meta-analyses.

Evidence-based HR means being anti-fad and willing to face uncomfortable truths. But it’s hugely important. Relying on weak evidence can feel politically expedient (staying in line with what your boss expects to see) or compelling (in tune with the zeitgeist, intuitively right), yet at its worst it gives little more than a 50% chance of success, the equivalent of flipping a coin. If a decision is important, it’s worth testing the available evidence in this light: how much more scientific is it than a coin toss?

There are plenty of thorny questions in evidence-based HR but the basic principles are pretty simple and more importantly, worth striving for. Our hope is that this forum will help put people these principles to work and grapple with the challenges. Thoughts?

Parents
  • Part of the challenge is that it can be hard to find practitioner-focused evidence. A lot of academic research is aimed at other academics (rather than people managers) and adopts a language that many HR professionals find hard to understand. Also, the research often follows fads and fashions, so we don't necessarily get the evidence base in the right areas of people management.
  • Yes I completely agree Charles. A lot of academic research is not practitioner-focused, and for good reasons. Similarly the language is not helpful - often not even to other academics! And fads and fashions are just as evident in academic research.

    On the other hand, academic research is just one source of evidence and like any source of evidence it needs to be judged for it's relevance and trustworthiness. And it's also about using the best available evidence. So even if the evidence isn't great it's still worth checking out.

    As you probably know, more and more academics are becoming concerned about scientific (mal)practices. There's a great summary here by John Antonakis where he discusses what he calls the five diseases of academic publishing:

    retractionwatch.com/.../
  • I hope we'll hear both in this forum. The challenges people are grappling with is an ideal starting point, but definitely good for practitioners to be talking about evidence too.
    What aspects of people management are most problematic for your organisation?
    On what basis are these identified as issues?
    What evidence would help you progress on these issues?
  • Rob, to your point of doing this for 15 years, you've clearly had a lot of influence, so don't underestimate that. But clearly it's not easy to get traction with evidence-based practice. I think we have to reflect on this & ask why it's the case. For example:
    1. EBP can feel like a technical point of methodology and I think many people are happy leaving that to the experts. The line of thought of: surely you've got to at least trust the scientists.
    2. Practitioners need to feel they are central to the solution of EBHR, not just consumers of research. So as you suggest, to ask: what problems do you grapple with? But also: what would help HR in your org be more discerning of evidence, or make better use of it? And: what tools would you like to see to help?

  • There's a great quote from Daniel Kahneman that I think goes some way to answering that question. He suggests evidenced based solutions will always struggle to land within the complexity of big business:

    "Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients. An unbiased appreciation of uncertainty is a cornerstone of rationality--but it is not what people and organisations want."

    There's a difficult reality that follows for me - how much compromise (if any) should there be between progress and rigour/credibility? We can say none: we stick to the gold standard of RCT's and communicate with plainly presented objectivity. But we should recognise that the competition from non-evidence-based solutions is highly sophisticated and evolved to be 'sellable':

    Take the '10 questions' approach - the strength of non-EBHR is to answer them with everything we know people lean towards - certainty (this is the one answer that works), norms + loss aversion (everyone else is doing it, you're falling behind), messenger (this guru says it's right) etc etc. Understanding why it isn't easy to get traction with EBHR is partly about understanding why the alternative is so compelling...

    The question then evolves to 'how should we compete'? How can we be heard?' I imagine the subsequent debate to be in line with a respected academic writing a best-selling psychology book... what's the line?
  • There's a great quote from Daniel Kahneman that I think goes some way to answering that question. He suggests evidenced based solutions will always struggle to land within the complexity of big business:

    "Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients. An unbiased appreciation of uncertainty is a cornerstone of rationality--but it is not what people and organisations want."

    There's a difficult reality that follows for me - how much compromise (if any) should there be between progress and rigour/credibility? We can say none: we stick to the gold standard of RCT's and communicate with plainly presented objectivity. But we should recognise that the competition from non-evidence-based solutions is highly sophisticated and evolved to be 'sellable':

    Take the '10 questions' approach - the strength of non-EBHR is to answer them with everything we know people lean towards - certainty (this is the one answer that works), norms + loss aversion (everyone else is doing it, you're falling behind), messenger (this guru says it's right) etc etc. Understanding why it isn't easy to get traction with EBHR is partly about understanding why the alternative is so compelling...

    The question then evolves to 'how should we compete'? How can we be heard?' I imagine the subsequent debate to be in line with a respected academic writing a best-selling psychology book... what's the line?

    (Great thread btw!)
  • What aspects of people management are most problematic for your organisation?
    On what basis are these identified as issues?
    What evidence would help you progress on these issues?

    Excellent questions, Jonny... and a very helpful lens through which I can view many of the questions and challenges posed elsewhere on this Community by practitioners. Also a very useful mindset for HR professionals to have when considering some of their own work projects and professional objectives - e.g. see this thread.

    A key challenge for me (and others) is to try to 'unlock' and curate the collective experience of why a particular action/set of actions/route was 'successful' (and less successful)... and explore how this might translate to the many different work contexts that are out there.

  • Perhaps we should ask the HR community what 10 pieces of evidence would be most useful for them and then pass this on to the research community?

    Now you've got me thinking...

  • There's a great quote from Daniel Kahneman that I think goes some way to answering that question. He suggests evidenced based solutions will always struggle to land within the complexity of big business:

    "Experts who acknowledge the full extent of their ignorance may expect to be replaced by more confident competitors, who are better able to gain the trust of clients. An unbiased appreciation of uncertainty is a cornerstone of rationality--but it is not what people and organisations want."

    There's a difficult reality that follows for me - how much compromise (if any) should there be between progress and rigour/credibility? We can say none: we stick to the gold standard of RCT's and communicate with plainly presented objectivity. But we should recognise that the competition from non-evidence-based solutions is highly sophisticated and evolved to be 'sellable':

    Take the '10 questions' approach - the strength of non-EBHR is to answer them with everything we know people lean towards - certainty (this is the one answer that works), norms + loss aversion (everyone else is doing it, you're falling behind), messenger (this guru says it's right) etc etc. Understanding why it isn't easy to get traction with EBHR is partly about understanding why the alternative is so compelling...

    The question then evolves to 'how should we compete'? How can we be heard?' I imagine the following debate to be similar to a respected academic writing a best-selling psychology book... where's the line?
  • Thanks James - insightful points and I agree, it's not enough to trash woolly thinking; we need to understand why people look to quick and overconfident solutions. This is a great Ted talk from Stuart Firestein on 'the pursuit of ignorance', which speaks to one way we might seek to change our mindsets: www.ted.com/.../stuart_firestein_the_pursuit_of_ignorance

    At a more practical level, I think it will be important to understand how EBHR can be made appealing and do-able (e.g. what are the tools or resources needed?). This is something for which the views of practitioners will be invaluable.

    Like you r point on popular science books. In my mind, it's pretty clear that, while there are defined methods one can follow in evidence-based practice, it's also a continuum. We can stretch ourselves to be MORE discerning of evidence, especially when it comes to major decisions or conclusions. Don't know if you saw it, but Kahneman himself gave a mea culpa on parts of his bestseller Thinking Fast & Slow, admitting “I placed too much faith in underpowered studies”. Fantastic lesson in honesty & humility. retractionwatch.com/.../
  • Feels like this is on a trajectory already Rob.
    I like the idea of seeking out the 10 most difficult problems or questions - maybe CIPD could survey members to determine.

    Another idea for helping EBP support HR, would be for the world of academia to lend their support and services directly to HR practitioners 'in the field' through knowledge transfer partnerships.
  • Thanks Jonny, I haven't seen the Ted Talk - one for the commute home. I had seen Kahneman's mea culpa (#mancrush). What a guy ;)

    I reckon you're spot on - appealing and do-able are great words (as are desirable and feasible from the Rubicon Model). I also reckon the tendency of evidence based peeps is to lean towards the latter (i.e tools and resources), and the tendency of less credible solutions is to lean towards the former (for obvious reasons).

    So, 'appealing'... how far do you go? Likening 'test, learn and adapt' methodology to the agility of startup cultures that big corporates so admire? Tie up with publications like HBR? Go full on with branding academic research like this from Francesa Gino... hbr.org/.../let-your-workers-rebel.

    I know this will make many academics squirm, but I also supsect that practioners subscribing to the continuum approach may sympathise. I saw a criticism of an IOPsych approach as 'old wine in new bottles' recently, and thought, 'if the old wine is good and the new bottles sell more of it, what's the problem?'

    So maybe the 10 questions will work, but maybe that offers an alternative answer. What are the solutions that EBHR are most confident about? Which will be easiest to land in a big organisation? How can we repackage those in the sweet spot to be attractive to a practioner audience?
Reply
  • Thanks Jonny, I haven't seen the Ted Talk - one for the commute home. I had seen Kahneman's mea culpa (#mancrush). What a guy ;)

    I reckon you're spot on - appealing and do-able are great words (as are desirable and feasible from the Rubicon Model). I also reckon the tendency of evidence based peeps is to lean towards the latter (i.e tools and resources), and the tendency of less credible solutions is to lean towards the former (for obvious reasons).

    So, 'appealing'... how far do you go? Likening 'test, learn and adapt' methodology to the agility of startup cultures that big corporates so admire? Tie up with publications like HBR? Go full on with branding academic research like this from Francesa Gino... hbr.org/.../let-your-workers-rebel.

    I know this will make many academics squirm, but I also supsect that practioners subscribing to the continuum approach may sympathise. I saw a criticism of an IOPsych approach as 'old wine in new bottles' recently, and thought, 'if the old wine is good and the new bottles sell more of it, what's the problem?'

    So maybe the 10 questions will work, but maybe that offers an alternative answer. What are the solutions that EBHR are most confident about? Which will be easiest to land in a big organisation? How can we repackage those in the sweet spot to be attractive to a practioner audience?
Children
  • Another Kahneman fanboi, here. Great to see James raise (and Jonny acknowledge) his important contribution to this field, especially in commercial HR.

    I recently sat down with Jacques Quinio of Manpower Group who's an evangelist for EBHR and has some great insights on how organizations can make practical use of big data, but I challenged him a bit on how much harder it is for SMEs to acquire people data on the scales necessary to make really evidence-based decisions. Both in terms of the quantity of data we have available and the time we have within which decisions have to be taken, we don't have that luxury. HR practitioners in SMEs, with small teams, most of whose time is occupied with transactional pratice, have to straddle the line between data-led and instinct-led decision-making and Kahneman is our guru for this.

    Our instincts are fallible, but time spent considering the available evidence will inform and improve our instinctive decision-making.

    I'd take issue with Jonny's assertion that "it gives little more than a 50% chance of success". Not on the 50% figure (I'll take that), but on the idea that HR decision are fail/success binary options. We aren't financial market traders for whom the buy/sell decision is a straightforward win/lose binary state. Our decisions are far more nuanced. Do we hire X or hire Y? If we make the "wrong" decision, we still end up with a qualified, capable employee (most of the time) but just one who might not have been as good as the alternative.

    That's not to say that HR is incapable of making business-breaking decisions, but they are usually at the tail-end of a series of failures made at executive level (q.v. bhs).

    As a parting thought, if we assume that EBHR is the pathway to the most effective decision making, to what extent will it therefore be possible for even the most nuanced and sophisticated HR decisions to be automated, given sufficient data of the appropriate quality?
  • Robey said:

    "...if we assume that EBHR is the pathway to the most effective decision making, to what extent will it therefore be possible for even the most nuanced and sophisticated HR decisions to be automated, given sufficient data of the appropriate quality?"

    That is a question, Robey.

    I recall having this exchange about 'empathy' with Peter Cheese and (deleted blog).

  • Nice comments. Making better decisions is the key thing. And not every problem need a meta-analysis to solve, let alone big data. But there are other things we can do on a smaller scale.
    So, the 50% argument is basically descriptive, to say that technically speaking, some evidence is no better at predicting outcomes than the toss of a coin. It's to recognise that we all like to latch on to evidence that supports our pet theory, especially if it tells an intuitive story, but don't kid yourself about being evidence based if the evidence is not trustworthy.
    It's a very different point from marginal gains and what decisions are important. Gary Klein has done some really interesting work on this - he argues that some decisions probably ARE best made by the toss of a coin, if you've eliminated the no-nos and are quibbling over minutiae. I enjoyed this book of his wordery.com/the-power-of-intuition-how-to-use-your-gut-feelings-to-make-better-decisions-at-work-gary-klein-9780385502894 . Incidentally, Klein wrote a paper with & gets name-checked by Kahneman in T,F&S. He's good, even if his book has a corny title.
    Don't think Kahneman is a champion of instinct-led decision-making, tho. If anything, the opposite!
    One other thing I'd flag is the different sources of evidence - in particular there's published scientific research as well organisational data and expertise. Someone will be posting more on that here soon I think...
  • Hi Robey.  I'd love to know if Jacques Quinio is an evangelist for evidence-based HR in the way evidence-based practice is generally understood.  Does he just mean big data (which is not what evidence-based HR is about) or does he actually mean this:

  • Sorry that was way too small!

  • I'd hate to speak for Jacques on the basis of a 20-minute seminar and 15-minute face-to-face. Looking over my notes, though, his point on Big Data seems to have been less that it was the answer to all our woes and more that HR practitioners are ignoring its potential, seeing it as a sales/marketing tool rather than a people tool. If I look at the other topics we covered, and particularly at the approach to employee-led performance appraisal, I'd say that the overall push was in the direction of, but not identical to, the EBP infographic.

    For example, he was quite a fan of psychometrics as a tool for employee development, but my understanding of the literature on that subject from practising academics in neurology and psychology (as opposed to former academics in those fields who've developed a product to sell) is that psychometrics are basically pseudoscience.

    However, when dealing with any consultant - especially one at the top of his game - you always have to remember that they too have a product to sell.

    Your point about the emphasis on practice may be semantic, but it's very important and not a perspective I'd thought of before.
  • Thanks interesting points. I'm not sure I agree that psychometrics = psudoscience as there are so many different types of psychometrics with varying validity and reliabilty.

    Yes, it's ALL about practice and not about evidence as such. But it's also about doing stuff first to discover the evidence about what the real problems/opportunities may be and only then thinking about what you might do. Things like 'big data' are sold as sort of non-solutions to non-problems though of course data can help.