What’s the deal with ‘evidence-based practice’? Aren’t we all evidence based?

To an extent, yes - we all use evidence. But as we argue in our positioning paper, In search of the best available evidence, there are two big things we tend to get wrong when using evidence to inform decisions.

First, we’re often not great at gauging the quality of evidence we’re looking at. There is a well-established hierarchy of scientific evidence on cause-and-effect relationships. The ‘gold standard’ is randomised controlled trials, which carry a good deal more weight than, say, simple before-and-after studies, and far more than surveys run at one point in time. If we can take note of this hierarchy in looking at evidence, we are well on the way to making more reliable, better decisions.

Second, we tend to cherry pick evidence that supports our pet theory. It feels great when you find a piece of research that confirms what you long suspected. But barring the ridiculous, the chances are you’ll be able to find research – even good quality research – to back your opinion whatever it is. To find out if a technique is really worth replicating, we should look at the wider body of evidence. So sitting above the hierarchy of single studies, we have a ‘platinum standard’ of systematic reviews and meta-analyses.

Evidence-based HR means being anti-fad and willing to face uncomfortable truths. But it’s hugely important. Relying on weak evidence can feel politically expedient (staying in line with what your boss expects to see) or compelling (in tune with the zeitgeist, intuitively right), yet at its worst it gives little more than a 50% chance of success, the equivalent of flipping a coin. If a decision is important, it’s worth testing the available evidence in this light: how much more scientific is it than a coin toss?

There are plenty of thorny questions in evidence-based HR but the basic principles are pretty simple and more importantly, worth striving for. Our hope is that this forum will help put people these principles to work and grapple with the challenges. Thoughts?

Parents
  • Part of the challenge is that it can be hard to find practitioner-focused evidence. A lot of academic research is aimed at other academics (rather than people managers) and adopts a language that many HR professionals find hard to understand. Also, the research often follows fads and fashions, so we don't necessarily get the evidence base in the right areas of people management.
  • Yes I completely agree Charles. A lot of academic research is not practitioner-focused, and for good reasons. Similarly the language is not helpful - often not even to other academics! And fads and fashions are just as evident in academic research.

    On the other hand, academic research is just one source of evidence and like any source of evidence it needs to be judged for it's relevance and trustworthiness. And it's also about using the best available evidence. So even if the evidence isn't great it's still worth checking out.

    As you probably know, more and more academics are becoming concerned about scientific (mal)practices. There's a great summary here by John Antonakis where he discusses what he calls the five diseases of academic publishing:

    retractionwatch.com/.../
  • Hi Robey.  I'd love to know if Jacques Quinio is an evangelist for evidence-based HR in the way evidence-based practice is generally understood.  Does he just mean big data (which is not what evidence-based HR is about) or does he actually mean this:

  • Sorry that was way too small!

  • I'd hate to speak for Jacques on the basis of a 20-minute seminar and 15-minute face-to-face. Looking over my notes, though, his point on Big Data seems to have been less that it was the answer to all our woes and more that HR practitioners are ignoring its potential, seeing it as a sales/marketing tool rather than a people tool. If I look at the other topics we covered, and particularly at the approach to employee-led performance appraisal, I'd say that the overall push was in the direction of, but not identical to, the EBP infographic.

    For example, he was quite a fan of psychometrics as a tool for employee development, but my understanding of the literature on that subject from practising academics in neurology and psychology (as opposed to former academics in those fields who've developed a product to sell) is that psychometrics are basically pseudoscience.

    However, when dealing with any consultant - especially one at the top of his game - you always have to remember that they too have a product to sell.

    Your point about the emphasis on practice may be semantic, but it's very important and not a perspective I'd thought of before.
  • What a great example. We often think a big question - like 'how do we improve learning in our organisation'? - needs a big answer - like 'a £1m LMS system'. There's a great book our by Owain Service from The BIT, that aims to debunk just that...

    www.behaviouralinsights.co.uk/.../

    I bet there is plenty 'small steps' work out there that never gets captured or shared. Especially when competing with the glossy campaigns of big corporate HR and their consultancies.

    Perhaps you could outline one of your experiments, Liam? What approach did you take to measuring your interventions and their impact on learning? What were your results?

    Jonny/Rob - the salesman in me thinks that retelling and heroising some of these stories/examples alongside tools and resources might be impactful.
  • James I find your arguments compelling - here and above ('how should we compete'? How can we be heard?'). I think part of the challenge is being evidence-based ourselves.

    Most of CEBMa resources, tools and trainings available are based on the hard science of how people learn and build skills https://www.researchgate.net/publication/260178378_The_Science_of_Training_and_Development_in_Organizations_What_Matters_in_Practice 

    So we may not sell evidence itself with overstatements, but can we sell the path to understand and use better quality evidence with confidence over specific outcomes? I think we could begin defining what outcomes we aim to achieve, and at what level of analysis - maybe beginning with individual-level?

  • James I find your arguments compelling - here and above ('how should we compete'? How can we be heard?'). I think part of the challenge is being evidence-based ourselves.

    Most of CEBMa resources, tools and trainings available are based on the hard science of how people learn and build skills www.researchgate.net/.../260178378_The_Science_of_Training_and_Development_in_Organizations_What_Matters_in_Practice

    So we may not sell evidence itself with overstatements, but can we sell the path to understand and use better quality evidence with confidence over specific outcomes? I think we could begin defining what outcomes we aim to achieve, and at what level of analysis - maybe beginning with individual-level?
  • Exactly. In the absence of strong or reliable evidence you can still proceed but, as you say, more cautiously than if that wasn't the case.
  • I agree with most but not the bit about defining outcomes. First you need evidence to define and identify problems (or opportunities)! In other words, I think that starting with 'outcomes' in mind (e.g., let's improve performance) can be very unhelpful and misleading. Clearly, those interested in evidence-based practice do not yet have compelling evidence that there is a 'problem' which EBP can help fix.
  • Thanks interesting points. I'm not sure I agree that psychometrics = psudoscience as there are so many different types of psychometrics with varying validity and reliabilty.

    Yes, it's ALL about practice and not about evidence as such. But it's also about doing stuff first to discover the evidence about what the real problems/opportunities may be and only then thinking about what you might do. Things like 'big data' are sold as sort of non-solutions to non-problems though of course data can help.
  • Thanks, Pietro – I’m pleased to hear that, I’ve been blown away by the work you and the team are doing at ScienceForWork.

    The paper on training and development is great. The area I’m most intrigued by is described as ‘individual differences’ - and in particular the components of self-efficacy, goal orientation and motivation. It’s an oversimplification, but I would position these as providing an intent to learn and practice EBHR. Until you have that, there’s little drive to use the tools and resources that are available.

    We know quite a lot about amplifying intent – take the messenger effect I mentioned above – we’re influenced by who communicates information; we’re more likely to act when that person or institution has authority, we trust them, and they're ‘like us’.

    There’s a problem with the latter. EBHR advocates will typically tick the first two boxes, but the approach feels very unfamiliar to an ‘everyday’ HR practitioner. The language is different and the expertise of an HR professional (and therefore their self-efficacy) may feel under threat or undermined (particularly when we spend so much time ‘debunking’ concepts that are so widely used in the community).

    Now debunking should of course continue, and the language of academia is here to stay. But how can you also make advocates of messengers like Liam, who are taking small, accessible steps in the field, and whom HR practitioners are able to relate to on a day-to-day basis…? How do you begin to hold early-adopter practitioners up as heroes? As pioneers, even?

    Just an example of course - and there’s far more than the messenger effect to take into account - but that’s where I’m going with the whole feasible and desirable thing. If we want to change the behaviour of HR professionals, we should use the science of behavioural change.
Reply
  • Thanks, Pietro – I’m pleased to hear that, I’ve been blown away by the work you and the team are doing at ScienceForWork.

    The paper on training and development is great. The area I’m most intrigued by is described as ‘individual differences’ - and in particular the components of self-efficacy, goal orientation and motivation. It’s an oversimplification, but I would position these as providing an intent to learn and practice EBHR. Until you have that, there’s little drive to use the tools and resources that are available.

    We know quite a lot about amplifying intent – take the messenger effect I mentioned above – we’re influenced by who communicates information; we’re more likely to act when that person or institution has authority, we trust them, and they're ‘like us’.

    There’s a problem with the latter. EBHR advocates will typically tick the first two boxes, but the approach feels very unfamiliar to an ‘everyday’ HR practitioner. The language is different and the expertise of an HR professional (and therefore their self-efficacy) may feel under threat or undermined (particularly when we spend so much time ‘debunking’ concepts that are so widely used in the community).

    Now debunking should of course continue, and the language of academia is here to stay. But how can you also make advocates of messengers like Liam, who are taking small, accessible steps in the field, and whom HR practitioners are able to relate to on a day-to-day basis…? How do you begin to hold early-adopter practitioners up as heroes? As pioneers, even?

    Just an example of course - and there’s far more than the messenger effect to take into account - but that’s where I’m going with the whole feasible and desirable thing. If we want to change the behaviour of HR professionals, we should use the science of behavioural change.
Children
  • James, thank you for your comment about ScienceForWork, we are delighted to hear that.

    I agree that our effort can be oriented to empowering individuals rather than encumbering them, and we have good science at supporting how people can begin and stustain the path forward.

    I also acknowledge the importance of celebrating successes applying the EB approach. We need a balance between highlighting mal practice and showing/understanding the benefits of good practice.

    Noted down the book from BIT that you've suggested, thanks for sharing.

    And about debunking claims and oversimplifications, I guess it takes time and an attitude to explain things. Particularly liked these lessons from the medical field: www.vox.com/.../fight-fake-news-doctors-medical-community
  • Thanks Pietro - that's a powerful share. There's no quick fix for sure.

    I'll continue to enjoy and use your work: keep it coming...