What’s the deal with ‘evidence-based practice’? Aren’t we all evidence based?

To an extent, yes - we all use evidence. But as we argue in our positioning paper, In search of the best available evidence, there are two big things we tend to get wrong when using evidence to inform decisions.

First, we’re often not great at gauging the quality of evidence we’re looking at. There is a well-established hierarchy of scientific evidence on cause-and-effect relationships. The ‘gold standard’ is randomised controlled trials, which carry a good deal more weight than, say, simple before-and-after studies, and far more than surveys run at one point in time. If we can take note of this hierarchy in looking at evidence, we are well on the way to making more reliable, better decisions.

Second, we tend to cherry pick evidence that supports our pet theory. It feels great when you find a piece of research that confirms what you long suspected. But barring the ridiculous, the chances are you’ll be able to find research – even good quality research – to back your opinion whatever it is. To find out if a technique is really worth replicating, we should look at the wider body of evidence. So sitting above the hierarchy of single studies, we have a ‘platinum standard’ of systematic reviews and meta-analyses.

Evidence-based HR means being anti-fad and willing to face uncomfortable truths. But it’s hugely important. Relying on weak evidence can feel politically expedient (staying in line with what your boss expects to see) or compelling (in tune with the zeitgeist, intuitively right), yet at its worst it gives little more than a 50% chance of success, the equivalent of flipping a coin. If a decision is important, it’s worth testing the available evidence in this light: how much more scientific is it than a coin toss?

There are plenty of thorny questions in evidence-based HR but the basic principles are pretty simple and more importantly, worth striving for. Our hope is that this forum will help put people these principles to work and grapple with the challenges. Thoughts?

  • Hi David, I think the motivation issue is an important one, but no more so for this approach than for any other - any approach demands taking the time to build influence and engagement and I would argue that an evidence based approach his self-serving in that respect as it brings confidence and therefore control, which we know to be crucial to influence.

    What I sense from comments from practitioners about getting to grips with evidence-based practice is that they experience a kind of paralysis when it comes to getting started with working in this way.

    What i understand Rob, Johnny et al. to be advocating is a simple set of considerations for gathering evidence in the first place (e.g. whatever is available from various sources of evidence such as professional experience, organisational data...), and then another set for putting them into practice (e.g. identifying a clear set of goals with which to focus) combined with a self-aware attitude to analysing and presenting the data one obtains.

    I sense that practitioners are stymied by reading from the academic origins of this approach that there is a 'proper' way of approaching things, when actually there's a good element of just getting one's hands dirty and trying stuff out - essentially what organisations do anyway but with slightly (and increasingly) more method to the madness.

    I also think that the fashionable focus on 'big data' is part of the issue. It is overwhelmingly challenging to work with unless you have the resources of a tech co like Citymapper whose business model is focused on exploiting it.

    We need to get over that hurdle to move this forward. I think a great starting point is for HR practitioners to talk to colleagues in marketing, for example, who are used to working with both quantitative and qualitative data in their work, which like HR is not an exact science.
  • Thanks Johnny. Time's a little tight at the moment but I'd be happy to write a post.
  • Hi David,

    The blog reminded me of Adam Grant, who does a great line on the 'narcissism of small differences' - parties who share similar views but bicker relentlessly on the details, and therefore halt the progress of their shared goals.

    Evidence-based HR vs. 'Experimental HR' or 'Scientific HR' strikes me as a conceit - and a potentially damaging debate to hold in front of an already confused audience. I don't think anyone's suggesting there can be certainty in human behaviour; everything is inconclusive. But this model is about trying to reduce that empirically, and - as Rob says - typically finding out we didn't know as much as we thought along the way.

    You're right there are no perfect answers or models, and much will depend on context. But we can still strive to better, more informed ways to gather knowledge and make judgements in our workplaces.

    The scientific method has done us alright elsewhere - I hope we could all agree we should take from it everything we can.
  • Thanks Pietro - that's a powerful share. There's no quick fix for sure.

    I'll continue to enjoy and use your work: keep it coming...
  • Lots of really interesting ideas in the discussion. But something's missing - how does this link with basic training of HR professionals, and the CIPD syllabus?

    All CIPD members are exposed to this, and it (should) provide some linkage between academic and managerial practice in the area of HR, organizational behaviour etc. So there needs to be a clear space in the curriculum for EB perspectives; tutors need to bring this into the way they deliver sessions, and so on. We need to find ways to encourage tutors to understand what EBP means, for them to incorporate it into their teaching, and start to encourage students to adopt a more critical stance towards what they encounter as information for decision-making.

    The initial steps in Toulmin's scheme for analysing arguments, for example, draws students' attention (a) to the fact that they have made a 'claim', a statement of fact/opinion, that (b) needs or rests on data to substantiate it. Given these things (which take time for students to absorb) then they can address questions such as 'what's the quality of the data?' And so on.
  • Bloomin' Google - always one step ahead!

    Nice story about accessible EBHR...

    rework.withgoogle.com/.../