56

What’s the deal with ‘evidence-based practice’? Aren’t we all evidence based?

Jonny

| 0 Posts

CIPD Staff

25 May, 2017 14:55

To an extent, yes - we all use evidence. But as we argue in our positioning paper, In search of the best available evidence, there are two big things we tend to get wrong when using evidence to inform decisions.

First, we’re often not great at gauging the quality of evidence we’re looking at. There is a well-established hierarchy of scientific evidence on cause-and-effect relationships. The ‘gold standard’ is randomised controlled trials, which carry a good deal more weight than, say, simple before-and-after studies, and far more than surveys run at one point in time. If we can take note of this hierarchy in looking at evidence, we are well on the way to making more reliable, better decisions.

Second, we tend to cherry pick evidence that supports our pet theory. It feels great when you find a piece of research that confirms what you long suspected. But barring the ridiculous, the chances are you’ll be able to find research – even good quality research – to back your opinion whatever it is. To find out if a technique is really worth replicating, we should look at the wider body of evidence. So sitting above the hierarchy of single studies, we have a ‘platinum standard’ of systematic reviews and meta-analyses.

Evidence-based HR means being anti-fad and willing to face uncomfortable truths. But it’s hugely important. Relying on weak evidence can feel politically expedient (staying in line with what your boss expects to see) or compelling (in tune with the zeitgeist, intuitively right), yet at its worst it gives little more than a 50% chance of success, the equivalent of flipping a coin. If a decision is important, it’s worth testing the available evidence in this light: how much more scientific is it than a coin toss?

There are plenty of thorny questions in evidence-based HR but the basic principles are pretty simple and more importantly, worth striving for. Our hope is that this forum will help put people these principles to work and grapple with the challenges. Thoughts?

4749 views
  • In reply to Pietro:

    Thanks, Pietro – I’m pleased to hear that, I’ve been blown away by the work you and the team are doing at ScienceForWork.

    The paper on training and development is great. The area I’m most intrigued by is described as ‘individual differences’ - and in particular the components of self-efficacy, goal orientation and motivation. It’s an oversimplification, but I would position these as providing an intent to learn and practice EBHR. Until you have that, there’s little drive to use the tools and resources that are available.

    We know quite a lot about amplifying intent – take the messenger effect I mentioned above – we’re influenced by who communicates information; we’re more likely to act when that person or institution has authority, we trust them, and they're ‘like us’.

    There’s a problem with the latter. EBHR advocates will typically tick the first two boxes, but the approach feels very unfamiliar to an ‘everyday’ HR practitioner. The language is different and the expertise of an HR professional (and therefore their self-efficacy) may feel under threat or undermined (particularly when we spend so much time ‘debunking’ concepts that are so widely used in the community).

    Now debunking should of course continue, and the language of academia is here to stay. But how can you also make advocates of messengers like Liam, who are taking small, accessible steps in the field, and whom HR practitioners are able to relate to on a day-to-day basis…? How do you begin to hold early-adopter practitioners up as heroes? As pioneers, even?

    Just an example of course - and there’s far more than the messenger effect to take into account - but that’s where I’m going with the whole feasible and desirable thing. If we want to change the behaviour of HR professionals, we should use the science of behavioural change.
  • In reply to James:

    Thanks James.

    In my view, one thing to consider is that, when you start working in this way at least, your methodology doesn't need to be solid you just need to be discerning in how you work with the data you get. You can then use what you learn to develop a more robust methodology.

    For example, one simple thing we did was to flag MOOCS in our monthly L&D mail out using bitly.com to track click through. It was quite rough and ready but we got a sense of topics people were curious about and that online learning was worth pursuing still. We also saw a significant increase in people proposing online learning as a solution to learning needs in their personal development plan (PDP) this year. That might be due to various things but combining that data with anecdotal evidence and the click through data created a better indication that people are keen to know more and we have a (slightly) better sense of the kinds of things to focus our time on sourcing, and that we should also trial providing support to help people research and work with these learning resources themselves on the other.
    The question of how to measure use of external resources is still a challenge but in the process of writing this I've realised we can evolve our PDP process this year into a L&D survey which should help with that - so thanks for asking your question!
  • In reply to Liam:

    Just possibly to stir the pot by linking:

    www.aconventional.com/.../is-evidence-based-hr-another-fad.html

    Just musing that a lot (most?) of management - including but not 'just' HRM - involves motivation of human beings, and, whilst I'm all for this being based on (social-) scientific inquiry and evidence etc I think usually the relevant / available evidence will tend to conflict and / or be inconclusive or arguable - simply because there are only in the present state of our knowledge about human motivation, there exists only a motley body of conflicting theories and by no means any adequate scientific model. Even if one existed, applying it to particular, unique, situations would raise many judgement calls too?

  • In reply to James:

    James, thank you for your comment about ScienceForWork, we are delighted to hear that.

    I agree that our effort can be oriented to empowering individuals rather than encumbering them, and we have good science at supporting how people can begin and stustain the path forward.

    I also acknowledge the importance of celebrating successes applying the EB approach. We need a balance between highlighting mal practice and showing/understanding the benefits of good practice.

    Noted down the book from BIT that you've suggested, thanks for sharing.

    And about debunking claims and oversimplifications, I guess it takes time and an attitude to explain things. Particularly liked these lessons from the medical field: www.vox.com/.../fight-fake-news-doctors-medical-community
  • Jonny

    | 0 Posts

    CIPD Staff

    3 Jun, 2017 16:01

    In reply to Liam:

    Liam, I completely agree with James, examples like yours show how practitioners really can be evidence based. Thanks for sharing. If you are up for it, a new post or blog recounting your journey in taking an EB approach to L&D would be brilliant. Email me if I can help j.gifford@cipd.co.uk

  • In reply to David:

    Hi David, I think the motivation issue is an important one, but no more so for this approach than for any other - any approach demands taking the time to build influence and engagement and I would argue that an evidence based approach his self-serving in that respect as it brings confidence and therefore control, which we know to be crucial to influence.

    What I sense from comments from practitioners about getting to grips with evidence-based practice is that they experience a kind of paralysis when it comes to getting started with working in this way.

    What i understand Rob, Johnny et al. to be advocating is a simple set of considerations for gathering evidence in the first place (e.g. whatever is available from various sources of evidence such as professional experience, organisational data...), and then another set for putting them into practice (e.g. identifying a clear set of goals with which to focus) combined with a self-aware attitude to analysing and presenting the data one obtains.

    I sense that practitioners are stymied by reading from the academic origins of this approach that there is a 'proper' way of approaching things, when actually there's a good element of just getting one's hands dirty and trying stuff out - essentially what organisations do anyway but with slightly (and increasingly) more method to the madness.

    I also think that the fashionable focus on 'big data' is part of the issue. It is overwhelmingly challenging to work with unless you have the resources of a tech co like Citymapper whose business model is focused on exploiting it.

    We need to get over that hurdle to move this forward. I think a great starting point is for HR practitioners to talk to colleagues in marketing, for example, who are used to working with both quantitative and qualitative data in their work, which like HR is not an exact science.
  • In reply to Jonny:

    Thanks Johnny. Time's a little tight at the moment but I'd be happy to write a post.
  • In reply to David:

    Hi David,

    The blog reminded me of Adam Grant, who does a great line on the 'narcissism of small differences' - parties who share similar views but bicker relentlessly on the details, and therefore halt the progress of their shared goals.

    Evidence-based HR vs. 'Experimental HR' or 'Scientific HR' strikes me as a conceit - and a potentially damaging debate to hold in front of an already confused audience. I don't think anyone's suggesting there can be certainty in human behaviour; everything is inconclusive. But this model is about trying to reduce that empirically, and - as Rob says - typically finding out we didn't know as much as we thought along the way.

    You're right there are no perfect answers or models, and much will depend on context. But we can still strive to better, more informed ways to gather knowledge and make judgements in our workplaces.

    The scientific method has done us alright elsewhere - I hope we could all agree we should take from it everything we can.
  • In reply to Pietro:

    Thanks Pietro - that's a powerful share. There's no quick fix for sure.

    I'll continue to enjoy and use your work: keep it coming...
  • Lots of really interesting ideas in the discussion. But something's missing - how does this link with basic training of HR professionals, and the CIPD syllabus?

    All CIPD members are exposed to this, and it (should) provide some linkage between academic and managerial practice in the area of HR, organizational behaviour etc. So there needs to be a clear space in the curriculum for EB perspectives; tutors need to bring this into the way they deliver sessions, and so on. We need to find ways to encourage tutors to understand what EBP means, for them to incorporate it into their teaching, and start to encourage students to adopt a more critical stance towards what they encounter as information for decision-making.

    The initial steps in Toulmin's scheme for analysing arguments, for example, draws students' attention (a) to the fact that they have made a 'claim', a statement of fact/opinion, that (b) needs or rests on data to substantiate it. Given these things (which take time for students to absorb) then they can address questions such as 'what's the quality of the data?' And so on.
  • In reply to Jonny:

    Bloomin' Google - always one step ahead!

    Nice story about accessible EBHR...

    rework.withgoogle.com/.../