2

The behavioral evidence hub

One of the most valuable things I learned from Rob Briner (of this parish) is that there are many types of evidence. It doesn't always have to be a peer reviewed academic paper with findings based on randomized controlled trials.

In the domain of behavioural science, a relatively new site has emerged, known as the Behavioral Evidence Hub, B-hub for short. I think it's a pretty cool resource which, in its own words seeks to bridge "the longstanding gap between research and the real world by collecting evidence-based, behaviorally-informed solutions and bringing them directly to the people who can put them to work."

I really like the idea behind it: showcasing actual interventions and their effect in the real world. Seeing what others have done, where, how and why it worked (or indeed didn't work!) can be as inspiring as an academic study. It can stimulate experimentation, and sharing insights arising from these experiments.

Imagine how something similar for management-related interventions might energize the debate around EBM. What would it take to establish a Management Evidence Hub?

Check out the B-Hub here:

339 views
  • Thank you Koen,
    for sharing this invaluable resource.

    Based on my experience, in management we have some practices where the use of rigorous experimental or quasi experimental designs is established. For instance in the domain of goal setting, developmental feedback, and other performance-related practices. "Translating" this type of evidence in an accessible way would be worth doing. I agree this would be an incentive to creativity and imagination for field experimentation. However, the business field is filled with extraordinary claims, without extraordinary evidence. So what are the entry-requirements to ensure integrity?

    Not strictly related to your arguments, in my opinion we would need an entity, like the Cochrane Collaboration in medicine, to:
    - set up standards of rigour for research (where rigour depends on the appropriateness of a research design to answer different types of questions), and questions are relevant.
    - provide funding to teams of researchers/practioners who conduct and publish applied research (no matter statistical significance)
    - engage independent teams of "science communicators" to write plain language summaries of research syntheses, single studies, etc.
    - make the whole idea of research accessible to a wide audience through various means.
    - engage the public (employees in organizations) in the production of scientific evidence, where the outcomes are directly relevant and useful to improve working lives.

    The motto here is that an utopia is an utopia for most people until somebody starts somewhere. Then the dream becomes a project, that is something immensely bigger
    (credit: Adriano Olivetti, a century too early www.amazon.it/.../B00G4MDMP8)
  • In reply to Pietro:

    You mention something I had only vaguely considered: the 'translating' of existing, robust scientific evidence. I think this too could be very useful (in the same way that reviews of scientific papers are often helpful to summarize the findings for those who do not have the time or inclination to read the actual papers). One way to kill two birds with one stone would be if the 'translation' were accompanied by a description of an actual implementation (whether or not it was the subject of an experiment).

    You raise a good point about the quality control of any submissions of empirical evidence. My concern is that making the criteria too onerous in the eyes of the practitioners might discourage them.

    Bearing in mind Richard Thaler's 3-word summary of using behavioural economics to influence behaviour, "make it easy", it seems to me it is important to strike the right balance between rigour and feasibility from the viewpoint of the practitioner. In certain cases, when circumstances and budgets allow, a more formal approach may well be favoured. In others, neither the time nor the resources may be available, but that doesn't mean the intervention cannot produce valuable insight.

    Imagine, for example, the manager or consultant with 20 or 25 years' worth of experience, hardly any of which has been properly codified. Yet if you could ask such a person for their advice about a particular problem you're facing, or about whether they'd ever tried something you have in mind -- they might well provide inspiration and wisdom... and evidence.

    Great quote!