Of course, there is more to be done – far too many in the academic community and in funding bodies still think that the key to ensuring that academic research has a societal impact is for people to shout louder, and ideally in shorter sentences.
But there is a growing awareness that raising the visibility and accessibility of research, while necessary, is insufficient; and that if the aim is for policy makers to be routinely using academic research, something else is needed.
The importance of relationships, understanding context and integrating different types of knowledge
The WCPP has developed a model that shows what this ‘something else’ could look like. We have created a demand-led approach to knowledge mobilisation that starts with the problems that decision-makers are grappling with and develops projects that bring together the best available evidence and expertise.
Underpinning this is a framework for thinking about the practices that support the mobilisation of research evidence into policy. This emphasises the importance of building effective, trusting relationships with decision makers; using these to build a nuanced understanding of the context or landscape in which a decision maker is operating; and then drawing on this to translate and contextualise research evidence by bringing it into dialogue with the other forms of knowledge and insight that shape policy decisions.
This framework is rooted in the literature, but also our own practice, and some of the lessons that we have learnt from a decade of doing this work. Here I’ve chosen my ‘top three’ lessons. These may sound obvious, but I think that they are yet to be fully integrated into the way that universities and research funders operate:
Academic evidence does not speak for itself
I don’t just mean this in terms of the accessibility of journal articles. More often than not, research falls short of providing ‘the answer’ that a decision maker is looking for. That could be because the research was designed to answer a different question. It could be because the findings require interpretation: “it worked over there, but how can I make it work here?”. It is often because the research concludes some version of “we don’t really know, it depends, and more research is needed”.
Research evidence can be incredibly useful to decision makers, even when it only offers a partial and caveated answer to the question at hand. But it is rarely useful without some effort being made to contextualise the findings so that they are relevant to decision makers.
Academic evidence is not the only relevant information for policy makers
A few years ago, one of our sister organisations in the What Works Network developed a graphic that showed the different types of evidence that are relevant to local decision makers. The ‘evidence pie’ as it became known, includes academic evidence, alongside impact evaluation, different types of data (population data, organisational data), lived experience, and professional judgement. What this was trying to show was that even a decision that is informed by academic research will incorporate other forms of knowledge and insight. And this is legitimate, especially in a democratic society, and especially when academic evidence only offers partial answers to the questions that policy makers are grappling with.
Report + event ≠ impact
This is an easy trap to fall into – some interesting research has been done, it is packaged up in an accessible format, and then you ‘launch’ it with some kind of event. From the outside, it can seem that ‘successful’ impact work amounts to little more than this.
But for most research projects that succeed in having an impact, there is lot of invisible work that happens before the shiny report and the launch event, and it is that which makes the research have an impact, rather than just what you see from the outside.
This ‘invisible work’ is the relationship building, and the translation and contextualising work that we wrap around the evidence synthesis, and which gives us a better chance of supporting evidence-informed decision making.
Some questions for the next phase
Having offered some lessons from our work to date, I wanted to conclude by talking about some of the things that are on our agenda now and are shaping where we head next:
- Codifying practice and (re)defining impact: having developed this framework, and some associated tools for capturing our practices and their impact, I’m looking forward to being able to share these, and to share the learning that they are generating. My hope is that this will shine a light on the ‘invisible work’ and make it explicit. And that it will support a shift in thinking about ‘impact’ – what it means, how you achieve it and why, even if you do everything ‘right’, you still might not succeed.
- Continuing to innovate and develop new knowledge mobilisation practices: we are currently experimenting (among other things) with integrating implementation science into our work, and with our approach to collaborating with experts by experience. Beyond this, I am interested to see how AI tools change how we work. I’m also interested to learn from others who are innovating, including those who are experimenting with bringing creativity and imagination into the policy process.
- Closer integration of knowledge mobilisation into ‘core business’: at the moment, what we do is quite separated from the two principle foci of the university – teaching and research. I believe that this shouldn’t be the case, and that there is huge potential for greater integration between research, knowledge mobilisation, and teaching.
If any of this sparks your interest, please sign up to our newsletter, or get in touch.