Please enable JavaScript for this site to run correctly.
Posted 2022-05-31 12:13:22 by Matthew Flinders

 

But how do researchers know how to engage with policy-makers? Where are the docking points? How might engaging with policy-makers at different levels of government or in different policy areas demand different skills? What are the risks of engagement as science advice increasingly comes under pressure? What does ‘efficient’ engagement look like and how do we make all processes ‘EDI aware’?

With what some scholars have labelled 'the tyranny of impact’ having shaped higher education over the last decade (as seen through REF impact case studies, impact prizes, knowledge-exchange fellows, impact acceleration accounts, etc.) an interested intellectual observer from a distant galaxy might expect a fairly clear and robust evidence base to exist around these questions. 

They would be wrong. 

In fact as Roger Pielke – author of The Honest Broker: Making Sense of Science in Policy and Politics – argued at a recent conference on ‘Science under Pressure’ organised by the European Commission’s scientific advice mechanism, we urgently need more ‘science on the science of effective policy advice’.

The need for this investment was recently underlined by the findings of a major review of policy-engagement focused initiatives and investments by Katherine Oliver and her colleagues. Despite a huge expansion in research-policy engagement the main conclusion was that the evidence base for actual impact was alarmingly thin. 

These initiatives tend not to draw on existing evidence and theory, and are mostly unevaluated. The rudderless mass of activity therefore fails to provide useful lessons for those wishing to improve evidence use, leading to wasted time and resources. Future initiatives should draw on existing evidence about what works, seek to contribute to this evidence base, and respond to a more realistic picture of the decision-making context. 

We don’t really know what works when it comes to promoting policy-engagement. And we don’t know because we have not studied or collected the data at a systemic level. 

That’s not to suggest that the research community has not engaged or that translational structures have not been built. The Universities Policy Engagement Networkbrings together over a hundred universities in the UK to promote and support policy engagement. But it to suggest that the insights of policy analysis, diffusion and evaluation have not generally been applied to the process of research-led policy engagement itself. 

The What Works network provides a case in point. It’s hard to deny that across a range of policy areas – from aging to homelessness, from wellbeing to economic growth, and from youth policy to education – these centres have achieved multiple and far-reaching impacts. Their founding rationale included an explicit emphasis on capacity building for taking research into policy but nearly a decade after the network was first established it’s possible to suggest that where the centres have been less successful is in relation to building an evidence base about ‘what works’ when it comes to policy engagement, and disseminating those insights out across the wider research community in order to inform and orientate activity and investment. 

The What Works centres evolved almost like isolated islands of (knowledge-exchange) excellence, whereas a focus on the connective tissue needed to catalyse and drive performance across the science base might have helped prevent the ‘rudderless mass of activity’ that Oliver and her colleagues identify.

The same learning potential is to be found within other major investments such as the UK in a Changing Europe Initiative, the Independent Public Policy Observatory, Economics Observatory - all pathbreaking investments operating at the intersection of research and policy. Although very different organisations, a similar argument could be made for the Institute for Fiscal Studies, Institute for Government and Institute for Community Studies – but again little connective or catalysing capacity exists to extract and scale-up and scale-out the vast reservoirs of understanding about ‘what works’ when it comes to policy engagement that undoubtedly exist. The brilliant new ESRC Policy Fellows – to continue the theme –will quickly develop skills, understandings and useful lessons about navigating the intersection between research and policy that need to be harvested and promoted to upskill the wider science base. 

Although it’s probably the most obvious and predictable conclusion for an academic to make, what’s needed is more ‘research on research’. There is, of course, a major and co-funded Research on Research Institute (RoRI) founded on the mantra that ‘we can’t unlock the full potential of investment in research systems unless we have the evidence and tools to understand them.’ But where RoRI focuses on world-leading big data and meta-science to gauge the performance of research systems understanding policy engagement demands a different and more qualitative and contextualised approach if the full tone and texture of such an endeavour is to be fully captured. 

You can’t navigate without a rudder, so let’s understand ‘what works’ when it comes to research-led policy engagement. 

Matthew Flinders is Professor of Politics at the University of Sheffield and is Chair of the Universities Policy Engagement Network. @politicalspike