Dutch lessons for an impact agenda that satisfies all parties

The Netherlands has a workable system from which the UK could learn, says Paul Benneworth. But, be warned, it will involve compromise

November 17, 2011



Credit: Femke De Jong


Lord Rees is the latest eminent scientist to add his voice to the chorus of disapproval facing the UK's impact agenda. According to Times Higher Education's report of his speech at the University of Cambridge last week, the Astronomer Royal bluntly stated that it was unfeasible to assess impact in funding proposals, and attempts to do so risked compromising the UK's research strengths.

Lord Rees joins a growing opposition to funding and research councils' practical efforts to respond to government demands to show the value of research investments. Along with the research excellence framework's impact element, the impact statement now required in funding proposals is being resolutely condemned by many sensitive to its potentially negative impacts.

As a Netherlands-based scholar, I find some of this opposition slightly surprising. For the past decade, the Royal Netherlands Academy of Arts and Sciences, the Association of Universities in the Netherlands, and the Netherlands Organisation for Scientific Research have put considerable effort into developing techniques to deal with impact at both the proposal and evaluation stage.

As in the UK, all research council funding applications must include an impact statement as part of proposed dissemination activity. Interestingly, at least one recent funding call referenced Research Councils UK's Pathways to Impact document as useful for academics preparing this statement.

ADVERTISEMENT

There is also a post hoc research impact programme of subject-based five-yearly reviews. In contrast to the UK, this is a rolling programme carrying no direct monetary reward. The Standard Evaluation Protocol, as it is known, has evolved and now measures both social and scientific impact. Trials are under way for the social impact element to comprise half of departmental grades.

The system is broadly similar to the UK's, which is no coincidence given how closely Dutch science policymakers have worked with European partners to improve their scientific productivity. But it differs in how uncontroversial the whole process has been; surprisingly so for a country where the tradition of academic autonomy is arguably even more deeply entrenched than in the UK.

ADVERTISEMENT

So the past decade offers useful lessons for the UK's impact agenda, particularly for those striving to improve impact without compromising excellence.

First, and typically Dutch, the process has been both experimental and consensual. In the mid-2000s, the Rathenau Institute - a government-funded laboratory researching science policy - oversaw the Evaluating Research in Context programme. This meant that the subsequent discussions have been strongly rooted in a sensible researcher-driven understanding of what impact is, and how it is created.

Unsuccessful experiments have been abandoned or modified at the pilot stage, and lessons have been learned. The net effect is that everyone involved in the process, from grant writers through to funders' project officers and proposal reviewers, have a realistic understanding of the expectations of, and limits to, achieving impact.

Second, as Lord Rees points out, real societal impact may take place many years in the future, and the Dutch process recognises that. What is instead important is that researchers plan their dissemination in ways that make their findings as usable by non-scientists as by their scientific peers.

ADVERTISEMENT

The impact section on a funding application does not attempt to predict what the impact will be, any more than the research questions in the proposal attempt to predict the scientific findings. UK funding bids already show how researchers plan to make their findings accessible to their peers, identifying conferences to attend, papers and books to write, and workshops to organise. Likewise, Dutch-style impact statements allow researchers to showcase how they ensure that their expensive publicly funded research may help non-academic users.

This is linked to the third element: a clear understanding of where the responsibilities lie in this process. Dutch scientists are not responsible for exploiting their research findings. They must simply ensure that their research findings can eventually be exploited, creating opportunities to change the way other scholars think or other users behave.

That is very different from trying to predict research's impact, or trying to use research to "create" impact. Rather, it is a declaration by researchers that they take seriously their duties to their peers and society by making their research as accessible as possible.

The overarching lesson for the UK is that it is now time for a compromise between funders, government and researchers, with researchers agreeing to make their research potentially accessible to users, and funders accepting the limited influence that academics play in its uptake.

ADVERTISEMENT

Compromise is vital to restore confidence in the system and to make impact a valued and accepted consideration in academic practice. But compromise requires concessions from all sides. The critical question for the UK now is who will move first?

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Sponsored

ADVERTISEMENT