All of the REF – not just 25% of it – must drive a better research culture

Everyone, from funders to individuals, has a role to play in building a research system that operates with integrity and robustness, says Alexandra Freeman

December 13, 2023
Blocks with "fact" and "fake" written on them, symbolising research culture
Source: iStock

The year’s delay just announced to the next Research Excellence Framework (REF) gives everyone in UK higher education time to help rethink the way the exercise assesses research. And that time is sorely needed.

The 2029 REF proposes a 25 per cent weighting (up from 15 in the previous exercise) on “research culture and environment” – but that rather misses the point that 100 per cent of the REF creates the incentives that drive that culture.

Over the past few years, research culture has become a focus of anxiety worldwide. In the UK, projects by organisations such as the Wellcome Trust, the Royal Society and, most recently, (with my involvement) UK Research and Innovation, the University of Bristol and Jisc have highlighted how it affects not just researchers but also the quality of research itself. And while the media tends to focus on fraudulent articles, the more insidious cultural issues affect researchers, and research, daily.

But research culture isn’t some fact of nature we just need to live with. We have complete control over it. For instance, if universities and funders assess people based solely on their publications, the natural result is papers that convey researchers’ credentials but don’t always convey accurately the research they’ve done, or its soundness.

Research funders (like society in general) clearly signal that they want robust and carefully done research – “significance” and “rigour” in the REF’s case. They recognise that they need to reward exactly that. But how to assess research is where everything seems to fall down.

Firstly, we must recognise that good work comes in various forms, many of which – such as  data sets or software – are not found in traditional articles or monographs. We must also recognise that very little research results in immediately world-changing findings – and that pressure to chase this unicorn takes researchers into dark forests. Those we interviewed for the UKRI-funded survey told us how pressure for novelty, broad audience appeal and good storytelling means that what gets researched and published is biased.

It’s easy to see how encouraging “impactful” findings can be seen as a way of prioritising socially useful work. But excellence should also be recognised in experimental or instrumental design, data curation, analytical technique and software development. All these are also useful, and their creation and distribution should be encouraged (only shared outputs can be assessed).

The REF is trying to broaden what is assessed but it has the power to drive real change – and researchers’ employers need to follow suit (my own project, octopus.ac aims to help by making it easy to share all kinds of research outputs and have them quality-assessed).

Another fly in the research ointment is the prominence of favour-exchanging. We heard from researchers how important personal contacts were to getting their names on publications or getting hold of data they needed. In many fields, an unofficial barter-exchange system exists because resources are scarce and there is no incentive to share. At the very least, this means early career researchers have to spend time networking and winning the favour of more senior researchers. At worst, it can result in exploitation.

We must ensure that access to data, publication, jobs and research resources is not controlled by unofficial, unregulated “gatekeepers”. Funders can be more insistent that data or other outputs of research they have funded are made as open as possible and conform to the FAIR sharing standards. Publishers can insist on mechanisms to minimise bias and favouritism. Institutions can ensure that hiring and promotion practices are as equitable and transparent as possible.

A final bugbear is the rampant competition that supposedly drives productivity but, in reality, hinders the breakthroughs that could arise through collaboration. In my experience, specialisation and professionalisation result in the best overall outcomes, rather than individuals trying to do everything themselves. Rewarding excellence in all kinds of research work will help, but a culture shift towards open working and collaborative thinking needs encouragement as well.

The best way to stimulate that shift is to model it. We know that researchers tend to mimic the way their bosses and colleagues behave – resulting in bubbles of good and bad practice regardless of institutional prestige. We need to establish a global sense of what best practice is and develop accessible resources to reinforce that and to help deal with examples of poor behaviour and practice.

While funders have the most power over what, and how, research is done, universities also have huge influence and responsibility – even if they don’t always recognise it. I know that many institutions don’t like “laying down the law”, particularly when that might be seen to infringe academic freedom. What I hear, though, is researchers crying out to be incentivised to do what they know to be right, rather than have to compete in an environment where “cheats” prosper.

Individuals, too can play their part. Organisations like the Reproducibility Networks are growing, including via grassroots initiatives such as ReproducibiliTea, where researchers come together to discuss and share thoughts and knowledge and spread best practice. Those who sit on promotion and hiring panels, advise funders and lead teams are ultimately responsible for the culture around them.

Everyone can and should do their bit to build a research system that operates with the integrity and robustness we all rely on for better health, economic, environmental and educational outcomes. That means researchers, funders and every institution that employs them.

Alexandra Freeman is executive director of the Winton Centre for Risk and Evidence Communication at the University of Cambridge and founder of research platform Octopus.ac.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Sponsored