‘Change fatigue’ dampens desire to scrap journal impact factors

Turbulence caused by pandemic may have reduced appetite for radical reform of research assessment and rewards, suggests study

March 28, 2022

Only about one in five academics say abolishing journal impact factors would be the best way to reform how research quality is measured, according to a global survey that suggests the sector could be “losing its appetite” for different metrics in academia.

This year the movement to end the use of journal impact factors (JIFs) in hiring, promotion and funding decisions will celebrate its 10th anniversary, with more than 19,000 individuals and 2,500 organisations subsequently putting their names to the San Francisco Declaration on Research Assessment (Dora).

That enthusiasm to drop JIFs – a citation-led metric often used to denote the prestige of a journal, and the quality of researchers who publish in it – was not found in a survey by Emerald Publishing, which runs more than 300 journals.

Of the 2,128 academics who responded, 70 per cent said that they were assessed on the basis of journal citations and impact factors but only 18 per cent felt dropping citation metrics such as JIFs was the main change they would like to see.

Many more respondents (47 per cent) say they would instead prefer the introduction of additional metrics beyond citation metrics as the main way to change how research quality is measured. Nearly half (49 per cent), however, worried about what measures would replace the ranking of journals to assess quality.

Academics were also slightly less keen about leading change than they did in previous years when polled by Emerald, with 34 per cent saying they were very open to change compared with 38 per cent in 2019. That may suggest “waning support for new impact measures”, the report says.

Sally Wilson, publishing director at Emerald, which is a Dora signatory, said she believed academics were still keen to reform how research quality was assessed but “there is a bit of change fatigue” caused by the pandemic, which might suggest why enthusiasm for new assessment systems was not as high as expected.

“Given the enormous changes that universities have undergone in the pandemic, maybe there is an acceptance of what is now in place, even if it is not always reliable [as a proxy for quality],” she said.

Nevertheless, Ms Wilson believed that it was important to look at other types of publication metrics and measurements that could more accurately reflect the impact of researchers’ work.

“Journal impact factors have been around for so long and have been used to assess researchers and shape careers, but they do not always make the best incentives for research,” she said. “The pandemic has shown more than ever how research directly affects people’s lives and what impact might look like.”

jack.grove@timeshighereducation.com

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Register
Please Login or Register to read this article.

Related articles

Sponsored