Social science impact demands faster publishing and more reproducibility

Timeliness and rigour are vital, but the publishing and incentives systems are not set up to deliver them, says Meron Wondemaghen

四月 12, 2024
A souped up old car, symbolising speeded up publication
Source: iStock/Asim Ali

Research impact is a highly valued commodity these days. And while its definition can be a little hard to pin down, studies that advance current knowledge theoretically while improving professional practice or informing the policy process would seem to hit the bullseye.

Achieving such outcomes can depend enormously on getting papers out in a timely manner, when they are most relevant. But such timeliness is being undermined by the current overproduction of research, reviewing backlogs and lack of editorial decision-making transparency. Add to that the vast amounts of time researchers waste preparing submissions that don’t pass to review, the unnecessary pre-review back-and-forth on formatting and style and the sometimes long post-acceptance wait for publication and it’s amazing that anything sees the light of day in time to have a real-world impact – at least outside the science subjects that have adopted preprint servers.

Take a recent article of mine on media framing of Syrian and Ukrainian refugees. This has already been in the review process for more than a year. It has been submitted to many journals, several of which took up to two months to explain that they were having to desk-reject “even excellent submissions” because of their backlogs while offering no guidance on how other excellent manuscripts are passed to review anyway.

Even when my article was finally accepted for review, it took two months for the editor to find reviewers, despite my further nudging. In the meantime, another war has started in Gaza and the media framing of the Ukraine conflict may have shifted, undermining the paper’s relevance.

Dismay at this tardiness is exacerbated by concerns about who is reading and editing our manuscripts. Journals will often trumpet the importance of editorial independence, but the flip side is a lack of transparency that makes it hard to trust the system. Assurances of peer-review quality assume that editors are consistently neutral and guided solely by the merit of a manuscript, which isn’t the case.

Consider the following. Late last year, a manuscript of mine came back with reviewer comments requiring me to restructure and resubmit because my manuscript “offers potential to cover new intellectual territory” about the NHS. I repeatedly asked for clarification of some comments, but no reply was forthcoming for several weeks until I received a reminder that I had only a fortnight left to resubmit.

I tried to do so the day before the deadline, only to find that the portal was already closed. So I made a new submission, with a cover letter adding this context. The editor – who was now replying to emails within minutes – informed me that he wouldn’t send the reworked paper to the reviewer nor count it as a new submission because his decision to reject it was final.

No further rationale was offered. I had held up my end by not submitting the manuscript elsewhere during those now-wasted months, but I received little more than an apology. If there is no adherence to basic professional and contractual standards between editors and authors, it’s easy to imagine that professional favours and personal contacts are the deciding factors.

Nor are editorial tardiness and opacity the only barriers to research impact. Another major issue is replicability. Without this, publications have no epistemic authority or real-world relevance; as Karl Popper argued in 1959, “non-replicable single occurrences are of no significance to science”. Yet while impact is highly prized rhetorically, researchers are given little incentive to prioritise replicability, as opposed to the kind of novelty that secures papers in top journals, high citation numbers and external funding.

This is true even of the UK’s Research Excellence Framework (REF), on the basis of which billions of pounds in research funding are distributed. The REF assigns a 25 per cent weighting to impact, but its scores are still predominantly determined on the basis of papers considered to be “internationally excellent” (3*) or “world leading” (4*) in originality and rigour. This singular theoretical and/or methodological quality is all too often not replicable.

Moreover, in many social science fields, impact starts out local, but locally focused papers generally score poorly. So do papers reporting negative results – so much so that they are rarely published at all, skewing the evidence on the social issue in question.

I suggest we halt many studies in health and social sciences until timely dissemination and replicability are addressed. Instead, systematic reviews and meta-analyses should be prioritised to identify what we’ve researched so far; its impact on community, policy and practice; and the recurrent gaps in knowledge. This would allow us to understand what research is required to address those gaps and to probe real-world problems in collaboration with practitioners and policymakers.

Granted, these reviews will not be without flaws, but this would be a start at capturing the scale of knowledge lacunae across disciplines and the characteristics of non-replicable research. Journals could temporarily function as they do when putting out special issues, setting out specific parameters that papers must meet, including the incorporation of systematic reviews and meta-analyses to further productivity and collaboration, rather than unhealthy competition.

Reviewers can eventually be financially incentivised to help restart the review process by prioritising and addressing the unnecessary backlog of research. My hope is that this will usher in a genuine era of impact, built on timely dissemination, replicable research and oversight of editors that ensures that acceptance and rejection decisions are meritocratic and rubric-based.

Meron Wondemaghen is a senior lecturer in criminology at the University of Hull.

请先注册再继续

为何要注册?

  • 注册是免费的,而且十分便捷
  • 注册成功后,您每月可免费阅读3篇文章
  • 订阅我们的邮件
注册
Please 登录 or 注册 to read this article.