For all the attempts to rein it in, publish or perish culture still prevails in academia.
The game is more accurately described, however, as get cited or perish. Comparing individuals on “productivity and impact” statistics, such as Google Scholar’s various metrics, may be statistically and ethically questionable, but it is just too easy for most managers to resist.
So what to do? Some cynics will, of course, try to game the system. You don’t want to be a cynic. But we are all competing against them for jobs and promotions – and, besides, we all want our work to be read and referenced.
So is it possible to improve your citation metrics without compromising either your research or your principles? I think it is. Puzzled about why some of my best papers (in my view) had modest citations while others that verged on what academics like to dismiss as “somewhat trivial” did well, I did some research into scientometrics and studied the impact of my friends. My conclusion is that while chance plays a role and very deliberate actions to attempt to boost one’s metrics backfire, small changes in focus and strategy can reap serious rewards.
Below are the 10 observations I now make to my psychology PhD students. Some may not apply in the arts or hard sciences, but I think most do.
- Publishing in high-impact journals may impress peers and committees, but it in no way guarantees high citations. None of my 50 top-cited papers are in the highest impact journals in their area.
- Relatedly, innovative studies, which are often highly cited, are more difficult to get published. The “top journals” are often deeply conservative, particularly with respect to methodology. Only a minority of editors are prepared to risk publishing papers that open up new areas of research or use unorthodox methods.
- While journals do not determine citations, it is also true that the narrower a journal is in terms of focus, method or theory, the fewer citations its contents are likely to attract. Such journals are usually read only by those particularly interested in that area – who can exhibit rather cult-like behaviours. Choose general journals in better-cited disciplines for most impact.
- Don’t put all your eggs in one basket, either. Some academics begin the whole research process with a particular journal in mind. Others publish most of their work in the same journal. This may be classified as a “targeting approach”, trying to reach those most interested in the topic in the venues it is most discussed. But a change of editor or board can radically affect a journal’s focus and destabilise its readership. Better to spread papers across journals, publishers and countries: they will attract new readers.
- Beware predatory journals, though. There has been a staggering growth in open access journals whose often impressive titles and promises belie their total lack of meaningful quality control. Reputable citation databases rightly exclude them, so it’s worth the investment of scrutiny to ensure that your paper (and your reputation) does not disappear into a black hole. I have failed a few times to do this and regretted it.
- Aim also to publish good reviews as frequently as good papers. Summing up and critiquing a whole area of research, via a meta-analysis, structured review or historical analysis, is typically better cited even than an important new empirical contribution to that field. Five of my top 20 most highly cited papers fall into this category.
- Another good tactic is to develop, validate and publish a measure in a new area (even one that others then have to buy). Researchers often have a choice of tools and analytic methods – in my world, tests, procedures and statistical methods – but in new fields they are sometimes at a loss. Why not fill the void? For years, the top citation in psychology was a stats textbook, for instance. My most-cited paper concerns the development of a test, at the right time and on the right topic.
- By contrast, be wary about wading into a new field too far behind the curve. However good they are, papers published at the tail end of a disciplinary fashion, fad or folderol are less cited. To gauge what stage trends are at, plot the increase in papers. Don’t invest if the graph is too steep: you will get lost in the avalanche.
- Be a little wary of books, too. Many authorities stress that books and chapters are “not peer-reviewed”, so are not worthy grounds for recruitment or promotion. Nor do they generally accrue lots of citations. Only a handful of the more than 90 books I have written and only one of my 100 or so chapters are among my 100 most-cited publications.
- Books are also hard, lonely work. A less onerous route to citations, especially in early career, is to join some productive teams and become known as a reliable team player with useful expertise.
That you get cited no matter how many co-authors there are does attract social loafers, of course – but cynics are always around in academia. To avoid being beaten by them, become familiar with the game and play it to the extent that you can while still keeping the pursuit of knowledge, not credit points, as your primary motivation.
Adrian Furnham is an adjunct professor in the department of leadership and organisational behaviour at the Norwegian Business School. He has an h-index of 188.
Register to continue
Why register?
- Registration is free and only takes a moment
- Once registered, you can read 3 articles a month
- Sign up for our newsletter
Subscribe
Or subscribe for unlimited access to:
- Unlimited access to news, views, insights & reviews
- Digital editions
- Digital access to THE’s university and college rankings analysis
Already registered or a current subscriber? Login