Former research director of the Joseph Rowntree Foundation and member of the social policy and social work panel for the 2001 research assessment exercise
The research assessment exercise in its current form has outlived its usefulness, so the review of it is welcome. However, it is sad that its initial stages seem to be concentrating on "methods of assessment". This is an important part of the process, but it is not the only or the most urgent one.
It is difficult to see how there can be any sensible discussion about the introduction of a new form of research assessment without a proper appreciation of how the RAE has operated or of the impact that it has had on issues such as incentives. I have seen no assessment by the Higher Education Funding Council for England or the other funding councils of the operation of the RAE as a whole, nor have I seen anything published.
There have been some benefits of the RAE system. It raised the profile of research in some universities, which meant that research and individual researchers became more valued. The external validation of the research endeavour has led to a greater emphasis on the value of good research.
But there is now a whole range of dysfunctional aspects. It adopted a science model that did not fit some disciplines, and it operated a "one-size-fits-all" model in relation to research despite the considerable differences between disciplines and the huge diversity between institutions in types of research carried out, employment patterns and sources of external funding. It also operated on a number of assumptions - about quality of output equating to quality of research and about who and what it was appropriate to assess.
The RAE was geared entirely towards the individual, not the team. This continued the now-outdated academic model of the lone scholar. It provided no incentives to encourage departmental heads or directors of research centres to manage the work of the staff within their remit; to build teams whose output was greater than the sum of the individual parts; or to encourage the development of junior staff. Research needs to be seen as a complex activity that is often carried out in teams and that is part of a wider context that includes teaching and knowledge transfer. In addition, the parameters of judgement used in assessment were in essence academic. Despite Hefce's efforts to encourage equal treatment of all forms of output in the 2001 RAE, applied work did not feature very frequently. And the financial cost and administrative burden of the RAE has been considerable - the 2001 exercise was estimated to have cost £36 million.
Finally, the RAE system has created a range of "unintended consequences": people seeking funds who have little talent for research, the quantity of output becoming more important than research quality, and the proliferation of small-circulation journals on esoteric subjects. Sophisticated gamesmanship has also developed around the definition of who is "research active".
But there are more important fundamental questions that need to be asked. The funding councils must clarify which activities they wish their funding to support before a great deal of time is spent creating a new assessment method. If it is part of the dual-support system, a key aim is to "top up" funds from other bodies. But how does this link with other government objectives such as transferring knowledge, nurturing staff and supporting new subjects and blue-skies research?
A second question relates to the purpose of the RAE. Its outcomes are used crudely. The exercise produces a grade that goes into a funding formula that produces a sum of money for a university. The RAE has become too detailed and elaborate to be "fit for purpose". Instead of focusing on alternative methods of assessment, it would be better to ask: "how do we get a grade for the funding formula that reasonably equates to the quality of work being carried out?"
Janet Lewis is the former research director of the Joseph Rowntree Fouondation and member of the social policy and social work panel for the 2001 research assessment exercise