Five motives why the REF is not fit for purpose‬‬‬

Peter Scott recently characterised the investigation excellence framework (REF) as “a monster, a Minotaur that must be appeased by bloody sacrifices”. With the publication of the results of REF 2014 on December 18, discussion will inevitably concentrate on winners and losers, on who has moved up and down in the all-crucial analysis rankings. But ahead of we drop sight of the wood for the trees, some far more fundamental questions can usefully be asked. In brief, is Scott’s “monster” fit for objective? Here are 5 reasons why I consider it is not.

1) It expenses too considerably

About one,a hundred of the UK’s top scientists and scholars have invested the final year grading 191,232 investigation outputs submitted to REF 2014. They will have had little time to do anything at all else. This is just the tip of an iceberg. Universities have commandeered countless far more hrs of academics’ time in getting ready their REF submissions. As the demands of successive research assessment workout routines (RAEs) have grown, so have the internal bureaucracies devoted to gaming the technique.

REF upped the ante even more by requiring all submissions to consist of “impact case studies” as nicely as outputs. The official bill for this 6-yearly academic Battle Royale is all around £47m invested within universities and a more £12m in Hefce’s administrative fees – most of it taxpayers’ income. But the far more considerable possibility expense is that this is all time that could have been invested in the lecture theatre, the library, or the lab, performing what the public thinks it pays us to do.

two) It is not peer overview anyway

In 2007-09 the academic establishment persuaded the government to abandon plans to change the 2008 RAE with a less costly and less time-consuming metrics-based mostly assessment, pleading that rigorous investigation evaluation necessary “expert peer review”. But the REF falls extremely far brief of international peer reviewing requirements in other academic contexts like publication, study funding, or promotions.

The 36 REF disciplinary subpanels that assess outputs rely fully on in-home assessment, by panellists drawn overwhelmingly from British universities. On some panels just 1 assessor could read through each output. While panellists are undoubtedly eminent in their disciplines, they frequently lack the professional knowledge to assess a lot of of the outputs falling underneath their remit – a dilemma compounded by a reduction in the amount of panels from 67 in the RAE to 36 in this year’s REF.

Hefce’s prohibitions on utilizing citations or perceived good quality of journals and publishers reinforce the dependency on panellists’ subjective judgments. Last but not least, panellists do not have the time to do a correct job anyway. One RAE panelist advised Occasions Increased Schooling that it would call for “two years’ total-time work, although carrying out nothing else” to go through the 1,200 journal content articles he had been allocated.

3) It undermines collegiality

If REF panels’ evaluative competence is questionable, the procedures used to pick staff for REF submissions inside person universities appear sometimes to have been even worse. Hefce’s withdrawal of funding from outputs ranked below 3* in 2010 led many institutions to develop “internal REFs” to filter probably lower-scoring perform from their submissions.

Employees variety has proved highly contentious in REF, with widespread accusations that in their zeal to second-guess REF panels’ grades universities have ridden roughshod over Hefce’s needs of transparency, accountability, consistency and inclusiveness. Provided the secrecy surrounding these variety procedures such accusations are tough to demonstrate, but the suspicion will stay that some individuals have been unfairly excluded from the REF. The resulting injury to collegial relations and personnel morale has been immense.

four) It discourages innovation

On the way to gather his 2013 Nobel prize in physics, Peter Higgs advised the Guardian he doubted regardless of whether “a comparable breakthrough could be achieved in today’s academic culture, simply because of the expectations on academics to collaborate and preserve churning out papers.” When every influence case review counts for a substantially higher portion of the overall score than the outputs of any personal academic, universities might prioritise investigation very likely to have measurable brief-term effect.

The most innovative function – the research that breaks molds, shifts paradigms and redefines fields – might not even make it into the REF at all simply because universities tailor their submissions to what they consider REF panels want, and REF panels reflect disciplinary hierarchies. Panel chairs have to be endorsed by relevant skilled associations and chairs then ‘advise’ on the appointment of other panelists. Interdisciplinary research is most clearly in jeopardy here, but the deeper dilemma is that the REF’s panels give extraordinary gatekeeping energy to a disproportionately older, male, white – and overwhelmingly Russell Group and former 1994 Group – academic elite.

5) It is redundant

11 British universities made the leading a hundred in the 2013-14 Times Greater Education Planet University Rankings, which really don’t use RAE/REF data and rely heavily on the citation metrics that Hefce rejected for the REF. Eight of these universities were in the top 10 in RAE and the other 3 in the best 20. Other rankings give equivalent correlations. Practically 85% of Hefce’s good quality-associated funding in 2013 went to Russell Group and former 1994 Group universities and nobody expects the outcomes of REF to considerably modify this.

In brief, not only is the REF an pricey, cumbersome and divisive method that is much more likely to inhibit revolutionary analysis than foster investigation excellence, but it primarily tells us what we previously know. It is time it was replaced by something whose expenses are far more proportionate to its positive aspects each for the universities and the taxpayer.

Derek Sayer is a professor of background at Lancaster University – follow him on Twitter @coastofbohemia.

Enter the Guardian university awards 2015 and join the higher training network for more comment, analysis and task options, direct to your inbox. Adhere to us on Twitter @gdnhighered.

Leave a Reply