Journal articles make up an ever-higher proportion of the outputs submitted for assessment. If this is really the best of UK research, we are becoming a dull lot, says Jonathan Adams.
The impact case studies submitted to the Research Excellence Framework are not just another tool of assessment. Their introduction has changed the way UK research is viewed and valued.
The balance of output types in REF 2014 may have been a signal of that shift. What researchers submit for assessment should be evidence of their best achievements, and this has changed over time, with the journal article now dominating and arts and humanities researchers relying more on publications and less on outputs such as performance, exhibitions and design.
In the 1992 Research Assessment Exercise, for example, engineers and social scientists relied on conferences and monographs. In 1996, however, they submitted an increased share of journal articles, which rose further in 2001 and again in 2008.
Some saw this as the adoption of a ‘science paradigm’, with the article, preferably in a journal with a high impact factor, seen as the most direct evidence of academic performance. By contrast, arts and humanities remained firmly anchored to the book.
REF 2014 revealed the journal article’s continuing ascendance (see table). In 1996, journal articles accounted for three of every five outputs submitted across all units of assessment. In 2014, they accounted for four out of five.
In the science units of assessment (a mix of main REF panels A and B), more than 99 per cent of outputs were journal articles. The same went for 90 per cent of engineering outputs (the rest of panel B). In social sciences (panel C), books and chapters fell to less than half the share they had in 1996, while articles climbed from fewer than 50 per cent to more than 80 per cent of submissions.
In the humanities and arts there were complex changes. Comparing the reshaped units of assessment across main REF panel D with those in the 2008 RAE, the total volume of submitted outputs fell in every area except philosophy. Within this, the proportion of books and chapters held up at about 46 per cent: pretty much the same as in the last four assessment cycles.
So, no major change from RAE to REF in these disciplines? Not so. Book share held up for panel D, but journal articles rose from 30 to 40 per cent of outputs. The decline was in the ‘other’ categories, spanning reports, performance, exhibitions, digital media and design.
The rich diversity of outputs at the core of visual and performing art, for example, was down from nearly 20 per cent in 2008 to just 12.5 per cent of 2014’s submitted material. In art, music and communication there has been a shift from ‘devices’ to ‘digital media’, which might just be relabelling. But the number of art exhibitions halved between 2008 and 2014, the number of design submissions fell to one-fifth and conference material also decreased. Meanwhile, even though the volume of outputs in art and design was only about two-thirds that of 2008, journal articles were up by 10 per cent.
These shifts raise questions about assessment culture and the consequences of introducing impact.
An important question is whether the REF-submitted outputs really represent the best of UK research. Maybe they do, but if so the loss of output diversity across all fields since 1992 suggests we are becoming a dull lot.
No RAE panel ever suggested that it preferred articles to anything else. Other work pressures must have led to journal articles displacing well-honed but onerous contributions: this shift to journals deserves further analysis.
One possibility is that institutions chose to submit their more diverse outputs as impact case studies rather than research outputs. Perhaps there was uncertainty about whether the same things could be submitted as evidence of both research achievement and research impact.
The research references in the case studies will give clues as to whether this was the case. From today, they, along with the rest of the REF impact case studies, will be available online in an indexed, searchable database developed by Digital Science. This will allow research producers and users to explore a unique and rich account of what UK research delivers.
We must keep an eye on the pervasive effects of assessment culture. RAE and REF data analysis usually focuses on immediate outcomes: how much money did we win? Longer-term trends are just as important, and maintaining research diversity is as valuable as past excellence and impact.
More to say? Email comment@ResearchResearch.com
Jonathan Adams founded Evidence Ltd and is now chief scientist at Digital Science.
This article also appeared in Research Fortnight