The papers with the greatest academic impact also have the most societal impact, say Jonathan Adams and his colleagues.
In the 1990s, efforts to measure the impact of research drew mostly on publication and citation data. More highly cited work was taken as an indicator of academic excellence, which was widely pursued as a public policy goal.
Since then, the agenda has shifted, with societal and technological priorities enshrined in many policy and funding programmes. This has driven evaluation to try to assess, through efforts such as the UK Research Excellence Framework (REF) and Excellence in Research for Australia, how well research meets these goals.
This raises the question of what the indicators of socioeconomic impact might be, and whether they are the same as those of academic impact. In a recent study, we tackled these questions.
We examined both the citation counts and the measures of online attention known as altmetrics for publications submitted to REF 2014. We also compared these scores with the judgments of the REF’s peer-review panels.
We found that altmetric scores do not reflect wider economic, social, cultural or policy impact. More importantly, we found that papers included as both research outputs and case studies—the two routes for submitting publications—showed relatively high impact both academically and societally. This confirms that there is every reason to expect that research that follows high academic standards will also have societal value.
Judging by the large number of impact types and descriptions in the REF database, a broad consensus on how to analyse and interpret socioeconomic impact is many years away. Fortunately, the REF data allow other tests, given the two routes for submitting publications. Papers can serve as evidence of a researcher’s academic achievement (the ‘excellence’ route) and as part of a case study showing evidence of socioeconomic impact (the ‘impact’ route).
There is substantial overlap between the two pools. There are 120,784 journal papers in the REF database of submitted research outputs and 11,822 journal papers among the case study references. Of these, 5,703 papers feature in both. Interestingly, overlap is higher in applied research areas than in basic research areas.
We predicted that journal papers submitted via the excellence route would gather more citations, whereas those taking the impact route should attract wider social recognition, trackable through Altmetric.com, which collates mentions on Twitter, Wikipedia, Facebook, policy-related documents, news items and blogs. We also predicted that citations and altmetric scores for the two categories would correlate positively with the quality judgements of REF review panels, which award each submission between one and four stars.
Some of these predictions were confirmed. Papers used as evidence of academic excellence have higher average bibliometric citation impact. Papers used in case studies, in contrast, are mentioned significantly more often on Wikipedia, and especially in policy-related documents. The same goes for news reports, although here the difference between the two routes is smaller.
For Twitter counts, however, the difference between excellence papers and impact papers is close to zero. Nor do mentions on Twitter correlate with citations. Tweets, in other words, do not appear to reflect any serious form of impact.
Across all indicators, papers submitted via both routes showed the highest scores on both bibliometric and altmetric measures. These publications were as highly cited as those that were only in the excellence set, and had higher altmetric scores—on every source—than papers that were only in the impact set.
In the second part of the study, we compared these metrics with the scores of REF reviewers for each disciplinary panel and submitting institution. We found that the REF scores of impact case studies correlated only weakly with altmetrics—suggesting that these measures should not be used as markers of impact. Peers can identify when research has been of benefit to society, based on descriptions in case studies, but altmetrics cannot.
Perhaps our most interesting results are the relatively high scores—across the board—given to publications that were submitted as evidence of both academic excellence and societal impact. Some outputs have an evident capacity for impact both among other researchers and in their potential for wider application.
These results show that there is no necessary gap between academic and societal value. This is not a new conclusion—Vannevar Bush said much the same in his influential 1945 report for the United States government, Science, the Endless Frontier. Research that follows high academic standards can also be expected to have a broader societal value.
Jonathan Adams is from the Institute for Scientific Information, Clarivate Analytics. Lutz Bornmann is from the Division for Science and Innovation Studies, Max Planck Society. Robin Haunschild is from the Max Planck Institute for Solid State Research.
This article also appeared in Research Europe