Go back

Time to stand back, then step forward on research evaluation

Ioana Galleron and Geoffrey Williams describe a project to improve how the social sciences and humanities are evaluated across Europe.

The truism often says that we live in an evaluation culture, which presumably means that rigorous, transparent procedures are everywhere. But, while all areas of research activity and academic life are under constant scrutiny, a rigorous, transparent and open culture of research evaluation remains to be built. This is what the European Network for Research Evaluation in the Social Sciences and Humanities will try to achieve.

ENRESSH, which will launch on 8 April, seeks to tackle social sciences and humanities research evaluation first by standing back and looking critically at what is happening in countries throughout Europe. Then it will step forward with proposals for improving practices. Researchers in the social sciences and humanities will work with evaluation specialists, policymakers and others to compare outlooks and determine how best to evaluate the disciplines.

Evaluation remains a controversial area in these disciplines, especially in the humanities. Two extremes dominate the debate: those who oppose evaluation altogether and those who put an absolute faith in bibliometrics to gauge the quality of academic work.

The anti-evaluation school is against focusing on individual researchers or linking assessment to funding. Anxieties arise from not knowing who is evaluating or why. There is a suspicion that excellence does not reign and the real goal is cutting costs. This may be partially true.

On the bibliometric side, the myth goes that it is efficient and cheap, and even if it does not work well in some disciplines, it can only get better. Most bibliometric specialists take a more nuanced and critical view, but the myth is strong among some policymakers.

True, bibliometrics can be improved and altmetrics, based on data other than traditional citations, have their place too. But much remains to be done to see just how useful quantitative analyses are and where their limits lie, especially in the social sciences and humanities. The argument that these disciplines should adapt to better fit bibliometric analysis does not stand up. That is the policy tail wagging the research dog.

Between the extremes, there is much ground to be explored and this is where ENRESSH comes in. The project brings together evaluation specialists, users, policymakers and researchers to improve the understanding of how the social sciences and humanities generate knowledge; to observe scientific and societal interactions in these disciplines; and to understand and explain patterns of dissemination. Social impact and innovation are central issues in all fields of research, and not just in the social sciences and humanities. 

A one size-fits-all approach is clearly neither possible nor desirable. Local conditions are important and reflect local research culture; what is needed is to see how these conditions sit with European and international criteria.

Large-scale evaluations such as the UK’s Research Excellence Framework and the Netherlands’ Standard Evaluation Protocol work in countries where large amounts of funding are in play. Databases of research activity such as Cristin in Norway can work elsewhere, but have to be adapted to local needs.

In France, direct state funding to research groups is minimal and university autonomy recent. Thus, the evaluations by France’s High Council for Evaluation of Research and Higher Education are closer to monitoring and gentle nudging than evaluation, as there are no clear indicators and evaluators have no real training. However, HCERES is important in creating a culture that will make true evaluation possible in years to come.

Europe has its needs and individual countries have their needs. There is a growing jungle of European needs, national needs and national solutions. ENRESSH aims to stimulate informed discussion as to what is done where and why, and then seek means of improving systems to better take into account both the aspirations of the disciplines and the objectives of policymakers.

Thirty countries have signed up for ENRESSH, which will run for four years and is funded by the European Cooperation in Science and Technology programme. Early plans include a strong presence at the September Science and Technology Indicators conference in Valencia, with a special track on evaluation in the social sciences and humanities. An invitation-only event  designed to collect information and ensure outreach from the outset is scheduled for January. Summer 2017 will see a special session at the international conference on Research Evaluation in the Social Sciences and Humanities in Antwerp.

By standing back to study existing evaluation procedures, it will be possible to step forward with proposals for evaluation. This will open the way for a wider consultation of all academic and research disciplines and strengthen the incentive to find means to valorise the diversity and wealth of social science and humanities research.

Ioana Galleron is at the Université Grenoble Alpes. Geoffrey Williams is president of the EvalHum Initiative.

More to say? Email comment@ResearchResearch.com

This article also appeared in Research Europe