Go back

Ten years of Dora

                  

The conversation on research assessment has shifted, but the work goes on, says Stephen Curry

In December 2012, around 20 editors and publishers of scholarly journals joined an impromptu gathering at the 2012 American Society for Cell Biology annual meeting in San Francisco. They came together to vent their frustration at how the misuse of journal impact factors in research assessment had helped to create a hyper-competitive culture driven by targets, where the focus was on academic outputs rather than the breadth of researcher contributions.

The minutes of that ASCB meeting show that much of the talk was technical and devoted to how to replace JIFs with “a more intelligent, relevant and transparent metric of journal value”. But at the heart of the discussion burned a desire to tackle the roots of the problem, by developing and promoting alternative methods for evaluating researchers—even if there was little talk of how best to do that or how long it might take to effect change.

It took six more months to thrash out the text of what has become known as the San Francisco Declaration on Research Assessment, or Dora, which was posted on the ASCB website on 16 May 2013. Dora proclaims, first and foremost, that JIFs should not be used “as a surrogate measure of the quality of individual research articles, to assess an individual scientist’s contributions, or in hiring, promotion or funding decisions”.

Less well known are the declaration’s 17 concrete recommendations for action by different stakeholders—funding agencies, research institutions, publishers, data providers and researchers themselves. (Journal Impact Factor is a Clarivate product. Research Professional News is an editorially independent part of Clarivate.)

Few if any of those at that first meeting can have imagined they were lighting a flame that would still be burning more than 10 years later. Only one present, Bernd Pulverer of the European Molecular Biology Organisation (Embo), is still directly involved in Dora as a member of the steering committee.

But in that decade, Dora has won support from organisations and individuals across different disciplines, including funders, universities, publishers and learned societies. So far, the declaration has attracted more than 23,000 signatories from 160 countries.

Dora began as a loose organisation supported by the ASCB and relying almost entirely on volunteer efforts, limiting its activities to promoting the declaration. Frustration that such a worthwhile initiative was struggling to build momentum within the research community prompted several strong allies of Dora to work with the original steering committee to raise financial support from a group of funders and publishers. This allowed Dora to appoint Anna Hatch as its first community manager in late 2017, a role that evolved into programme director.

Working together

I joined the new steering committee as chair the same year. In 2018, a reinvigorated Dora published an action plan that aimed to increase awareness of the declaration, extend its global and disciplinary reach, and, crucially, develop and promote best practice in research assessment. Dora morphed from a document telling people what they should be doing into an organisation that rolled up its sleeves to work alongside the community on figuring out solutions.

The action plan led to workshops and conferences that sought to understand the cultural and institutional barriers to reforming research assessment and develop alternative approaches. Dora also produced articles and briefings, often in collaboration with other organisations and advocates for change; its website became a rich repository of good practice and case studies; and its staff and steering committee members spoke to and consulted with audiences far and wide.

Anyone involved in research assessment knows how hard it is to evaluate impact but, 10 years on, Dora’s influence can be seen across the globe. Its principles for guiding research assessment are baked into the policies of funders such as the Wellcome Trust and Research England; they are a lynchpin of the Plan S initiative to drive open access publication; and they run through recommendations, policy documents and actions from the European Commission, the Global Research Council, Unesco and others, including last week’s report on reproducibility and research integrity from the House of Commons science and technology committee.

While Dora has emerged as an authoritative voice, its influence owes a great deal to interactions with many other like-minded groups and initiatives, including the Dutch Science in Transition movement, the Leiden Manifesto, the Metric Tide report, the Hong Kong Principles, the Latin American Forum on Research Assessment, and, most recently, the Coalition for Advancing Research Assessment.

Dora has leveraged impact by its readiness to collaborate, for example, with funders to develop and test the narrative CV, which Dora has long supported as a way to capture richer combinations of qualitative and quantitative information about the diverse dimensions and qualities of scholarly work.

Bumpy journey

Dora’s work is far from complete. Even after a decade, the declaration is not yet known in every corner of academia. Nor has it all been plain sailing. A perennial criticism is that for many organisations signing the declaration is performative rather than transformative.

There is some substance to this, not least because Dora does not have the resources or the wish to police compliance, relying instead on a pragmatic mix of public and private community engagement, recently enhanced by a more robust policy formulation. Elsewhere, institutional inertia has stymied healthier research evaluation.

The problems with JIFs were already well understood in 2013, if not widely appreciated. Presenting an arithmetic mean of an often highly skewed distribution of the citations of individual papers brings a reductive simplicity.

Worse, it makes a journal’s JIF meaningless as a guide to the quality of its individual research papers. But even today, some academics and university administrators cling to their faith in the notion that the law of averages means that the impact factor can be applied to every paper a journal publishes.

So the work must go on. Though tied to a declaration written in 2013, Dora has always reflected critically on its mission. The 2018 roadmap, for example, led to the formation of an international advisory board with representatives from every continent.

Expanded vision

The board brought new perspectives, enabling us in 2020 to expand our vision by articulating the intersections between Dora’s mission to reform research assessment and related movements to advance open scholarship and address long-standing inequities in academia and scholarly communication.

That led to a lowering of the financial barriers for organisations in developing economies wishing to support Dora. It also led to a merger of the steering committee and advisory board in 2022 to create a single centre of authority that shares power worldwide.

We will allow ourselves a few moments to commemorate 10 years of work that have shifted the global conversation on research assessment practices, but Dora remains focused on the journey ahead. March, for example, saw the publication of a new strategic plan for the next three years, developed during a year of consultation.

The plan’s content is not particularly new, because the problem and the emerging solutions are already well known. What is new is a greater focus on implementing and evaluating innovative approaches to research assessment. In concert with many other reformers, we recognise that the time for moaning about the problem is past. The task now is to work with organisations, institutions and advocates to enact change.

This will entail culture change—and that, as Dora’s founders realised, requires “sustained effort”. But the steps to reform are in sight. The first is a clear articulation of the values of the assessing organisation, so that the full range of desired research contributions can be defined when making decisions on recruitment, retention, promotion and funding.

Just as critically, organisations should set clear and achievable standards of performance, to supplant the platitudes about ‘excellence’ that researchers still too often internalise as meaning big grants and papers in ‘top’ journals.

How those standards are structured and reviewed is just as important. For this, narrative CVs can help capture the qualities being assessed in a concise and consistently formatted manner. These qualities can include scholarly impact, research integrity, teaching and mentoring, team building, the ability to build and maintain collaborations, departmental and disciplinary citizenship, or contributions to entrepreneurship, public engagement or policy-making.


To be sure, the narrative CV is not a silver bullet and there are implementation issues to be resolved—for example, how best to blend quantitative and qualitative information; how to separate style from substance; how to configure the most efficient ways to structure requests for information to minimise the burden on applicants and reviewers—but none of these are insurmountable. There is no one right way to implement new forms of research assessment, but concerns that the narrative CV is uniquely subjective and prone to bias are unfounded.

Values and complexities

No process of assessment can be fully objective. All approaches—including metrics— are constructed from social processes, vulnerable to the imperfections of human judgement and, as often as not, striving to evaluate research activities that are not directly comparable. Only by being clear-sighted about the values and complexities embedded in assessment can reform lead to better research and a healthier research culture.

This destination should not be taken as a given. Dora has endured these past 10 years because it embodies an aspiration with worldwide appeal and, dare I say, because it has always been open to the dialogue necessary to illuminate the liminal space, so familiar to academics, between ideals and reality. More than 10 years on from that first gathering of minds, the flame of research assessment reform burns brighter than ever.

Stephen Curry is professor of structural biology at Imperial College London and chair of the Dora steering committee

A version of this article appeared in Research Fortnight and Research Europe