Go back

Open infrastructure needs a level playing field

            

Funders’ approach leaves scholar-led projects at a disadvantage against commercial players, says Benedikt Fecher

The 2002 Budapest Open Access Initiative envisaged restructuring access to scientific research, with the help of the internet. This would accelerate scientific progress, lessen the power of commercial publishers and help academics regain autonomy over their infrastructure in light of the rocketing cost of scholarly journals. 

Twenty years on, the number of open-access papers, journals and preprints has increased, yet the promised autonomy over the infrastructure of scholarly communication has not materialised. In fact, the dependence on commercial players has reproduced itself in the digital realm.

Digital tools have become indispensable in researchers’ everyday lives; they help in searching and summarising the literature, storing and commenting on data, communicating science with external stakeholders and tracking post-publication impact.

This “platformisation” occurs at every stage of research. Most of these tools and services are commercially owned, and many have been developed or bought by the major publishers. The 20th-anniversary recommendations from the Budapest initiative reflect the concerns around this trend, and its likely acceleration. 

There are alternatives: academic-led initiatives such as the search engine BASE, the discovery tool Open Knowledge Maps, the Crossref agency that administers Digital Object Identifiers and many community-driven repositories serving particular places or disciplines. 

Researchers surely want a strong presence for such open, non-commercial, diverse infrastructures. We would like to avoid lock-in and domination by commercial entities and preserve genuinely open services and the more niche infrastructures that are sustained by academic, rather than commercial logic. 

For a study published last year, my colleagues and I compared 33 commercial and non-commercial services at various points on the research lifecycle. Almost all aim to make research more transparent, inclusive, accessible and effective. 

We found that publicly funded infrastructures are at a disadvantage, largely due to the way they are funded. There are three main reasons for this.

Rigid funding logics 

Research infrastructures today are mainly software projects. Software production is constantly adapting to user needs, whereas public funding for research infrastructure is usually linear and rigid. Every change in a project is a laborious bureaucratic act. 

That means an idea from, say, 2018 gets a funding decision in 2019; after a short user survey, it is implemented as proposed. The result is a software product in 2022 that is obsolete on arrival. 

Under-resourced teams

Compared with commercial services, public and non-commercial efforts usually have fewer programmers and sales personnel and a larger proportion of researchers. This hinders their ability to understand and respond to users’ demands. 

A lack of career opportunities in science for highly qualified non-scientific personnel compounds the problem. Why work for a publicly funded infrastructure project if you do not get credit for it and can earn twice as much in private industry?

Misguided funding

Public funding often supports new services instead of sustaining those that are already operating successfully. At best, new initiatives are expected to have developed a business model by the end of the funding period. This is almost cynical, considering projects often fulfil niche needs that cannot be monetised. Many of these initiatives stop as soon as the money runs out. Infrastructure work is not sexy, but it is necessary for science to function.

This helps to explain why public and non-commercial scholar-led infrastructures seem to have a hard time succeeding, and why the successful non-commercial efforts mentioned above are the exception. Policymakers need to rethink their approach: there is a danger that short-sighted policy will result in science in the digital age becoming neoliberal rather than open. 

We need to level the playing field, providing scholar-led and open projects with more sustainable funding and ways of working, and regulating commercial products to ensure open values and avoid lock-in. 

Initiatives such as Invest in Open, Global Sustainability Coalition for Open Science Services, and OpenAIRE, among others, are important advocates for open values in tomorrow’s science infrastructure. 

One way to achieve this might be to consider infrastructure as universities’ fourth mission, alongside teaching, research and knowledge transfer. With better funding and training in coding, librarians would be well suited to fulfil this role. 

Benedikt Fecher heads the Knowledge and Society programme at the Alexander von Humboldt Institute for Internet and Society in Berlin

This article also appeared in Research Europe