Go back

The good fight

   

RPN Live: Winning the war against research misconduct

In the words of Sarah Richardson, editor-in-chief of Research Professional News, speaking at an RPN Live webinar on research integrity, there is “growing talk of a reproducibility crisis in research and growing awareness that systemic issues within research environments—pressure to publish, insecure environments, insufficient institutional oversight, to name just a few—are jeopardising trust in the reliability of research”.

A serious situation, but it’s not all doom and gloom. A fightback is underway, as the six panellists at the webinar laid out. Here are the four main themes that developed during the discussion.

Misconduct is a serious problem

The opening presentation by Elisabeth Bik, a science integrity consultant, was a bracing demonstration of how widespread misconduct is.

In 2014-15, Bik examined more than 20,000 papers for image manipulation, flagging 782 of them to journals as problematic. She estimated that half of those were intentional errors, equating to roughly 2 per cent of all publications assessed.

When asked whether she thought that figure indicated the level of deliberate falsification of results in the contemporary wider scientific record, she said that was a tough question and that she was only able to catch “very dumb” errors, where little or no effort had been made to cover up the mistake. 

“It would also be very hard to detect any data manipulation that is not in a photo,” she added. “So the real percentage of misconduct might be much higher than 2 per cent.”

Bik believes that research misconduct is becoming more prevalent, especially with the rise of generative artificial intelligence and its likely use by ‘paper mills’—organisations that produce and sell fraudulent manuscripts that resemble genuine research.

Sabina Alam, director of publishing ethics and integrity at the academic publisher Taylor & Francis, agreed, adding: “The paper mills are one thing, but we can also see people who are part of what we refer to as ‘cartels’… They’re working with each other to boost their publication numbers and h-indexes [a measure of the significance of a researcher’s published work] to levels never seen before.”

Honest mistakes happen

As Alam noted, careless errors and deliberate misconduct “are being lumped into one box and it can lead to demotivating researchers who have spotted a problem in their paper from requesting a retraction”—something that itself leads to the perpetuation of errors in the scientific record. 

Outcomes for researchers who have made honest mistakes and those who have engaged in deliberate misconduct need to be different, stressed Marcus Munafò, chair of the UK Reproducibility Network.

He said: “Yes, the [investigation] process needs to protect the rights of the individual accused, but once the process is complete and it’s known that misconduct did take place, that needs to be made clear—and often it isn’t.”

Sector-wide response needed

The need for a sector-wide response was a point of apparent universal agreement, although panellists put different emphasis on the various stakeholders in academic research. 

For Munafò, academic institutions must take the lead on this topic, partly due to “issues around the legalities of misconduct”. But he said it was a challenge to ensure that institutions did not “mark their own homework”, and he encouraged “cross-institutional partnerships that would allow for cases to be reviewed independently” to meet that challenge.

Nandita Quaderi, editor-in-chief of Web of Science*, sounded a note of caution while still agreeing about the sector-wide response: “When we talk about shared responsibility, we need to make sure that doesn’t allow people to shirk their responsibility. Each stakeholder has to have a very clear sense of what bit of the process they’re responsible for.”

A ‘just culture’ is a way forward

Munafò described what an integrated system that predisposed researchers towards honest behaviour might look like—“in some sectors, this is called a ‘just culture’”, he said.

One way of creating this just culture could be by breaking the focus on outputs and focusing instead on the process. He gave the example of the ‘registered report’ publishing system, which allows researchers to submit their work at the protocol stage and before any data have been collected, thus rewarding robust methodology and necessary research questions rather than eye-catching results.

Like other panellists, Miles Padgett, a member of the UK Committee on Research Integrity, stressed the need for transparency but added that total transparency was not always possible. As a physical scientist, Padgett said that the data he typically dealt with were “basically collections of numbers taken from instruments. If your data is the output of an interview, that needs to be treated very differently in terms of transparency.”  

*Web of Science is a Clarivate product. Research Professional News is an editorially independent part of Clarivate.

Listen to the webinar recording

This is an extract from an article in Research Professional’s Funding Insight service. To subscribe contact sales@researchresearch.com