Efforts to root out scientific misconduct need more support, says Elisabeth Bik
Almost every published scientific paper builds on the past work of others. Whether scientists start a new project or expand upon their previous work, they read, cite and base new ideas on existing studies. But what if this work is unreliable? What if it contains errors or, worse, fraudulent data?
This disturbing scenario seems to be more common than one might hope. A couple of years ago, I started systematically searching for duplicated and manipulated images published in biomedical journals.
After scanning an initial set of 20,621 papers, I found that 1 in 25 had problematic figures, half of which appeared to be the result of deliberate manipulation. This 4 per cent might seem a relatively small proportion, but with millions of scientific papers published each year, it amounts to a huge number of publications.
What’s more, this 4 per cent is probably the tip of the iceberg. I only looked for problematic images by eye and did not seek fraud in other types of data. I probably missed photos duplicated between papers, or well-crafted-but-fabricated line graphs.
Scientific photos and plots can manifest several types of problem. If a photo is taken from a tissue sample under a microscope, and the same sample is moved to take another photo, it can result in two overlapping images. Especially when overlapping photos are rotated or mirrored to apparently represent different experiments, it suggests an intention to mislead. In other cases, photos might contain duplicated cells or protein bands, suggestive of manipulation using Photoshop.
One suspect paper can threaten to undermine an entire scientific field. A recent investigation by the journal Science, for example, reported allegations of manipulated photos in research supporting the hypothesis that Alzheimer’s disease is caused by buildups of amyloid-beta protein in the brain, carried out at the University of Minnesota.
This hypothesis has attracted millions of dollars in funding, but questions about its validity have been growing. Concern over the possibility of manipulation has in turn raised concern about wasted research money and potentially crushed hope for Alzheimer’s patients and their caregivers.
Manipulation is often not caught during peer review. The system of peer review of grants and manuscripts is based on trust and not designed to detect fraud. Errors and concerns about misconduct are often raised only after publication, when a wider audience gains access to the study. Small errors can be addressed with a correction, but if misconduct is suspected then a paper should be retracted.
As of today, I have examined over 100,000 scientific articles and found thousands containing problematic images, plagiarism, suspect data or ethical concerns. My findings have resulted in 957 corrected and another 923 retracted papers. But more than half of the papers I’ve reported to the journal editors have yet to be acted upon.
There might be many reasons behind this lack of response, such as editors’ uncertainty about how to respond, corresponding authors whose email addresses have changed, missing original data, the long duration of institutional investigations, or even authors who threaten to sue a journal if it retracts their paper. Universities and funders might try to keep these investigations confidential because they fear potential damage to their reputations.
This tardiness, however, is harmful for readers of these papers. Many will be scientists looking for ideas to base their work on. They may be unaware that concerns have been raised about a paper and that it is being investigated.
Frustrated by the slow pace of change, several ‘science detectives’, myself included, have taken to reporting concerns around photos and other data to PubPeer, an online platform for post-publication peer review, as a way to warn other scientists that there could be problems with these papers.
But this relies on researchers using PubPeer. What’s really needed is more and faster action from publishers, research institutions, funders and regulatory bodies. Papers containing manipulated or fabricated data should be retracted much more quickly, and preferably not published in the first place.
Must do better
Incentives also need to change. There needs to be less emphasis on novel results, publication metrics and impact factors, and funders need to become more interested in supporting activities that focus on quality and reproducibility in science. Getting funding for research is tough, but getting funding for finding errors and misconduct in scientific papers is virtually impossible.
Most perpetrators of scientific misconduct probably face little in the way of consequences, while those like myself who seek to expose it find themselves attacked online and threatened with lawsuits.
Winning the 2021 John Maddox Prize was wonderful recognition for my work on scientific integrity. But awards and grants for people who work on the robustness and reliability of science are rare. The research community must do better to support and fund research integrity.
Elisabeth Bik is a science integrity consultant based in California.
A version of this article also appeared in Research Europe