The Biggest Retraction Stories of 2025 (And What They Mean for Your Research)
Roughly 10,000 papers get retracted every year now, growing at 22% annually. Here's what that means for literature reviews.
Roughly 10,000 papers get retracted every year now, growing at 22% annually. Here's what that means for literature reviews.
If you're conducting literature reviews in 2025, here's a number you need to know: 10,000.
That's roughly how many papers get retracted every year now. The growth rate is 22% annually - far outpacing the 6% growth rate of overall publications.
Last week's Friday Lab covered the biggest retraction stories of 2025 and the broader patterns they reveal. If you missed it, here's what you need to know.
Retraction rates vary significantly by field. The highest rates are in electrical engineering and computer science (33 per 10K papers), followed by clinical and life sciences (14 per 10K), and engineering and materials science (8 per 10K).
One key finding from recent research: rapid publication growth in a field correlates with non-linear increases in retractions. In other words, fields experiencing publication surges see disproportionately higher retraction rates.
Based on recent data through 2024-2025, here's what's driving retractions:
Dana-Farber agreed to a $15 million settlement after allegations that researchers reused or misrepresented images in NIH grant applications. Dozens of papers required retractions or corrections.
The case highlighted systematic oversight failures. Flawed data passed peer review, supported federal funding, and persisted for years despite being publicly detectable.
The whistleblower received $2.63M under the False Claims Act, a U.S. law allowing private citizens to sue on behalf of the government when federal funds are obtained through fraud.
A Bacterium That Can Grow by Using Arsenic Instead of Phosphorus was finally retracted in July 2025, fifteen years after publication.
The retraction focused on whether the study's results actually supported its conclusions. The bacteria weren't using arsenic instead of phosphate - the arsenic was simply contaminated with small amounts of phosphate.
This reflects a widening scope of what triggers retractions beyond fraud and misconduct to include inadequate methodology.
The economic commitment of climate change was retracted in December 2025 following criticism of data and methodology.
The paper had been:
The paper claimed climate change would reduce GDP 62% by 2030 - roughly 3x larger than prior estimates. One specific data source was found to have caused the inflated estimate.
Frontiers retracted 122 articles after discovering a coordinated network of authors engaged in systematic misconduct. The investigation identified over 4,000 additional articles linked to the same network across 7 different publishers.
The key fraud mechanisms were citation manipulation (artificially boosting citation counts) and reviewing papers without disclosing conflicts of interest.
Multiple papers were retracted in 2025 for obvious AI artifacts:
AI has reduced the marginal cost of adding citations to near zero. Researchers can scan summaries and understand a paper's directional finding without reading the full text.
This means bad papers - including those eventually retracted - propagate more quickly through citation networks. The lower barrier to citation increases the risk of citing flawed or fraudulent work.
Formal screening processes, similar to those used in systematic reviews, help mitigate this risk.
The Problematic Paper Screener (PPS), created by Guillaume Cabanac, scans literature at scale for:
Other useful tools include the Retraction Watch Database (the authoritative source for retraction tracking), LibKey (which flags retracted papers automatically and is integrated into moara.io), and PubPeer for community-driven post-publication peer review.
A few practical steps based on what we've learned from 2025's retractions:
We integrate with LibKey to automatically flag retracted papers during screening. In the Friday Lab demo, we uploaded a retracted arsenic paper (the one mentioned above), enriched it to find the DOI, and watched the retraction notice appear automatically in the screening interface.
No manual checking required. The system flags retracted papers as part of the normal screening workflow.
Literature reviews accumulate outdated information over time. As retractions increase, the need for versioning and updating becomes more apparent.
The challenge is developing workflows that can incorporate new information - including retractions - into existing reviews without requiring complete reconstruction.
Try moara.io free: https://moara.io
Upcoming Friday Labs: