Addressing Researcher Fraud: Retrospective, Real-Time, and Preventive Strategies — Including Legal Points and Data Management That Prevents Fraud
📄 Original study ↗📌 Appears in:
Plain English Summary
What if the biggest threat to science isn't bad hypotheses but researchers faking their data? Kennedy tackles this head-on, drawing from decades of hands-on experience -- including running an actual sting operation back in the 1970s and spending 15 years in FDA-regulated drug trials where data integrity is literally a legal requirement. He lays out three ways to fight research fraud. First, the usual approach: investigating after someone blows the whistle. Problem is, by then the evidence is murky, standards are loose, and you often can't even prove who did it. Second, catching cheaters in the act with sting operations -- beautifully conclusive but almost impossible to pull off in practice. Third, and this is the real gem: preventive measures borrowed from the pharmaceutical world that most academic researchers have simply never adopted. Kennedy spells out eight specific practices -- things like keeping tamper-proof audit trails, locking down who can touch the raw data, validating your software, running independent audits, and making datasets public. He even audited a flagship parapsychology study (the Transparent Psi Project) and found a significant software bug lurking in what was supposed to be a gold-standard registered study. That's a wake-up call. He proposes a new badge system so readers can instantly see which studies actually meet these standards. And here's the kicker for the future: AI is about to make fraud dramatically harder to detect, raising the stakes for getting these safeguards in place now.
Abstract
Researcher fraud is often easy and enticing in academic research, with little risk of detection. Cases of extensive fraud continue to occur. The amount of fraud that goes undetected is unknown and may be substantial. Three strategies for addressing researcher fraud are (a) retrospective investigations after allegations of fraud have been made, (b) sting operations that provide conclusive evidence of fraud as it occurs, and (c) data management practices that prevent the occurrence of fraud. Institutional and regulatory efforts to address researcher fraud have focused almost exclusively on the retrospective strategy. The retrospective approach is subject to controversy due to the limitations of post-hoc evidence in science, the difficulty in establishing who actually committed the fraud in some cases, the application of a legal standard of evidence that is much lower than the usual standards of evidence in science, and the lack of legal expertise by scientists investigating fraud. The retrospective strategy may be reliably effective primarily in cases of extensive, careless fraud. Sting operations can overcome these limitations and controversies, but are not feasible in many situations. Data management practices that are effective at preventing researcher fraud and unintentional errors are well-established in clinical trials regulated by government agencies, but appear to be largely unknown or unimplemented in most academic research. Established data management practices include: archiving secure copies of the raw data, audit trails, restricted access to the data and data collection processes, software validation, quality control checks, blinding, preregistration of data processing and analysis programs, and research audits that directly address fraud. Current discussions about data management in academic research focus on sharing data with little attention to practices that prevent intentional and unintentional errors. A designation or badge such as error-controlled data management could be established to indicate research that was conducted with data management practices that effectively address intentional and unintentional errors.
Links
Related Papers
Same Research Program
- Experimenter Fraud: What Are Appropriate Methodological Standards? — Kennedy, J.E (2017)
- Can Parapsychology Move Beyond the Controversies of Retrospective Meta-Analyses? — Kennedy, J.E (2013)
- Is the Methodological Revolution in Psychology Over or Just Beginning? — Kennedy, J.E (2016)
- Conclusions about Paranormal Phenomena — Kennedy, J.E (2013)
- Planning Falsifiable Confirmatory Research — Kennedy, James E (2024)
Cites
- False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant — Simmons, Joseph P (2011)
- An Agenda for Purely Confirmatory Research — Wagenmakers, Eric-Jan (2012)
- Editors' Introduction to the Special Section on Replicability in Psychological Science: A Crisis of Confidence? — Pashler, Harold (2012)
Also by these authors
More in Methodology
Paranormal belief, conspiracy endorsement, and positive wellbeing: a network analysis
Quantum Aspects of the Brain-Mind Relationship: A Hypothesis with Supporting Evidence
Paranormal beliefs and cognitive function: A systematic review and assessment of study quality across four decades of research
Experimental evidence of non-classical brain functions
Self-Ascribed Paranormal Ability: Reflexive Thematic Analysis
📋 Cite this paper
Kennedy, James E (2024). Addressing Researcher Fraud: Retrospective, Real-Time, and Preventive Strategies — Including Legal Points and Data Management That Prevents Fraud. Frontiers in Research Metrics and Analytics. https://doi.org/10.3389/frma.2024.1397649
@article{kennedy_2024_research_fraud,
title = {Addressing Researcher Fraud: Retrospective, Real-Time, and Preventive Strategies — Including Legal Points and Data Management That Prevents Fraud},
author = {Kennedy, James E},
year = {2024},
journal = {Frontiers in Research Metrics and Analytics},
doi = {10.3389/frma.2024.1397649},
}