IDEAS home Printed from https://ideas.repec.org/a/spr/scient/v118y2019i1d10.1007_s11192-018-2969-2.html
   My bibliography  Save this article

The ability of different peer review procedures to flag problematic publications

Author

Listed:
  • S. P. J. M. Horbach

    (Radboud University
    Leiden University)

  • W. Halffman

    (Radboud University)

Abstract

There is a mounting worry about erroneous and outright fraudulent research that gets published in the scientific literature. Although peer review’s ability to filter out such publications is contentious, several peer review innovations attempt to do just that. However, there is very little systematic evidence documenting the ability of different review procedures to flag problematic publications. In this article, we use survey data on peer review in a wide range of journals to compare the retraction rates of specific review procedures, using the Retraction Watch database. We were able to identify which peer review procedures were used since 2000 for 361 journals, publishing a total of 833,172 articles, of which 670 were retracted. After addressing the dual character of retractions, signalling both a failure to identify problems prior to publication, but also the willingness to correct mistakes, we empirically assess review procedures. With considerable conceptual caveats, we were able to identify peer review procedures that seem able to detect problematic research better than others. Results were verified for disciplinary differences and variation between reasons for retraction. This leads to informed recommendations for journal editors about strengths and weaknesses of specific peer review procedures, allowing them to select review procedures that address issues most relevant to their field.

Suggested Citation

  • S. P. J. M. Horbach & W. Halffman, 2019. "The ability of different peer review procedures to flag problematic publications," Scientometrics, Springer;Akadémiai Kiadó, vol. 118(1), pages 339-373, January.
  • Handle: RePEc:spr:scient:v:118:y:2019:i:1:d:10.1007_s11192-018-2969-2
    DOI: 10.1007/s11192-018-2969-2
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11192-018-2969-2
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11192-018-2969-2?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    2. Brandli Stitzel & Gary A. Hoover & William Clark, 2018. "More on Plagiarism in the Social Sciences," Social Science Quarterly, Southwestern Social Science Association, vol. 99(3), pages 1075-1088, September.
    3. Hopp, Christian & Hoover, Gary A., 2017. "How prevalent is academic misconduct in management research?," Journal of Business Research, Elsevier, vol. 80(C), pages 73-81.
    4. Ludo Waltman & Nees Jan van Eck, 2012. "A new methodology for constructing a publication‐level classification system of science," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    5. Tianwei He, 2013. "Retraction of global scientific publications from 2001 to 2010," Scientometrics, Springer;Akadémiai Kiadó, vol. 96(2), pages 555-561, August.
    6. Azoulay, Pierre & Bonatti, Alessandro & Krieger, Joshua L., 2017. "The career effects of scandal: Evidence from scientific retractions," Research Policy, Elsevier, vol. 46(9), pages 1552-1569.
    7. Marion Schmidt, 2018. "An analysis of the validity of retraction annotation in pubmed and the web of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 69(2), pages 318-328, February.
    8. Solmaz Filiz Karabag & Christian Berggren, 2016. "Misconduct, Marginality and Editorial Practices in Management, Business and Economics Journals," PLOS ONE, Public Library of Science, vol. 11(7), pages 1-25, July.
    9. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the American Society for Information Science and Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    10. Vincent Larivière & Stefanie Haustein & Philippe Mongeon, 2015. "The Oligopoly of Academic Publishers in the Digital Era," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-15, June.
    11. Daniele Fanelli & Rodrigo Costas & Vincent Larivière, 2015. "Misconduct Policies, Academic Culture and Career Stage, Not Gender or Pressures to Publish, Affect Scientific Integrity," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-18, June.
    12. Ludo Waltman & Nees Jan Eck, 2012. "A new methodology for constructing a publication-level classification system of science," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 63(12), pages 2378-2392, December.
    13. Carole J. Lee & Cassidy R. Sugimoto & Guo Zhang & Blaise Cronin, 2013. "Bias in peer review," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 64(1), pages 2-17, January.
    14. Yýldýrým Beyazýt ÇÝÇEN, 2017. "II. International Strategic Research Congress," Journal of Economics and Political Economy, KSP Journals, vol. 4(4), pages 423-424, December.
    15. Daniele Fanelli, 2009. "How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data," PLOS ONE, Public Library of Science, vol. 4(5), pages 1-11, May.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Martin Reinhart & Cornelia Schendzielorz, 2024. "Peer-review procedures as practice, decision, and governance—the road to theories of peer review," Science and Public Policy, Oxford University Press, vol. 51(3), pages 543-552.
    2. Jasper Brinkerink, 2023. "When Shooting for the Stars Becomes Aiming for Asterisks: P-Hacking in Family Business Research," Entrepreneurship Theory and Practice, , vol. 47(2), pages 304-343, March.
    3. Lingzi Feng & Junpeng Yuan & Liying Yang, 2020. "An observation framework for retracted publications in multiple dimensions," Scientometrics, Springer;Akadémiai Kiadó, vol. 125(2), pages 1445-1457, November.
    4. Sida Feng & Lingzi Feng & Fang Han & Ye Zhang & Yanqing Ren & Lixue Wang & Junpeng Yuan, 2024. "Citation network analysis of retractions in molecular biology field," Scientometrics, Springer;Akadémiai Kiadó, vol. 129(8), pages 4795-4817, August.
    5. Catalin Toma & Liliana Padureanu & Bogdan Toma, 2022. "Correction of the Scientific Production: Publisher Performance Evaluation Using a Dataset of 4844 PubMed Retractions," Publications, MDPI, vol. 10(2), pages 1-25, April.
    6. José Luis Ortega, 2022. "Classification and analysis of PubPeer comments: How a web journal club is used," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 73(5), pages 655-670, May.
    7. Alessandro Checco & Lorenzo Bracciale & Pierpaolo Loreti & Stephen Pinfield & Giuseppe Bianchi, 2021. "AI-assisted peer review," Palgrave Communications, Palgrave Macmillan, vol. 8(1), pages 1-11, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dag W. Aksnes & Liv Langfeldt & Paul Wouters, 2019. "Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories," SAGE Open, , vol. 9(1), pages 21582440198, February.
    2. Emilija Stojmenova Duh & Andrej Duh & Uroš Droftina & Tim Kos & Urban Duh & Tanja Simonič Korošak & Dean Korošak, 2019. "Publish-and-Flourish: Using Blockchain Platform to Enable Cooperative Scholarly Communication," Publications, MDPI, vol. 7(2), pages 1-15, May.
    3. Salandra, Rossella, 2018. "Knowledge dissemination in clinical trials: Exploring influences of institutional support and type of innovation on selective reporting," Research Policy, Elsevier, vol. 47(7), pages 1215-1228.
    4. A. I. M. Jakaria Rahman & Raf Guns & Loet Leydesdorff & Tim C. E. Engels, 2016. "Measuring the match between evaluators and evaluees: cognitive distances between panel members and research groups at the journal level," Scientometrics, Springer;Akadémiai Kiadó, vol. 109(3), pages 1639-1663, December.
    5. Horbach, S.P.J.M.(Serge) & Halffman, W.(Willem), 2019. "The extent and causes of academic text recycling or ‘self-plagiarism’," Research Policy, Elsevier, vol. 48(2), pages 492-502.
    6. Marlo M Vernon & E Andrew Balas & Shaher Momani, 2018. "Are university rankings useful to improve research? A systematic review," PLOS ONE, Public Library of Science, vol. 13(3), pages 1-15, March.
    7. Gary A. Hoover & Christian Hopp, 2017. "What Crisis? Taking Stock of Management Researchers' Experiences with and Views of Scholarly Misconduct," CESifo Working Paper Series 6611, CESifo.
    8. Gonzalo Marco-Cuenca & José Antonio Salvador-Oliván & Rosario Arquero-Avilés, 2021. "Fraud in scientific publications in the European Union. An analysis through their retractions," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 5143-5164, June.
    9. Tariq Ahmad Shah & Sumeer Gul & Saimah Bashir & Suhail Ahmad & Assumpció Huertas & Andrea Oliveira & Farzana Gulzar & Ashaq Hussain Najar & Kanu Chakraborty, 2021. "Influence of accessibility (open and toll-based) of scholarly publications on retractions," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(6), pages 4589-4606, June.
    10. M. D. Ribeiro & S. M. R. Vasconcelos, 2018. "Retractions covered by Retraction Watch in the 2013–2015 period: prevalence for the most productive countries," Scientometrics, Springer;Akadémiai Kiadó, vol. 114(2), pages 719-734, February.
    11. Caroline Lievore & Priscila Rubbo & Celso Biynkievycz Santos & Claudia Tânia Picinin & Luiz Alberto Pilatti, 2021. "Research ethics: a profile of retractions from world class universities," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(8), pages 6871-6889, August.
    12. Fei Shu & Xiaojian Wang & Sichen Liu & Junping Qiu & Vincent Larivière, 2023. "Global impact or national accessibility? A paradox in China’s science," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(1), pages 263-277, January.
    13. Mohan, Vijay, 2019. "On the use of blockchain-based mechanisms to tackle academic misconduct," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    14. Yanto Chandra, 2018. "Mapping the evolution of entrepreneurship as a field of research (1990–2013): A scientometric analysis," PLOS ONE, Public Library of Science, vol. 13(1), pages 1-24, January.
    15. Nadine Desrochers & Adèle Paul‐Hus & Jen Pecoskie, 2017. "Five decades of gratitude: A meta‐synthesis of acknowledgments research," Journal of the Association for Information Science & Technology, Association for Information Science & Technology, vol. 68(12), pages 2821-2833, December.
    16. Diego Chavarro & Puay Tang & Ismael Rafols, 2014. "Interdisciplinarity and research on local issues: evidence from a developing country," Research Evaluation, Oxford University Press, vol. 23(3), pages 195-209.
    17. María Pinto & Rosaura Fernández-Pascual & David Caballero-Mariscal & Dora Sales, 2020. "Information literacy trends in higher education (2006–2019): visualizing the emerging field of mobile information literacy," Scientometrics, Springer;Akadémiai Kiadó, vol. 124(2), pages 1479-1510, August.
    18. K. J. Wang & J. Widagdo & Y. S. Lin & H. L. Yang & S. L. Hsiao, 2016. "A service innovation framework for start-up firms by integrating service experience engineering approach and capability maturity model," Service Business, Springer;Pan-Pacific Business Association, vol. 10(4), pages 867-916, December.
    19. Jürgen Janger & Nicole Schmidt-Padickakudy & Anna Strauss-Kollin, 2019. "International Differences in Basic Research Grant Funding. A Systematic Comparison," WIFO Studies, WIFO, number 61664, March.
    20. Juan Miguel Campanario, 2018. "Are leaders really leading? Journals that are first in Web of Science subject categories in the context of their groups," Scientometrics, Springer;Akadémiai Kiadó, vol. 115(1), pages 111-130, April.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:scient:v:118:y:2019:i:1:d:10.1007_s11192-018-2969-2. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.