IDEAS home Printed from https://ideas.repec.org/a/spr/qualqt/v56y2022i3d10.1007_s11135-021-01196-6.html
   My bibliography  Save this article

Improving the reproducibility of findings by updating research methodology

Author

Listed:
  • Joseph Klein

    (Bar-Ilan University)

Abstract

The literature discusses causes of low reproducibility of scientific publications. Our article adds another main cause—uncritical adherence to accepted research procedures. This is evident in: (1) anachronistically requiring researchers to base themselves on theoretical background even if the studies cited were not tested for reproducibility; (2) conducting studies suffering from a novelty effect bias; (3) forcing researchers who use data mining methods and field-based theory, with no preliminary theoretical rationale, to present a theoretical background that allegedly guided their work—as a precondition for publication of their findings. It is possible to increase research validity in relation to the above problems by the following means: (1) Conducting a longitudinal study on the same participants and only on them; (2) Trying to shorten the time period between laboratory experiments and those on humans, based on cost–benefit considerations, anchored in ethical norms; (3) Reporting the theoretical background in a causal modular format; (4) Giving incentives to those who meet the above criteria while moderating the pressure for fast output.

Suggested Citation

  • Joseph Klein, 2022. "Improving the reproducibility of findings by updating research methodology," Quality & Quantity: International Journal of Methodology, Springer, vol. 56(3), pages 1597-1609, June.
  • Handle: RePEc:spr:qualqt:v:56:y:2022:i:3:d:10.1007_s11135-021-01196-6
    DOI: 10.1007/s11135-021-01196-6
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s11135-021-01196-6
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s11135-021-01196-6?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Ava Kiai, 2019. "To protect credibility in science, banish “publish or perish”," Nature Human Behaviour, Nature, vol. 3(10), pages 1017-1018, October.
    2. Neal S Young & John P A Ioannidis & Omar Al-Ubaydli, 2008. "Why Current Publication Practices May Distort Science," PLOS Medicine, Public Library of Science, vol. 5(10), pages 1-5, October.
    3. Marcus R. Munafò & George Davey Smith, 2018. "Robust research needs many lines of evidence," Nature, Nature, vol. 553(7689), pages 399-401, January.
    4. C. Glenn Begley & Lee M. Ellis, 2012. "Raise standards for preclinical cancer research," Nature, Nature, vol. 483(7391), pages 531-533, March.
    5. Belton, Cameron A. & Sugden, Robert, 2018. "Attention and novelty: An experimental investigation of order effects in multiple valuation tasks," Journal of Economic Psychology, Elsevier, vol. 67(C), pages 103-115.
    6. Monya Baker, 2016. "1,500 scientists lift the lid on reproducibility," Nature, Nature, vol. 533(7604), pages 452-454, May.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Peter Harremoës, 2019. "Replication Papers," Publications, MDPI, vol. 7(3), pages 1-8, July.
    2. Ana Cecilia Quiroga Gutierrez & Daniel J. Lindegger & Ala Taji Heravi & Thomas Stojanov & Martin Sykora & Suzanne Elayan & Stephen J. Mooney & John A. Naslund & Marta Fadda & Oliver Gruebner, 2023. "Reproducibility and Scientific Integrity of Big Data Research in Urban Public Health and Digital Epidemiology: A Call to Action," IJERPH, MDPI, vol. 20(2), pages 1-15, January.
    3. Mark D Lindner & Richard K Nakamura, 2015. "Examining the Predictive Validity of NIH Peer Review Scores," PLOS ONE, Public Library of Science, vol. 10(6), pages 1-12, June.
    4. Daniel Homocianu, 2024. "Life Satisfaction: Insights from the World Values Survey," Societies, MDPI, vol. 14(7), pages 1-41, July.
    5. Kiran Sharma & Satyam Mukherjee, 2024. "The ripple effect of retraction on an author’s collaboration network," Journal of Computational Social Science, Springer, vol. 7(2), pages 1519-1531, October.
    6. Jinzhou Li & Marloes H. Maathuis, 2021. "GGM knockoff filter: False discovery rate control for Gaussian graphical models," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 83(3), pages 534-558, July.
    7. Obsa Urgessa Ayana & Jima Degaga, 2022. "Effects of rural electrification on household welfare: a meta-regression analysis," International Review of Economics, Springer;Happiness Economics and Interpersonal Relations (HEIRS), vol. 69(2), pages 209-261, June.
    8. Hussinger, Katrin & Pellens, Maikel, 2019. "Guilt by association: How scientific misconduct harms prior collaborators," Research Policy, Elsevier, vol. 48(2), pages 516-530.
    9. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    10. Dennis Bontempi & Leonard Nuernberg & Suraj Pai & Deepa Krishnaswamy & Vamsi Thiriveedhi & Ahmed Hosny & Raymond H. Mak & Keyvan Farahani & Ron Kikinis & Andrey Fedorov & Hugo J. W. L. Aerts, 2024. "End-to-end reproducible AI pipelines in radiology using the cloud," Nature Communications, Nature, vol. 15(1), pages 1-9, December.
    11. Omar Al-Ubaydli & John A. List, 2015. "Do Natural Field Experiments Afford Researchers More or Less Control than Laboratory Experiments? A Simple Model," NBER Working Papers 20877, National Bureau of Economic Research, Inc.
    12. Anya Topiwala & Kulveer Mankia & Steven Bell & Alastair Webb & Klaus P. Ebmeier & Isobel Howard & Chaoyue Wang & Fidel Alfaro-Almagro & Karla Miller & Stephen Burgess & Stephen Smith & Thomas E. Nicho, 2023. "Association of gout with brain reserve and vulnerability to neurodegenerative disease," Nature Communications, Nature, vol. 14(1), pages 1-9, December.
    13. Belton, Cameron A. & Lunn, Peter D., 2020. "Smart choices? An experimental study of smart meters and time-of-use tariffs in Ireland," Energy Policy, Elsevier, vol. 140(C).
    14. Andreoli-Versbach, Patrick & Mueller-Langer, Frank, 2014. "Open access to data: An ideal professed but not practised," Research Policy, Elsevier, vol. 43(9), pages 1621-1633.
    15. Kirmayer, Laurence J., 2012. "Cultural competence and evidence-based practice in mental health: Epistemic communities and the politics of pluralism," Social Science & Medicine, Elsevier, vol. 75(2), pages 249-256.
    16. Colin F. Camerer & Anna Dreber & Felix Holzmeister & Teck-Hua Ho & Jürgen Huber & Magnus Johannesson & Michael Kirchler & Gideon Nave & Brian A. Nosek & Thomas Pfeiffer & Adam Altmejd & Nick Buttrick , 2018. "Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015," Nature Human Behaviour, Nature, vol. 2(9), pages 637-644, September.
    17. Aarti Iyer & Gwilym Pryce, 2024. "Theorising the causal impacts of social frontiers: The social and psychological implications of discontinuities in the geography of residential mix," Urban Studies, Urban Studies Journal Limited, vol. 61(5), pages 782-798, April.
    18. Irsova, Zuzana & Doucouliagos, Hristos & Havranek, Tomas & Stanley, T. D., 2023. "Meta-Analysis of Social Science Research: A Practitioner’s Guide," EconStor Preprints 273719, ZBW - Leibniz Information Centre for Economics.
    19. Fabian Scheidegger & Andre Briviba & Bruno S. Frey, 2023. "Behind the curtains of academic publishing: strategic responses of economists and business scholars," Scientometrics, Springer;Akadémiai Kiadó, vol. 128(8), pages 4765-4790, August.
    20. Bettina Bert & Céline Heinl & Justyna Chmielewska & Franziska Schwarz & Barbara Grune & Andreas Hensel & Matthias Greiner & Gilbert Schönfelder, 2019. "Refining animal research: The Animal Study Registry," PLOS Biology, Public Library of Science, vol. 17(10), pages 1-12, October.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:qualqt:v:56:y:2022:i:3:d:10.1007_s11135-021-01196-6. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.