IDEAS home Printed from https://ideas.repec.org/a/nat/nathum/v9y2025i2d10.1038_s41562-024-01961-1.html
   My bibliography  Save this article

Predicting the replicability of social and behavioural science claims in COVID-19 preprints

Author

Listed:
  • Alexandru Marcoci

    (University of Cambridge
    University of Nottingham)

  • David P. Wilkinson

    (University of Melbourne
    University of Melbourne)

  • Ans Vercammen

    (University of Melbourne
    The University of Queensland
    Curtin University)

  • Bonnie C. Wintle

    (University of Melbourne)

  • Anna Lou Abatayo

    (Wageningen University and Research)

  • Ernest Baskin

    (Saint Joseph’s University)

  • Henk Berkman

    (University of Auckland)

  • Erin M. Buchanan

    (Harrisburg University of Science and Technology)

  • Sara Capitán

    (Swedish University of Agricultural Sciences)

  • Tabaré Capitán

    (Swedish University of Agricultural Sciences)

  • Ginny Chan

    (Rhizom Psychological Services LLC)

  • Kent Jason G. Cheng

    (The Pennsylvania State University)

  • Tom Coupé

    (University of Canterbury)

  • Sarah Dryhurst

    (University of Cambridge
    University of Cambridge
    University College London)

  • Jianhua Duan

    (Statistics New Zealand)

  • John E. Edlund

    (Rochester Institute of Technology)

  • Timothy M. Errington

    (Center for Open Science)

  • Anna Fedor

    (Independent researcher)

  • Fiona Fidler

    (University of Melbourne)

  • James G. Field

    (West Virginia University)

  • Nicholas Fox

    (Center for Open Science)

  • Hannah Fraser

    (University of Melbourne)

  • Alexandra L. J. Freeman

    (University of Cambridge)

  • Anca Hanea

    (University of Melbourne
    University of Melbourne)

  • Felix Holzmeister

    (University of Innsbruck)

  • Sanghyun Hong

    (University of Canterbury)

  • Raquel Huggins

    (Harrisburg University of Science and Technology)

  • Nick Huntington-Klein

    (Seattle University)

  • Magnus Johannesson

    (Stockholm School of Economics)

  • Angela M. Jones

    (Texas State University)

  • Hansika Kapoor

    (Monk Prayogshala
    University of Connecticut)

  • John Kerr

    (University of Cambridge
    University of Otago)

  • Melissa Kline Struhl

    (Massachusetts Institute of Technology)

  • Marta Kołczyńska

    (Polish Academy of Sciences)

  • Yang Liu

    (University of California, Santa Cruz)

  • Zachary Loomas

    (Center for Open Science)

  • Brianna Luis

    (Center for Open Science)

  • Esteban Méndez

    (Central Bank of Costa Rica)

  • Olivia Miske

    (Center for Open Science)

  • Fallon Mody

    (University of Melbourne
    University of Melbourne)

  • Carolin Nast

    (University of Stavanger, School of Business and Law)

  • Brian A. Nosek

    (Center for Open Science
    University of Virginia)

  • E. Simon Parsons

    (Center for Open Science)

  • Thomas Pfeiffer

    (Massey University)

  • W. Robert Reed

    (University of Canterbury)

  • Jon Roozenbeek

    (University of Cambridge)

  • Alexa R. Schlyfestone

    (Harrisburg University of Science and Technology)

  • Claudia R. Schneider

    (University of Cambridge
    University of Cambridge
    University of Canterbury)

  • Andrew Soh

    (University of Hawaii at Manoa)

  • Zhongchen Song

    (New Zealand Institute of Economic Research (NZIER))

  • Anirudh Tagat

    (Monk Prayogshala)

  • Melba Tutor

    (Independent researcher)

  • Andrew H. Tyner

    (Center for Open Science)

  • Karolina Urbanska

    (Independent researcher)

  • Sander Linden

    (University of Cambridge)

Abstract

Replications are important for assessing the reliability of published findings. However, they are costly, and it is infeasible to replicate everything. Accurate, fast, lower-cost alternatives such as eliciting predictions could accelerate assessment for rapid policy implementation in a crisis and help guide a more efficient allocation of scarce replication resources. We elicited judgements from participants on 100 claims from preprints about an emerging area of research (COVID-19 pandemic) using an interactive structured elicitation protocol, and we conducted 29 new high-powered replications. After interacting with their peers, participant groups with lower task expertise (‘beginners’) updated their estimates and confidence in their judgements significantly more than groups with greater task expertise (‘experienced’). For experienced individuals, the average accuracy was 0.57 (95% CI: [0.53, 0.61]) after interaction, and they correctly classified 61% of claims; beginners’ average accuracy was 0.58 (95% CI: [0.54, 0.62]), correctly classifying 69% of claims. The difference in accuracy between groups was not statistically significant and their judgements on the full set of claims were correlated (r(98) = 0.48, P

Suggested Citation

  • Alexandru Marcoci & David P. Wilkinson & Ans Vercammen & Bonnie C. Wintle & Anna Lou Abatayo & Ernest Baskin & Henk Berkman & Erin M. Buchanan & Sara Capitán & Tabaré Capitán & Ginny Chan & Kent Jason, 2025. "Predicting the replicability of social and behavioural science claims in COVID-19 preprints," Nature Human Behaviour, Nature, vol. 9(2), pages 287-304, February.
  • Handle: RePEc:nat:nathum:v:9:y:2025:i:2:d:10.1038_s41562-024-01961-1
    DOI: 10.1038/s41562-024-01961-1
    as

    Download full text from publisher

    File URL: https://www.nature.com/articles/s41562-024-01961-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1038/s41562-024-01961-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Daniel Benjamin & David R Mandel & Jonathan Kimmelman, 2017. "Can cancer researchers accurately judge whether preclinical reports will reproduce?," PLOS Biology, Public Library of Science, vol. 15(6), pages 1-17, June.
    2. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    3. Luciana Leite & Luisa Maria Diele-Viegas, 2021. "Juggling slow and fast science," Nature Human Behaviour, Nature, vol. 5(4), pages 409-409, April.
    4. Lynch, John G. & Bradlow, Eric T. & Huber, Joel C. & Lehmann, Donald R., 2015. "Reflections on the replication corner: In praise of conceptual replications," International Journal of Research in Marketing, Elsevier, vol. 32(4), pages 333-342.
    5. Daniel S. Quintana, 2021. "Replication studies for undergraduate theses to improve science and education," Nature Human Behaviour, Nature, vol. 5(9), pages 1117-1118, September.
    6. Hanea, A.M. & McBride, M.F. & Burgman, M.A. & Wintle, B.C. & Fidler, F. & Flander, L. & Twardy, C.R. & Manning, B. & Mascaro, S., 2017. "I nvestigate D iscuss E stimate A ggregate for structured expert judgement," International Journal of Forecasting, Elsevier, vol. 33(1), pages 267-279.
    7. Adam Altmejd & Anna Dreber & Eskil Forsell & Juergen Huber & Taisuke Imai & Magnus Johannesson & Michael Kirchler & Gideon Nave & Colin Camerer, 2019. "Predicting the replicability of social science lab experiments," PLOS ONE, Public Library of Science, vol. 14(12), pages 1-18, December.
    8. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    9. Forsell, Eskil & Viganola, Domenico & Pfeiffer, Thomas & Almenberg, Johan & Wilson, Brad & Chen, Yiling & Nosek, Brian A. & Johannesson, Magnus & Dreber, Anna, 2019. "Predicting replication outcomes in the Many Labs 2 study," Journal of Economic Psychology, Elsevier, vol. 75(PA).
    10. Michael Gordon & Domenico Viganola & Anna Dreber & Magnus Johannesson & Thomas Pfeiffer, 2021. "Predicting replicability—Analysis of survey and prediction market data from large-scale forecasting projects," PLOS ONE, Public Library of Science, vol. 16(4), pages 1-14, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Felix Holzmeister & Magnus Johannesson & Colin F. Camerer & Yiling Chen & Teck-Hua Ho & Suzanne Hoogeveen & Juergen Huber & Noriko Imai & Taisuke Imai & Lawrence Jin & Michael Kirchler & Alexander Ly , 2025. "Examining the replicability of online experiments selected by a decision market," Nature Human Behaviour, Nature, vol. 9(2), pages 316-330, February.
    2. Bossaerts, Frederik & Yadav, Nitin & Bossaerts, Peter & Nash, Chad & Todd, Torquil & Rudolf, Torsten & Hutchins, Rowena & Ponsonby, Anne-Louise & Mattingly, Karl, 2024. "Price formation in field prediction markets: The wisdom in the crowd," Journal of Financial Markets, Elsevier, vol. 68(C).
    3. Adler, Susanne Jana & Röseler, Lukas & Schöniger, Martina Katharina, 2023. "A toolbox to evaluate the trustworthiness of published findings," Journal of Business Research, Elsevier, vol. 167(C).
    4. Bergemann, Dirk & Ottaviani, Marco, 2021. "Information Markets and Nonmarkets," CEPR Discussion Papers 16459, C.E.P.R. Discussion Papers.
    5. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    6. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    7. Kasy, Maximilian & Frankel, Alexander, 2018. "Which findings should be published?," MetaArXiv mbvz3_v1, Center for Open Science.
    8. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    9. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    10. Christophe Pérignon & Olivier Akmansoy & Christophe Hurlin & Anna Dreber & Felix Holzmeister & Juergen Huber & Magnus Johanneson & Michael Kirchler & Albert Menkveld & Michael Razen & Utz Weitzel, 2022. "Reproducibility of Empirical Results: Evidence from 1,000 Tests in Finance," Working Papers hal-03810013, HAL.
    11. Fanelli, Daniele, 2020. "Metascientific reproducibility patterns revealed by informatic measure of knowledge," MetaArXiv 5vnhj, Center for Open Science.
    12. Markku Maula & Wouter Stam, 2020. "Enhancing Rigor in Quantitative Entrepreneurship Research," Entrepreneurship Theory and Practice, , vol. 44(6), pages 1059-1090, November.
    13. Frederik Bossaerts & Nitin Yadav & Peter Bossaerts & Chad Nash & Torquil Todd & Torsten Rudolf & Rowena Hutchins & Anne-Louise Ponsonby & Karl Mattingly, 2022. "Price Formation in Field Prediction Markets: the Wisdom in the Crowd," Papers 2209.08778, arXiv.org.
    14. Fišar, Miloš & Greiner, Ben & Huber, Christoph & Katok, Elena & Ozkes, Ali & Collaboration, Management Science Reproducibility, 2023. "Reproducibility in Management Science," OSF Preprints mydzv_v1, Center for Open Science.
    15. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    16. Page, Lionel & Noussair, Charles & Slonim, Robert, 2021. "The replication crisis, the rise of new research practices and what it means for experimental economics," OSF Preprints 8abyu_v1, Center for Open Science.
    17. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    18. Cristina Blanco-Perez & Abel Brodeur, 2019. "Transparency in empirical economic research," IZA World of Labor, Institute of Labor Economics (IZA), pages 467-467, November.
    19. Anthony Doucouliagos & Hristos Doucouliagos & T. D. Stanley, 2024. "Power and bias in industrial relations research," British Journal of Industrial Relations, London School of Economics, vol. 62(1), pages 3-27, March.
    20. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nat:nathum:v:9:y:2025:i:2:d:10.1038_s41562-024-01961-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.nature.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.