IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/apdxk_v1.html
   My bibliography  Save this paper

A scoping review on metrics to quantify reproducibility: a multitude of questions leads to a multitude of metrics

Author

Listed:
  • Heyard, Rachel
  • Pawel, Samuel

    (University of Zurich)

  • Frese, Joris
  • Voelkl, Bernhard
  • Würbel, Hanno

    (University of Bern)

  • McCann, Sarah
  • Held, Leonhard
  • Wever, Kimberley E. PhD

    (Radboud university medical center)

  • Hartmann, Helena

    (University Hospital Essen)

  • Townsin, Louise

Abstract

*Background:* Reproducibility is recognized as essential to scientific progress and integrity. Replication studies and large-scale replication projects, aiming to quantify different aspects of reproducibility, have become more common. Since no standardized approach to measuring reproducibility exists,a diverse set of metrics has emerged and a comprehensive overview is needed. *Methods:* We conducted a scoping review to identify large-scale replication projects that used metrics and methodological papers that proposed or discussed metrics. The project list was compiled by the authors. For the methodological papers, we searched Scopus, MedLine, PsycINFO andEconLit. Records were screened in duplicate against predefined inclusion criteria. Demographic information on included records and information on reproducibility metrics used, suggested or discussed was extracted. *Results:* We identified 49 large-scale projects and 97 methodological papers, and extracted 50 metrics. The metrics were characterized based on type (formulas and/or statistical models, frameworks, graphical representations, studies and questionnaires, algorithms), input required, and appropriate application scenarios. Each metric addresses a distinct question. *Conclusions:* Our review provides a comprehensive resource in the form of a “live”, interactive table for future replication teams and meta-researchers, offering support in how to select the most appropriate metrics that are aligned with research questions and project goals.

Suggested Citation

  • Heyard, Rachel & Pawel, Samuel & Frese, Joris & Voelkl, Bernhard & Würbel, Hanno & McCann, Sarah & Held, Leonhard & Wever, Kimberley E. PhD & Hartmann, Helena & Townsin, Louise, 2024. "A scoping review on metrics to quantify reproducibility: a multitude of questions leads to a multitude of metrics," MetaArXiv apdxk_v1, Center for Open Science.
  • Handle: RePEc:osf:metaar:apdxk_v1
    DOI: 10.31219/osf.io/apdxk_v1
    as

    Download full text from publisher

    File URL: https://osf.io/download/6745e4637ab0b3495aac21b6/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/apdxk_v1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Leonhard Held, 2020. "A new standard for the analysis and design of replication studies," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 183(2), pages 431-448, February.
    2. Shirley V. Wang & Sushama Kattinakere Sreedhara & Sebastian Schneeweiss, 2022. "Reproducibility of real-world evidence studies using clinical practice data to inform regulatory and coverage decisions," Nature Communications, Nature, vol. 13(1), pages 1-11, December.
    3. Hanousek, Jan & Hajkova, Dana & Filer, Randall K., 2008. "A rise by any other name? Sensitivity of growth regressions to data source," Journal of Macroeconomics, Elsevier, vol. 30(3), pages 1188-1206, September.
    4. Andrew C. Chang & Phillip Li, 2022. "Is Economics Research Replicable? Sixty Published Papers From Thirteen Journals Say “Often Notâ€," Critical Finance Review, now publishers, vol. 11(1), pages 185-206, February.
    5. Charlotte Micheloud & Fadoua Balabdaoui & Leonhard Held, 2023. "Assessing replicability with the sceptical p$$ p $$‐value: Type‐I error control and sample size planning," Statistica Neerlandica, Netherlands Society for Statistics and Operations Research, vol. 77(4), pages 573-591, November.
    6. Samuel Pawel & Leonhard Held, 2022. "The sceptical Bayes factor for the assessment of replication success," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(3), pages 879-911, July.
    7. Shahram M. Amini & Christopher F. Parmeter, 2012. "Comparison Of Model Averaging Techniques: Assessing Growth Determinants," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 27(5), pages 870-876, August.
    8. Francesco Pauli, 2019. "A Statistical Model to Investigate the Reproducibility Rate Based on Replication Experiments," International Statistical Review, International Statistical Institute, vol. 87(1), pages 68-79, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Heyard, Rachel & Pawel, Samuel & Frese, Joris & Voelkl, Bernhard & Würbel, Hanno & McCann, Sarah & Held, Leonhard & Wever, Kimberley E. PhD & Hartmann, Helena & Townsin, Louise, 2024. "A scoping review on metrics to quantify reproducibility: a multitude of questions leads to a multitude of metrics," MetaArXiv apdxk, Center for Open Science.
    2. Konstantinos Bourazas & Guido Consonni & Laura Deldossi, 2024. "Bayesian sample size determination for detecting heterogeneity in multi-site replication studies," TEST: An Official Journal of the Spanish Society of Statistics and Operations Research, Springer;Sociedad de Estadística e Investigación Operativa, vol. 33(3), pages 697-716, September.
    3. Charlotte Micheloud & Fadoua Balabdaoui & Leonhard Held, 2023. "Assessing replicability with the sceptical p$$ p $$‐value: Type‐I error control and sample size planning," Statistica Neerlandica, Netherlands Society for Statistics and Operations Research, vol. 77(4), pages 573-591, November.
    4. Piotr Dybka & Bartosz Olesiński & Marek Rozkrut & Andrzej Torój, 2023. "Measuring the model uncertainty of shadow economy estimates," International Tax and Public Finance, Springer;International Institute of Public Finance, vol. 30(4), pages 1069-1106, August.
    5. Freuli, Francesca & Held, Leonhard & Heyard, Rachel, 2022. "Replication success under questionable research practices – a simulation study," MetaArXiv s4b65_v1, Center for Open Science.
    6. Mark F. J. Steel, 2020. "Model Averaging and Its Use in Economics," Journal of Economic Literature, American Economic Association, vol. 58(3), pages 644-719, September.
    7. Roman Horvath & Ali Elminejad & Tomas Havranek, 2020. "Publication and Identification Biases in Measuring the Intertemporal Substitution of Labor Supply," Working Papers IES 2020/32, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.
    8. Randall K. Filer & Dragana Stanišić, 2016. "The Effect of Terrorist Incidents on Capital Flows," Review of Development Economics, Wiley Blackwell, vol. 20(2), pages 502-513, May.
    9. Tomas Havranek & Zuzana Irsova & Lubica Laslopova & Olesia Zeynalova, 2020. "Skilled and Unskilled Labor Are Less Substitutable than Commonly Thought," Working Papers IES 2020/29, Charles University Prague, Faculty of Social Sciences, Institute of Economic Studies, revised Sep 2020.
    10. Gric, Zuzana & Bajzík, Josef & Badura, Ondřej, 2023. "Does sentiment affect stock returns? A meta-analysis across survey-based measures," International Review of Financial Analysis, Elsevier, vol. 89(C).
    11. Rockey, James & Temple, Jonathan, 2016. "Growth econometrics for agnostics and true believers," European Economic Review, Elsevier, vol. 81(C), pages 86-102.
    12. Brodeur, Abel & Esterling, Kevin & Ankel-Peters, Jörg & Bueno, Natália S & Desposato, Scott & Dreber, Anna & Genovese, Federica & Green, Donald P & Hepplewhite, Matthew & de la Guardia, Fernando Hoces, 2024. "Promoting Reproducibility and Replicability in Political Science," Department of Economics, Working Paper Series qt23n3n3dg, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    13. Schnabel, Isabel & Körner, Tobias, 2010. "Public Ownership of Banks and Economic Growth - The Role of Heterogeneity," CEPR Discussion Papers 8138, C.E.P.R. Discussion Papers.
    14. Havranek, Tomas & Irsova, Zuzana & Laslopova, Lubica & Zeynalova, Olesia, 2020. "The Elasticity of Substitution between Skilled and Unskilled Labor: A Meta-Analysis," MPRA Paper 102598, University Library of Munich, Germany.
    15. Ehrenbergerova, Dominika & Bajzik, Josef & Havranek, Tomas, 2021. "When Does Monetary Policy Sway House Prices? A Meta-Analysis," MetaArXiv npeqs_v1, Center for Open Science.
    16. Andrew C. Chang & Trace J. Levinson, 2023. "Raiders of the lost high‐frequency forecasts: New data and evidence on the efficiency of the Fed's forecasting," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 38(1), pages 88-104, January.
    17. Piotr Dybka & Bartosz Olesiński & Marek Rozkrut & Andrzej Torój, 2020. "Measuring the uncertainty of shadow economy estimates using Bayesian and frequentist model averaging," KAE Working Papers 2020-046, Warsaw School of Economics, Collegium of Economic Analysis.
    18. Petar Stankov, 2018. "Deregulation, Economic Growth and Growth Acceleration," Journal of Economic Development, Chung-Ang Unviersity, Department of Economics, vol. 43(4), pages 21-40, December.
    19. Henderson, Daniel J. & Papageorgiou, Chris & Parmeter, Christopher F., 2013. "Who benefits from financial development? New methods, new evidence," European Economic Review, Elsevier, vol. 63(C), pages 47-67.
    20. Dominika Ehrenbergerova & Josef Bajzik, 2020. "The Effect of Monetary Policy on House Prices - How Strong is the Transmission?," Working Papers 2020/14, Czech National Bank.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:apdxk_v1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.