IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/n7a4d.html
   My bibliography  Save this paper

Push button replication: Is impact evaluation evidence for international development verifiable?

Author

Listed:
  • Wood, Benjamin
  • Müller, Rui
  • Brown, Annette Nicole

Abstract

Objective: In past years, research audit exercises conducted across several fields of study have found a high prevalence of published empirical research that cannot be reproduced using the original dataset and software code (replication files). The failure to reproduce arises either because the original authors refuse to make replication files available or because third party researchers are unable to produce the published results using the provided files. Both causes create a credibility challenge for empirical research, as it means those published findings are not verifiable. In recent years, increasing numbers of journals, funders, and academics have embraced research transparency, which should reduce the prevalence of failures to reproduce. This study reports the results of a research audit exercise, known as the push button replication (PBR) project, which tested a sample of studies published in 2014 that use similar empirical methods but span a variety of academic fields. Methods: To draw our sample of articles, we used the 3ie Impact Evaluation Repository to identify the ten journals that published the most impact evaluations (experimental and quasi-experimental intervention studies) from low- and middle-income countries from 2010 through 2012. This set includes health, economics, and development journals. We then selected all articles in these journals published in 2014 that meet the same inclusion criteria. We developed and piloted a detailed protocol for conducting push button replication and determining the level of comparability of the replication findings to the original. To ensure all materials and processes for the PBR project were transparent, we established a project site on the Open Science Framework. We divided the sample of articles across several researchers who followed the protocol to request data and conduct the replications. Results: Of the 109 articles in our sample, only 27 are push button replicable, meaning the provided code run on the provided dataset produces comparable findings for the key results in the published article. The authors of 59 of the articles refused to provide replication files. Thirty of these 59 articles were published in journals that had replication file requirements in 2014, meaning these articles are non-compliant with their journal requirements. For the remaining 23 articles, we confirmed that three had proprietary data, we received incomplete replication files for 15, and we found minor differences in the replication results for five. We found open data for only 14 of the articles in our sample.

Suggested Citation

  • Wood, Benjamin & Müller, Rui & Brown, Annette Nicole, 2018. "Push button replication: Is impact evaluation evidence for international development verifiable?," OSF Preprints n7a4d, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:n7a4d
    DOI: 10.31219/osf.io/n7a4d
    as

    Download full text from publisher

    File URL: https://osf.io/download/5b29734804fe52000ecb92ac/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/n7a4d?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. de Mel, Suresh & McKenzie, David & Woodruff, Christopher, 2014. "Business training and female enterprise start-up, growth, and dynamics: Experimental evidence from Sri Lanka," Journal of Development Economics, Elsevier, vol. 106(C), pages 199-210.
    2. Daniel S. Hamermesh, 2007. "Viewpoint: Replication in economics," Canadian Journal of Economics, Canadian Economics Association, vol. 40(3), pages 715-733, August.
    3. Michael A. Clemens, 2017. "The Meaning Of Failed Replications: A Review And Proposal," Journal of Economic Surveys, Wiley Blackwell, vol. 31(1), pages 326-342, February.
    4. Jelte M Wicherts & Marjan Bakker & Dylan Molenaar, 2011. "Willingness to Share Research Data Is Related to the Strength of the Evidence and the Quality of Reporting of Statistical Results," PLOS ONE, Public Library of Science, vol. 6(11), pages 1-7, November.
    5. Hamermesh, Daniel S., 2007. "Replication in Economics," IZA Discussion Papers 2760, Institute of Labor Economics (IZA).
    6. Fafchamps, Marcel & McKenzie, David & Quinn, Simon & Woodruff, Christopher, 2014. "Microenterprise growth and the flypaper effect: Evidence from a randomized experiment in Ghana," Journal of Development Economics, Elsevier, vol. 106(C), pages 211-226.
    7. Bruhn, Miriam & Lara Ibarra, Gabriel & McKenzie, David, 2014. "The minimal impact of a large-scale financial education program in Mexico City," Journal of Development Economics, Elsevier, vol. 108(C), pages 184-189.
    8. Faizan Diwan & Grace Makana & David McKenzie & Silvia Paruzzolo, 2014. "Invitation Choice Structure Has No Impact on Attendance in a Female Business Training Program in Kenya," PLOS ONE, Public Library of Science, vol. 9(10), pages 1-8, October.
    9. McCullough, B. D. & McGeary, Kerry Anne & Harrison, Teresa D., 2006. "Lessons from the JMCB Archive," Journal of Money, Credit and Banking, Blackwell Publishing, vol. 38(4), pages 1093-1107, June.
    10. Sebastian Galiani & Paul Gertler & Mauricio Romero, 2017. "Incentives for Replication in Economics," NBER Working Papers 23576, National Bureau of Economic Research, Inc.
    11. McCullough, B. D., 2018. "Quis custodiet ipsos custodes? Despite evidence to the contrary, the American Economic Review concluded that all was well with its archive," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 12, pages 1-13.
    12. Annette N. Brown & Drew B. Cameron & Benjamin D. K. Wood, 2014. "Quality evidence for policymaking: I'll believe it when I see the replication," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(3), pages 215-235, September.
    13. Drew B. Cameron & Anjini Mishra & Annette N. Brown, 2016. "The growth of impact evaluation for international development: how much have we learned?," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 8(1), pages 1-21, March.
    14. Doi, Yoko & McKenzie, David & Zia, Bilal, 2014. "Who you train matters: Identifying combined effects of financial education on migrant households," Journal of Development Economics, Elsevier, vol. 109(C), pages 39-55.
    15. Chang, Andrew C., 2017. "A replication recipe: List your ingredients before you start cooking," Economics Discussion Papers 2017-74, Kiel Institute for the World Economy (IfW Kiel).
    16. Benjamin A. Olken & Junko Onishi & Susan Wong, 2014. "Should Aid Reward Performance? Evidence from a Field Experiment on Health and Education in Indonesia," American Economic Journal: Applied Economics, American Economic Association, vol. 6(4), pages 1-34, October.
    17. Chang, Andrew C., 2018. "A replication recipe: List your ingredients before you start cooking," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 12, pages 1-8.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Reed, W. Robert, 2019. "Takeaways from the special issue on The Practice of Replication," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 13, pages 1-11.
    2. Brown, Annette N. & Wood, Benjamin Douglas Kuflick, 2018. "Which tests not witch hunts: A diagnostic approach for conducting replication research," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 12, pages 1-26.
    3. Piers Steel & Sjoerd Beugelsdijk & Herman Aguinis, 2021. "The anatomy of an award-winning meta-analysis: Recommendations for authors, reviewers, and readers of meta-analytic reviews," Journal of International Business Studies, Palgrave Macmillan;Academy of International Business, vol. 52(1), pages 23-44, February.
    4. Alipourfard, Nazanin & Arendt, Beatrix & Benjamin, Daniel Jacob & Benkler, Noam & Bishop, Michael Metcalf & Burstein, Mark & Bush, Martin & Caverlee, James & Chen, Yiling & Clark, Chae, 2021. "Systematizing Confidence in Open Research and Evidence (SCORE)," SocArXiv 46mnb, Center for Open Science.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Colliard, Jean-Edouard & Hurlin, Christophe & Pérignon, Christophe, 2019. "Reproducibility Certification in Economics Research," HEC Research Papers Series 1345, HEC Paris.
    2. Valérie Orozco & Christophe Bontemps & Elise Maigné & Virginie Piguet & Annie Hofstetter & Anne Lacroix & Fabrice Levert & Jean‐Marc Rousselle, 2020. "How To Make A Pie: Reproducible Research For Empirical Economics And Econometrics," Journal of Economic Surveys, Wiley Blackwell, vol. 34(5), pages 1134-1169, December.
    3. Valérie Orozco & Christophe Bontemps & Élise Maigné & Virginie Piguet & Annie Hofstetter & Anne Marie Lacroix & Fabrice Levert & Jean-Marc Rousselle, 2017. "How to make a pie? Reproducible Research for Empirical Economics & Econometrics," Post-Print hal-01939942, HAL.
    4. Hensel, Przemysław G., 2021. "Reproducibility and replicability crisis: How management compares to psychology and economics – A systematic review of literature," European Management Journal, Elsevier, vol. 39(5), pages 577-594.
    5. Nicolas Vallois & Dorian Jullien, 2017. "Replication in experimental economics: A historical and quantitative approach focused on public good game experiments," Université Paris1 Panthéon-Sorbonne (Post-Print and Working Papers) halshs-01651080, HAL.
    6. Fišar, Miloš & Greiner, Ben & Huber, Christoph & Katok, Elena & Ozkes, Ali & Management Science Reproducibility Collaboration, 2023. "Reproducibility in Management Science," Department for Strategy and Innovation Working Paper Series 03/2023, WU Vienna University of Economics and Business.
    7. Hernández Alemán, Anastasia & León, Carmelo J., 2018. "La Réplica en el Análisis Económico Aplicado/Replication in Applied Economic Analysis," Estudios de Economia Aplicada, Estudios de Economia Aplicada, vol. 36, pages 317-332, Enero.
    8. Nicolas Vallois & Dorian Jullien, 2017. "Replication in Experimental Economics: A Historical and Quantitative Approach Focused on Public Good Game Experiments," GREDEG Working Papers 2017-21, Groupe de REcherche en Droit, Economie, Gestion (GREDEG CNRS), Université Côte d'Azur, France.
    9. Sylvérie Herbert & Hautahi Kingi & Flavio Stanchi & Lars Vilhubern, 2021. "The Reproducibility of Economics Research: A Case Study," Working papers 853, Banque de France.
    10. Teresa Molina Millán & Karen Macours, 2017. "Attrition in randomized control trials: Using tracking information to correct bias," FEUNL Working Paper Series novaf:wp1702, Universidade Nova de Lisboa, Faculdade de Economia.
    11. Grolleau, Gilles & Ibanez, Lisette & Mzoughi, Naoufel, 2020. "Moral judgment of environmental harm caused by a single versus multiple wrongdoers: A survey experiment," Ecological Economics, Elsevier, vol. 170(C).
    12. Nick Huntington‐Klein & Andreu Arenas & Emily Beam & Marco Bertoni & Jeffrey R. Bloem & Pralhad Burli & Naibin Chen & Paul Grieco & Godwin Ekpe & Todd Pugatch & Martin Saavedra & Yaniv Stopnitzky, 2021. "The influence of hidden researcher decisions in applied microeconomics," Economic Inquiry, Western Economic Association International, vol. 59(3), pages 944-960, July.
    13. Roy Chen & Yan Chen & Yohanes E. Riyanto, 2021. "Best practices in replication: a case study of common information in coordination games," Experimental Economics, Springer;Economic Science Association, vol. 24(1), pages 2-30, March.
    14. Mark J. McCabe & Frank Mueller-Langer, 2019. "Does Data Disclosure Increase Citations? Empirical Evidence from a Natural Experiment in Leading Economics Journals," JRC Working Papers on Digital Economy 2019-02, Joint Research Centre.
    15. Amélie Charles & Olivier Darné, 2019. "Volatility estimation for Bitcoin: Replication and robustness," International Economics, CEPII research center, issue 157, pages 23-32.
    16. Alecos Papadopoulos, 2022. "Trade liberalization and growth: a quantile moderator for Hoyos’ (2021) replication study of Estevadeordal and Taylor (2013)," Empirical Economics, Springer, vol. 63(1), pages 549-563, July.
    17. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    18. Molina Millán, Teresa & Macours, Karen, 2017. "Attrition in Randomized Control Trials: Using Tracking Information to Correct Bias," IZA Discussion Papers 10711, Institute of Labor Economics (IZA).
    19. Reed, W. Robert, 2019. "Takeaways from the special issue on The Practice of Replication," Economics - The Open-Access, Open-Assessment E-Journal (2007-2020), Kiel Institute for the World Economy (IfW Kiel), vol. 13, pages 1-11.
    20. Dela‐Dem Fiankor & Fabio G. Santeramo, 2023. "Revisiting the impact of per‐unit duties on agricultural export prices," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 45(3), pages 1472-1492, September.

    More about this item

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:n7a4d. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.