IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/5nsh3.html
   My bibliography  Save this paper

How Often Should We Believe Positive Results? Assessing the Credibility of Research Findings in Development Economics

Author

Listed:
  • Coville, Aidan
  • Vivalt, Eva

Abstract

Under-powered studies combined with low prior beliefs about intervention effects increase the chances that a positive result is overstated. We collect prior beliefs about intervention impacts from 125 experts to estimate the false positive and false negative report probabilities (FPRP and FNRP) as well as Type S (sign) and Type M (magnitude) errors for studies in development economics. We find that the large majority of studies in our sample are generally credible. We discuss how more systematic collection and use of prior expectations could help improve the literature.

Suggested Citation

  • Coville, Aidan & Vivalt, Eva, 2017. "How Often Should We Believe Positive Results? Assessing the Credibility of Research Findings in Development Economics," MetaArXiv 5nsh3, Center for Open Science.
  • Handle: RePEc:osf:metaar:5nsh3
    DOI: 10.31219/osf.io/5nsh3
    as

    Download full text from publisher

    File URL: https://osf.io/download/59910090b83f690252e29b22/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/5nsh3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Sarojini Hirshleifer & David McKenzie & Rita Almeida & Cristobal Ridao‐Cano, 2016. "The Impact of Vocational Training for the Unemployed: Experimental Evidence from Turkey," Economic Journal, Royal Economic Society, vol. 126(597), pages 2115-2146, November.
    2. Eva Vivalt, 0. "How Much Can We Generalize From Impact Evaluations?," Journal of the European Economic Association, European Economic Association, vol. 18(6), pages 3045-3089.
    3. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    4. David McKenzie & Christopher Woodruff, 2014. "What Are We Learning from Business Training and Entrepreneurship Evaluations around the Developing World?," The World Bank Research Observer, World Bank, vol. 29(1), pages 48-82.
    5. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    6. Matthew Groh & Nandini Krishnan & David McKenzie & Tara Vishwanath, 2016. "The impact of soft skills training on female youth employment: evidence from a randomized experiment in Jordan," IZA Journal of Labor & Development, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 5(1), pages 1-23, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    2. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    3. Fox,Louise & Kaul,Upaasna, 2018. "The evidence is in : how should youth employment programs in low-income countries be designed ?," Policy Research Working Paper Series 8500, The World Bank.
    4. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    5. Beber, Bernd & Dworschak, Regina & Lakemann, Tabea & Lay, Jann & Priebe, Jan, 2021. "Skills Development and Training Interventions in Africa: Findings, Challenges, and Opportunities," RWI Projektberichte, RWI - Leibniz-Institut für Wirtschaftsforschung, number 247426.
    6. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    7. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    8. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    9. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    10. Calderone, Margherita & Fiala, Nathan & Melyoki, Lemayon Lemilia & Schoofs, Annekathrin & Steinacher, Rachel, 2022. "Making intense skills training work at scale: Evidence on business and labor market outcomes in Tanzania," Ruhr Economic Papers 950, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    11. Gechert, Sebastian & Mey, Bianka & Opatrny, Matej & Havranek, Tomas & Stanley, T. D. & Bom, Pedro R. D. & Doucouliagos, Hristos & Heimberger, Philipp & Irsova, Zuzana & Rachinger, Heiko J., 2023. "Conventional Wisdom, Meta-Analysis, and Research Revision in Economics," EconStor Preprints 280745, ZBW - Leibniz Information Centre for Economics.
    12. Girum Abebe & Stefano Caria & Marcel Fafchamps & Paolo Falco & Simon Franklin & Simon Quinn, 2016. "Curse of Anonymity or Tyranny of Distance? The Impacts of Job-Search Support in Urban Ethiopia," CSAE Working Paper Series 2016-10, Centre for the Study of African Economies, University of Oxford.
    13. Lucas C. Coffman & Muriel Niederle & Alistair J. Wilson, 2017. "A Proposal to Organize and Promote Replications," American Economic Review, American Economic Association, vol. 107(5), pages 41-45, May.
    14. Bergemann, Dirk & Ottaviani, Marco, 2021. "Information Markets and Nonmarkets," CEPR Discussion Papers 16459, C.E.P.R. Discussion Papers.
    15. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    16. Dreber, Anna & Johannesson, Magnus, 2023. "A framework for evaluating reproducibility and replicability in economics," Ruhr Economic Papers 1055, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    17. Maitra, Pushkar & Mani, Subha, 2017. "Learning and earning: Evidence from a randomized evaluation in India," Labour Economics, Elsevier, vol. 45(C), pages 116-130.
    18. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    19. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    20. Marcel Fafchamps & Julien Labonne, 2016. "Using Split Samples to Improve Inference about Causal Effects," NBER Working Papers 21842, National Bureau of Economic Research, Inc.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:5nsh3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.