IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/5nsh3_v1.html
   My bibliography  Save this paper

How Often Should We Believe Positive Results? Assessing the Credibility of Research Findings in Development Economics

Author

Listed:
  • Coville, Aidan
  • Vivalt, Eva

Abstract

Under-powered studies combined with low prior beliefs about intervention effects increase the chances that a positive result is overstated. We collect prior beliefs about intervention impacts from 125 experts to estimate the false positive and false negative report probabilities (FPRP and FNRP) as well as Type S (sign) and Type M (magnitude) errors for studies in development economics. We find that the large majority of studies in our sample are generally credible. We discuss how more systematic collection and use of prior expectations could help improve the literature.

Suggested Citation

  • Coville, Aidan & Vivalt, Eva, 2017. "How Often Should We Believe Positive Results? Assessing the Credibility of Research Findings in Development Economics," MetaArXiv 5nsh3_v1, Center for Open Science.
  • Handle: RePEc:osf:metaar:5nsh3_v1
    DOI: 10.31219/osf.io/5nsh3_v1
    as

    Download full text from publisher

    File URL: https://osf.io/download/59910090b83f690252e29b22/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/5nsh3_v1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. David McKenzie & Christopher Woodruff, 2014. "What Are We Learning from Business Training and Entrepreneurship Evaluations around the Developing World?," The World Bank Research Observer, World Bank, vol. 29(1), pages 48-82.
    2. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    3. Matthew Groh & Nandini Krishnan & David McKenzie & Tara Vishwanath, 2016. "The impact of soft skills training on female youth employment: evidence from a randomized experiment in Jordan," IZA Journal of Labor & Development, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 5(1), pages 1-23, December.
    4. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Coville, Aidan & Vivalt, Eva, 2017. "How Often Should We Believe Positive Results? Assessing the Credibility of Research Findings in Development Economics," MetaArXiv 5nsh3, Center for Open Science.
    2. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    3. Calderone, Margherita & Fiala, Nathan & Melyoki, Lemayon Lemilia & Schoofs, Annekathrin & Steinacher, Rachel, 2022. "Making intense skills training work at scale: Evidence on business and labor market outcomes in Tanzania," Ruhr Economic Papers 950, RWI - Leibniz-Institut für Wirtschaftsforschung, Ruhr-University Bochum, TU Dortmund University, University of Duisburg-Essen.
    4. Gechert, Sebastian & Mey, Bianka & Opatrny, Matej & Havranek, Tomas & Stanley, T. D. & Bom, Pedro R. D. & Doucouliagos, Hristos & Heimberger, Philipp & Irsova, Zuzana & Rachinger, Heiko J., 2023. "Conventional Wisdom, Meta-Analysis, and Research Revision in Economics," EconStor Preprints 280745, ZBW - Leibniz Information Centre for Economics.
    5. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    6. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    7. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    8. Kasy, Maximilian & Frankel, Alexander, 2018. "Which findings should be published?," MetaArXiv mbvz3_v1, Center for Open Science.
    9. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    10. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    11. Fernando Hoces de la Guardia & Sean Grant & Edward Miguel, 2021. "A framework for open policy analysis," Science and Public Policy, Oxford University Press, vol. 48(2), pages 154-163.
    12. Christoph Siemroth, 2024. "Economics Peer-Review: Problems, Recent Developments, and Reform Proposals," The American Economist, Sage Publications, vol. 69(2), pages 241-258, October.
    13. Thibaut Arpinon & Marianne Lefebvre, 2024. "Registered Reports and Associated Benefits for Agricultural Economics," Post-Print hal-04635986, HAL.
    14. Kaiser, Tim & Lusardi, Annamaria & Menkhoff, Lukas & Urban, Carly, 2022. "Financial education affects financial knowledge and downstream behaviors," Journal of Financial Economics, Elsevier, vol. 145(2), pages 255-272.
    15. Fišar, Miloš & Greiner, Ben & Huber, Christoph & Katok, Elena & Ozkes, Ali & Management Science Reproducibility Collaboration, 2023. "Reproducibility in Management Science," Department for Strategy and Innovation Working Paper Series 03/2023, WU Vienna University of Economics and Business.
    16. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    17. Page, Lionel & Noussair, Charles & Slonim, Robert, 2021. "The replication crisis, the rise of new research practices and what it means for experimental economics," OSF Preprints 8abyu_v1, Center for Open Science.
    18. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    19. Cristina Blanco-Perez & Abel Brodeur, 2019. "Transparency in empirical economic research," IZA World of Labor, Institute of Labor Economics (IZA), pages 467-467, November.
    20. Kasy, Maximilian & Andrews, Isaiah, 2018. "Identification of and correction for publication bias," MetaArXiv 49yst_v1, Center for Open Science.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:5nsh3_v1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.