IDEAS home Printed from https://ideas.repec.org/p/osf/metaar/mbvz3.html
   My bibliography  Save this paper

Which findings should be published?

Author

Listed:
  • Kasy, Maximilian
  • Frankel, Alexander

Abstract

Given a scarcity of journal space, what is the socially optimal rule for whether an empirical finding should be published? Suppose that the goal of publication is to inform the public about a policy-relevant state. Then journals should publish extreme results, meaning ones that move beliefs sufficiently. For specific objectives, the optimal rule can take the form of a one- or a two-sided test comparing a point estimate to the prior mean, with critical values deter- mined by a cost-benefit analysis. An explicit consideration of future studies may additionally justify the publication of precise null results. If one insists that standard inference remain valid, however, publication must not select on the study’s findings (but may select on the study’s design).

Suggested Citation

  • Kasy, Maximilian & Frankel, Alexander, 2018. "Which findings should be published?," MetaArXiv mbvz3, Center for Open Science.
  • Handle: RePEc:osf:metaar:mbvz3
    DOI: 10.31219/osf.io/mbvz3
    as

    Download full text from publisher

    File URL: https://osf.io/download/5bfec2f748140b0018df3e52/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/mbvz3?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Alberto Abadie, 2020. "Statistical Nonsignificance in Empirical Economics," American Economic Review: Insights, American Economic Association, vol. 2(2), pages 193-208, June.
    2. John P A Ioannidis, 2005. "Why Most Published Research Findings Are False," PLOS Medicine, Public Library of Science, vol. 2(8), pages 1-1, August.
    3. Michaillat, Pascal & Akerlof, George, 2017. "Beetles: Biased Promotions and Persistence of False Belief," CEPR Discussion Papers 12514, C.E.P.R. Discussion Papers.
    4. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    5. Emeric Henry & Marco Ottaviani, 2019. "Research and the Approval Process: The Organization of Persuasion," American Economic Review, American Economic Association, vol. 109(3), pages 911-955, March.
    6. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    7. Aleksey Tetenov, 2016. "An economic theory of statistical testing," CeMMAP working papers CWP50/16, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    8. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    9. Vaart,A. W. van der, 2000. "Asymptotic Statistics," Cambridge Books, Cambridge University Press, number 9780521784504, January.
    10. Boyan Jovanovic, 1982. "Truthful Disclosure of Information," Bell Journal of Economics, The RAND Corporation, vol. 13(1), pages 36-44, Spring.
    11. Richard McElreath & Paul E Smaldino, 2015. "Replication, Communication, and the Population Dynamics of Scientific Discovery," PLOS ONE, Public Library of Science, vol. 10(8), pages 1-16, August.
    12. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    13. Edward L. Glaeser, 2006. "Researcher Incentives and Empirical Methods," NBER Technical Working Papers 0329, National Bureau of Economic Research, Inc.
    14. Nicola Persico, 2000. "Information Acquisition in Auctions," Econometrica, Econometric Society, vol. 68(1), pages 135-148, January.
    15. David Card & Stefano DellaVigna, 2013. "Nine Facts about Top Journals in Economics," Journal of Economic Literature, American Economic Association, vol. 51(1), pages 144-161, March.
    16. Isaiah Andrews & Jesse M. Shapiro, 2021. "A Model of Scientific Communication," Econometrica, Econometric Society, vol. 89(5), pages 2117-2142, September.
    17. Verrecchia, Robert E., 1983. "Discretionary disclosure," Journal of Accounting and Economics, Elsevier, vol. 5(1), pages 179-194, April.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    2. Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2024. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.
    3. Abhirup Bhunia, 2021. "Evidence-based Policy in India: Crossing the Long, Uphill Bridge," Journal of Development Policy and Practice, , vol. 6(2), pages 137-143, July.
    4. Maximilian Kasy & Jann Spiess, 2024. "Optimal Pre-Analysis Plans: Statistical Decisions Subject to Implementability," Economics Series Working Papers 1058, University of Oxford, Department of Economics.
    5. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    6. David M. Kaplan, 2019. "Unbiased Estimation as a Public Good," Working Papers 1911, Department of Economics, University of Missouri.
    7. Maximilian Kasy & Jann Spiess, 2022. "Rationalizing Pre-Analysis Plans:Statistical Decisions Subject to Implementability," Economics Series Working Papers 975, University of Oxford, Department of Economics.
    8. Graham Elliott & Nikolay Kudrin & Kaspar Wuthrich, 2022. "The Power of Tests for Detecting $p$-Hacking," Papers 2205.07950, arXiv.org, revised Apr 2024.
    9. Garg, Prashant & Fetzer, Thiemo, 2024. "Causal Claims in Economics," I4R Discussion Paper Series 183, The Institute for Replication (I4R).
    10. Drazen, Allan & Dreber, Anna & Ozbay, Erkut Y. & Snowberg, Erik, 2021. "Journal-based replication of experiments: An application to “Being Chosen to Lead”," Journal of Public Economics, Elsevier, vol. 202(C).
    11. Maximilian Kasy & Jann Spiess, 2022. "Optimal Pre-Analysis Plans: Statistical Decisions Subject to Implementability," Papers 2208.09638, arXiv.org, revised Jul 2024.
    12. Davide Viviano & Kaspar Wuthrich & Paul Niehaus, 2021. "A model of multiple hypothesis testing," Papers 2104.13367, arXiv.org, revised Apr 2024.
    13. Isaiah Andrews & Jesse M. Shapiro, 2021. "A Model of Scientific Communication," Econometrica, Econometric Society, vol. 89(5), pages 2117-2142, September.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Maximilian Kasy & Jann Spiess, 2022. "Optimal Pre-Analysis Plans: Statistical Decisions Subject to Implementability," Papers 2208.09638, arXiv.org, revised Jul 2024.
    2. Furukawa, Chishio, 2019. "Publication Bias under Aggregation Frictions: Theory, Evidence, and a New Correction Method," EconStor Preprints 194798, ZBW - Leibniz Information Centre for Economics.
    3. Abel Brodeur & Nikolai Cook & Anthony Heyes, 2020. "Methods Matter: p-Hacking and Publication Bias in Causal Analysis in Economics," American Economic Review, American Economic Association, vol. 110(11), pages 3634-3660, November.
    4. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    5. Felix Chopra & Ingar Haaland & Christopher Roth & Andreas Stegmann, 2024. "The Null Result Penalty," The Economic Journal, Royal Economic Society, vol. 134(657), pages 193-219.
    6. Abel Brodeur & Nikolai Cook & Carina Neisser, 2024. "p-Hacking, Data type and Data-Sharing Policy," The Economic Journal, Royal Economic Society, vol. 134(659), pages 985-1018.
    7. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    8. Doucouliagos, Hristos & Hinz, Thomas & Zigova, Katarina, 2022. "Bias and careers: Evidence from the aid effectiveness literature," European Journal of Political Economy, Elsevier, vol. 71(C).
    9. Maximilian Kasy & Jann Spiess, 2024. "Optimal Pre-Analysis Plans: Statistical Decisions Subject to Implementability," Economics Series Working Papers 1058, University of Oxford, Department of Economics.
    10. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    11. Patrick Vu, 2022. "Can the Replication Rate Tell Us About Publication Bias?," Papers 2206.15023, arXiv.org, revised Jul 2022.
    12. Bergemann, Dirk & Ottaviani, Marco, 2021. "Information Markets and Nonmarkets," CEPR Discussion Papers 16459, C.E.P.R. Discussion Papers.
    13. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    14. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    15. Maximilian Kasy & Jann Spiess, 2022. "Rationalizing Pre-Analysis Plans:Statistical Decisions Subject to Implementability," Economics Series Working Papers 975, University of Oxford, Department of Economics.
    16. Abel Brodeur & Scott Carrell & David Figlio & Lester Lusher, 2023. "Unpacking P-hacking and Publication Bias," American Economic Review, American Economic Association, vol. 113(11), pages 2974-3002, November.
    17. Anna Dreber & Magnus Johannesson & Yifan Yang, 2024. "Selective reporting of placebo tests in top economics journals," Economic Inquiry, Western Economic Association International, vol. 62(3), pages 921-932, July.
    18. Brodeur, Abel & Cook, Nikolai & Heyes, Anthony, 2018. "Methods Matter: P-Hacking and Causal Inference in Economics," IZA Discussion Papers 11796, Institute of Labor Economics (IZA).
    19. Mueller-Langer, Frank & Fecher, Benedikt & Harhoff, Dietmar & Wagner, Gert G., 2019. "Replication studies in economics—How many and which papers are chosen for replication, and why?," Research Policy, Elsevier, vol. 48(1), pages 62-83.
    20. Cristina Blanco-Perez & Abel Brodeur, 2020. "Publication Bias and Editorial Statement on Negative Findings," The Economic Journal, Royal Economic Society, vol. 130(629), pages 1226-1247.

    More about this item

    JEL classification:

    • D61 - Microeconomics - - Welfare Economics - - - Allocative Efficiency; Cost-Benefit Analysis
    • D82 - Microeconomics - - Information, Knowledge, and Uncertainty - - - Asymmetric and Private Information; Mechanism Design
    • D83 - Microeconomics - - Information, Knowledge, and Uncertainty - - - Search; Learning; Information and Knowledge; Communication; Belief; Unawareness
    • L82 - Industrial Organization - - Industry Studies: Services - - - Entertainment; Media

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:metaar:mbvz3. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/metaarxiv .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.