IDEAS home Printed from https://ideas.repec.org/p/nbr/nberwo/26993.html
   My bibliography  Save this paper

In Praise of Moderation: Suggestions for the Scope and Use of Pre-Analysis Plans for RCTs in Economics

Author

Listed:
  • Abhijit Banerjee
  • Esther Duflo
  • Amy Finkelstein
  • Lawrence F. Katz
  • Benjamin A. Olken
  • Anja Sautmann

Abstract

Pre-Analysis Plans (PAPs) for randomized evaluations are becoming increasingly common in Economics, but their definition remains unclear and their practical applications therefore vary widely. Based on our collective experiences as researchers and editors, we articulate a set of principles for the ex-ante scope and ex-post use of PAPs. We argue that the key benefits of a PAP can usually be realized by completing the registration fields in the AEA RCT Registry. Specific cases where more detail may be warranted include when subgroup analysis is expected to be particularly important, or a party to the study has a vested interest. However, a strong norm for more detailed pre-specification can be detrimental to knowledge creation when implementing field experiments in the real world. An ex-post requirement of strict adherence to pre-specified plans, or the discounting of non-pre-specified work, may mean that some experiments do not take place, or that interesting observations and new theories are not explored and reported. Rather, we recommend that the final research paper be written and judged as a distinct object from the “results of the PAP”; to emphasize this distinction, researchers could consider producing a short, publicly available report (the “populated PAP”) that populates the PAP to the extent possible and briefly discusses any barriers to doing so.

Suggested Citation

  • Abhijit Banerjee & Esther Duflo & Amy Finkelstein & Lawrence F. Katz & Benjamin A. Olken & Anja Sautmann, 2020. "In Praise of Moderation: Suggestions for the Scope and Use of Pre-Analysis Plans for RCTs in Economics," NBER Working Papers 26993, National Bureau of Economic Research, Inc.
  • Handle: RePEc:nbr:nberwo:26993
    Note: DEV LS PE
    as

    Download full text from publisher

    File URL: http://www.nber.org/papers/w26993.pdf
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    2. Alexandre Belloni & Victor Chernozhukov & Christian Hansen, 2014. "High-Dimensional Methods and Inference on Structural and Treatment Effects," Journal of Economic Perspectives, American Economic Association, vol. 28(2), pages 29-50, Spring.
    3. Humphreys, Macartan & Sanchez de la Sierra, Raul & van der Windt, Peter, 2013. "Fishing, Commitment, and Communication: A Proposal for Comprehensive Nonbinding Research Registration," Political Analysis, Cambridge University Press, vol. 21(1), pages 1-20, January.
    4. Victor Chernozhukov & Mert Demirer & Esther Duflo & Iván Fernández-Val, 2018. "Generic Machine Learning Inference on Heterogeneous Treatment Effects in Randomized Experiments, with an Application to Immunization in India," NBER Working Papers 24678, National Bureau of Economic Research, Inc.
    5. Amy Finkelstein & Sarah Taubman & Bill Wright & Mira Bernstein & Jonathan Gruber & Joseph P. Newhouse & Heidi Allen & Katherine Baicker, 2012. "The Oregon Health Insurance Experiment: Evidence from the First Year," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(3), pages 1057-1106.
    6. Alexandre Belloni & Victor Chernozhukov & Christian Hansen, 2014. "Inference on Treatment Effects after Selection among High-Dimensional Controlsâ€," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 81(2), pages 608-650.
    7. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Sarah A. Janzen & Jeffrey D. Michler, 2021. "Ulysses' pact or Ulysses' raft: Using pre‐analysis plans in experimental and nonexperimental research," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 43(4), pages 1286-1304, December.
    2. Alexandre Belloni & Victor Chernozhukov & Denis Chetverikov & Christian Hansen & Kengo Kato, 2018. "High-dimensional econometrics and regularized GMM," CeMMAP working papers CWP35/18, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    3. Alexander Frankel & Maximilian Kasy, 2022. "Which Findings Should Be Published?," American Economic Journal: Microeconomics, American Economic Association, vol. 14(1), pages 1-38, February.
    4. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    5. Stefano DellaVigna & Elizabeth Linos, 2022. "RCTs to Scale: Comprehensive Evidence From Two Nudge Units," Econometrica, Econometric Society, vol. 90(1), pages 81-116, January.
    6. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).
    7. Michael C. Knaus, 2021. "A double machine learning approach to estimate the effects of musical practice on student’s skills," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(1), pages 282-300, January.
    8. Michael C Knaus & Michael Lechner & Anthony Strittmatter, 2021. "Machine learning estimation of heterogeneous causal effects: Empirical Monte Carlo evidence," The Econometrics Journal, Royal Economic Society, vol. 24(1), pages 134-161.
    9. Christopher Snyder & Ran Zhuo, 2018. "Sniff Tests as a Screen in the Publication Process: Throwing out the Wheat with the Chaff," NBER Working Papers 25058, National Bureau of Economic Research, Inc.
    10. Andrew Y Chen & Tom Zimmermann & Jeffrey Pontiff, 2020. "Publication Bias and the Cross-Section of Stock Returns," The Review of Asset Pricing Studies, Society for Financial Studies, vol. 10(2), pages 249-289.
    11. Everding, Jakob & Marcus, Jan, 2020. "The effect of unemployment on the smoking behavior of couples," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 29(2), pages 154-170.
    12. Patrick Vu, 2022. "Can the Replication Rate Tell Us About Publication Bias?," Papers 2206.15023, arXiv.org, revised Jul 2022.
    13. Zigraiova, Diana & Havranek, Tomas & Irsova, Zuzana & Novak, Jiri, 2021. "How puzzling is the forward premium puzzle? A meta-analysis," European Economic Review, Elsevier, vol. 134(C).
    14. Briole, Simon & Gurgand, Marc & Maurin, Eric & McNally, Sandra & Ruiz-Valenzuela, Jenifer & Santín, Daniel, 2022. "The Making of Civic Virtues: A School-Based Experiment in Three Countries," IZA Discussion Papers 15141, Institute of Labor Economics (IZA).
    15. Chris Doucouliagos & Jakob de Haan & Jan-Egbert Sturm, 2022. "What drives financial development? A Meta-regression analysis [A new database of financial reforms]," Oxford Economic Papers, Oxford University Press, vol. 74(3), pages 840-868.
    16. Maximilian Kasy & Jann Spiess, 2022. "Optimal Pre-Analysis Plans: Statistical Decisions Subject to Implementability," Papers 2208.09638, arXiv.org, revised Jul 2024.
    17. Graham Elliott & Nikolay Kudrin & Kaspar Wüthrich, 2022. "Detecting p‐Hacking," Econometrica, Econometric Society, vol. 90(2), pages 887-906, March.
    18. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    19. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    20. Kasy, Maximilian & Frankel, Alexander, 2018. "Which findings should be published?," MetaArXiv mbvz3_v1, Center for Open Science.

    More about this item

    JEL classification:

    • A0 - General Economics and Teaching - - General

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:nbr:nberwo:26993. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: https://edirc.repec.org/data/nberrus.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.