IDEAS home Printed from https://ideas.repec.org/a/spr/jesaex/v9y2023i1d10.1007_s40881-022-00123-1.html
   My bibliography  Save this article

A practical guide to Registered Reports for economists

Author

Listed:
  • Thibaut Arpinon

    (CREM, University of Rennes 1)

  • Romain Espinosa

    (CIRED, CNRS)

Abstract

The current publication system in economics has encouraged the inflation of positive results in empirical papers. Registered Reports, also called Pre-Results Reviews, are a new submission format for empirical work that takes pre-registration one step further. In Registered Reports, researchers write their papers before running the study and commit to a detailed data collection process and analysis plan. After a first-stage review, a journal can give an In-Principle-Acceptance guaranteeing that the paper will be published if the authors carry out their data collection and analysis as pre-specified. We here propose a practical guide to Registered Reports for empirical economists. We illustrate the major problems that Registered Reports address (p-hacking, HARKing, forking, and publication bias), and present practical guidelines on how to write and review Registered Reports (e.g., the data-analysis plan, power analysis, and correction for multiple-hypothesis testing), with R and STATA codes. We provide specific examples for experimental economics, and show how research design can be improved to maximize statistical power. Last, we discuss some tools that authors, editors, and referees can use to evaluate Registered Reports (checklist, study-design table, and quality assessment).

Suggested Citation

  • Thibaut Arpinon & Romain Espinosa, 2023. "A practical guide to Registered Reports for economists," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 9(1), pages 90-122, June.
  • Handle: RePEc:spr:jesaex:v:9:y:2023:i:1:d:10.1007_s40881-022-00123-1
    DOI: 10.1007/s40881-022-00123-1
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s40881-022-00123-1
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s40881-022-00123-1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Joseph P. Romano & Michael Wolf, 2005. "Stepwise Multiple Testing as Formalized Data Snooping," Econometrica, Econometric Society, vol. 73(4), pages 1237-1282, July.
    2. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    3. Charles Bellemare & Luc Bissonnette & Sabine Kröger, 2016. "Simulating power of economic experiments: the powerBBK package," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 2(2), pages 157-168, November.
    4. Daniele Fanelli, 2010. "“Positive” Results Increase Down the Hierarchy of the Sciences," PLOS ONE, Public Library of Science, vol. 5(4), pages 1-10, April.
    5. Benjamin A. Olken, 2015. "Promises and Perils of Pre-analysis Plans," Journal of Economic Perspectives, American Economic Association, vol. 29(3), pages 61-80, Summer.
    6. Abel Brodeur & Mathias Lé & Marc Sangnier & Yanos Zylberberg, 2016. "Star Wars: The Empirics Strike Back," American Economic Journal: Applied Economics, American Economic Association, vol. 8(1), pages 1-32, January.
    7. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.
    8. Necker, Sarah, 2014. "Scientific misbehavior in economics," Research Policy, Elsevier, vol. 43(10), pages 1747-1759.
    9. Ziliak, Stephen T. & McCloskey, Deirdre N., 2004. "Size matters: the standard error of regressions in the American Economic Review," Journal of Behavioral and Experimental Economics (formerly The Journal of Socio-Economics), Elsevier, vol. 33(5), pages 527-546, November.
    10. Damian Clarke & Joseph P. Romano & Michael Wolf, 2020. "The Romano–Wolf multiple-hypothesis correction in Stata," Stata Journal, StataCorp LP, vol. 20(4), pages 812-843, December.
    11. Eliot Abrams & Jonathan Libgober & John List, 2020. "Research Registries: Facts, Myths, and Possible Improvements," Artefactual Field Experiments 00703, The Field Experiments Website.
    12. Roy Chen & Yan Chen & Yohanes E. Riyanto, 2021. "Best practices in replication: a case study of common information in coordination games," Experimental Economics, Springer;Economic Science Association, vol. 24(1), pages 2-30, March.
    13. John A. List & Azeem M. Shaikh & Yang Xu, 2019. "Multiple hypothesis testing in experimental economics," Experimental Economics, Springer;Economic Science Association, vol. 22(4), pages 773-793, December.
    14. Nicholas Swanson & Garret Christensen & Rebecca Littman & David Birke & Edward Miguel & Elizabeth Levy Paluck & Zenan Wang, 2020. "Research Transparency Is on the Rise in Economics," AEA Papers and Proceedings, American Economic Association, vol. 110, pages 61-65, May.
    15. Edward Miguel, 2021. "Evidence on Research Transparency in Economics," Journal of Economic Perspectives, American Economic Association, vol. 35(3), pages 193-214, Summer.
    16. Valentin Amrhein & Sander Greenland & Blake McShane, 2019. "Scientists rise up against statistical significance," Nature, Nature, vol. 567(7748), pages 305-307, March.
    17. Deirdre N. McCloskey & Stephen T. Ziliak, 1996. "The Standard Error of Regressions," Journal of Economic Literature, American Economic Association, vol. 34(1), pages 97-114, March.
    18. Lionel Page & Charles N. Noussair & Robert Slonim, 2021. "The replication crisis, the rise of new research practices and what it means for experimental economics," Journal of the Economic Science Association, Springer;Economic Science Association, vol. 7(2), pages 210-225, December.
    19. Rachel Glennerster & Kudzai Takavarasha, 2013. "Running Randomized Evaluations: A Practical Guide," Economics Books, Princeton University Press, edition 1, number 10085.
    20. Paul J. Ferraro & Pallavi Shukla, 2020. "Feature—Is a Replicability Crisis on the Horizon for Environmental and Resource Economics?," Review of Environmental Economics and Policy, University of Chicago Press, vol. 14(2), pages 339-351.
    21. Christopher D. Chambers & Loukia Tzavella, 2022. "The past, present and future of Registered Reports," Nature Human Behaviour, Nature, vol. 6(1), pages 29-42, January.
    22. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    23. Daniele Fanelli, 2009. "How Many Scientists Fabricate and Falsify Research? A Systematic Review and Meta-Analysis of Survey Data," PLOS ONE, Public Library of Science, vol. 4(5), pages 1-11, May.
    24. Jonathan W. Schooler, 2014. "Metascience could rescue the ‘replication crisis’," Nature, Nature, vol. 515(7525), pages 9-9, November.
    25. Romano, Joseph P. & Wolf, Michael, 2016. "Efficient computation of adjusted p-values for resampling-based stepdown multiple testing," Statistics & Probability Letters, Elsevier, vol. 113(C), pages 38-40.
    26. Romain Espinosa & Nicolas Treich, 2021. "Moderate Versus Radical NGOs†," American Journal of Agricultural Economics, John Wiley & Sons, vol. 103(4), pages 1478-1501, August.
    27. Marjan Bakker & Coosje L S Veldkamp & Marcel A L M van Assen & Elise A V Crompvoets & How Hwee Ong & Brian A Nosek & Courtney K Soderberg & David Mellor & Jelte M Wicherts, 2020. "Ensuring the quality and specificity of preregistrations," PLOS Biology, Public Library of Science, vol. 18(12), pages 1-18, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Luca A. Panzone & Natasha Auch & Daniel John Zizzo, 2024. "Nudging the Food Basket Green: The Effects of Commitment and Badges on the Carbon Footprint of Food Shopping," Environmental & Resource Economics, Springer;European Association of Environmental and Resource Economists, vol. 87(1), pages 89-133, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Thibaut Arpinon & Romain Espinosa, 2023. "A Practical Guide to Registered Reports for Economists," Post-Print halshs-03897719, HAL.
    2. Heckelei, Thomas & Huettel, Silke & Odening, Martin & Rommel, Jens, 2021. "The replicability crisis and the p-value debate – what are the consequences for the agricultural and food economics community?," Discussion Papers 316369, University of Bonn, Institute for Food and Resource Economics.
    3. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    4. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    5. Heath, Davidson & Ringgenberg, Matthew C. & Samadi, Mehrdad & Werner, Ingrid M., 2019. "Reusing Natural Experiments," Working Paper Series 2019-21, Ohio State University, Charles A. Dice Center for Research in Financial Economics.
    6. Garret Christensen & Edward Miguel, 2018. "Transparency, Reproducibility, and the Credibility of Economics Research," Journal of Economic Literature, American Economic Association, vol. 56(3), pages 920-980, September.
    7. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    8. Igor Asanov & Christoph Buehren & Panagiota Zacharodimou, 2020. "The power of experiments: How big is your n?," MAGKS Papers on Economics 202032, Philipps-Universität Marburg, Faculty of Business Administration and Economics, Department of Economics (Volkswirtschaftliche Abteilung).
    9. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.
    10. Bruns, Stephan B. & Asanov, Igor & Bode, Rasmus & Dunger, Melanie & Funk, Christoph & Hassan, Sherif M. & Hauschildt, Julia & Heinisch, Dominik & Kempa, Karol & König, Johannes & Lips, Johannes & Verb, 2019. "Reporting errors and biases in published empirical findings: Evidence from innovation research," Research Policy, Elsevier, vol. 48(9), pages 1-1.
    11. Bruno Ferman & Cristine Pinto & Vitor Possebom, 2020. "Cherry Picking with Synthetic Controls," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 39(2), pages 510-532, March.
    12. Hermes, Henning & Mierisch, Fabian & Peter, Frauke & Wiederhold, Simon & Lergetporer, Philipp, 2023. "Discrimination on the Child Care Market: A Nationwide Field Experiment," IZA Discussion Papers 16082, Institute of Labor Economics (IZA).
    13. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    14. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    15. Hermes, Henning & Krauß, Marina & Lergetporer, Philipp & Peter, Frauke & Wiederhold, Simon, 2022. "Early Child Care and Labor Supply of Lower-SES Mothers: A Randomized Controlled Trial," IZA Discussion Papers 15814, Institute of Labor Economics (IZA).
    16. Roggenkamp, Hauke C., 2024. "Revisiting ‘Growth and Inequality in Public Good Provision’—Reproducing and Generalizing Through Inconvenient Online Experimentation," OSF Preprints 6rn97, Center for Open Science.
    17. Hermes, Henning & Lergetporer, Philipp & Peter, Frauke & Wiederhold, Simon, 2021. "Behavioral Barriers and the Socioeconomic Gap in Child Care Enrollment," IZA Discussion Papers 14698, Institute of Labor Economics (IZA).
    18. Henning Hermes & Philipp Lergetporer & Frauke Peter & Simon Wiederhold, 2021. "Application Barriers and the Socioeconomic Gap in Child Care Enrollment," CESifo Working Paper Series 9282, CESifo.
    19. Fabo, Brian & Jančoková, Martina & Kempf, Elisabeth & Pástor, Ľuboš, 2021. "Fifty shades of QE: Comparing findings of central bankers and academics," Journal of Monetary Economics, Elsevier, vol. 120(C), pages 1-20.
    20. Brodeur, Abel & Cook, Nikolai & Hartley, Jonathan & Heyes, Anthony, 2022. "Do Pre-Registration and Pre-analysis Plans Reduce p-Hacking and Publication Bias?," MetaArXiv uxf39, Center for Open Science.

    More about this item

    Keywords

    Registered Reports; Practical guide; Pre-registration; p-hacking; HARKing; Multiple-hypothesis testing; Power analysis; The smallest effect size of interest;
    All these keywords.

    JEL classification:

    • A10 - General Economics and Teaching - - General Economics - - - General
    • C12 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods and Methodology: General - - - Hypothesis Testing: General
    • C9 - Mathematical and Quantitative Methods - - Design of Experiments

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:jesaex:v:9:y:2023:i:1:d:10.1007_s40881-022-00123-1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.