IDEAS home Printed from https://ideas.repec.org/a/zbw/espost/202031.html
   My bibliography  Save this article

Declaring and Diagnosing Research Designs

Author

Listed:
  • Blair, Graeme
  • Cooper, Jasper
  • Coppock, Alexander
  • Humphreys, Macartan

Abstract

Researchers need to select high-quality research designs and communicate those designs clearly to readers. Both tasks are difficult. We provide a framework for formally “declaring” the analytically relevant features of a research design in a demonstrably complete manner, with applications to qualitative, quantitative, and mixed methods research. The approach to design declaration we describe requires defining a model of the world (M), an inquiry (I), a data strategy (D), and an answer strategy (A). Declaration of these features in code provides sufficient information for researchers and readers to use Monte Carlo techniques to diagnose properties such as power, bias, accuracy of qualitative causal inferences, and other “diagnosands.” Ex ante declarations can be used to improve designs and facilitate preregistration, analysis, and reconciliation of intended and actual analyses. Ex post declarations are useful for describing, sharing, reanalyzing, and critiquing existing designs. We provide open-source software, DeclareDesign, to implement the proposed approach.

Suggested Citation

  • Blair, Graeme & Cooper, Jasper & Coppock, Alexander & Humphreys, Macartan, 2019. "Declaring and Diagnosing Research Designs," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, vol. 113(3), pages 838-859.
  • Handle: RePEc:zbw:espost:202031
    DOI: 10.1017/S0003055419000194
    as

    Download full text from publisher

    File URL: https://www.econstor.eu/bitstream/10419/202031/1/f-22301-full-text-Blair-et_al-Designs-v3.pdf
    Download Restriction: no

    File URL: https://libkey.io/10.1017/S0003055419000194?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. James J. Heckman & Sergio Urzua & Edward Vytlacil, 2006. "Understanding Instrumental Variables in Models with Essential Heterogeneity," The Review of Economics and Statistics, MIT Press, vol. 88(3), pages 389-432, August.
    2. Dunning,Thad, 2012. "Natural Experiments in the Social Sciences," Cambridge Books, Cambridge University Press, number 9781107017665, November.
    3. Hug, Simon, 2013. "Qualitative Comparative Analysis: How Inductive Use and Measurement Error Lead to Problematic Inference," Political Analysis, Cambridge University Press, vol. 21(2), pages 252-265, April.
    4. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    5. Dunning,Thad, 2012. "Natural Experiments in the Social Sciences," Cambridge Books, Cambridge University Press, number 9781107698000, November.
    6. Fairfield, Tasha & Charman, Andrew, 2017. "Explicit Bayesian analysis for process tracing: guidelines, opportunities, and caveats," LSE Research Online Documents on Economics 69203, London School of Economics and Political Science, LSE Library.
    7. Junni L. Zhang & Donald B. Rubin, 2003. "Estimation of Causal Effects via Principal Stratification When Some Outcomes are Truncated by “Deathâ€," Journal of Educational and Behavioral Statistics, , vol. 28(4), pages 353-368, December.
    8. Coppock, Alexander, 2019. "Avoiding Post-Treatment Bias in Audit Experiments," Journal of Experimental Political Science, Cambridge University Press, vol. 6(1), pages 1-4, April.
    9. Muller, Keith E. & Peterson, Bercedis L., 1984. "Practical methods for computing power in testing the multivariate general linear hypothesis," Computational Statistics & Data Analysis, Elsevier, vol. 2(2), pages 143-158, August.
    10. Peter M. Aronow & Cyrus Samii, 2016. "Does Regression Produce Representative Estimates of Causal Effects?," American Journal of Political Science, John Wiley & Sons, vol. 60(1), pages 250-267, January.
    11. Kosuke Imai & Gary King & Elizabeth A. Stuart, 2008. "Misunderstandings between experimentalists and observationalists about causal inference," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 171(2), pages 481-502, April.
    12. Fairfield, Tasha & Charman, Andrew E., 2017. "Explicit Bayesian Analysis for Process Tracing: Guidelines, Opportunities, and Caveats," Political Analysis, Cambridge University Press, vol. 25(3), pages 363-380, July.
    13. Imbens,Guido W. & Rubin,Donald B., 2015. "Causal Inference for Statistics, Social, and Biomedical Sciences," Cambridge Books, Cambridge University Press, number 9780521885881, September.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Karthik Muralidharan & Mauricio Romero & Kaspar Wüthrich, 2019. "Factorial Designs, Model Selection, and (Incorrect) Inference in Randomized Experiments," NBER Working Papers 26562, National Bureau of Economic Research, Inc.
    2. McKenzie, David & Mohpal, Aakash & Yang, Dean, 2022. "Aspirations and financial decisions: Experimental evidence from the Philippines," Journal of Development Economics, Elsevier, vol. 156(C).
    3. Brodeur, Abel & Esterling, Kevin & Ankel-Peters, Jörg & Bueno, Natália S & Desposato, Scott & Dreber, Anna & Genovese, Federica & Green, Donald P & Hepplewhite, Matthew & de la Guardia, Fernando Hoces, 2024. "Promoting Reproducibility and Replicability in Political Science," Department of Economics, Working Paper Series qt23n3n3dg, Department of Economics, Institute for Business and Economic Research, UC Berkeley.
    4. Dawid, Philip & Humphreys, Macartan & Musio, Monica, 2022. "Bounding Causes of Effects With Mediators," EconStor Open Access Articles and Book Chapters, ZBW - Leibniz Information Centre for Economics, issue OnlineFir, pages 1-1.
    5. Chad Hazlett & Tanvi Shinkre, 2024. "Demystifying and avoiding the OLS "weighting problem": Unmodeled heterogeneity and straightforward solutions," Papers 2403.03299, arXiv.org, revised Nov 2024.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. David A. Bateman & Dawn Langan Teele, 2020. "A developmental approach to historical causal inference," Public Choice, Springer, vol. 185(3), pages 253-279, December.
    2. Haoge Chang & Joel Middleton & P. M. Aronow, 2021. "Exact Bias Correction for Linear Adjustment of Randomized Controlled Trials," Papers 2110.08425, arXiv.org, revised Oct 2021.
    3. Clara Bicalho & Adam Bouyamourn & Thad Dunning, 2022. "Conditional Balance Tests: Increasing Sensitivity and Specificity With Prognostic Covariates," Papers 2205.10478, arXiv.org.
    4. Rocio Titiunik, 2020. "Natural Experiments," Papers 2002.00202, arXiv.org.
    5. Öberg, Stefan, 2018. "Instrumental variables based on twin births are by definition not valid (v.3.0)," SocArXiv zux9s, Center for Open Science.
    6. Sloczynski, Tymon, 2018. "A General Weighted Average Representation of the Ordinary and Two-Stage Least Squares Estimands," IZA Discussion Papers 11866, Institute of Labor Economics (IZA).
    7. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    8. Adel Daoud, 2020. "The wealth of nations and the health of populations: A quasi-experimental design of the impact of sovereign debt crises on child mortality," Papers 2012.14941, arXiv.org.
    9. Tymon S{l}oczy'nski, 2018. "Interpreting OLS Estimands When Treatment Effects Are Heterogeneous: Smaller Groups Get Larger Weights," Papers 1810.01576, arXiv.org, revised May 2020.
    10. Hyunseung Kang & Laura Peck & Luke Keele, 2018. "Inference for instrumental variables: a randomization inference approach," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 181(4), pages 1231-1254, October.
    11. Ian D. Gow & David F. Larcker & Peter C. Reiss, 2016. "Causal Inference in Accounting Research," Journal of Accounting Research, Wiley Blackwell, vol. 54(2), pages 477-523, May.
    12. Committee, Nobel Prize, 2021. "Answering causal questions using observational data," Nobel Prize in Economics documents 2021-2, Nobel Prize Committee.
    13. Tarek Azzam & Michael Bates & David Fairris, 2019. "Do Learning Communities Increase First Year College Retention? Testing Sample Selection and External Validity of Randomized Control Trials," Working Papers 202002, University of California at Riverside, Department of Economics.
    14. Kirk Bansak, 2021. "Estimating causal moderation effects with randomized treatments and non‐randomized moderators," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(1), pages 65-86, January.
    15. Pablo Geraldo Bast'ias, 2024. "Credible causal inference beyond toy models," Papers 2402.11659, arXiv.org.
    16. Alberto Abadie & Susan Athey & Guido W. Imbens & Jeffrey M. Wooldridge, 2020. "Sampling‐Based versus Design‐Based Uncertainty in Regression Analysis," Econometrica, Econometric Society, vol. 88(1), pages 265-296, January.
    17. Parker Hevron, 2018. "Judicialization and Its Effects: Experiments as a Way Forward," Laws, MDPI, vol. 7(2), pages 1-21, May.
    18. Alejandro Avenburg & John Gerring & Jason Seawright, 2023. "How do social scientists reach causal inferences? A study of reception," Quality & Quantity: International Journal of Methodology, Springer, vol. 57(1), pages 257-275, February.
    19. Tymon Słoczyński, 2022. "Interpreting OLS Estimands When Treatment Effects Are Heterogeneous: Smaller Groups Get Larger Weights," The Review of Economics and Statistics, MIT Press, vol. 104(3), pages 501-509, May.
    20. Susan Athey & Raj Chetty & Guido Imbens, 2020. "Combining Experimental and Observational Data to Estimate Treatment Effects on Long Term Outcomes," Papers 2006.09676, arXiv.org.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:zbw:espost:202031. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: ZBW - Leibniz Information Centre for Economics (email available below). General contact details of provider: https://edirc.repec.org/data/zbwkide.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.