IDEAS home Printed from https://ideas.repec.org/p/osf/osfxxx/bqmws_v1.html
   My bibliography  Save this paper

How Much Should We Trust Modern Difference-in-Differences Estimates?

Author

Listed:
  • Weiss, Amanda

Abstract

When do modern difference-in-differences (DID)-style methods work for empirical political science? Scholars exploit the staggered roll-out of policies like election regulation, civil service reform, and healthcare across places to estimate causal effects - often using the two-way fixed effects (TWFE) estimator. However, recent literature has highlighted the TWFE estimator's bias in the presence of heterogeneous treatment effects and tendency to make ``forbidden comparisons" between treated units. In response, scholars have increasingly turned to modern DID estimators that promise greater robustness to real-world data problems. This paper asks how well these modern methods work for for the empirical settings and sample sizes commonly used in political science, with the U.S. states as the running example. In particular, it provides a simulation study of the performance of seven DID methods under either constant or heterogeneous effects, in an N=50 setting that mimics the American federalism natural experiment. I find that many modern methods (1) produce confidence intervals that do not include the true average effect at the specified rate and (2) are underpowered. I show that many cases of coverage problems with modern DID estimators can be addressed using the block bootstrap to estimate standard errors. However, I also show that even where identification and estimation are straightforward, the fifty-state sample poses a power problem without large average effect sizes - at least 0.5 standard deviations. I illustrate the challenges of DID research with the fifty-state panel in the case of estimating the effects of strict voter identification laws on voter turnout.

Suggested Citation

  • Weiss, Amanda, 2024. "How Much Should We Trust Modern Difference-in-Differences Estimates?," OSF Preprints bqmws_v1, Center for Open Science.
  • Handle: RePEc:osf:osfxxx:bqmws_v1
    DOI: 10.31219/osf.io/bqmws_v1
    as

    Download full text from publisher

    File URL: https://osf.io/download/66ce13425d1af01bb8b0328e/
    Download Restriction: no

    File URL: https://libkey.io/10.31219/osf.io/bqmws_v1?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Keele, Luke & Minozzi, William, 2013. "How Much Is Minnesota Like Wisconsin? Assumptions and Counterfactuals in Causal Inference with Observational Data," Political Analysis, Cambridge University Press, vol. 21(2), pages 193-216, April.
    2. Paul Frymer & Jacob M. Grumbach, 2021. "Labor Unions and White Racial Politics," American Journal of Political Science, John Wiley & Sons, vol. 65(1), pages 225-240, January.
    3. Sophie Schuit & Jon C. Rogowski, 2017. "Race, Representation, and the Voting Rights Act," American Journal of Political Science, John Wiley & Sons, vol. 61(3), pages 513-526, July.
    4. Ivan A. Canay & Andres Santos & Azeem M. Shaikh, 2021. "The Wild Bootstrap with a “Small” Number of “Large” Clusters," The Review of Economics and Statistics, MIT Press, vol. 103(2), pages 346-363, May.
    5. Hausman, Jerry & Hall, Bronwyn H & Griliches, Zvi, 1984. "Econometric Models for Count Data with an Application to the Patents-R&D Relationship," Econometrica, Econometric Society, vol. 52(4), pages 909-938, July.
    6. M. Hashem Pesaran, 2021. "General diagnostic tests for cross-sectional dependence in panels," Empirical Economics, Springer, vol. 60(1), pages 13-50, January.
    7. Gilens, Martin & Patterson, Shawn & Haines, Pavielle, 2021. "Campaign Finance Regulations and Public Policy," American Political Science Review, Cambridge University Press, vol. 115(3), pages 1074-1081, August.
    8. Patrick Flavin & Michael T. Hartney, 2015. "When Government Subsidizes Its Own: Collective Bargaining Laws as Agents of Political Mobilization," American Journal of Political Science, John Wiley & Sons, vol. 59(4), pages 896-911, October.
    9. Angrist, Josh & Lavy, Victor, 2002. "The Effect of High School Matriculation Awards: Evidence from Randomized Trials," CEPR Discussion Papers 3827, C.E.P.R. Discussion Papers.
    10. Artūras Juodis & Simon Reese, 2022. "The Incidental Parameters Problem in Testing for Remaining Cross-Section Correlation," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 40(3), pages 1191-1203, June.
    11. Alberto Abadie & Susan Athey & Guido W Imbens & Jeffrey M Wooldridge, 2023. "When Should You Adjust Standard Errors for Clustering?," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 138(1), pages 1-35.
    12. Alberto Abadie & Alexis Diamond & Jens Hainmueller, 2015. "Comparative Politics and the Synthetic Control Method," American Journal of Political Science, John Wiley & Sons, vol. 59(2), pages 495-510, February.
    13. James H. Stock & Mark W. Watson, 2008. "Heteroskedasticity-Robust Standard Errors for Fixed Effects Panel Data Regression," Econometrica, Econometric Society, vol. 76(1), pages 155-174, January.
    14. James E. Pustejovsky & Elizabeth Tipton, 2018. "Small-Sample Methods for Cluster-Robust Variance Estimation and Hypothesis Testing in Fixed Effects Models," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 36(4), pages 672-683, October.
    15. Baker, Andrew C. & Larcker, David F. & Wang, Charles C.Y., 2022. "How much should we trust staggered difference-in-differences estimates?," Journal of Financial Economics, Elsevier, vol. 144(2), pages 370-395.
    16. Taisuke Otsu & Yoshiyasu Rai, 2017. "Bootstrap Inference of Matching Estimators for Average Treatment Effects," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 112(520), pages 1720-1732, October.
    17. Jonathan Roth, 2022. "Pretest with Caution: Event-Study Estimates after Testing for Parallel Trends," American Economic Review: Insights, American Economic Association, vol. 4(3), pages 305-322, September.
    18. Eubank, Nicholas & Fresh, Adriane, 2022. "Enfranchisement and Incarceration after the 1965 Voting Rights Act," American Political Science Review, Cambridge University Press, vol. 116(3), pages 791-806, August.
    19. A. Colin Cameron & Douglas L. Miller, 2015. "A Practitioner’s Guide to Cluster-Robust Inference," Journal of Human Resources, University of Wisconsin Press, vol. 50(2), pages 317-372.
    20. Bai, Jushan & Wang, Peng, 2024. "Causal inference using factor models," MPRA Paper 120585, University Library of Munich, Germany.
    21. Jeffrey M. Wooldridge, 2003. "Cluster-Sample Methods in Applied Econometrics," American Economic Review, American Economic Association, vol. 93(2), pages 133-138, May.
    22. Stommes, Drew & Aronow, P. M. & Sävje, Fredrik, 2023. "On the Reliability of Published Findings Using the Regression Discontinuity Design in Political Science," I4R Discussion Paper Series 22, The Institute for Replication (I4R).
    23. Agustina S. Paglayan, 2019. "Public‐Sector Unions and the Size of Government," American Journal of Political Science, John Wiley & Sons, vol. 63(1), pages 21-36, January.
    24. John Marshall, 2019. "The Anti‐Democrat Diploma: How High School Education Decreases Support for the Democratic Party," American Journal of Political Science, John Wiley & Sons, vol. 63(1), pages 67-83, January.
    25. Grumbach, Jacob M., 2023. "Laboratories of Democratic Backsliding," American Political Science Review, Cambridge University Press, vol. 117(3), pages 967-984, August.
    26. John B. Holbein & D. Sunshine Hillygus, 2016. "Making Young Voters: The Impact of Preregistration on Youth Turnout," American Journal of Political Science, John Wiley & Sons, vol. 60(2), pages 364-382, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Weiss, Amanda, 2024. "How Much Should We Trust Modern Difference-in-Differences Estimates?," OSF Preprints bqmws, Center for Open Science.
    2. Eli Ben‐Michael & Avi Feller & Jesse Rothstein, 2022. "Synthetic controls with staggered adoption," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 84(2), pages 351-381, April.
    3. MacKinnon, James G. & Nielsen, Morten Ørregaard & Webb, Matthew D., 2023. "Cluster-robust inference: A guide to empirical practice," Journal of Econometrics, Elsevier, vol. 232(2), pages 272-299.
    4. Bai, Jushan & Choi, Sung Hoon & Liao, Yuan, 2024. "Standard errors for panel data models with unknown clusters," Journal of Econometrics, Elsevier, vol. 240(2).
    5. Hwang, Jungbin, 2021. "Simple and trustworthy cluster-robust GMM inference," Journal of Econometrics, Elsevier, vol. 222(2), pages 993-1023.
    6. Riccardo D'Adamo, 2018. "Cluster-Robust Standard Errors for Linear Regression Models with Many Controls," Papers 1806.07314, arXiv.org, revised Apr 2019.
    7. James G. MacKinnon & Morten Ørregaard Nielsen & Matthew D. Webb, 2023. "Fast and reliable jackknife and bootstrap methods for cluster‐robust inference," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 38(5), pages 671-694, August.
    8. James G. MacKinnon, 2019. "How cluster-robust inference is changing applied econometrics," Canadian Journal of Economics, Canadian Economics Association, vol. 52(3), pages 851-881, August.
    9. Roth, Jonathan & Sant’Anna, Pedro H.C. & Bilinski, Alyssa & Poe, John, 2023. "What’s trending in difference-in-differences? A synthesis of the recent econometrics literature," Journal of Econometrics, Elsevier, vol. 235(2), pages 2218-2244.
    10. Seonho Shin, 2021. "Were they a shock or an opportunity?: The heterogeneous impacts of the 9/11 attacks on refugees as job seekers—a nonlinear multi-level approach," Empirical Economics, Springer, vol. 61(5), pages 2827-2864, November.
    11. Bruno Ferman, 2019. "Assessing Inference Methods," Papers 1912.08772, arXiv.org, revised Oct 2022.
    12. James G. MacKinnon & Morten Ørregaard Nielsen & Matthew D. Webb, 2021. "Wild Bootstrap and Asymptotic Inference With Multiway Clustering," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 39(2), pages 505-519, March.
    13. Taylor K. Odle, 2022. "Free to Spend? Institutional Autonomy and Expenditures on Executive Compensation, Faculty Salaries, and Research Activities," Research in Higher Education, Springer;Association for Institutional Research, vol. 63(1), pages 1-32, February.
    14. Marson, Marta & Migheli, Matteo & Saccone, Donatella, 2023. "Free to die: Economic freedoms and influenza mortality," Economics & Human Biology, Elsevier, vol. 49(C).
    15. Jeffrey D. Michler & Anna Josephson, 2022. "Recent developments in inference: practicalities for applied economics," Chapters, in: A Modern Guide to Food Economics, chapter 11, pages 235-268, Edward Elgar Publishing.
    16. Albert Chiu & Xingchen Lan & Ziyi Liu & Yiqing Xu, 2023. "Causal Panel Analysis under Parallel Trends: Lessons from A Large Reanalysis Study," Papers 2309.15983, arXiv.org, revised Nov 2024.
    17. James G. MacKinnon & Matthew D. Webb, 2020. "When and How to Deal with Clustered Errors in Regression Models," Working Paper 1421, Economics Department, Queen's University.
    18. A. Colin Cameron & Douglas L. Miller, 2010. "Robust Inference with Clustered Data," Working Papers 106, University of California, Davis, Department of Economics.
    19. Shibashish Mukherjee & Sorin M.S. Krammer, 2024. "When the going gets tough : Board gender diversity in the wake of a major crisis," Post-Print hal-04522722, HAL.
    20. Irsova, Zuzana & Bom, Pedro R. D. & Havranek, Tomas & Rachinger, Heiko, 2023. "Spurious Precision in Meta-Analysis of Observational Research," MetaArXiv 3qp2w_v1, Center for Open Science.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:osf:osfxxx:bqmws_v1. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: OSF (email available below). General contact details of provider: https://osf.io/preprints/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.