IDEAS home Printed from https://ideas.repec.org/a/bla/jorssa/v184y2021i1p227-254.html
   My bibliography  Save this article

Did you conduct a sensitivity analysis? A new weighting‐based approach for evaluations of the average treatment effect for the treated

Author

Listed:
  • Guanglei Hong
  • Fan Yang
  • Xu Qin

Abstract

In non‐experimental research, a sensitivity analysis helps determine whether a causal conclusion could be easily reversed in the presence of hidden bias. A new approach to sensitivity analysis on the basis of weighting extends and supplements propensity score weighting methods for identifying the average treatment effect for the treated (ATT). In its essence, the discrepancy between a new weight that adjusts for the omitted confounders and an initial weight that omits them captures the role of the confounders. This strategy is appealing for a number of reasons including that, regardless of how complex the data generation functions are, the number of sensitivity parameters remains small and their forms never change. A graphical display of the sensitivity parameter values facilitates a holistic assessment of the dominant potential bias. An application to the well‐known LaLonde data lays out the implementation procedure and illustrates its broad utility. The data offer a prototypical example of non‐experimental evaluations of the average impact of job training programmes for the participant population.

Suggested Citation

  • Guanglei Hong & Fan Yang & Xu Qin, 2021. "Did you conduct a sensitivity analysis? A new weighting‐based approach for evaluations of the average treatment effect for the treated," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(1), pages 227-254, January.
  • Handle: RePEc:bla:jorssa:v:184:y:2021:i:1:p:227-254
    DOI: 10.1111/rssa.12621
    as

    Download full text from publisher

    File URL: https://doi.org/10.1111/rssa.12621
    Download Restriction: no

    File URL: https://libkey.io/10.1111/rssa.12621?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Manski, Charles F, 1990. "Nonparametric Bounds on Treatment Effects," American Economic Review, American Economic Association, vol. 80(2), pages 319-323, May.
    2. LaLonde, Robert J, 1986. "Evaluating the Econometric Evaluations of Training Programs with Experimental Data," American Economic Review, American Economic Association, vol. 76(4), pages 604-620, September.
    3. Andrea Ichino & Fabrizia Mealli & Tommaso Nannicini, 2008. "From temporary help jobs to permanent employment: what can we learn from matching estimators and their sensitivity?," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 23(3), pages 305-327.
    4. Guildo W. Imbens, 2003. "Sensitivity to Exogeneity Assumptions in Program Evaluation," American Economic Review, American Economic Association, vol. 93(2), pages 126-132, May.
    5. Heckman, James, 2013. "Sample selection bias as a specification error," Applied Econometrics, Russian Presidential Academy of National Economy and Public Administration (RANEPA), vol. 31(3), pages 129-137.
    6. Newey, Whitney K., 1984. "A method of moments interpretation of sequential estimators," Economics Letters, Elsevier, vol. 14(2-3), pages 201-206.
    7. Peng Ding & Tyler J. Vanderweele, 2014. "Generalized Cornfield conditions for the risk difference," Biometrika, Biometrika Trust, vol. 101(4), pages 971-977.
    8. Petra E. Todd & Jeffrey A. Smith, 2001. "Reconciling Conflicting Evidence on the Performance of Propensity-Score Matching Methods," American Economic Review, American Economic Association, vol. 91(2), pages 112-118, May.
    9. Ashenfelter, Orley & Card, David, 1985. "Using the Longitudinal Structure of Earnings to Estimate the Effect of Training Programs," The Review of Economics and Statistics, MIT Press, vol. 67(4), pages 648-660, November.
    10. James J. Heckman & Vytlacil, Edward J., 2007. "Econometric Evaluation of Social Programs, Part I: Causal Models, Structural Models and Econometric Policy Evaluation," Handbook of Econometrics, in: J.J. Heckman & E.E. Leamer (ed.), Handbook of Econometrics, edition 1, volume 6, chapter 70, Elsevier.
    11. Xu Qin & Guanglei Hong & Jonah Deutsch & Edward Bein, 2019. "Multisite causal mediation analysis in the presence of complex sample and survey designs and non‐random non‐response," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 182(4), pages 1343-1370, October.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    2. Alberto Abadie & Guido W. Imbens, 2002. "Simple and Bias-Corrected Matching Estimators for Average Treatment Effects," NBER Technical Working Papers 0283, National Bureau of Economic Research, Inc.
    3. Peter Hull & Michal Kolesár & Christopher Walters, 2022. "Labor by design: contributions of David Card, Joshua Angrist, and Guido Imbens," Scandinavian Journal of Economics, Wiley Blackwell, vol. 124(3), pages 603-645, July.
    4. Matthew A. Masten & Alexandre Poirier & Linqi Zhang, 2024. "Assessing Sensitivity to Unconfoundedness: Estimation and Inference," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 42(1), pages 1-13, January.
    5. David H. Dean & Robert C. Dolan & Robert M. Schmidt, 1999. "Evaluating the Vocational Rehabilitation Program Using Longitudinal Data," Evaluation Review, , vol. 23(2), pages 162-189, April.
    6. Susan Athey & Guido W. Imbens, 2017. "The State of Applied Econometrics: Causality and Policy Evaluation," Journal of Economic Perspectives, American Economic Association, vol. 31(2), pages 3-32, Spring.
    7. Guido W. Imbens, 2004. "Nonparametric Estimation of Average Treatment Effects Under Exogeneity: A Review," The Review of Economics and Statistics, MIT Press, vol. 86(1), pages 4-29, February.
    8. Jones A.M & Rice N, 2009. "Econometric Evaluation of Health Policies," Health, Econometrics and Data Group (HEDG) Working Papers 09/09, HEDG, c/o Department of Economics, University of York.
    9. van der Klaauw, Bas, 2014. "From micro data to causality: Forty years of empirical labor economics," Labour Economics, Elsevier, vol. 30(C), pages 88-97.
    10. James J. Heckman, 2005. "Micro Data, Heterogeneity and the Evaluation of Public Policy Part 2," The American Economist, Sage Publications, vol. 49(1), pages 16-44, March.
    11. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    12. Sascha O. Becker & Marco Caliendo, 2007. "Sensitivity analysis for average treatment effects," Stata Journal, StataCorp LP, vol. 7(1), pages 71-83, February.
    13. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    14. Tommaso Nannicini, 2007. "Simulation-based sensitivity analysis for matching estimators," Stata Journal, StataCorp LP, vol. 7(3), pages 334-350, September.
    15. Sokbae Lee & Bernard Salanié, 2018. "Identifying Effects of Multivalued Treatments," Econometrica, Econometric Society, vol. 86(6), pages 1939-1963, November.
    16. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.
    17. Regner, Hakan, 2002. "A nonexperimental evaluation of training programs for the unemployed in Sweden," Labour Economics, Elsevier, vol. 9(2), pages 187-206, April.
    18. Metcalf, Charles E., 1997. "The Advantages of Experimental Designs for Evaluating Sex Education Programs," Children and Youth Services Review, Elsevier, vol. 19(7), pages 507-523, November.
    19. Xu Qin & Jonah Deutsch & Guanglei Hong, 2021. "Unpacking Complex Mediation Mechanisms And Their Heterogeneity Between Sites In A Job Corps Evaluation," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 40(1), pages 158-190, January.
    20. Guido W. Imbens, 2022. "Causality in Econometrics: Choice vs Chance," Econometrica, Econometric Society, vol. 90(6), pages 2541-2566, November.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bla:jorssa:v:184:y:2021:i:1:p:227-254. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Wiley Content Delivery (email available below). General contact details of provider: https://edirc.repec.org/data/rssssea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.