IDEAS home Printed from https://ideas.repec.org/a/bpj/causin/v11y2023i1p13n1008.html
   My bibliography  Save this article

Testing for treatment effect twice using internal and external controls in clinical trials

Author

Listed:
  • Yi Yanyao

    (Global Statistical Sciences, Eli Lilly and Company, Indianapolis, IN 46285, United States)

  • Zhang Ying

    (Global Statistical Sciences, Eli Lilly and Company, Indianapolis, IN 46285, United States)

  • Du Yu

    (Global Statistical Sciences, Eli Lilly and Company, Indianapolis, IN 46285, United States)

  • Ye Ting

    (Department of Biostatistics, University of Washington, Seattle, WA 98195, United States)

Abstract

Leveraging external controls – relevant individual patient data under control from external trials or real-world data – has the potential to reduce the cost of randomized controlled trials (RCTs) while increasing the proportion of trial patients given access to novel treatments. However, due to lack of randomization, RCT patients and external controls may differ with respect to covariates that may or may not have been measured. Hence, after controlling for measured covariates, for instance by matching, testing for treatment effect using external controls may still be subject to unmeasured biases. In this article, we propose a sensitivity analysis approach to quantify the magnitude of unmeasured bias that would be needed to alter the study conclusion that presumed no unmeasured biases are introduced by employing external controls. Whether leveraging external controls increases power or not depends on the interplay between sample sizes and the magnitude of treatment effect and unmeasured biases, which may be difficult to anticipate. This motivates a combined testing procedure that performs two highly correlated analyses, one with and one without external controls, with a small correction for multiple testing using the joint distribution of the two test statistics. The combined test provides a new method of sensitivity analysis designed for data fusion problems, which anchors at the unbiased analysis based on RCT only and spends a small proportion of the type I error to also test using the external controls. In this way, if leveraging external controls increases power, the power gain compared to the analysis based on RCT only can be substantial; if not, the power loss is small. The proposed method is evaluated in theory and power calculations, and applied to a real trial.

Suggested Citation

  • Yi Yanyao & Zhang Ying & Du Yu & Ye Ting, 2023. "Testing for treatment effect twice using internal and external controls in clinical trials," Journal of Causal Inference, De Gruyter, vol. 11(1), pages 1-13.
  • Handle: RePEc:bpj:causin:v:11:y:2023:i:1:p:13:n:1008
    DOI: 10.1515/jci-2022-0018
    as

    Download full text from publisher

    File URL: https://doi.org/10.1515/jci-2022-0018
    Download Restriction: no

    File URL: https://libkey.io/10.1515/jci-2022-0018?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Guildo W. Imbens, 2003. "Sensitivity to Exogeneity Assumptions in Program Evaluation," American Economic Review, American Economic Association, vol. 93(2), pages 126-132, May.
    2. Elizabeth A. Stuart & Stephen R. Cole & Catherine P. Bradshaw & Philip J. Leaf, 2011. "The use of propensity scores to assess the generalizability of results from randomized trials," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 174(2), pages 369-386, April.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Pritchett Lant & Sandefur Justin, 2014. "Context Matters for Size: Why External Validity Claims and Development Practice do not Mix," Journal of Globalization and Development, De Gruyter, vol. 4(2), pages 161-197, March.
    2. Lant Pritchett, Justin Sandefur, 2013. "Context Matters for Size: Why External Validity Claims and Development Practice Don't Mix-Working Paper 336," Working Papers 336, Center for Global Development.
    3. Xinkun Nie & Guido Imbens & Stefan Wager, 2021. "Covariate Balancing Sensitivity Analysis for Extrapolating Randomized Trials across Locations," Papers 2112.04723, arXiv.org.
    4. Paolo Naticchioni & Silvia Loriga, 2011. "Short and Long Term Evaluations of Public Employment Services in Italy," Applied Economics Quarterly (formerly: Konjunkturpolitik), Duncker & Humblot, Berlin, vol. 57(3), pages 201-229.
    5. Leonardo Becchetti & Pierluigi Conzo & Alessandro Romeo, 2014. "Violence, trust, and trustworthiness: evidence from a Nairobi slum," Oxford Economic Papers, Oxford University Press, vol. 66(1), pages 283-305, January.
    6. Doko Tchatoka, Firmin Sabro, 2012. "Specification Tests with Weak and Invalid Instruments," MPRA Paper 40185, University Library of Munich, Germany.
    7. Fabian Kosse & Thomas Deckers & Pia Pinger & Hannah Schildberg-Hörisch & Armin Falk, 2020. "The Formation of Prosociality: Causal Evidence on the Role of Social Environment," Journal of Political Economy, University of Chicago Press, vol. 128(2), pages 434-467.
    8. Becchetti, Leonardo & Ciciretti, Rocco & Hasan, Iftekhar, 2015. "Corporate social responsibility, stakeholder risk, and idiosyncratic volatility," Journal of Corporate Finance, Elsevier, vol. 35(C), pages 297-309.
    9. Wendy Chan, 2018. "Applications of Small Area Estimation to Generalization With Subclassification by Propensity Scores," Journal of Educational and Behavioral Statistics, , vol. 43(2), pages 182-224, April.
    10. Michael A. Clemens & Claudio Montenegro & Lant Pritchett, 2016. "Bounding the Price Equivalent of Migration Barriers," Growth Lab Working Papers 67, Harvard's Growth Lab.
    11. Mequanint B. Melesse & Amos Nyangira Tirra & Yabibal M. Walle & Michael Hauser, 2023. "Understanding the Determinants of Aspirations in Rural Tanzania: Does Financial Literacy Matter?," The European Journal of Development Research, Palgrave Macmillan;European Association of Development Research and Training Institutes (EADI), vol. 35(6), pages 1294-1321, December.
    12. Colnet Bénédicte & Josse Julie & Varoquaux Gaël & Scornet Erwan, 2022. "Causal effect on a target population: A sensitivity analysis to handle missing covariates," Journal of Causal Inference, De Gruyter, vol. 10(1), pages 372-414, January.
    13. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2006. "Moving the Goalposts: Addressing Limited Overlap in the Estimation of Average Treatment Effects by Changing the Estimand," NBER Technical Working Papers 0330, National Bureau of Economic Research, Inc.
    14. Richard K. Crump & V. Joseph Hotz & Guido W. Imbens & Oscar A. Mitnik, 2009. "Dealing with limited overlap in estimation of average treatment effects," Biometrika, Biometrika Trust, vol. 96(1), pages 187-199.
    15. Battistin, Erich & Chesher, Andrew, 2014. "Treatment effect estimation with covariate measurement error," Journal of Econometrics, Elsevier, vol. 178(2), pages 707-715.
    16. Tommaso Nannicini, 2007. "Simulation-based sensitivity analysis for matching estimators," Stata Journal, StataCorp LP, vol. 7(3), pages 334-350, September.
    17. Stéphane Bonhomme & Martin Weidner, 2020. "Minimizing Sensitivity to Model Misspecification," CeMMAP working papers CWP37/20, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    18. Shen, Chung-Hua & Wu, Meng-Wen & Chen, Ting-Hsuan & Fang, Hao, 2016. "To engage or not to engage in corporate social responsibility: Empirical evidence from global banking sector," Economic Modelling, Elsevier, vol. 55(C), pages 207-225.
    19. Stéphane Bonhomme & Martin Weidner, 2022. "Minimizing sensitivity to model misspecification," Quantitative Economics, Econometric Society, vol. 13(3), pages 907-954, July.
    20. Myoung Lee & Sang Lee, 2009. "Sensitivity analysis of job-training effects on reemployment for Korean women," Empirical Economics, Springer, vol. 36(1), pages 81-107, February.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:bpj:causin:v:11:y:2023:i:1:p:13:n:1008. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Peter Golla (email available below). General contact details of provider: https://www.degruyter.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.