IDEAS home Printed from https://ideas.repec.org/a/inm/orijds/v1y2022i1p4-16.html
   My bibliography  Save this article

Causal Decision Making and Causal Effect Estimation Are Not the Same…and Why It Matters

Author

Listed:
  • Carlos Fernández-Loría

    (Department of Information Systems, Business Statistics, and Operations Management, HKUST Business School, Hong Kong University of Science and Technology, New Territories, Hong Kong)

  • Foster Provost

    (Department of Technology, Operations, and Statistics, NYU Stern School of Business, New York University, New York, New York 10012; Compass Inc., New York, New York 10011)

Abstract

Causal decision making (CDM) at scale has become a routine part of business, and increasingly, CDM is based on statistical models and machine learning algorithms. Businesses algorithmically target offers, incentives, and recommendations to affect consumer behavior. Recently, we have seen an acceleration of research related to CDM and causal effect estimation (CEE) using machine-learned models. This article highlights an important perspective: CDM is not the same as CEE, and counterintuitively, accurate CEE is not necessary for accurate CDM. Our experience is that this is not well understood by practitioners or most researchers. Technically, the estimand of interest is different, and this has important implications both for modeling and for the use of statistical models for CDM. We draw on recent research to highlight three implications. (1) We should carefully consider the objective function of the causal machine learning, and if possible, optimize for accurate “treatment assignment” rather than for accurate effect-size estimation. (2) Confounding affects CDM and CEE differently. The upshot here is that for supporting CDM it may be just as good or even better to learn with confounded data as with unconfounded data. (3) Causal statistical modeling may not be necessary at all to support CDM because a proxy target for statistical modeling might do as well or better. This third observation helps to explain at least one broad common CDM practice that seems “wrong” at first blush—the widespread use of noncausal models for targeting interventions. The last two implications are particularly important in practice, as acquiring (unconfounded) data on both “sides” of the counterfactual for modeling can be quite costly and often impracticable. These observations open substantial research ground. We hope to facilitate research in this area by pointing to related articles from multiple contributing fields, most of them written in the last five years.

Suggested Citation

  • Carlos Fernández-Loría & Foster Provost, 2022. "Causal Decision Making and Causal Effect Estimation Are Not the Same…and Why It Matters," INFORMS Joural on Data Science, INFORMS, vol. 1(1), pages 4-16, April.
  • Handle: RePEc:inm:orijds:v:1:y:2022:i:1:p:4-16
    DOI: 10.1287/ijds.2021.0006
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/ijds.2021.0006
    Download Restriction: no

    File URL: https://libkey.io/10.1287/ijds.2021.0006?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Yingqi Zhao & Donglin Zeng & A. John Rush & Michael R. Kosorok, 2012. "Estimating Individualized Treatment Rules Using Outcome Weighted Learning," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 107(499), pages 1106-1118, September.
    2. Duncan Simester & Artem Timoshenko & Spyros I. Zoumpoulis, 2020. "Efficiently Evaluating Targeting Policies: Improving on Champion vs. Challenger Experiments," Management Science, INFORMS, vol. 66(8), pages 3412-3424, August.
    3. Maytal Saar-Tsechansky & Foster Provost, 2007. "Decision-Centric Active Learning of Binary-Outcome Models," Information Systems Research, INFORMS, vol. 18(1), pages 4-22, March.
    4. Stefan Wager & Susan Athey, 2018. "Estimation and Inference of Heterogeneous Treatment Effects using Random Forests," Journal of the American Statistical Association, Taylor & Francis Journals, vol. 113(523), pages 1228-1242, July.
    5. Susan Athey & Raj Chetty & Guido Imbens, 2020. "Combining Experimental and Observational Data to Estimate Treatment Effects on Long Term Outcomes," Papers 2006.09676, arXiv.org.
    6. Brett R. Gordon & Florian Zettelmeyer & Neha Bhargava & Dan Chapsky, 2019. "A Comparison of Approaches to Advertising Measurement: Evidence from Big Field Experiments at Facebook," Marketing Science, INFORMS, vol. 38(2), pages 193-225, March.
    7. Bhattacharya, Debopam & Dupas, Pascaline, 2012. "Inferring welfare maximizing treatment assignment under budget constraints," Journal of Econometrics, Elsevier, vol. 167(1), pages 168-196.
    8. Charles F. Manski, 2004. "Statistical Treatment Rules for Heterogeneous Populations," Econometrica, Econometric Society, vol. 72(4), pages 1221-1246, July.
    9. Susan Athey & Raj Chetty & Guido Imbens & Hyunseung Kang, 2016. "Estimating Treatment Effects using Multiple Surrogates: The Role of the Surrogate Score and the Surrogate Index," Papers 1603.09326, arXiv.org, revised Aug 2024.
    10. Aurélie Lemmens & Sunil Gupta, 2020. "Managing Churn to Maximize Profits," Marketing Science, INFORMS, vol. 39(5), pages 956-973, September.
    11. Athey, Susan & Imbens, Guido W., 2019. "Machine Learning Methods Economists Should Know About," Research Papers 3776, Stanford University, Graduate School of Business.
    12. Susan Athey & Stefan Wager, 2021. "Policy Learning With Observational Data," Econometrica, Econometric Society, vol. 89(1), pages 133-161, January.
    13. Tyler J. VanderWeele, 2013. "Surrogate Measures and Consistent Surrogates," Biometrics, The International Biometric Society, vol. 69(3), pages 561-565, September.
    14. Ali Tafti & Galit Shmueli, 2020. "Beyond Overall Treatment Effects: Leveraging Covariates in Randomized Experiments Guided by Causal Structure," Information Systems Research, INFORMS, vol. 31(4), pages 1183-1199, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Philipp Schwarz & Oliver Schacht & Sven Klaassen & Daniel Grunbaum & Sebastian Imhof & Martin Spindler, 2024. "Management Decisions in Manufacturing using Causal Machine Learning -- To Rework, or not to Rework?," Papers 2406.11308, arXiv.org.
    2. Margrét Vilborg Bjarnadóttir & Louiqa Raschid, 2023. "Modeling Financial Products and Their Supply Chains," INFORMS Joural on Data Science, INFORMS, vol. 2(2), pages 138-160, October.
    3. Yu Xia & Ali Arian & Sriram Narayanamoorthy & Joshua Mabry, 2023. "RetailSynth: Synthetic Data Generation for Retail AI Systems Evaluation," Papers 2312.14095, arXiv.org.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Carlos Fernández-Loría & Foster Provost & Jesse Anderton & Benjamin Carterette & Praveen Chandar, 2023. "A Comparison of Methods for Treatment Assignment with an Application to Playlist Generation," Information Systems Research, INFORMS, vol. 34(2), pages 786-803, June.
    2. Augustine Denteh & Helge Liebert, 2022. "Who Increases Emergency Department Use? New Insights from the Oregon Health Insurance Experiment," Working Papers 2201, Tulane University, Department of Economics.
    3. Christopher Adjaho & Timothy Christensen, 2022. "Externally Valid Policy Choice," Papers 2205.05561, arXiv.org, revised Jul 2023.
    4. Yan Liu, 2022. "Policy Learning under Endogeneity Using Instrumental Variables," Papers 2206.09883, arXiv.org, revised Mar 2024.
    5. Carlos Fern'andez-Lor'ia & Foster Provost & Jesse Anderton & Benjamin Carterette & Praveen Chandar, 2020. "A Comparison of Methods for Treatment Assignment with an Application to Playlist Generation," Papers 2004.11532, arXiv.org, revised Apr 2022.
    6. Michael Lechner, 2023. "Causal Machine Learning and its use for public policy," Swiss Journal of Economics and Statistics, Springer;Swiss Society of Economics and Statistics, vol. 159(1), pages 1-15, December.
    7. Tobias Cagala & Ulrich Glogowsky & Johannes Rincke & Anthony Strittmatter, 2021. "Optimal Targeting in Fundraising: A Machine-Learning Approach," Economics working papers 2021-08, Department of Economics, Johannes Kepler University Linz, Austria.
    8. Toru Kitagawa & Weining Wang & Mengshan Xu, 2022. "Policy Choice in Time Series by Empirical Welfare Maximization," Papers 2205.03970, arXiv.org, revised Jun 2023.
    9. Johannes Haushofer & Paul Niehaus & Carlos Paramo & Edward Miguel & Michael W. Walker, 2022. "Targeting Impact versus Deprivation," NBER Working Papers 30138, National Bureau of Economic Research, Inc.
    10. Toru Kitagawa & Guanyi Wang, 2021. "Who should get vaccinated? Individualized allocation of vaccines over SIR network," CeMMAP working papers CWP28/21, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    11. Kitagawa, Toru & Wang, Guanyi, 2023. "Who should get vaccinated? Individualized allocation of vaccines over SIR network," Journal of Econometrics, Elsevier, vol. 232(1), pages 109-131.
    12. Tobias Cagala & Ulrich Glogowsky & Johannes Rincke & Anthony Strittmatter, 2021. "Optimal Targeting in Fundraising: A Causal Machine-Learning Approach," Papers 2103.10251, arXiv.org, revised Sep 2021.
    13. Daniel F. Pellatt, 2022. "PAC-Bayesian Treatment Allocation Under Budget Constraints," Papers 2212.09007, arXiv.org, revised Jun 2023.
    14. Chunrong Ai & Yue Fang & Haitian Xie, 2024. "Data-driven Policy Learning for a Continuous Treatment," Papers 2402.02535, arXiv.org.
    15. Gabriel Okasa, 2022. "Meta-Learners for Estimation of Causal Effects: Finite Sample Cross-Fit Performance," Papers 2201.12692, arXiv.org.
    16. Kyle Colangelo & Ying-Ying Lee, 2019. "Double debiased machine learning nonparametric inference with continuous treatments," CeMMAP working papers CWP72/19, Centre for Microdata Methods and Practice, Institute for Fiscal Studies.
    17. Eric Mbakop & Max Tabord‐Meehan, 2021. "Model Selection for Treatment Choice: Penalized Welfare Maximization," Econometrica, Econometric Society, vol. 89(2), pages 825-848, March.
    18. Combes, Pierre-Philippe & Gobillon, Laurent & Zylberberg, Yanos, 2022. "Urban economics in a historical perspective: Recovering data with machine learning," Regional Science and Urban Economics, Elsevier, vol. 94(C).
    19. Bokelmann, Björn & Lessmann, Stefan, 2024. "Improving uplift model evaluation on randomized controlled trial data," European Journal of Operational Research, Elsevier, vol. 313(2), pages 691-707.
    20. Garbero, Alessandra & Sakos, Grayson & Cerulli, Giovanni, 2023. "Towards data-driven project design: Providing optimal treatment rules for development projects," Socio-Economic Planning Sciences, Elsevier, vol. 89(C).

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:orijds:v:1:y:2022:i:1:p:4-16. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.