IDEAS home Printed from https://ideas.repec.org/a/inm/ormnsc/v70y2024i6p4069-4086.html
   My bibliography  Save this article

Correlated Cluster-Based Randomized Experiments: Robust Variance Minimization

Author

Listed:
  • Ozan Candogan

    (Booth School of Business, University of Chicago, Chicago, Illinois 60637)

  • Chen Chen

    (New York University Shanghai, Shanghai 200124, China)

  • Rad Niazadeh

    (Booth School of Business, University of Chicago, Chicago, Illinois 60637)

Abstract

Experimentation is prevalent in online marketplaces and social networks to assess the effectiveness of new market intervention. To mitigate the interference among users in an experiment, a common practice is to use a cluster-based experiment, where the designer partitions the market into loosely connected clusters and assigns all users in the same cluster to the same variant (treatment or control). Given the experiment, we assume an unbiased Horvitz–Thompson estimator is used to estimate the total market effect of the treatment. We consider the optimization problem of choosing (correlated) randomized assignments of clusters to treatment and control to minimize the worst-case variance of the estimator under a constraint that the marginal assignment probability is q ∈ ( 0 , 1 ) for all clusters. This problem can be formulated as a linear program where both the number of decision variables and constraints are exponential in the number of clusters—and hence is generally computationally intractable. We develop a family of practical experiments that we refer to as independent block randomization (IBR) experiments. Such an experiment partitions clusters into blocks so that each block contains clusters of similar size. It then treats a fraction q of the clusters in each block (chosen uniformly at random) and does so independently across blocks. The optimal cluster partition can be obtained in a tractable way using dynamic programming. We show that these policies are asymptotically optimal when the number of clusters grows large and no cluster size dominates the rest. In the special case where cluster sizes take values in a finite set and the number of clusters of each size is a fixed proportion of the total number of clusters, the loss is only a constant that is independent of the number of clusters. Beyond the asymptotic regime, we show that the IBR experiment has a good approximation for any problem instance when q is not very tiny. We also examine the performance of the IBR experiments on data-driven numerical examples, including examples based on Airbnb and Facebook data.

Suggested Citation

  • Ozan Candogan & Chen Chen & Rad Niazadeh, 2024. "Correlated Cluster-Based Randomized Experiments: Robust Variance Minimization," Management Science, INFORMS, vol. 70(6), pages 4069-4086, June.
  • Handle: RePEc:inm:ormnsc:v:70:y:2024:i:6:p:4069-4086
    DOI: 10.1287/mnsc.2021.02741
    as

    Download full text from publisher

    File URL: http://dx.doi.org/10.1287/mnsc.2021.02741
    Download Restriction: no

    File URL: https://libkey.io/10.1287/mnsc.2021.02741?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Eckles Dean & Karrer Brian & Ugander Johan, 2017. "Design and Analysis of Experiments in Networks: Reducing Bias from Interference," Journal of Causal Inference, De Gruyter, vol. 5(1), pages 1-23, March.
    2. Ramesh Johari & Hannah Li & Inessa Liskovich & Gabriel Y. Weintraub, 2022. "Experimental Design in Two-Sided Platforms: An Analysis of Bias," Management Science, INFORMS, vol. 68(10), pages 7069-7089, October.
    3. Stefan Wager & Kuang Xu, 2021. "Experimenting in Equilibrium," Management Science, INFORMS, vol. 67(11), pages 6694-6715, November.
    4. Ruomeng Cui & Jun Li & Dennis J. Zhang, 2020. "Reducing Discrimination with Reviews in the Sharing Economy: Evidence from Field Experiments on Airbnb," Management Science, INFORMS, vol. 66(3), pages 1071-1094, March.
    5. David Holtz & Sinan Aral, 2020. "Limiting Bias from Test-Control Interference in Online Marketplace Experiments," Papers 2004.12162, arXiv.org.
    6. Iavor Bojinov & David Simchi-Levi & Jinglong Zhao, 2023. "Design and Analysis of Switchback Experiments," Management Science, INFORMS, vol. 69(7), pages 3759-3777, July.
    7. Hudgens, Michael G. & Halloran, M. Elizabeth, 2008. "Toward Causal Inference With Interference," Journal of the American Statistical Association, American Statistical Association, vol. 103, pages 832-842, June.
    8. David Holtz & Ruben Lobel & Inessa Liskovich & Sinan Aral, 2020. "Reducing Interference Bias in Online Marketplace Pricing Experiments," Papers 2004.12489, arXiv.org.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Ruoxuan Xiong & Alex Chin & Sean J. Taylor, 2024. "Data-Driven Switchback Experiments: Theoretical Tradeoffs and Empirical Bayes Designs," Papers 2406.06768, arXiv.org.
    2. Iavor Bojinov & David Simchi-Levi & Jinglong Zhao, 2023. "Design and Analysis of Switchback Experiments," Management Science, INFORMS, vol. 69(7), pages 3759-3777, July.
    3. Ali Goli & Anja Lambrecht & Hema Yoganarasimhan, 2024. "A Bias Correction Approach for Interference in Ranking Experiments," Marketing Science, INFORMS, vol. 43(3), pages 590-614, May.
    4. Ruohan Zhan & Shichao Han & Yuchen Hu & Zhenling Jiang, 2024. "Estimating Treatment Effects under Recommender Interference: A Structured Neural Networks Approach," Papers 2406.14380, arXiv.org, revised Jul 2024.
    5. Nian Si, 2023. "Tackling Interference Induced by Data Training Loops in A/B Tests: A Weighted Training Approach," Papers 2310.17496, arXiv.org, revised Apr 2024.
    6. Evan Munro & David Jones & Jennifer Brennan & Roland Nelet & Vahab Mirrokni & Jean Pouget-Abadie, 2023. "Causal Estimation of User Learning in Personalized Systems," Papers 2306.00485, arXiv.org.
    7. Luofeng Liao & Christian Kroer, 2024. "Statistical Inference and A/B Testing in Fisher Markets and Paced Auctions," Papers 2406.15522, arXiv.org, revised Aug 2024.
    8. Denis Fougère & Nicolas Jacquemet, 2020. "Policy Evaluation Using Causal Inference Methods," SciencePo Working papers Main hal-03455978, HAL.
    9. Zhaonan Qu & Ruoxuan Xiong & Jizhou Liu & Guido Imbens, 2021. "Semiparametric Estimation of Treatment Effects in Observational Studies with Heterogeneous Partial Interference," Papers 2107.12420, arXiv.org, revised Jun 2024.
    10. Ariel Boyarsky & Hongseok Namkoong & Jean Pouget-Abadie, 2023. "Modeling Interference Using Experiment Roll-out," Papers 2305.10728, arXiv.org, revised Aug 2023.
    11. Shaina J. Alexandria & Michael G. Hudgens & Allison E. Aiello, 2023. "Assessing intervention effects in a randomized trial within a social network," Biometrics, The International Biometric Society, vol. 79(2), pages 1409-1419, June.
    12. Vivek F. Farias & Andrew A. Li & Tianyi Peng & Andrew Zheng, 2022. "Markovian Interference in Experiments," Papers 2206.02371, arXiv.org, revised Jun 2022.
    13. Michael P. Leung, 2022. "Causal Inference Under Approximate Neighborhood Interference," Econometrica, Econometric Society, vol. 90(1), pages 267-293, January.
    14. Stefan Wager & Kuang Xu, 2021. "Experimenting in Equilibrium," Management Science, INFORMS, vol. 67(11), pages 6694-6715, November.
    15. Davide Viviano & Jess Rudder, 2020. "Policy design in experiments with unknown interference," Papers 2011.08174, arXiv.org, revised May 2024.
    16. Shan Huang & Chen Wang & Yuan Yuan & Jinglong Zhao & Jingjing Zhang, 2023. "Estimating Effects of Long-Term Treatments," Papers 2308.08152, arXiv.org.
    17. Elizabeth L. Ogburn & Ilya Shpitser & Youjin Lee, 2020. "Causal inference, social networks and chain graphs," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 183(4), pages 1659-1676, October.
    18. Hannah Li & Geng Zhao & Ramesh Johari & Gabriel Y. Weintraub, 2021. "Interference, Bias, and Variance in Two-Sided Marketplace Experimentation: Guidance for Platforms," Papers 2104.12222, arXiv.org.
    19. Steven Wilkins Reeves & Shane Lubold & Arun G. Chandrasekhar & Tyler H. McCormick, 2024. "Model-Based Inference and Experimental Design for Interference Using Partial Network Data," Papers 2406.11940, arXiv.org.
    20. Davide Viviano, 2020. "Experimental Design under Network Interference," Papers 2003.08421, arXiv.org, revised Jul 2022.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:inm:ormnsc:v:70:y:2024:i:6:p:4069-4086. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Chris Asher (email available below). General contact details of provider: https://edirc.repec.org/data/inforea.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.