IDEAS home Printed from https://ideas.repec.org/a/sae/somere/v46y2017i4p1049-1075.html
   My bibliography  Save this article

Randomized Sequential Individual Assignment in Social Experiments: Evaluating the Design Options Prospectively

Author

Listed:
  • Sharon L. Lohr
  • Xiaoshu Zhu

Abstract

Many randomized experiments in the social sciences allocate subjects to treatment arms at the time the subjects enroll. Desirable features of the mechanism used to assign subjects to treatment arms are often (1) equal numbers of subjects in intervention and control arms, (2) balanced allocation for population subgroups and across covariates, (3) ease of use, and (4) inability for a site worker to predict the treatment arm for a subject before he or she has been assigned. In general, a trade-off must be made among these features: Many mechanisms that achieve high balance do so at the cost of high predictability. In this article, we review methods for randomized assignment of individuals that have been discussed in the literature, evaluating the performance of each with respect to the desirable design features. We propose a method for controlling the amount of predictability in a study while achieving high balance across subgroups and covariates. The method is applicable when a database containing the subgroup membership and covariates of each potential participant is available in advance. We use simple simulation and graphical methods to evaluate the balance and predictability of randomization mechanisms when planning the study and describe a computer program implemented in the R statistical software package that prospectively evaluates candidate randomization methods.

Suggested Citation

  • Sharon L. Lohr & Xiaoshu Zhu, 2017. "Randomized Sequential Individual Assignment in Social Experiments: Evaluating the Design Options Prospectively," Sociological Methods & Research, , vol. 46(4), pages 1049-1075, November.
  • Handle: RePEc:sae:somere:v:46:y:2017:i:4:p:1049-1075
    DOI: 10.1177/0049124115621332
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0049124115621332
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0049124115621332?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John Horton & David Rand & Richard Zeckhauser, 2011. "The online laboratory: conducting experiments in a real labor market," Experimental Economics, Springer;Economic Science Association, vol. 14(3), pages 399-425, September.
    2. Jun Shao & Xinxin Yu & Bob Zhong, 2010. "A theory for testing hypotheses under covariate-adaptive randomization," Biometrika, Biometrika Trust, vol. 97(2), pages 347-360.
    3. Jun Shao & Xinxin Yu, 2013. "Validity of Tests under Covariate-Adaptive Biased Coin Randomization and Generalized Linear Models," Biometrics, The International Biometric Society, vol. 69(4), pages 960-969, December.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Tong Wang & Wei Ma, 2021. "The impact of misclassification on covariate‐adaptive randomized clinical trials," Biometrics, The International Biometric Society, vol. 77(2), pages 451-464, June.
    2. Jiang, Liang & Phillips, Peter C.B. & Tao, Yubo & Zhang, Yichong, 2023. "Regression-adjusted estimation of quantile treatment effects under covariate-adaptive randomizations," Journal of Econometrics, Elsevier, vol. 234(2), pages 758-776.
    3. Guiteras, Raymond P. & Levine, David I. & Polley, Thomas H., 2016. "The pursuit of balance in sequential randomized trials," Development Engineering, Elsevier, vol. 1(C), pages 12-25.
    4. Liang Jiang & Oliver B. Linton & Haihan Tang & Yichong Zhang, 2022. "Improving Estimation Efficiency via Regression-Adjustment in Covariate-Adaptive Randomizations with Imperfect Compliance," Papers 2201.13004, arXiv.org, revised Jun 2023.
    5. Yichong Zhang & Xin Zheng, 2020. "Quantile treatment effects and bootstrap inference under covariate‐adaptive randomization," Quantitative Economics, Econometric Society, vol. 11(3), pages 957-982, July.
    6. Ting Ye & Jun Shao, 2020. "Robust tests for treatment effect in survival analysis under covariate‐adaptive randomization," Journal of the Royal Statistical Society Series B, Royal Statistical Society, vol. 82(5), pages 1301-1323, December.
    7. Yamada, Katsunori & Sato, Masayuki, 2013. "Another avenue for anatomy of income comparisons: Evidence from hypothetical choice experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 89(C), pages 35-57.
    8. Lechthaler, Wolfgang & Ring, Patrick, 2021. "Labor force participation, job search effort and unemployment insurance in the laboratory," Journal of Economic Behavior & Organization, Elsevier, vol. 189(C), pages 748-778.
    9. Heinicke, Franziska & Rosenkranz, Stephanie & Weitzel, Utz, 2019. "The effect of pledges on the distribution of lying behavior: An online experiment," Journal of Economic Psychology, Elsevier, vol. 73(C), pages 136-151.
    10. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    11. Jean-Marc Bourgeon & José de Sousa & Alexis Noir-Luhalwe, 2022. "Social Distancing and Risk Taking: Evidence from a Team Game Show [Distanciation sociale et prise de risque : Les résultats d'un jeu d'équipe]," SciencePo Working papers Main hal-03792423, HAL.
    12. Mariconda, Simone & Lurati, Francesco, 2015. "Does familiarity breed stability? The role of familiarity in moderating the effects of new information on reputation judgments," Journal of Business Research, Elsevier, vol. 68(5), pages 957-964.
    13. Ingar Haaland & Christopher Roth & Johannes Wohlfart, 2023. "Designing Information Provision Experiments," Journal of Economic Literature, American Economic Association, vol. 61(1), pages 3-40, March.
    14. Simon Gächter & Lingbo Huang & Martin Sefton, 2016. "Combining “real effort” with induced effort costs: the ball-catching task," Experimental Economics, Springer;Economic Science Association, vol. 19(4), pages 687-712, December.
    15. Masha Shunko & Julie Niederhoff & Yaroslav Rosokha, 2018. "Humans Are Not Machines: The Behavioral Impact of Queueing Design on Service Time," Management Science, INFORMS, vol. 64(1), pages 453-473, January.
    16. L. Mundaca & H. Moncreiff, 2021. "New Perspectives on Green Energy Defaults," Journal of Consumer Policy, Springer, vol. 44(3), pages 357-383, September.
    17. Sandro Ambuehl & B. Douglas Bernheim & Annamaria Lusardi, 2022. "Evaluating Deliberative Competence: A Simple Method with an Application to Financial Choice," American Economic Review, American Economic Association, vol. 112(11), pages 3584-3626, November.
    18. Chen, Daniel L. & Schonger, Martin & Wickens, Chris, 2016. "oTree—An open-source platform for laboratory, online, and field experiments," Journal of Behavioral and Experimental Finance, Elsevier, vol. 9(C), pages 88-97.
    19. Abel Brodeur, Nikolai M. Cook, Anthony Heyes, 2022. "We Need to Talk about Mechanical Turk: What 22,989 Hypothesis Tests Tell Us about Publication Bias and p-Hacking in Online Experiments," LCERPA Working Papers am0133, Laurier Centre for Economic Research and Policy Analysis.
    20. Guenther, Isabel & Tetteh-Baah, Samuel Kofi, 2019. "The impact of discrimination on redistributive preferences and productivity: experimental evidence from the United States," VfS Annual Conference 2019 (Leipzig): 30 Years after the Fall of the Berlin Wall - Democracy and Market Economy 203652, Verein für Socialpolitik / German Economic Association.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:somere:v:46:y:2017:i:4:p:1049-1075. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.