IDEAS home Printed from https://ideas.repec.org/p/eid/wpaper/39843.html
   My bibliography  Save this paper

A New Perspective on the Issue of Selection Bias into Randomized Controlled Field Experiments

Author

Listed:
  • Michele Belot

    (University of Edinburgh)

  • Jonathan James

    (University of Bath)

Abstract

Many randomized controlled trials require participants to opt in. Such self-selection could introduce a potential bias, because only the most optimistic may participate. We revisit this prediction. We argue that in many situations, the experimental intervention is competing with alternative interventions participants could conduct themselves outside the experiment. Since participants have a chance of being assigned to the control group, participating has a direct opportunity cost, which is likely to be higher for optimists. We propose a model of self-selection and show that both pessimists and optimists may opt out of the experiment, leading to an ambiguous selection bias.
(This abstract was borrowed from another version of this item.)

Suggested Citation

  • Michele Belot & Jonathan James, 2014. "A New Perspective on the Issue of Selection Bias into Randomized Controlled Field Experiments," Department of Economics Working Papers 23/14, University of Bath, Department of Economics.
  • Handle: RePEc:eid:wpaper:39843
    as

    Download full text from publisher

    File URL: https://purehost.bath.ac.uk/ws/files/62506384/23_14.pdf
    File Function: Final published version
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Bruno Crépon & Esther Duflo & Marc Gurgand & Roland Rathelot & Philippe Zamora, 2013. "Do Labor Market Policies have Displacement Effects? Evidence from a Clustered Randomized Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 128(2), pages 531-580.
    2. Justine S. Hastings & Jeffrey M. Weinstein, 2008. "Information, School Choice, and Academic Achievement: Evidence from Two Experiments," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 123(4), pages 1373-1414.
    3. Roland G. Fryer, 2011. "Financial Incentives and Student Achievement: Evidence from Randomized Trials," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 126(4), pages 1755-1798.
    4. Esther Duflo & Pascaline Dupas & Michael Kremer, 2011. "Peer Effects, Teacher Incentives, and the Impact of Tracking: Evidence from a Randomized Evaluation in Kenya," American Economic Review, American Economic Association, vol. 101(5), pages 1739-1774, August.
    5. Oriana Bandiera & Iwan Barankay & Imran Rasul, 2007. "Incentives for Managers and Inequality among Workers: Evidence from a Firm-Level Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 122(2), pages 729-773.
    6. Bruno Crépon & Esther Duflo & Marc Gurgand & Roland Rathelot & Philippe Zamora, 2013. "Do Labor Market Policies have Displacement Effects? Evidence from a Clustered Randomized Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 128(2), pages 531-580.
    7. Oriana Bandiera & Iwan Barankay & Imran Rasul, 2013. "Team Incentives: Evidence From A Firm Level Experiment," Journal of the European Economic Association, European Economic Association, vol. 11(5), pages 1079-1114, October.
    8. Belot, Michèle & James, Jonathan, 2016. "Partner selection into policy relevant field experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 123(C), pages 31-56.
    9. Ernst Fehr & Lorenz Goette, 2007. "Do Workers Work More if Wages Are High? Evidence from a Randomized Field Experiment," American Economic Review, American Economic Association, vol. 97(1), pages 298-317, March.
    10. Malani, Anup, 2008. "Patient enrollment in medical trials: Selection bias in a randomized experiment," Journal of Econometrics, Elsevier, vol. 144(2), pages 341-351, June.
    11. Edward Vytlacil & James J. Heckman, 2001. "Policy-Relevant Treatment Effects," American Economic Review, American Economic Association, vol. 91(2), pages 107-111, May.
    12. List, John A. & Rasul, Imran, 2011. "Field Experiments in Labor Economics," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 4, chapter 2, pages 103-228, Elsevier.
    13. Pascaline Dupas, 2011. "Do Teenagers Respond to HIV Risk Information? Evidence from a Field Experiment in Kenya," American Economic Journal: Applied Economics, American Economic Association, vol. 3(1), pages 1-34, January.
    14. Pieter A. Gautier & Bas van der Klaauw, 2012. "Selection in a field experiment with voluntary participation," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 27(1), pages 63-84, January.
    15. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    16. Jessica Wisdom & Julie S. Downs & George Loewenstein, 2010. "Promoting Healthy Choices: Information versus Convenience," American Economic Journal: Applied Economics, American Economic Association, vol. 2(2), pages 164-178, April.
    17. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Belot, Michèle & James, Jonathan, 2016. "Partner selection into policy relevant field experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 123(C), pages 31-56.
    2. Naphtal Habiyaremye & Nadhem Mtimet & Emily A. Ouma & Gideon A. Obare, 2023. "Consumers' willingness to pay for safe and quality milk: Evidence from experimental auctions in Rwanda," Agribusiness, John Wiley & Sons, Ltd., vol. 39(4), pages 1049-1074, October.
    3. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    4. Michael Sanders & Aisling Ní Chonaire, 2015. "“Powered to Detect Small Effect Sizes”: You keep saying that. I do not think it means what you think it means," The Centre for Market and Public Organisation 15/337, The Centre for Market and Public Organisation, University of Bristol, UK.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Belot, Michèle & James, Jonathan, 2016. "Partner selection into policy relevant field experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 123(C), pages 31-56.
    2. Peters, Jörg & Langbein, Jörg & Roberts, Gareth, 2016. "Policy evaluation, randomized controlled trials, and external validity—A systematic review," Economics Letters, Elsevier, vol. 147(C), pages 51-54.
    3. Altmann, Steffen & Falk, Armin & Jäger, Simon & Zimmermann, Florian, 2018. "Learning about job search: A field experiment with job seekers in Germany," Journal of Public Economics, Elsevier, vol. 164(C), pages 33-49.
    4. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    5. Maibom, Jonas, 2021. "The Danish Labor Market Experiments: Methods and Findings," Nationaløkonomisk tidsskrift, Nationaløkonomisk Forening, vol. 2021(1), pages 1-21.
    6. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    7. Omar Al-Ubaydli & John List & Claire Mackevicius & Min Sok Lee & Dana Suskind, 2019. "How Can Experiments Play a Greater Role in Public Policy? 12 Proposals from an Economic Model of Scaling," Artefactual Field Experiments 00679, The Field Experiments Website.
    8. Rüdiger Wapler & Daniel Werner & Katja Wolf, 2018. "Active labour market policies in Germany: do regional labour markets benefit?," Applied Economics, Taylor & Francis Journals, vol. 50(51), pages 5561-5578, November.
    9. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    10. Mergele, Lukas & Weber, Michael, 2020. "Public employment services under decentralization: Evidence from a natural experiment," Journal of Public Economics, Elsevier, vol. 182(C).
    11. Anna Aizer & Shari Eli & Adriana Lleras-Muney & Keyoung Lee, 2020. "Do Youth Employment Programs Work? Evidence from the New Deal," NBER Working Papers 27103, National Bureau of Economic Research, Inc.
    12. Jonathan M.V. Davis & Sara B. Heller, 2017. "Rethinking the Benefits of Youth Employment Programs: The Heterogeneous Effects of Summer Jobs," NBER Working Papers 23443, National Bureau of Economic Research, Inc.
    13. G�nther Fink & Margaret McConnell & Sebastian Vollmer, 2014. "Testing for heterogeneous treatment effects in experimental data: false discovery risks and correction procedures," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(1), pages 44-57, January.
    14. Pieter Gautier & Paul Muller & Bas van der Klaauw & Michael Rosholm & Michael Svarer, 2018. "Estimating Equilibrium Effects of Job Search Assistance," Journal of Labor Economics, University of Chicago Press, vol. 36(4), pages 1073-1125.
    15. Chatri, Abdellatif & Hadef, Khadija & Samoudi, Naima, 2021. "Micro-econometric evaluation of subsidized employment in morocco: the case of the "Idmaj" program," Journal for Labour Market Research, Institut für Arbeitsmarkt- und Berufsforschung (IAB), Nürnberg [Institute for Employment Research, Nuremberg, Germany], vol. 55, pages 1-17.
    16. Martin Biewen & Bernd Fitzenberger & Aderonke Osikominu & Marie Paul, 2014. "The Effectiveness of Public-Sponsored Training Revisited: The Importance of Data and Methodological Choices," Journal of Labor Economics, University of Chicago Press, vol. 32(4), pages 837-897.
    17. Gosnell, Greer & Metcalfe, Robert & List, John A, 2016. "A new approach to an age-old problem: solving externalities by incenting workers directly," LSE Research Online Documents on Economics 84331, London School of Economics and Political Science, LSE Library.
    18. repec:iab:iabjlr:v:55:i::p:art.17 is not listed on IDEAS
    19. Charness, Gary & Kuhn, Peter, 2011. "Lab Labor: What Can Labor Economists Learn from the Lab?," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 4, chapter 3, pages 229-330, Elsevier.
    20. Marios Michaelides & Peter Mueser & Jeffrey Smith, 2019. "Youth Unemployment and U.S. Job Search Assistance Policy during the Great Recession," University of Cyprus Working Papers in Economics 13-2019, University of Cyprus Department of Economics.
    21. Fallesen, Peter & Geerdsen, Lars Pico & Imai, Susumu & Tranæs, Torben, 2018. "The effect of active labor market policies on crime: Incapacitation and program effects," Labour Economics, Elsevier, vol. 52(C), pages 263-286.

    More about this item

    JEL classification:

    • C4 - Mathematical and Quantitative Methods - - Econometric and Statistical Methods: Special Topics
    • C9 - Mathematical and Quantitative Methods - - Design of Experiments

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:eid:wpaper:39843. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Scholarly Communications Librarian (email available below). General contact details of provider: https://edirc.repec.org/data/debatuk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.