IDEAS home Printed from https://ideas.repec.org/p/edn/sirdps/537.html
   My bibliography  Save this paper

Partner Selection into Policy Relevant Field Experiments

Author

Listed:
  • Belot, Michele
  • James, Jonathan

Abstract

This study investigates the issue of self-selection of stakeholders into participation and collaboration in policy-relevant experiments. We document and test the implications of self-selection in the context of randomised policy experiment we conducted in primary schools in the UK. The main questions we ask are (1) is there evidence of selection on key observable characteristics likely to matter for the outcome of interest and (2) does selection matter for the estimates of treatment eff ects. The experimental work consists in testing the e ffects of an intervention aimed at encouraging children to make more healthy choices at lunch. We recruited schools through local authorities and randomised schools across two incentive treatments and a control group. We document the selection taking place both at the level of local authorities and at the school level. Overall we nd mild evidence of selection on key observables such as obesity levels and socio-economic characteristics. We find evidence of selection along indicators of involvement in healthy lifestyle programmes at the school level, but the magnitude is small. Moreover, We do not find signifi cant di erences in the treatment e ffects of the experiment between variables which, albeit to a mild degree, are correlated with selection into the experiment. To our knowledge, this is the rst study providing direct evidence on the magnitude of self-selection in fi eld experiments.

Suggested Citation

  • Belot, Michele & James, Jonathan, 2013. "Partner Selection into Policy Relevant Field Experiments," SIRE Discussion Papers 2013-112, Scottish Institute for Research in Economics (SIRE).
  • Handle: RePEc:edn:sirdps:537
    as

    Download full text from publisher

    File URL: http://hdl.handle.net/10943/537
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. Heather Royer & Mark Stehr & Justin Sydnor, 2015. "Incentives, Commitments, and Habit Formation in Exercise: Evidence from a Field Experiment with Workers at a Fortune-500 Company," American Economic Journal: Applied Economics, American Economic Association, vol. 7(3), pages 51-84, July.
    2. Roland G. Fryer, 2011. "Financial Incentives and Student Achievement: Evidence from Randomized Trials," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 126(4), pages 1755-1798.
    3. Esther Duflo & William Gale & Jeffrey Liebman & Peter Orszag & Emmanuel Saez, 2006. "Saving Incentives for Low- and Middle-Income Families: Evidence from a Field Experiment with H&R Block," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 121(4), pages 1311-1346.
    4. repec:hal:pseose:halshs-00942662 is not listed on IDEAS
    5. Frijters, Paul & Kong, Tao Sherry & Liu, Elaine M., 2015. "Who is coming to the artefactual field experiment? Participation bias among Chinese rural migrants," Journal of Economic Behavior & Organization, Elsevier, vol. 114(C), pages 62-74.
    6. Raj Chetty & Adam Looney & Kory Kroft, 2009. "Salience and Taxation: Theory and Evidence," American Economic Review, American Economic Association, vol. 99(4), pages 1145-1177, September.
    7. Glenn W. Harrison & John A. List, 2004. "Field Experiments," Journal of Economic Literature, American Economic Association, vol. 42(4), pages 1009-1055, December.
    8. Francesco Avvisati & Marc Gurgand & Nina Guyon & Eric Maurin, 2014. "Getting Parents Involved: A Field Experiment in Deprived Schools," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 81(1), pages 57-83.
    9. Jens Ludwig & Jeffrey R. Kling & Sendhil Mullainathan, 2011. "Mechanism Experiments and Policy Evaluations," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 17-38, Summer.
    10. Hunt Allcott, 2015. "Site Selection Bias in Program Evaluation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1117-1165.
    11. Belot, Michèle & James, Jonathan & Nolen, Patrick, 2016. "Incentives and children's dietary choices: A field experiment in primary schools," Journal of Health Economics, Elsevier, vol. 50(C), pages 213-229.
    12. Roland G. Fryer, Jr., 2014. "Injecting Charter School Best Practices into Traditional Public Schools: Evidence from Field Experiments," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 129(3), pages 1355-1407.
    13. Esther Duflo & Michael Greenstone & Nicholas Ryan, 2013. "Truth-telling by Third-party Auditors and the Response of Polluting Firms: Experimental Evidence from India," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 128(4), pages 1499-1545.
    14. Vivi Alatas & Abhijit Banerjee & Rema Hanna & Benjamin A. Olken & Julia Tobias, 2012. "Targeting the Poor: Evidence from a Field Experiment in Indonesia," American Economic Review, American Economic Association, vol. 102(4), pages 1206-1240, June.
    15. Glenn W. Harrison & Morten I. Lau & E. Elisabet Rutström, 2007. "Estimating Risk Attitudes in Denmark: A Field Experiment," Scandinavian Journal of Economics, Wiley Blackwell, vol. 109(2), pages 341-368, June.
    16. Dean Karlan & John A. List, 2007. "Does Price Matter in Charitable Giving? Evidence from a Large-Scale Natural Field Experiment," American Economic Review, American Economic Association, vol. 97(5), pages 1774-1793, December.
    17. Ernst Fehr & Lorenz Goette, 2007. "Do Workers Work More if Wages Are High? Evidence from a Randomized Field Experiment," American Economic Review, American Economic Association, vol. 97(1), pages 298-317, March.
    18. Suresh de Mel & David McKenzie & Christopher Woodruff, 2013. "The Demand for, and Consequences of, Formalization among Informal Firms in Sri Lanka," American Economic Journal: Applied Economics, American Economic Association, vol. 5(2), pages 122-150, April.
    19. Esther Duflo & Pascaline Dupas & Michael Kremer, 2011. "Peer Effects, Teacher Incentives, and the Impact of Tracking: Evidence from a Randomized Evaluation in Kenya," American Economic Review, American Economic Association, vol. 101(5), pages 1739-1774, August.
    20. Belot, Michèle & James, Jonathan, 2011. "Healthy school meals and educational outcomes," Journal of Health Economics, Elsevier, vol. 30(3), pages 489-504, May.
    21. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    22. Kate Ambler & Diego Aycinena & Dean Yang, 2015. "Channeling Remittances to Education: A Field Experiment among Migrants from El Salvador," American Economic Journal: Applied Economics, American Economic Association, vol. 7(2), pages 207-232, April.
    23. Justine S. Hastings & Jeffrey M. Weinstein, 2008. "Information, School Choice, and Academic Achievement: Evidence from Two Experiments," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 123(4), pages 1373-1414.
    24. Oriana Bandiera & Iwan Barankay & Imran Rasul, 2007. "Incentives for Managers and Inequality among Workers: Evidence from a Firm-Level Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 122(2), pages 729-773.
    25. Heckman, James J. & Lalonde, Robert J. & Smith, Jeffrey A., 1999. "The economics and econometrics of active labor market programs," Handbook of Labor Economics, in: O. Ashenfelter & D. Card (ed.), Handbook of Labor Economics, edition 1, volume 3, chapter 31, pages 1865-2097, Elsevier.
    26. Pascaline Dupas & Jonathan Robinson, 2013. "Why Don't the Poor Save More? Evidence from Health Savings Experiments," American Economic Review, American Economic Association, vol. 103(4), pages 1138-1171, June.
    27. Xavier Gine & Jessica Goldberg & Dean Yang, 2012. "Credit Market Consequences of Improved Personal Identification: Field Experimental Evidence from Malawi," American Economic Review, American Economic Association, vol. 102(6), pages 2923-2954, October.
    28. Blair Cleave & Nikos Nikiforakis & Robert Slonim, 2013. "Is there selection bias in laboratory experiments? The case of social and risk preferences," Experimental Economics, Springer;Economic Science Association, vol. 16(3), pages 372-382, September.
    29. Jessica Wisdom & Julie S. Downs & George Loewenstein, 2010. "Promoting Healthy Choices: Information versus Convenience," American Economic Journal: Applied Economics, American Economic Association, vol. 2(2), pages 164-178, April.
    30. Pascaline Dupas, 2011. "Do Teenagers Respond to HIV Risk Information? Evidence from a Field Experiment in Kenya," American Economic Journal: Applied Economics, American Economic Association, vol. 3(1), pages 1-34, January.
    31. Erica Field & Rohini Pande & John Papp & Natalia Rigol, 2013. "Does the Classic Microfinance Model Discourage Entrepreneurship among the Poor? Experimental Evidence from India," American Economic Review, American Economic Association, vol. 103(6), pages 2196-2226, October.
    32. Gary Burtless, 1995. "The Case for Randomized Field Trials in Economic and Policy Research," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 63-84, Spring.
    33. Orazio Attanasio & Britta Augsburg & Ralph De Haas & Emla Fitzsimons & Heike Harmgart, 2015. "The Impacts of Microfinance: Evidence from Joint-Liability Lending in Mongolia," American Economic Journal: Applied Economics, American Economic Association, vol. 7(1), pages 90-122, January.
    34. Joseph Hotz, V. & Imbens, Guido W. & Mortimer, Julie H., 2005. "Predicting the efficacy of future training programs using past experiences at other locations," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 241-270.
    35. Belot, Michèle & James, Jonathan, 2014. "A new perspective on the issue of selection bias in randomized controlled field experiments," Economics Letters, Elsevier, vol. 124(3), pages 326-328.
    36. Moussa P. Blimpo, 2014. "Team Incentives for Education in Developing Countries: A Randomized Field Experiment in Benin," American Economic Journal: Applied Economics, American Economic Association, vol. 6(4), pages 90-109, October.
    37. Henrik Jacobsen Kleven & Martin B. Knudsen & Claus Thustrup Kreiner & Søren Pedersen & Emmanuel Saez, 2011. "Unwilling or Unable to Cheat? Evidence From a Tax Audit Experiment in Denmark," Econometrica, Econometric Society, vol. 79(3), pages 651-692, May.
    38. Slonim, Robert & Wang, Carmen & Garbarino, Ellen & Merrett, Danielle, 2012. "Opting-In: Participation Biases in the Lab," IZA Discussion Papers 6865, Institute of Labor Economics (IZA).
    39. Menno Pradhan & Daniel Suryadarma & Amanda Beatty & Maisy Wong & Arya Gaduh & Armida Alisjahbana & Rima Prama Artha, 2014. "Improving Educational Quality through Enhancing Community Participation: Results from a Randomized Field Experiment in Indonesia," American Economic Journal: Applied Economics, American Economic Association, vol. 6(2), pages 105-126, April.
    40. Leonardo Bursztyn & Robert Jensen, 2015. "How Does Peer Pressure Affect Educational Investments?," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 130(3), pages 1329-1367.
    41. John A. List, 2011. "Why Economists Should Conduct Field Experiments and 14 Tips for Pulling One Off," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 3-16, Summer.
    42. Armin Falk, 2007. "Gift Exchange in the Field," Econometrica, Econometric Society, vol. 75(5), pages 1501-1511, September.
    43. Daniel S. Nagin & James B. Rebitzer & Seth Sanders & Lowell J. Taylor, 2002. "Monitoring, Motivation, and Management: The Determinants of Opportunistic Behavior in a Field Experiment," American Economic Review, American Economic Association, vol. 92(4), pages 850-873, September.
    44. David Card & Stefano DellaVigna & Ulrike Malmendier, 2011. "The Role of Theory in Field Experiments," Journal of Economic Perspectives, American Economic Association, vol. 25(3), pages 39-62, Summer.
    45. Malani, Anup, 2008. "Patient enrollment in medical trials: Selection bias in a randomized experiment," Journal of Econometrics, Elsevier, vol. 144(2), pages 341-351, June.
    46. Hongbin Cai & Yuyu Chen & Hanming Fang, 2009. "Observational Learning: Evidence from a Randomized Natural Field Experiment," American Economic Review, American Economic Association, vol. 99(3), pages 864-882, June.
    47. Bruce Shearer, 2004. "Piece Rates, Fixed Wages and Incentives: Evidence from a Field Experiment," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 71(2), pages 513-534.
    48. repec:feb:artefa:0110 is not listed on IDEAS
    49. Lant Pritchett, 2002. "It pays to be ignorant: A simple political economy of rigorous program evaluation," Journal of Economic Policy Reform, Taylor & Francis Journals, vol. 5(4), pages 251-269.
    50. Pieter A. Gautier & Bas van der Klaauw, 2012. "Selection in a field experiment with voluntary participation," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 27(1), pages 63-84, January.
    51. Emily Oster & Rebecca Thornton, 2011. "Menstruation, Sanitary Products, and School Attendance: Evidence from a Randomized Evaluation," American Economic Journal: Applied Economics, American Economic Association, vol. 3(1), pages 91-100, January.
    52. Damon Jones, 2010. "Information, Preferences, and Public Benefit Participation: Experimental Evidence from the Advance EITC and 401(k) Savings," American Economic Journal: Applied Economics, American Economic Association, vol. 2(2), pages 147-163, April.
    53. Slonim, Robert & Wang, Carmen & Garbarino, Ellen & Merrett, Danielle, 2013. "Opting-in: Participation bias in economic experiments," Journal of Economic Behavior & Organization, Elsevier, vol. 90(C), pages 43-70.
    54. Eric P. Bettinger & Bridget Terry Long & Philip Oreopoulos & Lisa Sanbonmatsu, 2012. "The Role of Application Assistance and Information in College Decisions: Results from the H&R Block Fafsa Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 127(3), pages 1205-1242.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Belot, Michèle & James, Jonathan & Nolen, Patrick, 2016. "Incentives and children's dietary choices: A field experiment in primary schools," Journal of Health Economics, Elsevier, vol. 50(C), pages 213-229.
    2. Belot, Michèle & James, Jonathan, 2014. "A new perspective on the issue of selection bias in randomized controlled field experiments," Economics Letters, Elsevier, vol. 124(3), pages 326-328.
    3. repec:esx:essedp:753 is not listed on IDEAS
    4. Michael Sanders & Aisling Ní Chonaire, 2015. "“Powered to Detect Small Effect Sizes”: You keep saying that. I do not think it means what you think it means," The Centre for Market and Public Organisation 15/337, The Centre for Market and Public Organisation, University of Bristol, UK.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Peters, Jörg & Langbein, Jörg & Roberts, Gareth, 2016. "Policy evaluation, randomized controlled trials, and external validity—A systematic review," Economics Letters, Elsevier, vol. 147(C), pages 51-54.
    2. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    3. Belot, Michèle & James, Jonathan, 2014. "A new perspective on the issue of selection bias in randomized controlled field experiments," Economics Letters, Elsevier, vol. 124(3), pages 326-328.
    4. G�nther Fink & Margaret McConnell & Sebastian Vollmer, 2014. "Testing for heterogeneous treatment effects in experimental data: false discovery risks and correction procedures," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 6(1), pages 44-57, January.
    5. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    6. Karthik Muralidharan & Paul Niehaus, 2017. "Experimentation at Scale," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 103-124, Fall.
    7. Dolan, P. & Hallsworth, M. & Halpern, D. & King, D. & Metcalfe, R. & Vlaev, I., 2012. "Influencing behaviour: The mindspace way," Journal of Economic Psychology, Elsevier, vol. 33(1), pages 264-277.
    8. Hunt Allcott & Richard L. Sweeney, 2017. "The Role of Sales Agents in Information Disclosure: Evidence from a Field Experiment," Management Science, INFORMS, vol. 63(1), pages 21-39, January.
    9. Damgaard, Mette Trier & Nielsen, Helena Skyt, 2018. "Nudging in education," Economics of Education Review, Elsevier, vol. 64(C), pages 313-342.
    10. Stefano DellaVigna, 2009. "Psychology and Economics: Evidence from the Field," Journal of Economic Literature, American Economic Association, vol. 47(2), pages 315-372, June.
    11. Omar Al-Ubaydli & John List, 2016. "Field Experiments in Markets," Artefactual Field Experiments j0002, The Field Experiments Website.
    12. Manuela Angelucci & Silvia Prina & Heather Royer & Anya Samek, 2015. "When Incentives Backfire: Spillover Effects in Food Choice," NBER Working Papers 21481, National Bureau of Economic Research, Inc.
    13. Eric Floyd & John A. List, 2016. "Using Field Experiments in Accounting and Finance," Journal of Accounting Research, Wiley Blackwell, vol. 54(2), pages 437-475, May.
    14. Matteo M. Galizzi & Daniel Navarro-Martinez, 2019. "On the External Validity of Social Preference Games: A Systematic Lab-Field Study," Management Science, INFORMS, vol. 65(3), pages 976-1002, March.
    15. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    16. Glenn W. Harrison & Morten I. Lau & Hong Il Yoo, 2020. "Risk Attitudes, Sample Selection, and Attrition in a Longitudinal Field Experiment," The Review of Economics and Statistics, MIT Press, vol. 102(3), pages 552-568, July.
    17. Gosnell, Greer & Metcalfe, Robert & List, John A, 2016. "A new approach to an age-old problem: solving externalities by incenting workers directly," LSE Research Online Documents on Economics 84331, London School of Economics and Political Science, LSE Library.
    18. Mirco Tonin & Michael Vlassopoulos, 2015. "Corporate Philanthropy and Productivity: Evidence from an Online Real Effort Experiment," Management Science, INFORMS, vol. 61(8), pages 1795-1811, August.
    19. Figlio, D. & Karbownik, K. & Salvanes, K.G., 2016. "Education Research and Administrative Data," Handbook of the Economics of Education,, Elsevier.
    20. Karthik Muralidharan & Mauricio Romero & Kaspar Wüthrich, 2019. "Factorial Designs, Model Selection, and (Incorrect) Inference in Randomized Experiments," NBER Working Papers 26562, National Bureau of Economic Research, Inc.

    More about this item

    Keywords

    Selection; Field Experiments; Randomised controlled trials; External Validity;
    All these keywords.

    JEL classification:

    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • I18 - Health, Education, and Welfare - - Health - - - Government Policy; Regulation; Public Health
    • J13 - Labor and Demographic Economics - - Demographic Economics - - - Fertility; Family Planning; Child Care; Children; Youth

    NEP fields

    This paper has been announced in the following NEP Reports:

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:edn:sirdps:537. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Research Office (email available below). General contact details of provider: https://edirc.repec.org/data/sireeuk.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.