IDEAS home Printed from https://ideas.repec.org/p/crs/wpaper/2009-05.html
   My bibliography  Save this paper

Sample Attrition Bias in Randomized Experiments : A Table of Two Surveys

Author

Listed:
  • Luc BEHAGHEL

    (Crest)

  • Bruno CREPON

    (Crest)

  • Marc GURGAND

    (Crest)

  • Thomas LE BARBANCHON

    (Crest)

Abstract

The randomized trial literature has helped to renew the field of microeconometric policy evaluation byemphasizing identification issues raised by endogenous program participation. Measurement andattrition issues have perhaps received less attention. This paper analyzes the dramatic impact of sampleattrition in a large job search experiment. We take advantage of two independent surveys on the sameinitial sample of 8,000 persons. The first one is a long telephone survey that had a strikingly low andunbalanced response rate of about 50%. The second one is a combination of administrative data and ashort telephone survey targeted at those leaving the unemployment registers; this enriched data sourcehas a balanced and much higher response rate (about 80%). With naive estimates that neglect nonresponses, these two sources yield puzzlingly different results.Using the enriched administrative data as benchmark, we find evidence that estimates from the longtelephone survey lack external and internal validity. We turn to existing methods to bound the effectsin the presence of sample selection; we extend them to the context of randomization with imperfectcompliance. The bounds obtained from the two surveys are compatible but those from the longtelephone survey are somewhat uninformative. We conclude on the consequences for data collectionstrategies.

Suggested Citation

  • Luc BEHAGHEL & Bruno CREPON & Marc GURGAND & Thomas LE BARBANCHON, 2009. "Sample Attrition Bias in Randomized Experiments : A Table of Two Surveys," Working Papers 2009-05, Center for Research in Economics and Statistics.
  • Handle: RePEc:crs:wpaper:2009-05
    as

    Download full text from publisher

    File URL: http://crest.science/RePEc/wpstorage/2009-05.pdf
    File Function: Crest working paper version
    Download Restriction: no
    ---><---

    Other versions of this item:

    References listed on IDEAS

    as
    1. David Card & Raj Chetty & Andrea Weber, 2007. "The Spike at Benefit Exhaustion: Leaving the Unemployment System or Starting a New Job?," American Economic Review, American Economic Association, vol. 97(2), pages 113-118, May.
    2. Duflo, Esther & Glennerster, Rachel & Kremer, Michael, 2008. "Using Randomization in Development Economics Research: A Toolkit," Handbook of Development Economics, in: T. Paul Schultz & John A. Strauss (ed.), Handbook of Development Economics, edition 1, volume 4, chapter 61, pages 3895-3962, Elsevier.
    3. Ashenfelter, Orley & Ashmore, David & Deschenes, Olivier, 2005. "Do unemployment insurance recipients actively seek work? Evidence from randomized trials in four U.S. States," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 53-75.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Luc Behaghel & Bruno Cr?pon & Marc Gurgand, 2014. "Private and Public Provision of Counseling to Job Seekers: Evidence from a Large Controlled Experiment," American Economic Journal: Applied Economics, American Economic Association, vol. 6(4), pages 142-174, October.
    2. Dmitry Taubinsky & Alex Rees-Jones, 2018. "Attention Variation and Welfare: Theory and Evidence from a Tax Salience Experiment," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 85(4), pages 2462-2496.
    3. Glenn W. Harrison & Morten I. Lau & Hong Il Yoo, 2020. "Risk Attitudes, Sample Selection, and Attrition in a Longitudinal Field Experiment," The Review of Economics and Statistics, MIT Press, vol. 102(3), pages 552-568, July.
    4. Damon Jones & Aprajit Mahajan, 2015. "Time-Inconsistency and Saving: Experimental Evidence from Low-Income Tax Filers," NBER Working Papers 21272, National Bureau of Economic Research, Inc.
    5. Nadia Siddiqui & Vikki Boliver & Stephen Gorard, 2019. "Reliability of Longitudinal Social Surveys of Access to Higher Education: The Case of Next Steps in England," Social Inclusion, Cogitatio Press, vol. 7(1), pages 80-89.
    6. Markus Dertwinkel-Kalt & Katrin Köhler & Mirjam R. J. Lange & Tobias Wenzel, 2017. "Demand Shifts Due to Salience Effects: Experimental Evidence," Journal of the European Economic Association, European Economic Association, vol. 15(3), pages 626-653.
    7. Gortazar, Lucas & Hupkau, Claudia & Roldán-Monés, Antonio, 2024. "Online tutoring works: Experimental evidence from a program with vulnerable children," Journal of Public Economics, Elsevier, vol. 232(C).

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Bart COCKX & Muriel DEJEMEPPE & Andrey LAUNOV & Bruno VAN DER LINDEN, 2011. "Monitoring, Sanctions and Front-Loading of Job Search in a Non-Stationary Model," LIDAM Discussion Papers IRES 2011042, Université catholique de Louvain, Institut de Recherches Economiques et Sociales (IRES).
    2. Morescalchi Andrea & Paruolo Paolo, 2020. "Too Much Stick for the Carrot? Job Search Requirements and Search Behaviour of Unemployment Benefit Claimants," The B.E. Journal of Economic Analysis & Policy, De Gruyter, vol. 20(1), pages 1-21, January.
    3. Cockx, Bart & Dejemeppe, Muriel, 2010. "The Threat of Monitoring Job Search: A Discontinuity Design," IZA Discussion Papers 5337, Institute of Labor Economics (IZA).
    4. Pathric Hägglund, 2014. "Experimental Evidence From Active Placement Efforts Among Unemployed in Sweden," Evaluation Review, , vol. 38(3), pages 191-216, June.
    5. Petrongolo, Barbara, 2009. "The long-term effects of job search requirements: Evidence from the UK JSA reform," Journal of Public Economics, Elsevier, vol. 93(11-12), pages 1234-1253, December.
    6. Patrick Arni & Rafael Lalive & Jan C. Van Ours, 2013. "How Effective Are Unemployment Benefit Sanctions? Looking Beyond Unemployment Exit," Journal of Applied Econometrics, John Wiley & Sons, Ltd., vol. 28(7), pages 1153-1178, November.
    7. Jan Boone & Jan Ours, 2012. "Why is There a Spike in the Job Finding Rate at Benefit Exhaustion?," De Economist, Springer, vol. 160(4), pages 413-438, December.
    8. Cederlöf, Jonas, 2020. "Extended unemployment benefits and the hazard to employment," Working Paper Series 2020:25, IFAU - Institute for Evaluation of Labour Market and Education Policy.
    9. Eduardo Fernández-Arias & Ricardo Hausmann & Ugo Panizza, 2020. "Smart Development Banks," Journal of Industry, Competition and Trade, Springer, vol. 20(2), pages 395-420, June.
    10. Ashish Arora & Michelle Gittelman & Sarah Kaplan & John Lynch & Will Mitchell & Nicolaj Siggelkow & Aaron K. Chatterji & Michael Findley & Nathan M. Jensen & Stephan Meier & Daniel Nielson, 2016. "Field experiments in strategy research," Strategic Management Journal, Wiley Blackwell, vol. 37(1), pages 116-132, January.
    11. Patrick Guillaumont, 2011. "Aid effectiveness for poverty reduction:macroeconomic overview and emerging issues," CERDI Working papers halshs-00554285, HAL.
    12. Carattini, Stefano & Gillingham, Kenneth & Meng, Xiangyu & Yoeli, Erez, 2024. "Peer-to-peer solar and social rewards: Evidence from a field experiment," Journal of Economic Behavior & Organization, Elsevier, vol. 219(C), pages 340-370.
    13. María laura Alzúa & Guillermo Cruces & Carolina Lopez, 2016. "Long-Run Effects Of Youth Training Programs: Experimental Evidence From Argentina," Economic Inquiry, Western Economic Association International, vol. 54(4), pages 1839-1859, October.
    14. Inna Petrunyk & Christian Pfeifer, 2022. "Diverse effects of shorter potential unemployment benefit duration on labor market outcomes in Germany," LABOUR, CEIS, vol. 36(3), pages 367-388, September.
    15. Fevang, Elisabeth & Hardoy, Inés & Røed, Knut, 2013. "Getting Disabled Workers Back to Work: How Important Are Economic Incentives?," IZA Discussion Papers 7137, Institute of Labor Economics (IZA).
    16. Antonia Grohmann & Lukas Menkhoff & Helke Seitz, 2022. "The Effect of Personalized Feedback on Small Enterprises’ Finances in Uganda," Economic Development and Cultural Change, University of Chicago Press, vol. 70(3), pages 1197-1227.
    17. Pedro Carneiro & Sokbae Lee & Daniel Wilhelm, 2020. "Optimal data collection for randomized control trials," The Econometrics Journal, Royal Economic Society, vol. 23(1), pages 1-31.
    18. Cairo, Sofie & Mahlstedt, Robert, 2021. "Transparency of the Welfare System and Labor Market Outcomes of Unemployed Workers," IZA Discussion Papers 14940, Institute of Labor Economics (IZA).
    19. Peter Z. Schochet & Ronald D'Amico & Jillian Berk & Sarah Dolfin & Nathan Wozny, "undated". "Estimated Impacts for Participants in the Trade Adjustment Assistance (TAA) Program Under the 2002 Amendments," Mathematica Policy Research Reports 582d8723f6884d4eb7a3f95a4, Mathematica Policy Research.
    20. Stanley, T. D. & Doucouliagos, Chris, 2019. "Practical Significance, Meta-Analysis and the Credibility of Economics," IZA Discussion Papers 12458, Institute of Labor Economics (IZA).

    More about this item

    JEL classification:

    • C31 - Mathematical and Quantitative Methods - - Multiple or Simultaneous Equation Models; Multiple Variables - - - Cross-Sectional Models; Spatial Models; Treatment Effect Models; Quantile Regressions; Social Interaction Models
    • C93 - Mathematical and Quantitative Methods - - Design of Experiments - - - Field Experiments
    • J68 - Labor and Demographic Economics - - Mobility, Unemployment, Vacancies, and Immigrant Workers - - - Public Policy

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:crs:wpaper:2009-05. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Secretariat General (email available below). General contact details of provider: https://edirc.repec.org/data/crestfr.html .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.