IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v37y2013i2p63-108.html
   My bibliography  Save this article

A Multilevel Analysis of the Impacts of Services Provided by the U.K. Employment Retention and Advancement Demonstration

Author

Listed:
  • Richard Dorsett
  • Philip K. Robins

Abstract

Background: The United Kingdom Employment Retention and Advancement (U.K. ERA) demonstration was the largest and most comprehensive social experiment ever conducted in the United Kingdom. It examined the extent to which a combination of postemployment advisory support and financial incentives could help lone parents on welfare to find sustained employment with prospects for advancement. ERA was experimentally tested across more than 50 public employment service offices and, within each office, individuals were randomly assigned to either a program (or treatment) group (eligible for ERA) or a control group (not eligible). Method: This article presents the results of a multilevel nonexperimental analysis that examines the variation in office-level impacts and attempts to understand what services provided in the offices tend to be associated with impacts. Result: The analysis suggests that impacts were greater in offices that emphasized in-work advancement, support while working and financial bonuses for sustained employment, and also in those offices that assigned more caseworkers to ERA participants. Offices that encouraged further education had smaller employment impacts. Conclusion: Plausible results are obtained identifying those particular implementation features that tended to be linked to stronger impacts of ERA. The methodology employed also allows the identification of which services are associated with employment and welfare receipt of control families receiving benefits under the traditional New Deal for Lone Parent program.

Suggested Citation

  • Richard Dorsett & Philip K. Robins, 2013. "A Multilevel Analysis of the Impacts of Services Provided by the U.K. Employment Retention and Advancement Demonstration," Evaluation Review, , vol. 37(2), pages 63-108, April.
  • Handle: RePEc:sae:evarev:v:37:y:2013:i:2:p:63-108
    DOI: 10.1177/0193841X13517383
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X13517383
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X13517383?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2006. "Evaluating the Differential Effects of Alternative Welfare-to-Work Training Components: A Reanalysis of the California GAIN Program," Journal of Labor Economics, University of Chicago Press, vol. 24(3), pages 521-566, July.
    2. James Heckman & Neil Hohmann & Jeffrey Smith & Michael Khoo, 2000. "Substitution and Dropout Bias in Social Experiments: A Study of an Influential Social Experiment," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 115(2), pages 651-694.
    3. Dehejia, Rajeev H, 2003. "Was There a Riverside Miracle? A Hierarchical Framework for Evaluating Programs with Grouped Data," Journal of Business & Economic Statistics, American Statistical Association, vol. 21(1), pages 1-11, January.
    4. David Greenberg & Philip K. Robins, 2011. "Have Welfare-to-Work Programs Improved over Time in Putting Welfare Recipients to Work?," ILR Review, Cornell University, ILR School, vol. 64(5), pages 910-920, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Dorsett, Richard, 2014. "The effect of temporary in-work support on employment retention: Evidence from a field experiment," Labour Economics, Elsevier, vol. 31(C), pages 61-71.
    2. Richard Hendra & James Riccio & Richard Dorsett & Philip Robins, 2015. "Breaking the low pay, no pay cycle: the effects of the UK Employment Retention and Advancement programme," IZA Journal of Labor Policy, Springer;Forschungsinstitut zur Zukunft der Arbeit GmbH (IZA), vol. 4(1), pages 1-32, December.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    2. Carlos A. Flores & Oscar A. Mitnik, 2013. "Comparing Treatments across Labor Markets: An Assessment of Nonexperimental Multiple-Treatment Strategies," The Review of Economics and Statistics, MIT Press, vol. 95(5), pages 1691-1707, December.
    3. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    4. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    5. Gueorgui Kambourov & Iourii Manovskii & Miana Plesca, 2020. "Occupational mobility and the returns to training," Canadian Journal of Economics/Revue canadienne d'économique, John Wiley & Sons, vol. 53(1), pages 174-211, February.
    6. David Card & Pablo Ibarrarán & Ferdinando Regalia & David Rosas-Shady & Yuri Soares, 2011. "The Labor Market Impacts of Youth Training in the Dominican Republic," Journal of Labor Economics, University of Chicago Press, vol. 29(2), pages 267-300.
    7. Hunt Allcott, 2012. "Site Selection Bias in Program Evaluation," NBER Working Papers 18373, National Bureau of Economic Research, Inc.
    8. Jesse Rothstein & Till von Wachter, 2016. "Social Experiments in the Labor Market," NBER Working Papers 22585, National Bureau of Economic Research, Inc.
    9. Chung Choe & Alfonso Flores-Lagunes & Sang-Jun Lee, 2015. "Do dropouts with longer training exposure benefit from training programs? Korean evidence employing methods for continuous treatments," Empirical Economics, Springer, vol. 48(2), pages 849-881, March.
    10. Judith M. Gueron & Gayle Hamilton, 2023. "Using Multi-Arm Designs to Test Operating Welfare-to-Work Programs," Evaluation Review, , vol. 47(1), pages 71-103, February.
    11. Alfonso Flores‐Lagunes & Arturo Gonzalez & Todd Neumann, 2010. "Learning But Not Earning? The Impact Of Job Corps Training On Hispanic Youth," Economic Inquiry, Western Economic Association International, vol. 48(3), pages 651-667, July.
    12. James J. Heckman, 1991. "Randomization and Social Policy Evaluation Revisited," NBER Technical Working Papers 0107, National Bureau of Economic Research, Inc.
    13. Riddell, Chris & Riddell, W. Craig, 2016. "When Can Experimental Evidence Mislead? A Re-Assessment of Canada's Self Sufficiency Project," IZA Discussion Papers 9939, Institute of Labor Economics (IZA).
    14. Alan B. Krueger, 2002. "Inequality, Too Much of a Good Thing," Working Papers 845, Princeton University, Department of Economics, Industrial Relations Section..
    15. John Gibson & Steven Stillman & David McKenzie & Halahingano Rohorua, 2013. "Natural Experiment Evidence On The Effect Of Migration On Blood Pressure And Hypertension," Health Economics, John Wiley & Sons, Ltd., vol. 22(6), pages 655-672, June.
    16. Dhaval Dave & Hope Corman & Nancy Reichman, 2012. "Effects of Welfare Reform on Education Acquisition of Adult Women," Journal of Labor Research, Springer, vol. 33(2), pages 251-282, June.
    17. David McKenzie & Steven Stillman & John Gibson, 2010. "How Important is Selection? Experimental VS. Non‐Experimental Measures of the Income Gains from Migration," Journal of the European Economic Association, European Economic Association, vol. 8(4), pages 913-945, June.
    18. Benjamin Lu & Eli Ben-Michael & Avi Feller & Luke Miratrix, 2023. "Is It Who You Are or Where You Are? Accounting for Compositional Differences in Cross-Site Treatment Effect Variation," Journal of Educational and Behavioral Statistics, , vol. 48(4), pages 420-453, August.
    19. Chakravorty, Bhaskar & Arulampalam, Wiji & Bhatiya, Apurav Yash & Imbert, Clément & Rathelot, Roland, 2024. "Can information about jobs improve the effectiveness of vocational training? Experimental evidence from India," Journal of Development Economics, Elsevier, vol. 169(C).
    20. A. Smith, Jeffrey & E. Todd, Petra, 2005. "Does matching overcome LaLonde's critique of nonexperimental estimators?," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 305-353.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:37:y:2013:i:2:p:63-108. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.