IDEAS home Printed from https://ideas.repec.org/a/uwp/jhriss/v38y2003i4p860-880.html
   My bibliography  Save this article

Using Experiments to Evaluate Performance Standards: What Do Welfare-to-Work Demonstrations Reveal to Welfare Reformers?

Author

Listed:
  • John V. Pepper

Abstract

This paper examines how experimental demonstrations can be used to inform planners about the efficacy of social programs in light of a performance standard. The problem is illustrated by considering the situation faced by state governments attempting to design programs to meet the new federal welfare-to-work standards. Data from experimental evaluations alone allow only limited inferences about the labor market outcomes of welfare recipients. Combined with prior information on the selection process, however, these data are informative, suggesting either that the long-run federal requirements cannot be met or that these standards will only be met under special circumstances.

Suggested Citation

  • John V. Pepper, 2003. "Using Experiments to Evaluate Performance Standards: What Do Welfare-to-Work Demonstrations Reveal to Welfare Reformers?," Journal of Human Resources, University of Wisconsin Press, vol. 38(4).
  • Handle: RePEc:uwp:jhriss:v:38:y:2003:i:4:p860-880
    as

    Download full text from publisher

    File URL: http://jhr.uwpress.org/cgi/reprint/XXXVIII/4/860
    Download Restriction: A subscripton is required to access pdf files. Pay per article is available.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Charles F. Manski & John Newman & John V. Pepper, 2002. "Using Performance Standards to Evaluate Social Programs with Incomplete Outcome Data," Evaluation Review, , vol. 26(4), pages 355-381, August.
    2. James J. Heckman & Jeffrey A. Smith, 1995. "Assessing the Case for Social Experiments," Journal of Economic Perspectives, American Economic Association, vol. 9(2), pages 85-110, Spring.
    3. John V. Pepper, 2002. "To Train or Not To Train: Optimal Treatment Assignment Rules Using Welfare-to-Work Experiments," Virginia Economics Online Papers 356, University of Virginia, Department of Economics.
    4. Michael Wiseman, 1991. "Research and policy: A symposium on the family support act of 1988," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 10(4), pages 588-589.
    5. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2000. "The Long-Term Gains from GAIN: A Re-Analysis of the Impacts of the California GAIN Program," NBER Working Papers 8007, National Bureau of Economic Research, Inc.
    6. Hausman, Jerry A. & Wise, David A. (ed.), 1985. "Social Experimentation," National Bureau of Economic Research Books, University of Chicago Press, number 9780226319407, September.
    7. Jerry A. Hausman & David A. Wise, 1985. "Social Experimentation," NBER Books, National Bureau of Economic Research, Inc, number haus85-1.
    8. Mark C. Berger & Dan Black & Jeffrey Smith, 2000. "Evaluating Profiling as a Means of Allocating Government Services," University of Western Ontario, Departmental Research Report Series 200018, University of Western Ontario, Department of Economics.
    9. Charles F. Manski, 1996. "Learning about Treatment Effects from Experiments with Random Assignment of Treatments," Journal of Human Resources, University of Wisconsin Press, vol. 31(4), pages 709-733.
    10. V. Joseph Hotz & Guido W. Imbens & Julie H. Mortimer, 1999. "Predicting the Efficacy of Future Training Programs Using Past Experiences," NBER Technical Working Papers 0238, National Bureau of Economic Research, Inc.
    11. Jerry A. Hausman & David A. Wise, 1985. "Introduction to "Social Experimentation"," NBER Chapters, in: Social Experimentation, pages 1-10, National Bureau of Economic Research, Inc.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Oscar Mitnik, 2008. "How do Training Programs Assign Participants to Training? Characterizing the Assignment Rules of Government Agencies for Welfare-to-Work Programs in California," Working Papers 0907, University of Miami, Department of Economics.
    2. Guildo W. Imbens, 2003. "Sensitivity to Exogeneity Assumptions in Program Evaluation," American Economic Review, American Economic Association, vol. 93(2), pages 126-132, May.
    3. Robert Lemke & Claus Hoerandner & Robert McMahon, 2006. "Student Assessments, Non-test-takers, and School Accountability," Education Economics, Taylor & Francis Journals, vol. 14(2), pages 235-250.
    4. Craig Gundersen & Brent Kreider & John Pepper & Valerie Tarasuk, 2017. "Food assistance programs and food insecurity: implications for Canada in light of the mixing problem," Empirical Economics, Springer, vol. 52(3), pages 1065-1087, May.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dehejia, Rajeev H., 2005. "Program evaluation as a decision problem," Journal of Econometrics, Elsevier, vol. 125(1-2), pages 141-173.
    2. Abhijit Banerjee & Rukmini Banerji & James Berry & Esther Duflo & Harini Kannan & Shobhini Mukerji & Marc Shotland & Michael Walton, 2017. "From Proof of Concept to Scalable Policies: Challenges and Solutions, with an Application," Journal of Economic Perspectives, American Economic Association, vol. 31(4), pages 73-102, Fall.
    3. Robert Moffitt, 2002. "The role of randomized field trials in social science research: a perspective from evaluations of reforms of social welfare programs," CeMMAP working papers 23/02, Institute for Fiscal Studies.
    4. Mekonnen, Tigist, 2017. "Financing rural households and its impact: Evidence from randomized field experiment data," MERIT Working Papers 2017-009, United Nations University - Maastricht Economic and Social Research Institute on Innovation and Technology (MERIT).
    5. Richard Blundell & Monica Costa Dias, 2009. "Alternative Approaches to Evaluation in Empirical Microeconomics," Journal of Human Resources, University of Wisconsin Press, vol. 44(3).
    6. Eduard Marinov, 2019. "The 2019 Nobel Prize in Economics," Economic Thought journal, Bulgarian Academy of Sciences - Economic Research Institute, issue 6, pages 78-116.
    7. Gustavo Canavire-Bacarreza & Luis Castro Peñarrieta & Darwin Ugarte Ontiveros, 2021. "Outliers in Semi-Parametric Estimation of Treatment Effects," Econometrics, MDPI, vol. 9(2), pages 1-32, April.
    8. Levitt, Steven D. & List, John A., 2009. "Field experiments in economics: The past, the present, and the future," European Economic Review, Elsevier, vol. 53(1), pages 1-18, January.
    9. Guido W. Imbens, 2010. "Better LATE Than Nothing: Some Comments on Deaton (2009) and Heckman and Urzua (2009)," Journal of Economic Literature, American Economic Association, vol. 48(2), pages 399-423, June.
    10. Committee, Nobel Prize, 2019. "Understanding development and poverty alleviation," Nobel Prize in Economics documents 2019-2, Nobel Prize Committee.
    11. Meyer, Bruce D, 1996. "What Have We Learned from the Illinois Reemployment Bonus Experiment?," Journal of Labor Economics, University of Chicago Press, vol. 14(1), pages 26-51, January.
    12. Manski, Charles F., 2013. "Public Policy in an Uncertain World: Analysis and Decisions," Economics Books, Harvard University Press, number 9780674066892, Spring.
    13. Charles F. Manski, 2015. "Randomizing Regulatory Approval for Adaptive Diversification and Deterrence," The Journal of Legal Studies, University of Chicago Press, vol. 44(S2), pages 367-385.
    14. Sianesi, Barbara, 2017. "Evidence of randomisation bias in a large-scale social experiment: The case of ERA," Journal of Econometrics, Elsevier, vol. 198(1), pages 41-64.
    15. Guido W. Imbens & Jeffrey M. Wooldridge, 2009. "Recent Developments in the Econometrics of Program Evaluation," Journal of Economic Literature, American Economic Association, vol. 47(1), pages 5-86, March.
    16. John V. Pepper, 1999. "What Do Welfare-to-Work Demonstrations Reveal to Welfare Reformers?," Virginia Economics Online Papers 317, University of Virginia, Department of Economics.
    17. Peter R. Mueser & Kenneth R. Troske & Alexey Gorislavsky, 2007. "Using State Administrative Data to Measure Program Performance," The Review of Economics and Statistics, MIT Press, vol. 89(4), pages 761-783, November.
    18. Djebbari, Habiba & Smith, Jeffrey, 2008. "Heterogeneous impacts in PROGRESA," Journal of Econometrics, Elsevier, vol. 145(1-2), pages 64-80, July.
    19. Robert A. Pollak, 1998. "Notes on How Economists Think . . ," JCPR Working Papers 35, Northwestern University/University of Chicago Joint Center for Poverty Research.
    20. V. Joseph Hotz & Guido W. Imbens & Jacob A. Klerman, 2006. "Evaluating the Differential Effects of Alternative Welfare-to-Work Training Components: A Reanalysis of the California GAIN Program," Journal of Labor Economics, University of Chicago Press, vol. 24(3), pages 521-566, July.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:uwp:jhriss:v:38:y:2003:i:4:p860-880. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: the person in charge (email available below). General contact details of provider: http://jhr.uwpress.org/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.