IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v47y2023i2p209-230.html
   My bibliography  Save this article

2020 Rossi Award Lecture: The Evolving Art of Program Evaluation

Author

Listed:
  • Randall S. Brown

Abstract

Evaluation of public programs has undergone many changes over the past four decades since Peter Rossi coined his “Iron Law†of program evaluation: “The expected value of any net impact assessment of any large-scale social program is zero.†While that assessment may be somewhat overstated, the essence still holds. The failures far outnumber the successes, and the estimated favorable effects are rarely sizeable. Despite this grim assessment, much can be learned from “failed†experiments, and from ones that are successful in only some sites or subgroups. Advances in study design, statistical models, data, and how inferences are drawn from estimates have substantially improved our analyses and will continue to do so. However, the most actual learning about “what works†(and why, when, and where) is likely to come from gathering more detailed and comprehensive data on how the intervention was implemented and attempting to link that data to estimated impacts. Researchers need detailed data on the target population served, the content of the intervention, and the process by which it is delivered to participating service providers and individuals. Two examples presented here illustrate how researchers drew useful broader lessons from impact estimates for a set of related programs. Rossi posited three reasons most interventions fail—wrong question, wrong intervention, poor implementation. Speeding the accumulation of wisdom about how social programs can best help vulnerable populations will require that researchers work closely with program funders, developers, operators, and participants to gather and interpret these detailed data about program implementation.

Suggested Citation

  • Randall S. Brown, 2023. "2020 Rossi Award Lecture: The Evolving Art of Program Evaluation," Evaluation Review, , vol. 47(2), pages 209-230, April.
  • Handle: RePEc:sae:evarev:v:47:y:2023:i:2:p:209-230
    DOI: 10.1177/0193841X221121241
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X221121241
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X221121241?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Angus Deaton, 2020. "Randomization in the Tropics Revisited: a Theme and Eleven Variations," NBER Working Papers 27600, National Bureau of Economic Research, Inc.
    2. John P. A. Ioannidis & T. D. Stanley & Hristos Doucouliagos, 2017. "The Power of Bias in Economics Research," Economic Journal, Royal Economic Society, vol. 127(605), pages 236-265, October.
    3. Deaton, Angus & Cartwright, Nancy, 2018. "Understanding and misunderstanding randomized controlled trials," Social Science & Medicine, Elsevier, vol. 210(C), pages 2-21.
    4. Heckman, James, 2013. "Sample selection bias as a specification error," Applied Econometrics, Russian Presidential Academy of National Economy and Public Administration (RANEPA), vol. 31(3), pages 129-137.
    5. Deborah Peikes & Arnold Chen & Jennifer Schore & Randall Brown, 2009. "Effects of Care Coordination on Hospitalization, Quality of Care, and Health Care Expenditures Among Medicare Beneficiaries: 15 Randomized Trials," Mathematica Policy Research Reports ce70f11be1b44e2c8590b9cf5, Mathematica Policy Research.
    6. repec:mpr:mprres:6184 is not listed on IDEAS
    7. Camerer, Colin & Dreber, Anna & Forsell, Eskil & Ho, Teck-Hua & Huber, Jurgen & Johannesson, Magnus & Kirchler, Michael & Almenberg, Johan & Altmejd, Adam & Chan, Taizan & Heikensten, Emma & Holzmeist, 2016. "Evaluating replicability of laboratory experiments in Economics," MPRA Paper 75461, University Library of Munich, Germany.
    8. repec:mpr:mprres:7472 is not listed on IDEAS
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Vellore Arthi & James Fenske, 2018. "Polygamy and child mortality: Historical and modern evidence from Nigeria’s Igbo," Review of Economics of the Household, Springer, vol. 16(1), pages 97-141, March.
    2. Isaiah Andrews & Maximilian Kasy, 2019. "Identification of and Correction for Publication Bias," American Economic Review, American Economic Association, vol. 109(8), pages 2766-2794, August.
    3. Burlig, Fiona, 2018. "Improving transparency in observational social science research: A pre-analysis plan approach," Economics Letters, Elsevier, vol. 168(C), pages 56-60.
    4. Maurizio Canavari & Andreas C. Drichoutis & Jayson L. Lusk & Rodolfo M. Nayga, Jr., 2018. "How to run an experimental auction: A review of recent advances," Working Papers 2018-5, Agricultural University of Athens, Department Of Agricultural Economics.
    5. Nick Huntington‐Klein & Andreu Arenas & Emily Beam & Marco Bertoni & Jeffrey R. Bloem & Pralhad Burli & Naibin Chen & Paul Grieco & Godwin Ekpe & Todd Pugatch & Martin Saavedra & Yaniv Stopnitzky, 2021. "The influence of hidden researcher decisions in applied microeconomics," Economic Inquiry, Western Economic Association International, vol. 59(3), pages 944-960, July.
    6. Eszter Czibor & David Jimenez‐Gomez & John A. List, 2019. "The Dozen Things Experimental Economists Should Do (More of)," Southern Economic Journal, John Wiley & Sons, vol. 86(2), pages 371-432, October.
    7. James Mahoney & Andrew Owen, 2022. "Importing set-theoretic tools into quantitative research: the case of necessary and sufficient conditions," Quality & Quantity: International Journal of Methodology, Springer, vol. 56(4), pages 2001-2022, August.
    8. Cloos, Janis & Greiff, Matthias & Rusch, Hannes, 2019. "Geographical Concentration and Editorial Favoritism within the Field of Laboratory Experimental Economics," Research Memorandum 029, Maastricht University, Graduate School of Business and Economics (GSBE).
    9. George W. Norton, 2020. "Lessons from a Career in Agricultural Development and Research Evaluation," Applied Economic Perspectives and Policy, John Wiley & Sons, vol. 42(2), pages 151-167, June.
    10. Kaiser, Tim & Lusardi, Annamaria & Menkhoff, Lukas & Urban, Carly, 2022. "Financial education affects financial knowledge and downstream behaviors," Journal of Financial Economics, Elsevier, vol. 145(2), pages 255-272.
    11. Fatoumata Nankoto Cissé, 2022. "How impact evaluation methods influence the outcomes of development projects? Evidence from a meta-analysis on decentralized solar nano projects," Documents de travail du Centre d'Economie de la Sorbonne 22008, Université Panthéon-Sorbonne (Paris 1), Centre d'Economie de la Sorbonne.
    12. Jiří Gregor & Aleš Melecký & Martin Melecký, 2021. "Interest Rate Pass‐Through: A Meta‐Analysis Of The Literature," Journal of Economic Surveys, Wiley Blackwell, vol. 35(1), pages 141-191, February.
    13. Jindrich Matousek & Tomas Havranek & Zuzana Irsova, 2022. "Individual discount rates: a meta-analysis of experimental evidence," Experimental Economics, Springer;Economic Science Association, vol. 25(1), pages 318-358, February.
    14. Abel Brodeur & Nikolai Cook & Carina Neisser, 2022. "P-Hacking, Data Type and Data-Sharing Policy," ECONtribute Discussion Papers Series 200, University of Bonn and University of Cologne, Germany.
    15. Anthony Doucouliagos & Hristos Doucouliagos & T. D. Stanley, 2024. "Power and bias in industrial relations research," British Journal of Industrial Relations, London School of Economics, vol. 62(1), pages 3-27, March.
    16. Fatoumata Nankoto Cissé, 2022. "How impact evaluation methods influence the outcomes of development projects? Evidence from a meta-analysis on decentralized solar nano projects," Post-Print halshs-03623394, HAL.
    17. Guillaume Coqueret, 2023. "Forking paths in financial economics," Papers 2401.08606, arXiv.org.
    18. Felix Holzmeister & Magnus Johannesson & Robert Böhm & Anna Dreber & Jürgen Huber & Michael Kirchler, 2023. "Heterogeneity in effect size estimates: Empirical evidence and practical implications," Working Papers 2023-17, Faculty of Economics and Statistics, Universität Innsbruck.
    19. Sarah Tahamont & Zubin Jelveh & Aaron Chalfin & Shi Yan & Benjamin Hansen, 2019. "Administrative Data Linking and Statistical Power Problems in Randomized Experiments," NBER Working Papers 25657, National Bureau of Economic Research, Inc.
    20. Ankel-Peters, Jörg & Fiala, Nathan & Neubauer, Florian, 2023. "Do economists replicate?," Journal of Economic Behavior & Organization, Elsevier, vol. 212(C), pages 219-232.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:47:y:2023:i:2:p:209-230. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.