IDEAS home Printed from https://ideas.repec.org/a/plo/pone00/0160652.html
   My bibliography  Save this article

Cluster Sampling Bias in Government-Sponsored Evaluations: A Correlational Study of Employment and Welfare Pilots in England

Author

Listed:
  • Arnaud Vaganay

Abstract

For pilot or experimental employment programme results to apply beyond their test bed, researchers must select ‘clusters’ (i.e. the job centres delivering the new intervention) that are reasonably representative of the whole territory. More specifically, this requirement must account for conditions that could artificially inflate the effect of a programme, such as the fluidity of the local labour market or the performance of the local job centre. Failure to achieve representativeness results in Cluster Sampling Bias (CSB). This paper makes three contributions to the literature. Theoretically, it approaches the notion of CSB as a human behaviour. It offers a comprehensive theory, whereby researchers with limited resources and conflicting priorities tend to oversample ‘effect-enhancing’ clusters when piloting a new intervention. Methodologically, it advocates for a ‘narrow and deep’ scope, as opposed to the ‘wide and shallow’ scope, which has prevailed so far. The PILOT-2 dataset was developed to test this idea. Empirically, it provides evidence on the prevalence of CSB. In conditions similar to the PILOT-2 case study, investigators (1) do not sample clusters with a view to maximise generalisability; (2) do not oversample ‘effect-enhancing’ clusters; (3) consistently oversample some clusters, including those with higher-than-average client caseloads; and (4) report their sampling decisions in an inconsistent and generally poor manner. In conclusion, although CSB is prevalent, it is still unclear whether it is intentional and meant to mislead stakeholders about the expected effect of the intervention or due to higher-level constraints or other considerations.

Suggested Citation

  • Arnaud Vaganay, 2016. "Cluster Sampling Bias in Government-Sponsored Evaluations: A Correlational Study of Employment and Welfare Pilots in England," PLOS ONE, Public Library of Science, vol. 11(8), pages 1-21, August.
  • Handle: RePEc:plo:pone00:0160652
    DOI: 10.1371/journal.pone.0160652
    as

    Download full text from publisher

    File URL: https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0160652
    Download Restriction: no

    File URL: https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0160652&type=printable
    Download Restriction: no

    File URL: https://libkey.io/10.1371/journal.pone.0160652?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Robert B. Olsen & Larry L. Orr & Stephen H. Bell & Elizabeth A. Stuart, 2013. "External Validity in Policy Evaluations That Choose Sites Purposively," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 32(1), pages 107-121, January.
    2. Michael Woolcock, 2013. "Using Case Studies to Explore the External Validity of ‘Complex’ Development Interventions," CID Working Papers 270, Center for International Development at Harvard University.
    3. Adrian Gheorghe & Tracy E Roberts & Jonathan C Ives & Benjamin R Fletcher & Melanie Calvert, 2013. "Centre Selection for Clinical Trials and the Generalisability of Results: A Mixed Methods Study," PLOS ONE, Public Library of Science, vol. 8(2), pages 1-9, February.
    4. Woolcock, Michael, 2013. "Using Case Studies to Explore the External Validity of 'Complex' Development Interventions," Working Paper Series rwp13-048, Harvard University, John F. Kennedy School of Government.
    5. David Greenberg & Burt S. Barnow, 2014. "Flaws in Evaluations of Social Programs," Evaluation Review, , vol. 38(5), pages 359-387, October.
    6. Megan L Head & Luke Holman & Rob Lanfear & Andrew T Kahn & Michael D Jennions, 2015. "The Extent and Consequences of P-Hacking in Science," PLOS Biology, Public Library of Science, vol. 13(3), pages 1-15, March.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Katherine Casey & Rachel Glennerster & Edward Miguel & Maarten Voors, 2023. "Skill Versus Voice in Local Development," The Review of Economics and Statistics, MIT Press, vol. 105(2), pages 311-326, March.
    2. Quentin Stoeffler & Michael Carter & Catherine Guirkinger & Wouter Gelade, 2022. "The Spillover Impact of Index Insurance on Agricultural Investment by Cotton Farmers in Burkina Faso," The World Bank Economic Review, World Bank, vol. 36(1), pages 114-140.
    3. Jörg Peters & Jörg Langbein & Gareth Roberts, 2018. "Generalization in the Tropics – Development Policy, Randomized Controlled Trials, and External Validity," The World Bank Research Observer, World Bank, vol. 33(1), pages 34-64.
    4. Hakiman, Kamran & Sheely, Ryan, 2023. "Unlocking the Potential of Participatory Planning: How Flexible and Adaptive Governance Interventions Can Work in Practice," OSF Preprints kucjs, Center for Open Science.
    5. Milante Gary & Woolcock Michael, 2017. "New Approaches to Identifying State Fragility," Journal of Globalization and Development, De Gruyter, vol. 8(1), pages 1-10, June.
    6. Woolcock, Michael, 2014. "Engaging with Fragile and Conflict-Affected States," Working Paper Series rwp14-038, Harvard University, John F. Kennedy School of Government.
    7. Florence Jany‐Catrice, 2022. "A political economy of social impact measurement," Annals of Public and Cooperative Economics, Wiley Blackwell, vol. 93(2), pages 267-291, June.
    8. Ngoc Thi Minh Tran & Michael P. Cameron & Jacques Poot, 2019. "What are migrants willing to pay for better home country institutions?," Letters in Spatial and Resource Sciences, Springer, vol. 12(3), pages 257-268, December.
    9. Florent Bédécarrats & Isabelle Guérin & François Roubaud, 2019. "All that Glitters is not Gold. The Political Economy of Randomized Evaluations in Development," Development and Change, International Institute of Social Studies, vol. 50(3), pages 735-762, May.
    10. Reed, M.S. & Ferré, M. & Martin-Ortega, J. & Blanche, R. & Lawford-Rolfe, R. & Dallimer, M. & Holden, J., 2021. "Evaluating impact from research: A methodological framework," Research Policy, Elsevier, vol. 50(4).
    11. Florence JANY-CATRICE, 2020. "Une économie politique des mesures d’impact social," CIRIEC Working Papers 2014, CIRIEC - Université de Liège.
    12. Ton, Giel & Klerkx, Laurens & de Grip, Karin & Rau, Marie-Luise, 2015. "Innovation grants to smallholder farmers: Revisiting the key assumptions in the impact pathways," Food Policy, Elsevier, vol. 51(C), pages 9-23.
    13. Edwards, David M. & Meagher, Laura R., 2020. "A framework to evaluate the impacts of research on policy and practice: A forestry pilot study," Forest Policy and Economics, Elsevier, vol. 114(C).
    14. Andrews, Matt & Pritchett, Lant & Woolcock, Michael, 2017. "Building State Capability: Evidence, Analysis, Action," OUP Catalogue, Oxford University Press, number 9780198747482.
    15. Mendoza Alcantara, Alejandra & Woolcock, Michael, 2014. "Integrating qualitative methods into investment climate impact evaluations," Policy Research Working Paper Series 7145, The World Bank.
    16. Cameron, Lisa & Olivia, Susan & Shah, Manisha, 2019. "Scaling up sanitation: Evidence from an RCT in Indonesia," Journal of Development Economics, Elsevier, vol. 138(C), pages 1-16.
    17. Michael Woolcock, 2014. "Engaging with Fragile and Conflict-Affected States: An Alternative Approach to Theory, Measurement and Practice," WIDER Working Paper Series wp-2014-097, World Institute for Development Economic Research (UNU-WIDER).
    18. Lucia Mýtna Kureková, 2015. "Policy Puzzles with Roma Employment in Slovakia," Discussion Papers 34, Central European Labour Studies Institute (CELSI).
    19. Heinemann, E. & Van Hemelrijck, A. & Guijt, I., 2017. "IFAD RESEARCH SERIES 16 - Getting the most out of impact evaluation for learning, reporting and influence," IFAD Research Series 280054, International Fund for Agricultural Development (IFAD).
    20. Aki Nagano, 2022. "Value Propositions for Small Fashion Businesses: From Japanese Case Studies," Sustainability, MDPI, vol. 14(6), pages 1-15, March.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:plo:pone00:0160652. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: plosone (email available below). General contact details of provider: https://journals.plos.org/plosone/ .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.