IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v42y2018i5-6p491-514.html
   My bibliography  Save this article

Using Administrative Data to Explore the Effect of Survey Nonresponse in the UK Employment Retention and Advancement Demonstration

Author

Listed:
  • Richard Dorsett
  • Richard Hendra
  • Philip K. Robins

Abstract

Background: Even a well-designed randomized control trial (RCT) study can produce ambiguous results. This article highlights a case in which full sample results from a large-scale RCT in the United Kingdom differ from results for a subsample of survey respondents. Objectives: Our objective is to ascertain the source of the discrepancy in inferences across data sources and, in doing so, to highlight important threats to the reliability of the causal conclusions derived from even the strongest research designs. Research design: The study analyzes administrative data to shed light on the source of the differences between the estimates. We explore the extent to which heterogeneous treatment impacts and survey nonresponse might explain these differences. We suggest checks which assess the external validity of survey measured impacts, which in turn provides an opportunity to test the effectiveness of different weighting schemes to remove bias. The subjects included 6,787 individuals who participated in a large-scale social policy experiment. Results: Our results were not definitive but suggest nonresponse bias is the main source of the inconsistent findings. Conclusions: The results caution against overconfidence in drawing conclusions from RCTs and highlight the need for great care to be taken in data collection and analysis. Particularly, given the modest size of impacts expected in most RCTs, small discrepancies in data sources can alter the results. Survey data remain important as a source of information on outcomes not recorded in administrative data. However, linking survey and administrative data is strongly recommended whenever possible.

Suggested Citation

  • Richard Dorsett & Richard Hendra & Philip K. Robins, 2018. "Using Administrative Data to Explore the Effect of Survey Nonresponse in the UK Employment Retention and Advancement Demonstration," Evaluation Review, , vol. 42(5-6), pages 491-514, October.
  • Handle: RePEc:sae:evarev:v:42:y:2018:i:5-6:p:491-514
    DOI: 10.1177/0193841X16674395
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X16674395
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X16674395?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. John M. Abowd & Martha H. Stinson, 2013. "Estimating Measurement Error in Annual Job Earnings: A Comparison of Survey and Administrative Data," The Review of Economics and Statistics, MIT Press, vol. 95(5), pages 1451-1467, December.
    2. Kornfeld, Robert & Bloom, Howard S, 1999. "Measuring Program Impacts on Earnings and Employment: Do Unemployment Insurance Wage Reports from Employers Agree with Surveys of Individuals?," Journal of Labor Economics, University of Chicago Press, vol. 17(1), pages 168-197, January.
    3. Burt S. Barnow & David Greenberg, 2015. "Do Estimated Impacts on Earnings Depend on the Source of the Data Used to Measure Them? Evidence From Previous Social Experiments," Evaluation Review, , vol. 39(2), pages 179-228, April.
    4. Peter Z. Schochet & John Burghardt & Sheena McConnell, 2006. "National Job Corps Study and Longer-Term Follow-Up Study: Impact and Benefit-Cost Findings Using Survey and Summary Earnings Records Data," Mathematica Policy Research Reports 8074f4e4499d4e2ab1de13747, Mathematica Policy Research.
    5. Ori Heffetz & Matthew Rabin, 2013. "Conclusions Regarding Cross-Group Differences in Happiness Depend on Difficulty of Reaching Respondents," American Economic Review, American Economic Association, vol. 103(7), pages 3001-3021, December.
    6. repec:mpr:mprres:5840 is not listed on IDEAS
    7. Richard Dorsett & Deborah Smeaton & Stefan Speckesser, 2013. "The Effect of Making a Voluntary Labour Market Programme Compulsory: Evidence from a UK Experiment," Fiscal Studies, Institute for Fiscal Studies, vol. 34, pages 467-489, December.
    8. Abhijit Banerjee & Esther Duflo & Rachel Glennerster & Cynthia Kinnan, 2015. "The Miracle of Microfinance? Evidence from a Randomized Evaluation," American Economic Journal: Applied Economics, American Economic Association, vol. 7(1), pages 22-53, January.
    9. Arie Kapteyn & Jelmer Y. Ypma, 2007. "Measurement Error and Misclassification: A Comparison of Survey and Administrative Data," Journal of Labor Economics, University of Chicago Press, vol. 25(3), pages 513-551.
    10. Esmeralda A. Ramalho & Richard J. Smith, 2013. "Discrete Choice Non-Response," The Review of Economic Studies, Review of Economic Studies Ltd, vol. 80(1), pages 343-364.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Quinn Moore & Irma Perez-Johnson & Robert Santillano, 2018. "Decomposing Differences in Impacts on Survey- and Administrative-Measured Earnings From a Job Training Voucher Experiment," Evaluation Review, , vol. 42(5-6), pages 515-549, October.
    2. Reuben Ford & Douwêrê Grékou & Isaac Kwakye & Taylor Shek-wai Hui, 2018. "The Sensitivity of Impact Estimates to Data Sources Used: Analysis From an Access to Postsecondary Education Experiment," Evaluation Review, , vol. 42(5-6), pages 575-615, October.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Quinn Moore & Irma Perez-Johnson & Robert Santillano, 2018. "Decomposing Differences in Impacts on Survey- and Administrative-Measured Earnings From a Job Training Voucher Experiment," Evaluation Review, , vol. 42(5-6), pages 515-549, October.
    2. Fredrik Andersson & Harry J. Holzer & Julia I. Lane & David Rosenblum & Jeffrey Smith, 2024. "Does Federally Funded Job Training Work? Nonexperimental Estimates of WIA Training Impacts Using Longitudinal Data on Workers and Firms," Journal of Human Resources, University of Wisconsin Press, vol. 59(4), pages 1244-1283.
    3. Burt S. Barnow & David Greenberg, 2015. "Do Estimated Impacts on Earnings Depend on the Source of the Data Used to Measure Them? Evidence From Previous Social Experiments," Evaluation Review, , vol. 39(2), pages 179-228, April.
    4. Adrian Chadi, 2019. "Dissatisfied with life or with being interviewed? Happiness and the motivation to participate in a survey," Social Choice and Welfare, Springer;The Society for Social Choice and Welfare, vol. 53(3), pages 519-553, October.
    5. Zachary H. Seeskin, 2016. "Evaluating the Use of Commercial Data to Improve Survey Estimates of Property Taxes," CARRA Working Papers 2016-06, Center for Economic Studies, U.S. Census Bureau.
    6. Michele Lalla & Patrizio Frederic & Daniela Mantovani, 2022. "The inextricable association of measurement errors and tax evasion as examined through a microanalysis of survey data matched with fiscal data: a case study," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 31(5), pages 1375-1401, December.
    7. Michele Lalla & Maddalena Cavicchioli, 2020. "Nonresponse and measurement errors in income: matching individual survey data with administrative tax data," Department of Economics 0170, University of Modena and Reggio E., Faculty of Economics "Marco Biagi".
    8. Dieter Vandelannoote & André Decoster & Toon Vanheukelom & Gerlinde Verbist, 2016. "Evaluating The Quality Of Gross Incomes In SILC: Compare Them With Fiscal Data And Re-calibrate Them Using EUROMOD," International Journal of Microsimulation, International Microsimulation Association, vol. 9(3), pages 5-34.
    9. Jenkins, Stephen P. & Rios-Avila, Fernando, 2021. "Reconciling Reports: Modelling Employment Earnings and Measurement Errors Using Linked Survey and Administrative Data," IZA Discussion Papers 14405, Institute of Labor Economics (IZA).
    10. Bruce D. Meyer & Nikolas Mittag, 2015. "Using Linked Survey and Administrative Data to Better Measure Income: Implications for Poverty, Program Effectiveness and Holes in the Safety Net," Upjohn Working Papers 15-242, W.E. Upjohn Institute for Employment Research.
    11. Stefan Angel & Richard Heuberger & Nadja Lamei, 2018. "Differences Between Household Income from Surveys and Registers and How These Affect the Poverty Headcount: Evidence from the Austrian SILC," Social Indicators Research: An International and Interdisciplinary Journal for Quality-of-Life Measurement, Springer, vol. 138(2), pages 575-603, July.
    12. Jenkins, Stephen P. & Rios-Avila, Fernando, 2020. "Modelling errors in survey and administrative data on employment earnings: Sensitivity to the fraction assumed to have error-free earnings," Economics Letters, Elsevier, vol. 192(C).
    13. Hyslop, Dean R. & Townsend, Wilbur, 2017. "Employment misclassification in survey and administrative reports," Economics Letters, Elsevier, vol. 155(C), pages 19-23.
    14. Whitaker, Stephan D., 2018. "Big Data versus a survey," The Quarterly Review of Economics and Finance, Elsevier, vol. 67(C), pages 285-296.
    15. Ha Trong Nguyen & Huong Thu Le & Luke Connelly & Francis Mitrou, 2023. "Accuracy of self‐reported private health insurance coverage," Health Economics, John Wiley & Sons, Ltd., vol. 32(12), pages 2709-2729, December.
    16. Robert Moffitt & John Abowd & Christopher Bollinger & Michael Carr & Charles Hokayem & Kevin McKinney & Emily Wiemers & Sisi Zhang & James Ziliak, 2022. "Reconciling Trends in U.S. Male Earnings Volatility: Results from Survey and Administrative Data," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 41(1), pages 1-11, December.
    17. Burt S. Barnow & Jeffrey Smith, 2015. "Employment and Training Programs," NBER Chapters, in: Economics of Means-Tested Transfer Programs in the United States, Volume 2, pages 127-234, National Bureau of Economic Research, Inc.
    18. Madeira, Carlos & Margaretic, Paula, 2022. "The impact of financial literacy on the quality of self-reported financial information," Journal of Behavioral and Experimental Finance, Elsevier, vol. 34(C).
    19. Stephen P. Jenkins & Fernando Rios-Avila, 2023. "Finite mixture models for linked survey and administrative data: Estimation and postestimation," Stata Journal, StataCorp LP, vol. 23(1), pages 53-85, March.
    20. Robert Moffitt & Sisi Zhang, 2022. "Estimating Trends in Male Earnings Volatility with the Panel Study of Income Dynamics," Journal of Business & Economic Statistics, Taylor & Francis Journals, vol. 41(1), pages 20-25, December.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:42:y:2018:i:5-6:p:491-514. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.