IDEAS home Printed from https://ideas.repec.org/a/tpr/edfpol/v12y2017i4p468-491.html
   My bibliography  Save this article

The Impact of Summer Learning Loss on Measures of School Performance

Author

Listed:
  • Andrew McEachin

    (RAND Corporation Santa Monica, CA 90401)

  • Allison Atteberry

    (School of Education University of Colorado, Boulder Boulder, CO 80309)

Abstract

State and federal accountability policies are predicated on the ability to estimate valid and reliable measures of school impacts on student learning. The typical spring-to-spring testing window potentially conflates the amount of learning that occurs during the school year with learning that occurs during the summer. We use a unique dataset to explore the potential for students’ summer learning to bias school-level value-added models used in accountability policies and research on school quality. The results of this paper raise important questions about the design of performance-based education policies, as well as schools’ role in the production of students’ achievement.

Suggested Citation

  • Andrew McEachin & Allison Atteberry, 2017. "The Impact of Summer Learning Loss on Measures of School Performance," Education Finance and Policy, MIT Press, vol. 12(4), pages 468-491, Fall.
  • Handle: RePEc:tpr:edfpol:v:12:y:2017:i:4:p:468-491
    as

    Download full text from publisher

    File URL: http://www.mitpressjournals.org/doi/pdf/10.1162/EDFP_a_00213
    Download Restriction: Access to PDF is restricted to subscribers.
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Erik Hanushek & Stephen Machin & Ludger Woessmann (ed.), 2011. "Handbook of the Economics of Education," Handbook of the Economics of Education, Elsevier, edition 1, volume 4, number 4, June.
    2. Thomas J. Kane & Douglas O. Staiger, 2008. "Estimating Teacher Impacts on Student Achievement: An Experimental Evaluation," NBER Working Papers 14607, National Bureau of Economic Research, Inc.
    3. Raj Chetty & John N. Friedman & Nathaniel Hilger & Emmanuel Saez & Diane Whitmore Schanzenbach & Danny Yagan, 2011. "How Does Your Kindergarten Classroom Affect Your Earnings? Evidence from Project Star," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 126(4), pages 1593-1660.
    4. Canice Prendergast, 1999. "The Provision of Incentives in Firms," Journal of Economic Literature, American Economic Association, vol. 37(1), pages 7-63, March.
    5. Daniel F. McCaffrey & Tim R. Sass & J. R. Lockwood & Kata Mihaly, 2009. "The Intertemporal Variability of Teacher Effect Estimates," Education Finance and Policy, MIT Press, vol. 4(4), pages 572-606, October.
    6. Cassandra M. Guarino & Mark D. Reckase & Jeffrey M. Woolrdige, 2014. "Can Value-Added Measures of Teacher Performance Be Trusted?," Education Finance and Policy, MIT Press, vol. 10(1), pages 117-156, November.
    7. Raj Chetty & John N. Friedman & Jonah E. Rockoff, 2014. "Measuring the Impacts of Teachers I: Evaluating Bias in Teacher Value-Added Estimates," American Economic Review, American Economic Association, vol. 104(9), pages 2593-2632, September.
    8. Joshua D. Angrist & Jörn-Steffen Pischke, 2009. "Mostly Harmless Econometrics: An Empiricist's Companion," Economics Books, Princeton University Press, edition 1, number 8769.
    9. Joshua D. Angrist & Peter D. Hull & Parag A. Pathak & Christopher R. Walters, 2017. "Erratum to “Leveraging Lotteries for School Value-Added: Testing and Estimation”," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 132(4), pages 2061-2062.
    10. Dan Goldhaber & Michael Hansen, 2013. "Is it Just a Bad Class? Assessing the Long-term Stability of Estimated Teacher Performance," Economica, London School of Economics and Political Science, vol. 80(319), pages 589-612, July.
    11. Imberman, Scott A. & Lovenheim, Michael F., 2016. "Does the market value value-added? Evidence from housing prices after a public release of school and teacher value-added," Journal of Urban Economics, Elsevier, vol. 91(C), pages 104-121.
    12. Raj Chetty & John N. Friedman & Jonah E. Rockoff, 2014. "Measuring the Impacts of Teachers II: Teacher Value-Added and Student Outcomes in Adulthood," American Economic Review, American Economic Association, vol. 104(9), pages 2633-2679, September.
    13. David J. Deming, 2014. "Using School Choice Lotteries to Test Measures of School Effectiveness," American Economic Review, American Economic Association, vol. 104(5), pages 406-411, May.
    14. Joshua D. Angrist & Peter D. Hull & Parag A. Pathak & Christopher R. Walters, 2017. "Leveraging Lotteries for School Value-Added: Testing and Estimation," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 132(2), pages 871-919.
    15. Ladd, Helen F. & Walsh, Randall P., 2002. "Implementing value-added measures of school effectiveness: getting the incentives right," Economics of Education Review, Elsevier, vol. 21(1), pages 1-17, February.
    16. Kata Mihaly & Daniel F. McCaffrey & J. R. Lockwood & Tim R. Sass, 2010. "Centering and reference groups for estimates of fixed effects: Modifications to felsdvreg," Stata Journal, StataCorp LP, vol. 10(1), pages 82-103, March.
    17. Figlio, David N. & Kenny, Lawrence W., 2009. "Public sector performance measurement and stakeholder support," Journal of Public Economics, Elsevier, vol. 93(9-10), pages 1069-1077, October.
    18. C. Kirabo Jackson, 2012. "Non-Cognitive Ability, Test Scores, and Teacher Quality: Evidence from 9th Grade Teachers in North Carolina," NBER Working Papers 18624, National Bureau of Economic Research, Inc.
    19. George Baker, 2000. "The Use of Performance Measures in Incentive Contracting," American Economic Review, American Economic Association, vol. 90(2), pages 415-420, May.
    20. Holmstrom, Bengt & Milgrom, Paul, 1991. "Multitask Principal-Agent Analyses: Incentive Contracts, Asset Ownership, and Job Design," The Journal of Law, Economics, and Organization, Oxford University Press, vol. 7(0), pages 24-52, Special I.
    21. Will Dobbie & Roland G. Fryer Jr., 2015. "The Medium-Term Impacts of High-Achieving Charter Schools," Journal of Political Economy, University of Chicago Press, vol. 123(5), pages 985-1037.
    22. Petra E. Todd & Kenneth I. Wolpin, 2003. "On The Specification and Estimation of The Production Function for Cognitive Achievement," Economic Journal, Royal Economic Society, vol. 113(485), pages 3-33, February.
    23. Cory Koedel & Mark Ehlert & Eric Parsons & Michael Podgursky, 2012. "Selecting Growth Measures for School and Teacher Evaluations," Working Papers 1210, Department of Economics, University of Missouri.
    24. Sean F. Reardon & Stephen W. Raudenbush, 2009. "Assumptions of Value-Added Models for Estimating School Effects," Education Finance and Policy, MIT Press, vol. 4(4), pages 492-519, October.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Thompson, Paul N., 2021. "Is four less than five? Effects of four-day school weeks on student achievement in Oregon," Journal of Public Economics, Elsevier, vol. 193(C).
    2. Daniel McNeish & Denis Dumas, 2021. "A seasonal dynamic measurement model for summer learning loss," Journal of the Royal Statistical Society Series A, Royal Statistical Society, vol. 184(2), pages 616-642, April.
    3. Alban Conto, Carolina & Akseer, Spogmai & Dreesen, Thomas & Kamei, Akito & Mizunoya, Suguru & Rigole, Annika, 2021. "Potential effects of COVID-19 school closures on foundational skills and Country responses for mitigating learning loss," International Journal of Educational Development, Elsevier, vol. 87(C).
    4. Ludger Wößmann, 2020. "Folgekosten ausbleibenden Lernens: Was wir über die Corona-bedingten Schulschließungen aus der Forschung lernen können," ifo Schnelldienst, ifo Institute - Leibniz Institute for Economic Research at the University of Munich, vol. 73(06), pages 38-44, June.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Koedel, Cory & Mihaly, Kata & Rockoff, Jonah E., 2015. "Value-added modeling: A review," Economics of Education Review, Elsevier, vol. 47(C), pages 180-195.
    2. M. Caridad Araujo & Pedro Carneiro & Yyannú Cruz-Aguayo & Norbert Schady, 2016. "Teacher Quality and Learning Outcomes in Kindergarten," The Quarterly Journal of Economics, President and Fellows of Harvard College, vol. 131(3), pages 1415-1453.
    3. Gershenson, Seth & Holt, Stephen B. & Papageorge, Nicholas W., 2015. "Who Believes in Me? The Effect of Student-Teacher Demographic Match on Teacher Expectations," IZA Discussion Papers 9202, Institute of Labor Economics (IZA).
    4. Seth Gershenson & Diane Whitmore Schanzenbach, 2016. "Linking Teacher Quality, Student Attendance, and Student Achievement," Education Finance and Policy, MIT Press, vol. 11(2), pages 125-149, Spring.
    5. Stacy, Brian & Guarino, Cassandra & Wooldridge, Jeffrey, 2018. "Does the precision and stability of value-added estimates of teacher performance depend on the types of students they serve?," Economics of Education Review, Elsevier, vol. 64(C), pages 50-74.
    6. Naven, Matthew, 2019. "Human-Capital Formation During Childhood and Adolescence: Evidence from School Quality and Postsecondary Success in California," MPRA Paper 97716, University Library of Munich, Germany.
    7. Nirav Mehta, 2014. "Targeting the Wrong Teachers: Estimating Teacher Quality for Use in Accountability Regimes," University of Western Ontario, Centre for Human Capital and Productivity (CHCP) Working Papers 20143, University of Western Ontario, Centre for Human Capital and Productivity (CHCP).
    8. Nirav Mehta, 2019. "Measuring quality for use in incentive schemes: The case of “shrinkage” estimators," Quantitative Economics, Econometric Society, vol. 10(4), pages 1537-1577, November.
    9. Stacy, Brian, 2014. "Ranking Teachers when Teacher Value-Added is Heterogeneous Across Students," EconStor Preprints 104743, ZBW - Leibniz Information Centre for Economics.
    10. Goel, Deepti & Barooah, Bidisha, 2018. "Drivers of Student Performance: Evidence from Higher Secondary Public Schools in Delhi," GLO Discussion Paper Series 231, Global Labor Organization (GLO).
    11. Hinnerich, Björn Tyrefors & Vlachos, Jonas, 2017. "The impact of upper-secondary voucher school attendance on student achievement. Swedish evidence using external and internal evaluations," Labour Economics, Elsevier, vol. 47(C), pages 1-14.
    12. Dhushyanth Raju, 2017. "Public School Teacher Management in Sri Lanka," South Asia Economic Journal, Institute of Policy Studies of Sri Lanka, vol. 18(1), pages 39-63, March.
    13. Eric Parsons & Cory Koedel & Li Tan, 2019. "Accounting for Student Disadvantage in Value-Added Models," Journal of Educational and Behavioral Statistics, , vol. 44(2), pages 144-179, April.
    14. David Blazar, 2018. "Validating Teacher Effects on Students’ Attitudes and Behaviors: Evidence from Random Assignment of Teachers to Students," Education Finance and Policy, MIT Press, vol. 13(3), pages 281-309, Summer.
    15. Susanna Loeb & Michael S. Christian & Heather Hough & Robert H. Meyer & Andrew B. Rice & Martin R. West, 2019. "School Differences in Social–Emotional Learning Gains: Findings From the First Large-Scale Panel Survey of Students," Journal of Educational and Behavioral Statistics, , vol. 44(5), pages 507-542, October.
    16. Mookerjee, Sulagna & Slichter, David, 2023. "Test scores, schools, and the geography of economic opportunity," Journal of Urban Economics, Elsevier, vol. 137(C).
    17. Jean-William Laliberté, "undated". "Long-term Contextual Effects in Education: Schools and Neighborhoods," Working Papers 2019-01, Department of Economics, University of Calgary.
    18. Aedin Doris & Donal O'Neill & Olive Sweetman, 2019. "Good Schools or Good Students? The Importance of Selectivity for School Rankings," Economics Department Working Paper Series n293-19.pdf, Department of Economics, National University of Ireland - Maynooth.
    19. Backes, Ben & Cowan, James & Goldhaber, Dan & Koedel, Cory & Miller, Luke C. & Xu, Zeyu, 2018. "The common core conundrum: To what extent should we worry that changes to assessments will affect test-based measures of teacher performance?," Economics of Education Review, Elsevier, vol. 62(C), pages 48-65.
    20. Seth Gershenson, 2016. "Performance Standards and Employee Effort: Evidence From Teacher Absences," Journal of Policy Analysis and Management, John Wiley & Sons, Ltd., vol. 35(3), pages 615-638, June.

    More about this item

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:tpr:edfpol:v:12:y:2017:i:4:p:468-491. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Kelly McDougall (email available below). General contact details of provider: https://direct.mit.edu/journals .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.