IDEAS home Printed from https://ideas.repec.org/a/spr/jlabre/v42y2021i2d10.1007_s12122-021-09317-8.html
   My bibliography  Save this article

Taking PISA Seriously: How Accurate are Low-Stakes Exams?

Author

Listed:
  • Pelin Akyol

    (Bilkent University)

  • Kala Krishna

    (Penn State University, CES-IFO and NBER)

  • Jinwen Wang

    (Bates White Economic Consulting)

Abstract

PISA is seen as the gold standard for evaluating educational outcomes worldwide. Yet, being a low-stakes exam, students may not take it seriously resulting in downward biased scores and inaccurate rankings. This paper provides a method to identify and account for non-serious behavior in low-stakes exams by leveraging information in computer-based assessments in PISA 2015. Our method corrects for non-serious behavior by fully imputing scores for items not taken seriously. We compare the scores/rankings calculated by our method to the scores/rankings calculated by giving zero points to skipped items as well as to the scores/rankings calculated by treating skipped items at the end of the exam as if they were not administered, which is the procedure followed by PISA. We show that a country can improve its ranking by up to 15 places by encouraging its own students to take the exam seriously and that the PISA approach corrects for only about half of the bias generated by the non-seriousness.

Suggested Citation

  • Pelin Akyol & Kala Krishna & Jinwen Wang, 2021. "Taking PISA Seriously: How Accurate are Low-Stakes Exams?," Journal of Labor Research, Springer, vol. 42(2), pages 184-243, June.
  • Handle: RePEc:spr:jlabre:v:42:y:2021:i:2:d:10.1007_s12122-021-09317-8
    DOI: 10.1007/s12122-021-09317-8
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s12122-021-09317-8
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s12122-021-09317-8?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to look for a different version below or search for a different version of it.

    Other versions of this item:

    References listed on IDEAS

    as
    1. Eric A. Hanushek & Ludger Wössmann, 2006. "Does Educational Tracking Affect Performance and Inequality? Differences- in-Differences Evidence Across Countries," Economic Journal, Royal Economic Society, vol. 116(510), pages 63-76, March.
    2. Jalava, Nina & Joensen, Juanna Schrøter & Pellas, Elin, 2015. "Grades and rank: Impacts of non-financial incentives on test performance," Journal of Economic Behavior & Organization, Elsevier, vol. 115(C), pages 161-196.
    3. Attali, Yigal & Neeman, Zvika & Schlosser, Analia, 2011. "Rise to the Challenge or Not Give a Damn: Differential Performance in High vs. Low Stakes Tests," Foerder Institute for Economic Research Working Papers 275743, Tel-Aviv University > Foerder Institute for Economic Research.
    4. Hanushek, Eric A. & Link, Susanne & Woessmann, Ludger, 2013. "Does school autonomy make sense everywhere? Panel estimates from PISA," Journal of Development Economics, Elsevier, vol. 104(C), pages 212-232.
    5. Victor Lavy, 2015. "Do Differences in Schools' Instruction Time Explain International Achievement Gaps? Evidence from Developed and Developing Countries," Economic Journal, Royal Economic Society, vol. 125(588), pages 397-424, November.
    6. Lounkaew, Kiatanantha, 2013. "Explaining urban–rural differences in educational achievement in Thailand: Evidence from PISA literacy data," Economics of Education Review, Elsevier, vol. 37(C), pages 213-225.
    7. Uri Gneezy & John A. List & Jeffrey A. Livingston & Xiangdong Qin & Sally Sadoff & Yang Xu, 2019. "Measuring Success in Education: The Role of Effort on the Test Itself," American Economic Review: Insights, American Economic Association, vol. 1(3), pages 291-308, December.
    8. Gema Zamarro & Collin Hitt & Ildefonso Mendez, 2019. "When Students Don’t Care: Reexamining International Differences in Achievement and Student Effort," Journal of Human Capital, University of Chicago Press, vol. 13(4), pages 519-552.
    9. Ghazala Azmat & Caterina Calsamiglia & Nagore Iriberri, 2016. "Gender Differences In Response To Big Stakes," Journal of the European Economic Association, European Economic Association, vol. 14(6), pages 1372-1400, December.
    Full references (including those not matched with items on IDEAS)

    Citations

    Citations are extracted by the CitEc Project, subscribe to its RSS feed for this item.
    as


    Cited by:

    1. Lucas Gortazar, 2019. "¿Favorece el sistema educativo español la igualdad de oportunidades?," Studies on the Spanish Economy eee2019-17, FEDEA.
    2. Brunello, Giorgio & Kiss, David, 2022. "Math scores in high stakes grades," Economics of Education Review, Elsevier, vol. 87(C).
    3. Filmer, Deon & Rogers, Halsey & Angrist, Noam & Sabarwal, Shwetlena, 2020. "Learning-adjusted years of schooling (LAYS): Defining a new macro measure of education," Economics of Education Review, Elsevier, vol. 77(C).
    4. Bobba, Matteo & Frisancho, Veronica & Pariguana, Marco, 2016. "Perceived Ability and School Choices: Experimental Evidence and Scale-up Effects," TSE Working Papers 16-660, Toulouse School of Economics (TSE), revised Jul 2024.
    5. Griselda, Silvia, 2024. "Gender gap in standardized tests: What are we measuring?," Journal of Economic Behavior & Organization, Elsevier, vol. 221(C), pages 191-229.
    6. Hai-Anh Dang & Paul Glewwe & Khoa Vu & Jongwook Lee, 2021. "What Explains Vietnam’s Exceptional Performance in Education Relative to Other Countries? Analysis of the 2012 and 2015 PISA Data," Working Papers 580, ECINEQ, Society for the Study of Economic Inequality.
    7. Hanushek, Eric A. & Kinne, Lavinia & Lergetporer, Philipp & Woessmann, Ludger, 2020. "Culture and Student Achievement: The Intertwined Roles of Patience and Risk-Taking," Rationality and Competition Discussion Paper Series 249, CRC TRR 190 Rationality and Competition.
    8. de Hoyos, Rafael & Estrada, Ricardo & Vargas, María José, 2021. "What do test scores really capture? Evidence from a large-scale student assessment in Mexico," World Development, Elsevier, vol. 146(C).
    9. Sarkar, Dipanwita & Sarkar, Jayanta & Dulleck, Uwe, 2024. "The effects of private and social incentives on students’ test-taking effort," Economic Modelling, Elsevier, vol. 135(C).
    10. De Hoyos Navarro,Rafael E. & Estrada,Ricardo & Vargas Mancera,Maria Jose, 2021. "Do Large-Scale Student Assessments Really Capture Cognitive Skills ?," Policy Research Working Paper Series 9537, The World Bank.
    11. Ana Balsa & Alejandro Cid & Ana Laura Zardo, 2022. "Providing academic opportunities to vulnerable adolescents: a randomised evaluation of privately managed tuition-free middle schools in Uruguay," Journal of Development Effectiveness, Taylor & Francis Journals, vol. 14(4), pages 340-379, October.
    12. Hanson, Gordon & Liu, Chen, 2023. "Immigration and occupational comparative advantage," Journal of International Economics, Elsevier, vol. 145(C).
    13. Dang, Hai-Anh H & Glewwe, Paul & Vu, Khoa & Lee, Jongwook, 2021. "What Explains Vietnam's Exceptional Performance in Education Relative to Other Countries? Analysis of the 2012 and 2015 Pisa Data," IZA Discussion Papers 14315, Institute of Labor Economics (IZA).
    14. Pelin Akyol, 2021. "Comparison of Computer-based and Paper-based Exams: Evidence from PISA," Bogazici Journal, Review of Social, Economic and Administrative Studies, Bogazici University, Department of Economics, vol. 35(2), pages 137-150.
    15. Bau, Natalie & Das, Jishnu & Yi Chang, Andres, 2021. "New evidence on learning trajectories in a low-income setting," International Journal of Educational Development, Elsevier, vol. 84(C).
    16. Ofek-Shanny, Yuval, 2024. "Measurements of performance gaps are sensitive to the level of test stakes: Evidence from PISA and a Field Experiment," Economics of Education Review, Elsevier, vol. 98(C).
    17. Eric A Hanushek & Lavinia Kinneifo & Philipp Lergetporer & Ludger Woessmann, 2022. "Patience, Risk-Taking, and Human Capital Investment Across Countries," The Economic Journal, Royal Economic Society, vol. 132(646), pages 2290-2307.
    18. Silvia Griselda, 2020. "Different Questions, Different Gender Gap: Can the Format of Questions Explain the Gender Gap in Mathematics?," 2020 Papers pgr710, Job Market Papers.
    19. Maria Zumbuehl & Stefanie Hof & Stefan C. Wolter, 2020. "Private tutoring and academic achievement in a selective education system," Economics of Education Working Paper Series 0169, University of Zurich, Department of Business Administration (IBW), revised Oct 2022.
    20. Franco, Catalina & Povea, Erika, 2024. "Innocuous Exam Features? The Impact of Answer Placement on High-Stakes Test Performance and College Admissions," Discussion Paper Series in Economics 4/2024, Norwegian School of Economics, Department of Economics.
    21. Giorgio Brunello & Angela Crema & Lorenzo Rocco, 2021. "Some Unpleasant Consequences of Testing at Length," Oxford Bulletin of Economics and Statistics, Department of Economics, University of Oxford, vol. 83(4), pages 1002-1023, August.
    22. Francesca Borgonovi & Alessandro Ferrara & Mario Piacentini, 2020. "From asking to observing. Behavioural measures of socio-emotional and motivational skills in large-scale assessments," DoQSS Working Papers 20-19, Quantitative Social Science - UCL Social Research Institute, University College London.
    23. Robert Rudolf & Dirk Bethmann, 2023. "The Paradox of Wealthy Nations’ Low Adolescent Life Satisfaction," Journal of Happiness Studies, Springer, vol. 24(1), pages 79-105, January.

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Clare Leaver & Renata Lemos & Daniela Scur, 2019. "Measuring and explaining management in schools: new approaches using public data," CEP Discussion Papers dp1656, Centre for Economic Performance, LSE.
    2. Ludger Woessmann, 2016. "The Importance of School Systems: Evidence from International Differences in Student Achievement," Journal of Economic Perspectives, American Economic Association, vol. 30(3), pages 3-32, Summer.
    3. Uri Gneezy & John A. List & Jeffrey A. Livingston & Xiangdong Qin & Sally Sadoff & Yang Xu, 2019. "Measuring Success in Education: The Role of Effort on the Test Itself," American Economic Review: Insights, American Economic Association, vol. 1(3), pages 291-308, December.
    4. Brunello, Giorgio & Kiss, David, 2022. "Math scores in high stakes grades," Economics of Education Review, Elsevier, vol. 87(C).
    5. José M. Cordero & Víctor Cristóbal & Daniel Santín, 2018. "Causal Inference On Education Policies: A Survey Of Empirical Studies Using Pisa, Timss And Pirls," Journal of Economic Surveys, Wiley Blackwell, vol. 32(3), pages 878-915, July.
    6. Griselda, Silvia, 2024. "Gender gap in standardized tests: What are we measuring?," Journal of Economic Behavior & Organization, Elsevier, vol. 221(C), pages 191-229.
    7. Daniela Del Boca & Chiara Monfardini & Sarah Grace See, 2022. "Early Childcare Duration and Student' Later Outcomes in Europe," Working Papers 2022-021, Human Capital and Economic Opportunity Working Group.
    8. Calsamiglia, Caterina & Loviglio, Annalisa, 2019. "Grading on a curve: When having good peers is not good," Economics of Education Review, Elsevier, vol. 73(C).
    9. Tim Klausmann, 2021. "Feedback in Homogeneous Ability Groups: A Field Experiment," Working Papers 2114, Gutenberg School of Management and Economics, Johannes Gutenberg-Universität Mainz.
    10. Catherine Haeck & Pierre Lefebvre, 2020. "The Evolution of Cognitive Skills Inequalities by Socioeconomic Status across Canada," Working Papers 20-04, Research Group on Human Capital, University of Quebec in Montreal's School of Management.
    11. Jerrim, John & Lopez-Agudo, Luis Alejandro & Marcenaro-Gutierrez, Oscar D. & Shure, Nikki, 2017. "What happens when econometrics and psychometrics collide? An example using the PISA data," Economics of Education Review, Elsevier, vol. 61(C), pages 51-58.
    12. Burgess, Simon & Metcalfe, Robert & Sadoff, Sally, 2021. "Understanding the response to financial and non-financial incentives in education: Field experimental evidence using high-stakes assessments," Economics of Education Review, Elsevier, vol. 85(C).
    13. Gust, Sarah & Hanushek, Eric A. & Woessmann, Ludger, 2024. "Global universal basic skills: Current deficits and implications for world development," Journal of Development Economics, Elsevier, vol. 166(C).
    14. Maria A. Cattaneo & Chantal Oggenfuss & Stefan C. Wolter, 2017. "The more, the better? The impact of instructional time on student performance," Education Economics, Taylor & Francis Journals, vol. 25(5), pages 433-445, September.
    15. Bach, Maximilian & Fischer, Mira, 2020. "Understanding the response to high-stakes incentives in primary education," ZEW Discussion Papers 20-066, ZEW - Leibniz Centre for European Economic Research.
    16. Xiqian Cai & Yi Lu & Jessica Pan & Songfa Zhong, 2019. "Gender Gap under Pressure: Evidence from China's National College Entrance Examination," The Review of Economics and Statistics, MIT Press, vol. 101(2), pages 249-263, May.
    17. Hanushek, Eric A., 2021. "Addressing cross-national generalizability in educational impact evaluation," International Journal of Educational Development, Elsevier, vol. 80(C).
    18. Jo Blanden & Matthias Doepke & Jan Stuhler, 2022. "Education inequality," CEP Discussion Papers dp1849, Centre for Economic Performance, LSE.
    19. Maria Cotofan & Ron Diris & Trudie Schils, 2019. "The Heterogeneous Effects of Early Track Assignment on Cognitive and Non-cognitive Skills," Tinbergen Institute Discussion Papers 19-038/V, Tinbergen Institute.
    20. Clara-Christina E. Gerstner & Emmanuel S. Tsyawo, 2022. "Policy spillover effects on student achievement: evidence from PISA," Letters in Spatial and Resource Sciences, Springer, vol. 15(3), pages 523-541, December.

    More about this item

    Keywords

    Low-stakes exams; Computer-based assessments; PISA; Biased rankings; Item response data;
    All these keywords.

    JEL classification:

    • C53 - Mathematical and Quantitative Methods - - Econometric Modeling - - - Forecasting and Prediction Models; Simulation Methods
    • I20 - Health, Education, and Welfare - - Education - - - General
    • I21 - Health, Education, and Welfare - - Education - - - Analysis of Education

    Statistics

    Access and download statistics

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:jlabre:v:42:y:2021:i:2:d:10.1007_s12122-021-09317-8. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.