IDEAS home Printed from https://ideas.repec.org/a/spr/stmapp/v33y2024i5d10.1007_s10260-024-00762-0.html
   My bibliography  Save this article

Jointly exploring mathematics ability and speed in large-scale computer-based testing

Author

Listed:
  • Luca Bungaro

    (University of Bologna)

  • Marta Desimoni

    (National Institute for the Evaluation of the Education and Training Educational System (INVALSI))

  • Mariagiulia Matteucci

    (University of Bologna)

  • Stefania Mignani

    (University of Bologna)

Abstract

In large-scale tests, the implementation of computer-based testing (CBT) allows to automatically collect data not only on the students’ response accuracy (RA) based on item responses of the test, but also on their response time (RT). RTs can provide a more comprehensive view of a test-taker’s performance beyond just what is obtainable based on correct responses alone. In this paper a joint approach is considered to improve the estimation of ability scores involving complex data coming from computer-based test administration. The study focuses on analysing the data of Italian grade 10 mathematics national assessment administered by the National Institute for the Evaluation of the Education and Training System (INVALSI). In addition, a bivariate multilevel regression with speed and ability estimates, obtained by joint model, is developed including individual covariates to evaluate the contribution of individual and contextual variables in predicting test-taking speed and ability. Overall, the main results indicate that mathematics ability and speed are significantly and negatively correlated, and that the hierarchical data structure (students nested into classes) should be taken into account when explaining the dependency of ability and speed on explanatory variables, such as prior achievement, test anxiety, sociodemographic covariates, class compositional variables, school tracks and geographical area.

Suggested Citation

  • Luca Bungaro & Marta Desimoni & Mariagiulia Matteucci & Stefania Mignani, 2024. "Jointly exploring mathematics ability and speed in large-scale computer-based testing," Statistical Methods & Applications, Springer;Società Italiana di Statistica, vol. 33(5), pages 1429-1450, November.
  • Handle: RePEc:spr:stmapp:v:33:y:2024:i:5:d:10.1007_s10260-024-00762-0
    DOI: 10.1007/s10260-024-00762-0
    as

    Download full text from publisher

    File URL: http://link.springer.com/10.1007/s10260-024-00762-0
    File Function: Abstract
    Download Restriction: Access to the full text of the articles in this series is restricted.

    File URL: https://libkey.io/10.1007/s10260-024-00762-0?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    As the access to this document is restricted, you may want to search for a different version of it.

    References listed on IDEAS

    as
    1. Wim van der Linden, 2007. "A Hierarchical Framework for Modeling Speed and Accuracy on Test Items," Psychometrika, Springer;The Psychometric Society, vol. 72(3), pages 287-308, September.
    2. Maria Bolsinova & Paul Boeck & Jesper Tijmstra, 2017. "Modelling Conditional Dependence Between Response Time and Accuracy," Psychometrika, Springer;The Psychometric Society, vol. 82(4), pages 1126-1148, December.
    3. Pau Balart & Matthijs Oosterveen, 2019. "Females show more sustained performance during test-taking than males," Nature Communications, Nature, vol. 10(1), pages 1-11, December.
    4. Fox, Jean-Paul & Entink, Rinke Klein & van der Linden, Wilm, 2007. "Modeling of Responses and Response Times with the Package cirt," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 20(i07).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Dylan Molenaar & Paul Boeck, 2018. "Response Mixture Modeling: Accounting for Heterogeneity in Item Characteristics across Response Times," Psychometrika, Springer;The Psychometric Society, vol. 83(2), pages 279-297, June.
    2. M. Marsman & H. Sigurdardóttir & M. Bolsinova & G. Maris, 2019. "Characterizing the Manifest Probability Distributions of Three Latent Trait Models for Accuracy and Response Time," Psychometrika, Springer;The Psychometric Society, vol. 84(3), pages 870-891, September.
    3. Inhan Kang & Paul Boeck & Roger Ratcliff, 2022. "Modeling Conditional Dependence of Response Accuracy and Response Time with the Diffusion Item Response Theory Model," Psychometrika, Springer;The Psychometric Society, vol. 87(2), pages 725-748, June.
    4. Shaw, Amy & Elizondo, Fabian & Wadlington, Patrick L., 2020. "Reasoning, fast and slow: How noncognitive factors may alter the ability-speed relationship," Intelligence, Elsevier, vol. 83(C).
    5. Minjeong Jeon & Paul Boeck & Xiangrui Li & Zhong-Lin Lu, 2020. "Trivariate Theory of Mind Data Analysis with a Conditional Joint Modeling Approach," Psychometrika, Springer;The Psychometric Society, vol. 85(2), pages 398-436, June.
    6. Sun-Joo Cho & Sarah Brown-Schmidt & Paul De Boeck & Matthew Naveiras & Si On Yoon & Aaron Benjamin, 2023. "Incorporating Functional Response Time Effects into a Signal Detection Theory Model," Psychometrika, Springer;The Psychometric Society, vol. 88(3), pages 1056-1086, September.
    7. Inhan Kang & Minjeong Jeon & Ivailo Partchev, 2023. "A Latent Space Diffusion Item Response Theory Model to Explore Conditional Dependence between Responses and Response Times," Psychometrika, Springer;The Psychometric Society, vol. 88(3), pages 830-864, September.
    8. Jinxin Guo & Xin Xu & Zhiliang Ying & Susu Zhang, 2022. "Modeling Not-Reached Items in Timed Tests: A Response Time Censoring Approach," Psychometrika, Springer;The Psychometric Society, vol. 87(3), pages 835-867, September.
    9. Minjeong Jeon & Paul Boeck & Jevan Luo & Xiangrui Li & Zhong-Lin Lu, 2021. "Modeling Within-Item Dependencies in Parallel Data on Test Responses and Brain Activation," Psychometrika, Springer;The Psychometric Society, vol. 86(1), pages 239-271, March.
    10. Wim J. van der Linden, 2009. "A Bivariate Lognormal Response-Time Model for the Detection of Collusion Between Test Takers," Journal of Educational and Behavioral Statistics, , vol. 34(3), pages 378-394, September.
    11. Kang, Inhan & De Boeck, Paul & Partchev, Ivailo, 2022. "A randomness perspective on intelligence processes," Intelligence, Elsevier, vol. 91(C).
    12. Inhan Kang & Dylan Molenaar & Roger Ratcliff, 2023. "A Modeling Framework to Examine Psychological Processes Underlying Ordinal Responses and Response Times of Psychometric Data," Psychometrika, Springer;The Psychometric Society, vol. 88(3), pages 940-974, September.
    13. Konrad Klotzke & Jean-Paul Fox, 2019. "Modeling Dependence Structures for Response Times in a Bayesian Framework," Psychometrika, Springer;The Psychometric Society, vol. 84(3), pages 649-672, September.
    14. Junhuan Wei & Liufen Luo & Yan Cai & Dongbo Tu, 2024. "A Multistrategy Cognitive Diagnosis Model Incorporating Item Response Times Based on Strategy Selection Theories," Journal of Educational and Behavioral Statistics, , vol. 49(4), pages 658-686, August.
    15. Maria Bolsinova & Jesper Tijmstra, 2019. "Modeling Differences Between Response Times of Correct and Incorrect Responses," Psychometrika, Springer;The Psychometric Society, vol. 84(4), pages 1018-1046, December.
    16. Hyeon-Ah Kang & Yi Zheng & Hua-Hua Chang, 2020. "Online Calibration of a Joint Model of Item Responses and Response Times in Computerized Adaptive Testing," Journal of Educational and Behavioral Statistics, , vol. 45(2), pages 175-208, April.
    17. Steven Andrew Culpepper & James Joseph Balamuta, 2017. "A Hierarchical Model for Accuracy and Choice on Standardized Tests," Psychometrika, Springer;The Psychometric Society, vol. 82(3), pages 820-845, September.
    18. Th'eo Durandard & Matteo Camboni, 2024. "Comparative Statics for Optimal Stopping Problems in Nonstationary Environments," Papers 2402.06999, arXiv.org, revised Jul 2024.
    19. Anaya, Lina & Iriberri, Nagore & Rey-Biel, Pedro & Zamarro, Gema, 2022. "Understanding performance in test taking: The role of question difficulty order," Economics of Education Review, Elsevier, vol. 90(C).
    20. Edison M. Choe & Jinming Zhang & Hua-Hua Chang, 2018. "Sequential Detection of Compromised Items Using Response Times in Computerized Adaptive Testing," Psychometrika, Springer;The Psychometric Society, vol. 83(3), pages 650-673, September.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:spr:stmapp:v:33:y:2024:i:5:d:10.1007_s10260-024-00762-0. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: Sonal Shukla or Springer Nature Abstracting and Indexing (email available below). General contact details of provider: http://www.springer.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.