IDEAS home Printed from https://ideas.repec.org/a/gam/jmathe/v13y2025i4p594-d1588711.html
   My bibliography  Save this article

A New Multidimensional Computerized Testing Approach: On-the-Fly Assembled Multistage Adaptive Testing Based on Multidimensional Item Response Theory

Author

Listed:
  • Jingwen Li

    (College of Science, Beijing Forestry University, Beijing 100083, China)

  • Jianan Sun

    (College of Science, Beijing Forestry University, Beijing 100083, China)

  • Mingyu Shao

    (College of Science, Beijing Forestry University, Beijing 100083, China)

  • Yinghui Lai

    (School of Educational Sciences, Hunan Institute of Science and Technology, Yueyang 414006, China)

  • Chen Chen

    (College of Science, Beijing Forestry University, Beijing 100083, China)

Abstract

Unidimensional on-the-fly assembled multistage adaptive testing (OMST), a flexible testing method, integrates the strengths of the adaptive test assembly of computerized adaptive testing (CAT) and the modular test administration of multistage adaptive testing (MST). Since numerous latent trait structures in practical applications are inherently multidimensional, extending the realm from unidimensional to multidimensional is necessary. Multidimensional item response theory (MIRT), a branch of mathematical and statistical latent variable modeling research, has an important position in the international testing field. Based on MIRT, this study proposes an approach of multidimensional OMST (OMST-M), and on-the-fly automated test assembly algorithms are proposed based on point estimation and confidence ellipsoid, respectively. OMST-M can effectively and flexibly measure multidimensional latent traits through stage-by-stage adaptive testing. The simulation results indicated that under different settings of latent trait structures, module lengths, and module contents, the OMST-M approach demonstrated good performance in terms of ability estimation accuracy and item exposure control. The empirical research revealed that the OMST-M approach was comparable to both multidimensional MST and CAT in ability estimation accuracy and exhibited remarkable flexibility in adjusting the length and content across its test stages. In summary, the proposed OMST-M features relatively high measurement accuracy, efficiency, convenient implementation, and practical feasibility.

Suggested Citation

  • Jingwen Li & Jianan Sun & Mingyu Shao & Yinghui Lai & Chen Chen, 2025. "A New Multidimensional Computerized Testing Approach: On-the-Fly Assembled Multistage Adaptive Testing Based on Multidimensional Item Response Theory," Mathematics, MDPI, vol. 13(4), pages 1-27, February.
  • Handle: RePEc:gam:jmathe:v:13:y:2025:i:4:p:594-:d:1588711
    as

    Download full text from publisher

    File URL: https://www.mdpi.com/2227-7390/13/4/594/pdf
    Download Restriction: no

    File URL: https://www.mdpi.com/2227-7390/13/4/594/
    Download Restriction: no
    ---><---

    References listed on IDEAS

    as
    1. Hua-Hua Chang, 2015. "Psychometrics Behind Computerized Adaptive Testing," Psychometrika, Springer;The Psychometric Society, vol. 80(1), pages 1-20, March.
    2. Hua-Hua Chang & Zhiliang Ying, 2008. "To Weight or Not to Weight? Balancing Influence of Initial Items in Adaptive Testing," Psychometrika, Springer;The Psychometric Society, vol. 73(3), pages 441-450, September.
    3. Joris Mulder & Wim Linden, 2009. "Multidimensional Adaptive Testing with Optimal Design Criteria for Item Selection," Psychometrika, Springer;The Psychometric Society, vol. 74(2), pages 273-296, June.
    4. Bernard Veldkamp & Wim Linden, 2002. "Multidimensional adaptive testing with constraints on test content," Psychometrika, Springer;The Psychometric Society, vol. 67(4), pages 575-588, December.
    5. Chalmers, R. Philip, 2016. "Generating Adaptive and Non-Adaptive Test Interfaces for Multidimensional Item Response Theory Applications," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 71(i05).
    6. Chalmers, R. Philip, 2012. "mirt: A Multidimensional Item Response Theory Package for the R Environment," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 48(i06).
    7. Chengyu Cui & Chun Wang & Gongjun Xu, 2024. "Variational Estimation for Multidimensional Generalized Partial Credit Model," Psychometrika, Springer;The Psychometric Society, vol. 89(3), pages 929-957, September.
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Chun Wang & David J. Weiss & Zhuoran Shang, 2019. "Variable-Length Stopping Rules for Multidimensional Computerized Adaptive Testing," Psychometrika, Springer;The Psychometric Society, vol. 84(3), pages 749-771, September.
    2. Chun Wang & Hua-Hua Chang, 2011. "Item Selection in Multidimensional Computerized Adaptive Testing—Gaining Information from Different Angles," Psychometrika, Springer;The Psychometric Society, vol. 76(3), pages 363-384, July.
    3. Chun Wang, 2015. "On Latent Trait Estimation in Multidimensional Compensatory Item Response Models," Psychometrika, Springer;The Psychometric Society, vol. 80(2), pages 428-449, June.
    4. Chun Wang & Hua-Hua Chang & Keith Boughton, 2011. "Kullback–Leibler Information and Its Applications in Multi-Dimensional Adaptive Testing," Psychometrika, Springer;The Psychometric Society, vol. 76(1), pages 13-39, January.
    5. Hua-Hua Chang, 2015. "Psychometrics Behind Computerized Adaptive Testing," Psychometrika, Springer;The Psychometric Society, vol. 80(1), pages 1-20, March.
    6. Edison M. Choe & Jinming Zhang & Hua-Hua Chang, 2018. "Sequential Detection of Compromised Items Using Response Times in Computerized Adaptive Testing," Psychometrika, Springer;The Psychometric Society, vol. 83(3), pages 650-673, September.
    7. Lihua Yao, 2012. "Multidimensional CAT Item Selection Methods for Domain Scores and Composite Scores: Theory and Applications," Psychometrika, Springer;The Psychometric Society, vol. 77(3), pages 495-523, July.
    8. Sara Fernandes & Guillaume Fond & Xavier Zendjidjian & Pierre Michel & Karine Baumstarck & Christophe Lançon & Ludovic Samalin & Pierre-Michel Llorca & Magali Coldefy & Pascal Auquier & Laurent Boyer , 2022. "Development and Calibration of the PREMIUM Item Bank for Measuring Respect and Dignity for Patients with Severe Mental Illness," Post-Print hal-03649277, HAL.
    9. Mi Jung Lee & Daejin Kim & Sergio Romero & Ickpyo Hong & Nikolay Bliznyuk & Craig Velozo, 2022. "Examining Older Adults’ Home Functioning Using the American Housing Survey," IJERPH, MDPI, vol. 19(8), pages 1-13, April.
    10. Luo, Nanyu & Ji, Feng & Han, Yuting & He, Jinbo & Zhang, Xiaoya, 2024. "Fitting item response theory models using deep learning computational frameworks," OSF Preprints tjxab, Center for Open Science.
    11. Melissa Gladstone & Gillian Lancaster & Gareth McCray & Vanessa Cavallera & Claudia R. L. Alves & Limbika Maliwichi & Muneera A. Rasheed & Tarun Dua & Magdalena Janus & Patricia Kariger, 2021. "Validation of the Infant and Young Child Development (IYCD) Indicators in Three Countries: Brazil, Malawi and Pakistan," IJERPH, MDPI, vol. 18(11), pages 1-19, June.
    12. Björn Andersson & Tao Xin, 2021. "Estimation of Latent Regression Item Response Theory Models Using a Second-Order Laplace Approximation," Journal of Educational and Behavioral Statistics, , vol. 46(2), pages 244-265, April.
    13. Victoria T. Tanaka & George Engelhard & Matthew P. Rabbitt, 2020. "Using a Bifactor Model to Measure Food Insecurity in Households with Children," Journal of Family and Economic Issues, Springer, vol. 41(3), pages 492-504, September.
    14. Klaas Sijtsma & Jules L. Ellis & Denny Borsboom, 2024. "Recognize the Value of the Sum Score, Psychometrics’ Greatest Accomplishment," Psychometrika, Springer;The Psychometric Society, vol. 89(1), pages 84-117, March.
    15. Çetin Toraman & Güneş Korkmaz, 2023. "What is the “Meaning of School†to High School Students? A Scale Development and Implementation Study Based on IRT and CTT," SAGE Open, , vol. 13(3), pages 21582440231, September.
    16. Yikun Luo & Qipeng Chen & Jianyong Chen & Peida Zhan, 2024. "Development and validation of two shortened anxiety sensitive index-3 scales based on item response theory," Palgrave Communications, Palgrave Macmillan, vol. 11(1), pages 1-7, December.
    17. Cervantes, Víctor H., 2017. "DFIT: An R Package for Raju's Differential Functioning of Items and Tests Framework," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 76(i05).
    18. Elina Tsigeman & Sebastian Silas & Klaus Frieler & Maxim Likhanov & Rebecca Gelding & Yulia Kovas & Daniel Müllensiefen, 2022. "The Jack and Jill Adaptive Working Memory Task: Construction, Calibration and Validation," PLOS ONE, Public Library of Science, vol. 17(1), pages 1-29, January.
    19. Joshua B. Gilbert & James S. Kim & Luke W. Miratrix, 2023. "Modeling Item-Level Heterogeneous Treatment Effects With the Explanatory Item Response Model: Leveraging Large-Scale Online Assessments to Pinpoint the Impact of Educational Interventions," Journal of Educational and Behavioral Statistics, , vol. 48(6), pages 889-913, December.
    20. Bing Li & Cody Ding & Huiying Shi & Fenghui Fan & Liya Guo, 2023. "Assessment of Psychological Zone of Optimal Performance among Professional Athletes: EGA and Item Response Theory Analysis," Sustainability, MDPI, vol. 15(10), pages 1-15, May.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:gam:jmathe:v:13:y:2025:i:4:p:594-:d:1588711. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: MDPI Indexing Manager (email available below). General contact details of provider: https://www.mdpi.com .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.