IDEAS home Printed from https://ideas.repec.org/a/sae/evarev/v48y2024i6p1146-1159.html
   My bibliography  Save this article

Calibrating Items Using an Unfolding Model of Item Response Theory: The Case of the Trait Personality Questionnaire 5 (TPQue5)

Author

Listed:
  • Eirini M. Mitropoulou
  • Leonidas A. Zampetakis
  • Ioannis Tsaousis

Abstract

Unfolding item response theory (IRT) models are important alternatives to dominance IRT models in describing the response processes on self-report tests. Their usage is common in personality measures, since they indicate potential differentiations in test score interpretation. This paper aims to gain a better insight into the structure of trait personality, by investigating whether the dominance or alternatively the unfolding IRT model are better descriptors of the response processes on a personality measure constructed under the dominance response theorem. For the assessment of the dominant model, the Graded Response Model (GRM) is used; while for the unfolding model, the Generalized Graded Unfolding Model (GGUM) was examined. All analyses are conducted with the freely available R. A sample of 1340 Greek adults, employed in private and public organizations, fulfilled the Trait Personality Questionnaire 5 short-form (TPQue5). Findings contradict previous research on trait personality. In accordance to the construction method employed, the TPQue5 items are best understood by monotonically increasing item response functions (IRFs). Individuals responding to the TPQue5 increase their probability of endorsing its items as their trait level increases; this stands for all personality dimensions, although Openness to Experience exhibited mixed type of item response patterns. Further research directions, implications and limitations are also discussed.

Suggested Citation

  • Eirini M. Mitropoulou & Leonidas A. Zampetakis & Ioannis Tsaousis, 2024. "Calibrating Items Using an Unfolding Model of Item Response Theory: The Case of the Trait Personality Questionnaire 5 (TPQue5)," Evaluation Review, , vol. 48(6), pages 1146-1159, December.
  • Handle: RePEc:sae:evarev:v:48:y:2024:i:6:p:1146-1159
    DOI: 10.1177/0193841X231223374
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/0193841X231223374
    Download Restriction: no

    File URL: https://libkey.io/10.1177/0193841X231223374?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. Paul Jansen & Edward Roskam, 1986. "Latent trait models and dichotomization of graded responses," Psychometrika, Springer;The Psychometric Society, vol. 51(1), pages 69-91, March.
    2. Chalmers, R. Philip, 2012. "mirt: A Multidimensional Item Response Theory Package for the R Environment," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 48(i06).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Daphna Harel & Russell J. Steele, 2018. "An Information Matrix Test for the Collapsing of Categories Under the Partial Credit Model," Journal of Educational and Behavioral Statistics, , vol. 43(6), pages 721-750, December.
    2. Luo, Nanyu & Ji, Feng & Han, Yuting & He, Jinbo & Zhang, Xiaoya, 2024. "Fitting item response theory models using deep learning computational frameworks," OSF Preprints tjxab, Center for Open Science.
    3. Mark Wilson & Geofferey Masters, 1993. "The partial credit model and null categories," Psychometrika, Springer;The Psychometric Society, vol. 58(1), pages 87-99, March.
    4. Melissa Gladstone & Gillian Lancaster & Gareth McCray & Vanessa Cavallera & Claudia R. L. Alves & Limbika Maliwichi & Muneera A. Rasheed & Tarun Dua & Magdalena Janus & Patricia Kariger, 2021. "Validation of the Infant and Young Child Development (IYCD) Indicators in Three Countries: Brazil, Malawi and Pakistan," IJERPH, MDPI, vol. 18(11), pages 1-19, June.
    5. Björn Andersson & Tao Xin, 2021. "Estimation of Latent Regression Item Response Theory Models Using a Second-Order Laplace Approximation," Journal of Educational and Behavioral Statistics, , vol. 46(2), pages 244-265, April.
    6. Victoria T. Tanaka & George Engelhard & Matthew P. Rabbitt, 2020. "Using a Bifactor Model to Measure Food Insecurity in Households with Children," Journal of Family and Economic Issues, Springer, vol. 41(3), pages 492-504, September.
    7. Klaas Sijtsma & Jules L. Ellis & Denny Borsboom, 2024. "Recognize the Value of the Sum Score, Psychometrics’ Greatest Accomplishment," Psychometrika, Springer;The Psychometric Society, vol. 89(1), pages 84-117, March.
    8. Çetin Toraman & Güneş Korkmaz, 2023. "What is the “Meaning of School†to High School Students? A Scale Development and Implementation Study Based on IRT and CTT," SAGE Open, , vol. 13(3), pages 21582440231, September.
    9. Yikun Luo & Qipeng Chen & Jianyong Chen & Peida Zhan, 2024. "Development and validation of two shortened anxiety sensitive index-3 scales based on item response theory," Palgrave Communications, Palgrave Macmillan, vol. 11(1), pages 1-7, December.
    10. Cervantes, Víctor H., 2017. "DFIT: An R Package for Raju's Differential Functioning of Items and Tests Framework," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 76(i05).
    11. Elina Tsigeman & Sebastian Silas & Klaus Frieler & Maxim Likhanov & Rebecca Gelding & Yulia Kovas & Daniel Müllensiefen, 2022. "The Jack and Jill Adaptive Working Memory Task: Construction, Calibration and Validation," PLOS ONE, Public Library of Science, vol. 17(1), pages 1-29, January.
    12. Joshua B. Gilbert & James S. Kim & Luke W. Miratrix, 2023. "Modeling Item-Level Heterogeneous Treatment Effects With the Explanatory Item Response Model: Leveraging Large-Scale Online Assessments to Pinpoint the Impact of Educational Interventions," Journal of Educational and Behavioral Statistics, , vol. 48(6), pages 889-913, December.
    13. Bing Li & Cody Ding & Huiying Shi & Fenghui Fan & Liya Guo, 2023. "Assessment of Psychological Zone of Optimal Performance among Professional Athletes: EGA and Item Response Theory Analysis," Sustainability, MDPI, vol. 15(10), pages 1-15, May.
    14. Joakim Wallmark & James O. Ramsay & Juan Li & Marie Wiberg, 2024. "Analyzing Polytomous Test Data: A Comparison Between an Information-Based IRT Model and the Generalized Partial Credit Model," Journal of Educational and Behavioral Statistics, , vol. 49(5), pages 753-779, October.
    15. Ick Hoon Jin & Minjeong Jeon, 2019. "A Doubly Latent Space Joint Model for Local Item and Person Dependence in the Analysis of Item Response Data," Psychometrika, Springer;The Psychometric Society, vol. 84(1), pages 236-260, March.
    16. Michela Gnaldi & Silvia Bacci & Thiemo Kunze & Samuel Greiff, 2020. "Students’ Complex Problem Solving Profiles," Psychometrika, Springer;The Psychometric Society, vol. 85(2), pages 469-501, June.
    17. Alessandro Chiarotto & Annette Bishop & Nadine E Foster & Kirsty Duncan & Ebenezer Afolabi & Raymond W Ostelo & Muirne C S Paap, 2018. "Item response theory evaluation of the biomedical scale of the Pain Attitudes and Beliefs Scale," PLOS ONE, Public Library of Science, vol. 13(9), pages 1-17, September.
    18. Alexander Robitzsch, 2021. "A Comprehensive Simulation Study of Estimation Methods for the Rasch Model," Stats, MDPI, vol. 4(4), pages 1-23, October.
    19. Harald Hruschka, 2021. "Comparing unsupervised probabilistic machine learning methods for market basket analysis," Review of Managerial Science, Springer, vol. 15(2), pages 497-527, February.
    20. David Andrich, 2010. "Sufficiency and Conditional Estimation of Person Parameters in the Polytomous Rasch Model," Psychometrika, Springer;The Psychometric Society, vol. 75(2), pages 292-308, June.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:evarev:v:48:y:2024:i:6:p:1146-1159. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.