IDEAS home Printed from https://ideas.repec.org/a/sae/sagope/v6y2016i2p2158244016653791.html
   My bibliography  Save this article

Differential Performance on National Exams

Author

Listed:
  • Syed Latifi
  • Okan Bulut
  • Mark Gierl
  • Thomas Christie
  • Shehzad Jeeva

Abstract

The purpose of this study is to evaluate two methodological perspectives of test fairness using a national Secondary School Certificate (SSC) examinations. SSC is a suit of multi-subject national qualification tests at Grade 10 level in South Asian countries, such as Bangladesh, India, and Pakistan. Because it is a high-stakes test, the fairness of SSC tests is a major concern among public and educational policy planners. This study is a first attempt to investigate test fairness of the national SSC examination of Pakistan using two independent differential item functioning (DIF) and differential bundle functioning (DBF) procedures. The SSC was evaluated for possible gender bias using multiple-choice tests in three core subjects, namely, English, Mathematics, and Physics. The study was conducted in two phases using explanatory item response model (EIRM) and Simultaneous Item Bias Test (SIBTEST). In Phase 1, test items were studied for DIF, and items with severe DIF were flagged in each subject. In Phase 2, the item bundles were analyzed for DBF. Three items were detected with large DIF, one for each subject, and one item bundle was detected with a negligible DBF. Taken together, the results demonstrate that there is no major threat to the validity of the interpretation of examinees’ test scores on the SSC examination. The outcome from this study provided evidence for test fairness, which will enhance test development practices at the national examination authorities.

Suggested Citation

  • Syed Latifi & Okan Bulut & Mark Gierl & Thomas Christie & Shehzad Jeeva, 2016. "Differential Performance on National Exams," SAGE Open, , vol. 6(2), pages 21582440166, June.
  • Handle: RePEc:sae:sagope:v:6:y:2016:i:2:p:2158244016653791
    DOI: 10.1177/2158244016653791
    as

    Download full text from publisher

    File URL: https://journals.sagepub.com/doi/10.1177/2158244016653791
    Download Restriction: no

    File URL: https://libkey.io/10.1177/2158244016653791?utm_source=ideas
    LibKey link: if access is restricted and if your library uses this service, LibKey will redirect you to where you can use your library subscription to access this item
    ---><---

    References listed on IDEAS

    as
    1. De Boeck, Paul & Bakker, Marjan & Zwitser, Robert & Nivard, Michel & Hofman, Abe & Tuerlinckx, Francis & Partchev, Ivailo, 2011. "The Estimation of Item Response Models with the lmer Function from the lme4 Package in R," Journal of Statistical Software, Foundation for Open Access Statistics, vol. 39(i12).
    Full references (including those not matched with items on IDEAS)

    Most related items

    These are the items that most often cite the same works as this one and are cited by the same works as this one.
    1. Andrés López-Sepulcre & Sebastiano De Bona & Janne K. Valkonen & Kate D.L. Umbers & Johanna Mappes, 2015. "Item Response Trees: a recommended method for analyzing categorical data in behavioral studies," Behavioral Ecology, International Society for Behavioral Ecology, vol. 26(5), pages 1268-1273.
    2. Joshua B. Gilbert & James S. Kim & Luke W. Miratrix, 2023. "Modeling Item-Level Heterogeneous Treatment Effects With the Explanatory Item Response Model: Leveraging Large-Scale Online Assessments to Pinpoint the Impact of Educational Interventions," Journal of Educational and Behavioral Statistics, , vol. 48(6), pages 889-913, December.
    3. Minjeong Jeon & Sophia Rabe-Hesketh, 2012. "Profile-Likelihood Approach for Estimating Generalized Linear Mixed Models With Factor Structures," Journal of Educational and Behavioral Statistics, , vol. 37(4), pages 518-542, August.
    4. repec:jss:jstsof:40:i05 is not listed on IDEAS
    5. Joshua B. Gilbert & Zachary Himmelsbach & James Soland & Mridul Joshi & Benjamin W. Domingue, 2024. "Estimating Heterogeneous Treatment Effects with Item-Level Outcome Data: Insights from Item Response Theory," Papers 2405.00161, arXiv.org, revised Aug 2024.
    6. Boris Forthmann & Philipp Doebler, 2021. "Reliability of researcher capacity estimates and count data dispersion: a comparison of Poisson, negative binomial, and Conway-Maxwell-Poisson models," Scientometrics, Springer;Akadémiai Kiadó, vol. 126(4), pages 3337-3354, April.
    7. Minjeong Jeon & Sophia Rabe-Hesketh, 2016. "An autoregressive growth model for longitudinal item analysis," Psychometrika, Springer;The Psychometric Society, vol. 81(3), pages 830-850, September.
    8. Cheng Gao & Ling Xu & Liliam Montoya & Mary Madera & Joy Hollingsworth & Liang Chen & Elizabeth Purdom & Vasanth Singan & John Vogel & Robert B. Hutmacher & Jeffery A. Dahlberg & Devin Coleman-Derr & , 2022. "Co-occurrence networks reveal more complexity than community composition in resistance and resilience of microbial communities," Nature Communications, Nature, vol. 13(1), pages 1-12, December.
    9. Antonio Caronni & Marina Ramella & Pietro Arcuri & Claudia Salatino & Lucia Pigini & Maurizio Saruggia & Chiara Folini & Stefano Scarano & Rosa Maria Converti, 2023. "The Rasch Analysis Shows Poor Construct Validity and Low Reliability of the Quebec User Evaluation of Satisfaction with Assistive Technology 2.0 (QUEST 2.0) Questionnaire," IJERPH, MDPI, vol. 20(2), pages 1-19, January.
    10. Li, Kai & Chen, Pei-Ying & Yan, Erjia, 2019. "Challenges of measuring software impact through citations: An examination of the lme4 R package," Journal of Informetrics, Elsevier, vol. 13(1), pages 449-461.
    11. Niccolò Cao & Antonio Calcagnì, 2022. "Jointly Modeling Rating Responses and Times with Fuzzy Numbers: An Application to Psychometric Data," Mathematics, MDPI, vol. 10(7), pages 1-11, March.
    12. Sun, Katherine Qianwen & Slepian, Michael L., 2020. "The conversations we seek to avoid," Organizational Behavior and Human Decision Processes, Elsevier, vol. 160(C), pages 87-105.
    13. Sun-Joo Cho & Jennifer Gilbert & Amanda Goodwin, 2013. "Explanatory Multidimensional Multilevel Random Item Response Model: An Application to Simultaneous Investigation of Word and Person Contributions to Multidimensional Lexical Representations," Psychometrika, Springer;The Psychometric Society, vol. 78(4), pages 830-855, October.
    14. Ting Wang & Benjamin Graves & Yves Rosseel & Edgar C. Merkle, 2022. "Computation and application of generalized linear mixed model derivatives using lme4," Psychometrika, Springer;The Psychometric Society, vol. 87(3), pages 1173-1193, September.
    15. Dellaert, Benedict G.C. & Arentze, Theo & Horeni, Oliver & Timmermans, Harry J.P., 2017. "Deriving attribute utilities from mental representations of complex decisions," Journal of choice modelling, Elsevier, vol. 22(C), pages 24-38.

    Corrections

    All material on this site has been provided by the respective publishers and authors. You can help correct errors and omissions. When requesting a correction, please mention this item's handle: RePEc:sae:sagope:v:6:y:2016:i:2:p:2158244016653791. See general information about how to correct material in RePEc.

    If you have authored this item and are not yet registered with RePEc, we encourage you to do it here. This allows to link your profile to this item. It also allows you to accept potential citations to this item that we are uncertain about.

    If CitEc recognized a bibliographic reference but did not link an item in RePEc to it, you can help with this form .

    If you know of missing items citing this one, you can help us creating those links by adding the relevant references in the same way as above, for each refering item. If you are a registered author of this item, you may also want to check the "citations" tab in your RePEc Author Service profile, as there may be some citations waiting for confirmation.

    For technical questions regarding this item, or to correct its authors, title, abstract, bibliographic or download information, contact: SAGE Publications (email available below). General contact details of provider: .

    Please note that corrections may take a couple of weeks to filter through the various RePEc services.

    IDEAS is a RePEc service. RePEc uses bibliographic data supplied by the respective publishers.