Item response theory modeling for examinee-selected items with rater effect

Journal article


Liu, Chen-Wei, Qiu, Xue-Lan and Wang, Wen-Chung. (2019). Item response theory modeling for examinee-selected items with rater effect. Applied Psychological Measurement. 43(6), pp. 435-448. https://doi.org/10.1177/0146621618798667
AuthorsLiu, Chen-Wei, Qiu, Xue-Lan and Wang, Wen-Chung
Abstract

Some large-scale testing requires examinees to select and answer a fixed number of items from given items (e.g., select one out of the three items). Usually, they are constructed-response items that are marked by human raters. In this examinee-selected item (ESI) design, some examinees may benefit more than others from choosing easier items to answer, and so the missing data induced by the design become missing not at random (MNAR). Although item response theory (IRT) models have recently been developed to account for MNAR data in the ESI design, they do not consider the rater effect; thus, their utility is seriously restricted. In this study, two methods are developed: the first one is a new IRT model to account for both MNAR data and rater severity simultaneously, and the second one adapts conditional maximum likelihood estimation and pairwise estimation methods to the ESI design with the rater effect. A series of simulations was then conducted to compare their performance with those of conventional IRT models that ignored MNAR data or rater severity. The results indicated a good parameter recovery for the new model. The conditional maximum likelihood estimation and pairwise estimation methods were applicable when the Rasch models fit the data, but the conventional IRT models yielded biased parameter estimates. An empirical example was given to illustrate these new initiatives.

Keywordsrater severity; examinee-selected items; missing not at random
Year2019
JournalApplied Psychological Measurement
Journal citation43 (6), pp. 435-448
PublisherSAGE Publications
ISSN0146-6216
Digital Object Identifier (DOI)https://doi.org/10.1177/0146621618798667
PubMed ID31452553
Scopus EID2-s2.0-85059681803
PubMed Central IDPMC6696873
Page range435-448
FunderResearch Grants Council, Hong Kong SAR (Special Administrative Region)
Publisher's version
License
All rights reserved
File Access Level
Controlled
Output statusPublished
Publication dates
Online08 Oct 2018
Publication process dates
Deposited15 Nov 2023
Grant ID18613716
Permalink -

https://acuresearchbank.acu.edu.au/item/8zz0w/item-response-theory-modeling-for-examinee-selected-items-with-rater-effect

Restricted files

Publisher's version

  • 17
    total views
  • 0
    total downloads
  • 0
    views this month
  • 0
    downloads this month
These values are for the period from 19th October 2020, when this repository was created.

Export as

Related outputs

Computerized adaptive testing for ipsative tests with multidimensional pairwise-comparison items : Algorithm development and applications
Qiu, Xue-Lan, de la Torre, Jimmy, Ro, Sage and Wang, Wen-Chung. (2022). Computerized adaptive testing for ipsative tests with multidimensional pairwise-comparison items : Algorithm development and applications. Applied Psychological Measurement. 46(4), pp. 255-272. https://doi.org/10.1177/01466216221084209
An empirical Q-Matrix validation method for the polytomous G-DINA model
de la Torre, Jimmy, Qiu, Xue-Lan and Santos, Kevin Carl. (2022). An empirical Q-Matrix validation method for the polytomous G-DINA model. Psychometrika. 87(2), pp. 693-724. https://doi.org/10.1007/s11336-021-09821-x
Equity in mathematics education in Hong Kong : Evidence from TIMSS 2011 to 2019
Qiu, Xue-Lan and Leung, Frederick K. S.. (2022). Equity in mathematics education in Hong Kong : Evidence from TIMSS 2011 to 2019. Large-scale Assessments in Education. 10(1), p. Article 3. https://doi.org/10.1186/s40536-022-00121-z
A new item response theory model for rater centrality using a hierarchical rater model approach
Qiu, Xue-Lan, Chiu, Ming Ming, Wang, Wen-Chung and Chen, Po-Hsi. (2022). A new item response theory model for rater centrality using a hierarchical rater model approach. Behavior Research Methods. 54(4), pp. 1854-1868. https://doi.org/10.3758/s13428-021-01699-y
Assessment of differential statement functioning in ipsative tests with multidimensional forced-choice items
Qiu, Xue-Lan and Wang, Wen-Chung. (2021). Assessment of differential statement functioning in ipsative tests with multidimensional forced-choice items. Applied Psychological Measurement. 45(2), pp. 79-94. https://doi.org/10.1177/0146621620965739
Student self-assessment : Why do they do it?
Yan, Zi, Brown, Gavin T. L., Lee, John Chi-Kin and Qiu, Xue-Lan. (2020). Student self-assessment : Why do they do it? Educational Psychology. 40(4), pp. 509-532. https://doi.org/10.1080/01443410.2019.1672038
Multilevel modeling of cognitive diagnostic assessment : The multilevel DINA example
Wang, Wen-Chung and Qiu, Xue-Lan. (2019). Multilevel modeling of cognitive diagnostic assessment : The multilevel DINA example. Applied Psychological Measurement. 43(1), pp. 34-50. https://doi.org/10.1177/0146621618765713