Assessment of differential statement functioning in ipsative tests with multidimensional forced-choice items

Journal article


Qiu, Xue-Lan and Wang, Wen-Chung. (2021). Assessment of differential statement functioning in ipsative tests with multidimensional forced-choice items. Applied Psychological Measurement. 45(2), pp. 79-94. https://doi.org/10.1177/0146621620965739
AuthorsQiu, Xue-Lan and Wang, Wen-Chung
Abstract

Ipsative tests with multidimensional forced-choice (MFC) items have been widely used to assess career interest, values, and personality to prevent response biases. Recently, there has been a surge of interest in developing item response theory models for MFC items. In reality, a statement in an MFC item may have different utilities for different groups, which is referred to as differential statement functioning (DSF). However, few studies have been investigated methods for detecting DSF owing to the challenges related to the features of ipsative tests. In this study, three methods were adapted for DSF assessment in MFC items: equal-mean-utility (EMU), all-other-statement (AOS), and constant-statement (CS). Simulation studies were conducted to evaluate the recovery of parameters and the performance of the proposed methods. Results showed that statement parameters and DSF parameters were well recovered for all the three methods when the test did not contain any DSF statement. When the test contained one or more DSF statements, only the CS method yielded accurate estimates. With respect to DSF assessment, both the EMU method using the bootstrap standard error and the AOS method performed appropriately so long as the test did not contain any DSF statement. The CS method performed well in cases where one or more DSF-free statements were chosen as an anchor. The longer the anchor statements, the higher the power of DSF detection.

Keywordsdifferential item functioning; ipsative tests; multidimensional forced-choice; item response theory
Year2021
JournalApplied Psychological Measurement
Journal citation45 (2), pp. 79-94
PublisherSAGE Publications
ISSN0146-6216
Digital Object Identifier (DOI)https://doi.org/10.1177/0146621620965739
PubMed ID33627915
Scopus EID2-s2.0-85093832836
PubMed Central IDPMC7876635
Page range79-94
FunderThe University of Hong Kong
Research Grants Council of the Hong Kong Special Administrative Region, China
Publisher's version
License
All rights reserved
File Access Level
Controlled
Output statusPublished
Publication dates
Online21 Oct 2020
Publication process dates
Deposited30 Nov 2023
Grant ID845013
Permalink -

https://acuresearchbank.acu.edu.au/item/90033/assessment-of-differential-statement-functioning-in-ipsative-tests-with-multidimensional-forced-choice-items

Restricted files

Publisher's version

  • 19
    total views
  • 0
    total downloads
  • 0
    views this month
  • 0
    downloads this month
These values are for the period from 19th October 2020, when this repository was created.

Export as

Related outputs

Computerized adaptive testing for ipsative tests with multidimensional pairwise-comparison items : Algorithm development and applications
Qiu, Xue-Lan, de la Torre, Jimmy, Ro, Sage and Wang, Wen-Chung. (2022). Computerized adaptive testing for ipsative tests with multidimensional pairwise-comparison items : Algorithm development and applications. Applied Psychological Measurement. 46(4), pp. 255-272. https://doi.org/10.1177/01466216221084209
An empirical Q-Matrix validation method for the polytomous G-DINA model
de la Torre, Jimmy, Qiu, Xue-Lan and Santos, Kevin Carl. (2022). An empirical Q-Matrix validation method for the polytomous G-DINA model. Psychometrika. 87(2), pp. 693-724. https://doi.org/10.1007/s11336-021-09821-x
Equity in mathematics education in Hong Kong : Evidence from TIMSS 2011 to 2019
Qiu, Xue-Lan and Leung, Frederick K. S.. (2022). Equity in mathematics education in Hong Kong : Evidence from TIMSS 2011 to 2019. Large-scale Assessments in Education. 10(1), p. Article 3. https://doi.org/10.1186/s40536-022-00121-z
A new item response theory model for rater centrality using a hierarchical rater model approach
Qiu, Xue-Lan, Chiu, Ming Ming, Wang, Wen-Chung and Chen, Po-Hsi. (2022). A new item response theory model for rater centrality using a hierarchical rater model approach. Behavior Research Methods. 54(4), pp. 1854-1868. https://doi.org/10.3758/s13428-021-01699-y
Student self-assessment : Why do they do it?
Yan, Zi, Brown, Gavin T. L., Lee, John Chi-Kin and Qiu, Xue-Lan. (2020). Student self-assessment : Why do they do it? Educational Psychology. 40(4), pp. 509-532. https://doi.org/10.1080/01443410.2019.1672038
Multilevel modeling of cognitive diagnostic assessment : The multilevel DINA example
Wang, Wen-Chung and Qiu, Xue-Lan. (2019). Multilevel modeling of cognitive diagnostic assessment : The multilevel DINA example. Applied Psychological Measurement. 43(1), pp. 34-50. https://doi.org/10.1177/0146621618765713
Item response theory modeling for examinee-selected items with rater effect
Liu, Chen-Wei, Qiu, Xue-Lan and Wang, Wen-Chung. (2019). Item response theory modeling for examinee-selected items with rater effect. Applied Psychological Measurement. 43(6), pp. 435-448. https://doi.org/10.1177/0146621618798667