A Comparison of Aggregation Rules for Selecting Anchor Items in Multigroup DIF Analysis

Autor: Thorben Huelmann, Carolin Strobl, Rudolf Debelak
Přispěvatelé: University of Zurich
Rok vydání: 2019
Předmět:
Zdroj: Journal of Educational Measurement. 57:185-215
ISSN: 1745-3984
0022-0655
DOI: 10.1111/jedm.12246
Popis: This study addresses the topic of how anchoring methods for differential item functioning (DIF) analysis can be used in multigroup scenarios. The direct approach would be to combine anchoring methods developed for two‐group scenarios with multigroup DIF‐detection methods. Alternatively, multiple tests could be carried out. The results of these tests need to be aggregated to determine the anchor for the final DIF analysis. In this study, the direct approach and three aggregation rules are investigated. All approaches are combined with a variety of anchoring methods, such as the “all‐other purified” and “mean p‐value threshold” methods, in two simulation studies based on the Rasch model. Our results indicate that the direct approach generally does not lead to more accurate or even to inferior results than the aggregation rules. The min rule overall shows the best trade‐off between low false alarm rate and medium to high hit rate. However, it might be too sensitive when the number of groups is large. In this case, the all rule may be a good compromise. We also take a closer look at the anchor selection method “next candidate,” which performed rather poorly, and suggest possible improvements.
Databáze: OpenAIRE