DIF Analyses in Multilevel Data: Identification and Effects on Ability Estimates

Loading...
Thumbnail Image

Authors

License

DOI

Type

dissertation

Journal Title

Journal ISSN

Volume Title

Publisher

Grantor

University of Wisconsin-Milwaukee

Abstract

Fairness is an important issue in educational testing in that different groups of examinees should have equal probabilities of answering an item correctly, provided they have the same capabilities. Therefore, differential item functioning (DIF) analyses were developed due to the possibility of bias in cognitive or achievement tests. Data are multilevel structured in educational testing as students are nested within teachers who are nested within schools, and which may further be nested within districts. Although DIF analyses have been discussed for decades, they are rarely investigated in multilevel data. In this study, DIF analyses in multilevel data were investigated via a simulation study with an emphasis on studying DIF at the teacher-level only and at both student and teacher levels, followed by the impacts of DIF on ability estimation. The multilevel Rasch models were used to detect DIF at different locations in both exploratory and confirmatory manners. Type I error rates were all accepted at the 0.05 level. The power was larger when conducting confirmatory analyses. The magnitude of DIF at both levels and the proportion of manifest groups at both levels were two most influential factors on the power of detecting of DIF. However, no influential factors found had impacts on ability estimates. The interpretation of results, possible reasons, limitations, and further studies were discussed.

Description

Related Material and Data

Citation

Sponsorship

Endorsement

Review

Supplemented By

Referenced By