DIF investigations across groups of gender and academic background in a large-scale high-stakes language test
Xiamei Song, Georgia Southern University
Liying Cheng and Don Klinger, Queens’ University
Liying Cheng and Don Klinger, Queens’ University
https://doi.org/10.58379/RSHG8366
|
Volume 4, Issue 1, 2015
|
Abstract: High-stakes pre-entry language testing is the predominate tool used to measure test takers’ proficiency for admission purposes in higher education in China. Given the important role of these tests, there are heated discussions about how to ensure test fairness for different groups of test takers. This study examined the fairness of the Graduate School Entrance English Examination (GSEEE) that is used to decide whether over one million test takers can enter master’s programs in China. Using SIBTEST and content analysis, the study investigated differential item functioning (DIF) and the presence of potential bias on the GSEEE with aspects to groups of gender and academic background. Results found that a large percentage of the GSEEE items did not provide reliable results to distinguish good and poor performers. A number of DIF and DBF functioned differentially and three test reviewers identified a myriad of factors such as motivation and learning styles that potentially contributed to group performance differences. However, consistent evidence was not found to suggest these flagged items/texts exhibited bias. While systematic bias may not have been detected, the results revealed poor test reliability and the study highlighted an urgent need to improve test quality and clarify the purpose of the test. DIF issues may be revisited once test quality has been improved.
Keywords: Differential item functioning, test bias, language testing, content analysis, EAP/ESP