ALTAANZ
  • About ALTAANZ
  • ALTAANZ Committee
    • Current Committee
    • 2021 Committee
    • 2020 ALTAANZ Committee
    • 2018 - 2019 ALTAANZ Committee
    • 2017 ALTAANZ Committee
    • 2016 ALTAANZ Committee
    • 2015 ALTAANZ Committee
    • 2014 ALTAANZ Committee
    • 2013 ALTAANZ Committee
    • 2012 ALTAANZ Committee
    • 2011 ALTAANZ Committee
  • Joining ALTAANZ
  • ALANZ - ALAA - ALTAANZ 2022
  • ALTAANZ Online Research Forum 2021
  • LTRC/ALTAANZ Online Celebratory event 2020
    • About the event
    • Event Programme
    • LTRC Anniversary Symposium
  • Past ALTAANZ Conferences
    • ALANZ / ALAA / ALTAANZ AUCKLAND 2017
    • ALTAANZ Conference Auckland 2016 >
      • Keynote Speakers >
        • Plenary Abstracts
      • Teachers' Day
      • Pre-conference workshops
      • Conference programme
    • ALTAANZ Conference Brisbane 2014
    • ALTAANZ Conference Sydney 2012
  • Past ALTAANZ / LTRC Workshops
    • LTRC / ALTAANZ Workshops July 2014 >
      • Test analysis for teachers
      • Diagnostic assessment in the language classroom
      • Responding to student writing
      • Assessing Pragmatics
      • Introduction to Rasch measurement
      • Introduction to many-facet Rasch measurement
    • LTRC / ALTAANZ workshops September 2015 >
      • A Practical Approach to Questionnaire Construction for Language Assessment Research
      • Integrating self- and peer-assessment into the language classroom
      • Implementing and assessing collaborative writing activities
      • Assessing Vocabulary
      • Revisiting language constructs
  • Studies in Language Assessment (formerly PLTA)
    • Early View Articles
    • Current Issue
    • About SiLA
    • Past Issues >
      • 2022
      • 2021
      • 2020
      • 2019
      • 2018
      • 2017
      • 2016
      • 2015
      • 2014
      • 2013
      • 2012
    • Editorial Board
    • Contributors
  • ALTAANZ Awards
    • ALTAANZ Best Student Paper Award
    • Penny McKay Award
    • PLTA Best Paper Award
  • Sponsorship for Educational Activities
  • Newsletter: Language Assessment Matters
  • Contact us
Examining test fairness across gender in a computerised reading test: A comparison between the Rasch-based DIF technique and MIMIC 
Xuelian Zhu, Sichuan International Studies University, China; National Institute of Education, Nanyang Technological University, Singapore
Vahid Aryadoust, National Institute of Education, Nanyang Technological University, Singapore
https://doi/10.0000/0000
Volume 8, Issue 2, 2019
Abstract: Test fairness has been recognised as a fundamental requirement of test validation. Two quantitative approaches to investigate test fairness, the Rasch-based differential item functioning (DIF) detection method and a measurement invariance technique called multiple indicators, multiple causes (MIMIC), were adopted and compared in a test fairness study of the Pearson Test of English (PTE) Academic Reading test (n = 783). The Rasch partial credit model (PCM) showed no statistically significant uniform DIF across gender and, similarly, the MIMIC analysis showed that measurement invariance was maintained in the test. However, six pairs of significant non-uniform DIF (p < 0.05) were found in the DIF analysis. A discussion of the results and post-hoc content analysis is presented and the theoretical and practical implications of the study for test developers and language assessment are discussed.
Keywords: Pearson Test of English (PTE) Academic Reading test, Rasch-based partial credit model, MIMIC, differential item functioning (DIF), measurement invariance
Click to download Full Text