ALTAANZ
  • About ALTAANZ
  • ALTAANZ Committee
    • Current Committee
    • 2021 Committee
    • 2020 ALTAANZ Committee
    • 2018 - 2019 ALTAANZ Committee
    • 2017 ALTAANZ Committee
    • 2016 ALTAANZ Committee
    • 2015 ALTAANZ Committee
    • 2014 ALTAANZ Committee
    • 2013 ALTAANZ Committee
    • 2012 ALTAANZ Committee
    • 2011 ALTAANZ Committee
  • Joining ALTAANZ
  • ALANZ - ALAA - ALTAANZ 2022
  • ALTAANZ Online Research Forum 2021
  • LTRC/ALTAANZ Online Celebratory event 2020
    • About the event
    • Event Programme
    • LTRC Anniversary Symposium
  • Past ALTAANZ Conferences
    • ALANZ / ALAA / ALTAANZ AUCKLAND 2017
    • ALTAANZ Conference Auckland 2016 >
      • Keynote Speakers >
        • Plenary Abstracts
      • Teachers' Day
      • Pre-conference workshops
      • Conference programme
    • ALTAANZ Conference Brisbane 2014
    • ALTAANZ Conference Sydney 2012
  • Past ALTAANZ / LTRC Workshops
    • LTRC / ALTAANZ Workshops July 2014 >
      • Test analysis for teachers
      • Diagnostic assessment in the language classroom
      • Responding to student writing
      • Assessing Pragmatics
      • Introduction to Rasch measurement
      • Introduction to many-facet Rasch measurement
    • LTRC / ALTAANZ workshops September 2015 >
      • A Practical Approach to Questionnaire Construction for Language Assessment Research
      • Integrating self- and peer-assessment into the language classroom
      • Implementing and assessing collaborative writing activities
      • Assessing Vocabulary
      • Revisiting language constructs
  • Studies in Language Assessment (formerly PLTA)
    • Early View Articles
    • Current Issue
    • About SiLA
    • Past Issues >
      • 2022
      • 2021
      • 2020
      • 2019
      • 2018
      • 2017
      • 2016
      • 2015
      • 2014
      • 2013
      • 2012
    • Editorial Board
    • Contributors
  • ALTAANZ Awards
    • ALTAANZ Best Student Paper Award
    • Penny McKay Award
    • PLTA Best Paper Award
  • Sponsorship for Educational Activities
  • Newsletter: Language Assessment Matters
  • Contact us
Experimenting with a Japanese automated essay scoring system in the L2 Japanese environment
Jun Imaki & Shunichi Ishihara, College of Asia and the Pacific, Australian National University, Australia 
https://doi/10.0000/0000
Volume 2, Issue 2, 2013
Abstract: The purpose of this study is to provide an empirical analysis of the performance of an L1 Japanese automated essay scoring system which was on L2 Japanese compositions. In particular, this study concerns the use of such a system in formal L2 Japanese classes by the teachers (not in standardised tests). Thus experiments were designed accordingly. For this study, Jess, Japanese essay scoring system, was trialled using L2 Japanese compositions (n = 50). While Jess performed very well, being comparable with human raters in that the correlation between Jess and the average of the nine human raters is at least as high as the correlation between the 9 human raters, we also found: 1) that the performance of Jess is not as good as the reported performance of English automated essay scoring systems in the L2 environment and 2) that the very good compositions tend to be under-scored by Jess, indicating that Jess still has possible room for improvement. 
Keywords: L2 writing, automated essay scoring system, Japanese, rater reliability, essay evaluation, rater training ​
Click to download Full Text