ALTAANZ
  • About ALTAANZ
  • ALTAANZ Committee
    • Current Committee
    • Past Committees >
      • 2024 Committee
      • 2022 - 2023 Committee
      • 2021 Committee
      • 2020 ALTAANZ Committee
      • 2018 - 2019 ALTAANZ Committee
      • 2017 ALTAANZ Committee
      • 2016 ALTAANZ Committee
      • 2015 ALTAANZ Committee
      • 2014 ALTAANZ Committee
      • 2013 ALTAANZ Committee
      • 2012 ALTAANZ Committee
      • 2011 ALTAANZ Committee
  • Events
    • ALTAANZ Online conference 2025 >
      • Conference info
      • ALTAANZ conference registration 2025
      • Keynote Speakers
      • Featured sessions
      • ALTAANZ 2025 Mentor-mentee program
    • Past Conferences >
      • The Applied Linguistics ALAA/ALANZ/ALTAANZ Conference 2024
      • ALTAANZ Online Conference 2023 >
        • Program 2023
        • Plenary Sessions 2023
        • Registration 2023
        • Conference Committee 2023
      • ALANZ - ALAA - ALTAANZ 2022
      • ALTAANZ Online Research Forum 2021
      • LTRC/ALTAANZ Online Celebratory event 2020 >
        • About the event
        • Event Programme
        • LTRC Anniversary Symposium
      • ALANZ / ALAA / ALTAANZ Auckland 2017
      • ALTAANZ Conference Auckland 2016 >
        • Keynote Speakers >
          • Plenary Abstracts
        • Teachers' Day
        • Pre-conference workshops
        • Conference programme
      • ALTAANZ Conference Brisbane 2014
      • ALTAANZ Conference Sydney 2012
    • Past Workshops >
      • LTRC / ALTAANZ Workshops July 2014 >
        • Test analysis for teachers
        • Diagnostic assessment in the language classroom
        • Responding to student writing
        • Assessing Pragmatics
        • Introduction to Rasch measurement
        • Introduction to many-facet Rasch measurement
      • LTRC / ALTAANZ workshops September 2015 >
        • A Practical Approach to Questionnaire Construction for Language Assessment Research
        • Integrating self- and peer-assessment into the language classroom
        • Implementing and assessing collaborative writing activities
        • Assessing Vocabulary
        • Revisiting language constructs
  • SiLA Journal
    • About SiLA
    • SiLA Publication Policies
    • Early View Articles
    • Current Issue
    • Past Issues >
      • 2025
      • 2024
      • 2023
      • 2022
      • 2021
      • 2020
      • 2019
      • 2018
      • 2017
      • 2016
      • 2015
      • 2014
      • 2013
      • 2012
    • Editorial Board
    • Submission Guidelines
  • Awards
    • SiLA Best Paper Award
    • PLTA Best Paper Award 2013-2021
    • ALTAANZ Best Student Journal Article Award
    • ALTAANZ Best Student Paper Award
    • Penny McKay Award
  • Funding Opportunities
  • Newsletter: Language Assessment Matters
  • Resources
    • Best practice in language testing & assessment
  • Join ALTAANZ
  • Contact us
Picture
Developing a digital tool for L2 speaking assessment in low-resourced languages
Raili Hilden, University of Helsinki, Finland; Mikko Kuronen, University of Jyväskylä, Finland; Ekaterina Voskoboinik, Aalto University, Finland; Yaroslav Getman, Aalto University, Finland; Mikko Kurimo, Aalto University, Finland
https://doi.org/10.58379/ROXO3257
Volume 14, Issue 2, 2025
Abstract: Previous research on training and assessment of oral skills has mainly focused on English as L2, but since languages and learning contexts vary, it is imperative to study automatic speaking assessment (ASA) in other languages with local relevance as well. This paper summarizes a project which set out to develop a prototype tool to support training and assessment of oral skills in two low-resourced languages, Finnish and Swedish. This project addressed the applicability of automated assessment to measure multiple features of monologue speech, the accuracy of human ratings used for training a system based on automatic speech recognition (ASR) and the technical conditions of providing individualized feedback to improve student learning. Encouraging results for both Finnish and Swedish were gained when adapting a big pre-trained wav2vec2.0 speech model that was fine-tuned first with a larger L1 dataset, and then with an L2 dataset collected in the project. The results suggested that the most suitable features for automatic analysis were quantifiable fluency measures and vocabulary range. Machine and human estimates were most consistent for assessing fluency, range and accuracy, while the results were more controversial for pronunciation features other than fluency. The prototype will be further developed, fine-tuned and adjusted to address the needs of adult learners preparing for the final test of integration training in L2 Finnish.
Keywords: L2 learning, oral proficiency, automatic speaking assessment, signal processing
Click to download Full Text