ALTAANZ
  • About ALTAANZ
  • ALTAANZ Committee
    • Current Committee
    • Past Committees >
      • 2024 Committee
      • 2022 - 2023 Committee
      • 2021 Committee
      • 2020 ALTAANZ Committee
      • 2018 - 2019 ALTAANZ Committee
      • 2017 ALTAANZ Committee
      • 2016 ALTAANZ Committee
      • 2015 ALTAANZ Committee
      • 2014 ALTAANZ Committee
      • 2013 ALTAANZ Committee
      • 2012 ALTAANZ Committee
      • 2011 ALTAANZ Committee
  • Events
    • ALTAANZ Online conference 2025 >
      • Conference info
      • ALTAANZ conference registration 2025
      • Keynote Speakers
      • Featured sessions
      • ALTAANZ 2025 Mentor-mentee program
    • Past Conferences >
      • The Applied Linguistics ALAA/ALANZ/ALTAANZ Conference 2024
      • ALTAANZ Online Conference 2023 >
        • Program 2023
        • Plenary Sessions 2023
        • Registration 2023
        • Conference Committee 2023
      • ALANZ - ALAA - ALTAANZ 2022
      • ALTAANZ Online Research Forum 2021
      • LTRC/ALTAANZ Online Celebratory event 2020 >
        • About the event
        • Event Programme
        • LTRC Anniversary Symposium
      • ALANZ / ALAA / ALTAANZ Auckland 2017
      • ALTAANZ Conference Auckland 2016 >
        • Keynote Speakers >
          • Plenary Abstracts
        • Teachers' Day
        • Pre-conference workshops
        • Conference programme
      • ALTAANZ Conference Brisbane 2014
      • ALTAANZ Conference Sydney 2012
    • Past Workshops >
      • LTRC / ALTAANZ Workshops July 2014 >
        • Test analysis for teachers
        • Diagnostic assessment in the language classroom
        • Responding to student writing
        • Assessing Pragmatics
        • Introduction to Rasch measurement
        • Introduction to many-facet Rasch measurement
      • LTRC / ALTAANZ workshops September 2015 >
        • A Practical Approach to Questionnaire Construction for Language Assessment Research
        • Integrating self- and peer-assessment into the language classroom
        • Implementing and assessing collaborative writing activities
        • Assessing Vocabulary
        • Revisiting language constructs
  • SiLA Journal
    • About SiLA
    • SiLA Publication Policies
    • Early View Articles
    • Current Issue
    • Past Issues >
      • 2024
      • 2023
      • 2022
      • 2021
      • 2020
      • 2019
      • 2018
      • 2017
      • 2016
      • 2015
      • 2014
      • 2013
      • 2012
    • Editorial Board
    • Submission Guidelines
  • Awards
    • SiLA Best Paper Award
    • PLTA Best Paper Award 2013-2021
    • ALTAANZ Best Student Journal Article Award
    • ALTAANZ Best Student Paper Award
    • Penny McKay Award
  • Funding Opportunities
  • Newsletter: Language Assessment Matters
  • Resources
    • Best practice in language testing & assessment
  • Join ALTAANZ
  • Contact us
Papers in Language Testing & Assessment Best Paper Award 2013-2021

The PLTA best paper award was awarded three times before it changed to the SiLA best paper award in 2022. 
The first award covered 2013-2015.
The second award to cover 2016-2018.
The third award to cover 2019-2021.

The winners of the three awards are listed below.

Papers in Language Testing & Assessment Best Paper Award 2013-2015
Winner:

Knoch, U., & Elder, C. (2013). A framework for validating post-entry language assessments (PELAs). Papers in Language Testing and Assessment, 2(2), 48-66.

Citation:

This paper presents a significant and substantive step in the development of PELAs (Post-entry English Language Assessments) in Australia and New Zealand over the past 20 years. This paper is an extremely useful adaptation of the validity argument conceptualisation into a practical framework for validating PELAs. It provides broader applicability in terms of the discussion of validation/evaluation distinction. The framework will no doubt be influential in time for many institutions developing PELAs.

Runner-up:

Hudson, C., & Angelo, D. (2014). Concepts underpinning innovations to second language proficiency scales inclusive of Aboriginal and Torres Strait Islander learners: a dynamic process in progress. Papers in Language Testing and Assessment, 3(1), 44-85.

Citation:

​
This paper not only has an invaluable contribution to language assessment in the Australian context, but also practical implications for other similar contexts. It documents the development of an instrument which has arisen out of social and pedagogical need with considerable input from classroom teachers. This paper is an excellent example of how a rating scale can serve a professional development role and how assessment instruments might fit in the nexus of second language acquisition, descriptive linguistics, policy and education.

Other Finalists:

Clark, M. (2014). The use of semi-scripted speech in a listening placement test for university students. Papers in Language Testing and Assessment, 3(2), 1-26.

Ruegg, R. (2014). The effect of assessment of peer feedback on the quantity and quality of feedback given. Papers in Language Testing and Assessment, 3(1), 24-43.

Kokhan, K. & Lin, C-K. (2014) Test of English as a Foreign Language (TOEFL): Interpretation of multiple score reports for ESL placement. Papers in Language Testing and Assessment, 3(1), 1-23.

Selection Committee:
​

Associate Professor Aek Phakiti (Chair)
Associate Professor Angela Scarino
Dr Rosemary Wette
Dr Susy Macqueen

Papers in Language Testing & Assessment Best Paper Award 2016-2018
Joint winners:
Macqueen, S., O’Hagan, S., Hughes, B. (2016) Negotiating the boundary between achievement and proficiency: An evaluation of the exit standard of an academic English pathway program Papers in Language Testing and Assessment 5(1), 107-23
 
Citation: This paper is an evaluation study addressing the tension between the university English entry standard at an Australian university, as represented by a proficiency test score, and a preparatory pathway course exit standard designed to be reflective of broader academic literacy achievements within the context of concern. The authors clearly outline and justify the procedures adopted to evaluate the claim that completion of the pathway course indicates readiness for university.
The originality of the paper lies in its framing of the course assessments in pathway programs as ‘boundary objects’ between different social worlds and its thoughtful discussion of the complex role of the evaluator in navigating between these worlds. It thus makes a valuable conceptual contribution to our field as well as highlighting important practical issues for those tasked with delivering and evaluating the effectiveness of academic pathway programs in the higher education context.
 
Fan, J.J. (2016) The Construct and Predictive Validity of a Self-Assessment Scale Papers in Language Testing and Assessment 5(2) 69-100
 
Citation: The paper offers a meticulous account of the process of developing and validating a self-assessment scale designed for course placement purposes at a Chinese research university. By clearly outlining and justifying the steps in their methodological approach, and by linking the assumptions driving this approach to an existing validation framework, the authors have provided a strong defence for their conclusions about the validity of the scale.
The paper contributes not only to our knowledge of the characteristics of self-assessment, but also provides a rationale for the use of self-assessment in language assessment and learning. The methodological approach and findings of the study will serve as a model for development and validation of self-assessments in other contexts, which will ultimately benefit language teachers and learners.
 
Other Finalists:
Sato, T. (2018) The gap between communicative ability measurements: General-purpose English speaking tests and linguistic laypersons’ judgments. Papers in Language Testing and Assessment 7(1), 1-31
Xu, Y. & Brown, G (2017) University English teacher assessment literacy: A survey-test report from China. Papers in Language Testing and Assessment 6(1), 133-158
 
Selection Committee:
Associate Professor Catherine Elder (Chair)
Dr Ana-Maria Ducasse
Dr Naoki Ikeda
Dr Rachael Ruegg


Papers in Language Testing & Assessment Best Paper Award 2019-2021
Winner:
Hyunji Park & Xun Yan, 'An investigation into rater performance with a holistic scale and a binary, analytic scale on an ESL writing placement test', PLTA 2019 Vol 8(2) Special Issue
 
This paper provides valuable insights into how raters rate and how holistic and analytic scales influence the rating. The study is well designed and well-motivated. It provides a cogent argument for the comparison of two rating scales and for the efficacy of analytic scales in unpacking descriptors in holistic scales. The quality of the procedures and the richness of data are impressive, providing a contrast to studies that tend to make a number of inferences from a small amount of data. The analysis in the study is competent and the results and conclusions are persuasive, with much convincing data being provided before the conclusion is reached. The study makes a much-needed contribution to rating scale development research.
 
Runner-up:
J. Dylan Burton , ‘The face of communication breakdown: Multimodal repair in L2 oral proficiency interviews’. PLTA 2021 Vol 10(2)
 
This paper presents a rigorous fine-grained analysis of the non-verbal display of Interactional Competence in language testing settings, which, as the author rightly argues, is lacking in current language assessment literature. The study explores a new perspective  and innovative methodology. The topic focuses on non-verbal communication in the test of speaking, with the elicitation task involving communication breakdowns in Zoom-based video recordings. The study features an innovative multimodal conversation analysis. This paper suggests significant directions, methodologies and motivations for the incorporation of nonverbal conduct in speaking constructs in language tests. 
 
Finalist:
Jason, Fan & Ute Knoch, 'Fairness in language assessment: What can the Rasch model offer?' PLTA 2019 Vol 8(2) Special Issue
 
Selection Committee:
Dr Cath Hudson (Chair)
Dr David Wei Dai
A/Prof Peter Gu
Dr Morena Magalhaes



 
Criteria for Selection
  1. The contribution of the work to the field of language testing and assessment, or to the interface between language testing and other areas of enquiry (and in particular the originality of the contribution).
  2. The persuasiveness of the argument (whether based on the interpretation of empirical data, theoretical rationales, or both).
  3. The thoroughness of the literature review.
  4. The clarity of presentation (written expression; the use of figures and tables, where appropriate).
  5. In the case of empirical studies, the quality of the procedures for data-gathering and analysis.