Automated Essay Scoring with the E-rater System

Automated Essay Scoring with the E-rater System

[featured_image]
  • Version
  • Download 347
  • File Size 99.21 KB
  • File Count 1
  • Create Date August 2, 2018
  • Last Updated August 2, 2018

Automated Essay Scoring with the E-rater System

This paper provides an overview of e-rater®,, a state-of-the-art automated essay scoring system developed at the Educational Testing Service (ETS). E-rater is used as part of the operational scoring of two high-stakes graduate admissions programs: the GRE®, General Test and the TOEFL iBT®, assessments. E-rater is also used to provide score reporting and diagnostic feedback in CriterionSM, ETS’,s writing instruction application.E-rater scoring is based on automatically extracting features of the essay text using Natural Language Processing (NLP) techniques. These features measure several underlying aspects of the writing construct: word choice, grammatical conventions (grammar, usage, and mechanics), development and organization, and topical vocabulary usage. The paper reviews e-rater’,s feature set, the framework for feature aggregation into essay scores, processes for the evaluation of e-rater scores, and options for operational implementation, with an emphasis on the standards and procedures used at ETS to ensure scoring quality.Keywords: automated scoring, writing assessment

Attached Files

FileAction
paper_5bc1babe.pdfDownload