Rubric reliability and annotation of content and argument in source-based argument essays

Yanjun Gao, Alex Driban, Brennan Xavier McManus, Elena Musi, Patricia M. Davies, Smaranda Muresan, Rebecca J. Passonneau

Research output: Chapter in Book/Report/Conference proceedingConference contribution

8 Scopus citations

Abstract

We present a unique dataset of student sourcebased argument essays to facilitate research on the relations between content, argumentation skills, and assessment. Two classroom writing assignments were given to college students in a STEM major, accompanied by a carefully designed rubric. The paper presents a reliability study of the rubric, showing it to be highly reliable, and initial annotation on content and argumentation annotation of the essays.

Original languageEnglish (US)
Title of host publicationACL 2019 - Innovative Use of NLP for Building Educational Applications, BEA 2019 - Proceedings of the 14th Workshop
PublisherAssociation for Computational Linguistics (ACL)
Pages507-518
Number of pages12
ISBN (Electronic)9781950737345
StatePublished - 2019
Event14th Workshop on Innovative Use of NLP for Building Educational Applications, BEA 2019, collocated with ACL 2019 - Florence, Italy
Duration: Aug 2 2019 → …

Publication series

NameACL 2019 - Innovative Use of NLP for Building Educational Applications, BEA 2019 - Proceedings of the 14th Workshop

Conference

Conference14th Workshop on Innovative Use of NLP for Building Educational Applications, BEA 2019, collocated with ACL 2019
Country/TerritoryItaly
CityFlorence
Period8/2/19 → …

All Science Journal Classification (ASJC) codes

  • Language and Linguistics
  • Linguistics and Language
  • Software

Fingerprint

Dive into the research topics of 'Rubric reliability and annotation of content and argument in source-based argument essays'. Together they form a unique fingerprint.

Cite this