Findings of WASSA 2023 Shared Task: Multi-Label and Multi-Class Emotion Classification on Code-Mixed Text Messages

Necva Bölücü, Iqra Ameer, Ali Al Bataineh, Hua Xu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present the results of the WASSA 2023 Shared-Task 2: Emotion Classification on codemixed text messages (Roman Urdu + English), which included two tracks for emotion classification: multi-label and multi-class. The participants were provided with a dataset of codemixed SMS messages in English and Roman Urdu labeled with 12 emotions for both tracks. A total of 5 teams (19 team members) participated in the shared task. We summarized the methods, resources, and tools used by the participating teams. We also made the data freely available for further improvements to the task.

Original languageEnglish (US)
Title of host publicationWASSA 2023 - 13th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, Proceedings of the Workshop
EditorsJeremy Barnes, Orphee De Clercq, Roman Klinger
PublisherAssociation for Computational Linguistics (ACL)
Pages587-595
Number of pages9
ISBN (Electronic)9781959429876
StatePublished - 2023
Event13th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, WASSA 2023 - Toronto, Canada
Duration: Jul 14 2023 → …

Publication series

NameProceedings of the Annual Meeting of the Association for Computational Linguistics
ISSN (Print)0736-587X

Conference

Conference13th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, WASSA 2023
Country/TerritoryCanada
CityToronto
Period7/14/23 → …

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics

Fingerprint

Dive into the research topics of 'Findings of WASSA 2023 Shared Task: Multi-Label and Multi-Class Emotion Classification on Code-Mixed Text Messages'. Together they form a unique fingerprint.

Cite this