NCUEE-NLP at WASSA 2023 Empathy, Emotion, and Personality Shared Task: Perceived Intensity Prediction Using Sentiment-Enhanced RoBERTa Transformers

Tzu Mi Lin, Jung Ying Chang, Lung Hao Lee

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Scopus citations

Abstract

This paper describes our proposed system design for the WASSA 2023 shared task 1. We propose a unified architecture of ensemble neural networks to integrate the original RoBERTa transformer with two sentiment-enhanced RoBERTa-Twitter and EmoBERTa models. For Track 1 at the speech-turn level, our best submission achieved an average Pearson correlation score of 0.7236, ranking fourth for empathy, emotion polarity and emotion intensity prediction. For Track 2 at the essay-level, our best submission obtained an average Pearson correlation score of 0.4178 for predicting empathy and distress scores, ranked first among all nine submissions.

Original languageEnglish
Title of host publicationWASSA 2023 - 13th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, Proceedings of the Workshop
EditorsJeremy Barnes, Orphee De Clercq, Roman Klinger
PublisherAssociation for Computational Linguistics (ACL)
Pages548-552
Number of pages5
ISBN (Electronic)9781959429876
StatePublished - 2023
Event13th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, WASSA 2023 - Toronto, Canada
Duration: 14 Jul 2023 → …

Publication series

NameProceedings of the Annual Meeting of the Association for Computational Linguistics
ISSN (Print)0736-587X

Conference

Conference13th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, WASSA 2023
Country/TerritoryCanada
CityToronto
Period14/07/23 → …

Fingerprint

Dive into the research topics of 'NCUEE-NLP at WASSA 2023 Empathy, Emotion, and Personality Shared Task: Perceived Intensity Prediction Using Sentiment-Enhanced RoBERTa Transformers'. Together they form a unique fingerprint.

Cite this