NCU-NLP at ROCLING-2021 Shared Task: Using MacBERT Transformers for Dimensional Sentiment Analysis

Man Chen Hung, Chao Yi Chen, Pin Jung Chen, Lung Hao Lee

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

3 Scopus citations

Abstract

We use the MacBERT transformers and fine-tune them to ROCLING-2021 shared tasks using the CVAT and CVAS data. We compare the performance of MacBERT with the other two transformers BERT and RoBERTa in the valence and arousal dimensions, respectively. MAE and correlation coefficient (r) were used as evaluation metrics. On ROCLING-2021 test set, our used MacBERT model achieves 0.611 of MAE and 0.904 of r in the valence dimensions; and 0.938 of MAE and 0.549 of r in the arousal dimension.

Original languageEnglish
Title of host publicationROCLING 2021 - Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing
EditorsLung-Hao Lee, Chia-Hui Chang, Kuan-Yu Chen
PublisherThe Association for Computational Linguistics and Chinese Language Processing (ACLCLP)
Pages380-384
Number of pages5
ISBN (Electronic)9789869576949
StatePublished - 2021
Event33rd Conference on Computational Linguistics and Speech Processing, ROCLING 2021 - Taoyuan, Taiwan
Duration: 15 Oct 202116 Oct 2021

Publication series

NameROCLING 2021 - Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing

Conference

Conference33rd Conference on Computational Linguistics and Speech Processing, ROCLING 2021
Country/TerritoryTaiwan
CityTaoyuan
Period15/10/2116/10/21

Keywords

  • Affective computing
  • Deep learning
  • Learning emotions

Fingerprint

Dive into the research topics of 'NCU-NLP at ROCLING-2021 Shared Task: Using MacBERT Transformers for Dimensional Sentiment Analysis'. Together they form a unique fingerprint.

Cite this