每年專案
摘要
We use the MacBERT transformers and fine-tune them to ROCLING-2021 shared tasks using the CVAT and CVAS data. We compare the performance of MacBERT with the other two transformers BERT and RoBERTa in the valence and arousal dimensions, respectively. MAE and correlation coefficient (r) were used as evaluation metrics. On ROCLING-2021 test set, our used MacBERT model achieves 0.611 of MAE and 0.904 of r in the valence dimensions; and 0.938 of MAE and 0.549 of r in the arousal dimension.
原文 | ???core.languages.en_GB??? |
---|---|
主出版物標題 | ROCLING 2021 - Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing |
編輯 | Lung-Hao Lee, Chia-Hui Chang, Kuan-Yu Chen |
發行者 | The Association for Computational Linguistics and Chinese Language Processing (ACLCLP) |
頁面 | 380-384 |
頁數 | 5 |
ISBN(電子) | 9789869576949 |
出版狀態 | 已出版 - 2021 |
事件 | 33rd Conference on Computational Linguistics and Speech Processing, ROCLING 2021 - Taoyuan, Taiwan 持續時間: 15 10月 2021 → 16 10月 2021 |
出版系列
名字 | ROCLING 2021 - Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing |
---|
???event.eventtypes.event.conference???
???event.eventtypes.event.conference??? | 33rd Conference on Computational Linguistics and Speech Processing, ROCLING 2021 |
---|---|
國家/地區 | Taiwan |
城市 | Taoyuan |
期間 | 15/10/21 → 16/10/21 |
指紋
深入研究「NCU-NLP at ROCLING-2021 Shared Task: Using MacBERT Transformers for Dimensional Sentiment Analysis」主題。共同形成了獨特的指紋。專案
- 1 已完成