Classification of Tweets Self-reporting Adverse Pregnancy Outcomes and Potential COVID-19 Cases Using RoBERTa Transformers

Lung Hao Lee, Man Chen Hung, Chien Huan Lu, Chang Hao Chen, Po Lei Lee, Kuo Kai Shyu

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

4 Scopus citations

Abstract

This study describes our proposed model design for SMM4H 2021 shared tasks. We fine-tune the language model of RoBERTa transformers and their connecting classifier to complete the classification tasks of tweets for adverse pregnancy outcomes (Task 4) and potential COVID-19 cases (Task 5). The evaluation metric is F1-score of the positive class for both tasks. For Task 4, our best score of 0.93 exceeded the median score of 0.925. For Task 5, our best of 0.75 exceeded the median score of 0.745.

Original languageEnglish
Title of host publicationSocial Media Mining for Health, SMM4H 2021 - Proceedings of the 6th Workshop and Shared Tasks
EditorsArjun Magge, Ari Z. Klein, Antonio Miranda-Escalada, Mohammed Ali Al-garadi, Ilseyar Alimova, Zulfat Miftahutdinov, Eulalia Farre-Maduell, Salvador Lima Lopez, Ivan Flores, Karen O'Connor, Davy Weissenbacher, Elena Tutubalina, Abeed Sarker, Juan M Banda, Martin Krallinger, Graciela Gonzalez-Hernandez
PublisherAssociation for Computational Linguistics (ACL)
Pages98-101
Number of pages4
ISBN (Electronic)9781954085312
DOIs
StatePublished - 2021
Event6th Workshop and Shared Tasks on Social Media Mining for Health, SMM4H 2021 - Mexico City, Mexico
Duration: 10 Jun 2021 → …

Publication series

NameSocial Media Mining for Health, SMM4H 2021 - Proceedings of the 6th Workshop and Shared Tasks

Conference

Conference6th Workshop and Shared Tasks on Social Media Mining for Health, SMM4H 2021
Country/TerritoryMexico
CityMexico City
Period10/06/21 → …

Fingerprint

Dive into the research topics of 'Classification of Tweets Self-reporting Adverse Pregnancy Outcomes and Potential COVID-19 Cases Using RoBERTa Transformers'. Together they form a unique fingerprint.

Cite this