TY - JOUR
T1 - The overview of the BioRED (Biomedical Relation Extraction Dataset) track at BioCreative VIII
AU - Islamaj, Rezarta
AU - Lai, Po Ting
AU - Wei, Chih Hsuan
AU - Luo, Ling
AU - Almeida, Tiago
AU - Jonker, Richard A.A.
AU - Conceição, Sofia I.R.
AU - Sousa, Diana F.
AU - Phan, Cong Phuoc
AU - Chiang, Jung Hsien
AU - Li, Jiru
AU - Pan, Dinghao
AU - Meesawad, Wilailack
AU - Tsai, Richard Tzong Han
AU - Sarol, M. Janina
AU - Hong, Gibong
AU - Valiev, Airat
AU - Tutubalina, Elena
AU - Lee, Shao Man
AU - Hsu, Yi Yu
AU - Li, Mingjie
AU - Verspoor, Karin
AU - Lu, Zhiyong
N1 - Publisher Copyright:
© 2024 Oxford University Press. All rights reserved.
PY - 2024
Y1 - 2024
N2 - The BioRED track at BioCreative VIII calls for a community effort to identify, semantically categorize, and highlight the novelty factor of the relationships between biomedical entities in unstructured text. Relation extraction is crucial for many biomedical natural language processing (NLP) applications, from drug discovery to custom medical solutions. The BioRED track simulates a real-world application of biomedical relationship extraction, and as such, considers multiple biomedical entity types, normalized to their specific corresponding database identifiers, as well as defines relationships between them in the documents. The challenge consisted of two subtasks: (i) in Subtask 1, participants were given the article text and human expert annotated entities, and were asked to extract the relation pairs, identify their semantic type and the novelty factor, and (ii) in Subtask 2, participants were given only the article text, and were asked to build an end-to-end system that could identify and categorize the relationships and their novelty. We received a total of 94 submissions from 14 teams worldwide. The highest F-score performances achieved for the Subtask 1 were: 77.17% for relation pair identification, 58.95% for relation type identification, 59.22% for novelty identification, and 44.55% when evaluating all of the above aspects of the comprehensive relation extraction. The highest F-score performances achieved for the Subtask 2 were: 55.84% for relation pair, 43.03% for relation type, 42.74% for novelty, and 32.75% for comprehensive relation extraction. The entire BioRED track dataset and other challenge materials are available at https://ftp.ncbi.nlm.nih.gov/pub/lu/BC8-BioRED-track/and https://codalab.lisn.upsaclay.fr/competitions/13377 and https://codalab.lisn.upsaclay.fr/competitions/13378.
AB - The BioRED track at BioCreative VIII calls for a community effort to identify, semantically categorize, and highlight the novelty factor of the relationships between biomedical entities in unstructured text. Relation extraction is crucial for many biomedical natural language processing (NLP) applications, from drug discovery to custom medical solutions. The BioRED track simulates a real-world application of biomedical relationship extraction, and as such, considers multiple biomedical entity types, normalized to their specific corresponding database identifiers, as well as defines relationships between them in the documents. The challenge consisted of two subtasks: (i) in Subtask 1, participants were given the article text and human expert annotated entities, and were asked to extract the relation pairs, identify their semantic type and the novelty factor, and (ii) in Subtask 2, participants were given only the article text, and were asked to build an end-to-end system that could identify and categorize the relationships and their novelty. We received a total of 94 submissions from 14 teams worldwide. The highest F-score performances achieved for the Subtask 1 were: 77.17% for relation pair identification, 58.95% for relation type identification, 59.22% for novelty identification, and 44.55% when evaluating all of the above aspects of the comprehensive relation extraction. The highest F-score performances achieved for the Subtask 2 were: 55.84% for relation pair, 43.03% for relation type, 42.74% for novelty, and 32.75% for comprehensive relation extraction. The entire BioRED track dataset and other challenge materials are available at https://ftp.ncbi.nlm.nih.gov/pub/lu/BC8-BioRED-track/and https://codalab.lisn.upsaclay.fr/competitions/13377 and https://codalab.lisn.upsaclay.fr/competitions/13378.
UR - http://www.scopus.com/inward/record.url?scp=85201052293&partnerID=8YFLogxK
U2 - 10.1093/database/baae069
DO - 10.1093/database/baae069
M3 - 期刊論文
AN - SCOPUS:85201052293
SN - 1758-0463
VL - 2024
JO - Database : the journal of biological databases and curation
JF - Database : the journal of biological databases and curation
M1 - baae069
ER -