Multi-Task Neural Sequence Labeling for Zero-Shot Cross-Language Boilerplate Removal

Yu Hao Wu, Chia Hui Chang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

Although web pages are rich in resources, they are usually intertwined with advertisements, banners, navigation bars, footer copyrights and other templates, which are often not of interest to users. In this paper, we study the problem of extracting the main content and removing irrelevant information from web pages. The common solution is to classify each web component into boilerplate (noise) or main content. State-of-the-art approaches such as BoilerNet use neural sequence labeling to achieve an impressive score in CleanEval EN dataset. However, the model uses only the top 50 HTML tags as input features, which does not fully utilize the power of tag information. In addition, the most frequent 1,000 words used for text content representation cannot effectively support a real-world environment in which web pages appear in multiple languages. In this paper, we propose a multi-task learning framework based on two auxiliary tasks: depth prediction and position prediction. We explore HTML tag embedding for tag path representation learning. Further, we employ multilingual Bidirectional Encoder Representations from Transformers (BERT) for text content representation to deal with any web pages without language limitations. The experiments show that HTML tag embedding and multi-task learning frameworks achieve much higher scores than using BoilerNet on CleanEval EN datasets. Secondly, the pre-trained text block representation based on multilingual BERT will degrade the performance on EN test sets; however, zero-shot experiments on three languages (Chinese, Japanese, and Thai) have a performance consistent with the five-fold cross-validation of the respective language, which indicates the possibility of providing cross-lingual support in one model.

Original languageEnglish
Title of host publicationProceedings - 2021 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, WI-IAT 2021
PublisherAssociation for Computing Machinery
Pages326-334
Number of pages9
ISBN (Electronic)9781450391153
DOIs
StatePublished - 14 Dec 2021
Event2021 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, WI-IAT 2021 - Virtual, Online, Australia
Duration: 14 Dec 202117 Dec 2021

Publication series

NameACM International Conference Proceeding Series

Conference

Conference2021 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, WI-IAT 2021
Country/TerritoryAustralia
CityVirtual, Online
Period14/12/2117/12/21

Keywords

  • boilerplate removal
  • cross-lingual model
  • multi-task learning
  • tag embedding
  • zero-shot learning

Fingerprint

Dive into the research topics of 'Multi-Task Neural Sequence Labeling for Zero-Shot Cross-Language Boilerplate Removal'. Together they form a unique fingerprint.

Cite this