Deep Residual and Deep Dense Attentions in English Chinese Translation

Yi Xing Lin, Kai Wen Liang, Chih Hsuan Yang, Jia Ching Wang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Neural Machine Translation (NMT) with attention mechanism has achieved impressively improvement for automated translation. However, such models may lose information during multiple times of attention representations. This paper focuses on dealing with the over-attention problem. In our English-Chinese translation experimental results, the proposed model reduces the error rate of information in output sentences about 0.5%.

Original languageEnglish
Title of host publication2021 IEEE International Conference on Consumer Electronics-Taiwan, ICCE-TW 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781665433280
DOIs
StatePublished - 2021
Event8th IEEE International Conference on Consumer Electronics-Taiwan, ICCE-TW 2021 - Penghu, Taiwan
Duration: 15 Sep 202117 Sep 2021

Publication series

Name2021 IEEE International Conference on Consumer Electronics-Taiwan, ICCE-TW 2021

Conference

Conference8th IEEE International Conference on Consumer Electronics-Taiwan, ICCE-TW 2021
Country/TerritoryTaiwan
CityPenghu
Period15/09/2117/09/21

Fingerprint

Dive into the research topics of 'Deep Residual and Deep Dense Attentions in English Chinese Translation'. Together they form a unique fingerprint.

Cite this