Deep Residual and Deep Dense Attentions in English Chinese Translation

Yi Xing Lin, Kai Wen Liang, Chih Hsuan Yang, Jia Ching Wang

研究成果: 書貢獻/報告類型會議論文篇章同行評審

摘要

Neural Machine Translation (NMT) with attention mechanism has achieved impressively improvement for automated translation. However, such models may lose information during multiple times of attention representations. This paper focuses on dealing with the over-attention problem. In our English-Chinese translation experimental results, the proposed model reduces the error rate of information in output sentences about 0.5%.

原文???core.languages.en_GB???
主出版物標題2021 IEEE International Conference on Consumer Electronics-Taiwan, ICCE-TW 2021
發行者Institute of Electrical and Electronics Engineers Inc.
ISBN(電子)9781665433280
DOIs
出版狀態已出版 - 2021
事件8th IEEE International Conference on Consumer Electronics-Taiwan, ICCE-TW 2021 - Penghu, Taiwan
持續時間: 15 9月 202117 9月 2021

出版系列

名字2021 IEEE International Conference on Consumer Electronics-Taiwan, ICCE-TW 2021

???event.eventtypes.event.conference???

???event.eventtypes.event.conference???8th IEEE International Conference on Consumer Electronics-Taiwan, ICCE-TW 2021
國家/地區Taiwan
城市Penghu
期間15/09/2117/09/21

指紋

深入研究「Deep Residual and Deep Dense Attentions in English Chinese Translation」主題。共同形成了獨特的指紋。

引用此