Low-Resource Speech Recognition Based on Transfer Learning

Wei Hong Tsai, Phuong Le Thi, Tzu Chiang Tai, Chien Lin Huang, Jia Ching Wang

研究成果: 書貢獻/報告類型會議論文篇章同行評審

1 引文 斯高帕斯(Scopus)

摘要

A lot of research aims to improve accuracy in end-to-end speech recognition, and achieves higher accuracy on various famous corpora. However, there are many languages which do not have enough data to build their speech recognition system in the world. The system often cannot get a reliable result and be used in the real-world. Therefore, how to build a robust low-resource speech recognition system is an important issue in speech recognition. In this paper, we use ESPnet toolkit to implement an end-to-end speech recognition model based on sequence-to-sequence architecture, and use Fairseq toolkit to implement an unsupervised pre-training model for assisted speech recognition. In addition, we use unlabeled speech data to help extract speech features, and transfer a speech recognition model with sufficient corpus to Hakka speech recognition with less corpus through transfer learning. Experimental results show that we establish a more robust low-resource Hakka speech recognition system.

原文???core.languages.en_GB???
主出版物標題Proceedings - 2022 RIVF International Conference on Computing and Communication Technologies, RIVF 2022
編輯Vo Nguyen Quoc Bao, Tran Manh Ha
發行者Institute of Electrical and Electronics Engineers Inc.
頁面145-149
頁數5
ISBN(電子)9781665461665
DOIs
出版狀態已出版 - 2022
事件2022 RIVF International Conference on Computing and Communication Technologies, RIVF 2022 - Ho Chi Minh City, Viet Nam
持續時間: 20 12月 202222 12月 2022

出版系列

名字Proceedings - 2022 RIVF International Conference on Computing and Communication Technologies, RIVF 2022

???event.eventtypes.event.conference???

???event.eventtypes.event.conference???2022 RIVF International Conference on Computing and Communication Technologies, RIVF 2022
國家/地區Viet Nam
城市Ho Chi Minh City
期間20/12/2222/12/22

指紋

深入研究「Low-Resource Speech Recognition Based on Transfer Learning」主題。共同形成了獨特的指紋。

引用此