ASSOCIATED LEARNING: A METHODOLOGY TO DECOMPOSE END-TO-END BACKPROPAGATION ON CNN, RNN, AND TRANSFORMER

Dennis Y. Wu, Di Nan Lin, Vincent F. Chen, Hung Hsuan Chen

研究成果: 會議貢獻類型會議論文同行評審

摘要

We study associated learning (AL), an alternative methodology to end-to-end backpropagation (BP). We introduce the workflow to convert a neural network into an AL-form network such that AL can be used to learn parameters for various types of neural networks. We compare AL and BP on some of the most successful neural networks: convolutional neural networks, recurrent neural networks, and Transformers. Experimental results show that AL consistently outperforms BP on open datasets. We discuss possible reasons for AL's success, its limitations, and AL's newly discovered properties. Our implementation is available at https://github.com/Hibb-bb/AL.

原文???core.languages.en_GB???
出版狀態已出版 - 2022
事件10th International Conference on Learning Representations, ICLR 2022 - Virtual, Online
持續時間: 25 4月 202229 4月 2022

???event.eventtypes.event.conference???

???event.eventtypes.event.conference???10th International Conference on Learning Representations, ICLR 2022
城市Virtual, Online
期間25/04/2229/04/22

指紋

深入研究「ASSOCIATED LEARNING: A METHODOLOGY TO DECOMPOSE END-TO-END BACKPROPAGATION ON CNN, RNN, AND TRANSFORMER」主題。共同形成了獨特的指紋。

引用此