Abstract
We study associated learning (AL), an alternative methodology to end-to-end backpropagation (BP). We introduce the workflow to convert a neural network into an AL-form network such that AL can be used to learn parameters for various types of neural networks. We compare AL and BP on some of the most successful neural networks: convolutional neural networks, recurrent neural networks, and Transformers. Experimental results show that AL consistently outperforms BP on open datasets. We discuss possible reasons for AL's success, its limitations, and AL's newly discovered properties. Our implementation is available at https://github.com/Hibb-bb/AL.
| Original language | English |
|---|---|
| State | Published - 2022 |
| Event | 10th International Conference on Learning Representations, ICLR 2022 - Virtual, Online Duration: 25 Apr 2022 → 29 Apr 2022 |
Conference
| Conference | 10th International Conference on Learning Representations, ICLR 2022 |
|---|---|
| City | Virtual, Online |
| Period | 25/04/22 → 29/04/22 |
Fingerprint
Dive into the research topics of 'ASSOCIATED LEARNING: A METHODOLOGY TO DECOMPOSE END-TO-END BACKPROPAGATION ON CNN, RNN, AND TRANSFORMER'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver