Learning with Sharing: An Edge-Optimized Incremental Learning Method for Deep Neural Networks

Muhammad Awais Hussain, Shih An Huang, Tsung Han Tsai

研究成果: 雜誌貢獻期刊論文同行評審

4 引文 斯高帕斯(Scopus)

摘要

Incremental learning techniques aim to increase the capability of Deep Neural Network (DNN) model to add new classes in the pre-trained model. However, DNNs suffer from catastrophic forgetting during the incremental learning process. Existing incremental learning techniques try to reduce the effect of catastrophic forgetting by either using previous samples of data while adding new classes in the model or designing complex model architectures. This leads to high design complexity and memory requirements which make incremental learning impossible to implement on edge devices that have limited memory and computation resources. So, we propose a new incremental learning technique Learning with Sharing (LwS) based on the concept of transfer learning. The main aims of LwS are reduction in training complexity and storage memory requirements while achieving high accuracy during the incremental learning process. We perform cloning and sharing of full connected (FC) layers to add new classes in the model incrementally. Our proposed technique can preserve the knowledge of existing classes and add new classes without storing data from the previous classes. We show that our proposed technique outperforms the state-of-the-art techniques in accuracy comparison for Cifar-100, Caltech-101, and UCSD Birds datasets.

原文???core.languages.en_GB???
頁(從 - 到)461-473
頁數13
期刊IEEE Transactions on Emerging Topics in Computing
11
發行號2
DOIs
出版狀態已出版 - 1 4月 2023

指紋

深入研究「Learning with Sharing: An Edge-Optimized Incremental Learning Method for Deep Neural Networks」主題。共同形成了獨特的指紋。

引用此