Single-Head Lifelong Learning Based on Distilling Knowledge

Yen Hsiang Wang, Chih Yang Lin, Tipajin Thaipisutikul, Timothy K. Shih

Research output: Contribution to journalArticlepeer-review

3 Scopus citations

Abstract

Within the machine learning field, the main purpose of lifelong learning, also known as continuous learning, is to enable neural networks to learn continuously, as humans do. Lifelong learning accumulates the knowledge learned from previous tasks and transfers it to support the neural network in future tasks. This technique not only avoids the catastrophic forgetting problem with previous tasks when training new tasks, but also makes the model more robust with the temporal evolution. Motivated by the recent intervention of the lifelong learning technique, this paper presents a novel feature-based knowledge distillation method that differs from the existing methods of knowledge distillation in lifelong learning. Specifically, our proposed method utilizes the features from intermediate layers and compresses them in a unique way that involves global average pooling and fully connected layers. The authors then use the output of this branch network to deliver information from previous tasks to the model in the future. Extensive experiments show that our proposed model consistency outperforms the state-of-the-art baselines with the accuracy metric by at least two percent improvement under different experimental settings.

Original languageEnglish
Pages (from-to)35469-35478
Number of pages10
JournalIEEE Access
Volume10
DOIs
StatePublished - 2022

Keywords

  • Lifelong learning
  • continuous learning
  • incremental learning
  • knowledge distillation

Fingerprint

Dive into the research topics of 'Single-Head Lifelong Learning Based on Distilling Knowledge'. Together they form a unique fingerprint.

Cite this