Paper Image

Learning without forgetting by maximizing consistency

Published on:

16 October 2023

Primary Category:

Computer Vision and Pattern Recognition

Paper Authors:

Tao Zhuo,

Zhiyong Cheng,

Hehe Fan,

Mohan Kankanhalli

Bullets

Key Details

Proposes continual learning method without task identity or old data

Uses unlabeled auxiliary dataset to enhance prediction consistency

Selects reliable samples with high discrepancy for regularization

Significantly reduces forgetting in class-incremental scenarios

Achieves strong performance compared to rehearsal methods

AI generated summary

Learning without forgetting by maximizing consistency

This paper proposes a method for continual learning that allows a model to learn new tasks without forgetting old ones, by maximizing consistency between the model's old and new predictions. It does this without requiring knowledge of task identity or previous data samples. The method uses an auxiliary unlabeled dataset to enhance consistency across more data distributions. It also selects reliable samples from this dataset that have high discrepancy between old and new model predictions, further improving consistency.

Answers from this paper

Comments

No comments yet, be the first to start the conversation...

Sign up to comment on this paper

Sign Up