Paper Image

Dynamic task-aware network expansion for long-tail incremental learning

Published on:

8 February 2024

Primary Category:

Computer Vision and Pattern Recognition

Paper Authors:

Linjie Li,

S. Liu,

Zhenyu Wu,

JI yang

Bullets

Key Details

Proposes Task-aware Expandable framework to mitigate memory explosion and expand network suitable for each task

Develops Centroid-Enhanced method to enhance class-specific features and tail class discriminability

Achieves SOTA results on CIFAR-100 and ImageNet100 benchmarks under different long-tail settings

Significantly reduces expandable model memory size while improving accuracy

Framework is robust and outperforms across range of incremental steps and imbalance ratios

AI generated summary

Dynamic task-aware network expansion for long-tail incremental learning

This paper introduces a novel framework called Task-aware Expandable (TaE) to address challenges in long-tail class incremental learning. TaE dynamically allocates and updates a small subset of task-specific trainable parameters to learn representations tailored to each new incremental task, while resisting forgetting through freezing the majority of model parameters. A Centroid-Enhanced method is also proposed to encourage class-specific feature representations and handle the low discriminability of tail class features. Experiments on CIFAR-100 and ImageNet100 under different long-tail settings demonstrate state-of-the-art performance for TaE.

Answers from this paper

Comments

No comments yet, be the first to start the conversation...

Sign up to comment on this paper

Sign Up