Paper Image

Continual learning for named entity recognition

Published on:

23 October 2023

Primary Category:

Computation and Language

Paper Authors:

Duzhen Zhang,

Wei Cong,

Jiahua Dong,

Yahan Yu,

Xiuyi Chen,

Yonggang Zhang,

Zhen Fang

Bullets

Key Details

Proposes techniques for continual learning in named entity recognition (NER)

Uses distillation of attention weights to retain linguistic knowledge

Handles semantic shift in 'non-entity' class via pseudo-labeling

Significantly outperforms prior methods for continual NER

Achieves state-of-the-art on multiple benchmark datasets

AI generated summary

Continual learning for named entity recognition

This paper proposes methods to enable neural network models to continually learn to recognize new types of named entities in text, without forgetting prior entity types. The key ideas are using distillation of model attention weights to retain linguistic knowledge, and pseudo-labeling to handle semantic shifts in the 'non-entity' class.

Answers from this paper

Comments

No comments yet, be the first to start the conversation...

Sign up to comment on this paper

Sign Up