Paper Image

Simplifying large language models with adapters

Published on:

30 October 2023

Primary Category:

Computation and Language

Paper Authors:

Rishabh Bhardwaj,

Tushar Vaidya,

Soujanya Poria


Key Details

Adapter modules can efficiently adapt large pre-trained language models to new tasks

Tropical geometry helps identify important adapter parameters to preserve when pruning

An optimization method prunes adapters while maintaining hypersurface orientation

Experiments show 60%+ of adapter params can be pruned with little performance drop

A combined magnitude and tropical pruning method works best across tasks

AI generated summary

Simplifying large language models with adapters

This paper proposes using small adapter modules inserted between layers of a large pre-trained language model to adapt it for specific tasks. This allows task-specific adaptation without retraining the full model. The authors develop an approach to prune the adapter modules without hurting performance, by preserving the geometric characteristics of the network.

Answers from this paper


No comments yet, be the first to start the conversation...

Sign up to comment on this paper

Sign Up