Paper Image

Progressive LLaMA with block expansion

Published on:

4 January 2024

Primary Category:

Computation and Language

Paper Authors:

Chengyue Wu,

Yukang Gan,

Yixiao Ge,

Zeyu Lu,

Jiahao Wang,

Ye Feng,

Ping Luo,

Ying Shan

Bullets

Key Details

Proposes block expansion method to add knowledge to LLMs without forgetting

Creates LLaMA Pro with 8 added blocks, tuned on code and math data

LLaMA Pro matches or exceeds skills of other models on language and reasoning

Fine-tuned LLaMA Pro outperforms other LLaMA models on benchmarks

Shows way to integrate language, code, and reasoning abilities

AI generated summary

Progressive LLaMA with block expansion

The paper proposes a method to expand LLaMA models with additional transformer blocks, tuning them on new data to add knowledge without losing existing capabilities. This allows creating LLaMA Pro, enhanced at code and math, while keeping general skills. Experiments show LLaMA Pro and its fine-tuned version outperform other LLaMA models, demonstrating ability to combine language, code, and reasoning.

Answers from this paper

Comments

No comments yet, be the first to start the conversation...

Sign up to comment on this paper

Sign Up