Paper Image

Instruction tuning enables controllable text generation

Published on:

2 May 2024

Primary Category:

Computation and Language

Paper Authors:

Dhananjay Ashok,

Barnabas Poczos

Bullets

Key Details

Instruction tuning offers new approach to controllable text generation

Algorithm introduced to create constraint datasets without human curation

New ConGenBench benchmark compiled across 17 datasets and 18 constraints

Prompt-based methods outperform specialized generation methods

Performance competitive with humans on stylistic tasks, gaps remain on structural constraints

AI generated summary

Instruction tuning enables controllable text generation

This paper explores using instruction tuning of large language models as an approach to controllable text generation. The authors introduce an algorithm to automatically generate constraint datasets from only a task dataset and natural language description. They benchmark instruction-tuned models on a new testbed, ConGenBench, finding that prompting outperforms other controllable generation methods, although there are still challenges with structural constraints.

Answers from this paper

Comments

No comments yet, be the first to start the conversation...

Sign up to comment on this paper

Sign Up