126 - Optimizing Continuous Prompts for Generation, with Lisa Li

126 - Optimizing Continuous Prompts for Generation, with Lisa Li

We invited Lisa Li to talk about her recent work,…
48 Minuten
Podcast
Podcaster
Welcome to the NLP highlights podcast, where we i…

Beschreibung

vor 1 Jahr
We invited Lisa Li to talk about her recent work, Prefix-Tuning:
Optimizing Continuous Prompts for Generation. Prefix tuning is a
lightweight alternative to finetuning, and the idea is to tune only
a fixed-length task-specific continuous vector, and to keep the
pretrained transformer parameters frozen. We discussed how prefix
tuning compares with finetuning and other efficient alternatives on
two tasks in various experimental settings, and in what scenarios
prefix tuning is preferable. Lisa is a Phd student at Stanford
University. Lisa's webpage: https://xiangli1999.github.io/ The
hosts for this episode are Pradeep Dasigi and Ana Marasović.

Kommentare (0)

Lade Inhalte...

Abonnenten

15
15
:
: