126 - Optimizing Continuous Prompts for Generation, with Lisa Li
We invited Lisa Li to talk about her recent work,…
48 Minuten
Podcast
Podcaster
Beschreibung
vor 4 Jahren
We invited Lisa Li to talk about her recent work, Prefix-Tuning:
Optimizing Continuous Prompts for Generation. Prefix tuning is a
lightweight alternative to finetuning, and the idea is to tune only
a fixed-length task-specific continuous vector, and to keep the
pretrained transformer parameters frozen. We discussed how prefix
tuning compares with finetuning and other efficient alternatives on
two tasks in various experimental settings, and in what scenarios
prefix tuning is preferable. Lisa is a Phd student at Stanford
University. Lisa's webpage: https://xiangli1999.github.io/ The
hosts for this episode are Pradeep Dasigi and Ana Marasović.
Optimizing Continuous Prompts for Generation. Prefix tuning is a
lightweight alternative to finetuning, and the idea is to tune only
a fixed-length task-specific continuous vector, and to keep the
pretrained transformer parameters frozen. We discussed how prefix
tuning compares with finetuning and other efficient alternatives on
two tasks in various experimental settings, and in what scenarios
prefix tuning is preferable. Lisa is a Phd student at Stanford
University. Lisa's webpage: https://xiangli1999.github.io/ The
hosts for this episode are Pradeep Dasigi and Ana Marasović.
Weitere Episoden
30 Minuten
vor 2 Jahren
51 Minuten
vor 2 Jahren
45 Minuten
vor 2 Jahren
48 Minuten
vor 2 Jahren
36 Minuten
vor 2 Jahren
In Podcasts werben
Kommentare (0)