
126 - Optimizing Continuous Prompts for Generation, with Lisa Li
We invited Lisa Li to talk about her recent work,…
48 Minuten
Podcast
Podcaster
Beschreibung
vor 1 Jahr
We invited Lisa Li to talk about her recent work, Prefix-Tuning:
Optimizing Continuous Prompts for Generation. Prefix tuning is a
lightweight alternative to finetuning, and the idea is to tune only
a fixed-length task-specific continuous vector, and to keep the
pretrained transformer parameters frozen. We discussed how prefix
tuning compares with finetuning and other efficient alternatives on
two tasks in various experimental settings, and in what scenarios
prefix tuning is preferable. Lisa is a Phd student at Stanford
University. Lisa's webpage: https://xiangli1999.github.io/ The
hosts for this episode are Pradeep Dasigi and Ana Marasović.
Optimizing Continuous Prompts for Generation. Prefix tuning is a
lightweight alternative to finetuning, and the idea is to tune only
a fixed-length task-specific continuous vector, and to keep the
pretrained transformer parameters frozen. We discussed how prefix
tuning compares with finetuning and other efficient alternatives on
two tasks in various experimental settings, and in what scenarios
prefix tuning is preferable. Lisa is a Phd student at Stanford
University. Lisa's webpage: https://xiangli1999.github.io/ The
hosts for this episode are Pradeep Dasigi and Ana Marasović.
Weitere Episoden

47 Minuten
vor 1 Jahr


42 Minuten
vor 1 Jahr

46 Minuten
vor 1 Jahr

48 Minuten
vor 1 Jahr
Kommentare (0)