137 - Nearest Neighbor Language Modeling and Machine Translation, with Urvashi Khandelwal
We invited Urvashi Khandelwal, a research scienti…
36 Minuten
Podcast
Podcaster
Beschreibung
vor 2 Jahren
We invited Urvashi Khandelwal, a research scientist at Google Brain
to talk about nearest neighbor language and machine translation
models. These models interpolate parametric (conditional) language
models with non-parametric distributions over the closest values in
some data stores built from relevant data. Not only are these
models shown to outperform the usual parametric language models,
they also have important implications on memorization and
generalization in language models. Urvashi's webpage:
https://urvashik.github.io Papers discussed: 1) Generalization
through memorization: Nearest Neighbor Language Models
(https://www.semanticscholar.org/paper/7be8c119dbe065c52125ee7716601751f3116844)
2)Nearest Neighbor Machine Translation
(https://www.semanticscholar.org/paper/20d51f8e449b59c7e140f7a7eec9ab4d4d6f80ea)
to talk about nearest neighbor language and machine translation
models. These models interpolate parametric (conditional) language
models with non-parametric distributions over the closest values in
some data stores built from relevant data. Not only are these
models shown to outperform the usual parametric language models,
they also have important implications on memorization and
generalization in language models. Urvashi's webpage:
https://urvashik.github.io Papers discussed: 1) Generalization
through memorization: Nearest Neighbor Language Models
(https://www.semanticscholar.org/paper/7be8c119dbe065c52125ee7716601751f3116844)
2)Nearest Neighbor Machine Translation
(https://www.semanticscholar.org/paper/20d51f8e449b59c7e140f7a7eec9ab4d4d6f80ea)
Weitere Episoden
30 Minuten
vor 2 Jahren
51 Minuten
vor 2 Jahren
45 Minuten
vor 2 Jahren
48 Minuten
vor 2 Jahren
1 Stunde 2 Minuten
vor 3 Jahren
In Podcasts werben
Kommentare (0)