Former OpenAI Engineer William Saunders on Silence, Safety, and the Right to Warn
Whistleblower William Saunders quit over systemic issues at OpenAI.
Now he’s put his name to an open letter that proposes 4 principles
to protect the right of industry insiders to warn the public about
AI risks. On Your Undivided Attention this week, Tris
38 Minuten
Podcast
Podcaster
Beschreibung
vor 1 Jahr
This week, a group of current and former employees from OpenAI
and Google DeepMind penned an open letter accusing the industry’s
leading companies of prioritizing profits over safety. This comes
after a spate of high profile departures from OpenAI, including
co-founder Ilya Sutskever and senior researcher Jan Leike, as
well as reports that OpenAI has gone to great lengths to silence
would-be whistleblowers.
The writers of the open letter argue that researchers have a
“right to warn” the public about AI risks and laid out a series
of principles that would protect that right. In this episode, we
sit down with one of those writers: William Saunders, who left
his job as a research engineer at OpenAI in February. William is
now breaking the silence on what he saw at OpenAI that compelled
him to leave the company and to put his name to this
letter.
RECOMMENDED MEDIA
The Right to Warn Open Letter
My Perspective On "A Right to Warn about Advanced Artificial
Intelligence": A follow-up from William about the letter
Leaked OpenAI documents reveal aggressive tactics toward former
employees: An investigation by Vox into OpenAI’s policy of
non-disparagement.
RECOMMENDED YUA EPISODES
A First Step Toward AI Regulation with Tom WheelerSpotlight on
AI: What Would It Take For This to Go Well?Big Food, Big Tech and
Big AI with Michael MossCan We Govern AI? With Marietje Schaake
Your Undivided Attention is produced by
the Center for Humane Technology. Follow us on
Twitter: @HumaneTech_
Weitere Episoden
1 Stunde 4 Minuten
vor 8 Monaten
51 Minuten
vor 8 Monaten
59 Minuten
vor 9 Monaten
32 Minuten
vor 9 Monaten
35 Minuten
vor 10 Monaten
In Podcasts werben
Kommentare (0)