TEOTWAWKI: The Singularity, Part II
Stephen Hawking, Elon Musk, Martin Rees... all have warned about
the risk from Artificial Intelligence. Centres like the Future of
Humanity Institute in Oxford, and the Centre for Existential Risk
at Cambridge, rate the risk from artificial...
46 Minuten
Podcast
Podcaster
Beschreibung
vor 8 Jahren
Stephen Hawking, Elon Musk, Martin Rees... all have warned about
the risk from Artificial Intelligence. Centres like the Future of
Humanity Institute in Oxford, and the Centre for Existential Risk
at Cambridge, rate the risk from artificial intelligence as way
up there on the scale of potential human apocalypses. But it
won't look like killer terminator-robots. We are standing on the
precipice of a future that's almost impossible to comprehend.
Will it be possible to survive?
In this episode, we'll talk about why people are afraid that this
explosion in technology could lead to the end of the world.
Pictures of killer Terminators need not apply; instead, we're
discussing the philosophical and ethical problems that come with
artificial intelligence that could exceed the capacity of humans.
How can we be sure that it has the same values as us? How can we
be sure that it even understands the value of human life at all?
Why are people so afraid of such a miraculous technology? And is
there anything we can do to ensure that it will be used as a tool
rather than taking over entirely?
Hit us up on Twitter @physicspod if your brain hasn't yet been
uploaded to a computer and you still need to use a pesky keyboard
interface in meatspace to interact. There, you can donate to the
show (in fiat currency, not bitcoin, although I'll take some if
you have any going.) Like and review us on iTunes, which will
help their 'artificially intelligent' algorithms serve up this
show to more consumer human robo-droids. Until next time, stay
safe.
Weitere Episoden
40 Minuten
vor 3 Jahren
49 Minuten
vor 3 Jahren
37 Minuten
vor 3 Jahren
41 Minuten
vor 3 Jahren
38 Minuten
vor 3 Jahren
In Podcasts werben
Kommentare (0)