All that sounds impressive, for sure. But to Stephen Hawking, it also sounds ominous.
In a recent interview with BBC, Hawking said that the “development of full artificial intelligence could spell the end of the human race.”
"It would take off on its own, and re-design itself at an ever increasing rate," Hawking said. “Humans, who are limited by slow biological evolution, couldn't compete, and would be superseded."
The possibility that AI will eventually outstrip humans' ability to keep pace with it is known as the “technological singularity,” and it's a concern Hawking has mentioned before. As he wrote in a May op-ed for The Independent: "Success in creating AI would be the biggest event in human history. Unfortunately, it might also be the last."
Hawking isn't alone in his concern. Elon Musk, CEO of SpaceX and Tesla Motors, said last month that “we should be very careful about artificial intelligence. With artificial intelligence we are summoning the demon. You know all those stories where there's the guy with the pentagram and the holy water, and it's like yeah, he’s sure he can control the demon. Doesn't work out."
But not everyone is sounding the alarm.
Noted AI expert Oren Etzioni, CEO of the Allen Institute for Artificial Intelligence in Seattle, told participants in a recent Reddit AMA that he wasn't afraid of AI and that “I don’t think you should be either.”
It's possible that the singularity would arrive in a million years or so, Etzioni said, but apocalyptic visions of clever computers taking over the world are simply “silly.”
“The plausible scenario based on my working actively in this field for more than 25 years is that we will continue to make progress but that there's no runaway intelligence... We are building increasingly sophisticated programs to read and understand text, but they are in no danger of running anywhere,” he wrote.
Source: http://www.huffingtonpost.com/2014/12/02/stephen-hawking-ai-artificial-intelligence-dangers_n_6255338.html?utm_hp_ref=weird-news&ir=Weird+News and provided by entertainment-movie-news.com
No comments:
Post a Comment