In the wake of a number of important figures, including Elon Musk, Bill Gates, Steve Wozniak, and Professor Stephen Hawking, expressing their fears over the prospect of artificial intelligence subjugating humanity, Google’s premier AI scientist has called those ideas “preposterous”.
“Whether it’s Terminator coming to blow us up or mad scientists looking to create quite perverted women robots, this narrative has somehow managed to dominate the entire landscape, which we find really quite remarkable,” Mustafa Suleyman, the head of applied AI at Google DeepMind, said.
Suleyman was speaking at Bloomberg’s AI2015 conference in London on Friday. He added that, “The narrative has shifted from ‘Isn’t it terrible that AI has been such a failure?’ to ‘Isn’t it terrible that AI has been such a success?’”
DeepMind, the AI company co-founded by Suleyman, was bought by Google last year in a deal worth $400 million, and rose to prominence after writing a paper on an intelligent computer that was able to learn to play Atari games better than a human.
“On existential risk, our perspective is that it’s become a real distraction from the core ethics and safety issues, and it’s completely overshadowed the debate,” Suleyman said. ”The way we think about AI is that it’s going to be a hugely powerful tool that we control and that we direct, whose capabilities we limit, just as you do with any other tool that we have in the world around us, whether they’re washing machines or tractors. We’re building them to empower humanity and not to destroy us.”
When asked why Google was being so secretive over the structure of its AI ethics board in the face of pleas for transparency, Suleyman concurred with protests, saying, “That’s what I said to Larry [Page, Google’s co-founder]. I completely agree. Fundamentally we remain a corporation and I think that’s a question for everyone to think about. We’re very aware that this is extremely complex and we have no intention of doing this in a vacuum or alone.”
Thank you Wall Street Journal for providing us with this information.
Image courtesy of NYU.