Crush. Kill. Destroy. The singularity – the super-evolution of sentient machines – is coming, according to Professor Stephen Hawking, and we should all be afraid.
“The development of full artificial intelligence could spell the end of the human race,” Professor Hawking said to the BBC. “It would take off on its own, and re-design itself at an ever increasing rate. Humans, who are limited by slow biological evolution, couldn’t compete, and would be superseded.”
“We cannot quite know what will happen if a machine exceeds our own intelligence, so we can’t know if we’ll be infinitely helped by it, or ignored by it and sidelined, or conceivably destroyed by it,” Hawking warns. This echoes the opinion of Elon Musk, Chief Executive of Space X. He voiced his fear during the Summer, calling advanced AI “more dangerous than nukes” and is “our biggest existential threat”.