Stephen Hawking is worried that artificial intelligence could copy itself and replace humans as Earth's dominant life form. It sounds farfetched, but the 75-year-old physicist is not alone in his predictions, as tech entrepreneurs have also sounded the alarm over such systems.
"I fear that A.I. may replace humans altogether," Hawking said in an interview with Wired this week. "If people design computer viruses, someone will design A.I. that improves and replicates itself. This will be a new form of life that outperforms humans."
Hawking isn't the first big name to warn about A.I. SpaceX and Tesla founder Elon Musk has claimed the technology poses "a fundamental risk to the existence of human civilization," and its exponential growth is advancing "like a tidal wave." Among other projects, Musk has criticized plans to develop a godlike A.I. capable of forming its own religion:
Musk and Hawking have worked together before to warn about the future of A.I. The pair joined hundreds of other researchers and experts earlier this year to endorse a list of 23 guiding principles for future A.I. development. The Asilomar A.I. Principles includes basic ideas like ensuring humans have control, but also cover bigger concepts like ensuring such systems have a shared benefit and designing machines with compatible human values.
In the wider tech industry, responses to these warnings have varied. But as A.I. forms a bigger part of people's lives, the conversation is shifting.
"[Musk is] a deep thinker about problems, and I think it's right to be concerned about A.I.," Google CEO Sundar Pichai said in an interview, when told Musk had raised concerns about his company's A.I.-powered camera.
Like Musk, Hawking is worried people aren't concerned enough about the pace of change. At the launch of Cambridge University's artificial intelligence center last year, he said that while experiments with machines learning how to play Go were cool, they "will surely pale against what the coming decades will bring."
"I believe there is no deep difference between what can be achieved by a biological brain and what can be achieved by a computer," Hawking said . "It therefore follows that computers can, in theory, emulate human intelligence and exceed it."