I work at ValueFirst Digital Media Private Ltd. I am a Product Marketer in the Surbo Team. Surbo is Chatbot Generator Platform owned by Value First. ...Full Bio
I work at ValueFirst Digital Media Private Ltd. I am a Product Marketer in the Surbo Team. Surbo is Chatbot Generator Platform owned by Value First.
Success story of Haptik
964 days ago
Who is afraid of automation?
964 days ago
What's happening in AI, Blockchain & IoT
965 days ago
3 million at risk from the rise of robots
965 days ago
Artificial Intelligence Timeline: Infographic
James Cameron pessimistic about artificial intelligence
James Cameron, who is all set to reboot the "Terminator" franchise, believes artificial intelligence will probably be bad for the human race. In an interview with The Hollywood Reporter Cameron reflected upon "Terminator" and the advances in artificial intelligence. "Terminator" revolves around a cyborg assassin who is sent back in time from 2029 to 1984 to kill the woman whose son will one day become a Saviour against machines in a post- apocalyptic future.
Cameron has collaborated with "Deadpool" director Tim Miller to carry forward the franchise. The director said he is often asked the question about man versus machines, a theme that is recurrent in his films. "Technology has always scared me, and it's always seduced me. People ask me, 'Will the machines ever win against humanity?' I say, 'Look at people on their phones. The machines have already won'. It's just (that) they've won in a different way," the director, best known for blockbusters such as "Titanic" and "Avatar", said.
The filmmaker said while technology and humans were co- evolving, there might come a time when artificial intelligence will become too advanced for the human race. "One of the scientists we met with recently, she said: 'I used to be really, really optimistic, but now I'm just scared.' Her position on it is probably that we can't control this. It has more to do with human nature. "... At the very least, they will reflect our best and worst nature because we make them and we programme them. But it's going to take a lot of money. So who's got the money to do it and the will to do it? It could be business, so the Googles and the other big tech companies," he said.
Cameron said that like other previous inventions, artificial intelligence could be weaponised. "And if you're doing it for business, you're doing it to improve your market share or whatever your business goals are. So you're essentially taking a machine smarter than a human and teaching it greed. Or it's for defense, in which case you're taking a machine smarter than a human and teaching it to kill. Neither one of those has a good outcome in my mind," he said.