Nand Kishor Contributor

Nand Kishor is the Product Manager of House of Bots. After finishing his studies in computer science, he ideated & re-launched Real Estate Business Intelligence Tool, where he created one of the leading Business Intelligence Tool for property price analysis in 2012. He also writes, research and sharing knowledge about Artificial Intelligence (AI), Machine Learning (ML), Data Science, Big Data, Python Language etc... ...

Full Bio 
Follow on

Nand Kishor is the Product Manager of House of Bots. After finishing his studies in computer science, he ideated & re-launched Real Estate Business Intelligence Tool, where he created one of the leading Business Intelligence Tool for property price analysis in 2012. He also writes, research and sharing knowledge about Artificial Intelligence (AI), Machine Learning (ML), Data Science, Big Data, Python Language etc...

3 Best Programming Languages For Internet of Things Development In 2018
421 days ago

Data science is the big draw in business schools
594 days ago

7 Effective Methods for Fitting a Liner
604 days ago

3 Thoughts on Why Deep Learning Works So Well
604 days ago

3 million at risk from the rise of robots
604 days ago

Top 10 Hot Artificial Intelligence (AI) Technologies
316893 views

Here's why so many data scientists are leaving their jobs
82092 views

2018 Data Science Interview Questions for Top Tech Companies
79947 views

Want to be a millionaire before you turn 25? Study artificial intelligence or machine learning
78003 views

Google announces scholarship program to train 1.3 lakh Indian developers in emerging technologies
62718 views

Why We Must Not Build Automated Weapons of War

By Nand Kishor |Email | Sep 26, 2017 | 4659 Views

Over 100 CEOs of artificial intelligence and robotics firms recently signed an open letter warning that their work could be repurposed to build lethal autonomous weapons - "killer robots." They argued that to build such weapons would be to open a "Pandora's Box." This could forever alter war.

Over 30 countries have or are developing armed drones, and with each successive generation, drones have more autonomy. Automation has long been used in weapons to help identify targets and maneuver missiles. But to date, humans have remained in control of deciding whether to use lethal force. Militaries have only used automated engagements in limited settings to defend against high-speed rockets and missiles. Advances in autonomous technology could change that. The same intelligence that allows self-driving cars to avoid pedestrians could allow future weapons that hunt and attack targets on their own.

For the past three years, countries have met through the United Nations to discuss lethal autonomous weapons. Over 60 non-governmental organizations have called for a treaty banning autonomous weapons. Yet most countries are hedging their bets. No major military powers have said they plan to build autonomous weapons, but few have taken them off the table.


There's a certain irony in the CEOs of robotics and AI companies warning of the dangers of the very same technologies they themselves are building. They implore countries to "double their efforts" in international negotiations and warn that "we do not have long to act." But if the situation is truly dire, couldn't these companies slow their research to buy diplomats more time?

In reality, even if all of these companies stopped research, the field of AI would continue marching forward. The intelligence behind autonomous robots isn't like stealth technology, which was created in secret defense labs and tightly controlled by the military. Autonomous technology is everywhere. Hobbyist drones that retail for a few hundred dollars can takeoff, land, follow moving objects and avoid obstacles all on their own. Elementary school students build robots in competitions. Even the Islamic State is getting in on the game, strapping bombs to small drones. There is no stopping AI. Robotics companies can't easily band together to stop progress, because it only takes one company to break the agreement and advance the technology. Besides, to ask companies to stop research would be to ask them to forgo innovations that could generate profits and save lives.

These same dynamics make constraining autonomous weapons internationally very difficult. Asking countries to sign a treaty banning a weapon that doesn't yet exist means asking them to forgo a potentially useful tool to defend against threats and save lives. Moreover, the same problem of cheaters applies in the international arena, but the stakes are higher. Instead of lost profits, a nation might lose a war. History suggests that even when the international community widely condemns a weapon as inhumane - like chemical weapons - some despots will use them anyway. Treaties alone won't prevent rogue regimes and terrorists from building autonomous weapons. If autonomous weapons led to a decisive advantage in war, a treaty that disarmed only those who care for the rule of law would be the worst of all possible worlds.

The letter's signers likely understand this, which may be why the letter doesn't call for a ban, a notable departure from a similar letter two years ago. Instead, the signatories ask countries at the United Nations to "find a way to protect us from all these dangers." Banning or regulating emerging weapons technologies is easier said than done, though. Nations have tried to ban crossbows, firearms, surprise attacks by submarines, aerial attacks on cities and, in World War I, poison gas. All have failed.

And yet: Nations held back from using poison gas on the battlefields of World War II. The Cold War saw treaties banning chemical and biological weapons, using the environment as a weapon and placing nuclear weapons in space or on the seabed. The United States and Soviet Union pulled back from neutron bombs and anti-satellite weapons even without formal treaties. Nuclear weapons have proliferated, but not as widely as many predicted. In more recent years, nations have passed bans on blinding lasers, land mines and cluster munitions.

Weapons are easier to ban when few countries have access to them, when they are widely seen as horrifying and when they provide little military benefits. It is extremely difficult to ban weapons that are seen as giving a decisive advantage, as nuclear weapons are. A major factor in what will happen with autonomous weapons, therefore, is how nations come to see the benefits and risks they pose.Autonomous weapons pose a classic security dilemma for countries. All countries may be better off without them, but mutual restraint requires cooperation. Last year, nations agreed to create a more formal Group of Governmental Experts to study the issue. The group will convene in November and, once again, nations will attempt to halt a potentially dangerous technology before it is used in war.

Source: Time