All countries around the world are concerned about the development of autonomous weaponry system. The new age weapons are AI-driven machines that are very capable in making the battlefield decisions. These autonomous weapons and machines are very powerful and can cause a great amount of damage too. The question that arises here is, should we entrust the lethal authority to a piece of equipment?
Are weapons that have been programmed with AI a threat to the human beings? Maybe Yes! In this technologically-dependent era, where driverless cars are being developed and deployed, you can’t even imagine how far AI is going to take mankind. While autonomous weaponry may seem as friendly an invention alike all other AI powered devices like automated cars, healthcare machines etc., it is in the least as simple as the latter. Till now we all thought that killer robots were just fictitious characters, but in no time, you will see this fiction turning into reality. There is bigger threat of cyber weapons that is already hovering upon us. Cyber weapons have the potential to operate with great autonomy and can crash any financial network and these also render inoperative the entire power grids on their own. The world is panic stricken by the intimidation of autonomous weapons as they have the power to launch attacks without any human input.
How are “LAWS” described?
The Lethal Autonomous Weapons in short are known as LAWS. These are devices that help track, attack and identify the target without the invention of the humans. These devices can be in any form, be it in the form of a robot, a gun or even a drone. The robot form may or may not have a humanoid appearance. Once switched on, an autonomous weapon can make its own decision of attacking the target or not, this is the chief element of these revolutionary machines.
What’s the difference between an autonomous and an automatic weapon?
It’s very simple to distinguish between autonomous and automatic weapons. However, striking a balance between the pros and cons of autonomous weaponry has become a difficult task lately owing to the researches which indicate the unruly behaviour of machines. Any weapon that is programmed with artificial intelligence and has the aptitude to take decisions on its own without the interference of a human is an autonomous weapon. Let us take landmine as an example, landmine can cause harm without being programmed with artificial intelligence. It only has the capability to react but not choose to attack with its own intelligence. Consider we have a drone that tracks a truck and sends the film to the human. That drone might even carry a missile, but if it consults a human for identification and attacking purpose then it won’t be called autonomous.
The weapons today deal in various stages of automation. A weapon can be “in the loop”, on the loop” or “off the loop”. When in the loop, that means that a human decides whether an attack will take place or not, the decision is on the human entirely and not on the machine; being on the loop means that if a weapon decides on a target, then there is a chance that a human can switch off the attack; being off the loop means that the weapon decides to do things on its own, while it conducts the attack, no human is involved in it.