Khabar Online: According to what can be seen in the update of the instructions of the US Ministry of Defense, the army of this country is intensifying its efforts to grow and develop and use autonomous weapons.
The latest version of this directive was published on January 25, 2023, and the main emphasis is on the use of artificial intelligence in autonomous weapons. This follows a related implementation plan published by NATO on October 13, 2022.
Both statements reflect an important point that the world's militaries have learned in recent military operations in Karabakh and Ukraine: the use of artificial intelligence weapons will be the future of warfare.
Richard Moyes, director of Article36, a humanitarian organization that focuses on reducing the harm caused by weapons of war, said: "We know that the generals see the military value of stray munitions in Ukraine."
These weapons, which include bombs and drones, can wait for long periods of time to reach targets. In this regard, Moise reminded that such semi-autonomous missiles are usually used with human control over key decisions.
As war casualties mount in Ukraine, the pressure to gain a decisive advantage on the battlefield with fully autonomous weapons has increased. Because robots can choose their targets and attack them without human supervision.
This month, a major Russian manufacturer announced plans to grow and develop a new version of detection robots; A type of ground vehicle that can be relied upon to strengthen the combat forces present in Ukraine.
At the same time, fully autonomous drones are also used to defend Ukraine's energy facilities.
On the other hand, Mykhailo Fedorov, Minister of Digital Transformation of Ukraine, announced that the use of fully autonomous weapons in war is an inevitable and logical next step, and recently reminded that soldiers may see such weapons on the battlefield in the next six months.
Proponents of fully autonomous weapons systems believe that this technology protects soldiers from harm by keeping them away from battlefields. At the same time, it allows military decisions to be made at a superhuman speed and defense capabilities will be greatly improved.
Currently, semi-automatic weapons, such as stray munitions that track targets and detonate themselves, also require human assistance. Although these weapons can perform actions, they require operators to initiate the operation process.
On the other hand, fully autonomous drones, so-called "predator drones", are now deployed in Ukraine. These drones can monitor drones around the clock and disable them without operator intervention and faster than human-controlled weapon systems.
Request to stop using these weapons
Meanwhile, some critics, including the "Stop Killer Robots" campaign, have been trying to ban the research and development of autonomous weapons for more than a decade. They see a future in which autonomous weapons systems are specifically designed not just to target vehicles, infrastructure and other weapons, but also to target humans.
They believe that wartime decisions about life and death should remain in the hands of humans, and leaving these decisions to algorithms would be digital dehumanization.
The campaign, along with Human Rights Watch, believe that the autonomous weapons system lacks the human judgment necessary to distinguish between civilians and legitimate military targets. They also remove meaningful human control over what happens on the battlefield.
These organizations believe that heavy military investments in autonomous weapons systems, including in the United States, Russia, China, South Korea and the European Union, will enter the world into a new expensive and destructive arms race, and one of the consequences of this could be that this new technology And it is dangerous to fall into the hands of terrorists and other people who are outside the control of governments.
Moise reminded that the international laws in the current situation have not set a suitable framework for understanding the concept of independence of weapons. For example, the current legal framework does not specify whether commanders are responsible for determining the targets for the systems they use or whether they should determine the spatial and temporal limits for the operation of these systems.
"The danger is that there is no clear line between where we are now and where we have accepted the unacceptable," he continued.
The International Committee of the Red Cross, which is the guardian of international humanitarian law, has stressed that the legal obligations of commanders and operators cannot be imposed on a machine, algorithm or weapon system. Currently, humans are responsible for protecting civilians and trying to minimize the damage of war by ensuring appropriate use of weapons for military purposes.
But if artificially intelligent weapons are deployed on the battlefields, who will be responsible for the deaths of civilians? There is still no clear answer to this important question.