Drone technology in Ukraine has advanced to the point where fully autonomous fighting robots could soon be a reality.
Drone technology in Ukraine has advanced to the point where fully autonomous fighting robots could soon be a reality. This would mark a new era in warfare, where robots could be used in combat.
As the war drags on, it is becoming increasingly likely that drones will be used to identify, select and attack targets without any human involvement, according to military analysts, combatants and artificial intelligence researchers.
This would be a major shift in military technology, on par with the introduction of the machine gun. Ukraine already has semi-autonomous attack drones and counter-drone weapons that use artificial intelligence (AI). Russia also claims to have AI weaponry, though these claims have not been verified. There are no confirmed instances of a nation using robots in combat that have killed entirely on their own.
Some experts believe that Russia and/or Ukraine may deploy nuclear weapons in the near future.
"Many states are developing this technology," said Zachary Kallenborn, a George Mason University weapons innovation analyst. "Clearly, it's not all that difficult."
Activists who have long sought to ban killer drones now believe they must settle for trying to restrict their offensive use. This sense of inevitability extends to the activists themselves, who have tried for years to ban these weapons but have so far been unsuccessful.
Mykhailo Fedorov, Ukraine's digital transformation minister, believes that fully autonomous killer drones are the next logical step in weapons development. He said that Ukraine has been doing a lot of research and development in this area.
In a recent interview with The Associated Press, Fedorov said that he thinks the potential for this is great in the next six months.
In a recent interview near the front, Ukrainian Lt. Col. Yaroslav Honchar, co-founder of the combat drone innovation nonprofit Aerorozvidka, said that machines simply cannot process information and make decisions as quickly as human war fighters.
Ukrainian military leaders currently prohibit the use of fully independent lethal weapons, but that could change in the future, according to a recent report.
Honchar, whose group has spearheaded drone innovation in Ukraine, said that they have not yet crossed the line into using drones as weapons. "I say 'yet' because I don't know what will happen in the future," he said.
Russia could obtain autonomous AI technology from Iran or elsewhere. The long-range Shahed-136 exploding drones supplied by Iran have crippled Ukrainian power plants and terrorized civilians, but they are not especially smart. Iran has other drones in its evolving arsenal that it says feature AI technology.
According to Western manufacturers, Ukraine could make its semi-autonomous weaponized drones fully independent with relatively little effort, which would help them survive battlefield jamming.
Drones equipped with AI are increasingly being used for military purposes. The Switchblade 600 and the Warmate, both made in the US and Poland respectively, are two such drones. These drones, also known as "loitering munitions," can hover over a target for minutes, waiting for the perfect opportunity to strike.
"The technology to achieve a fully autonomous mission with Switchblade pretty much exists today," said Wahid Nawabi, CEO of AeroVironment, its maker. That will require a policy change - to remove the human from the decision-making loop - that he estimates is three years away.
There is disagreement over whether the technology behind drones is reliable enough to ensure that they don't err and take the lives of noncombatants. Some believe that drones can already recognize targets such as armored vehicles using cataloged images, but others are not convinced that the technology is advanced enough to be used safely.
The AP contacted the defense ministries of Ukraine and Russia to inquire if they have used autonomous weapons offensively. Neither ministry responded. If both sides agreed not to use such weapons, would they be willing to do so? There was no response to this question either.
If either side were to go on the attack with full AI, it might not be the first time.
A report from the United Nations last year suggested that killer robots may have been used for the first time in Libya's conflict in 2020. The report said that Turkish-made Kargu-2 drones in full-automatic mode killed an unspecified number of people.
A spokesman for STM, the manufacturer of the Kargu-2 robot, has disputed a recent report that the robot could attack targets without being instructed to do so. The spokesman told the Associated Press that the report was based on "speculative, unverified" information and should not be taken seriously. He stressed that the Kargu-2 cannot attack a target unless the operator tells it to do so.
Fortem Technologies, based in Utah, has supplied the Ukrainian military with drone-hunting systems that combine small radars and unmanned aerial vehicles, both powered by AI. The radars are designed to identify enemy drones, which the UAVs then disable by firing nets at them - all without human assistance. This fully autonomous AI is already helping to defend Ukraine.
The number of drones equipped with artificial intelligence is growing rapidly. Israel has been exporting them for decades, and its radar-killing Harpy drone can hover over anti-aircraft radar for up to nine hours, waiting for it to power up.
Other examples of AI-powered weapons include Beijing's Blowfish-3 unmanned weaponized helicopter and Russia's Poseidon nuclear-tipped underwater drone. The Dutch are currently testing a ground robot with a .50-caliber machine gun.
Honchar believes that if the Kremlin had killer autonomous drones, they would have used them by now, based on the fact that Russia has shown little regard for international law in their attacks on Ukrainian civilians.
"I agree with Adam Bartosiewicz," said Warmate vice president WB Group. "I don't think they would have any scruples."
AI is a priority for Russia. In a 2017 speech, President Vladimir Putin said that whoever dominates that technology will rule the world. In a Dec. 21 speech, he expressed confidence in the Russian arms industry’s ability to embed AI in war machines, stressing that “the most effective weapons systems are those that operate quickly and practically in an automatic mode.” Russian officials already claim their Lancet drone can operate with full autonomy.
"It will be difficult to determine if and when Russia crosses that line," said Gregory C. Allen, former director of strategy and policy at the Pentagon's Joint Artificial Intelligence Center.
Although it might not be perceptible, switching a drone from remote piloting to full autonomy can make a big difference. To date, drones that are able to work in both modes have performed better when piloted by a human, Allen said.
According to Stuart Russell, a top AI researcher from the University of California-Berkeley, the technology is not especially complicated. In the mid-2010s, he polled colleagues and found that they agreed that graduate students could produce an autonomous drone in a single term. This drone would be capable of finding and killing an individual, for example, inside a building.
Nine years of informal United Nations talks on military drones have so far been fruitless, with major powers such as the United States and Russia opposing a ban. The last session, in December, ended with no new round scheduled.
Washington policymakers say they won’t agree to a ban on drones because rivals developing drones cannot be trusted to use them ethically.
Toby Walsh, an Australian academic who, like Russell, is campaigning against killer robots, hopes to achieve a consensus on some limits. These limits include a ban on systems that use facial recognition and other data to identify or attack individuals or categories of people.
According to Walsh, author of "Machines Behaving Badly," if we are not careful, robots will proliferate much more easily than nuclear weapons. "If you can get a robot to kill one person, you can get it to kill a thousand," he said.
Scientists are also concerned about the possibility of AI weapons being repurposed by terrorists. In one feared scenario, the U.S. military spends hundreds of millions of dollars writing code to power killer drones. Then the code is stolen and copied, effectively giving terrorists the same weapon.
The global public is concerned about the use of lethal autonomous weapons systems. A 2019 survey by Ipsos for Human Rights Watch found that 61% of adults in 26 countries oppose their use.
As of yet, the Pentagon has neither clearly defined “autonomous weapon” nor authorized the use of any such weapon by U.S. troops, said Allen, the former Defense Department official. In order for any proposed system to be approved, it must be approved by the chairman of the Joint Chiefs of Staff and two undersecretaries.
Despite the international ban, development of autonomous weapons continues in the United States. The Defense Advanced Research Projects Agency, military labs, academic institutions, and private companies are all working on projects related to these weapons.
The Pentagon has emphasized using AI to augment human warriors. The Air Force is studying ways to pair pilots with drone wingmen. A booster of the idea, former Deputy Defense Secretary Robert O. Work, said in a report last month that it “would be crazy not to go to an autonomous system” once AI-enabled systems outperform humans. Work said that this threshold was crossed in 2015, when computer vision eclipsed that of humans.
Although humans are still involved in some defensive systems, we are gradually being replaced by machines. For example, Israel's Iron Dome missile shield is authorized to open fire automatically, although it is said to be monitored by a person who can intervene if the system goes after the wrong target.
According to Kallenborn, a George Mason researcher, multiple countries and every branch of the U.S. military are developing drones that can attack in deadly synchronized swarms.
It's possible that future wars will be fought with drones, with each side trying to destroy the other's drones.
In a 2017 televised chat with engineering students, Putin predicted that if one party's drones are destroyed by another party's drones, the first party will have no choice but to surrender.
There is no one-size-fits-all answer to the question of how often you should bathe your dog. The frequency with which you bathe your pup will depend on a number of factors, including the dog's breed, coat type, activity level, and skin condition. For example, dogs with thick, double coats may only need to be bathed every few months, while dogs with short coats or those who play outside a lot may need to be bathed more frequently. Consult your veterinarian or a professional groomer to get tailored advice for your dog.
Frank Bajak reported from Boston. Associated Press journalists Tara Copp in Washington, Garance Burke in San Francisco and Suzan Fraser in Turkey all contributed to this report.
There is no one-size-fits-all answer to the question of how often you should bathe your dog. The frequency with which you bathe your pup will depend on a number of factors, including the dog's breed, coat type, activity level, and skin condition. For example, dogs with thick, double coats may only need to be bathed every few months, while dogs with short coats or those who play outside a lot may need to be bathed more frequently. Consult your veterinarian or a professional groomer to get specific recommendations for your dog.
As a leading independent research provider, TradeAlgo keeps you connected from anywhere.