Ukraine’s Drone Developments Could Herald The Arrival Of Killer Robots

464
Credit: Jonathan Lampel/Unsplash

Drone developments in Ukraine have hastened a long-anticipated technological trend that might soon usher in a new era of warfare by bringing the first completely autonomous fighting robots to the field of battle, as reported by AP News.

Military technology

The longer the war lasts, the more likely it becomes that drones will be used to identify, select and attack targets without help from humans, according to military analysts, combatants and artificial intelligence researchers.

That would mark a revolution in military technology as profound as the introduction of the machine gun.

Ukraine already has semi-autonomous attack drones and counter-drone weapons endowed with AI.

Experts say it may be only a matter of time before either Russia or Ukraine, or both, deploy them.

The sense of inevitability extends to activists, who have tried for years to ban killer drones but now believe they must settle for trying to restrict the weapons’ offensive use.

Ukraine’s digital transformation minister, Mykhailo Fedorov, agrees that fully autonomous killer drones are “a logical and inevitable next step” in weapons development.

Lethal weapons?

In a recent interview near the front, Ukrainian Lt. Col. Yaroslav Honchar, the co-founder of the combat drone innovation organisation Aerorozvidka, claimed that human warfighters are unable to comprehend information and make decisions as quickly as machines.

Although things may change, according to him, Ukrainian military officials now forbid the use of completely autonomous lethal weaponry.

“We have not crossed this line yet – and I say ‘yet’ because I don’t know what will happen in the future,” said Honchar, whose group has spearheaded drone innovation in Ukraine, converting cheap commercial drones into lethal weapons.

Russia could be able to get autonomous AI from Iran or somewhere else. While terrorising citizens and crippling Ukrainian power stations, Iran’s long-range, exploding Shahed-136 drones are not particularly intelligent. Iran claims to have more drones with AI in its expanding arsenal.

According to Western producers, Ukraine could easily transform its semi-autonomous weapons drones into completely independent drones in order to better withstand battlefield interference.

These drones include the Polish Warmate and the American Switchblade 600, both of which now require a person to choose targets from a live video feed. AI completes the task. The drones, which are technically referred to as “loitering munitions,” can hover over a target for minutes while waiting for a clear shot.

“The technology to achieve a fully autonomous mission with Switchblade pretty much exists today,” said Wahid Nawabi, CEO of AeroVironment, its maker. That will require a policy change — remove the human from the decision-making loop — that he estimates is three years away.

Using catalogued imagery, drones can already identify targets like armoured vehicles. However, there is debate over whether the technology is trustworthy enough to guarantee that the machines won’t malfunction and kill civilians.

Defending Ukraine

The AP questioned the defence ministers of Russia and Ukraine about whether or not they had ever deployed autonomous weapons offensively and if they would agree to refrain from doing so if the other side did as well. Neither one answered.

It might not even be a first if either side launched a full-AI assault.

Killer robots allegedly made their debut in the internal strife in Libya in 2020 when Turkish-built Kargu-2 drones in full-automatic mode killed an unspecified number of combatants, according to an inconclusive U.N. report.

A spokesman for STM, the manufacturer, said the report was based on “speculative, unverified” information and “should not be taken seriously.” He told the AP the Kargu-2 cannot attack a target until the operator tells it to do so.

Ukraine’s defence is already aided by fully autonomous AI. The Ukrainian military has received drone-hunting devices from Utah-based Fortem Technologies, which combine small radars with unmanned aerial vehicles that are both AI-powered. Without human intervention, the UAVs use nets to disable adversary drones that have been identified by the radars.

The number of drones with AI is increasing. They have been exported by Israel for many years. When waiting for anti-aircraft radar to turn on, its radar-killing Harpy may hover over them for up to nine hours.

Beijing’s Blowfish-3 unmanned armed helicopter is another such. Russia has been developing the Poseidon, an AI drone with a nuclear tip. A.50 calibre machine gun-equipped ground robot is currently being tested by the Dutch.

Honchar thinks that if the Kremlin possessed killer autonomous drones by now, it would have used them in its attacks on Ukrainian people, which have demonstrated little concern for international law.

“I don’t think they’d have any scruples,” agreed Adam Bartosiewicz, vice president of WB Group, which makes the Warmate.

AI

For Russia, AI is a top priority. According to President Vladimir Putin, whoever controls this technology will govern the world. He expressed confidence in the capabilities of the Russian arms industry to integrate AI into war machines in a speech on December 21. He emphasised that “the most effective weapons systems are those that work rapidly and practically in an autonomous mode.”

Russian officials have already asserted that their Lancet drone is fully autonomous.

“It’s not going to be easy to know if and when Russia crosses that line,” said Gregory C. Allen, former director of strategy and policy at the Pentagon’s Joint Artificial Intelligence Center.

It might not be noticeable when a drone transitions from remote control flight to complete autonomy. According to Allen, drones that can operate in both modes have so far performed better when they are controlled by a person.

Stuart Russell, a renowned AI researcher and professor at the University of California-Berkeley, remarked that the technology is not overly complex. He surveyed colleagues in the middle of 2010, and they all agreed that graduate students could develop an autonomous drone in one term that was “capable of detecting and killing an individual, let’s say, within a building.”

It has been unsuccessful to establish international guidelines for the use of military drones. Nine years of informal UN negotiations in Geneva saw little progress since key nations like the US and Russia opposed a ban. With no new round scheduled as of the end of the previous session in December.

Nuclear weapons

Politicians in Washington said they won’t support a ban because they can’t trust competitors who are building drones to operate them morally.

Toby Walsh, an Australian scholar who supports Russell’s anti-killer robot campaign, wants to reach an agreement on some restrictions, such as a ban on technologies that utilise facial recognition and other data to identify or target specific persons or groups of people.

“If we are not careful, they are going to proliferate much more easily than nuclear weapons,” said Walsh, author of “Machines Behaving Badly.” “If you can get a robot to kill one person, you can get it to kill a thousand.”

Scientists are also concerned about terrorists repurposing AI weaponry. The U.S. military spends hundreds of millions of dollars designing code to power killer drones, according to one terrifying scenario. Then it is taken and copied, giving terrorists essentially the same weapon.

According to Allen, the former Defense Department official, neither has the Pentagon formally defined “an AI-enabled autonomous weapon” nor approved the use of even a single one by American forces. The chairman of the Joint Chiefs of Staff and two undersecretaries must consent to any proposed system.

That hasn’t stopped the development of weaponry across the United States. Defense Advanced Research Projects Agency, military research facilities, university institutions, and the commercial sector are all now working on projects.

The Pentagon

Pentagon has emphasized using AI to augment human warriors.

The Air Force is studying ways to pair pilots with drone wingmen.

A booster of the idea, former Deputy Defense Secretary Robert O. Work, said in a report last month that it “would be crazy not to go to an autonomous system” once AI-enabled systems outperform humans — a threshold that he said was crossed in 2015 when computer vision eclipsed that of humans.

Humans have already been pushed out in some defensive systems.

So will future wars become a fight to the last drone?

That’s what Putin predicted in a 2017 televised chat with engineering students: “When one party’s drones are destroyed by drones of another, it will have no other choice but to surrender.”

Did you subscribe to our newsletter?

It’s free! Click here to subscribe!

Source: AP News