Ukraine to make drone videos available for training AI models
The International Committee of the Red Cross, which monitors rules of warfare, has opposed automated targeting systems without human oversight.
Written by Andrew E. Kramer
The Ukrainian military will make available millions of drone videos and other battlefield data to Ukrainian companies and the firms of its allies to help train artificial intelligence models, Ukraine’s minister of defense, Mykhailo Fedorov, said in a statement Thursday.
Ukrainian drone videos have recorded attacks on soldiers, equipment such as vehicles and tanks and surveillance footage. These videos can be used to train AI models for automated targeting, according to experts on AI and warfare.
Allowing the use of genuine battlefield videos showing drones targeting people has raised ethical concerns. The International Committee of the Red Cross, which monitors rules of warfare, has opposed automated targeting systems without human oversight.
Fedorov said the data would be made available because “we must outperform Russia in every technological cycle” and “artificial intelligence is one of the key arenas of this competition.”
Ukraine has already used the data internally to train AI functions within its primary battlefield computer system, called Delta, the statement said.
Under the new policy, companies can train AI models on the data but will not be allowed to take possession of the videos. The datasets will be managed by a center for innovation within the Ministry of Defense.
Ukrainian officials have said humans will decide on the use of lethal force in the Ukrainian army. Supporters of AI targeting systems say that their precision guidance can reduce civilian deaths in war. Releasing an autonomous drone to attack soldiers who are miles away, they add, is no different from firing artillery at a distant target.
AI could also process information unrelated to lethal strikes. It could aid the processing of video from surveillance drones by identifying objects based on patterns gleaned from thousands or millions of hours of video.
Drones have surpassed all other weapons, such as rifles, machine guns, tanks, artillery and aerial bombs, in inflicting casualties for both Ukrainian and Russian forces. The drones broadcast video back to a human pilot, who controls the drone remotely.
But about 9 out of 10 strikes with exploding drones fail because radio signals are blocked by jamming equipment mounted on cars or carried by soldiers, or because targets are beyond the range for the signals. Drones guided by fiber-optic cables are one solution.
But drone makers in both Russia and Ukraine have been experimenting with AI systems that can autonomously recognize targets such as cars, tanks or humans and eliminate the need for a radio link with a pilot.
The videos show the evasive maneuvers that targets take at the last moment as exploding drones swoop in, such as car drivers swerving or soldiers jumping out of the way, shooting guns or throwing objects at drones. To work effectively, AI targeting systems need to maneuver drones to react to such defensive actions.
“The future of warfare belongs to autonomous systems,” according to Fedorov’s statement. “Our objective is to increase the level of autonomy in drones and other combat platforms so they can detect targets faster, analyze battlefield conditions and support real-time decision-making.”
This article originally appeared in The New York Times.
Stay updated with the latest - Click here to follow us on Instagram
Written by Andrew E. Kramer
The Ukrainian military will make available millions of drone videos and other battlefield data to Ukrainian companies and the firms of its allies to help train artificial intelligence models, Ukraine’s minister of defense, Mykhailo Fedorov, said in a statement Thursday.
Ukrainian drone videos have recorded attacks on soldiers, equipment such as vehicles and tanks and surveillance footage. These videos can be used to train AI models for automated targeting, according to experts on AI and warfare.
Allowing the use of genuine battlefield videos showing drones targeting people has raised ethical concerns. The International Committee of the Red Cross, which monitors rules of warfare, has opposed automated targeting systems without human oversight.
Fedorov said the data would be made available because “we must outperform Russia in every technological cycle” and “artificial intelligence is one of the key arenas of this competition.”
Ukraine has already used the data internally to train AI functions within its primary battlefield computer system, called Delta, the statement said.
Under the new policy, companies can train AI models on the data but will not be allowed to take possession of the videos. The datasets will be managed by a center for innovation within the Ministry of Defense.
Ukrainian officials have said humans will decide on the use of lethal force in the Ukrainian army. Supporters of AI targeting systems say that their precision guidance can reduce civilian deaths in war. Releasing an autonomous drone to attack soldiers who are miles away, they add, is no different from firing artillery at a distant target.
AI could also process information unrelated to lethal strikes. It could aid the processing of video from surveillance drones by identifying objects based on patterns gleaned from thousands or millions of hours of video.
Drones have surpassed all other weapons, such as rifles, machine guns, tanks, artillery and aerial bombs, in inflicting casualties for both Ukrainian and Russian forces. The drones broadcast video back to a human pilot, who controls the drone remotely.
But about 9 out of 10 strikes with exploding drones fail because radio signals are blocked by jamming equipment mounted on cars or carried by soldiers, or because targets are beyond the range for the signals. Drones guided by fiber-optic cables are one solution.
But drone makers in both Russia and Ukraine have been experimenting with AI systems that can autonomously recognize targets such as cars, tanks or humans and eliminate the need for a radio link with a pilot.
The videos show the evasive maneuvers that targets take at the last moment as exploding drones swoop in, such as car drivers swerving or soldiers jumping out of the way, shooting guns or throwing objects at drones. To work effectively, AI targeting systems need to maneuver drones to react to such defensive actions.
“The future of warfare belongs to autonomous systems,” according to Fedorov’s statement. “Our objective is to increase the level of autonomy in drones and other combat platforms so they can detect targets faster, analyze battlefield conditions and support real-time decision-making.”
This article originally appeared in The New York Times.