James Stavridis: Ukraine War May Become a Proving Ground for AI

FacebookTwitterPinterestEmailShare
a multi-story residential building, partially destroyed after night drone attacks, in Kyiv
A resident walks among damaged cars while leaving a multi-story residential building, partially destroyed after night drone attacks, in Kyiv on Tuesday, May 30, 2023. Ukraine on May 30, 2023 said it had downed 29 out of 31 drones, mainly over Kyiv and the Kyiv region in the latest Russian barrage -- the third on the capital in 24 hours. (Sergei Supinsky/AFPGetty Images/TNS)

The opinions expressed in this op-ed are those of the author and do not necessarily reflect the views of Military.com. If you would like to submit your own commentary, please send your article to opinions@military.com for consideration.

Artificial intelligence is, suddenly, everywhere. We are awash in ideas about how we can use AI productively — from agriculture to climate change to engineering to software construction. And, equally, there are plenty of cautionary notes being struck about using AI to control societies, manipulate economies, defeat commercial opponents, and generally fulfill Arthur C. Clarke’s visions of machines dominating man in 2001: A Space Odyssey.

Thus far, however, relatively little has been written about the implications of AI on warfare and geopolitics. For better and worse, those arenas also lend themselves to a variety of ways in which new technologies can suddenly break apart paradigms. Think of Agincourt in 1415, a medieval battle in which the flower of the French nobility — sporting the key technology of that age, plate armor — were slaughtered at long range through an emerging technology, English longbowmen led by King Henry V. Military technology — submarines, radar, sonar, nuclear weapons — can change the global balance in an instant.

Are we at such a moment with AI? Perhaps. A good point of comparison might be the advent of nuclear weapons, when the most experienced warrior of his age, General Douglas MacArthur, saw the atomic bombs used on Japan and said simply that “warfare is changed forever.” Yet the hand-to-hand combat in Ukraine, the dug-in Russian forces in their extensive trenches awaiting the promised Ukrainian summer offensive, and the endless artillery duels between the two sides all seem oh-so-19th century, frankly.

How will artificial intelligence completely change warfare?

First and foremost, AI will be a powerful tool for decision makers on the battlefield at every level. I vividly remember when the Vincennes, a U.S. Navy Aegis cruiser, mistakenly shot down an Iranian commercial airliner in 1988. The tactical action officer in the combat information center incorrectly assessed a hostile Iranian military jet. Nearly 300 civilians paid with their lives.

Had an artificial intelligence advisor been available, capable of synthesizing millions of data points and comparing the radar picture to an infinite number of similar scenarios, it almost certainly would have identified a civilian aircraft. AI could dramatically reduce “collateral damage” killings.

AI could also instantly provide highly detailed strategic targeting information, giving a decision maker a road map to use precision weapons at the most vulnerable points of an enemy’s logistics chain. In the Libyan campaign of 2011, which I commanded, the North Atlantic Treaty Organization struggled with both avoiding collateral damage and less-than-optimal target selection — capabilities AI could easily have provided.

Another crucial capability of AI is the ability to control massive swarms of drones in synchronized attack formations, much as birds flock together to scare away predators. This kind of mechanical murmuration can be directed with deadly results by low-cost, disposable drones that swamp air defenses. Using AI to direct drones in Ukraine, for example, could allow the Kyiv government to further deplete Russia’s dwindling supply of armor, and also cause its forces to waste critical air-defense missiles.

AI could also be a powerful tool in psychological and information warfare. Creating deepfakes ­— for example, videos purporting to show certain combat effects — could cause mistaken reactions by enemy forces. Consider the image of the Pentagon in flames that spooked markets last week as it went viral around the globe. Ukraine could further the Russians’ sense of a failing war through a flood of AI-generated fake images, false stories and shadow operations.

AI will be very helpful in defensive and back office activities in war. Logistics, as we have seen in the Russian invasion of Ukraine, can be an Achilles heel of militaries. With AI analyzing maintenance patterns, suggesting preventative maintenance, detangling combat supply chains, and providing minute-to-minute logistical advice, commanders will have a deep advantage over opponents who have fallen behind in the race to develop and deploy these tools.

Finally, the ability to use AI to conduct cyberattacks may be its most dangerous attribute. As militaries continue to run combat operations, logistics, targeting, intelligence and all other aspects of modern warfare with the internet as the backbone, the ability to crack into an opponent’s cyber networks will be crucial. Particularly with advances in quantum computing, superior AI systems will allow overall mastery of the cyber battlefield.

Even as we consider the immense benefits of AI to our societies, we need to have a clear-eyed understanding of just how deep the impact will be on the conduct of war. All the more reason for the Pentagon to continue to refine its understanding and implementation of AI in the Ukrainian campaign, which will have benefits for decades to come.

____

James Stavridis is a Bloomberg Opinion columnist. A retired U.S. Navy admiral, former supreme allied commander of NATO, and dean emeritus of the Fletcher School of Law and Diplomacy at Tufts University, he is vice chairman of global affairs at the Carlyle Group. He is on the boards of American Water Works, Fortinet, PreVeil, NFP, Ankura Consulting Group, Titan Holdings, Michael Baker and Neuberger Berman, and has advised Shield Capital, a firm that invests in the cybersecurity sector.

___

©2023 Bloomberg L.P. Visit bloomberg.com/opinion. Distributed by Tribune Content Agency, LLC.

Show Full Article