Source: Getty
commentary

Two Deeply Troubling Trends From Ukraine’s Year of War

Russia’s invasion has demonstrated the grave human costs when military interests override humanitarian considerations and outpace ethical consensus.

Published on February 22, 2023

At the one-year anniversary mark of Russia’s invasion of Ukraine, President Vladimir Putin’s brutal strategy of attrition has come into focus. Frustrated that his forces have been unable to attain victory, or even hold onto territorial gains amassed in the first months of the war, Putin has adopted an intensified campaign of civilian bombardment and destruction. Neither Russia nor Ukraine is ready to accept an outcome short of a decisive military victory. Russia cannot tolerate a Ukraine that is increasingly tied to the West, and Ukraine has rightfully invoked the principle of sovereignty to reject calls for ceding territory to Russia in exchange for peace. Both states are compelled to use extraordinary means to achieve victory.

Moscow has shown a brazen willingness to breach international humanitarian law (IHL) in its pursuit of victory. Over the weekend, U.S. Vice President Kamala Harris accused Russia of committing crimes against humanity—of pursuing “gruesome acts of murder, torture, rape, and deportation” as well as “execution-style killings, beatings, and electrocution.”

A core tenet of IHL is the principle of military necessity, which permits armed parties to pursue measures that are necessary to weaken the military capacity of the adversary but does not allow them to carry out attacks that harm civilians or damage nonmilitary infrastructure. Since World War II, international norms have steadily moved in favor of preferencing humanitarian objectives over military necessity. Yet this development appears to hold little sway when it comes to Moscow’s behavior.

Take Russia’s attacks on Ukraine’s energy infrastructure. Since October, Moscow has conducted strikes against power plants and substations using missiles and Iranian kamikaze drones in multiple regions across Ukraine. These attacks exacerbate humanitarian challenges to civilians, who rely on the power grid for heat, electricity, and hot water during the winter. As international legal scholar Michael N. Schmitt writes, “the attacks [against Ukraine’s power infrastructure] have gone on for so long, are so widespread, and are so intense, that it is difficult to attribute any purpose to them other than terrorizing the civilian population.”

Russia pretends to uphold international law, using disinformation to obfuscate its motives for targeting civilian structures. Repeatedly, Russia deploys propaganda to deny that it has struck civilian targets or to falsely accuse the Ukrainian military of co-opting civilian structures for military operations.

Russia’s true intent is to continue carrying out civilian attacks, whether or not they are in violation of international law, because its military leaders believe that the terror resulting from its air strikes will undermine Ukrainian resolve and promote Russia’s strategic objectives. Russia’s military is notorious for prosecuting war in a brutal, maximalist manner—a strategy it most recently honed in Syria. It has showered Ukraine with missiles and artillery on a scale unseen in Europe for decades, striking not just military targets but also civilian structures ranging from apartment blocks and theaters to schools and hospitals.

The aggregate global conflict data over the last twenty years is a useful context for the scale of casualties in Ukraine. The high water mark for civilian deaths occurred in 2014, with more than 35,000 recorded civilian conflict fatalities, largely due to conflict involving Islamic State forces.

In comparison, the UN’s human rights agency estimates that at least 6,919 civilians were killed in Ukraine between February 2022 and January 2023. This figure is likely an undercount, as it only includes verified deaths. Estimates of civilian fatalities from Ukrainian government sources range between 33,000 and 41,000 civilians killed, with the battle over Mariupol alone leading to an estimated 25,000 civilian deaths.

The civilian casualties linked to hostilities between Russian and Ukrainian forces are not just a result of collateral damage from fighting. Many of these casualties represent deliberate actions taken by Russian soldiers to kill Ukrainian civilians as part of a campaign of terror. Mass graves discovered in towns such as Bucha and Izium illustrate Russia’s vicious logic. All told, partners funded by the U.S. Agency for International Development have documented more than 20,000 instances of alleged war crimes and human rights abuses committed by Putin’s forces.

The implications of Russia’s apparent war crimes are significant. They not only have led to tremendous humanitarian suffering in Ukraine, but also, they call into question the international commitment to upholding norms of wartime conduct. Russia ratified the Geneva Conventions in 1954 and is obligated to preserve human life and dignity in armed conflict. Its blatant and continuing violations in Ukraine and the lack of a strong response to them by many countries sets a terrible example. Other countries in conflict—whether battling internal adversaries or fighting interstate rivals—will take note of Russia’s behavior and the uneven and often absent international reaction and possibly decide that flouting IHL norms comes without substantial cost.

Ukraine has shown far greater restraint in employing destructive means in the battlefield. But given the high stakes, Ukrainian leaders have actively experimented with innovative technology to gain a military edge, such as deploying lethal autonomous weapons in conflict or increasing reliance on AI tools. When the threat is framed in existential terms, the appeal of deploying emerging technologies on the battlefield can be compelling. But ethical problems may arise when untested technologies are used for war. Whereas there are consistent norms prohibiting the use of nuclear, chemical, or biological weapons, there is little agreement about the ethical bounds for these new options.

Hence, the war in Ukraine exemplifies a second worrying trend: rapid technological innovation on the battlefield leading to unintended and unknown consequences. The deployment of AI tools is a case in point. For example, the U.S. company Palantir is working with Ukrainian and NATO forces to implement an “electronic kill chain” that uses a data aggregator called MetaConstellation to give Ukrainian soldiers ready access to all available commercial data in a given battle space. This includes data from optical pictures, thermal images, radar, and satellite imagery. As a result, Ukrainians have precise intelligence about Russian offensive and defensive movements, allowing them to withdraw troops, reposition units, and use long-range artillery to strike Russian forces with accuracy. As Washington Post columnist David Ignatius notes, “By applying artificial intelligence to analyze sensor data, NATO advisers outside Ukraine can quickly answer the essential questions of combat: Where are allied forces? Where is the enemy? Which weapons will be most effective against enemy positions?”

On the one hand, these capabilities provide armies with more precise capabilities enabling specific targeting and possible reductions in civilian harm. Yet we cannot assume that technological innovation will stop with data fusion platforms. For example, there are ongoing efforts to incorporate greater autonomy into the decisionmaking process. Rather than simply presenting integrated streams of battlefield data, AI systems are developing the capability to make independent targeting decisions without human input. This is where major problems can arise. As Arthur Holland Michel writes for the United Nations Institute for Disarmament Research, battlefield environments are “harsh, dynamic, and adversarial.” Under these circumstances, autonomous systems are bound to face problems and could potentially “fail in a complex and unpredictable manner.”

Despite these risks, Ukraine’s digital transformation minister, Mykhailo Fedorov, has indicated that Ukraine has been conducting “a lot of R&D” to develop fully autonomous killer drones, declaring that these systems are “a logical and inevitable next step” in weapons development. Russian forces also may be close to deploying autonomous drones, with officials claiming that the Lancet drone “can operate with full autonomy.” As leading AI researcher Stuart Russell notes, “the technology is not especially complicated,” positing that within a semester, a team of graduate students could build an autonomous drone “capable of finding and killing an individual, let’s say, inside a building.”

The long-term spillover effects are troubling. Other actors, such as Iran and Turkey, already possess impressive technological military capabilities, and they are paying close attention to their drones’ military effectiveness on the Ukrainian battlefield. If more and more states become convinced that autonomous drones hold the key to the future of warfare, they will be motivated to pour resources into the development and deployment of such technologies.

Due to the “open technological revolution,” in which the provenance of innovation has shifted from government agencies to private commercial firms, a wide group of countries can acquire advanced tools for a variety of uses, whether enhancing their internal security or increasing their warfighting capabilities. One has to assume that AI battlefield tools will proliferate widely and efforts to establish ethical and normative guardrails will fall behind the pace of innovation, increasing the risk that military applications of emerging technology will spiral out of control.

The Ukraine war demonstrates the grave human costs when military imperatives override humanitarian concerns and outpace ethical consensus. Alarmed by escalatory threats in that conflict, in January, the Bulletin of the Atomic Scientists moved the Doomsday Clock forward to ninety seconds to midnight—the closest to global catastrophe the clock has ever been. This should be a wake-up call to political leaders about the imperative to confront threats emanating from Russia’s rogue behavior—and to mitigate risks posed by lethal emerging technologies—before it becomes too late to reverse disaster.

See more of Carnegie’s coverage of the Ukraine war’s long shadow.

Carnegie does not take institutional positions on public policy issues; the views represented herein are those of the author(s) and do not necessarily reflect the views of Carnegie, its staff, or its trustees.