Source: Getty

Understanding the Global Debate on Lethal Autonomous Weapons Systems: An Indian Perspective

This article explores the global debate on lethal autonomous weapons systems (LAWS), highlighting the convergences, complexities, and differences within and beyond the UN Group of Governmental Experts (GGE) on LAWS. It further examines India’s key position at the GGE and the probable reasons behind them.

Published on August 30, 2024

Background

Militaries worldwide are increasingly developing advanced weapons systems powered by artificial intelligence (AI) while simultaneously revising their military strategies to accommodate AI’s integration. Of particular concern is the emergence of lethal autonomous weapons systems, or LAWS—a class of advanced weapons that can identify and engage targets without human intervention. Reports from conflicts in Ukraine, Israel and Palestine, and Libya suggest that weapons with some autonomous capabilities may already be in use. These include systems like Saker Scout, Gospel, and Kargu-II. Many countries including China, Israel, Russia, South Korea, Türkiye, the United Kingdom, and the United States are also reported to be investing in building autonomous weapons.

The autonomous nature of these weapons has transformed human-machine interaction amid conflicts, complicating the application of international humanitarian law (IHL). Concerns have arisen over unsupervised use and potential system errors in these weapons systems that can cause unintended civilian casualties, escalate conflicts, and threaten peace. Many observers have also flagged their potential to go beyond human control, which could lead to rapid conflict escalations and probable flash wars. Further, there are concerns about these weapons being proliferated by nonstate actors such as terrorist and criminal groups. Despite these risks, some argue that LAWS could enhance IHL protections, while ethical debates question the morality of machines making life-and-death decisions.1 These humanitarian, legal, ethical, and security challenges spurred discussions within the UN Group of Governmental Experts on LAWS (GGE) under the Convention on Certain Conventional Weapons (CCW) in 2016. The GGE, composed of High Contracting Parties to the CCW, has produced numerous reports and documents offering valuable insights into each party’s stance on regulating LAWS.

India is a High Contracting Party to the CCW and has actively participated in the global discourse on LAWS at the GGE. As the chair of the GGE in 2017, India played an instrumental role in leading the group toward affirming the eleven guiding principles on the use and development of LAWS. This was despite the hold-up in defining these systems and reaching a consensus on their technical aspects.

In recent years, the GGE has faced criticism for its slow progress in establishing a binding norm to regulate LAWS—a shortcoming often attributed to its consensus-driven process where a single member’s dissent is enough to reject a proposal.

Within the GGE, too, there have been calls for a binding instrument to regulate LAWS, as evidenced by member statements since 2016. These have shaped the GGE’s current agenda, which has been focused on formulating the elements of such an instrument between 2024 and 2025, with discussions already underway in 2024.

Owing to the slow pace of discussions at the GGE, concerns have also been raised regarding the rapidly narrowing window to establish an effective regulatory framework for such weapons. This has led to the emergence of processes outside the GGE that emphasize the risks of these weapons and call for the negotiation of a binding treaty to regulate and ban LAWS urgently. For example, the UN Secretary-General and the International Committee of the Red Cross have called for a treaty to regulate and prohibit autonomous weapons systems by 2026. There are also concerns among countries that the emergence of multiple parallel processes could fragment the normative and regulatory debates on LAWS, hindering consensus on common approaches to regulate them.

What is the current state of the global debate on LAWS? Why has the GGE made slow progress, and why does India continue to support the GGE process despite the emergence of parallel efforts? This article addresses these questions, highlighting the convergences, complexities, and differences within and outside the GGE. It further examines India’s key position on LAWS at the GGE and the probable reasons behind them.

The State of the Global Debate on LAWS

Discussions Within the GGE

Since 2016, there have been several discussions at the GGE under the CCW on the ethical and normative issues on LAWS. Formed in 1980, the CCW framework is an integral part of IHL that aims to prohibit or restrict weapons having excessive and indiscriminate effects. Sometimes referred to as the Inhumane Weapons Convention, the CCW framework comprises a convention and five protocols that regulate the use of weapons deemed “excessively injurious.” For example, under Protocol V of the convention, a High Contracting Party that has been a party to a war must “mark and clear, remove, or destroy explosive remnants in affected territories under its control.” To allow a wide representation of expert opinion on the matter, GGE membership is open to all High Contracting Parties to the CCW, observer states, international organizations, and civil society, though voting rights remain reserved for High Contracting Parties.

The GGE’s mandate was finalized at the fifth CCW review conference in 2016. Among the subjects to be considered by the GGE, three were considered fundamental—identifying the characteristics of LAWS, developing a working definition, and applying IHL to these weapons. The GGE has met annually since 2016, including during the sixth CCW review conference in 2021. In 2019, the group released the eleven guiding principles for the use and development of LAWS. These principles, which relate concepts like accountability, risk mitigation, compliance with humanitarian laws, and technological innovation to LAWS, have provided a foundation for guiding discussions on their regulation. The guiding principles notably state that IHL will apply to all weapons systems including LAWS. In 2023, the chair of the group also compiled a non-exhaustive list of definitions and characterizations of LAWS, to which High Contracting Parties contributed through their submissions. Characterizations have been submitted by thirty-six countries and member states of the nonaligned movement. The 2023 GGE session also saw many countries converging around the two-tier approach to regulating LAWS.

Despite these outcomes and convergences, the GGE process has faced various complexities.

First, High Contracting Parties have varying understandings of what LAWS mean and entail. A study of the various characterizations of LAWS reveals differences in the understanding of both autonomous weapons and the concept of autonomy itself. These differences arise from the varying interpretations of key aspects of LAWS, such as autonomous capability, cognitive capability, the weapon’s intent, and the degree of human control, supervision, or intervention. For example, in their submission to the GGE in 2021, Argentina, Costa Rica, Ecuador, El Salvador, Guatemala, Kazakhstan, Panama, the Philippines, Sierra Leone, the State of Palestine, and Uruguay defined LAWS as weapons capable of performing the “critical functions” of identifying and engaging targets based on environmental triggers without human intervention. France’s submission went a step further and defined these weapons as those capable of modifying their programming and setting their own objectives. In its 2018 submission, China characterized LAWS as weapons with complete autonomy that cannot be terminated once deployed. Russia’s 2022 characterization viewed LAWS as prospective weapons that do not yet exist on the battlefield. Experts have warned that definitions that understand LAWS as prospective machines are detrimental to the process of their governance because they divert attention away from the pressing ethical and legal issues existing weapons pose. Such approaches also undermine regulations and declarations about banning these weapons, as they often refer to hypothetical systems with features like “understanding” and “intent” that current and foreseeable systems lack. Overall, the conceptual variety in the definitions of LAWS may lead to different ethical and normative considerations for their regulation.

Secondly, High Contracting Parties differ in their approach to LAWS regulations themselves. In the 2022 GGE session, France, Finland, Germany, Netherlands, Norway, Spain, and Sweden submitted a working paper on a two-tier approach, stating that LAWS that cannot comply with IHL should be prohibited, while other kinds of LAWS need to comply with IHL.

In the 2023 session of the GGE, Argentina, Ecuador, El Salvador, Colombia, Costa Rica, Guatemala, Kazakhstan, Nigeria, Palestine, Panama, Peru, the Philippines, Sierra Leone, and Uruguay submitted a draft of a new protocol for regulating LAWS within the CCW framework. Pakistan too argued in favor of a protocol to restrict and ban LAWS. Countries such as Australia, Canada, Japan, Poland, South Korea, the United States, and the United Kingdom have supported an approach of prohibiting and regulating LAWS based on existing IHL. Russia, however, has noted that there are no convincing reasons to immediately limit or ban LAWS, considering such calls to be premature.

In the 2024 GGE session, China stated that in the absence of a clear definition and characterization, a tiered approach to regulating LAWS based on its five technical characteristics may be useful. In a 2022 working paper submitted to the GGE, China elaborated on the characteristics of “unacceptable and acceptable LAWS.” Additionally, China has also expressed uncertainty about whether existing IHL adequately addresses the challenges of LAWS and has offered support for discussions on additional protocols. Meanwhile, Türkiye has backed the idea of a political guideline that will encourage the exchange of best practices as a confidence-building measure, recommending that the group take a step-by-step approach.

Emergence of Processes Outside the GGE

Despite the GGE’s ongoing efforts over the years, the movement to ban autonomous weapons has gained momentum. Civil society organizations, like Stop Killer Robots and the International Committee of the Red Cross, have strongly backed this cause, arguing that the GGE has failed to establish a binding norm and has fallen short of its ambitions to effectively regulate autonomous weapons. Some international bodies have also expressed dissatisfaction with the GGE’s work around IHL. Another significant critique regarding the GGE’s limitation is that militarily significant states have resisted the formulation of a new binding instrument, arguing that existing IHL is sufficient to regulate these weapons. There have also been calls for greater inclusivity and for expanding the discussions on LAWS, currently centered at GGE, to countries that are non-parties to the group.

In October 2023, Austria tabled a resolution at the first committee of the United Nations General Assembly (UNGA) on Disarmament and International Security. Co-sponsored by over forty states, the UNGA passed this resolution in December 2023, setting a provisional agenda to discuss LAWS at its general assembly session in 2024. This marked the first time the UNGA addressed this issue, signaling the intent of member countries to take the discussion outside the GGE. The resolution also tasked the UN secretary-general with preparing a report seeking the views of UN members, industry experts, and civil society on the matter, to be presented at the general assembly session in 2024. The resolution was endorsed by 152 nations; five, including Russia and India, voted against it, and eleven abstained. In August 2024, pursuant to the resolution, the UN secretary-general released an advance copy of the report, receiving ninety-one submissions including India’s.

The Belen Communiqué, adopted at the 2023 Latin American and Caribbean Conference on the Social and Humanitarian Impacts of Autonomous Weapons, urged swift treaty negotiations to regulate LAWS. This was also promoted at the GGE and other multilateral forums. Although the communiqué was not endorsed by major militaries like France, Russia, and the United States, they participated as observers during the proceedings. Similar calls were made in the CARICOM Declaration by Caribbean states in 2023 and the Freetown Communiqué by the Economic Community of West African States in 2024.

Numerous conferences and side events have also been organized on LAWS outside the GGE.

Pakistan’s side event on the margins of the 2023 UNGA First Committee addressed security risks arising from military applications of AI and LAWS and mapped the normative guardrails to address them. In 2024, on the sidelines of the GGE, the Philippines too hosted an event highlighting maritime security and environmental risks posed by autonomous weapons in the Indo-Pacific. In April, Austria convened the Vienna Conference on Autonomous Weapons Systems, exploring human dignity, control, accountability, and the future of human-technology interaction. A chair’s summary of the conference was released with vital points of discussion. Notably, most of these initiatives have been spearheaded by developing states from the Global South, though some have been led by countries with limited military capabilities to build LAWS.

In the past two years, discussions about the security and safety of various uses of AI in the military—such as logistics, training, simulation, and command and control—have also emerged. While these discussions focus on the broader military applications of AI, they also touch on LAWS, affecting existing discussions about their regulation.

One such event was the global Summit on Responsible AI in the Military Domain (henceforth REAIM), co-hosted by the Netherlands and South Korea in February 2023. The REAIM summit sought to expand the discourse on the military applications of AI to other areas—such discussions were largely confined to political spaces. Experts note that the summit aimed to forge a shared vision for the military application of AI through a collaborative, multistakeholder approach. The summit released a joint call for action that focused on nine deliverables. These included facilitating an inclusive dialogue, launching capacity-building initiatives and best practices to promote responsible deployment of AI in the military, and developing national frameworks and strategies, among others. Further, the Global Commission on REAIM was launched, initially for two years, to promote understanding among various bodies working on issues related to the global governance of AI’s military applications. Such an inclusive dialogue has offered a platform for several nongovernmental and private entities that play important roles in innovating and developing these AI applications.

While the United States has noted the GGE as the appropriate forum for discussing LAWS, it has also emphasized that military AI goes beyond LAWS and necessitates the identification of best practices across its broad spectrum. Pursuantly, the U.S.-led Political Declaration on Responsible Military Use of AI and Autonomy, unveiled at the 2023 REAIM summit, established a global framework for responsible use of military AI. It outlined ten foundational measures to guide nations in this endeavor. In March 2024, sixty countries, including the fifty-four that endorsed the declaration, convened at the first plenary meeting to discuss their national implementation strategies. Three working groups, led by Austria, Bahrain, Canada, Portugal, and the United States, were also formed to work on specific deliverables related to the declaration’s implementation. The United States emphasizes that the political declaration is complementary to, but independent of, the works of the GGE.

Both the REAIM summit and the U.S.-led political declaration operate under the shared assumption that AI will be used in the military domain. Rather than calling for a blanket ban on military applications of AI, these initiatives emphasize the need for mechanisms and practices to foster their responsible use. The two diverge in that the REAIM summit facilitates a bottom-up approach, emphasizing conversations around the technical aspects of AI’s military applications, whereas the declaration is a high-level political instrument that aims to implement best practices among its signatories through plenary sessions and thematic working groups. Regarding participation, the REAIM call to action was endorsed by China but not by Iran, Israel, or Russia, whereas the U.S.-led political declaration was endorsed by Israel but not by China, Iran, or Russia.

The different stances of High Contracting Parties within the GGE and initiatives outside the GGE reflect a fragmentation of the normative sphere of LAWS. While having regional and decentralized dialogues on regulating LAWS can promote inclusivity and bring diverse perspectives to the table, the existence of parallel processes with their own mandates could potentially hinder the development of a unified approach and delay achieving consensus for effective LAWS regulation.

India did not participate in the Belen Communiqué, the CARICOM Declaration, or the Freetown Communiqué. It neither joined the U.S.-led political declaration nor did it endorse the joint call at the 2023 REAIM Summit. The following section examines India’s stance on the normative governance of LAWS, highlighting its support for a politically binding instrument within the GGE and its focus on balancing military necessity with humanitarian concerns. It also analyses probable reasons behind India’s ongoing support for the GGE and a politically binding instrument, reflecting an Indian perspective on shaping the norms for regulating LAWS.

India’s Normative Positions on LAWS

Prioritizing the GGE Process

India was one of the five countries that voted against the December 2023 UNGA resolution on LAWS. At various sessions of the GGE and the UNGA, India maintained that the GGE was the appropriate forum to discuss LAWS. It posited that the GGE’s existing body of work must be expanded upon to build a common understanding of LAWS and expressed concerns that simultaneous processes may lead to a duplication of efforts or parallel sets of rules.

The following factors could have driven India’s negative vote at the UNGA.

Firstly, while the UNGA, with its 193 member states, offers greater inclusivity compared to the GGE, the latter provides essential technological, military, and legal expertise crucial for a comprehensive discussion on LAWS. The UNGA’s majority voting process may expedite resolutions, but these often lack binding effects, thereby weakening the regulation of LAWS, unlike the GGE’s consensus-based approach under the CCW. The CCW and its protocols offer a proven framework for regulating excessively harmful weapons, a framework that could potentially extend to LAWS in the future.

Secondly, the GGE has already developed a significant body of work—the eleven guiding principles on LAWS, annual reports, and working papers—that can help expand common understanding among High Contracting Parties. Given the international demand to establish a legal instrument to govern LAWS by 2026, the GGE’s work could expedite the process, instead of starting anew.

Thirdly, pursuing normative discussions on LAWS outside the GGE may prove ineffective, as major militaries developing autonomous weapons—such as Australia, China, Israel, Japan,  Russia, and the United States—support the GGE as the appropriate forum for these conversations. Many have also expressed disfavor for the creation of a parallel process outside the CCW. For example, although the United States voted for the UNGA resolution, it prioritizes the GGE for LAWS discussions. In its submission to the UN secretary-general report, it stated that efforts outside the GGE that do not include interested states or do not operate by consensus may cause fragmentation.

Supporting a Political Instrument

Global consensus takes time to build and even longer to translate and operationalize through codified rules in binding agreements. Reaching a consensus on the many aspects of LAWS is particularly challenging due to differing conceptual understandings of autonomous weapons and differing security realities that countries face. India believes that the discussions on LAWS should not contribute to the fragmentation of the normative sphere but rather should focus on finding common ground, taking into account the views and concerns of all. While it has expressed interest in developing legally binding instruments in arms control and disarmament as a matter of principle, India notes that much work needs to be done before meaningful negotiations can begin for a legally binding instrument on LAWS.

It is in this context that India has voiced its support for a politically binding instrument based on the eleven guiding principles of the GGE. Politically binding instruments embody high-level national commitments to certain goals. While legally binding instruments ensure compliance, the lack of clarity on key aspects—definitions and characterizations in the case of LAWS—may deter states from adopting them. As experts note, the positive effects of a legally binding instrument on compliance could be offset by reduced participation, thereby decreasing its effectiveness. In such circumstances, adopting a political instrument for LAWS at the GGE could be a useful first step before considering a legally binding instrument, especially given the reluctance of major militaries to restrict the development of autonomous weapons systems. This practice is not without precedent: Before the Outer Space Treaty was adopted in 1967, the UN Committee on the Peaceful Use of Outer Space formulated the Declaration of Legal Principles for Outer Space that was adopted by the UNGA in 1963, eventually paving the way for the treaty’s adoption.

Adopting a politically binding instrument within the GGE may help counter prevailing narratives on the group’s inability to produce sufficient outcomes to govern LAWS and may create momentum within the GGE to catalyze further action, such as a new CCW protocol.

Balancing Military Necessities With Risks

Developing autonomous weapons systems is a crucial military priority for India. Its defense industry already has some autonomous weapon systems, like the Counter Measure Dispensing Systems and the Adaptive Intelligent Front Towing Solution for Artillery Gun. Autonomous weapons systems can be useful for India along its borders, characterized by rough terrains and remote locations, for surveillance and reconnaissance and to avert cross-border terrorism. LAWS can continue to operate autonomously in information-denied environments, where operators could be subject to attacks by adversaries. In the Indian Ocean region, autonomous systems have proven particularly useful in anti-piracy operations, securing trade routes, and consolidating India’s place as a net security provider in the region. For example, India recently deployed an uncrewed MQ-9B SeaGuardian to assess and respond to a hijacked vessel in the Arabian Sea.

While recognizing the benefits these systems offer, India maintains that discussions on LAWS should not stigmatize the technology underlying autonomous weapons. This position was reflected in its stance at the GGE and reiterated in its submission to the UN secretary-general’s report. Moreover, India has maintained that the GGE under the CCW framework strikes the right balance between military necessities and humanitarian imperatives.

India’s continued focus on the GGE remains important amid these emerging processes because the group provides a balanced and value-neutral approach toward the governance of LAWS. In the past, the CCW framework has successfully balanced military necessities with humanitarian concerns and offers the potential to create new protocols for LAWS governance in the future. As discussed previously, the group also enjoys support from all relevant parties, including major militaries with capabilities to build autonomous weapons—key to ensuring effective regulation.

Conclusion

The global debate on LAWS is intensifying and expanding, with differing views on the fundamental understanding of these weapons and their regulation. Within the GGE, countries are moving toward a tiered form of LAWS regulation, despite divergences surrounding definitions, characterizations, and regulatory approaches. Outside the GGE, dialogues like the Belen Communiqué and the CARICOM Declaration emphasize risks, while the REAIM Summit and the U.S.-led political declaration advocate for their responsible use. At this stage, India’s key positions on the LAWS debate at the GGE offer insights as to why the GGE may still be relevant to the regulatory and normative process on LAWS. Countries should build on the momentum from the 2024 GGE session to further assess the progress and foster consensus on LAWS and their ongoing regulations.


  • 1According to Google’s Perspectives on Issues in AI Governance, machines exhibit limited capability to discern emotional subtleties. Despite advancements in their functionality, these systems may struggle to bring genuine humanity to their interactions.

Carnegie does not take institutional positions on public policy issues; the views represented herein are those of the author(s) and do not necessarily reflect the views of Carnegie, its staff, or its trustees.