Indo-Pacific Analyst
One is spoiled for choice when choosing a metaphor to describe the significance of Artificial Intelligence (A.I.) in warfare: a double-edged blade, a 'third revolution' in war, the 'pandora's box' of weapons development, and so forth. Accordingly, characterising such weapons systems as a 21st-century Prometheus fits with the general trend across the litany of media coverage which these emerging armaments have garnered. Lethal Autonomous Weapons System (LAWS), as they are more formally known, possess immense consequences for the future landscape of any conflict, and although a Terminator-style A.I. Armageddon remains a fictional nightmare, the deployment of LAWS alongside human combatants is already a reality. Indeed, one can already draw parallels between the dawn of the so-called 'Nuclear Age' in the early years of the Cold War and the emerging LAWS arms race amongst nations in the current age; the international community faces the possibility of another division between the 'have' and 'have-not' powers. The question, then, for the multilateral and national institutions of the world, is how to regulate autonomous weapons to ensure that the world can reap their benefits without falling afoul of the consequences.
When discussing matters related to the international armaments industry, it is rarely the case that the simplest solution is the best—or even the most agreeable—one. When it comes to LAWS, some nations and civil society organisations have repeatedly called for a legally binding ban on their development and usage, akin to the Treaty on the Non-Proliferation of Nuclear Weapons (NPT). Unsurprisingly, the states that object most vocally to these calls are also the ones channelling immense capital and effort into expanding the autonomous capabilities of their military hardware—the Permanent Five members of the Security Council as well as Israel and Türkiye chief among them. Yet even some experts agree that an outright ban on LAWS, or 'killer robots' as many campaigners have labelled them, is an unachievable goal given one major problem: there is no 'official or internationally agreed definition’ for autonomous weapons systems. In the absence of international agreement over what a LAWS constitutes, governments have published and circulated their own identifying criteria, essentially providing their militaries and armaments companies a blank cheque with which to develop increasingly powerful and capable autonomous weapons platforms. In March 2021, the United Nations Security Council received a report which detailed the deployment of one such weapon system in Libya: Türkiye’s STM Kargu-2 drone, which was ‘programmed to attack targets without requiring data connectivity between the operator and the munition'. Presently, similar drone-based LAWS have been reportedly deployed by militaries in Ukraine and the Middle East, though their exact nature and destructive potential remain unclear.
A multi-dimensional cost-benefit analysis is at the core of the debate surrounding the appeal of LAWS and their validity in future conflicts. From a technical and military standpoint, LAWS represent a game-changing shift in the modus operandi of modern militaries. A.I.-driven weapons are capable of carrying out more precise targeting and decision-making in battlefield scenarios, thereby reducing human error and weapon operator biases. More sophisticated versions of LAWS complement these benefits by eliminating the need for a human controller altogether, providing militaries with a promising 'fire-and-forget' system which minimises the risk of unnecessary casualties or collateral damage. On the other hand, opposition groups have raised concerns about the drawbacks of LAWS, including the vulnerability of such systems to technical malfunctions and the ethical question of conducting war through unfeeling machines. At the crux of all these arguments, and arguably the most critical area on which the entire topic of autonomous weapons regulations revolves, is the question of accountability.
When the international community gathers at forums and multilateral conferences to discuss accountability within autonomous weapons, the devil is almost always in the details. Whilst there is undeniably a practically universal consensus that some form of legislation is preferable to the alternative of allowing each State to dictate their own boundaries regarding LAWS, the global south is usually more inclined to unite on a hardline stance against developed nations who wish to protect their already considerable investment in the field. Yet even amongst the developed nations, divisions exist. The Russian Federation, for its part, has repeatedly submitted proposals to the Group of Governmental Experts (GGE) on LAWS, eschewing a binding legal instrument in favour of clear definitions and Confidence-Building Measures (CBMs) between Member States. On the other side of the negotiating table, the Anglo-American governments believe that a comprehensive treaty on them would be 'premature', whilst China has supported calls from other States for a ban on the first use of LAWS, but not their development as a whole. Despite these nuances in their stances, the Great Powers are aligned on the grounds that if any new legislation is to be put forward, its basis should be either entirely or largely rooted in the 1980 Convention on Certain Conventional Weapons (CCW), which they all believe provides an adequate foundation from which to build in any specific restrictions on autonomous weapons systems.
One tier below the global level of dialogue, regional powers have banded together to issue several critical declarations on the need to negotiate a comprehensive and legally binding instrument to regulate and potentially prohibit certain elements of autonomous weaponry. In February 2023, under the leadership of the Costa Rican government, more than 30 Latin American and Caribbean states issued the Belén Communiqué, calling on other States to 'promote the urgent negotiation of an international legally binding instrument, with prohibitions and regulations with regard to autonomy in weapons systems, in order to ensure compliance with International Law…' In March of that same year, 22 Spanish- and Portuguese-speaking States added their support to a similar communication, whilst regulating LAWs has been debated at other intergovernmental organisations such as the European Union and its African counterpart.
To these formal voices, several notable NGOs and campaign groups sit as observers in the formal GGE meetings, including the Campaign to Stop Killer Robots, the International Committee of the Red Cross, and Amnesty International. To this already considerable body of voices calling for greater regulation, U.N. Secretary-General Antonio Guterres has explicitly called for States to take up the matter with greater energy, recommending in his 2023 New Agenda for Peace that a binding instrument on LAWS should be concluded by 2026. Yet, as is often the case with multilateral initiatives at the U.N., consensus-based procedures imply that even one State dissenting from the majority can be enough to hold up any substantial progress on crafting a global rules-based environment for LAWS. Another challenge facing the international community is that not all States are party to the CCW, nor are all of the U.N.'s member nations represented in GGE meetings. The first step to ensuring that any regulations on LAWS can become a reality is to call for greater dialogue and inclusivity in pre-existing discussion spaces. Only then can a concerted effort emerge wherein compromise and collaboration can exist between the greatest number of international actors.
Corollary to the issue of greater participation is the need to ensure that such legislation is also receptive to future changes in A.I. development and its application in warfare. Will, for example, defensive armaments such as missile-interception systems and early-warning drones be subject to the same rules as offensive weapons? Even developing states have acknowledged the benefits LAWS can provide when utilised as an effective deterrent from attack by hostile State and non-state parties in wartime. It would, therefore, be prudent to ensure that the United Nations engage as many academic and technical experts to weigh in on the specifics of defining and delineating between LAWS and other similar weapons systems, lest a nation utilise a loophole later down the line to the detriment of all involved. Similarly, it may be necessary to involve private sector corporations with significant investment and expertise in A.I. systems to ensure that their responsibilities as the designers of LAWS can be clearly agreed upon. Thus, although the road to a comprehensive and legally binding instrument on autonomous weapons will require significant time and effort by as many parties as possible, the end result is preferable to the current modus vivendi of allowing an arms race to develop without serious checks and balances.
Another multilateral initiative which would boost the collective desire for regulations on LAWS would be the signing of declarations between groups of states, signalling their intention to abstain from the deployment of specific LAWS—such as those armed with Weapons of Mass Destruction (WMD) or other armaments prohibited under International Law. Whilst not an end goal in itself, such public communications would at least symbolise a greater degree of coherence across governments and provide the common grounds on which further talks could build. Indeed, one such area of common ground is the view that any future regulations on LAWS must adhere to the existing body of International Humanitarian Law (IHL) whilst also remaining cognizant of the fact that the absence of a human operator does not automatically absolve all parties involved in an instance where LAWS have breached international law. On the contrary, the necessity to include a human operator as an 'important limiting factor' to quote a GGE working paper, remains pertinent in forthcoming deliberations on how accountability can be built into every level of autonomous weapons systems.
Another significant sub-topic that must be addressed is ensuring that such weapons do not fall into the hands of non-state actors, seeing as such groups are inclined to avoid commitments to any formal legislation on their actions in conflict zones. A.I. weapons possess considerable benefits to levelling, or even generating, an asymmetric playing field in wartime, and it would be in the interest of governments as well as civil society to ensure that the future development of such technologies includes measures to prevent them from being hacked, stolen, or voluntarily provided to potentially dangerous groups. On that same point, however, the inviolability of sovereign rights must still be respected within any arms control regimes; many countries would be far less likely to support legislation if it involved surrendering their ability to develop national security apparatuses or subject such arsenals to inspection and potential espionage. Instead of insisting on these physical checks and disarmament, future legislation should be focused on fostering a trust-based environment with CBMs to demonstrate the mutual benefits of avoiding an A.I. arms race.
In summation, the question of regulating autonomous weapons certainly has all the trappings of ancient mythological quests for a seemingly unattainable end goal. Pandora's Box has already been opened by developing and deploying such weapons systems in war-torn landscapes across the 21st century. Still, there remains hope that the consequences of unleashing such technologies upon human populations will persuade all those involved to take serious steps in regulating how LAWS can be utilised in a responsible and accountable manner for the future. Having let the A.I. Genie out of the lamp, it now falls upon diplomats, civil society, and corporate executives to make a concerted effort to minimise the damage that autonomous weapons can deliver.
Comentarios