8 mins read

CCW Report, Vol. 14, No. 2: The Final Stretch Before the Finishing Line

Download full edition in PDF

On 2–6 March 2026, delegations met in Geneva for another session of the Group of Governmental Experts (GGE) on Emerging Technologies in the Area of Lethal Autonomous Weapon Systems (LAWS). After four sessions held in 2024 and 2025, this session marked the beginning of the end of the Group’s three-year mandate. During the week, the GGE discussed the rolling draft text of elements for a possible instrument or other measures on LAWS, circulated by the Chair in December 2025. When the week started, there were over 40 states that had expressed their support for moving to negotiations on the basis of the “rolling text;” by the end of the week, this number increased to over 70, after more states, including a group of African states, joined this call. It’s clear that most states are ready to take the next step. This is important not only because the upcoming Review Conference of the Convention on Certain Conventional Weapons (CCW) in November must decide upon next steps, but also because of all the increasing harm being caused by automated violence worldwide, which requires urgent action.

Preventing Setbacks

From Monday–Wednesday, delegations did a first reading of the whole draft text, divided into five boxes to help organise discussions. On Wednesday night, the Chair distributed revised text with additional changes in Boxes I, II, and III. Modified Box I was discussed on Thursday morning, while Modified Box III was discussed on Thursday afternoon and Friday morning. Informal consultations, without the presence of civil society, were held on Tuesday and Thursday evenings.

As discussions progressed, delegations offered views on the text and suggested changes, most of which were oriented toward finding possible convergence between positions. However, in this exercise of trying to bridge different positions, the GGE sometimes ends up moving away from its objective. As stated by the International Committee of the Red Cross (ICRC) on Friday morning, the purpose of this text should be “not only restating existing international humanitarian law (IHL) rules but also clarifying how those rules apply in the context of autonomous weapons and articulating additional specific measures that respond to the particular challenges raised by such systems.”

After more than two years of discussion within the current mandate to develop elements for a draft instrument, and over a decade of work on this topic in the CCW, some delegations still oppose certain concepts, particularly those related to human control over weapon systems, arguing that they are not part of existing IHL text. However, as Pakistan argued, states shouldn’t use the metric of whether these words appear in IHL before or not, but rather if the safeguards developed by the Group can respond to 21st century challenges. After all, as Stop Killer Robots reminded delegations, the central theme of discussions has been how to ensure that the quality and substance of human decision making and control over increasingly autonomous systems is not eroded. “States should not lose sight of the need for strong rules to ensure such judgment and control, to uphold basic ethical and legal standards, and for civilian protection,” emphasised Stop Killer Robots.

This is particularly important as delegations approach the final session of the GGE, scheduled for 31 August–4 September. In the scope of discussions of what will be included in the report to be submitted to the CCW Review Conference, states must avoid getting caught up in deleting paragraphs through a “consensus spree,” as put by Belgium. “At some point, this bubble we have built needs to produce something that connects with the real world and that’s something with a real-world impact,” emphasised the Belgian delegation, while also noting that in its view, this means working towards an instrument on LAWS with real added value for the world.

The Finishing Line

Having strong international law prohibiting autonomous weapons has never been more needed than now. A few weeks before the GGE met, autonomous weapons were at the centre of public attention when the United States’ so-called Department of War pressured the company Anthropic to adjust the terms of use of its artificial intelligence (AI) model, Claude. The company prohibits its use for mass domestic surveillance and for fully autonomous weapons operating without human oversight. Anthropic’s CEO Dario Amodei stated that “frontier AI systems are simply not reliable enough to power fully autonomous weapons” and that the oversight mechanisms needed to protect civilian lives and military personnel “don’t exist today”. In retaliation for Anthropic’s unwillingness to change its rules, the US government designated Anthropic a “supply chain risk.” Just a few hours after this, OpenAI announced that it had reached an agreement with the Department of War for the use of its AI models. The alleged wording of OpenAI’s contract with the Department states that “the AI System will not be used to independently direct autonomous weapons in any case where law, regulation, or Department policy requires human control.”

Despite this dispute, the US military continued to use Claude in its war on Iran, and will reportedly continue to be used until is phased out. A few weeks earlier, the Wall Street Journal also reported that the same system had been used by the US military during its operation to kidnap Nicolás Maduro from Venezuela. “Innocent people are already paying the price of AI-enabled digital dehumanisation in Iran and Gaza, as well as in border control and policing elsewhere.

This ongoing use of AI for weapons targeting, and the US government’s hostility toward AI companies placing limits on the use of their technologies, also provides context for the US delegation’s refusal to accept the term “human control” during last week’s GGE. During the debate on the Modified Box III text, the US proposed saying instead “good faith human judgement and care.” Many delegations pushed back against new phrase as insufficient to protect civilians or uphold international law. The vast majority of states have reiterated over the past decade that human control over weapons is imperative, and that they will not accept anything less.

The dispute with Anthropic demonstrates, yet again, the importance of having strong internationally legally binding rules prohibiting autonomous weapons and articulating exactly what human control means and how to achieve it. As highlighted in a recent article by Professors Jessica Dorsey, Elke Schwarz, Ingvild Bode, Zena Assaad, and Neil C. Renic, companies cannot be the last arbiters of reason against unreasonable government demands. “The safety, ethical and legal bar for AI use in contexts of conflicts should be set much higher,” emphasised the authors.

This highlights both the relevance and the urgency of the work of the GGE. As the Belgian delegation highlighted, “We should keep in mind that the CCW is made for the world, for us, and not the other way around.” In the final stretch of the race, it is important that the GGE concludes its mandate with strong elements for a possible legally binding instrument. And more importantly, it is essential that delegations decide to launch negotiations on an instrument prohibiting and regulating autonomous weapon systems at the CCW Review Conference in November. The world cannot afford to continue to live without such a treaty. “It is time for our governments to draw legal and ethical lines to divert humanity away from the dangerous slide towards automated killing we are currently on,” said Stop Killer Robots.

[PDF] ()

The post CCW Report, Vol. 14, No. 2: The Final Stretch Before the Finishing Line appeared first on WILPF.