Playing the long game on autonomous weapons

TRISHA RAY

back to issue

THE term ‘killer robots’ is an oft- invoked shorthand used to describe Autonomous Weapons Systems (AWS), systems capable of detecting, selecting and engaging targets without human intervention.1 However, this term glosses over the fact that no bright line separates the intelligent technologies used by the military and civilian sectors.

AWS governance must balance the strong interests countries have in their potentially game-changing battlefield applications, the economic growth potential of related AI technologies for countries preparing for the Fourth Industrial Revolution and questions regarding the ethics of AI decision making. As a result, AWS is a battleground where national, economic and ethical imperatives collide. India’s stance on AWS at global fora is, accordingly, ambivalent by design, reflecting the complex interaction of economic considerations, national and regional security implications and concerns regarding democratic accountability.

This article will lay out the state of play on the global governance of AWS and highlight three defining characteristics of India’s position on AWS which are driven by a need to first, meet its national security needs; second, balance regulation with AI-enabled growth; and third, meet its obligations as a democratically accountable government. In the context of the economic growth imperative, conflict amongst international stakeholders on the definition of AWS, and thereby the technologies that would come under stricter regulation, will also be explored.

 

Global governance of autonomous weapons falls under the purview of the UN Conference on Disarmament (CD). From 2014 to 2016, discussions were held at the CD’s Informal Meetings of Experts that laid out the groundwork on areas of consideration including the importance of humanitarian law, responsibility, accountability and proliferation risk. Since 2017, discussions have been held at the Group of Governmental Experts on Lethal Autonomous Weapon Systems (GGE on LAWS).

The GGE has agreed upon seven broad principles. First, human accountability cannot be transferred to machines; machines and human beings cannot be treated the same way under law. Second, humans are accountable at all stages of the development, deployment and use of LAWS. Third, international humanitarian law (IHL) is applicable to the development, deployment and use of all emerging weapons systems. As stated in Article 36, Protocol 1 of the Geneva Convention, states are liable based on their obligations under international law, including IHL, to determine whether all emerging weapons systems are in compliance. Fourth, states are responsible for the physical and non-physical safeguards for LAWS. States must take measures to secure weapons systems against theft, damage and cyber attacks by other state and non-state actors.

Fifth, policy measures under the aegis of the United Nations Convention on Certain Conventional Weapons (CCW) should not hamper peaceful use of emerging technologies. Sixth, human-machine interaction at various stages of development, deployment and use of AWS should ensure adherence to IHL. Seventh, states are free to conduct independent legal reviews of AWS and allied technologies and are encouraged to share best practices.

Generally, country positions on AWS fall along a spectrum: from countries calling for a complete ban; to those who claim that any regulation beyond existing limitations set by international law is unnecessary or infeasible; and finally, others who propose regulated development of AWS. For the sake of brevity, this paper labels these groups the absolutists, the laissez faireists and the regulators respectively.

 

The absolutists demand a complete ban on the development and use of fully autonomous weapons systems. The African Group called for a moratorium on the development and manufacture of all AWS until a complete ban is in place. The absolutists have distinct priorities driving their positions. For some like Pakistan – which in 2013 became the first country to call for a ban – the primary concern is that developing countries without access to such technologies will be disproportionately harmed. For others, like Zimbabwe and Chile, the delegation of life and death decisions to a machine is unacceptable and inhumane.

Mexico believes that fully autonomous systems cannot, by definition, meet the standards of accountability and responsibility set by the Geneva Conventions. Morocco voiced concerns about the start of a new high-tech global arms race that would undermine non-proliferation and disarmament. As of October 2019, the Campaign to Stop Killer Robots lists 30 countries that have called for a prohibition on fully autonomous weapons.

 

The laissez faire brigade, consisting of twelve states including the UK, Australia, Israel, South Korea, Russia and the United States, oppose an international treaty on AWS. Russia and the US have both asserted that AWS will be less prone to error than systems with human operators hence reducing collateral damage and harm to friendly forces. A report from the NGO, Reaching Critical Will, on the August 2018 GGE meeting, noted that the US and Russia questioned whether IHL could apply to AWS at all, marking a reversal on declarations made by both countries during previous expert meetings.

Australia and the UK consider a sweeping prohibition of AWS, at this stage, premature. British officials state that, according to their own parameters, no existing weapons system qualifies as autonomous.

The regulators occupy the middle ground: many advocate for a new instrument, legally binding or otherwise, under the aegis of the CCW that establishes parameters such as meaningful human control and accountability. Austria, Brazil and Chile made a joint submission to the August 2018 session of the UN GGE proposing the negotiation of a legally binding instrument on ‘meaningful human control over critical functions in lethal autonomous weapons systems.’ China, too, appears to fall in the regulator category, although its declared position is equivocal, at best.

While the Campaign to Stop Killer Robots includes China in the list of countries calling for a ban, it comes with an asterisk – stating that China opposes the use but not the development of AWS. China’s official position paper at the April 2018 GGE simply calls for a ‘uniform standard’ on national reviews on the development and deployment of AWS. Elsa Kania, a prominent China analyst, noted ‘China’s apparent diplomatic commitment to limit the use of "fully autonomous lethal weapons systems" is unlikely to stop Beijing from building its own.’

 

India has been an active participant in the UNCD process on LAWS since its inception. India’s Ambassador to the Conference on Disarmament, Amandeep Singh Gill, led the first UN GGE till 2018. On AWS, India functions as a regulator, taking a measured approach when it comes to governing AWS. India’s submissions to the Meeting of Experts and the GGE are sparse in detail but have a couple of key attributes. First, India prefers an emphasis on light touch regulation. India’s statements at the CCW have consistently cautioned against ‘premature, unnecessary’ prohibitions and emphasized that related technologies should not be ‘stigmatized’. In other words, stringent prohibitions on AWS-related technologies could jeopardize innovation.

Second is a cognizance of geopolitical and technological inequities. India’s stance on the applicability of international law on AWS development and use is nuanced; it highlights the importance of accountability and transparency, as well as the principles of proportionality, necessity and distinction. At the same time, regulations should not exacerbate existing technological gaps between countries. This latter belief has been a mainstay of India’s stance on international governance of technologies, as epitomized by earlier positions on nuclear weapons: Indian foreign policy emphasizes ‘global, verifiable and non-discriminatory nuclear disarmament’ due to a wariness of international regimes that limit its own ability to secure itself.

The two defining characteristics of India’s stance on AWS – light touch regulation and acute awareness of existing inequities – are a product of the foreign policy imperative of having the latitude to develop technologies that may provide strategic advantages. This section posits that India’s position as a ‘regulator’ on AWS is a product of, first, the exigencies of its security environment; and second, consideration of the effect such a ban would have on the development of India’s fledgling AI industry.

 

India’s grand strategy consists of three ‘concentric circles’: the immediate neighbourhood, the extended neighbourhood and the global stage.2 India’s conception of its security environment is similarly an interplay – manage relations with neighbours, create a stable regional security environment and maintain internal stability, all of which aid in its projection of itself as a military power. The 2018-19 Ministry of Defence (MoD) Annual Report mentions the following security issues: terrorism, insurgency, maritime security in the Indian Ocean region and land border security.

There are, therefore, a number of applications of AI in the Indian defence context. In its June 2018 report, the MoD Taskforce on AI made recommendations on applications of AI in aviation, naval, land systems, cyber, nuclear, and biological warfare. While the report itself was not made public, the following are potential uses for autonomous systems.3

Force multiplier and contested borders: India’s Border Security Force (BSF) personnel endure extended exposure to extreme terrain and are under constant threat from unfriendly forces attempting to infiltrate the border. As a result, according to the Ministry of Home Affairs, between 2001 and 2016, 529 BSF jawans committed suicide and 491 died in combat. AWS can supplement human capabilities, and improve work conditions for soldiers on the ground thereby improving the border force’s overall effectiveness.

 

Reducing human costs in urban theatres of conflict: Violence in Kashmir and Northeast India has resulted in 525 civilian deaths between 2014 and 2019, as reported by the Ministry of Home Affairs.4 Fewer boots on the ground paired with continued improvements in AI-enabled situational awareness can greatly reduce civilian casualties and harm to friendly forces.

Defence in depth through persistent presence in the maritime domain: India faces three sets of maritime threats: the first is sea-borne terrorism as exemplified by the 2008 Mumbai attacks; the second, is piracy along important trade routes; and the third is naval incursions by hostile states. AWS can help maintain a persistent presence in areas that are difficult to monitor due to risks arising from climate, vast or difficult terrain, or unexploded ordnance.

The Defence Research and Development Organization (DRDO) and a handful of Indian public sector undertakings (PSUs) are pursuing a number of projects on autonomous systems, including the Ghatak, a ‘self-defending high-speed reconnaissance UAV’, and unmanned combat aerial vehicles (UCAVs) in partnership with Israel Aerospace Industries.

 

Defence procurement has been dominated by PSUs, partly because procurement systems were skewed in their favour.5 However, the armed forces have expressed their dissatisfaction with the glacial pace of DRDO’s projects. One army officer told The Print, ‘There needs to be a shorter incubation period. Many times, the forces have demanded a certain product, and by the time it comes out, it is more or less outdated, technology-wise.’6 Acknowledging these concerns, the MoD AI Task Force’s report emphasizes the role of the private sector, stating that innovative AI applications will emerge from private firms and start-ups. Accordingly, to foster partnerships with start-ups and MSMEs, the Government of India launched the Defence India Startup Challenge.

The Indian government, aware of the need for home-grown solutions to India’s security challenges, is fostering research and development of autonomous systems and other AI-enabled military systems within the public and private sectors. Heavy-handed regulation of AWS would inhibit India’s ability to foster innovative AI-enabled solutions for its chronic security challenges.

India’s emphasis on light touch regulation and technological inequities at the UN GGE has an economic dimension as well. As Ambassador Amandeep Singh Gill said during an interview with the author at CyFy Africa in June 2019, ‘[India] is conscious not just of the security aspects but also the development opportunities, the economic transformation opportunities that are coming out of these technologies.’7

 

By 2025, the AI industry is projected to be valued at around $191 billion, which has driven many countries to draft national strategies that could increase their slice of this pie. India has done the same – the 2018 National Strategy on Artificial Intelligence highlights the transformative potential of AI for society, from expanding access to healthcare and finance, to improved agricultural practices, to easing the pressure on transport infrastructure.

One flaw with the UN GGE process is the absence of a single, meaningful definition of AWS that could coalesce country views. The Human Rights Watch defines AWS as systems that are able to select and engage targets, covering lower levels of autonomy, encapsulating functions present in many unmanned systems today. In contrast, the UK Ministry of Defence defines these systems as being ‘capable of understanding higher level intent and direction.’ The UK MoD defines AWS as being fully autonomous in that they require no human oversight or control. Fully autonomous systems with ‘higher intent’ presently exist only in science fiction. In the absence of clearly defined parameters for autonomy, AWS governance may cover a broad range of technologies that constitute these systems, including voice recognition, natural language processing, computer vision and sensor fusion, all of which have economic value beyond their military applications.

 

At the March 2019 convening of the UN GGE, the Indian delegation outlined the following five characteristics of AWS. Autonomous: Independent functioning till the terminal phase, following activation, deployment or launch. Situational awareness and adaptability: Navigate autonomously, track a target, and adapt function to changes in environment. Target identification and differentiation: Differentiate between friendly and opposing forces. Decision-making: Ability of a fully autonomous system to take decisions on its own, as opposed to pre-programmed actions as per a given set of conditions. Learning: AWS would possess complex self-learning and adaptive capabilities. Hence AWS can determine a course of action when encountering an unfamiliar scenario.

This characterization would put India’s endorsed definition closer to that of the UK. Barring launch or deployment, human control is absent at all stages of AWS functioning and the system would be able to learn and adapt to its environment. When considering the growth potential of AI technologies, this definition is a natural progression on the official Indian position that AWS regulations should not curb innovation.

While acknowledging the economic growth potential and security applications of AI, domestic support for the development of autonomous weapons is by no means unanimous. Thus, while a 2019 online poll by Ipsos had India recording the most support for fully autonomous weapons systems (50% of respondents) of the surveyed countries, 37% opposed AWS, with the primary concern being accountability.8

 

Accountability is central to a democracy like India and autonomous weapons epitomize the sharpest anxieties regarding the government use of new and emerging technologies for suppression and control. For instance, Home Minister Amit Shah stated during his speech in the Lok Sabha on 11 March 2020 that authorities used facial recognition software, in conjunction with driving licence and voter ID databases to identify protestors. India has also achieved the dubious distinction of leading the world on internet shutdowns, with more than 350 shutdowns between 2014 and 2019.9

Buried in India’s statement at the 2014 Meeting of Experts on LAWS is an important driver of India’s measured stance on AWS: ‘From India’s points of view, we would like […] increased systemic controls on international armed conflict embedded in international law in a manner that does not [...] encourage the use of lethal force to settle international disputes just because it affords the prospects of lesser casualties to one side or that its use can be insulated from the dictates of public conscience.’

 

For the Indian state, stability, whether one defines it in economic or hard security terms, is indelibly linked with the mandate of its people and the legitimacy gained at the global stage as a responsible actor. India’s active participation and engagement with the CCW AWS process ensures that it will help shape global rules that best serve its interests as an aspiring global power while retaining the trust of its citizens.

 

Footnotes::

1. This definition parses various characteristics proposed by experts at the UN GGE on LAWS, though there is currently no international legal definition of AWS. Further characteristics, highlighted in the 2019 report, include: self-adaption; predictability; explainability; reliability; ability to be subject to intervention; ability to redefine or modify objectives or goals or otherwise adapt to the environment; and ability to self-initiate.

2. C. Rajamohan, ‘India and the Balance of Power’, Foreign Affairs 85(4), 0015-7120, 1 July 2006.

3. Trisha Ray, ‘Beyond the Lethal in Lethal Autonomous Weapons’, Observer Research Foundation, 14 December 2018.

4. For Kashmir, the reporting period stops on 31 March 2019. https://www.mha.gov.in/sites/default/files/AnnualReport_ English_01102019.pdf. Unofficial estimates differ, with the Jammu Kashmir Coalition of Civil Society putting the death toll at 160 civilians in 2018 alone, as opposed to the MHA’s 37 deaths. http://jkccs.net/2018-deadliest-year-of-the-decade-jkccs-annual-human-rights-review/

5. ‘India’s Gross Defence Budget May Reach $112 bn by FY27 Clocking 11% CAGR: ASSOCHAM-KPMG Report’, Assocham, 27 May 2018. https://www.assocham.org/newsdetail.php?id=6838

6. Snehesh Alex Philip, ‘Army Wants DRDO to Take in More of its Personnel on Deputation, Give Them More Access’, The Print, 18 October 2019. https://theprint.in/defence/army-wants-drdo-take-more-personnel- on-deputation-give-them-more-access/307788/

7. ‘#CyFyAfrica | In Conversation on Encoded Lethality: The Effect of Autonomous Systems on National Security with Trisha Ray, Junior Fellow, ORF with Ambassador Amandeep Gill, UN Secretary Gene-ral’s High Level Panel on Digital Cooperation’, Facebook video, Observer Research Foundation, 8 June 2019. https://www.facebook.com/ORFOnline/videos/332975220702468/

8. It is important to note that opposition to AWS in India, as compared to a 2017 survey by the same organization, has gone up six percentage points.

9. Nikhil Rampal, ‘More Than 350 Internet Shutdowns in India Since 2014’, India Today, 18 December 2019. https://www.indiatoday. in/diu/story/more-than-350-internet-shutdowns-in-india-since-2014-1629203-2019-12-18

 

References:

Hayley Evans, Natalie Salmanowitz, ‘Lethal Autonomous Weapons Systems: Recent Developments’, Lawfare, 7 March 2019.

https://www.lawfareblog.com/lethal-autonomous-weapons-systems-recent-developments

Anja Dahlman and Marcel Dickow, ‘Preventive Regulation of Autonomous Weapon Systems: Need for Action by Germany at Various Levels’. SWP Research Paper 2019/RP 03, January 2019.

Autonomous Weapon Systems: Technical, Military, Legal and Humanitarian Aspects. ICRC submission to the LAWS expert meeting, Geneva, Switzerland, 26-28 March 2014.

Campaign to Stop Killer Robots. Stop-killerrobots.org, ‘Shifting Definitions – the UK and Autonomous Weapons Systems’, Article 36, July 2018.

http://www.article36.org/wp-content/uploads/2018/07/Shifting-definitions-UK-and-autonomous-weapons-July-2018.pdf

‘Annual Report 2018-19’, Ministry of Defence. https://mod.gov.in/sites/default/files/MoDAR2018.pdf

‘Statement by Ambassador D.B. Venkatesh Varma at the CCW Experts Meeting on Lethal Autonomous Weapons Systems’, 13 May 2014.

https://www.unog.ch/80256EDD006B8954/(httpAssets)/56839DAAD755FFC9C1257CD8003E65FD/$file/India+LAWS+2014.pdf

‘Statement by Ambassador D.B. Venkatesh Varma at the CCW Informal Meeting of Experts on Lethal Autonomous Weapons’, 17 April 2015.

https://www.unog.ch/80256EDD006B8954/(httpAssets)/FCCEC7D562B876E9C1257E2A0041E28D/$file/2015_LAWS_MX_IndiaConc.pdf

‘Statement by Ambassador D.B. Venkatesh Varma at the CCW Informal Meeting of Experts on Lethal Autonomous Weapons’, 11 April 2016.

https://www.unog.ch/80256EDD006B8954/(httpAssets)/2BE1A62650F95B8AC1257F920057AEED/$file/2016_LAWS+MX_General Exchange_Statements_India.pdf

‘Statement by India – Characterization of the Systems under consideration in order to promote a common understanding on Concepts and Characteristics relevant to the objectives and purposes of the Convention’, 25 March 2019.

https://www.unog.ch/80256EDD006B8954/(httpAssets) F8C1F0AEE961CA93C12583CC00353A09/$file/25+March+2019+-+5(d).pdf

‘Statement by India: An exploration of the potential challenges posed by Emerging Technologies in the area of Lethal Autonomous Weapons Systems to International Humanitarian Law’, 26 March 2019.

https://www.unog.ch/80256EDD006B8954/(httpAssets)/4C330E6B0BDD4C20C12583D2003C36AF/$file/5+a+26+ Mar+2019+forenoon.pdf

‘Group of Governmental Experts (GGE) on Lethal Autonomous Weapons Systems (LAWS) 13-17 November 2016, Opening Statement’, submitted by the United States (13-17 November 2016).

https://www.unog.ch/80256EDD006B8954/(httpAssets)/6E9C8002759032A8C12582 490031466C/$file/2017_GGE+LAWS_Statement_USA.pdf

‘Potential Opportunities and Limitations of Military Uses of Lethal Autonomous Wea-pons Systems’, submitted by the Russian Federation (9 March 2019).

https://www.unog.ch/80256EDD006B8954/(httpAssets)/489AAB0F44289865C12583 BB0063B977/$file/GGE+LAWS+ 2019_Working+Paper+Russian+ Federa-tion_E.pdf

‘Australian Statement – General Exchange of Views’, 13-17 November 2017.

https://www.unog.ch/80256EDD006B8954/(httpAssets)/E4B1B901E6728457C125823B0041DB57/$file/2017_GGE+LAWS_Statement_Australia.pdf

Ray Acheson, ‘New Law Needed Now’, Reaching Critical Will, CCW Report, Vol. 6, No. 9 (30 August 2018).

http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/ reports/CCWR6.9.pdf

Ray Acheson, ‘Mind the Downward Spiral’, Reaching Critical Will, CCW Report, Vol. 6, No. 11 (4 September 2018).

http://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2018/gge/ reports/CCWR6.11.pdf

top