Sponsored
    Follow Us:
Sponsored

Introduction

Artificial Intelligence (AI) has made huge strides in the past decade or two with the advent of Deep Learning and Reinforced Learning, powered by Artificial Neural Networks. Given the tremendous capabilities shown by AI systems today, it appears to be that autonomous AI systems have now pivoted from being a topic of the distant future to one of relevance in the near future. Even though AI is of relevance to a lot of sectors, none is more heavily invested in research and development of autonomous systems than the defence sector. Lethal Autonomous Weaponry System (LAWS) may comprise the ‘third age’ of warfare, where wars and battles are fought and retaliated against not by human soldiers, but killer robots ranging from drones, missiles and rocket launchers to national border defence systems.

Proponents of AI have multiple claims in respect of the proliferation of AI system into weaponry and its advantages. In this article, I will make an attempt at dissecting some of their most compelling and boldest claims, that of ‘Deterrence’ and ‘Protection to Civilians’.

AI: A Tool for Deterrence?

The usage of AI in weaponry and defence leads to the question as to whether AI will lead to deterrence in respect of adversarial action or would it encourage the same by levelling the playing field? We shall try to tackle this question in light of the ‘Coercion Theory’. The umbrella concept of coercion includes both making threats to influence an opponent’s behaviour and compulsion on part of the opponent to be convinced to pursue a particular behaviour that it may not have if not for the threat.[1] A defender communicates the threat of punishment or retaliation (sanctions, censure, or kinetic attack and cyber attack) whereas the aggressor would weigh in the potential cost of the threats and weigh them against the expected benefits of the adversarial action.[2] The deterrent effect of any action is assessed based on this. Employing this theory, I will discuss some of the possible outcomes.[3]

First, relying on the augmented speed, precision and certainty of weapons and defence systems that the AI provides, defensive actors are able to instil deterrence by effective denial of the intended outcome. Conversely, the AI may also aid in expanding the offensive ability of attacks. This would lead to favouring of punishment over denial. The interplay of these two outcomes, where the aggressor and the defender both have matching AI capabilities, leads us to the possibility of rearrangement of coercion to the middle.

This discussion prompts the question as to what if this rearrangement occurred in respect of non-state actors. To elaborate on this, I move on to the next section of this article.

The Third Drone Age and Non-State Actors

In response to the assassination of General Qasem Soleiman by the U.S.A, Iranian backed Houthi rebels fired drones and missiles at industrial sites and transport facilities in Abu Dhabi. This onslaught led to the death of three people and damage to a new extension of Abu Dhabi International airport.[4] On January 3rd 2022, the US troops near Baghdad International Airport shot down two armed drones.[5] Attacks have continued apace ever since[6] and serve to remind the world about the consequences of state-developed advanced weapon technology falling into the hands of hostile non-state actors.

The emergence of drone technology is attributable to the reduction of risk to military personnel owing to the remote nature of control of weaponry and targeting of objects.[7] However, the key concern remains as to the availability of drones to Non-State organizations even during a period where the supply of drones by the state of Iran to these hostile groups dried up due to peace talks and an embargo. The reason behind this supposed anomaly is that these groups employed fibre glass shell castings to military grade systems that they could get their hands on. This enabled them to be able to mass reproduce the aerodynamic chassis essential to these drones. Then, by procuring advanced commercial drone elements which are easy to purchase, hard to track and place under arms control measures, these actors managed to create a supply chain for killer drones.[8]

Drones have been regarded as the future of warfare.[9] Military R&D too is heavily invested in the amalgamation of AI technology with lethal drones. The automation of drone technology will further reduce human footprint and amplify the force. Devoid of the context provided herein, the lure of automation of weaponry is logical. However, it is apprehended by experts that if the R&D continues apace, such automated drones could be part of non-state arsenal by 2040[10], given that the commercial availability of AI systems will be even harder to regulate than those of commercial drone parts. AI powered automated drones may evade air defence systems more efficiently and further enhance the potential of offensive measures, reducing the asymmetry between non-state actors and state military groups.

Considering the diffusion of AI and machine learning algorithms in the development of ‘third-age drones’, the International community must take cognizance of the implications of not effectively monitoring and regulating these weapons and means to curb the proliferation to non-state groups. With respect to this, the question arises as to whether military control alone would suffice, or a broader deliberation is needed (i.e. commercial industry regulation) concerning international regulatory instruments to be able to limit this proliferation.[11]

Civilian Protection: A False Promise?

Military systems have shown interest in AI powered AWS for reasons such as expedited response time and amplified situational awareness. The US has expressed its support for the development of AWS by asserting that it would make the implementation of IHL more effective by limiting collateral damage and loss of civilian lives.[12] Based on this, a former US secretary of defence has argued, “it is a moral imperative to at least pursue this hypothesis”.[13]

Discussions at the United Nations Convention on Certain Conventional Weapons (CCW) spanning over a period of 7 years have led to a tentative agreement between nations that lethal weaponry must not transgress beyond human control. The same though is rendered less effective upon knowing that states have little to no agreement over the threshold of awareness and control that the human controller shall possess over such weapon systems.[14] Without a shared understanding of this threshold over functions such as target selection and engagement, it would be very difficult to establish legal accountability.

In situations where a substantial level of human involvement is not provable, who is to be held accountable when civilians are killed or essential civilian objects are destroyed? Some believe that web developers and their employers could be held responsible in situations like this.[15] Again, a great number of individuals are involved in the coding and training of systems and more importantly, without much knowledge about the parts undertaken by others and even at times the overall finished product. It is also imperative to note that military weapons manufacturers are indemnified against accidents on battlefield in jurisdictions like the US.[16]

These apprehensions are not unfounded either. AI technologies have been shown to display bias towards certain disadvantaged communities in aspects where their representation is disproportionate.[17] AI systems are dependent on mathematical probabilities and devoid of political context, a plain viewing of numbers can portray a distorted picture even for human beings. AI experts, even the most optimistic ones speculate that “human-level” cognition in AI is far from achievable before 2075.[18] It is far from unlikely that AWS are employed without adhering to this timeline given the rapid proliferation of AI into weapons.

Besides, IHL does not have any measures to hold anyone accountable for unintended damages caused to civilians during wars.[19] Given the black-box problem, States would find it easier to deny any actual intent and law enforcement would find it equally difficult to establish actual intent on part of any human being in cases of unlawful killings of civilians and damage to civilian objects. This begs the question, not only concerning the validity of any claims about protection of civilians but also whether proliferation of AIs would worsen the situation for civilians by serving as an effective tool for denial.

[1] https://www.cigionline.org/articles/ai-and-the-future-of-deterrence-promises-and-pitfalls/

[2] ibid

[3] www.science.org/doi/10.1126/science.349.6245.252.

[4] https://edition.cnn.com/2022/01/17/middleeast/uae-abu-dhabi-explosion-drone-houthi-intl/index.html.

[5] https://taskandpurpose.com/news/drone-shot-down-baghdad-soleimani-revenge/.

[6] https://www.cigionline.org/articles/the-third-drone-age-visions-out-to-2040/.

[7] http://www.historytoday.com/history-matters/origins-drone-warfare.

[8] Supra 6

[9] https://thebulletin.org/2019/10/the-dark-side-of-our-drone-future/.

[10] Supra 6

[11] https://thebulletin.org/2020/12/we-need-a-new-international-accord-to-control-drone-proliferation/#post-heading.

[12] https://reachingcriticalwill.org/images/documents/Disarmament-fora/ccw/2019/gge/Documents/2019GGE.2-WP5.pdf.

[13] www.theguardian.com/science/2021/jan/26/us-has-moral-imperative-to-develop-ai-weapons-says-panel.

[14] https://www.cigionline.org/articles/autonomous-weapons-the-false-promise-of-civilian-protection/.

[15] https://www.dhi.ac.uk/san/waysofbeing/data/governance-crone-sharkey-2012b.pdf.

[16] https://www.chathamhouse.org/sites/default/files/publications/research/2017-01-26-artificial-intelligence-future-warfare-cummings.pdf.

[17] Gebru, Timnit. 2020. “Race and Gender.” In The Oxford Handbook of Ethics of AI, edited by Markus D. Dubber, 252–69. New York, NY: Oxford University Press.

[18] https://doi.org/10.1007/978-3-319-26485-1_33.

[19] https://www.cigionline.org/articles/ai-and-the-actual-ihl-accountability-gap/.

*******

Co-Author: Arihant Shrivardhan, a 4th year student of Institute of Law, Nirma University

Sponsored

Author Bio


Join Taxguru’s Network for Latest updates on Income Tax, GST, Company Law, Corporate Laws and other related subjects.

Leave a Comment

Your email address will not be published. Required fields are marked *

Sponsored
Sponsored
Search Post by Date
July 2024
M T W T F S S
1234567
891011121314
15161718192021
22232425262728
293031