{"id":615319,"date":"2023-03-07T08:49:20","date_gmt":"2023-03-07T14:49:20","guid":{"rendered":"https:\/\/news.sellorbuyhomefast.com\/index.php\/2023\/03\/07\/lords-committee-to-investigate-use-of-ai-powered-weapons-systems\/"},"modified":"2023-03-07T08:49:20","modified_gmt":"2023-03-07T14:49:20","slug":"lords-committee-to-investigate-use-of-ai-powered-weapons-systems","status":"publish","type":"post","link":"https:\/\/newsycanuse.com\/index.php\/2023\/03\/07\/lords-committee-to-investigate-use-of-ai-powered-weapons-systems\/","title":{"rendered":"Lords Committee to investigate use of AI-powered weapons systems"},"content":{"rendered":"<div id=\"content-header\">\n<h2>House of Lords to investigate the use of artificial intelligence in weapons systems, following UK government publication of AI defence strategy in June 2022<\/h2>\n<\/div>\n<div id=\"content-center\">\n<ul>\n<li><i data-icon=\"1\"><\/i><\/li>\n<li><i data-icon=\"2\"><\/i><\/li>\n<\/ul>\n<div id=\"contributors-block\">\n<p><img decoding=\"async\" src=\"https:\/\/cdn.ttgtmedia.com\/rms\/computerweekly\/Sebastian-Klovig-Skelton-CW-contributor.jpg\" alt=\"Sebastian Klovig Skelton\">\n\t\t\t\t\t<\/p>\n<p><span>By<\/span><\/p>\n<ul>\n<li>\n\t\t\t\t\t<a href=\"https:\/\/www.techtarget.com\/contributor\/Sebastian-Klovig-Skelton\">Sebastian Klovig Skelton,<\/a><br \/>\n\t\t\t\t\t\t<span>Senior reporter<\/span>\n\t\t\t\t\t\t<\/li>\n<\/ul>\n<p>\n\tPublished: <span>06 Mar 2023 14:30<\/span>\n<\/p>\n<\/div>\n<section id=\"content-body\">\n<p>The House of Lords Artificial Intelligence (AI) in Weapon Systems Committee has published a call for evidence as part of its <a href=\"https:\/\/committees.parliament.uk\/work\/7270\/\">inquiry<\/a> into the use of autonomous weapons.<\/p>\n<p><a href=\"https:\/\/www.computerweekly.com\/news\/252508928\/Minister-refuses-to-rule-out-fully-autonomous-lethal-weapons\">Autonomous weapons systems<\/a> (AWS), also known as lethal autonomous weapons systems (LAWS), are weapons systems which can select, detect and engage targets with little or no human intervention.<\/p>\n<p>Established 31 January 2023, the committee will explore the ethics of developing and deploying autonomous weapons, including how they can be used safely and reliably, their potential for conflict escalation, and their compliance with international laws.<\/p>\n<p>The committee will also specifically look at the technical, legal and ethical safeguards that are necessary to control the use of AWS, as well as the sufficiency of current UK policy and the state of international policymaking in this area generally.<\/p>\n<p>\u201cArtificial intelligence features in many areas of life, including armed conflict. One of the most controversial uses of AI in defence is the creation of autonomous weapon systems that can select and engage a target without the direct control or supervision of a human operator,\u201d said committee chair Lord Lisvane.<\/p>\n<p>\u201cWe plan to examine the concerns that have arisen about the ethics of these systems, what are the practicalities of their use, whether they risk escalating wars more quickly, and their compliance with international humanitarian law.<\/p>\n<p>\u201cOur work relies on the input of a wide range of individuals and is most effective when it is informed by as diverse a range of perspectives and experiences as possible. We are inviting all those with views on this pressing and critical issue, including both experts and non-experts, to respond to our call for evidence by 10 April 2023.\u201d<\/p>\n<p>Following evidence submissions, the committee will begin interviewing witnesses in public session between March and July, with the aim of concluding its overall investigation by November 2023. A UK government response is expected shortly after in January 2024.<\/p>\n<section data-menu-title=\"UK government approach\">\n<h3><i data-icon=\"1\"><\/i>UK government approach<\/h3>\n<p>In June 2022, the Ministry of Defence (MoD) unveiled its <em><a href=\"https:\/\/www.computerweekly.com\/news\/252521624\/MoD-sets-out-strategy-to-develop-military-AI-with-private-sector\">Defence artificial intelligence strategy<\/a><\/em> outlining how the UK will work closely with the private sector to prioritise research, development and experimentation in AI to \u201crevolutionise our Armed Forces capabilities\u201d.<\/p>\n<p>Regarding the use of LAWS, the strategy claimed the UK was \u201cdeeply committed to multilateralism\u201d and will therefore continue to engage with the United Nations (UN) Convention on Certain Conventional Weapons (CCW).<\/p>\n<p>Although details on its approach to autonomous were light in the <a href=\"https:\/\/www.gov.uk\/government\/publications\/defence-artificial-intelligence-strategy\/defence-artificial-intelligence-strategy\">72-page strategy document<\/a>, an <a href=\"https:\/\/assets.publishing.service.gov.uk\/government\/uploads\/system\/uploads\/attachment_data\/file\/1082991\/20220614-Ambitious_Safe_and_Responsible.pdf\">annex on LAWS in an accompanying policy paper<\/a> said systems that can identify, select and attack targets without \u201ccontext-appropriate human involvement\u201d would be unacceptable.<\/p>\n<p>\u201cSharing the concerns of governments and AI experts around the world, we\u2026oppose the creation and use of systems that would operate without meaningful and context-appropriate human involvement throughout their lifecycle,\u201d it said.<\/p>\n<p>\u201cThe use of such weapons could not satisfy fundamental principles of International Humanitarian Law, nor our own values and standards as expressed in our AI Ethical Principles. Human responsibility and accountability cannot be removed \u2013 irrespective of the level of AI or autonomy in a system.\u201d<\/p>\n<p>It added the UK government would continue working with international allies and partners to address the \u201copportunities and risks\u201d around LAWS.<\/p>\n<p>The <a href=\"https:\/\/article36.org\/updates\/new-uk-government-position-on-autonomous-weapons-recognises-that-lines-need-to-be-drawn-but-lacks-detail-or-signs-of-real-leadership\/\">UK Stop Killer Robots campaign<\/a> said at the time that while it was significant that the government recognised lines need to be drawn around use of LAWS, it has given no indication of how \u201ccontext appropriate human involvement\u201d is to be assessed or understood.<\/p>\n<p>\u201cIn this new formulation,\u2018context appropriate human involvement\u2019 could mean almost anything and the lack of detail about where the UK draws the line amounts to the UK public being told by the military \u2018leave it to us\u2019 to determine what is appropriate,\u201d it said.<\/p>\n<p>\u201cThis is emblematic of our previous concerns that this policy position, as well as the wider <em>Defence AI strategy<\/em>, was formulated without public consultation or any attempt by ministers to have a national conversation on this issue.\u201d<\/p>\n<\/section>\n<section data-menu-title=\"Lack of international consensus\">\n<h3><i data-icon=\"1\"><\/i>Lack of international consensus<\/h3>\n<p>Commenting on the current state of international cooperation around LAWS in July 2022, however, the <a href=\"https:\/\/www.swp-berlin.org\/en\/publication\/autonomous-weapons-systems-un-expert-talks-facing-failuretime-to-consider-alternative-formats\">German Institute for International and Security Affairs<\/a> said that expert talks at the UN level are facing failure.<\/p>\n<p>\u201cThe Group of Governmental Experts [GGE] has been discussing\u2026 AWS in the UN arms control context since 2017,\u201d it said. \u201cRegulation of AWS is an increasingly remote prospect, and some representatives even admit privately that the talks may have failed.\u201d<\/p>\n<p>It added that because the GGE requires unanimity to make decisions, the lack of Russia involvement in talks since the February 2022 invasion of Ukraine means alternative forums will need to be found for the international debate around LAWS.<\/p>\n<p>However, it noted that even before Russia\u2019s actions in Ukraine, \u201cit was clear that differences of substance\u2026precluded rapid agreement\u201d, including disagreements over the exact definitions and terminology.<\/p>\n<p>\u201cAnother fault line is the arms race between the US, Russia and China, which is especially pronounced in the sphere of new technologies,\u201d it added.<\/p>\n<p>In a <a href=\"https:\/\/sgp.fas.org\/crs\/natsec\/R46458.pdf\">report<\/a> on \u201cemerging military technologies\u201d published November 2022 by the Congressional Research Service, analysts noted that roughly 30 countries and 165 nongovernmental organisations (NGOs) have called for a pre-emptive ban on the use of LAWS due to the ethical concerns surrounding their use, including the potential lack of accountability and inability to comply with international laws around conflict.<\/p>\n<p>A <a href=\"https:\/\/onlinelibrary.wiley.com\/doi\/full\/10.1111\/1758-5899.12713\">September 2019 study<\/a> by Justin Haner and Denise Garcia said that oversight of autonomous weapons capabilities is \u201ccritically important\u201d given the technology is likely \u201cto proliferate rapidly, enhance terrorist tactics, empower authoritarian rulers, undermine democratic peace, and is vulnerable to bias, hacking, and malfunction\u201d.<\/p>\n<p>They also found the main players pushing the technology globally are US, China, Russia, South Korea, and the European Union.<\/p>\n<p>\u201cThe US is the outright leader in autonomous hardware development and investment capacity. By 2010, the US had already invested $4bn into researching AWS with a further $18bn earmarked for autonomy development through 2020,\u201d they said.<\/p>\n<p>Despite the lack of explicit international rules and safeguards around LAWS, there are reports that autonomous weapons have already been deployed in combat situations.<\/p>\n<p>A <a href=\"https:\/\/documents-dds-ny.un.org\/doc\/UNDOC\/GEN\/N21\/037\/72\/PDF\/N2103772.pdf?OpenElement\">UN Security Council report<\/a> published March 2021, for example, describes an engagement between the Government of National Accord Affiliated Forces (GNA-AF, which are backed by the UN) and the Hafter Affiliated Forces (HAF) in Tripoli, Libya. \u00a0<\/p>\n<p>\u201cLogistics convoys and retreating HAF were subsequently hunted down and remotely engaged by the unmanned combat aerial vehicles or the lethal autonomous weapons systems such as the <a href=\"https:\/\/lieber.westpoint.edu\/kargu-2-autonomous-attack-drone-legal-ethical\/\">STM Kargu-2<\/a> and other loitering munitions,\u201d it said.<\/p>\n<p>\u201cThe lethal autonomous weapons systems were programmed to attack targets without requiring data connectivity between the operator and the munition: in effect, a true \u2018fire, forget and find\u2019 capability.\u201d<\/p>\n<p>It added that HAF units \u201cwere neither trained nor motivated to defend against the effective use of this new technology and usually retreated in disarray. Once in retreat, they were subject to continual harassment from the unmanned combat aerial vehicles and lethal autonomous weapons systems.\u201d<\/p>\n<p>However, it is unclear from the wording what level of autonomy these weapons systems have, and it does not explicitly mention whether people were killed as a result of their use.<\/p>\n<p>The <a href=\"https:\/\/www.newscientist.com\/article\/2282656-israel-used-worlds-first-ai-guided-combat-drone-swarm-in-gaza-attacks\/\">New Scientist reported<\/a> in June 2021 that the Israeli Defence Force (IDF) has also used a swarm of AI-powered drones locate, identify and attack targets in Gaza.<\/p>\n<\/section>\n<\/section>\n<section id=\"DigDeeperSplash\">\n<h4>\n\t\t\t<i data-icon=\"m\"><\/i>Read more on Artificial intelligence, automation and robotics<\/h4>\n<ul>\n<li><a id=\"DigDeeperItem-1\" href=\"https:\/\/www.computerweekly.com\/news\/252521624\/MoD-sets-out-strategy-to-develop-military-AI-with-private-sector\"><br \/>\n\t\t\t\t\t<img decoding=\"async\" src=\"https:\/\/cdn.ttgtmedia.com\/visuals\/ComputerWeekly\/Hero Images\/Military-digital-defence-adobe_searchsitetablet_520X173.jpg\" srcset=\"https:\/\/cdn.ttgtmedia.com\/visuals\/ComputerWeekly\/Hero%20Images\/Military-digital-defence-adobe_searchsitetablet_520X173.jpg 960w,https:\/\/cdn.ttgtmedia.com\/visuals\/ComputerWeekly\/Hero%20Images\/Military-digital-defence-adobe.jpg 1280w\" alt ><\/p>\n<h5>MoD sets out strategy to develop military AI with private sector<\/h5>\n<div>\n<p><img decoding=\"async\" src=\"https:\/\/cdn.ttgtmedia.com\/rms\/computerweekly\/Sebastian-Klovig-Skelton-CW-contributor.jpg\" alt=\"SebastianKlovig Skelton\">\n\t\t\t\t\t\t\t\t\t<\/p>\n<p><span>By: <span>Sebastian\u00a0Klovig Skelton<\/span><\/span>\n\t\t\t\t\t\t\t<\/p>\n<\/div>\n<p>\t\t\t\t<\/a><\/li>\n<li><a id=\"DigDeeperItem-2\" href=\"https:\/\/www.computerweekly.com\/news\/252510176\/Over-100-civil-society-groups-call-for-changes-to-EU-AI-Act\"><br \/>\n\t\t\t\t\t<img decoding=\"async\" src=\"https:\/\/cdn.ttgtmedia.com\/visuals\/German\/article\/artificial-intelligence-brain-2-adobe_searchsitetablet_520X173.jpg\" srcset=\"https:\/\/cdn.ttgtmedia.com\/visuals\/German\/article\/artificial-intelligence-brain-2-adobe_searchsitetablet_520X173.jpg 960w,https:\/\/cdn.ttgtmedia.com\/visuals\/German\/article\/artificial-intelligence-brain-2-adobe.jpg 1280w\" alt ><\/p>\n<h5>Over 100 civil society groups call for changes to EU AI Act<\/h5>\n<div>\n<p><img decoding=\"async\" src=\"https:\/\/cdn.ttgtmedia.com\/rms\/computerweekly\/Sebastian-Klovig-Skelton-CW-contributor.jpg\" alt=\"SebastianKlovig Skelton\">\n\t\t\t\t\t\t\t\t\t<\/p>\n<p><span>By: <span>Sebastian\u00a0Klovig Skelton<\/span><\/span>\n\t\t\t\t\t\t\t<\/p>\n<\/div>\n<p>\t\t\t\t<\/a><\/li>\n<li><a id=\"DigDeeperItem-3\" href=\"https:\/\/www.computerweekly.com\/news\/252510056\/British-Army-announces-transformation-programme\"><br \/>\n\t\t\t\t\t<img decoding=\"async\" src=\"https:\/\/cdn.ttgtmedia.com\/visuals\/ComputerWeekly\/Hero Images\/UK-british-army-union-flag-adobe_searchsitetablet_520X173.jpg\" srcset=\"https:\/\/cdn.ttgtmedia.com\/visuals\/ComputerWeekly\/Hero%20Images\/UK-british-army-union-flag-adobe_searchsitetablet_520X173.jpg 960w,https:\/\/cdn.ttgtmedia.com\/visuals\/ComputerWeekly\/Hero%20Images\/UK-british-army-union-flag-adobe.jpg 1280w\" alt ><\/p>\n<h5>British Army announces transformation programme<\/h5>\n<div>\n<p><img decoding=\"async\" src=\"https:\/\/cdn.ttgtmedia.com\/rms\/computerweekly\/Angelica-Mari-CW-contributor.jpg\" alt=\"AngelicaMari\">\n\t\t\t\t\t\t\t\t\t<\/p>\n<p><span>By: <span>Angelica\u00a0Mari<\/span><\/span>\n\t\t\t\t\t\t\t<\/p>\n<\/div>\n<p>\t\t\t\t<\/a><\/li>\n<li><a id=\"DigDeeperItem-4\" href=\"https:\/\/www.computerweekly.com\/news\/252508928\/Minister-refuses-to-rule-out-fully-autonomous-lethal-weapons\"><br \/>\n\t\t\t\t\t<img decoding=\"async\" src=\"https:\/\/cdn.ttgtmedia.com\/rms\/onlineimages\/ai_g1182183209_searchsitetablet_520X173.jpg\" srcset=\"https:\/\/cdn.ttgtmedia.com\/rms\/onlineimages\/ai_g1182183209_searchsitetablet_520X173.jpg 960w,https:\/\/cdn.ttgtmedia.com\/rms\/onlineimages\/ai_g1182183209.jpg 1280w\" alt ><\/p>\n<h5>Minister refuses to rule out fully autonomous lethal weapons<\/h5>\n<div>\n<p><img decoding=\"async\" src=\"https:\/\/cdn.ttgtmedia.com\/rms\/computerweekly\/Angelica-Mari-CW-contributor.jpg\" alt=\"AngelicaMari\">\n\t\t\t\t\t\t\t\t\t<\/p>\n<p><span>By: <span>Angelica\u00a0Mari<\/span><\/span>\n\t\t\t\t\t\t\t<\/p>\n<\/div>\n<p>\t\t\t\t<\/a><\/li>\n<\/ul>\n<\/section>\n<\/div>\n<p><a href=\"https:\/\/www.computerweekly.com\/news\/365532024\/Lords-Committee-to-investigate-use-of-AI-powered-weapons-systems\" class=\"button purchase\" rel=\"nofollow noopener\" target=\"_blank\">Read More<\/a><br \/>\n Luz Damron<\/p>\n","protected":false},"excerpt":{"rendered":"<p>House of Lords to investigate the use of artificial intelligence in weapons systems, following UK government publication of AI defence strategy in June 2022 By Sebastian Klovig Skelton, Senior reporter Published: 06 Mar 2023 14:30 The House of Lords Artificial Intelligence (AI) in Weapon Systems Committee has published a call for evidence as part of<\/p>\n","protected":false},"author":1,"featured_media":615320,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[24264,32078,46],"tags":[],"class_list":{"0":"post-615319","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-committee","8":"category-lords","9":"category-technology"},"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/615319","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/comments?post=615319"}],"version-history":[{"count":0,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/615319\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media\/615320"}],"wp:attachment":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media?parent=615319"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/categories?post=615319"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/tags?post=615319"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}