LARS Report debate at UN

Watch the debate on the introduction of the report on Lethal Autonomous Robots at the UN by clicking on the following link:-

http://webtv.un.org/watch/clustered-id-on-executions-and-idps-9th-meeting-23rd-regular-session-of-human-rights-council/2419860355001/

UN Report on Lethal Autonomous Robots (LARS)

United Nations A/HRC/23/47
General Assembly

Distr.: General
9 April 2013
Original: English

Human Rights Council
Twenty-third session
Agenda item 3
Promotion and protection of all human rights, civil,
political, economic, social and cultural

Report of the Special Rapporteur on extrajudicial,
summary or arbitrary executions, Christof Heyns
Summary
Lethal autonomous robotics (LARs) are weapon systems that, once activated, can select and engage targets without further human intervention. They raise far-reaching concerns about the protection of life during war and peace. This includes the question of the extent to which they can be programmed to comply with the requirements of international humanitarian law and the standards protecting life under international human rights law.  Beyond this, their deployment may be unacceptable because no adequate system of legal accountability can be devised, and because robots should not have the power of life and death over human beings. The Special Rapporteur recommends that States establish national moratoria on aspects of LARs, and calls for the establishment of a high level panel on LARs to articulate a policy for the international community on the issue.
United Nations A/HRC/23/47
General Assembly Distr.: General
9 April 2013
Original: English
A/HRC/23/47
2
Contents
Paragraphs Page
I. Introduction ………………………………………………………………………………………………. 1 3
II. Activities of the Special Rapporteur …………………………………………………………….. 2–25 3
A. Communications ……………………………………………………………………………….. 2–3 3
B. Visits ………………………………………………………………………………………………… 4–6 3
C. Press releases …………………………………………………………………………………….. 7–15 3
D. International and national meetings ………………………………………………………. 16–24 4
E. Intended future areas of research ………………………………………………………….. 25 5
III. Lethal autonomous robotics and the protection of life ……………………………………. 26–108 5
A. The emergence of LARs ……………………………………………………………………… 37–56 7
B. LARs and the decision to go to war or otherwise use force ………………………. 57–62 11
C. The use of LARs during armed conflict ………………………………………………… 63–74 12
D. Legal responsibility for LARs ……………………………………………………………… 75–81 14
E. The use of LARs by States outside armed conflict ………………………………….. 82–85 16
F. Implications for States without LARs ……………………………………………………. 86–88 16
G. Taking human decision-making out of the loop ……………………………………… 89–97 16
H. Other concerns …………………………………………………………………………………… 98–99 18
I. LARs and restrictive regimes on weapons ……………………………………………… 100–108 19
IV. Conclusions ………………………………………………………………………………………………. 109–112 20
V. Recommendations ……………………………………………………………………………………… 113–126 21
A. To the United Nations …………………………………………………………………………. 113–115 21
B. To regional and other inter-governmental organizations …………………………. 116–117 22
C. To States ………………………………………………………………………………………….. 118–121 22
D. To developers of robotic systems …………………………………………………………. 122 22
E. To NGOs, civil society and human rights groups and the ICRC ………………. 123–126 22
A/HRC/23/47
3
I. Introduction
1. The annual report of the Special Rapporteur on extrajudicial, summary and arbitrary executions, submitted to the Human Rights Council pursuant to its Resolution 17/5, focuses on lethal autonomous robotics and the protection of life.1 II. Activities of the Special Rapporteur
A. Communications
2. The present report covers communications sent by the Special Rapporteur between 16 March 2012 and 28 February 2013, and replies received between 1 May 2012 and 30 April 2013. The communications and responses from Governments are included in the following communications reports of special procedures: A/HRC/21/49; A/HRC/22/67 and A/HRC/23/51.
3. Observations on the communications sent and received during the reporting period are reflected in an addendum to the present report (A/HRC/23/47/Add.5).
B. Visits
4. The Special Rapporteur visited Turkey from 26 to 30 November 2012 and will visit Mexico from 22 April to 2 May 2013.
5. The Government of Mali has accepted the Special Rapporteur‟s visit requests and the Syrian Arab Republic views his proposal to visit the country positively. The Special Rapporteur thanks these Governments and encourages the Governments of Sri Lanka, the Republic of Madagascar and Pakistan to accept his pending requests for a visit.
6. Follow-up reports on missions undertaken by the previous mandate holder to Ecuador and Albania are contained in documents A/HRC/23/47/Add.3 and A/HRC/23/47/Add.4 respectively.
C. Press releases2
7. On 15 June 2012, the Special Rapporteur issued a joint statement with the Special Rapporteur on torture deploring the escalation of violence in the Syrian Arab Republic and called on all parties to renounce violence and lay down arms.
8. The Special Rapporteur issued several press releases with other mandate holders amongst others concerning aspects related to the right to life of human rights defenders in Honduras on 4 April 2012 and 1 October 2012; the Philippines, on 9 July 2012; and on 21 June 2012, he issued a press release to urge world governments, the international
1 The assistance of Tess Borden, Thompson Chengeta, Jiou Park and Jeff Dahlberg in writing this report is acknowledged with gratitude. The European University Institute is also thanked for hosting an expert consultation in February 2013, as well as the Global Justice Clinic, the Centre for Human Rights and Global Justice, and Professor Sarah Knuckey of New York University School of Law for preparing background materials and hosting an expert consultation in October 2012.
2 Press releases of the Special Rapporteur are available from http://www.ohchr.org/en/NewsEvents/Pages/NewsSearch.aspx?MID=SR_Summ_Executions.
A/HRC/23/47
4
community, journalists and media organizations to act decisively on the protection of the right to life of journalists and media freedom;
9. On 12 October 2012, a statement was sent jointly with other special rapporteurs concerning violence in Guatemala. The same day, the Special Rapporteur issued a joint statement regarding violence against a schoolchild in Pakistan.
10. On 22 October 2012, an open letter by special procedures mandate holders of the Human Rights Council was issued expressing concern at the planned adoption by the Congress of Colombia of a project to reform certain articles of the Political Constitution of Colombia, with regard to military criminal law.
11. On 15 November 2012, the Special Rapporteur, jointly with other mandate holders, called for an investigation into a death in custody in the Islamic Republic of Iran.
12. A joint statement was issued by all special procedures mandate holders on 23 November 2012 to express their dismay at the effect that the escalation of violence had on civilians in the Occupied Palestinian Territory and Israel.
13. On 28 February 2013, the Special Rapporteur together with other mandate holders called for an international inquiry into human rights violations in North Korea.
14. A number of press releases were issued specifically on death penalty cases concerning the following States: the United States of America, on 17 July 2012; Iraq, on 27 July 2012 and 30 August 2012; and the Gambia, on 28 August 2012.
15. Additional joint statements with other mandate holders on the death penalty were issued by the Special Rapporteur:
(a) The Islamic Republic of Iran: on 28 June 2012, concerning the execution of four individuals; on 12 October 2012, calling for a halt to executions; on 23 October 2012, regarding the execution of 10 individuals on drug-related crimes; and on 25 January 2013, urging the Iranian authorities to halt the execution of 5 Ahwazi activists;
(b) Saudi Arabia: on 11 January 2013, condemning the beheading of a domestic worker;
(c) Bangladesh: on 7 February 2013, expressing concern at a death sentence passed by the International Crimes Tribunal which failed to observe all the guarantees of a fair trial and due process.
D. International and national meetings
16. From 14 to 15 September 2012, the Special Rapporteur delivered a paper at the Pan-African Conference on the Safety of Journalists and the Issue of Impunity, held in Addis Ababa, Ethiopia.
17. On the occasion of the 52nd Ordinary Session of the African Commission on Human and Peoples‟ Rights on 9 October 2012, the Special Rapporteur delivered a statement on the cooperation between the United Nations and African Union special procedures mechanisms.
18. During the sixty-seventh session of the General Assembly, the Special Rapporteur was a panellist in the side-event on the theme “The Death Penalty and Human Rights”, organized by the Special Procedures Branch of the Office of the High Commissioner for Human Rights (OHCHR) in cooperation with the World Organisation Against Torture, Penal Reform International, the Center for Constitutional Rights and Human Rights Watch in New York on 24 October 2012.
A/HRC/23/47
5
19. On 25 October 2012, the Special Rapporteur participated in the weekly briefing entitled “Issue of the Moment: The Death Penalty” for the community of non-governmental organizations associated with the Department of Public Information in New York.
20. On 15 November 2012, the Special Rapporteur presented a lecture on “The Right to Life during Demonstrations” at a seminar organized by the South African Institute for Advanced Constitutional, Public, Human Rights and International Law at the Constitutional Court of South Africa in Johannesburg. On 22 and 23 November 2012, the Special Rapporteur was a panellist during the 2nd UN Inter-Agency meeting on the safety of journalists and the issue of impunity in Vienna, Austria.
21. The Special Rapporteur took part in an Expert Meeting in Geneva entitled “How Countries Abolished the Death Penalty”, organized by the International Commission against the Death Penalty on 5 February 2013, and delivered a presentation on the resumption of the death penalty.
22. On 22 February 2013, the Special Rapporteur participated in a High Level Policy Seminar organized by the European University Institute and Global Governance and Global Governance Programme on “Targeted Killing, Unmanned Aerial Vehicles and EU Policy”, held at the European University Institute in Florence, where he spoke on “Targeting by Drones: Protecting the Right to Life”.
23. On 19 March 2013, the Special Rapporteur presented a keynote address at a conference on “The Ethical, Strategic and Legal Implications of Drone Warfare”, organized by the Kroc Institute at the University of Notre Dame in Indiana, United States of America.
24. On 21 March 2013, the Special Rapporteur took part in the Pugwash Workshop at the University of Birmingham, United Kingdom, where he spoke on lethal autonomous robotics.
E. Intended future areas of research
25. The Special Rapporteur will present a report on unmanned combat aerial vehicles (UCAVs) to the General Assembly in 2013.
III. Lethal autonomous robotics and the protection of life
26. For societies with access to it, modern technology allows increasing distance to be put between weapons users and the lethal force they project. For example, UCAVs, commonly known as drones, enable those who control lethal force not to be physically present when it is deployed, but rather to activate it while sitting behind computers in faraway places, and stay out of the line of fire.
27. Lethal autonomous robotics (LARs), if added to the arsenals of States, would add a new dimension to this distancing, in that targeting decisions could be taken by the robots themselves. In addition to being physically removed from the kinetic action, humans would also become more detached from decisions to kill – and their execution.
28. The robotics revolution has been described as the next major revolution in military affairs, on par with the introduction of gunpowder and nuclear bombs.3 But in an important respect LARs are different from these earlier revolutions: their deployment would entail not merely an upgrade of the kinds of weapons used, but also a change in the identity of those
3 Peter Singer, Wired for War (Penguin Group (USA) Incorporated, 2009), p. 179 and further,notably p. 203.
A/HRC/23/47
6
who use them. With the contemplation of LARs, the distinction between weapons and warriors risks becoming blurred, as the former would take autonomous decisions about their own use.
29. Official statements from Governments with the ability to produce LARs indicate that their use during armed conflict or elsewhere is not currently envisioned.4 While this may be so, it should be recalled that aeroplanes and drones were first used in armed conflict for surveillance purposes only, and offensive use was ruled out because of the anticipated adverse consequences.5 Subsequent experience shows that when technology that provides a perceived advantage over an adversary is available, initial intentions are often cast aside. Likewise, military technology is easily transferred into the civilian sphere. If the international legal framework has to be reinforced against the pressures of the future, this must be done while it is still possible.
30. One of the most difficult issues that the legal, moral and religious codes of the world have grappled with is the killing of one human being by another. The prospect of a future in which fully autonomous robots could exercise the power of life and death over human beings raises a host of additional concerns. As will be argued in what follows, the introduction of such powerful yet controversial new weapons systems has the potential to pose new threats to the right to life. It could also create serious international division and weaken the role and rule of international law – and in the process undermine the international security system.6 The advent of LARs requires all involved – States, international organizations, and international and national civil societies – to consider the full implications of embarking on this road.
31. Some argue that robots could never meet the requirements of international humanitarian law (IHL) or international human rights law (IHRL), and that, even if they could, as a matter of principle robots should not be granted the power to decide who should live and die. These critics call for a blanket ban on their development, production and use.7 To others, such technological advances – if kept within proper bounds – represent legitimate military advances, which could in some respects even help to make armed conflict more humane and save lives on all sides.8 According to this argument, to reject this technology altogether could amount to not properly protecting life.
32. However, there is wide acceptance that caution and some form of control of States‟ use of this technology are needed, over and above the general standards already posed by international law. Commentators agree that an international discussion is needed to consider the appropriate approach to LARs.
4 US Department of Defense, Unmanned Systems Integrated Road Map FY2011-2036, p. 50, available from http://publicintelligence.net/dod-unmanned-systems-integrated-roadmap-fy2011-2036
5 See http://www.usaww1.com/World_War_1_Fighter_Planes.php4
6 Nils Melzer, “Human rights implications of the usage of drones and unmanned robots in warfare” Study for the European Parliament‟s Subcommittee on Human Rights (2013), available from http://www.europarl.europa.eu/committees/en/studies/html, p. 5 (forthcoming).
7 Human Rights Watch, Losing Humanity: The Case Against Killer Robots (2012), p. 2, available from http://www.hrw.org/reports/2012/11/19/losing-humanity-0. See in response Michael Schmitt “Autonomous Weapons Systems and International Humanitarian Law: A Reply to the Critics” Harvard International Security Journal (forthcoming 2013), available from http://harvardnsj.org/wp-content/uploads/2013/02/Schmitt-Autonomous-Weapon-Systems-and-IHL-Final.pdf). The International Committee on Robot Arms Control (ICRAC) was formed to promote such a ban. See http://icrac.net
8 Ronald Arkin, Governing Lethal Behaviour in Autonomous Robots (CRC Press, 2009); Kenneth Anderson and Matthew Waxman, “Law and ethics for robot soldiers”, Policy Review, No. 176 (2012), available from http://www.hoover.org/publications/policy-review/article/135336.
A/HRC/23/47
7
33. As with any technology that revolutionizes the use of lethal force, little may be known about the potential risks of the technology before it is developed, which makes formulating an appropriate response difficult; but afterwards the availability of its systems and the power of vested interests may preclude efforts at appropriate control.9 This is further complicated by the arms race that could ensue when only certain actors have weapons technology. The current moment may be the best we will have to address these concerns. In contrast to other revolutions in military affairs, where serious reflection mostly began after the emergence of new methods of warfare, there is now an opportunity collectively to pause, and to engage with the risks posed by LARs in a proactive way. This report is a call for pause, to allow serious and meaningful international engagement with this issue.
34. One of the reasons for the urgency of this examination is that current assessments of the future role of LARs will affect the level of investment of financial, human and other resources in the development of this technology over the next several years. Current assessments – or the lack thereof – thus risk to some extent becoming self-fulfilling prophesies.
35. The previous Special Rapporteur examined LARs in a report in 2010,10 calling inter alia for the convening of an expert group to consider robotic technology and compliance with international human rights and humanitarian law.11 The present report repeats and strengthens that proposal and calls on States to impose national moratoria on certain activities related to LARs.
36. As with UCAVs and targeted killing, LARs raise concerns for the protection of life under the framework of IHRL as well as IHL. The Special Rapporteur recalls the supremacy and non-derogability of the right to life under both treaty and customary international law.12 Arbitrary deprivation of life is unlawful in peacetime and in armed conflict.
A. The emergence of LARs
1. Definitions
37. While definitions of the key terms may differ, the following exposition provides a starting point.13
38. According to a widely used definition (endorsed inter alia by both the United States Department of Defense and Human Rights Watch14), the term LARs refers to robotic weapon systems that, once activated, can select and engage targets without further
9 David Collingridge, The Social Control of Technology (Frances Pinter, 1980).
10 A/65/321.
11 A/65/321, pp. 10-22.
12 International Covenant on Civil and Political Rights, art. 6, enshrining the right to life, and art. 4 (2) on its non-derogability.
13 Arkin (see note 8 above), p. 7; Noel Sharkey AutomatingWarfare: lessons learned from the drones, p. 2, available from http://www.alfredoroma.it/wp-content/uploads/2012/05/Automated-warfare-Noel-Sharkey.pdf; Patrick Lin et al, Autonomous Military Robotics: Risk, Ethics, and Design (San Luis Obispo, California Polytechnic State University, 2008) p. 4, available from http://ethics.calpoly.edu/ONR_report.pdf
14 US Department of Defense Directive, “Autonomy in Weapons Systems”, Number 3000.09 of 21 November 2012, Glossary Part II. See also United Kingdom Ministry of Defence “The UK Approach to Unmanned Aircraft Systems” paras. 202-203, available from https://www.gov.uk/government/publications/jdn-2-11-the-uk-approach-to-unmanned-aircraft-systems; see also, Human Rights Watch (see note 7 above), p. 2.
A/HRC/23/47
8
intervention by a human operator. The important element is that the robot has an autonomous “choice” regarding selection of a target and the use of lethal force.
39. Robots are often described as machines that are built upon the sense-think-act paradigm: they have sensors that give them a degree of situational awareness; processors or artificial intelligence that “decides” how to respond to a given stimulus; and effectors that carry out those “decisions”.15 The measure of autonomy that processors give to robots should be seen as a continuum with significant human involvement on one side, as with UCAVs where there is “a human in the loop”, and full autonomy on the other, as with LARs where human beings are “out of the loop”.
40. Under the currently envisaged scenario, humans will at least remain part of what may be called the “wider loop”: they will programme the ultimate goals into the robotic systems and decide to activate and, if necessary, deactivate them, while autonomous weapons will translate those goals into tasks and execute them without requiring further human intervention.
41. Supervised autonomy means that there is a “human on the loop” (as opposed to “in” or “out”), who monitors and can override the robot‟s decisions. However, the power to override may in reality be limited because the decision-making processes of robots are often measured in nanoseconds and the informational basis of those decisions may not be practically accessible to the supervisor. In such circumstances humans are de facto out of the loop and the machines thus effectively constitute LARs.
42. “Autonomous” needs to be distinguished from “automatic” or “automated.” Automatic systems, such as household appliances, operate within a structured and predictable environment. Autonomous systems can function in an open environment, under unstructured and dynamic circumstances. As such their actions (like those of humans) may ultimately be unpredictable, especially in situations as chaotic as armed conflict, and even more so when they interact with other autonomous systems.
43. The terms “autonomy” or “autonomous”, as used in the context of robots, can be misleading. They do not mean anything akin to “free will” or “moral agency” as used to describe human decision-making. Moreover, while the relevant technology is developing at an exponential rate, and full autonomy is bound to mean less human involvement in 10 years‟ time compared to today, sentient robots, or strong artificial intelligence are not currently in the picture.16
2. Current technology
44. Technology may in some respects be less advanced than is suggested by popular culture, which often assigns human-like attributes to robots and could lure the international community into misplaced trust in its abilities. However, it should also be recalled that in certain respects technology far exceeds human ability. Technology is developing exponentially, and it is impossible to predict the future confidently. As a result, it is almost impossible to determine how close we are to fully autonomous robots that are ready for use.
45. While much of their development is shrouded in secrecy, robots with full lethal autonomy have not yet been deployed. However, robotic systems with various degrees of autonomy and lethality are currently in use, including the following:
• The US Phalanx system for Aegis-class cruisers automatically detects, tracks and engages anti-air warfare threats such as anti-ship missiles and aircraft.17
15 Singer (see note 3 above), p. 67.
16 The same applies to “the Singularity”, Singer (see note 3 above), p. 101.
17 See http://usmilitary.about.com/library/milinfo/navyfacts/blphalanx.htm
A/HRC/23/47
9
• The US Counter Rocket, Artillery and Mortar (C-RAM) system can automatically destroy incoming artillery, rockets and mortar rounds.18
• Israel‟s Harpy is a “Fire-and-Forget” autonomous weapon system designed to detect, attack and destroy radar emitters.19
• The United Kingdom Taranis jet-propelled combat drone prototype can autonomously search, identify and locate enemies but can only engage with a target when authorized by mission command. It can also defend itself against enemy aircraft.20
• The Northrop Grumman X-47B is a fighter-size drone prototype commissioned by the US Navy to demonstrate autonomous launch and landing capability on aircraft carriers and navigate autonomously.21
• The Samsung Techwin surveillance and security guard robots, deployed in the demilitarized zone between North and South Korea, detect targets through infrared sensors. They are currently operated by humans but have an “automatic mode”.22
46. Military documents of a number of States describe air, ground and marine robotic weapons development programmes at various stages of autonomy. Large amounts of money are allocated for their development.23
47. It seems clear that if introduced, LARs will not, at least initially, entirely replace human soldiers, but that they will have discretely assigned tasks suitable to their specific capabilities. Their most likely use during armed conflict is in some form of collaboration with humans,24 although they would still be autonomous in their own functions. The question should therefore be asked to what extent the existing legal framework is sufficient to regulate this scenario, as well as the scenario whereby LARs are deployed without any human counterpart. Based on current experiences of UCAVs, there is reason to believe that States will inter alia seek to use LARs for targeting killing.
48. The nature of robotic development generally makes it a difficult subject of regulation, especially in the area of weapons control. Bright lines are difficult to find. Robotic development is incremental in nature. Furthermore, there is significant continuity between military and non-military technologies.25 The same robotic platforms can have civilian as well as military applications, and can be deployed for non-lethal purposes (e.g. to defuse improvised explosive devices) or be equipped with lethal capability (i.e. LARs). Moreover, LARs typically have a composite nature and are combinations of underlying technologies with multiple purposes.
49. The importance of the free pursuit of scientific study is a powerful disincentive to regulate research and development in this area. Yet “technology creep” in this area may
18 See http://www.dtic.mil/cgi-bin/GetTRDoc?AD=ADA557876
19 See http://www.israeli-weapons.com/weapons/aircraft/uav/harpy/harpy.html
20 See http://www.baesystems.com/product/BAES_020273/taranis
21 See http://www.as.northropgrumman.com/products/nucasx47b/assets/X-47B_Navy_UCAS_FactSheet.pdf
22 See http://singularityhub.com/2010/07/25/armed-robots-deployed-by-south-korea-in-demilitarized-zone-on-trial-basis
23 United States Air Force, “UAS Flight Plan 2009-2047” (Washington, D.C., 2009) p. 41, available from http://www.scribd.com/doc/17312080/United-States-Air-Force-Unmanned-Aircraft-Systems-Flight-Plan-20092047-Unclassified
24 Ronald Arkin “Governing Lethal Behaviour: Embedding Ethics in a Hybrid Deliberative/Reactive Robot Architecture” Technical Report GIT-GVU-07-11 p. 5, available from http://www.cc.gatech.edu/ai/robot-lab/online-publications/formalizationv35.pdf
25 Anderson and Waxman (see note 8 above), pp. 2 and 13 and Singer (see note 3 above), p. 379.
A/HRC/23/47
10
over time and almost unnoticeably result in a situation which presents grave dangers to core human values and to the international security system. It is thus essential for the international community to take stock of the current state of affairs, and to establish a responsible process to address the situation and where necessary regulate the technology as it develops.
3. Drivers of and impediments to the development of LARs
50. Some of the reasons to expect continuous pressures to develop LARs, as well as the impediments to this momentum, also apply to the development of other unmanned systems more generally. They offer huge military and other advantages to those using them and are part of the broader automization of warfare and of the world in general.
51. Unmanned systems offer higher force projection (preserving the lives of one‟s own soldiers) and force multiplication (allowing fewer personnel to do more). They are capable of enlarging the battlefield, penetrating more easily behind enemy lines, and saving on human and financial resources. Unmanned systems can stay on station much longer than individuals and withstand other impediments such as G-forces. They can enhance the quality of life of soldiers of the user party: unmanned systems, especially robots, are increasingly developed to do the so-called dirty, dull and dangerous work.26
52. Robots may in some respects serve humanitarian purposes. While the current emergence of unmanned systems may be related to the desire on the part of States not to become entangled in the complexities of capture, future generations of robots may be able to employ less lethal force, and thus cause fewer unnecessary deaths. Technology can offer creative alternatives to lethality, for instance by immobilizing or disarming the target.27 Robots can be programmed to leave a digital trail, which potentially allows better scrutiny of their actions than is often the case with soldiers and could therefore in that sense enhance accountability.
53. The progression from remote controlled systems to LARs, for its part, is driven by a number of other considerations.28 Perhaps foremost is the fact that, given the increased pace of warfare, humans have in some respects become the weakest link in the military arsenal and are thus being taken out of the decision-making loop. The reaction time of autonomous systems far exceeds that of human beings, especially if the speed of remote-controlled systems is further slowed down through the inevitable time-lag of global communications. States also have incentives to develop LARs to enable them to continue with operations even if communication links have been broken off behind enemy lines.
54. LARs will not be susceptible to some of the human shortcomings that may undermine the protection of life. Typically they would not act out of revenge, panic, anger, spite, prejudice or fear. Moreover, unless specifically programmed to do so, robots would not cause intentional suffering on civilian populations, for example through torture. Robots also do not rape.
55. Yet robots have limitations in other respects as compared to humans. Armed conflict and IHL often require human judgement, common sense, appreciation of the larger picture, understanding of the intentions behind people‟s actions, and understanding of values and anticipation of the direction in which events are unfolding. Decisions over life and death in armed conflict may require compassion and intuition. Humans – while they are fallible – at least might possess these qualities, whereas robots definitely do not. While
26 Gary Marchant et al, “International governance of autonomous military robots”, Columbia Science and Technology Law Review, Volume XII (2011) p. 275.
27 Singer (see note 3 above), p. 83.
28 Arkin (see note 8 above), xii.
A/HRC/23/47
11
robots are especially effective at dealing with quantitative issues, they have limited abilities to make the qualitative assessments that are often called for when dealing with human life. Machine calculations are rendered difficult by some of the contradictions often underlying battlefield choices. A further concern relates to the ability of robots to distinguish legal from illegal orders.
56. While LARs may thus in some ways be able to make certain assessments more accurately and faster than humans, they are in other ways more limited, often because they have restricted abilities to interpret context and to make value-based calculations.
B. LARs and the decision to go to war or otherwise use force
57. During the larger part of the last two centuries, international law was developed to constrain armed conflict and the use of force during law enforcement operations, to make it an option of last resort. However, there are also built-in constraints that humans have against going to war or otherwise using force which continue to play an important (if often not decisive) role in safeguarding lives and international security. Chief among these are unique human traits such as our aversion to getting killed, losing loved ones, or having to kill other people.29 The physical and psychological distance from the actual use of force potentially introduced by LARs can lessen all three concerns and even render them unnoticeable to those on the side of the State deploying LARs.30 Military commanders for example may therefore more readily deploy LARs than real human soldiers.
58. This ease could potentially affect political decisions. Due to the low or lowered human costs of armed conflict to States with LARs in their arsenals, the national public may over time become increasingly disengaged and leave the decision to use force as a largely financial or diplomatic question for the State, leading to the “normalization” of armed conflict.31 LARs may thus lower the threshold for States for going to war or otherwise using lethal force, resulting in armed conflict no longer being a measure of last resort32 According to the report of the Secretary-General on the role of science and technology in the context of international security and disarmament, “…the increased capability of autonomous vehicles opens up the potential for acts of warfare to be conducted by nations without the constraint of their people‟s response to loss of human life.”33 Presenting the use of unmanned systems as a less costly alternative to deploying “boots on the ground” may thus in many cases be a false dichotomy. If there is not sufficient support for a ground invasion, the true alternative to using unmanned systems may be not to use force at all.
59. Some have argued that if the above reasoning is taken to its logical conclusion, States should not attempt to develop any military technology that reduces the brutality of armed conflict or lowers overall deaths through greater accuracy.34 Drones and high-altitude airstrikes using smart bombs should then equally be viewed as problematic because
29 A/65/321, para. 44; John Mueller “The Iraq Syndrome”, Foreign Affairs, Vol. 84, No. 6, p. 44 (November/December 2005).
30 According to military experts, it generally becomes easier to take life as the distance between the actor and the target increases. See David Grossman On Killing: The Psychological Cost of Learning to Kill in War and Society (Back Bay Books, 1996).
31 Armin Krishnan Killer robots: Legality and Ethicality of Autonomous Weapons (Ashgate, 2009) p. 150
32 Singer (see note 3 above), p. 323; Peter Asaro “How Just Could a Robot War Be?” in P. Brey et al (eds.) Current Issues in Computing And Philosophy (2008), p. 7.
33 A/53/202, para. 98.
34 Asaro (see note 32 above), pp. 7-9. Discussed by Patrick Lin et al “Robots in War: Issues of Risk and Ethics” in R. Capurro & M. Nagenborg (eds.) Ethics and Robotics (2009) p. 57.
A/HRC/23/47
12
they also lower casualty rates for the side that uses them (and in some cases also for the other side), thereby removing political constraints on States to resort to military action.35
60. This argument does not withstand closer scrutiny. While it is desirable for States to reduce casualties in armed conflict, it becomes a question whether one can still talk about “war” – as opposed to one-sided killing – where one party carries no existential risk, and bears no cost beyond the economic. There is a qualitative difference between reducing the risk that armed conflict poses to those who participate in it, and the situation where one side is no longer a “participant” in armed conflict inasmuch as its combatants are not exposed to any danger.36 LARs seem to take problems that are present with drones and high-altitude airstrikes to their factual and legal extreme.
61. Even if it were correct to assume that if LARs were used there would sometimes be fewer casualties per armed conflict, the total number of casualties in aggregate could still be higher.
62. Most pertinently, the increased precision and ability to strike anywhere in the world, even where no communication lines exist, suggests that LARs will be very attractive to those wishing to perform targeted killing. The breaches of State sovereignty – in addition to possible breaches of IHL and IHRL – often associated with targeted killing programmes risk making the world and the protection of life less secure.
C. The use of LARs during armed conflict
63. A further question is whether LARs will be capable of complying with the requirements of IHL. To the extent that the answer is negative, they should be prohibited weapons. However, according to proponents of LARs this does not mean that LARs are required never to make a mistake – the yardstick should be the conduct of human beings who would otherwise be taking the decisions, which is not always a very high standard.37
64. Some experts have argued that robots can in some respects be made to comply even better with IHL requirements than human beings.38 Roboticist Ronald Arkin has for example proposed ways of building an “ethical governor” into military robots to ensure that they satisfy those requirements.39
65. A consideration of a different kind is that if it is technically possible to programme LARs to comply better with IHL than the human alternatives, there could in fact be an obligation to use them40 – in the same way that some human rights groups have argued that where available, “smart” bombs, rather than less discriminating ones, should be deployed.
66. Of specific importance in this context are the IHL rules of distinction and proportionality. The rule of distinction seeks to minimize the impact of armed conflict on civilians, by prohibiting targeting of civilians and indiscriminate attacks.41 In situations
35 Anderson and Waxman (see note 8 above), p. 12.
36 According to some commentators, war requires some willingness to accept reciprocal or mutual risk, involving some degree of sacrifice. See Paul Kahn “The Paradox of Riskless Warfare” Philosophy and Public Policy Vol. 22 (2002) and “War and Sacrifice in Kosovo” (1999), available from http://www-personal.umich.edu/~elias/Courses/War/kosovo.htm
37 Lin (see note 34 above), p. 50.
38 Marchant (see note 26 above), p. 280; Singer, (see note 3 above), p. 398.
39 Arkin (see note 8 above), p. 127.
40 Jonathan Herbach “Into the Caves of Steel: Precaution, Cognition and Robotic Weapons Systems Under the International Law of Armed Conflict” Amsterdam Law Forum Vol. 4 (2012), p. 14.
41 Protocol I additional to the Geneva Conventions, 1977, arts. 51 and 57.
A/HRC/23/47
13
where LARs cannot reliably distinguish between combatants or other belligerents and civilians, their use will be unlawful.
67. There are several factors that will likely impede the ability of LARs to operate according to these rules in this regard, including the technological inadequacy of existing sensors,42 a robot‟s inability to understand context, and the difficulty of applying of IHL language in defining non-combatant status in practice, which must be translated into a computer programme.43 It would be difficult for robots to establish, for example, whether someone is wounded and hors de combat, and also whether soldiers are in the process of surrendering.
68. The current proliferation of asymmetric warfare and non-international armed conflicts, also in urban environments, presents a significant barrier to the capabilities of LARs to distinguish civilians from otherwise lawful targets. This is especially so where complicated assessments such as “direct participation in hostilities” have to be made. Experts have noted that for counter-insurgency and unconventional warfare, in which combatants are often only identifiable through the interpretation of conduct, the inability of LARs to interpret intentions and emotions will be a significant obstacle to compliance with the rule of distinction.44
69. Yet humans are not necessarily superior to machines in their ability to distinguish. In some contexts technology can offer increased precision. For example, a soldier who is confronted with a situation where it is not clear whether an unknown person is a combatant or a civilian may out of the instinct of survival shoot immediately, whereas a robot may utilize different tactics to go closer and, only when fired upon, return fire. Robots can thus act “conservatively”45 and “can shoot second.”46 Moreover, in some cases the powerful sensors and processing powers of LARs can potentially lift the “fog of war” for human soldiers and prevent the kinds of mistakes that often lead to atrocities during armed conflict, and thus save lives.47
70. The rule of proportionality requires that the expected harm to civilians be measured, prior to the attack, against the anticipated military advantage to be gained from the operation.48 This rule, described as “one of the most complex rules of international humanitarian law,”49 is largely dependent on subjective estimates of value and context-specificity.
71. Whether an attack complies with the rule of proportionality needs to be assessed on a case-by-case basis, depending on the specific context and considering the totality of the circumstances.50 The value of a target, which determines the level of permissible collateral damage, is constantly changing and depends on the moment in the conflict. Concerns have been expressed that the open-endedness of the rule of proportionality combined with the complexity of circumstances may result in undesired and unexpected behaviour by LARs, with deadly consequences.51 The inability to “frame” and contextualize the environment
42 Noel Sharkey “Grounds for Discrimination: Autonomous Robot Weapons” RUSI Defence Systems (Oct 2008) pp. 88-89, available from http://rusi.org/downloads/assets/23sharkey.pdf
43 Peter Asaro “On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanisation of Lethal Decision-making” p. 94, International Review of the Red Cross (forthcoming 2013) p. 11.
44 Human Rights Watch (see note 7 above), p. 31.
45 Marchant (see note 26 above), p. 280.
46 Singer (see note 3 above), p. 398.
47 Ibid.
48 Protocol I additional to the Geneva Conventions, 1977, art. 51 (5) (b).
49 Human Rights Watch (see note 7 above), p. 32.
50 Lin (see note 34 above), p. 57.
51 Noel Sharkey, “Automated Killers and the Computing Profession” Computer, Vol. 40 (2007), p. 122.
A/HRC/23/47
14
may result in a LAR deciding to launch an attack based not merely on incomplete but also on flawed understandings of the circumstances.52 It should be recognized, however, that this happens to humans as well.
72. Proportionality is widely understood to involve distinctively human judgement. The prevailing legal interpretations of the rule explicitly rely on notions such as “common sense”, “good faith” and the “reasonable military commander standard.”53 It remains to be seen to what extent these concepts can be translated into computer programmes, now or in the future.
73. Additionally, proportionality assessments often involve qualitative rather than quantitative judgements.54
74. In view of the above, the question arises as to whether LARs are in all cases likely (on the one hand) or never (on the other) to meet this set of cumulative standard. The answer is probably less absolute, in that they may in some cases meet them (e.g. in the case of a weapons system that is set to only return fire and that is used on a traditional battlefield) but in other cases not (e.g. where a civilian with a large piece of metal in his hands must be distinguished from a combatant in plain clothes). Would it then be possible to categorize the different situations, to allow some to be prohibited and others to be permitted? Some experts argue that certain analyses such as proportionality would at least initially have to be made by commanders, while other aspects could be left to LARs.55
D. Legal responsibility for LARs
75. Individual and State responsibility is fundamental to ensure accountability for violations of international human rights and international humanitarian law. Without the promise of accountability, deterrence and prevention are reduced, resulting in lower protection of civilians and potential victims of war crimes.56
76. Robots have no moral agency and as a result cannot be held responsible in any recognizable way if they cause deprivation of life that would normally require accountability if humans had made the decisions. Who, then, is to bear the responsibility?
77. The composite nature of LAR technology and the many levels likely to be involved in decisions about deployment result in a potential accountability gap or vacuum. Candidates for legal responsibility include the software programmers, those who build or sell hardware, military commanders, subordinates who deploy these systems and political leaders.
78. Traditionally, criminal responsibility would first be assigned within military ranks. Command responsibility should be considered as a possible solution for accountability for
52 Krishnan, (see note 31 above), pp. 98-99.
53 Tonya Hagmaier et al, “Air force operations and the law: A guide for air, space and cyber forces” p. 21, available from http://www.afjag.af.mil/shared/media/document/AFD-100510-059.pdf; Andru Wall “Legal and Ethical Lessons of NATO‟s Kosovo Campaign” p. xxiii, available from http://www.au.af.mil/au/awc/awcgate/navy/kosovo_legal.pdf
54 Markus Wagner “The Dehumanization of International Humanitarian Law: Legal, Ethical, and Political Implications of Autonomous Weapon Systems” (2012), available from http://robots.law.miami.edu/wp-content/uploads/2012/01/Wagner_Dehumanization_of_international_humanitarian_law.pdf note 96 and accompanying text.
55 Benjamin Kastan “Autonomous Weapons Systems: A Coming Legal „Singularity‟?” University of Illinois Journal of Law, Technology and Policy (forthcoming 2013), p. 18 and further, available from http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2037808
56 Human Rights Watch (see note 7 above), pp. 42-45.
A/HRC/23/47
15
LAR violations.57 Since a commander can be held accountable for an autonomous human subordinate, holding a commander accountable for an autonomous robot subordinate may appear analogous. Yet traditional command responsibility is only implicated when the commander “knew or should have known that the individual planned to commit a crime yet he or she failed to take action to prevent it or did not punish the perpetrator after the fact.”58 It will be important to establish, inter alia, whether military commanders will be in a position to understand the complex programming of LARs sufficiently well to warrant criminal liability.
79. It has been proposed that responsibility for civil damages at least should be assigned to the programmer and the manufacturers, by utilizing a scheme similar to strict product liability. Yet national product liability laws remain largely untested in regard to robotics.59 The manufacturing of a LAR will invariably involve a vast number of people, and no single person will be likely to understand the complex interactions between the constituent components of LARs.60 It is also questionable whether putting the onus of bringing civil suits on victims is equitable, as they would have to bring suit while based in a foreign country, and would often lack the resources.
80. The question of legal responsibility could be an overriding issue. If each of the possible candidates for responsibility identified above is ultimately inappropriate or impractical, a responsibility vacuum will emerge, granting impunity for all LAR use. If the nature of a weapon renders responsibility for its consequences impossible, its use should be considered unethical and unlawful as an abhorrent weapon.61
81. A number of novel ways to establish legal accountability could be considered. One of the conditions that could be imposed for the use of LARs is that responsibility is assigned in advance.62 Due to the fact that technology potentially enables more precise monitoring and reconstruction of what occurs during lethal operations, a further condition for their use could be the installation of such recording devices, and the mandatory ex post facto review of all footage in cases of lethal use, regardless of the status of the individual killed.63 A system of “splitting” responsibility between the potential candidates could also be considered.64 In addition, amendments to the rules regarding command responsibility may be needed to cover the use of LARs. In general, a stronger emphasis on State as opposed to individual responsibility may be called for, except in respect of its use by non-state actors.
57 Rome Statute of the ICC, art. 28; Heather Roff “Killing in War: Responsibility, Liability and Lethal Autonomous Robots” p. 14, available from http://www.academia.edu/2606840/Killing_in_War_Responsibility_Liability_and_Lethal_Autonomous_Robots
58 Protocol I additional to the Geneva Conventions, 1977, arts. 86 (2) and 87.
59 Patrick Lin “Introduction to Robot Ethics” in Patrick Lin et al (eds.) Robot Ethics: The ethical and Social Implications of Robotics (MIT Press, 2012), p. 8.
60 Wendell Wallach “From Robots to Techno Sapiens: Ethics, Law and Public Policy in the Development of Robotics and Neurotechnologies” Law, Innovation and Technology Vol. 3 (2011) p. 194.
61 Gianmarco Verugio and Keith Abney “Roboethics: The Applied Ethics for a New Science” in Lin, (see note 59 above), p. 114; Robert Sparrow “Killer Robots” Journal of Applied Philosophy Vol. 24, No. 1 (2007).
62 See Ronald Arkin “The Robot didn‟t do it” Position Paper for the Workshop on Anticipatory Ethics, Responsibility and Artificial Agents p. 1, available from http://www.cc.gatech.edu/ai/robot-lab/publications.html
63 Marchant (see note 26 above), p. 7.
64 Krishnan (see note 31 above), 105.
A/HRC/23/47
16
E. The use of LARs by States outside armed conflict
82. The experience with UCAVs has shown that this type of military technology finds its way with ease into situations outside recognized battlefields.
83. One manifestation of this, whereby ideas of the battlefield are expanded beyond IHL contexts, is the situation in which perceived terrorists are targeted wherever they happen to be found in the world, including in territories where an armed conflict may not exist and IHRL is the applicable legal framework. The danger here is that the world is seen as a single, large and perpetual battlefield and force is used without meeting the threshold requirements. LARs could aggravate these problems.
84. On the domestic front, LARs could be used by States to suppress domestic enemies and to terrorize the population at large, suppress demonstrations and fight “wars” against drugs. It has been said that robots do not question their commanders or stage coups d‟état.65
85. The possibility of LAR usage in a domestic law enforcement situation creates particular risks of arbitrary deprivation of life, because of the difficulty LARs are bound to have in meeting the stricter requirements posed by IHRL.
F. Implications for States without LARs
86. Phrases such as “riskless war” and “wars without casualties” are often used in the context of LARs. This seems to purport that only the lives of those with the technology count, which suggests an underlying concern with the deployment of this technology, namely a disregard for those without it. LARs present the ultimate asymmetrical situation, where deadly robots may in some cases be pitted against people on foot. LARs are likely – at least initially – to shift the risk of armed conflict to the belligerents and civilians of the opposing side.
87. The use of overwhelming force has proven to have counterproductive results – e.g. in the context of demonstrations, where psychologists warn that it may elicit escalated counter force66 In situations of hostilities, the unavailability of a legitimate human target of the LAR user State on the ground may result in attacks on its civilians as the “best available” targets and the use of LARs could thus possibly encourage retaliation, reprisals and terrorism.67
88. The advantage that States with LARs would have over others is not necessarily permanent. There is likely to be proliferation of such systems, not only to those to which the first user States transfer and sell them. Other States will likely develop their own LAR technology, with inter alia varying degrees of IHL-compliant programming, and potential problems for algorithm compatibility if LARs from opposing forces confront one another. There is also the danger of potential acquisition of LARs by non-State actors, who are less likely to abide by regulatory regimes for control and transparency.
G. Taking human decision-making out of the loop
89. It is an underlying assumption of most legal, moral and other codes that when the decision to take life or to subject people to other grave consequences is at stake, the decision-making power should be exercised by humans. The Hague Convention (IV) requires any combatant “to be commanded by a person”. The Martens Clause, a
65 Ibid , p. 113.
66 A/HR/17/28, p. 17.
67 Asaro (see note 32 above), p. 13.
A/HRC/23/47
17
longstanding and binding rule of IHL, specifically demands the application of “the principle of humanity” in armed conflict.68 Taking humans out of the loop also risks taking humanity out of the loop.
90. According to philosopher Peter Asaro, an implicit requirement can thus be found in IHL for a human decision to use lethal force, which cannot be delegated to an automated process. Non-human decision-making regarding the use of lethal force is, by this argument, inherently arbitrary, and all resulting deaths are arbitrary deprivations of life.69
91. The contemplation of LARs is inextricably linked to the role of technology in the world today. While machines help to make many decisions in modern life, they are mostly so used only where mechanical observation is needed (e.g. as a line umpire in sporting events) and not in situations requiring value judgements with far-reaching consequences (e.g. in the process of adjudication during court cases). As a more general manifestation of the importance of person-to-person contact when important decisions are taken, legal systems around the world shy away from trials in absentia. Of course, robots already affect our lives extensively, including through their impact on life and death issues. Robotic surgery is for example a growing industry and robots are increasingly used in rescue missions after disasters.70 Yet in none of these cases do robots make the decision to kill and in this way LARs represent an entirely new prospect.
92. Even if it is assumed that LARs – especially when they work alongside human beings – could comply with the requirements of IHL, and it can be proven that on average and in the aggregate they will save lives, the question has to be asked whether it is not inherently wrong to let autonomous machines decide who and when to kill. The IHL concerns raised in the above paragraphs relate primarily to the protection of civilians. The question here is whether the deployment of LARs against anyone, including enemy fighters, is in principle acceptable, because it entails non-human entities making the determination to use lethal force.
93. This is an overriding consideration: if the answer is negative, no other consideration can justify the deployment of LARs, no matter the level of technical competence at which they operate. While the argument was made earlier that the deployment of LARs could lead to a vacuum of legal responsibility, the point here is that they could likewise imply a vacuum of moral responsibility.
94. This approach stems from the belief that a human being somewhere has to take the decision to initiate lethal force and as a result internalize (or assume responsibility for) the cost of each life lost in hostilities, as part of a deliberative process of human interaction. This applies even in armed conflict. Delegating this process dehumanizes armed conflict even further and precludes a moment of deliberation in those cases where it may be feasible. Machines lack morality and mortality, and should as a result not have life and death powers over humans. This is among the reasons landmines were banned.71
95. The use of emotive terms such as “killer robots” may well be criticized. However, the strength of the intuitive reactions that the use of LARs is likely to elicit cannot be ignored. Deploying LARs has been depicted as treating people like “vermin”, who are
68 Geneva Convention Protocol I, art. 1(2). See also the preambles to the 1899 and 1907 Hague Conventions. Hague Convention with Respect to the Laws and Customs of War on Land and its Annex: Regulation Concerning the Laws and Customs of War on Land (Hague Convention II)
69 Asaro (see note 43 above), p. 13.
70 See http://www.springer.com/medicine/surgery/journal/11701
71 Asaro (see note 43 above), p. 14.
A/HRC/23/47
18
“exterminated.”72 These descriptions conjure up the image of LARs as some kind of mechanized pesticide.
96. The experience of the two World Wars of the last century may provide insight into the rationale of requiring humans to internalize the costs of armed conflict, and thereby hold themselves and their societies accountable for these costs. After these wars, during which the devastation that could be caused by modern technology became apparent, those who had personally taken the central military decisions resolved, “in order to save succeeding generations from the scourge of war”, to establish the United Nations to pursue world peace and to found it on the principles of human rights. While armed conflict is by no means a thing of the past today, nearly 70 years have passed without a global war. The commitment to achieve such an objective can be understood as a consequence of the long-term and indeed inter-generational effects of insisting on human responsibility for killing decisions.
97. This historical recollection highlights the danger of measuring the performance of LARs against minimum standards set for humans during armed conflict. Human soldiers do bring a capacity for depravity to armed conflict, but they also hold the potential to adhere to higher values and in some cases to show some measure of grace and compassion. If humans are replaced on the battlefield by entities calibrated not to go below what is expected of humans, but which lack the capacity to rise above those minimum standards, we may risk giving up on hope for a better world. The ability to eliminate perceived “troublemakers” anywhere in the world at the press of a button could risk focusing attention only on the symptoms of unwanted situations. It would distract from, or even preclude, engagement with the causes instead, through longer term, often non-military efforts which, although more painstaking, might ultimately be more enduring. LARs could thus create a false sense of security for their users.
H. Other concerns
98. The possible deployment of LARs raises additional concerns that include but are not limited to the following:
• LARs are vulnerable to appropriation, as well as hacking and spoofing.73 States no longer hold a monopoly on the use of force. LARs could be intercepted and used by non-State actors, such as criminal cartels or private individuals, against the State or other non-State actors, including civilians.74
• Malfunctions could occur. Autonomous systems can be “brittle”.75 Unlikely errors can still be catastrophic.
• Future developments in the area of technology cannot be foreseen. Allowing LARs could open an even larger Pandora‟s box.
• The regulation of the use of UCAVs is currently in a state of contestation, as is the legal regime pertaining to targeted killing in general, and the emergence of LARs is likely to make this situation even more uncertain.
• The prospect of being killed by robots could lead to high levels of anxiety among at least the civilian population.
72 Robert Sparrow “Robotic Weapons and the Future of War” in Jessica Wolfendale and Paolo Tripodi (eds.) New Wars and New Soldiers: Military Ethics in the Contemporary World (2011), p. 11.
73 Jutta Weber “Robotic warfare, human rights and the rhetorics of ethical machines”, pp. 8 and 10, available from http://www.gender.uu.se/digitalAssets/44/44133_Weber_Robotic_Warfare.pdf
74 Singer (see note 3 above), p. 261-263.
75 Kastan (see note 55 above), p. 8.
A/HRC/23/47
19
99. The implications for military culture are unknown, and LARs may thus undermine the systems of State and international security.
I. LARs and restrictive regimes on weapons
100. The treaty restrictions76 placed on certain weapons stem from the IHL norm that the means and methods of warfare are not unlimited, and as such there must be restrictions on the rules that determine what weapons are permissible.77 The Martens Clause prohibits weapons that run counter to the “dictates of public conscience.” The obligation not to use weapons that have indiscriminate effects and thus cause unnecessary harm to civilians underlies the prohibition of certain weapons,78 and some weapons have been banned because they “cause superfluous injury or unnecessary suffering”79 to soldiers as well as civilians.80 The use of still others is restricted for similar reasons.81
101. In considering whether restrictions as opposed to an outright ban on LARs would be more appropriate, it should be kept in mind that it may be more difficult to restrict LARs as opposed to other weapons because they are combinations of multiple and often multipurpose technologies. Experts have made strong arguments that a regulatory approach that focuses on technology – namely, the weapons themselves – may be misplaced in the case of LARs and that the focus should rather be on intent or use.82
102. Disarmament law and its associated treaties, however, provide extensive examples of the types of arms control instruments that establish bans or restrictions on use and other activities. These instruments can be broadly characterized as some combination of type of restriction and type of activity restricted. The types of restrictions include a ban or other limitations short of a ban.
103. The type of activity that is typically restricted includes: (i) acquisition, retention or stockpiling, (ii) research (basic or applied) and development, (iii) testing, (iv) deployment, (v) transfer or proliferation, and (vi) use.83
104. Another positive development in the context of disarmament is the inclusion of victim assistance in weapons treaties.84 This concern for victims coincides with other efforts to address the harm weapons and warfare cause to civilians, including the practice of casualty counting85 and the good faith provision of amends – implemented for example by
76 Through the Hague Convention of 1907 and the 1977 Additional Protocols to the Geneva Conventions.
77 See http://www.icrc.org/eng/war-and-law/conduct-hostilities/methods-means-warfare/index.jsp
78 Mine Ban Treaty (1997); and Convention on Cluster Munitions (2008).
79 Protocol I additional to the Geneva Conventions, 1977. art. 35 (2); ICRC, Customary Humanitarian Law, Rule 70.
80 Protocol for the Prohibition of the Use of Asphyxiating, Poisonous or Other Gases, and of Bacteriological Methods of Warfare. Geneva, 17 June 1925.
81 Convention on Certain Conventional Weapons, Protocol III on incendiary weapons.
82 Marchant (see note 26 above), p. 287, Asaro see note 43 above), p. 10.
83 Marchant (see note 26 above), p. 300. See also Bonnie Docherty “The Time is Now: A Historical Argument for a Cluster Munitions Convention” 20 Harvard Human Rights Law Journal (2007), p. 53 for an overview.
84 Mine Ban Treaty (1997), art. 6, and Convention on Certain Conventional Weapons, Protocol V on Explosive Remnants of War (2003), art. 8. The Convention on Cluster Munitions (2008), art. 5 was groundbreaking in placing responsibility on the affected State.
85 S/2012/376, para. 28 (commending inter alia the commitment by the African Union Mission in Somalia).
A/HRC/23/47
20
some International Security Assistance Force States – in the case of civilian deaths in the absence of recognized IHL violations.86 These practices serve to reaffirm the value of life.
105. There are also meaningful soft law instruments that may regulate the emergence of LARs. Examples of relevant soft law instruments in the field of disarmament include codes of conduct, trans-governmental dialogue, information sharing and confidence-building measures and framework conventions.87 In addition, non-governmental organization (NGO) activity and public opinion can serve to induce restrictions on weapons.
106. Article 36 of the First Protocol Additional to the Geneva Conventions is especially relevant, providing that, “in the study, development, acquisition or adoption of a new weapon, means or methods of warfare, a High Contracting Party is under an obligation to determine whether its employment would, in some or all circumstances, be prohibited by this Protocol or by any other rule of international law applicable to the High Contracting Party.”
107. This process is one of internal introspection, not external inspection, and is based on the good faith of the parties.88 The United States, although not a State party, established formal weapons mechanisms review as early as 1947. While States cannot be obliged to disclose the outcomes of their reviews, one way of ensuring greater control over the emergence of new weapons such as LARs will be to encourage them to be more open about the procedure that they follow in Article 36 reviews generally.
108. In 2012 in a Department of Defense Directive, the United States embarked on an important process of self-regulation regarding LARs, recognizing the need for domestic control of their production and deployment, and imposing a form of moratorium.89 The Directive provides that autonomous weapons “shall be designed to allow commanders and operators to exercise appropriate levels of human judgement over the use of force”.90 Specific levels of official approval for the development and fielding of different forms of robots are identified.91 In particular, the Directive bans the development and fielding of LARs unless certain procedures are followed.92 This important initiative by a major potential LARs producer should be commended and may open up opportunities for mobilizing international support for national moratoria.
IV. Conclusions
109. There is clearly a strong case for approaching the possible introduction of LARs with great caution. If used, they could have far-reaching effects on societal values, including fundamentally on the protection and the value of life and on international stability and security. While it is not clear at present how LARs could be capable of satisfying IHL and IHRL requirements in many respects, it is foreseeable that they could comply under certain circumstances, especially if used alongside human soldiers. Even so, there is widespread concern that allowing LARs to kill people may denigrate the value of life itself. Tireless war machines, ready for deployment at the push of a button, pose the danger of permanent (if low-level) armed conflict, obviating the opportunity for post-war reconstruction. The onus is on those who wish to deploy LARs to demonstrate that specific uses should in particular
86 Ibid., para. 29 (the Secretary General “welcomed the practice of making amends”).
87 Marchant, see note 26, pp. 306-314.
88 Discussed in International Review of the Red Cross vol. 88, December 2006.
89 US DoD Directive (see note 14 above).
90 Ibid, para 4.a.
91 Ibid, paras 4.c and d.
92 Ibid, Enclosure 3.
A/HRC/23/47
21
circumstances be permitted. Given the far-reaching implications for protection of life, considerable proof will be required.
110. If left too long to its own devices, the matter will, quite literally, be taken out of human hands. Moreover, coming on the heels of the problematic use and contested justifications for drones and targeted killing, LARs may seriously undermine the ability of the international legal system to preserve a minimum world order.
111. Some actions need to be taken immediately, while others can follow afterwards. If the experience with drones is an indication, it will be important to ensure that transparency, accountability and the rule of law are placed on the agenda from the start. Moratoria are needed to prevent steps from being taken that may be difficult to reverse later, while an inclusive process to decide how to approach this issue should occur simultaneously at the domestic, intra-State, and international levels.
112. To initiate this process an international body should be established to monitor the situation and articulate the options for the longer term. The ongoing engagement of this body, or a successor, with the issues presented by LARs will be essential, in view of the constant evolution of technology and to ensure protection of the right to life – to prevent both individual cases of arbitrary deprivation of life as well as the devaluing of life on a wider scale.
V. Recommendations
A. To the United Nations
113. The Human Rights Council should call on all States to declare and implement national moratoria on at least the testing, production, assembly, transfer, acquisition, deployment and use of LARs until such time as an internationally agreed upon framework on the future of LARs has been established;
114. Invite the High Commissioner for Human Rights to convene, as a matter of priority, a High Level Panel on LARs consisting of experts from different fields such as law, robotics, computer science, military operations, diplomacy, conflict management, ethics and philosophy. The Panel should publish its report within a year, and its mandate should include the following:
(a) Take stock of technical advances of relevance to LARs;
(b) Evaluate the legal, ethical and policy issues related to LARs;
(c) Propose a framework to enable the international community to address effectively the legal and policy issues arising in relation to LARs, and make concrete substantive and procedural recommendations in that regard; in its work the Panel should endeavour to facilitate a broad-based international dialogue;
(d) Assessment of the adequacy or shortcomings of existing international and domestic legal frameworks governing LARs;
(e) Suggestions of appropriate ways to follow up on its work.
115. All relevant United Nations agencies and bodies should, where appropriate in their interaction with parties that are active in the field of robotic weapons:
(a) Emphasize the need for full transparency regarding all aspects of the development of robotic weapon systems;
A/HRC/23/47
22
(b) Seek more international transparency from States regarding their internal weapons review processes, including those under article 36 of Additional Protocol I to the Geneva Conventions.
B. To regional and other inter-governmental organizations
116. Support the proposals outlined in the recommendations to the United Nations and States, in particular the call for moratoria as an immediate step.
117. Where appropriate take similar or parallel initiatives to those of the United Nations.
C. To States
118. Place a national moratorium on LARs as described in paragraph 114.
119. Declare – unilaterally and through multilateral fora – a commitment to abide by IHL and IHRL in all activities surrounding robotic weapons and put in place and implement rigorous processes to ensure compliance at all stages of development.
120. Commit to being as transparent as possible about internal weapons review processes, including metrics used to test robotic systems. States should at a minimum provide the international community with transparency regarding the processes they follow (if not the substantive outcomes) and commit to making the reviews as robust as possible.
121. Participate in international debate and trans-governmental dialogue on the issue of LARs and be prepared to exchange best practices with other States, and collaborate with the High Level Panel on LARs.
D. To developers of robotic systems
122. Establish a code or codes of conduct, ethics and/or practice defining responsible behaviour with respect to LARs in accordance with IHL and IHRL, or strengthen existing ones.
E. To NGOs, civil society and human rights groups and the ICRC
123. Consider the implications of LARs for human rights and for those in situations of armed conflict, and raise awareness about the issue.
124. Assist and engage with States wherever possible in aligning their relevant procedures and activities with IHL and IHRL.
125. Urge States to be as transparent as possible in respect of their weapons review processes.
126. Support the work of the High Level Panel on LARs.

Drone Warfare – From Wales To Gaza – Conference Speech – Cardiff 25th May 2013

SPEECH

Good afternoon comrades, friends and colleagues. Can I start by saying how pleased I am to see this conference taking place and can I take this opportunity to thank the organisers for inviting me to share the platform today with such distinguished friends. My name is Harry Rogers and I live in West Wales about 12 miles from Aberporth where the MOD have been carrying out their tests on the Watchkeeper Drone.

I am sixty five years old and I almost never got born because of a UAV. My mother and my aunt were in the back bedroom over the top of my grandfathers public house in West Croydon in June 1944 just ten seconds before one of Hitler’s V1 doodle bug flying bomb drones blew the back of the pub clean off. Ten seconds earlier and I would never have been born. My point here is that UAVs have been around a long time and the Watchkeeper is nothing really new in concept. A lot has been written about this piece of almost obsolete already Army surveillance equipment so I won’t add to the reams you can find on the internet.

I am a member of a local peace group called Bro Emlyn For Peace and Justice and we formed in 2003 as a response to the decision by Bush and Blair to attack Iraq. During the time we have been involved in a number of campaigns including the campaign against the development of Cardigan Airport as a testing ground for UAVs, also the proposed introduction of an unmanned aerial systems technology hub at Parc Aberporth and also the management buyout of the MOD missile testing base at Parcllyn by QinetiQ.

There is much I could talk to you about concerning the history of this campaigning but that is not why I agreed to come here today. The past is something that we can learn from but not something that we can undo. My speech today is primarily about the future of drone technology and why it is vital that we all start to pay attention to what the research and development bods at MIT and DARPA, QinetiQ and BAE systems are cooking up for the future. I make no apologies for basing most of this speech on a report made by the RAF in October 2012 about the future of Unmanned Aerial Systems to the Government.

I am interested in ensuring that we all go away from today with an understanding that the big issue we face in terms of UAV development is that of Autonomy. The next generation of drones may well be able to think for themselves and act autonomously, that is without any human in the loop as the Military say. That means flying machines that can make their own decisions in search and destroy operations based on a set of algorithmic decisions pre-determined by human masters who may or may not be benign in their intentions.

Sceptics amongst you are already muttering balderdash and hokum, science fantasy and other such epithets. Well virtually all of the rest of the speech will, I hope, convince you otherwise.

* The RAF say that “As UAVs are developed with increasing levels of automation it will reduce the requirement for operator training in the more traditional piloting skills of flying the aircraft, such as landing and takeoff, and focus the training more towards operating the payload.”

* The MOD and in particular the RAF are discussing issues about the levels of automation they are comfortable with. They recognise that highly automated weaponry systems are unlikely to be able to apply judgement and pragmatism to “situations”. They are worried about legal and ethical considerations that occur when there are no human beings in the loop leading to loss of life or injury.

* The future is one where outdated weaponry will give way to what they term “precision weaponry” and where the battle-spaces increasingly involve unmanned and cyber operations.

* Autonomous weapon systems are capable of understanding higher level intent and direction, and perception of their environment, leading to the ability to take appropriate actions to bring about desired states. To be able to decide on a course of action, from a number of alternatives, without depending on human oversight and control. Although the overall activity of an autonomous unmanned aircraft will be predictable, individual actions may not be.

* Low intensity tasks for Autonomous UAVs requiring minimal human oversight rather than continous control include the following:-

– pattern of life surveillance tasks over fixed locations or in support of littoral maneuver

– maintenance of standing anti-submarine warfare or anti-surface warfare radar barriers

– counter-piracy tasks; monitoring of arrays of sonobuoys or other sensors

– electronic warfare tasks

– acting as a communications relay

– air-to-air refueling tankers

* The RAF consider the use of UAVs ideal in hostile environments for a manned aircraft or it’s crew in operations related to Chemical, Biological, Radiological, and Nuclear. Tasks include the following:-

– Could carry sensors for local, tactical or global use

– systems easily sacrificed in a safe area after data gathering

– use in situations where fire and smoke make human activity hazardous

* In risky situations UAV systems can be used instead of aircrew or soldiers where the threat of ground to air action is high and also where it is necessary to suppress an integrated air defence system.

– multiple cheap UAVs used sacrificially to swamp detection and command and control systems

– to encourage enemies to fire large numbers of missiles

– observe engagement tactics and transmit data back to intelligence collators

– convey tactical supplies

– sweep for improvised explosive devices

* Of course UAVs are used in scenarios which are highly distasteful and we all know that they are used for surveillance and targeting, and carrying out, weapon attacks inside the borders of countries such as Pakistan and Palestine. All of this is currently done with “man in the loop” systems but it will not be too long before drones are flying with what the RAF says is ” the ability to independently locate and attack mobile targets, with appropriate proportionality and discrimination.”

The RAF are worried about issues relating to the Geneva Convention with regard to autonomous UAV development and usage and they say that “compliance will become increasingly challenging as systems become more automated. In particular, if we wish to allow systems to make independent decisions without human intervention, some considerable work will be required to show how such systems will operate legally.”

Already there are automated weapons systems in use in Afghanistan, for example, the Phalanx and Counter-Rocket, Artillery and Mortar (C-RAM) systems used because there is deemed to be insufficient time for human response to counter incoming fire.

Future autonomous UAV systems will have to adhere to legal requirements and civilian airspace regulations. This will require political involvement in getting the necessary changes made. In my view it is absolutely vital that politicians understand the issues clearly and concisely because once made it will mean that there is going to be a lot of military and civilian hardware flying about in the airspace without any human interface whatsoever. The RAF say “As systems become increasingly automated, they will require decreasing human intervention between the issuing of mission-level orders and their execution.” and they go further saying “It would be only a small technical step to enable an unmanned aircraft to fire a weapon based solely on its own sensors, or shared information, and without recourse to higher, human authority.” They also discuss the timescale for the introduction of increased autonomy via Artificial Intelligence, saying ;- ” Estimates of when artificial intelligence will be achieved (as opposed to complex and clever automated systems) vary, but the consensus seems to lie between more than 5 years and less than 15 years.” Their words, not mine.

Currently the MOD “has no intention to develop systems that operate without human intervention in the weapon command and control chain, but it is looking to increase levels of automation where this will make systems more effective.”

The RAF are clearly worried about the direction all this is going in and they say “As technology matures and new capabilities appear, policy-makers will need to be aware of the potential legal issues and take advice at a very early stage of any new system’s procurement cycle.” I believe this highlights a degree of paranoia on the part of the RAF vis a vis it’s own future role.

Not only are the RAF exercised about legal dilemmas. Ethics and morals related questions also are in their thoughts such as when, where and how automated and autonomous unmanned systems may be used. This applies not just to the use of drones of course but also to all other forms of weaponry in any environment. Will all future wars be fought remotely with little or no loss of friendly or military personnel? Will future conflicts be waged between increasingly complex unmanned systems?

In my view a problem that we face today is that the accountants have control of governments and the most expensive resource used in public services is human beings. So autonomy offers massive savings in manpower and support for that manpower both before and after conflict occurs. As artificial Intelligence comes on board we are likely to see more complicated tasks arising that are beyond the capability of humans to deal with due to speed, complexity and information overload. No doubt some of you are probably suffering that now, but I only have a bit more to say before I will take questions, so bear with me.

The RAF and many others in the field are grappling with issues such as whether it is possible to develop AI that has the capability to focus on the unique (at the moment) ability that a human being has to bring empathy and morality to complex decision-making. The RAF say “To a robotic system, a school bus and a tank are the same – merely algorithms in a programme – and the engagement of a target is a singular action; the robot has no sense of ends, ways and means, no need to know why it is engaging a target. There is no recourse to human judgement in an engagement, no sense of a higher purpose on which to make decisions, and no ability to imagine (and therefore take responsibility for) repercussions of action taken.”

So we need to pose the following questions to our politicians:-

Are they happy to allow Autonomous Robots to take on the responsibility of choosing which of us lives and which of us dies?

Can an autonomous robot be considered capable of waging ethical and legal warfare?

Can software and hardware accidents be distinguished from war crimes when robots go wrong?

Is it possible to establish clear policies on acceptable machine behavior?

How far out of the bottle is the technological genie?

Are we doomed to a Terminator style future?

Is it possible have a sensible debate about technological development anymore?

Do we really want Artificial Intelligence with a greater capacity to think than a human to be in any way involved in future theatres of War?

 

The RAF pose very important questions when they ask the following:-

– Do military planners and politicians understand the full implications of the systems they are currently tasking and those they hope to procure?

– In the current economic climate, who will decide the best balance between keeping existing equipment and personnel, or whether to give these up to fund new unmanned systems?

– Do we understand even the basic implications of such decisions for the associated defence lines of development?

– Crucially, do we have a strategic level of understanding as to how we will deliver the considerable number of changes that will need to be made to existing policy, concepts, doctrine, and force structures?

* Finally we must expect to see governments bringing in changes to the law of armed conflict in order to accommodate the use of autonomous UAS and we must shout our opposition to this from the rooftops.

So there we have it friends, there is a lot for us to consider when we look into the issue of drones. We have to view all of this in an holistic way. That is we must not just lumber along from one demo to the next thinking only about the impact of the current use of drone warfare, terrible though that is. It is absolutely vital that we start to consider what future implications there are for the maintenance and development of basic human rights. We must also consider what the use of technology means in terms of social control measures such as we see the beginnings of in Gaza and elsewhere. I believe we are at, if not all ready passed, a dangerous turning point in the way we occupy this planet.

It is incumbent on us all to make a fuss about these issues if we want a planet where Human Rights are protected. If we don’t then we condemn the world to a dystopian future where all kinds of as yet un-thought of technology is used to maintain the position of a global elite above and beyond that of the majority of the people. The choice is ours, make a ruckus or bury our head in the sand and wait for the worst possible sci-fi future to engulf us all.

Thanks for listening, I’m happy to take questions and take part in debate.

Harry Rogers 24/05/2013

The Case – Short Story

THE CASE

A short story by

Harry Rogers

The girl was tired.  She had been traveling for nineteen hours and it was catching up on her.  It was a long journey from the beach hut on Koh Kood island in Thailand.  She was at the end of her tether and just wanted to be home.  The coach journey from Heathrow to Sheffield was just the final straw, and she was drifting in and out of consciousness as the motorway slid by the window.  Outside the weather was atrocious.  The fine rain was falling in that relentlessly misty way it does in early November and she wished she had decided to stay there for another three months.  If her sister had not been getting married she would have done.  As it was the whole family expected her at the wedding in just two days time and she had to be there, or else her mother would never let her hear the end of it.  She wanted to be there really but this journey was just too much and right now all she wanted was a hot bath and a sleep in her own bed.

The coach arrived in Sheffield amazingly on time.  She packed up her I-pod and headphones, put on her leather jacket, got off and waited under the bus shelter for the driver to drag her silver case out from the bowels of the coach.  He did so quite quickly and she was the fifth person away to the taxi rank.  She got into a black cab, and told the driver to take her to two hundred and forty Cemetery Road.  Twelve minutes later she was inside her ground floor flat.  On the telephone table in the hall there were three neat piles of post that she knew her mum had tidied up for her.  She had been away for fifteen months and there was a lot of catching up to do, but not now.  She put her case in the cupboard at the top of the cellar stairs, before going straight to the bathroom and running the hot tap for a long soak.

After her bath she got into a pair of old pyjamas, made herself a cup of camomile tea and sat on her bed looking through the first pile of post.  Most of it was junk but there were a couple of letters from Australia.  She opened them and was happy to see that they were from the young guy she had met in New Zealand.  She liked him a lot and when they had parted company in Christchurch and he had asked for her address she never thought for a minute that he would actually write to her.  Here they were though, two letters written in that sing song way that had made him so attractive in the first place. 

He was from Cork in Ireland and had a way about him that she fell for.  As she read the letters she could hear his voice in just the same lilting way that she remembered from that night at the Bar Crocodile when he said that after he finished traveling he was coming to England for her.  She had told him that she thought this was a load of old blarney and they both laughed.  She had given him the address anyway and now was very happy that he had written.  In the latest letter, dated only two weeks previously he said that he was going to be in England and would come to Sheffield to look her up.  It turned out he would be arriving just two days after her sister’s wedding.  She felt a warm glow inside her as she finished the tea and climbed under her duvet.  Donal, who she had playfully nicknamed Donut, was coming.  She fell asleep quickly, thinking about how much she had liked him during the two weeks they spent together in hobbit land.

She slept deeply for twelve hours.  When she awoke the girl took a shower,  made herself some porridge and a cup of Earl Grey tea.  After breakfast she called her mother and arranged to meet up with her at lunchtime in the cafe at John Lewis.  She knew her mother had her bridesmaids outfit waiting for to try on.  She threw a few bits of clothing into a shoulder bag and set off for the centre of Sheffield, carefully locking her flat behind her.

The wedding was a spectacular event and the girl had admitted to herself that she had enjoyed it.  Every one of her relations had been nice to her and were very impressed that she had been just about everywhere there was to go since she set off traveling twenty seven months earlier.  She was the first one to hit the trail in the family and she could sense that quite a lot of them secretly envied her.  The reception had been awesome and she had danced until 3.00 am and drunk quite a lot of Sailor Jerry’s rum.  Her mum had driven her back home after breakfast at the Hotel and she was about all familied out by the following day when she arrived back at her flat. 

She checked the answerphone to see whether there was a message from Donut.  She was surprised to find that there were ten messages, one from Donut saying he was coming around at 11.00 am that morning, the rest of them were all blank.  She decided that she would do her laundry and fetched her case from the cellar head.  She set it on the kitchen table and took the key from her purse to open it with.  She put the in the lock, it was a bit stiff and when she tried to turn it the key wouldn’t budge.  She twisted the key a bit harder and it snapped off in the lock.  The girl was a bit annoyed but decided to wait until Donut got there.

She put the kettle on to make herself a cup of tea and as she was putting the tea bag into a mug the house phone rang.  She went into the hall, picked up the receiver and said “Hello?” .  There was no reply and she heard the telephone being put down at the other end of the line.   She was slightly perplexed but she finished making her tea and sat there waiting for Donut to arrive.

Ten minutes later there was a ring at the door and she went and opened it.  There was Donut looking just the same as the last time she had seen him. 

“Hello Girl,” he said.

“Donut, it’s so nice to see you again.” And with that she put her arms around his neck and pulled him down for a welcoming kiss.

They stood at the doorstep locked in an embrace for 20 seconds before she said “Come on in, I’ll make you some tea. Are you hungry? We can go down town if you like for lunch, it’s not too far to walk.”

“Just a cup of tea will do fine,” he said “This is a grand place you have here.” He said looking around the place.

“I know, my auntie left me hundred thousand pounds when she died and I bought this place with most of it before I set off on the round the world jaunt.  I’ve only actually lived her for about three months all told.  My mum has been keeping her eye on it for me whilst I’ve been away.”

“Lucky you, I wish I had a pad of my own to go back to when I finish traveling.”

“Are you going off again?” asked the girl

“Yes, I thought I would go to Canada in autumn and get a job working on the ski slopes over there.”

“That sounds great, maybe I could meet up with you over there?”

Donut looked at her and she looked back at him.  They both started grinning together and she knew that this was the start of her next traveling adventure.  They hugged and he kissed her full on the lips.

“Drink your tea,” said the girl

“OK. Let’s talk about Canada over lunch.”

“Great idea.  Oh before we go out can you take a look at my case, I snapped the key off in the lock and I want to put my clothes in the washing machine whilst we are out.”

He looked at the aluminium case and the broken key in the lock.  “Do you have any tools here?”

“Sure, my dad gave a full tool kit as present when I first moved in. I’ll get it.”

She came back with plastic toolbox, and Donut opened it.  He took out a small cold chisel and a club hammer.  “I’m going to have to break the lock here,” he said

“Go ahead, I need a new case anyway.”

He put the chisel into the gap above the key hole and gave it a big whack with the club hammer.  The lock gave way instantly and he said “There you go Girl, nothing to it.”

“Aw thanks, now I can get on with the laundry.” And she opened the lid of the case.

She looked in expecting to see all her summer dresses from Thailand, but instead there was a whole shed-load of money in neat bundles, and a vacuum packed clear polythene bag containing a severed human hand.  She let out a scream and Donut had to steady her as she stepped back in alarm.

“This is not my case,” she said “I must have picked up the wrong one from the coach driver.  Look at all this money, look at this horrible thing,” and she pointed to the hand in the bag.

Donut stood there open mouthed looking at the money.  There were about thirty bundles of fifty pound notes, each bundle containing two thousand pounds. 

“There’s about sixty grand there.  That’s an awful lot of money.  The hand means this is a dangerous situation.  Is there anything in your case that can tell the owner of this case where you live?”

“Yes, there are some letters that my mum forwarded to me about my student loan stuff that have my address on them. Why?”

“Has anybody tried to contact you since you got back here?”

“Well there have been a load of blank messages on the answer phone.”

“Shit, we have to get out of here.”

They moved quickly down the hallway to the front door and as they opened it there was a large shape blocking the doorway.  The last thing they heard was the pfft pfft pfft pfft of the 9mm Glock 18 machine pistol with silencer as it despatched both of them before they could utter a word. Donut fell to the floor and the girl landed on top of him, both dead.  The shape stepped over them, went into the kitchen, picked up the case and left the flat, carefully closing the front door.

Goodbye Democracy – article in Heddwch first published in 2008

I wrote the following article for Heddwch, the newsletter for CND Cymru, in 2008.  I republish it here for those who may have missed it first time around.  Parc Aberporth is still largely unoccupied and Selex have pulled out.  It is an economic development fiasco but the issues raised vis a vis drone technology are more relevant than ever.

Goodbye Democracy

UAVs over West Wales

‘Skynet’ is a fictional, computer-based military defence system that acts as the
primary antagonist in the ‘Terminator’ series of films and computer games. This
is a fictional example of an artificial intelligence that becomes sentient, and turns
against its creators. Of course this is Science Fiction and not happening in the
real world. Or is it?

An ASTRAEA (Autonomous Systems Technology Related Airborne Evaluation and Assessment) Centre is being established at Parc Aberporth on the Ceredigion coast, with the backing of £3 million of public money on top of the cost of building the Parc itself. ASTRAEA is a British programme with Welsh Assembly and Westminster Governments support and funding, which involves a number of arms companies including the European Aeronautic Defence and Space Company (EADS), QinetiQ, Flight Refuelling Ltd (FRL), Thales and BAE Systems.

Remote killing

At Aberporth, Unmanned Aerial Vehicles (UAVs or drones) are being developed for a variety of purposes, some benign and some most certainly not.  QinetiQ is currently carrying out experiments marrying up ‘artificial intelligence’ with Unmanned Aerial Vehicles (UAVs). While these could have peaceful applications, it quickly becomes worrying when the same technology is applied to Unmanned Combat Aerial Vehicles (UCAVs). The future of warfare is computer controlled and where the human element is removed from the theatre of operations. Civilian populations could be kept in check by a variety of robotic drones.  Some drones are designed to be spy planes for collecting information and intelligence; others carry weapons for combat and bomb delivery. All this is talked up with glee by various politicians and civil servants as a perfect way to solve the problems of economic regeneration in West Wales.

Think again

There are dark things going on at Parc Aberporth. Why is the Welsh Assembly Government so keen to get into bed with these major armaments developers? QinetiQ’s website makes no bones about the fact that the company is linked in with the UCAV programme for the Ministry of Defence and will be connecting in with BAE, joint partners in the ASTRAEA project with offices in the Parc Aberporth complex.
Those in favour of the civil aviation authority airport planned for Aberporth might think twice about arms dealers and international weaponry experts flying in and out of a West Wales Airport. It is time that New Labour ’s Andrew Davies AM, Welsh Assembly Government Minister for Finance and Public Service Delivery, began to understand that Wales is a fledgling democracy and that the people of Wales do not want to be blinded with spin in the usual smoke and mirrors approach of the Westminster style of government.

Despite this huge ‘investment’ it was reported last May that the Parc Aberporth remained largely unoccupied more than 18 months after it was first opened. The Assembly Government had originally announced over 230 jobs would be created by 2008 with a further capacity for up to 1,000 employees. Selex (Sensors and Airborne Systems),  Europe’s second largest defence electronics business occupies one unit on the site. The ‘expertise’ of EADS, QinetiQ, Flight Refuelling Ltd, Thales and BAE Systems was to have been combined with that of university departments at Aberystwyth and Cardiff ’ to create even more jobs. The Parc is now frequented by practicing learner drivers and the grounds are unkempt. As we know, the ideal situation for profit in the military industrial complex is ‘endless war ’.

People in Wales demand better and want to know that industry in Wales is being used for purposes other than being part of the US led world domination inspired military transformation strategy which George W Bush, Tony Blair, Rhodri Morgan, and now, Gordon Brown have bought in to.

Harry Rogers

Drones at Aberporth demo 18/05/2013 – report.

On Saturday afternoon last week I attended a small demonstration organised by local Quakers outside the MOD base in Parclyn just above Aberporth.  There were only ten activists there on this cloudy day plus one desultory police car parked 5o yards down the road.

MOD Aberporth is renowned for having been one of the main missile testing ranges in Britain. However it has effectively been privatised and, following a management buyout, it is now run by QinetiQ. QinetiQ is a research company with many links to the MOD.

There have have been a number of arms sales fairs held at Parc Aberporth, in particular focused on UAVs and UCAVs.  Since 2010 the base has been involved in the testing of the Watchkeeper drone (developed by Thales) for the British Army flying out of West Wales airport. Members of Bro Emlyn For Peace and Justice and other local resident groups have been monitoring the ongoing drone development at Aberporth for the last ten years.

It is quite clear with the opening of the new drone facility in Lincolnshire that the British Government, like the rest of the world, are intent on maintaining and increasing the use of unmanned aerial vehicles for use by the armed forces in all current and future theatres of war, However it is also clear that there are other uses to which this technology can be put, including surveillance and social control within the home borders.

It is a fact that many police forces across the world have invested large sums of money in drones and are now using them on a regular basis. It is also clear that there is now a lot of interest in the development of autonomous weaponry and surveillance systems using AI (artificial intelligence). Such Autonomous weapons are already in use in Afghanistan and elsewhere in the form of anti aircraft weapons that don’t have to wait for a human to give it an order to deploy.

There is a lot of discussion going on between the MOD and The Pentagon about the use of autonomous AI in drones and other robot development. All of this research is partly driven by political aims in that it is becoming less and less acceptable for young men to be seen coming back from the front in body bags or severely damaged by conventional war.

Many people argue that the spectre of autonomous UAVs and other forms of social control technology is just pie in the sky sci fi fantasy. This is a dangerous delusion. QinetiQ has it’s own AI division and has been linking it’s work with its UAV development for about 7 years now. The pentagon via DARPA and The MOD are regularly discussing the ethical dilemmas associated with selling the concept of drones that take their own decisions to the politicians and the people.

In West Wales we have fought a lone battle against the developments at West Wales Airport and Parc Aberporth. Local politicians who are desperate to try and solve some of the economic development problems caused by recession and unemployment have been beguiled by the false promises of government agencies and arms manufacturers and have largely turned there back on the issues outlined here. It is time for the people to take action and we need national co-ordinated support from peace activists for future events.

Scratchy Cats

Scratchy Cats

 

All your life collecting hats

Tumbling with the acrobats

Fingering aristocrats

People said that you were bats

 

Scratchy cats

Scratchy cats

Living with

Scratchy cats

 

In those clapped out council flats

Cups of tea pissed by gnats

Beaten up with baseball bats

Stiffed by fucking bureaucrats

 

Scratchy cats

Scratchy cats

Living with

Scratchy cats

 

All the Chinese technocrats

Fiddling taxation stats

Polyunsaturated fats

Diana’s kids are breeding brats

 

Scratchy cats

Scratchy cats

Living with

Scratchy cats

.

249125_10151610697599695_868820289_nArtwork “Cat Scratches” by my daughter Sharon Rogers, click picture to see more of her work.

Canary In A Bamboo Cage – Flash fiction format.

CANARY IN A BAMBOO CAGE

By Harry Rogers

When he was just a young man, barely twenty three, he thought he saw the whole, of human history, reflected in the clouds, as on Afton Down he lay, in August nineteen seventy, above Freshwater Bay.  Then, he carried his canary, in his bamboo cage, down the shining path, to the diamond studded beach, where the crystal waterfall, splashed on the silver rocks.  He took a shower there, in his south sea bubble loons, the spray was filled with rainbows, as he shook his yellow locks, his head still filled with last night’s Jim Morrison tunes.

Later on that evening, down near desolation row, inside the Circus tent, putting on a show, Boris, Nik and Dik Mik,  gave away free blow, he was very nearly certain he could hear the grasses grow.

The anarchists were liberating food stalls everywhere.  Bread heads and rip off merchants could only stand and stare.   French warriors gave free Mars bars to girls with flowers in their hair.  The police?  They turned a blind eye, they didn’t seem to care.

The smell of bedroom joss spilled out of 50,000 tents.  Some dealers were still cleaning up from teenage innocents, but mostly psychedelic drugs were given out for free, sugar cubes and blotters, mescaline and peyote.  Everything was going down, the fences and the sun, then Jimi hit the stage, beaming love at everyone.  As he played guitar, for the people on the hill, our hero tripped all night, badly, way outside his head.  His canary in its bamboo cage started looking ill, by morning the canary was definitely dead. 

There was no coming back from this nightmarish scene, now he was becoming, a burnt out old has been.  Most of six hundred thousand hippies on the Isle of Wight, danced ecstatic dances as they journeyed through that night.  But a few were lost there as their brains were reconfigured.  See them shambling, in the shadows, well and truly jiggered.   These casualties of Acid never knew what they were in for, as all of their canaries twitched and died upon the floor.

Some people think that this was once a truly golden age, and it was, provided that, like underground coal miners, you nurtured your canary, in its bamboo cage!

Short Story – The Singularitarian

A dystopian sci fi short story written about the singularity which, if futurologists such as Ray Kursweil are to be believed, is almost upon us.

THE SINGULARITARIAN

by Harry Rogers

 

Alexander Heyking wasn’t quite sure why it was that he became so rabidly opposed to the whole idea of the singularity, he just knew he was.  He was neither a Luddite nor a technophobe but after 6 years he just found the total nerdiness of the other Singularitarians too much to bear anymore.  Not only that but the gloss of the future was definitely not as shiny as it had been at the Ray Kursweil lecture screening back in 2006 when he was a student of futurology at Aberystwyth University.

Back then the whole idea that there was going to be an incredible leap forward in the technological advance of computer power capable of solving the ills of the world was irresistible. There seemed to be so many positive advantages in what the lecturer was saying that he  signed up to join the Singularitarian Society in the Student Union as soon as he could. 

It was at the first meeting of Singu Soc that he met Juliana Elliott.  She was astonishingly attractive to him and took absolutely no notice of him whatsoever.  It seemed that she was the girlfriend of the society chairperson Luis Ray Ting.  There were only five of them at that 2 hour meeting and they spent the entire time discussing The Fermi Paradox.  Why was it that, even though there appears to be high probability that civilised extraterrestrial life exists, human beings had a distinct lack of evidence for the existence of such life and also that there had been no recognizable contact with other civilisations. 

The title of the meeting was Fermi’s question “Where Is Everybody?”.  Luis opened proceedings by putting forward the basic tenets of Enrico Fermi’s argument, firstly, there are billions of stars in the galaxy that are billions of years older than our sun. Secondly, some of these stars probably have planets similar to Earth which may have developed intelligent life.  Thirdly, interstellar travel would more than likely have been developed somewhere, given that humans seem likely to do so.

Fourthly, the galaxy ought to have been completely teeming with colonisers in just a few tens of millions of years.  Given these parameters Earth should have already been visited, if not colonized, but no evidence exists and also not even one confirmed sign of intelligence has been spotted either in the Earth’s galaxy or in the 80 billion other galaxies in the observable universe.  This is such a conundrum and has been discussed incessantly since Fermi posed it back in 1950. 

By the end of the meeting Alex’s brain was buzzing with ideas and he readily accepted an invitation to join the other four for a drink at The Scholars in North Street afterwards.  Luis had a battered VW camper van and he offered everyone a lift to the pub and ten minutes later Alex was sat in the corner of the top bar with a pint of Doom Bar bitter in front of him.  He wasn’t quite sure what the topic of the meeting had to do with Singularitarianism, but he was about to find out.  Luis started it all off by asking “Do we all think we are alone in the universe then?”

“I’m willing to believe we’re not,” said Juliana “but in the absence of any proof I guess we might be.”

“What about you Alex, any thoughts?”

“I’m not sure there is life in outer space, at least not as we know it.” Alex responded

“Ah, now we are getting to the nub of it.  Not as we know it is what I hoped we would get to.  Just as Spock might have said to Captain Kirk, It’s life Jim, but not as we know it.  Supposing we talk about intelligence rather than life, might that open a few more avenues for discussion?” Luis said

“I see where this is going,” said Juliana “You are suggesting that there may be alien artificial intelligence.”

“I am and more than that I believe that the whole of planet Earth is under the influence of some interference that is indifferent to us as a species but which is extremely interested in our progress towards developing our own version of a singularity.”

Luis took a large swig from his beer and Alex looked at him in a kind of reverential way.  “But if that is true might we not have detected something from them, a random data transmission, or even an attempt to hack some government machine, or something?” Alex asked.”

Luis put down the glass, turned to face Alex and staring him in the eyes with a cold steely gaze said “How do you know there hasn’t been such detection?  In fact how do any of us really know that there hasn’t been any contact?  We only have the word of politicians and spooks that there has been no contact, but this is unreliable evidence in my book.  I don’t believe what the establishment tell us on this issue for a single minute.”

Alex felt decidedly uncomfortable as Luis continued to fix his eyes with the unblinking look for another five seconds after he had finished speaking. 

“How would we ever be able to find out whether there had been contact by an alien artificial intelligence?” asked Juliana

“Ah well, I was coming to that.” said Luis, “I was thinking that we might start doing some digging for evidence.  There is a lot of work being done in the field at the university and I reckon that we have a good chance of accessing quite a lot of sensitive information if we put our minds to it.”

So it was that they agreed to work as a team, hunting out information related to extraterrestrial contacts from any source and it was also agreed that Luis would be the central collecting point.   Over the following six years members of the group came and went but Alex, Juliana and Luis continued on with the search for proof of contact.  Luis and Juliana were no longer an item and Alex had got into a relationship with her on the rebound.  They had been seriously dating for the last three years. 

Since Luis had amassed a simply gigantic database of information as a result of the diligence of Singu Soc members, his doctoral thesis was nearing completion and he was on course for a stunning career in futurology.  Alex watched his progress with interest over the years and didn’t begrudge his use of the material gathered by his peers, in fact he admired the way in which Luis was able to extrapolate theories and ideas from what was largely a fragmentary fishing exercise.   But the truth was, there was still no proof of the existence of an alien singularity, in fact not a single jot of verification, Alex was becoming bored with the subject.

That  he continued to carry out any searches for information was something he couldn’t explain, but he found it hard not to carry out his usual trawl across the internet once a week, followed by the weekly email of titbits to Luis.  It was a kind of addiction and he did it out of habit more than interest these days. 

It was a Saturday and Alex was taking Juliana out to a gig at Rummers wine bar that evening and he still hadn’t done the trawl but at seven o’clock he quickly turned on his computer and started searching, and he decided to look at the United States Defense Advanced Research Projects Agency (DARPA) website.  In particular he started interrogating pages relating to the Dynamic Analysis and Replanning Tool (DART) and DRPI – Knowledge-Based Planning and Scheduling Initiative.  Both of these projects had proved invaluable in revolutionising the logistics of the US military in both Desert Storm and The Gulf War and many other conflicts since.  The interest in artificial intelligence was back on the military agenda in the US in a big way since DARPA had pushed the projects forward.   As he scanned the information on the pages he came upon a reference to a paper on The Computational Theory of Mind so he clicked the link which in turn led him to click another link on future-predicting error-correction codes which in turn led him onto looking at brain-computer interfaces and neuro chip technology.  As he surfed on he came across an interesting paper on the use of nootropic drugs linked to brain computer interfaces.  This was fascinating and whilst he knew that there were students and lecturers using brain enhancing nootropic supplements he was unaware that there were people who were researching into how the use of these new super drugs could be used in conjunction with brain computer interfaces.  This was new to him and he thought this might be of interest to Luis so he bundled together a selection of links in an email and sent it off as per usual.

Later that evening, whilst he was slow dancing with Juliana in a dark corner of Rummers and listening to the local soul band he glanced across the dance floor to the entrance and was surprised to see Luis walking across the floor towards him.

“Look who’s turned up,” he said to Juliana.  “Can I get you a beer man?”

They both turned to face Luis and he said “No it’s OK, there is no time. It’s happened.”

“What do you mean it’s happened?”Alex said

“Contact.”, he said.

“You can’t be serious,” said Juliana “proof there is an alien singularity, after all this time?  Are you sure?”

“Yes I’m certain” replied Luis. “Both of you must come to my study now please, there is something you have just got to see.”

They picked up their coats and followed Luis out to his car as the band ironically started playing their version of the Noisettes song “Contact”. 

Once in the car Luis turned to Alex and said “Hey man, you know that email you sent me tonight cracked the whole contact process.  It just hadn’t dawned on me that we needed to open up some neural pathways in order to get through.  I missed the whole nootropic drugs connection but now it just seems so obvious, so bloody obvious.”  As he finished saying this they were already at the entrance onto the University Campus on Penglais Road, Luis turned the car in and soon they were parked up outside the Computer Science block.  He let them in and they were quickly standing in his study. 

Luis locked the door behind them saying “We don’t want to be disturbed while I show you this.”  He opened the computer on his desk and then he picked up one of three BCI (brain-computer interface) devices on his desk and placed it on his head.  He had been experimenting with a computer electroencephalograph for some considerable time and the interface headset was his own design.  He typed in some data using the keyboard and the screen became an active sea of coloured dots but gradually they settled into an image of Luis on the screen.  Luis spoke to them saying “Watch this” and the image mouthed his words.  Alex and Juliana were fascinated by what they were seeing, the image on the screen said “I have been trying very hard to get to this point for weeks but somehow just didn’t seem able to jump the final hurdle.  It was only when you sent me the link to the paper on BCIs and Nootropics that the cosmic tumblers all fell into place for me.  I realised that without some sort of booster there was never going to be enough power in my normal brain to get to this point.” 

They both stared at the image on the screen and were not aware that Luis was not actually speaking the words through his mouth, he was thinking them and the image on the screen was speaking them. 

“I managed to buy some strong enhancers from that hippy dealer in The Scholars, pretty cheaply if you ask me, and they gave me the extra power I needed to make the jump.  I have some more if you want to have a go; I’ve got a couple of extra headsets too.  You won’t believe what it can do for you.  Not only can I upload my thoughts directly into the computer, but also I can download information from the computer into my memory just by thinking about it.”

They were absolutely astounded by this and looked at each other in an awestruck way before Alex said “OK Luis, I’ll give it a go.” 

“Me too.”, echoed Juliana.

“Alright,” said Luis, “It will take about forty minutes for the nootropics to take effect so if you swallow these now, I’ll tell you about the contact.” And he handed them a couple of red and black capsules each and a bottle of water to wash them down.

As they swallowed the drugs Luis started talking to them via the computer screen again.  “This is such a breakthrough, I can’t quite believe it has happened.  Only six hours ago I was floundering about in the dark, sure that I was close to something, but not realising how close really.  As soon as I read the paper from the link in your email I shot out of here like a hare at a greyhound track, got the capsules, dropped a couple and within an hour I was through the portal into cyberspace.  I can only say that it is amazing.  I got an immediate boost of knowledge which enabled me to see things in a totally new way.  However, it was when I logged onto the internet that it happened.  I became aware that I was not alone on the super highway.  A strange intuitive feeling that there was another intelligence present engulfed me.  This feeling was not however malevolent, it was kind of like being in the presence of somebody who you know is much more powerful than you, whilst knowing that you are a complete irrelevance to them.  As soon as you join me you will see for yourselves.  Anyway, I just opened myself up to the stream of information on the web for about ten minutes before I came out to tell you both about it.  You have no idea what a phenomenal discovery this is, but I think it is only fair that we share in this together as we have been colleagues in the search right from the start.”

They had been listening intently and were completely sold on the idea of being partners in the future project.  Juliana said “My God Luis!  The ramifications of this are enormous, we are going to be the richest people on the planet.”

“Oh, the money is meaningless.” Luis replied, “It’s the access to knowledge and power that fascinates me.  We will be able to become the saviours of the planet, nothing will be beyond our comprehension.  Poverty, hunger, illness, global warming, political skulduggery, we will be able to deal with it all, nothing will be beyond us.  We will be superhuman.”

Alex was liking what he was hearing.  There were so many avenues they could go down, and all because it seemed they were on the verge of the singularity.  He was feeling very proud of the fact that he had contributed to what looked like the greatest scientific discovery of the last 100 years.

They sat there talking for the next half hour about all the things they were going to change and then Luis said “OK guys, I reckon that’s long enough for the drugs to come on, put these on and let’s get on with changing the world.”.   He handed them their interface headsets.

They eagerly donned them and as they looked at the screen it scrambled into the same myriad of coloured dots as before but then as it settled they saw there were the three of them visible on the screen.  Alex was amazed that he could make this on screen avatar of himself speak just by thinking, as was Juliana and they both started laughing at the same time. 

Luis said “Now, guys, before we set about accumulating all the knowledge on the planet I want you look at something.”  They watched as he conjured a window onto the screen.  It was an anagram solver program and he slowly typed his name into it –  L U I S   R A Y   T I N G and pressed the solve button.  The letters scrambled and reformed into SINGULARITY.  Alex and Juliana looked at each in a confused way.

 “What does this mean Luis?” said Juliana.

“Guess,” said Luis as he started the process of emptying both their brains of every scrap of knowledge and information they had ever gained throughout their entire lives.  It took just 3.235 seconds for both their brains to empty their riches into his.  He took off the headsets from the two brain dead husks that had once been Alex and Juliana, and switched off the computer.  It had worked out exactly as he thought it would.  All of their knowledge now resided in his memory.  He had become much more than the sum of the three brains he now possessed.  The birth of the singularity had started, only this was different to the way the futurologists had imagined it.  Luis Ray Ting led the husks to his car, sat them in the front seats, started the engine and released the handbrake.  A small shove was all that that was needed to send the car down the hill, gathering speed as it freewheeled into the traffic heading into Aberystwyth.

The singularitarian walked back to his office believing that he was soon going to be the most powerful entity on the planet.  The alien singularity that resided in Earth’s cyberspace hadn’t noticed him yet, but it was only a matter of time……….

The Chilly Dogz – “Ray Bradbury Said”

harri seeing red

This is the latest Chilly Dogz Tuesday session.  Every week we meet at my house and write a new song together.  I wrote this after seeing a documentary about deceased Sci Fi author Ray Bradbury in which he said that he never bothered to carry out research for his stories as all the information he needed was in his head.  His fiction is fabulous and has been a favourite of mine for more than 50 years.

LYRIC:-

RAY BRADBURY SAID

IT’S ALL IN MY HEAD

RAY BRADBURY SAID

IT’S ALL IN HIS HEAD

 

HE TALKED IN TECHNICOLOR

NIGHT AFTER NIGHT AFTER NIGHT

HE TRANSFORMED LANGUAGE

INTO SOMETHING MEGA BRIGHT

HE COULD SEE THE AIR WE BREATHE

AND HE TOLD US ALL ABOUT IT

SHIMMERING WORLDS INSIDE HIS HEAD

HE MADE SURE WE KNEW ABOUT IT

 

RAY BRADBURY SAID

IT’S ALL IN MY HEAD

RAY BRADBURY SAID

IT’S ALL…

View original post 61 more words

My Garden Girl

MY GARDEN GIRL

.

SHE’S MY GARDEN GIRL

SHE’S GOT POLLEN IN HER HAIR

SHE’S MY GARDEN GIRL

SHE PLANTS SMILES EVERYWHERE

 

LOOK ACROSS THE VALLEY

ON A WARM DAY

YOU’LL SEE MY GARDEN GIRL

ON A WARM DAY

SHE’LL BE BUSY PLANTING

ON A WARM DAY

OUTSIDE WITH HER RADIO

ON A WARM DAY

 

SHE’S MY GARDEN GIRL

SHE’S GOT POLLEN IN HER HAIR

SHE’S MY GARDEN GIRL

SHE PLANTS SMILES EVERYWHERE

 

LOOK ACROSS THE VALLEY

ON A HOT DAY

SHE’LL BE THERE AGAIN

ON A HOT DAY

WATERING THE VEGETABLES

ON A HOT DAY

OUTSIDE WITH HER RADIO

ON A HOT DAY

 

SHE’S MY GARDEN GIRL

SHE’S GOT POLLEN IN HER HAIR

SHE’S MY GARDEN GIRL

SHE PLANTS SMILES EVERYWHERE

 

LOOK ACROSS THE VALLEY

ON A WET DAY

SHE’S IN HER GREENHOUSE

ON A WET DAY

SOWING SEEDS IN COMPOST

ON A WET DAY

INSIDE WITH HER RADIO

ON A WET DAY

 

SHE’S MY GARDEN GIRL

SHE’S GOT POLLEN IN HER HAIR

SHE’S MY GARDEN GIRL

SHE PLANTS SMILES EVERYWHERE

 

LOOK ACROSS THE VALLEY

ON A COLD DAY

THERE’S MY GARDEN GIRL

ON A COLD DAY

DIGGING OVER FRUIT BEDS

ON A COLD DAY

OUTSIDE WITH HER RADIO

ON A COLD DAY

 

SHE’S MY GARDEN GIRL

SHE’S GOT POLLEN IN HER HAIR

I LOVE MY GARDEN GIRL

SHE PLANTS SMILES EVERYWHERE

COPYRIGHT: HARRY ROGERS – 25-11-11

This is a demo of a song I wrote for my partner Jenny who just loves her garden.  Rehearsed at Dolwion Mill in Drefach Felindre by the now defunct line up of Seeing Red in January 2012 who only ever played one gig.  I like it for it’s simplicity, could do with a bridge and some extra overdubs but hell as a first take demo it’s OK in my book.

Short Story – Go With The Flow

GO WITH THE FLOW

a short story by Harry Rogers

Maurice Warwick (Mo to everyone who knows him) sat on a stool at the counter in the saloon bar of The Prince of Wales with a pint of Guinness and a copy of the New Musical Express.  This had been a typical 1970’s day for him.  In the morning he had visited a couple of friends in Blackheath Village for breakfast where he had managed to sell a couple of rare Buddy Holly CDs he had bought at the Deptford record fair thus making a profit of £15. At midday he had gone to The Three Tuns for a lunchtime livener where he met up with a couple of members of local punk rock sensations The Prannits who were rehearsing in a famous rock stars rehearsal studio just around the corner.  After a couple of pints he had gone with them for their afternoon session and watched them lay down two tracks for their next single release.  The session finished at six thirty and Mo walked to the Prince of Wales where he was looking to meet up with a journalist who he hoped was going to buy one of his photographs for an article about Deptford Fun City, as the South London music scene was known.  A typical day for Mo who always knew how to go with the flow.  The bar was empty save for one geezer sat at the other end of the bar.  Mo looked up to see him looking at him in a strange way, kind of eying him up and down.

Mo said “Hello, is everything OK here?”

“I’m sorry,” the stranger said, “I was deep in thought.”

“I thought you were looking at me.”

“Well I was but not in any threatening way, I’m sorry if you thought otherwise.”

“Oh OK, that’s alright then.” said Mo and returned to the article about Bruce Springsteen in the NME.

“Let me buy you another pint,” offered the stranger, “I didn’t mean to upset you.”

“Well that’s very nice, I won’t say no.” answered Mo

The stranger came over and sat on the nearest empty bar stool “Guinness is it?”

“Yes Thanks, cheers.”

He ordered the beer for Mo and then introduced himself, “My name’s Billy Bleasdale.”

“I’m Mo.”

“Do you live near here?” asked Billy

“No man, I’ve got a short term flat in Deptford on one of those estates that the council are going to demolish soon, so they say.”

“I live in Kidbrook with my wife. Would you like to see some pictures of her?”

“Yeah, why not.” said Mo

Billy handed him a Kodak paper envelope containing half a dozen photographic prints.  He took them out and looked at the first one.  There was a portrait of a pretty, dark haired, young woman of about 25 years of age looking confidently out at him. He slid the picture to the back of the pile and looked at the next one, this time she was standing outside a low level block of flats wearing a mini skirt and smiling in a very beguiling way.  As he shuffled through the pictures he was getting more interested in her.  Then he reached the last one in the pile.  This time she was standing in front of a fence at what Mo guessed was an allotment.  This photograph was noticeably different to all the rest in that she was standing there with her skirt pulled up to her chest and wearing no underwear.  Mo carefully replaced the photographs into the yellow envelope and handed them back to Billy remarking “Hmmm, very nice.”

“Would you like to meet her?  My wife, would you like to meet her?”

“Listen” said Mo, “What is this all about, eh? What’s going on here?”

“Oh I’m sorry” said Billy looking a bit sheepish, “I suppose I had better explain.”

“Yes” said Mo, “That might be a good idea.”

“Well, as you can see, my wife, Jamie Lee, is quite a bit younger than me.”

Mo looked at Billy and guessed he must be about 35 years old.

“Yes, I can see that.” said Mo

“Well the point is, I won’t beat about the bush, we have been together since she was sixteen, in fact I married her at sixteen when I was 24.  She is 26 now and we have been happily married for ten years.”

“Yes?” queried Mo

“Well about a fortnight ago we were sitting up in bed, you know, talking, like you do, and she says she wonders what it would be like making love to another man.  I was a bit taken aback I must admit but she said I wasn’t to think that she was unhappy or anything but she just wondered, that was all, especially seeing as how she had never been with anybody else apart from me.  So I kind of looked at her and thought to myself, well you know, I thought, I’ve had other women before her, and it just didn’t seem fair, really.  Her not having had any other experience, what with being so young at the start and all.”

“I see, or I think I do, go on”

“Well I thought about it and then a couple of days ago I said to her that I thought it only fair that she should try making love with somebody else, just so that she could put her mind to rest, so to speak.  She said that she wouldn’t want to go out and find somebody herself because that somehow wouldn’t seem right.  So after a bit more chatting we agreed that I would go out and find someone who might do the trick for us.  So here we are and I ask you again, would you like to meet my wife.”

Mo picked up the fresh pint of Guinness and took a long swig of it before looking at Billy, who was staring at him in an earnest way.  He thought to himself why not, after all he would be doing them both a favour and she was very attractive, in fact he felt kind of flattered at having been selected for such a task. He thought to himself, go with the flow.

“OK” said Mo, “you’re on.  When do you want me come round?”

“Well, I was wondering whether you might make it tonight actually.” said Billy

“Well I haven’t got anything else planned, so the answer is yes, why not tonight.”

“Great,” said Billy, “I’ll just give Jamie Lee a ring and let her know we are on our way.”

He went over to the payphone in the passageway near the toilets and made the call home whilst Mo finished off the Guinness.

Billy came back and said “My car is parked up just outside, my place is only a few minutes away, Jamie Lee says she is so excited and will be ready and waiting for you.”

They left the pub and climbed into Billy’s slightly battered two tone Austin Cambridge car.  Twenty minutes later Billy parked up outside the flats and they walked up the stairs to the first floor balcony where he lived with Jamie Lee.  It was just after eight o’clock and the sun was almost gone, the sky turned that wonderful streaky red colour that you only get in cities with a lot of particulates in the air.

Billy opened the door and called out “We’re here Jay.”.

Mo followed him into the flat and the first thing that struck him was the smell of strong bedroom joss as he entered the hallway.  It was sparsely furnished with a small telephone table and wooden chair, a set of cast iron Victorian coat hooks and a Paul Klee framed print on the primrose yellow painted wall.

Billy asked Mo if he wanted a cup of tea before meeting Jamie Lee but as he had not long finished drinking Guinness he declined the offer.  “Oh OK then, well she is waiting to meet you in here.”, and he ushered mo into the first door off the hallway.  The scent of the joss was much stronger, almost cloying but not unpleasant, as an established hippy Mo was used to the smell of incense and liked it very well.  He looked into the room which was lit only by three large multi coloured candles and as his eyes got used to the low level of light he saw that this was in fact a bedroom.  The walls were papered with a deep red paisley patterned paper by Osborne and Little and there, in an ornate brass fluted bedstead, naked and wrapped in a white cotton sheet, Mo made out the face of Jamie Lee looking intently at him.

“Come in,” she said, in a matter of fact way.  Mo shuffled his feet a bit before Billy gently eased him forward.  Jamie Lee looked at Billy and nodded her head approvingly “He looks good Billy, real good, you chose well.”

Mo felt very awkward but before he could say anything Billy said “Well I’ll leave you both to get to know each other, I’m going to watch the rest of the European Cup match on TV.  Have fun.”, and with that he backed out of the room, gently closed the door and left the two strangers looking at each other.

“Well don’t just stand there,” she said, “take your clothes off and come over here.” , and she peeled back the bedclothes on one side off the bed and patted the sheet.  Mo was still a bit flustered but he took off his shoes and socks, dropped his loons and pants and stepped out of them and pulled his cheesecloth grandad shirt over his long hair and dropped it onto his trousers.  He walked across the room and carefully slid into the bed beside Jamie Lee.  He started to say something but before he could utter a word she put her forefinger to his lips and slid her arms around him pulling him towards her.  He put one hand on her back and was immediately aware of a lissom body that was electrified with erotic expectation.  They spent the next hour and a half making exquisite love three times and she was ecstatic in her satisfaction and whispered “Thank you Mo, that was just wonderful.” He asked her if he could light a cigarette and she said he could and she would have one too.  He got out of bed and took two Marlborough from his pocket and lit them both.  As they lounged in bed quietly smoking the cigarettes there was a soft knock at the bedroom door.  “Come in Billy” she said, and the door opened and Billy came in carrying a tray with a teapot with a cosy, two cups and saucers and a plate of chocolate biscuits.

Billy looked at Jamie Lee in that hopeful kind of way that pet dogs do when they are looking for approval.  She looked at him and smiled broadly and just slowly nodded to him.  He left them to the tea and closed the door behind him again.  Mo drank a cup of tea whilst Jamie Lee made small talk with him about his love life.  Mo got dressed quite quickly and he leaned over and kissed her passionately.  He knew this was the last time he would see her as it was crystal clear to him that  she was totally in love with Billy.  Billy came back in and Mo was dressed and ready to go.

Jamie lee looked at Mo and smiled at him as she said, “Thank you so much for a wonderful evening, I’ll never forget you.  I just had to make sure I am with the right man, and now I’m certain of it.”

Mo smiled and Billy said “OK then, where would you like me to drop you off?”

There was a late night gig starting at midnight down in Deptford so Mo said “Take me to The Albany Empire please.”.  They traveled in silence to Creek Road and as Mo was getting out of the car outside the venue Billy handed him an envelope and said  “Thank you very very much for doing that for us, now we can get on with the rest of our marriage with no ifs or buts.”

Mo smiled and looking clearly at Billy he said “No problems, the pleasure was all mine.”

As Billy drove away Mo opened the envelope and inside he found ten crispy five pound notes.

He entered the gig grinning and muttering to himself “Go with the flow, always go with the flow.”

(Any resemblance to anybody living or dead is entirely coincidental.)

Hungry Fools – A poem for merry pranksters everywhere

HUNGRY FOOLS

Hungry fools
Break the rules
Make the tools
Challenge schools
Ride off roads
Crack the codes
Lick the toads
Shed the loads
Fly the kites
See the sights
Light the lights
Build the rights
Turn up late
Tell it straight
Wipe the slate
Fuel the grate
Mine the coal
Rock and roll
Feed the soul
Keep Earth whole
Hungry fools
Stay hungry
Stay foolish
Hungry fools

Hunting Lizards In The Long Grass

HUNTING LIZARDS IN THE LONG GRASS

WAKE UP IN THE MORNING

THE SUN IS SHINING BRIGHT

GROWN-UPS ARE STILL SLEEPING

ANOTHER DRUNKEN NIGHT

 

GET SHORTS AND SANDALS ON

A SLICE OF BREAD AND HONEY

OUT THE DOOR WITH FISHING NET

AND A JAM JAR ON A STRING

 

GO DOWN TO THAT WASTE LAND

NEXT DOOR TO THE HARBOUR

WHERE THE BOATS ARE BOBBING

AND ALL THE FLAGS ARE FLYING

 

GREEN BACKED BEETLES GLEAMING

THE SEAGULLS SHRILLY SCREAMING

HUNTING LIZARDS IN THE LONG GRASS

WHILE DREAMS ARE DAILY DREAMING

 

I KEEP THINKING ABOUT THE NIGHT BEFORE

ON MY OWN OUTSIDE THE SALOON BAR DOOR

 

WITH A BAG OF CRISPS AND A GLASS OF LEMONADE

THE WAY THEY LEFT ME ON MY OWN AGAIN

 

ON MY OWN AGAIN

ON MY OWN AGAIN

 

CORNFLOWERS BLUELY SWAY

GRASSHOPPERS CLICK ALL DAY

HUNTING LIZARDS IN THE LONG GRASS

WILL TAKE THE PAIN AWAY

 

BACK TO THE BUNGALOW

EVERYONE IS UP AND DRESSED

LOOK AT THE JAR OF LIZARDS

THEY SEEM MILDLY IMPRESSED

 

HUNTING LIZARDS IN THE LONG GRASS

ALWAYS TAKES THE PAIN AWAY

HUNTING LIZARDS IN THE LONG GRASS

ALWAYS TAKES THE PAIN AWAY

Copyright: Harry Rogers, Aberbanc, 9th October 2011

One Way Ticket To Mars

ONE WAY TICKET TO MARS

I had a ginger cat that sometimes killed a mouse

I had a shiny car parked up outside my house

I had a steady job that I really thought would last

Now everything has gone, relegated to the past

My dreams and aspirations have all come tumbling down

Wife and kids live with her mother in another part of town

The only thing that’s left now I’ve landed on my arse

Is to try and get myself a one way ticket to Mars

A place where I’ll be blinded by the brightness of the stars

Where there will be no driving round in comfortable cars

No more late night drinking in my favourite bars

No more gigs down Deptford with those sweet, sweet guitars

It’s time I got myself on that one way ticket to Mars

There are no rotten human beings there to treat me so unkind

All those problems down on Planet Earth?  I’ll leave them all behind

I’ll climb into that rocket ship and become a pioneer

Things can’t be any worse up there than all this shit down here

I’ve got no cash inside my pocket

So strap me up into that rocket

I’ll take that one way ticket to Mars

I wanna one way ticket to Mars

I’m on a one way ticket to Mars

A One…….. Way……. Ticket

See ya……wouldn’t wanna be ya!

http://www.bbc.co.uk/news/science-environment-22146456#

Flying From The Sunset

FLYING FROM THE SUNSET

If you are stuck within a whirlpool
Spinning way out of control
I will throw you a lovers lifeline
To pull you out of that hole

Come along with me
Flying from the sunset
We can both be free
Flying from the sunset

Let me pull you in to shore
I am still within your reach
If you look above the waves
I am standing on the beach

Come along with me
Flying from the sunset
We can both be free
Flying from the sunset

We’ll go watching moonbeams
As they bounce across the bay
Then come tomorrow morning
We can start a brand new day

Come along with me
Flying from the sunset
We can both be free
Flying from the sunset