Watch the debate on the introduction of the report on Lethal Autonomous Robots at the UN by clicking on the following link:-
Good afternoon comrades, friends and colleagues. Can I start by saying how pleased I am to see this conference taking place and can I take this opportunity to thank the organisers for inviting me to share the platform today with such distinguished friends. My name is Harry Rogers and I live in West Wales about 12 miles from Aberporth where the MOD have been carrying out their tests on the Watchkeeper Drone.
I am sixty five years old and I almost never got born because of a UAV. My mother and my aunt were in the back bedroom over the top of my grandfathers public house in West Croydon in June 1944 just ten seconds before one of Hitler’s V1 doodle bug flying bomb drones blew the back of the pub clean off. Ten seconds earlier and I would never have been born. My point here is that UAVs have been around a long time and the Watchkeeper is nothing really new in concept. A lot has been written about this piece of almost obsolete already Army surveillance equipment so I won’t add to the reams you can find on the internet.
I am a member of a local peace group called Bro Emlyn For Peace and Justice and we formed in 2003 as a response to the decision by Bush and Blair to attack Iraq. During the time we have been involved in a number of campaigns including the campaign against the development of Cardigan Airport as a testing ground for UAVs, also the proposed introduction of an unmanned aerial systems technology hub at Parc Aberporth and also the management buyout of the MOD missile testing base at Parcllyn by QinetiQ.
There is much I could talk to you about concerning the history of this campaigning but that is not why I agreed to come here today. The past is something that we can learn from but not something that we can undo. My speech today is primarily about the future of drone technology and why it is vital that we all start to pay attention to what the research and development bods at MIT and DARPA, QinetiQ and BAE systems are cooking up for the future. I make no apologies for basing most of this speech on a report made by the RAF in October 2012 about the future of Unmanned Aerial Systems to the Government.
I am interested in ensuring that we all go away from today with an understanding that the big issue we face in terms of UAV development is that of Autonomy. The next generation of drones may well be able to think for themselves and act autonomously, that is without any human in the loop as the Military say. That means flying machines that can make their own decisions in search and destroy operations based on a set of algorithmic decisions pre-determined by human masters who may or may not be benign in their intentions.
Sceptics amongst you are already muttering balderdash and hokum, science fantasy and other such epithets. Well virtually all of the rest of the speech will, I hope, convince you otherwise.
* The RAF say that “As UAVs are developed with increasing levels of automation it will reduce the requirement for operator training in the more traditional piloting skills of flying the aircraft, such as landing and takeoff, and focus the training more towards operating the payload.”
* The MOD and in particular the RAF are discussing issues about the levels of automation they are comfortable with. They recognise that highly automated weaponry systems are unlikely to be able to apply judgement and pragmatism to “situations”. They are worried about legal and ethical considerations that occur when there are no human beings in the loop leading to loss of life or injury.
* The future is one where outdated weaponry will give way to what they term “precision weaponry” and where the battle-spaces increasingly involve unmanned and cyber operations.
* Autonomous weapon systems are capable of understanding higher level intent and direction, and perception of their environment, leading to the ability to take appropriate actions to bring about desired states. To be able to decide on a course of action, from a number of alternatives, without depending on human oversight and control. Although the overall activity of an autonomous unmanned aircraft will be predictable, individual actions may not be.
* Low intensity tasks for Autonomous UAVs requiring minimal human oversight rather than continous control include the following:-
– pattern of life surveillance tasks over fixed locations or in support of littoral maneuver
– maintenance of standing anti-submarine warfare or anti-surface warfare radar barriers
– counter-piracy tasks; monitoring of arrays of sonobuoys or other sensors
– electronic warfare tasks
– acting as a communications relay
– air-to-air refueling tankers
* The RAF consider the use of UAVs ideal in hostile environments for a manned aircraft or it’s crew in operations related to Chemical, Biological, Radiological, and Nuclear. Tasks include the following:-
– Could carry sensors for local, tactical or global use
– systems easily sacrificed in a safe area after data gathering
– use in situations where fire and smoke make human activity hazardous
* In risky situations UAV systems can be used instead of aircrew or soldiers where the threat of ground to air action is high and also where it is necessary to suppress an integrated air defence system.
– multiple cheap UAVs used sacrificially to swamp detection and command and control systems
– to encourage enemies to fire large numbers of missiles
– observe engagement tactics and transmit data back to intelligence collators
– convey tactical supplies
– sweep for improvised explosive devices
* Of course UAVs are used in scenarios which are highly distasteful and we all know that they are used for surveillance and targeting, and carrying out, weapon attacks inside the borders of countries such as Pakistan and Palestine. All of this is currently done with “man in the loop” systems but it will not be too long before drones are flying with what the RAF says is ” the ability to independently locate and attack mobile targets, with appropriate proportionality and discrimination.”
The RAF are worried about issues relating to the Geneva Convention with regard to autonomous UAV development and usage and they say that “compliance will become increasingly challenging as systems become more automated. In particular, if we wish to allow systems to make independent decisions without human intervention, some considerable work will be required to show how such systems will operate legally.”
Already there are automated weapons systems in use in Afghanistan, for example, the Phalanx and Counter-Rocket, Artillery and Mortar (C-RAM) systems used because there is deemed to be insufficient time for human response to counter incoming fire.
Future autonomous UAV systems will have to adhere to legal requirements and civilian airspace regulations. This will require political involvement in getting the necessary changes made. In my view it is absolutely vital that politicians understand the issues clearly and concisely because once made it will mean that there is going to be a lot of military and civilian hardware flying about in the airspace without any human interface whatsoever. The RAF say “As systems become increasingly automated, they will require decreasing human intervention between the issuing of mission-level orders and their execution.” and they go further saying “It would be only a small technical step to enable an unmanned aircraft to fire a weapon based solely on its own sensors, or shared information, and without recourse to higher, human authority.” They also discuss the timescale for the introduction of increased autonomy via Artificial Intelligence, saying ;- ” Estimates of when artificial intelligence will be achieved (as opposed to complex and clever automated systems) vary, but the consensus seems to lie between more than 5 years and less than 15 years.” Their words, not mine.
Currently the MOD “has no intention to develop systems that operate without human intervention in the weapon command and control chain, but it is looking to increase levels of automation where this will make systems more effective.”
The RAF are clearly worried about the direction all this is going in and they say “As technology matures and new capabilities appear, policy-makers will need to be aware of the potential legal issues and take advice at a very early stage of any new system’s procurement cycle.” I believe this highlights a degree of paranoia on the part of the RAF vis a vis it’s own future role.
Not only are the RAF exercised about legal dilemmas. Ethics and morals related questions also are in their thoughts such as when, where and how automated and autonomous unmanned systems may be used. This applies not just to the use of drones of course but also to all other forms of weaponry in any environment. Will all future wars be fought remotely with little or no loss of friendly or military personnel? Will future conflicts be waged between increasingly complex unmanned systems?
In my view a problem that we face today is that the accountants have control of governments and the most expensive resource used in public services is human beings. So autonomy offers massive savings in manpower and support for that manpower both before and after conflict occurs. As artificial Intelligence comes on board we are likely to see more complicated tasks arising that are beyond the capability of humans to deal with due to speed, complexity and information overload. No doubt some of you are probably suffering that now, but I only have a bit more to say before I will take questions, so bear with me.
The RAF and many others in the field are grappling with issues such as whether it is possible to develop AI that has the capability to focus on the unique (at the moment) ability that a human being has to bring empathy and morality to complex decision-making. The RAF say “To a robotic system, a school bus and a tank are the same – merely algorithms in a programme – and the engagement of a target is a singular action; the robot has no sense of ends, ways and means, no need to know why it is engaging a target. There is no recourse to human judgement in an engagement, no sense of a higher purpose on which to make decisions, and no ability to imagine (and therefore take responsibility for) repercussions of action taken.”
So we need to pose the following questions to our politicians:-
Are they happy to allow Autonomous Robots to take on the responsibility of choosing which of us lives and which of us dies?
Can an autonomous robot be considered capable of waging ethical and legal warfare?
Can software and hardware accidents be distinguished from war crimes when robots go wrong?
Is it possible to establish clear policies on acceptable machine behavior?
How far out of the bottle is the technological genie?
Are we doomed to a Terminator style future?
Is it possible have a sensible debate about technological development anymore?
Do we really want Artificial Intelligence with a greater capacity to think than a human to be in any way involved in future theatres of War?
The RAF pose very important questions when they ask the following:-
– Do military planners and politicians understand the full implications of the systems they are currently tasking and those they hope to procure?
– In the current economic climate, who will decide the best balance between keeping existing equipment and personnel, or whether to give these up to fund new unmanned systems?
– Do we understand even the basic implications of such decisions for the associated defence lines of development?
– Crucially, do we have a strategic level of understanding as to how we will deliver the considerable number of changes that will need to be made to existing policy, concepts, doctrine, and force structures?
* Finally we must expect to see governments bringing in changes to the law of armed conflict in order to accommodate the use of autonomous UAS and we must shout our opposition to this from the rooftops.
So there we have it friends, there is a lot for us to consider when we look into the issue of drones. We have to view all of this in an holistic way. That is we must not just lumber along from one demo to the next thinking only about the impact of the current use of drone warfare, terrible though that is. It is absolutely vital that we start to consider what future implications there are for the maintenance and development of basic human rights. We must also consider what the use of technology means in terms of social control measures such as we see the beginnings of in Gaza and elsewhere. I believe we are at, if not all ready passed, a dangerous turning point in the way we occupy this planet.
It is incumbent on us all to make a fuss about these issues if we want a planet where Human Rights are protected. If we don’t then we condemn the world to a dystopian future where all kinds of as yet un-thought of technology is used to maintain the position of a global elite above and beyond that of the majority of the people. The choice is ours, make a ruckus or bury our head in the sand and wait for the worst possible sci-fi future to engulf us all.
Thanks for listening, I’m happy to take questions and take part in debate.
Harry Rogers 24/05/2013
On Saturday afternoon last week I attended a small demonstration organised by local Quakers outside the MOD base in Parclyn just above Aberporth. There were only ten activists there on this cloudy day plus one desultory police car parked 5o yards down the road.
MOD Aberporth is renowned for having been one of the main missile testing ranges in Britain. However it has effectively been privatised and, following a management buyout, it is now run by QinetiQ. QinetiQ is a research company with many links to the MOD.
There have have been a number of arms sales fairs held at Parc Aberporth, in particular focused on UAVs and UCAVs. Since 2010 the base has been involved in the testing of the Watchkeeper drone (developed by Thales) for the British Army flying out of West Wales airport. Members of Bro Emlyn For Peace and Justice and other local resident groups have been monitoring the ongoing drone development at Aberporth for the last ten years.
It is quite clear with the opening of the new drone facility in Lincolnshire that the British Government, like the rest of the world, are intent on maintaining and increasing the use of unmanned aerial vehicles for use by the armed forces in all current and future theatres of war, However it is also clear that there are other uses to which this technology can be put, including surveillance and social control within the home borders.
It is a fact that many police forces across the world have invested large sums of money in drones and are now using them on a regular basis. It is also clear that there is now a lot of interest in the development of autonomous weaponry and surveillance systems using AI (artificial intelligence). Such Autonomous weapons are already in use in Afghanistan and elsewhere in the form of anti aircraft weapons that don’t have to wait for a human to give it an order to deploy.
There is a lot of discussion going on between the MOD and The Pentagon about the use of autonomous AI in drones and other robot development. All of this research is partly driven by political aims in that it is becoming less and less acceptable for young men to be seen coming back from the front in body bags or severely damaged by conventional war.
Many people argue that the spectre of autonomous UAVs and other forms of social control technology is just pie in the sky sci fi fantasy. This is a dangerous delusion. QinetiQ has it’s own AI division and has been linking it’s work with its UAV development for about 7 years now. The pentagon via DARPA and The MOD are regularly discussing the ethical dilemmas associated with selling the concept of drones that take their own decisions to the politicians and the people.
In West Wales we have fought a lone battle against the developments at West Wales Airport and Parc Aberporth. Local politicians who are desperate to try and solve some of the economic development problems caused by recession and unemployment have been beguiled by the false promises of government agencies and arms manufacturers and have largely turned there back on the issues outlined here. It is time for the people to take action and we need national co-ordinated support from peace activists for future events.
German television station ZDF2 came to West Wales and made this item on UAS development at Aberporth. It is in German. Click on the following link to see the Item.
BBC Radio Four aired this show today and it demonstrates that there are some serious voices with major concerns about the development of autonomous killer machines. This is well worth a listen and I urge all readers to share this link.
“Robots probably won’t take over the world, but they probably will be given ever greater responsibility. Already, robots care for the elderly in Japan, and drones have dropped bombs on Afghanistan. Professor Noel Sharkey fell in love with artificial intelligence in the 1980s, celebrated when he programmed his first robot to move in a straight line down the corridor and , for many years, judged robot wars on TV. Now, he thinks AI is a dangerous dream. Jim al-Khalili hears how Noel left school at 15 to become an electrician’s apprentice and amateur rock musician before graduating as a Doctor of Psychology and world authority on robots, studying both their strengths and their limitations.”
This video from Human Rights Watch is interesting but behind the curve a bit in that there already autonomous weapons deployed in Afghanistan and elsewhere (more on this in a future blog) however they are right about the issue being something that should exercise everybody who is concerned about the abuse of surveillance and weaponry in an increasingly weaponised and autonomous world.