Northumbria UniversityLaw School9975274188229Word count: 16,463This Project is submitted for the qualification of MLawDate of Submission: Monday 11th May 202000Word count: 16,463This Project is submitted for the qualification of MLawDate of Submission: Monday 11th May 2020-1460501575816Research Question:“The laws surrounding responsibility and accountability of autonomous weapons systems are insufficient: An analysis of legal and ethical implications of autonomous weapons systems.”00Research Question:“The laws surrounding responsibility and accountability of autonomous weapons systems are insufficient: An analysis of legal and ethical implications of autonomous weapons systems.”14993621873250021006331918Name: Sophie QuinceSupervisor: Professor Christopher Newman020000Name: Sophie QuinceSupervisor: Professor Christopher NewmanResearch Declaration I confirm that I have already submitted my Project Synopsis and Ethical Approval Form, which has been signed by my supervisor. I further confirm that this project is entirely my own work and that the research undertaken for the completion of this project was based entirely on secondary material or data already in the public domain (case law, journal articles, published surveys etc). It did not involve people in data collection through empirical research (eg, interviews, questionnaires or observation). Signed: SOPHIE QUINCEDated: 11/05/2020Table of contents:Introduction Chapter 1: Self-awareness of Autonomous Weapons (development of AWS) Defining Autonomy in Weapons Self-awareness in AWSChapter 2: Responsibility and Accountability2.1 Criminal Responsibility2.2 Developer/Manufacturers’ Liability2.3 State liability 2.4 Command responsibility 2.5 The accountability gap Chapter 3: Legal and Ethical Implications 3.1 International Humanitarian Laws3.2 Distinction3.3 Proportionality 3.4 Human Rights Agreements3.5 Ethical and Moral ImplicationsConclusion Bibliography Introduction:This dissertation is going to look at the law surrounding responsibility and accountability for Autonomous Weapons Systems (AWS). The basic hypothesis for this work is that the laws are insufficient and the legal frameworks on a national and international level are not equipped to deal with the challenges posed by AWS. This dissertation will look at the legal and ethical challenges of these weapons systems, with the aim of posing some solutions to these challenges.Throughout history there has been a race to make technological advancements, some of which are merely trivial, such as a light that is activated by someone clapping or a device that will recite your daily itinerary by simply saying “good morning google”; however more serious advancements can be seen in computer technology and life-saving medical equipment. At some point during the development of all this technology, it is likely that developers had to contemplate the legal and ethical implications of mass-producing these products for consumer use. However, what if we were to consider developments in technology that have lethal application, limited human control and the potential to be fully autonomous in the future? The development of such technology is currently a reality and is already being used in combat. Therefore, this dissertation aims to answer the question of whether or not the laws regarding responsibility and accountability of autonomous weapons systems are sufficient and what the legal and ethical implications of this technology are. The majority of the materials available surrounding Autonomous Weapons indicate that there is yet to be full autonomy attributed to robots, however contrary to this evidence one of the most distinctive fears surrounding this topic are the implications of such autotomy occurring and how this would affect humanity. Fear of the unknown is not an unusual human trait and the preservation of life is something that is a global value, because of this the concept of human dignity is seen to be one of the most important human rights, with the rules of war being implemented in order to reduce civilian casualties as well as unlawful action against soldiers in time of war. The new generation of autonomous weapons threatens such concepts and opens up a gateway into unknown territory where AWS technology is not solely dependent on human operation. An interesting article was published in September 2019 where an ex-google engineer disclosed how she resigned from Google last year in protest of being sent to work on a military drone project. Nolan stated that killer robots had the potential to do “calamitous things that they were not originally programmed for”. This article caught my attention for several reasons. First of all, it highlighted how fear for the unknown is shaping the way people view AWS, and secondly that so called ‘killer robots’ merely have the potential to do calamitous things, not that they are doing calamitous things. The latter point is crucial to unravelling the questions that I am posing for my dissertation as it indicates there are gaps in the understandings of both the technology itself and the legislations and definitions surrounding them. The subject of Autonomous Weapons is fascinating to myself and academics alike, particularly because whilst there is a willingness to develop such technology, there is a level of ignorance when it comes to the legality and ethics of such progresses. Whilst this is the case, it is encouraging to note that the issues identified have previously been discussed by academics such as Rebecca Crootof and Robert Sparrow as well as legal figures such as Christopher Heyns. I found that these figures provided me with some stimulating perspectives on the subject that I hope to incorporate throughout my work. Throughout this dissertation the intention is for there to be an in-depth discussion regarding the law of AWS, including a conversation regarding the definitional development and the ambiguities surrounding this. Moreover, great consideration will be given to the responsibility and accountability of such technology as well as what ethical and legal implications have emerged and implications that could develop in the future. It is hoped that problems will be identified and discussed for each element of this dissertation and a discussion will be had regarding what the implications of such problems are.Methodology:The chosen methodology of my dissertation provides a framework that allows for the question at hand to be appropriately broken down into subsections, which combined allow the opportunity to answer the research question.The information that I will collect throughout my research will be desk-based research and be a combination of peer review journals, news sources and legislation. My initial starting point has been the website ‘Campaign to stop killer robots’; this provided me with an important insight to the main issues as well as providing me with an exhaustive list of reputable sources. This is particularly important due to the rising tensions surrounding AWS which inevitably means there will be contradictory opinions and articles that are perhaps not fully grounded in facts; because of this, the validity and the provenance of the sources will be imperative. This dissertation will by-in-large not incorporate the traditional doctrinal methods of research, instead we will look at the societal and cultural impact this technology has and the various writings around this. One of the main reasons for this is the fact that there isn’t that much ‘black letter’ law surrounding this area, which is perhaps, in some ways one of the problems. By integrating this research method into my dissertation, I hope it will allow for a discussion to be had regarding how the law of AWS or indeed the lack of laws for AWS has implications on society, therefore linking to the ethical consideration embedded within my research. In terms of the structure of my dissertation I have broken it down into 3 chapters, each of which provide a different outlook into why the laws surrounding AWS are failing to address the issues of responsibility and liability, and how such failings have created ethical and legal problems in the process. The first chapter is ‘Self-awareness of Autonomous weapons (the development of AWS) and is designed to provide some background on the topic and discuss some potential definitions of AWS. By having this as the first chapter I am able to immediately address the underlying issue that runs throughout my research, that being the definitional ambiguity involved in AWS. Whilst definitions seem to be a somewhat trivial area, my research led to me to believe that this ambiguity is at the core of this issue. Chapter Two is ‘Responsibility and Accountability of Autonomous Weapons Systems’; this will provide a large portion of the dissertation and will examine and compare the arguments for and against possible applications of responsibility and accountability. I hope to discuss a number of parties who are exposed to potential liability and the practicalities of doing so. The comparative nature of this chapter allows me to expand my research to international law in order to identify possible ways in which legislation could be adapted in order to sufficiently address the accountability gap that is currently present. The final chapter will expand on the previous 2 chapters, discussing the legal and ethical implications of using AWS. There are some arguments that this technology creates a more precise and humane method of warfare, however, others suggest that actually it creates a human rights dilemma and that it should not be an acceptable method of combat. This chapter will analyse these arguments and will expand on the concept of responsibility and accountability using a comparative approach. The structure of my research will allow for each chapter to complement each other and for the main research question to run throughout. Ethical Considerations: I have considered the ethical dimension of this project and due to the fact that it is desk-based research, myself and my supervisor feel there is no need for any additional ethical safeguards. However, this will be monitored throughout the life cycle of this project and if any potential ethical issues arise, they will be brought to the attention of the appropriate committee. Outcomes:The intended outcomes of this piece of research is to provide a discussion regarding the practicality of allocating responsibility and accountability for the actions of AWS and whether or not current law provides adequate provisions for this. Following a review of the literature I hope that I will be able to identify an area that provides the potential to be adapted in order to amend any inconsistencies or failures in the law as it stands. As previously mentioned, chapter one discusses the definitional ambiguity of AWS, this is an element that I expect will run throughout my research due to these failings being embedded in every element. I hope to prove that by creating a universal definition of AWS it is possible to create a set of comprehensive laws that can govern this technology. Furthermore, I hope to identify how the inability to allocate responsibility or accountability for the actions of AWS will create ethical issues as well as legal issues relating to international humanitarian law and human rights. Chapter One: Self-Awareness of autonomous weapons (the development of AWS)Having outlined the initial areas of this enquiry, this chapter is going to address the first part of the research question and discuss the development of these weapons systems as well as considering the the definitional discussion surrounding this technology. Recent years have been host to increasingly rapid technological advancements that incorporate varying levels of autonomy into machinery and robots, such advancements are shaping the way in which the modern world functions. Technology is being pushed out of the realms of the impossible and into one of the most crucial debates of this century. The human ability to create biotechnology and to manipulate DNA all provide examples that show the human capability of engineering technology and genetics in in order to enhance the human body, with the human brain being the epicentre of such human intelligence. However, there have been extreme advancements in departments that are developing technology that can manifest artificial intelligence (AI) and is able to be programmed to act in a certain way that is independent to human controllers; such technology challenges the idea that the human brain is the unsurpassed form of cognitive intelligence.The concept of the singularity hypothesis suggests that eventually there will be no reason for the ordinary human to be ‘in the loop’; the intelligence of the AI technology will have exceeded that of the normal functioning human brain and humans will no longer be able to keep up with the rate at which such technology makes decisions and reacts. When you apply this concept to the advancement of AWS, the possibility of humans no longer being in the loop becomes a much more frightening and potentially catastrophic prospect. A crucial discussion in relation to the research question is that of the differentiation between legal responsibility and moral responsibility; whilst these are individual concepts one could argue that as we often view some legal actions to be immoral as well as some laws to be unjust, the two concepts conflate and compliment each other. And whilst it is true that one can have law without morality and morality without law, the combination of the two can provide for a more well-rounded set of legislations. If one is to consider the above in the context of the responsibility and accountability for AWS, it is clear that whilst laws and morals should be considered in conjunction with each other, due to the discrepancies in how others perceive morals, the legal arguments regarding AWS provide a better opportunity for a global agreement as to the responsibility and accountability of AWS. This research will provide an insight into the possibility of allocating legal responsibility and accountability for the actions of AWS, whilst delving into the legal and moral responsibilities of developing this technology. In the process of doing this, consideration must be given to what exactly is meant by a weapons system acting ‘autonomously’ and the extent of self-awareness of AWS. The following chapter considers how defining Autonomy in weapons and what is considered by ‘self-awareness’ has an impact in the ability for responsibility and accountability to be allocated. 1.1 Defining Autonomy in Weapons Accurate and comprehensive definitions are key when it comes to any product regardless of its function, as a definition provides certainty and the ability to accurately legislate such products if this is necessary. Before we consider the legal or ethical arguments surrounding the responsibility and accountability of AWS, it is important that we establish the definition parameters of this technology and identify any potential gaps in these definitions. The notion of armed conflict is one that is familiar worldwide, yet in past years the nature of armed conflict has adapted, taking on a more technological form. Modern day warfare is no longer solely dependent on human soldiers, in fact it now has a heavy reliance on Artificial Intelligence (AI) and the development of autonomous weapons systems (AWS). Yet whilst we are advancing this technology at rapid rates, integrating it into armed conflict with seemingly little thought to the legal and ethical consequences of doing so, the lexicography surrounding this technology is failing to adapt as quickly as the technology itself. ‘Autonomous’ and ‘automation’ have been used with some degree of interchangeability, yet in reality they bear different meanings. The issue regarding definition is one that is extremely prevalent in the field of autonomous weapons, with scholars and policy makers alike failing to agree on a definition of autonomy in weapons. One could argue that this is a trivial matter and the need for a precise definition is redundant, yet this is actually an integral factor. Without a working definition that all countries can implement, creating a worldwide consolidation as to which military weapons are deemed to be ‘autonomous’ is near impossible. The lack of legislation and definitional clarity regarding the use of AWS leaves the issue of responsibility and accountability unaddressed. Currently technologies are being used that possess a certain level of autonomy including tracking, identifying, deciding when to fire a weapon or to detonate a device, with approximately 30 countries deploying or developing defensive systems that can be placed somewhere on the spectrum of ‘autonomous’. However, this autonomy is only actively in use in circumstances where engagement time is too narrow for human response. Such technology has been in use for the best part of 70 years and has naturally developed from being a fairly basic use of autonomy in weapons to much more advanced and precise equipment, but this is not the technology that is currently in the spotlight. Indeed, the debates regarding AWS and the concerns surrounding its integration into modern warfare at this present time are merely hypothetical and are based on the predicted advancement of AWS technology, and what these systems are predicted to be capable of in the future. Crootof considered that “there is a nearly universal consensus, among both ban advocates and sceptics that autonomous weapons do not yet exist”. Moreover, there is a further concern that AWS, whilst they are yet to exist on a fully autonomous scale, will fail to comply with the principles of the Law of Armed conflict and will lack the ability of proportionality, distinction and military necessity, alongside the application of the Martens clause. Such fears were set out in ‘Loosing Humanity: The case Against Killer Robots’ published by Harvard Law Schools International Human Rights Clinic in 2012. So why is it the case that countless NGOs including Campaign Against Killer Robots are calling for a pre-emptive ban on this technology? This could be explained in the point made by Paul Scharre and Michael Horowitz: they consider that the rapid advances in information technology makes the development of more advanced autonomous weapons something that may come to fruition sooner rather than later, and whilst there is extreme importance in the awareness of this issue, there is a trend of leaping to the conclusion that AWS should be legally banned without having a fully comprehensive understanding of what is meant by ‘autonomous weapons systems’ and what their development would mean. One could argue that the apparent failure to develop a globally recognised definition of AWS allows the notion of “autonomous robot” to be subject to imagery of science fiction characters and anthropogenic robots. This knowledge gap is contributing to what could be described as a fear-based understanding of the consequences of introducing a more advanced form of autonomous technology, especially if that comes in the form of weaponry. With there already being a varying level of autonomy in existing technology, the question would be whether or not one definition would be able to encapsulate the complexity of autonomy. As previously discussed, simple forms of autonomous weapons and technology have been in use for approximately 70 years, but even seemingly innocuous examples generate these kinds of concerns; for example, existent ‘self-driving’ vehicles have the capability to manoeuvre themselves around obstacles and produce countermeasures to assist in the completion of its tasks, but do so with little concern for human safety. This demonstrates that although development of lethal autonomous weapons systems (LAWs) has produced the legal and ethical minefield we now find ourselves in, the legal grey area is by no means restricted to LAWS. There is no doubt that clear terminology is vital when it comes to providing clarification and understanding, without this the word “autonomy” will remain an umbrella term for a host of complex and varying AWS. Moreover, resolving the current debate on definition should be seen as a priority - once this has been established, issues regarding ethics and legality will be able to be considered. The aforementioned definitional ambiguity has led to different ways of thinking about AWS. Whilst these definitions vary, a popular theory can be seen in the form of the ‘loop theory’, which is based on the amount of human input/supervision. The Human Rights Watch applies this theory within their definition regarding degree of control, and defines it as follows: Human-in-the-loop weapons: robots that can select targets and deliver force only under human command; Human-on-the-loop weapons: robots that can select targets and deliver force under the oversight of a human operator who can override the robots actions and Human-out-of-the-loop weapons: robots that are capable of selecting targets and delivering force without any human input or interaction. One could argue that this bears resemblance to many theories including that of John Boyd’s observe-orient-decide-act paradigm. In this paradigm, Boyd recognises that a military pilot is required to make decisions at a quicker rate than their opponent, and that this decision-making process can be condensed into his model of an observe-orient-decide-act paradigm (OODA). This concept is not to be considered as an entirely new concept; however, it does organise this way of thinking into a strategic system that essentially allows one to adapt to any given situation, thus coping with the uncertainties of war. When considering this paradigm in the context of AWS, the level of autonomy attributed to the AWS systems would in theory mean that they would be able to adapt to different strategic situations, however the autonomy of a robotic is vastly different to the autonomy of humans and so the approach to this paradigm would be significantly different. A human is able to apply the OODA paradigm to situations where ambiguity is clouding judgment and there is the need deliver a fast reaction. Whilst this is the case Boyd comments that our inability to properly make sense of our changing surroundings is a bigger hindrance and can potentially mean that rather than shifting their perspective of a situation, humans will simply try and address their situational ambiguity by creating solutions according to their personal experiences. However, AWS are yet to be given the self-awareness that the human mind possesses and so do not have the same perspective on a situation that a human may have. Applying this paradigm to AWS could therefore introduce a technology that has a higher effectiveness and a drastically quicker response time to any human soldier or pilot. I am of the opinion that whilst the OODA paradigm doesn’t necessarily provide a definition for AWS, it provides us with a mode of thinking that could allow us to understand the implication of AWS being developed and introduced into modern day warfare. One might argue that without this deeper understanding and the lack of comparison between the ways human soldiers and AWS would make decisions in an ambiguous situation, it is not possible to produce a definition that would sufficiently engage with the capability of such technology. One of the most widely recognised definitions of AWS is from the US Department of Defence (DOD) directive 3000.09 on AWS (2012), which defines an AWS as: “a weapon system that, once activated can select and engage targets without further intervention by a human operator”. This definition distinguishes autonomous weapons from semi-autonomous weapons by providing a further definition of the latter as being “a weapon system that, once activated, is intended to only engage individual targets or specific target groups that have been selected by a human operator”. The same directive defines AWS as being able to “select and engage targets without further intervention”. Be that as it may, this definition is not without its ambiguities, the main concern being that the directive’s attempt at distinguishing semi-autonomous weapons from autonomous weapons somewhat tarnishes the clear definition above. The main distinguishing element of autonomous and semi-autonomous is the level of human responsibility involved in the selection of targets, and whilst the directive makes an effort to distinguish between the two, one could be of the same opinion of Crootof in thinking that this element remains vague and unclarified. These definitions provided by the DOD show a similar way of thinking to that of the Human Rights Watch ‘loop’ theory. Moreover, the DODs definition is not dissimilar to the directive set out by the British Ministry of Defence in which it is suggested that an autonomous system is capable of understanding a higher level intent and direction which will allow it to take the desired action in order to bring about the intended state. The MoD directive continues to state that ‘autonomous systems will, in effect, be self-aware and their response to inputs indistinguishable from, or even superior to, that of a manned aircraft. As such, they must be capable of achieving the same level of situational understanding as a human….’ The concept of AWS being ‘self-aware’ is widely regarded as one of the main concerns in the development of this technology, as removing humans from the loop creates a whole host of legal and moral conundrums that will be discussed further in the following chapters. The 2014 International Committee of the Red Cross (ICRC) expert meeting on “Autonomous Weapons Systems” maintained that there was yet to be an agreed upon definition of AWS.Following the 2014 ICRC meeting, Rebecca Crootof made an attempt to further address the ambiguity facing defining AWS: her new definition took into account the law in relation to armed conflict rather than simply the practicality of having a working definition. Her definition stated the following: “an ‘Autonomous weapons system’ is a weapons system that, based on conclusions derived from gathered information and pre-programmed constraints, is capable of independently selecting and engaging targets.” The way in which this definition has been phrased appears to address the individual concerns regarding defining AWS and takes into account the vital distinction between automated and autonomous. as well as accounting for the machine’s ability to, in some sense, ‘improvise’ within the constraints of its programming. Whilst there is still an element of ambiguity, it is clear that the numerous definitions that have been created all share the common theme of AWS having the ability to independently select and attack a target without human intervention; suggesting the aforementioned ‘loop’ principal could be seen as one of the most agreed upon definitions at present. When developing a globally recognised definition there should be a balance between the recognition of the different domestic laws that may affect its implementation and the need for clarification regarding what an autonomous weapons system is classified as. A pre-emptive ban on the use of AWS in armed conflict is, as considered above, based on hypothetical developments of this technology. Yet it is seemingly impossible and somewhat impractical to ban a system that is yet to be provided with a recognisable legal definition. 1.2 Self Awareness in AWSIn view of the above it is clear that defining AWS is at present, one of the predominant discussions amongst academics and authorities, yet one could argue that the real concern lies within the diminishing human control that AWS possess and the technology’s increasing self-awareness. The extent to which humans control the decisions of AWS is diminishing so rapidly that it is not unreasonable to assume that at some point in the future AWS will possess what could be considered as complete self-awareness. This is a quality that Crootof explains provides a threat to one of the laws of armed conflicts most fundamental assumptions; that ultimately, a human being decides whether another human being lives or dies. This is a point that Asaro considers when stating that, “it is the delegation of the human decision-making responsibilities to the autonomous system designed to take human lives that is the central moral and legal issue”. Whilst this is a point that potentially has closer links to the liability and accountability aspects in respect of decision making, the element of machine liability is what ties this to the attribution of self-awareness to AWS. One of the oldest questions in relation to the use of AWS is its capability to function like a human, make ‘human like’ decisions and to think and reflect on its actions in the way in which a human is capable of doing. By giving technology the ability to essentially make its own decisions, developers risk removing things such as the degree of empathy, common sense, and creativity exhibited by a human soldier on the battlefield. Moreover, if AWS achieve a level of self-awareness that a human may possess, we leave ourselves with a number of questions that touch on a somewhat philosophical approach yet will also provide a unique perspective for the legal responsibility that could be given to AWS. For example, if the level of self-awareness increases, would this therefore mean that it is capable of making moral judgements, and if so, could it be held responsible for its actions? Murray Shanahan discusses the idea of Brain-inspired AI, and what the consequences of this would mean for humans and AI alike. However, more relevant to AWS would be the AI that has been manufactured from scratch and has not been modelled with the intention of replicating the way in which the human brain functions and is more focused on ‘artificial general intelligence’. AWS are not designed to mimic the appearance and behaviour of humans, but to resemble and act as weapons, as the name suggests. Whilst this means AWS do not emulate a traditional form of consciousness, they would be capable of showing basic cognitive qualities such as awareness, purpose and integration - all qualities that are inevitably going to be a product of artificial general intelligence. As with most technologies, there are varying levels of capability and advancement, and this is certainly the case for autonomy and self-awareness of AWS. If we take the basic example of a landmine, it is an entirely independent device that is capable of performing the task it has been assigned without the need for further human intervention. It will be activated when someone walks over it or if a vehicle drives over it, yet this is the extent of its autonomy. It cannot discriminate between those who it has been placed there to kill or those who are innocent in this scenario. It simply knows two states, either exploding according to the pressure applied, or not exploding. This device would not be placed in the same category as a targeting system, for example. A targeting system would be functioning on a much higher level of autonomy, as it is designed to distinguish between many states and possibilities in order to achieve its desired outcome; it therefore demonstrates a level of autonomy that links to cognitive skills. From these simple examples a clear trend emerges: the greater the responsibility given to an AWS, the greater the degree of cognitive skill required, and thus the higher the level of autonomy. By assigning a device a higher level of autonomy we are essentially reducing how much it depends on humans when performing its tasks.The theory behind autonomous vehicles can also be closely linked to that of autonomous weapons due to the similarity behind their operational functions. The prospect of autonomous vehicles on our roads does not seem as deadly as an autonomous drone designed for warfare, yet as alluded to earlier, the legal and moral questions remain the same. Moreover, the in the loop system can also be applied here: with vehicles becoming increasingly autonomous, the purpose of the driver changes from being in charge of operating the vehicle to essentially monitoring how the vehicles system is operating. Therefore, like the relationship between operators and autonomous weapons the ‘driver’ of an autonomous car is essentially removed of its purpose to drive; the driver has to be assumed as completely out of the loop. Both autonomous weapons and vehicles must be able to act autonomously with a level of self-awareness in order to deal with critical situations and assess what the appropriate actions and responses are for any situations it may come across.As stated by the authors of “self-awareness in Autonomous Automotive systems”, a general challenge for self-aware autonomous systems is the fact that they are operated in an environment that allows only limited predictability. In most cases not all the effects that impact the system can be fully anticipated. Armed forces around the world already have some autonomous functions, such as navigation, communication and detection, and the level of autonomy these systems show ranges from remotely piloted to fully autonomous. Whilst these systems are highly advanced, there will still be a certain level of unpredictability that will only be noticed when the system faces a certain degree of uncertainty where it is challenged to adapt and apply the relevant countermeasure to combat any issues that arise. If we take the example of an autonomous missile device that is programmed to get to a certain location, it is impossible to predict all scenarios which could occur on its journey and so a certain level of self-awareness is vital to allow such systems to operate effectively. A final point to make in relation to the self-awareness of AWS is the notion of the bar being set too high in regard to what is deemed as autonomous. The legal view of fully autonomous weapons being merely hypothetical is not misplaced and is grounded in the confusion surrounding differentiating between ‘automated’, ‘autonomous’ and ‘semi-autonomous’ and the conflicting definitions of what classifies as an AWS. An example of this can be seen in the U.K MoDs definition of “autonomous systems” in which it states that “…as long as it can be shown that the system logically follows a set of rules or instructions and is not capable of human levels of situational understanding, then they should only be considered to be automated”. One could argue that this definition fails to take into account that as it stands, AI is unlikely to reach the same level as a human in terms of understanding of its situation and its surroundings, this remains a skill unique to the cognitive ability of humans. Having considered the definitional ambiguity surrounding AWS, it seems as though this is one of the main contributing factors that has led to the laws in this area being insufficient. A clear lack of structure and understanding is preventing a universally recognised definition from being developed, in turn one could argue that this has the potential to contribute to an accountability gap where responsibility and accountability is concerned. The following chapter is going to concentrate on the concepts of responsibility and accountability and how they are applicable to the discussion at hand.Chapter 2: Responsibility and AccountabilityThe discussion in the previous chapter regarding self-awareness of AWS posed interesting questions both legal and ethical. However, once we emerge from the conversation regarding definitional ambiguity, we are left with the development of weapons systems that can act without direct human intervention. Thus, providing the prospect of technology advancing to the level at which AWS could be fully autonomous and thus rendering human input surplus to requirement. We are then faced with yet another legal ambiguity which opens up a unique conversation regarding the challenges of attributing individual responsibility. Put simply, who is responsible for the actions of AWS? By design, AWS and LAWS have a whole host of human agents who contribute to its functioning; this considered we are then presented with various individuals that could be candidates for legal responsibility of the actions of AWS. It has been identified by UN Special Rapporteur Christof Heyns that those who could face individual responsibility include ‘the software programmers, those who build or sell hardware, military commanders, subordinates who deploy these systems and political leaders who authorize them’. Whilst this is true, one must consider the practicalities as well as the legal boundaries involved in holding these parties responsible. The previous chapter touched on the issue of AWS becoming fully self-aware and capable of making its own decisions, however, this is not reality at present. Because of this, the attribution of responsibility to AWS itself will be difficult as it is incapable of acting in a manner that could be subject to criminal liability. It is inevitable that the software running this technology will continue to be developed, increasing its complexity, therefore adding to the fact that not one person is involved in its creation or its functioning. It is thought that with the increase in people working on this software, it will reduce the likelihood that one individual will have complete understanding on how the software functions as a whole, it therefore follows that its functioning could be unpredictable as well. We must also then consider the prospect that in the future, near or distant, fully autonomous weapons systems may be reality, removing the human input element and creating a further concern regarding attributing responsibility to technology. Considering the above, in the situation where the conduct of AWS violates the laws of war or if a malfunctioning of the technology, faulty programming or misguided deployment results in international crimes being committed, who are we holding responsible for such violations? More importantly who should be held responsible? 2.1 Criminal Responsibility When considering the practical and legal standing of applying responsibility then a logical starting point would be to consider the elements of criminal responsibility both domestic and international.The basic principal of criminal responsibility is that when a person commits an offence that is deemed to be criminal in nature, they will be held criminally responsible for the commission of this crime. The perpetrator will then receive a penalty in the form of a fine, a community order or imprisonment. The presumption that someone can be held responsible for their actions can be rebutted in certain circumstances, for example if the perpetrator is a group, corporation or state, when the subject of the blame is an animal or non-human object, or where exemptions apply which mean that the individual responsible for committing the crime cannot be subject to blame; this can be in situations where the perpetrator is a minor or has diminished mental capacity. Current laws regarding responsibility and accountability have long been integrated into criminal and civil law both domestically and internationally. However, such laws vary depending on which country they belong to, with Germany having a more complex and versatile set of laws that recognise the concept of direct and indirect perpetrators. German law recognises that a person who commits and act through another is an indirect perpetrator, signifying that the indirect perpetrator (Hintermann) has control over the direct perpetrator (Vordermann). The Hintermann often exploits a certain deficit that the Vordermann possess, this can be something as simple as lacking the intent for the offence. The German concept of someone being held criminal responsible by acting through another person is also acknowledged on an international level, with Article 25(3) (a) of the Rome Statute of the International Criminal Court providing ‘whether as an individual, jointly with another or through another person, regardless of whether that other person is criminally responsible’. In view of this, an analogous comparison can be made between AWS and the Vordermann; whilst AWS are programmed to adapt and to make decisions based on the algorithms it has been assigned, at present it lacks the human ability to intend to commit an offence. Moreover, with intent comes the element of Mens Rea, an element which is designed to establish whether there was intent surrounding an action. Article 30 of the Rome Statute states that in order to act with intent, the perpetrator must mean to engage in the conduct and mean to cause or be aware that the consequences will occur in the ordinary course of events. AWS have been developed to operate in hostile environments and the functioning of the AWS itself is inherently unpredictable, this considered it is unlikely that one would be able to argue that is actions fit the requirements of the Mens Rea. Whilst there are inherent difficulties in applying laws that have been designed for human application only to AWS, the general principals embedded within them provide potential foundations to re-design the laws in the context of AWS. We are all aware that the development of AWS was facilitated in order to provide strategical advantages in armed conflict; with this in mind it seems only natural that they could be used to commit crimes capable of being recognised as war crimes. With this comes further questions regarding not only the legal implication of responsibility and accountability but also the ethical whirlwind surrounding AWS being in charge of kill decisions. This is an area that will be discussed in detail in a later chapter. Given that this is a very real prospect, attention should be given to Criminal responsibility being applied on the international stage in relation to international crimes such as war crimes and crimes against humanity. Individual criminal responsibility for war crimes committed in International Armed Conflicts has been the basis for prosecutions under the Charters of the International Military Tribunal of Nuremberg and at Tokyo, under the statute of the International Criminal Tribunal for the former Yugoslavia and the statute of the International Criminal Court. There have been countless examples of war criminals having been tried on the basis of this principle. Of some significance is the case of Tadić in 1995 which was seen in the Appeals Chamber of the International Criminal Tribunal for the former Yugoslavia; the conclusion in this case was that there was in fact individual criminal responsibility for war crimes committed in non-international armed conflicts. One could argue that the significant developments in the 1990s of individual criminal responsibility in non-international armed conflict was somewhat of a turning point in IHL (international humanitarian law), allowing those who had been at the forefront of such internal atrocities to be held responsible for their actions. Amongst these developments, the case of Tadić now provides us with a precedent for individual criminal responsibility for war crimes being committed in non-international armed conflicts. However, as with the majority of the laws governing armed conflict (international or non-international), the issues arise when it comes to determining responsibility for war crimes committed by the likes of AWS and LAWS. Based on the fact that individuals who commit war crimes are individually responsible for them under IHL and ICL, one could argue that those who deploy an autonomous weapon system which has been programmed to carry out acts that amount to crimes under either domestic or international law should therefore be criminally liable. This is an issue that will be discussed later on as whilst the solution seems somewhat simple, prosecuting such individuals would prove extremely difficult due to the necessary levels of understanding and proving that the intention for the crime in question to be committed was present. Elements of the ICTY decision can be seen to have been adopted from The Elements of Crimes document which considered the substantive crime within the Rome Statute. Within this, it stated that for every war crime there was a requirement that the alleged conduct “took place in the context of and was associated with an [international or non-international depending upon the precise provision of the statute] armed conflict’. Moreover, it is also the case that criminal responsibility in the context of a war crime has also been applied to individuals who have attempted to commit war crimes, as well as for those individuals who have assisted in, facilitated or aided or abetted the commission of a war crime. One could argue that this bares close resemblance to the approach seen in German law.One could consider this by approaching AWS as being akin to a Vordermann, who possesses some defect or deficit, such as the lack of capacity to act intentionally. The fact that AWS lacks the human qualities it takes to carry out an intentional act separate to its programming makes it an innocent agent as its actions can be controlled or caused by a human agent. It is no mystery that AWS and LAWS have been developed in order to be integrated into warfare to reduce human involvement and in theory soldier casualties and fatalities. Naturally this comes with the benefit that such technology has a precision that cannot be obtained by a human and so a more accurate form of warfare can be engaged. Considering the above it is possible to argue that these elements could be adapted and applied in order to allocate responsibility and accountability, be it criminal or civil, to those involved in the commission and actions of AWS.2.2 Developer/Manufacturers’ LiabilityThe concept of criminal responsibility is evidently well established within both domestic and international policy; however, the dynamic is bound to shift when this concept is required to be applied to AWS. As stated previously, Christof Heyns identified that there are a number of parties who are exposed to potential liability, this includes the developer/manufacturer of this technology. AWS have been developed to be able to operate with a capacity to manage their own operation, with various components working together to amplify its effectiveness. AWS technology was developed to provide specific advances, by creating algorithms that allow AWS to manage the operation of itself without explicit human operator input, moreover the speed and accuracy of modern computation means that AWS are able to function more efficiently in situations where a human operator lacks the capability of making such rapid fire decisions, especially in the heat of battle. The developers are programming AWS to be able to process information at a greater rate than a human would ever be capable and in turn creating what could be considered as more efficient warfare. This demonstrates the operational advantages driving the development in this area. The process is undoubtedly complex, and it would not be possible for those involved to accurately estimate the possible consequences of their deployment, therefore making it extremely difficult to attribute responsibility to the manufacturer and developer. One could argue that if you were to hold the developer accountable for each death or war crime committed using one of the AWS weapons they helped to develop, then you could also hold the developer of every gun, grenade, explosive and generic weapon accountable for the deaths, injuries and crimes their development has contributed to. However, it is worth noting that the Protection of Lawful Commerce in Arms Act (PLCAA) is implemented in US laws and was created in order to protect firearms manufacturers and also the dealers of these products from being held liable if crimes are committed using the products they have produced or sold. Whilst this is the case, the above parties will not escape liability if damage is caused by defective products, breach of contract or criminal misconduct. Similarly, they may also have liability for negligent entrustment, which involves them having reason to believe the gun is intended to be used for criminal purposes. From the implementation of the PLCAA we can see that it is in fact a legal possibility to exempt manufacturers from being held accountable for the use of their technology, could it therefore be said that a similar set of laws should be created that have the same effect to manufactures of AWS? Furthermore, if this became a reality and we used the PLCAA as a precedent, and manufactures sold AWS to various Armed Forces or leaders of countries involved in non-international armed conflict and were fully aware that they could be used to engage in War Crimes or strikes on civilians, would they be liable for negligent entrustment?The difficulty and impracticability in holding manufacturers responsible for the actions of AWS is discussed by Sparrow, who considers that if you were to hold programmers or manufacturer’s responsible for the actions of their creation once it becomes autonomous, it “would be analogous to holding parents responsible for the actions of their children once they have left their care”.A Human Rights Watch report stated that it would not be possible to hold the manufacturer liable for any harm caused, if: (i) the specification for LAWS was approved by the government, (ii) the weapons conformed to those specifications, and (iii) the manufacturer did not deliberately fail to inform the government of any expected or known danger from the weapon system.If we are to search for individuals to direct responsibility and accountability to, then one should consider that there would be overwhelming challenges in applying this to the developer. There is such uncertainty surrounding this field that to predict future developments and the way that the technology will be used would be ill-advised and challenging. Further to this, without knowing the outcome of future developments the task of saying how activities of the developers may constitute acts proscribed by the law of armed conflict is also extremely difficult.2.3 State Liability The development and deployment of AWS undoubtedly provides a new level of precision when it comes to armed conflict, the technology that is being developed allows AWS to react at greater speeds than any human soldier, providing strategic advantages and potentially reducing the mortality rate of human soldiers. However, this all comes at a cost; despite numerous states having commissioned the development and use of AWS technology there has been seemingly little consideration to the legal and ethical issues that come with them. As with any weapon commissioned by the state, there should be an obligation to ensure that the weapons are not being used in a way that violates International Humanitarian Laws or the human rights of those that come into contact with it. However, the conversation regarding state responsibility and risk management for when things go wrong is highly controversial. Robert Geiss considered that whilst the deployment of AWS is not unlawful when used responsibly, it is a high-risk activity that is not fully understood and so there is a level of predictable unpredictability. Considering this, Geiss states that a State that benefits from the strategic gains associated with AWS should be held responsible whenever such unpredictable risks are realised. It is clear from this that state liability is something that is being taken seriously and developments are being made in this area. One could argue that the State should hold a high level of responsibility for issues brought about by AWS as in theory, they are the party that is making the decision to integrate them into their own militaries and deploy them into combat. If States are capable of being held responsible for the actions of AWS in the event an unpredictable risk occurs, then it would make sense to implement preventative measures that would reduce the harm that such occurrences could cause. An essential component of this is for States to acknowledge their due diligence obligations that are aimed at risk prevention and harm reduction – state responsibility arises in the event that such obligations are violated. The Law of Armed Conflict clearly provides for the way in which states should conduct themselves in conflict and the repercussions of breaching such conduct, building on the element of ‘due diligence’. Furthermore, Article 1 of the Geneva Convention I-IV requires States to ‘ensure respect’ for the laws of armed in conflict in all circumstances. There is therefore no reason that it would be misplaced in interpreting this to include armed conflict that involves the use of AWS. This being said there are still elements of ambiguity that call for clarification in order to diminish the opportunity of any State claiming that their actions do not amount to a violation of the Law of Armed Conflict. Geiss considers that the problem is not that there is a lack of legal basis per se, but that there is a lack of clarity surrounding the meaning of ‘due diligence to ensure respect’ in the context of autonomous weapons systems. If we examine this, it is apparent that the issue here stems from the fact that this legal basis has been created in order to address human error not that of AWS, and that due diligence requires there to be a given circumstance, such as combat, and what a reasonable party would do if placed in these circumstances. The problem therefore arises when this is applied to the conduct of AWS, it is difficult to know what could and should be considered as reasonable when the technology in use is new and there are no precedents set to use as a point of reference when making this decision. An interesting point that Geiss raises is that whilst human beings are in charge of the deployment of AWS, accountability can be determined using the pre-established rules of attribution. This essentially means that if a member of the military of any given state decides to deploy AWS in the course of combat then the activities the AWS carries out will be attributable to that State and not the AWS itself – a weapons system possessing some autonomous capabilities does not alter this. It can be said that this accurately deals with the fact that whilst AWS exist, they are not fully autonomous (as previously discussed) and therefore humans still remain in the loop to a certain extent. A further obstacle to state liability is that of jurisdiction; only states are able to submit contentious cases for ICJ adjudication, because of this, the Courts lack jurisdiction to deal with any applications from individuals, non-governmental organisations, corporations or private parties. Simply being a member of the U.N does not mean that automatically gives the Court jurisdiction, it would in fact require both states to have given consent to the Courts Authority. If we consider the consequences of a state submitting itself to the court’s jurisdiction, it is evident that there is little to no incentive for them to do so as simply admitting liability would increase the risk of significant adjudication costs, costs that would increase substantially if the State in question was found guilty. 2.4 Command responsibilityCommand responsibility, also known as indirect responsibility, is another branch of International Criminal Law that presents the possibility of holding a military commander or a civilian superior liable (criminally). It could potentially occur if a superior is found to have failed to take reasonable measures when preventing or punishing a subordinate if they have committed a criminal act. This is due to the fact that the superior effectively has control over the actions of the subordination, and therefore as soon as they are aware that one of their subordinates has committed a criminal act, they have a duty to act on this. This was discussed in the case of Prosecutor v Halilović, which stated that command responsibility will hold the superior accountable for dereliction of duty. Superior or command responsibility is a concept that has been historically been applied to a number of cases following the Second World War. Article 28 of the Rome Statute provides a provision for the ‘Responsibility of Commander and Other Superiors’. The acts states that: ‘a military commander or person effectively acting as a military commander shall be criminally responsible for crimes within the jurisdiction of the court committed by forces under his or her effective command and control, or effective authority and control as the case may be, as a result of his or her failure to exercise control properly over such forces…..’ The accountability discussed here is activated if the superior should have known that the forces were committing or about to commit such crimes or that he had failed to take all necessary and reasonable measures to prevent or repress their commission. An important element to note if you were to pursue this level of responsibility would be the fact that it must be proven that there was a commander/superior and subordinate relationship between the parties. If this is unable to be shown, then the necessary elements would not have been satisfied and criminal responsibility of the superior would be unsuccessful. On one hand, this could be seen as analogous of AWS due to the nature of them being weapons and so functioning in a similar context to soldiers (subordinates) – this considered it could be said that there are potential grounds to trigger the doctrine or superior responsibility. However, AWS, if fully autonomous, have a certain ability to make their own independent decisions and judgments relating to target selection and engaging with a threat, with advanced technology embedded in them specifically designed in order for them to operate independently from a superior. If they are not under the direct command of a superior it would be unreasonable for one to expect superiors to have anywhere near the appropriate level of knowledge to trigger the doctrine of command/superior responsibility. It could also be argued that the rapid rate at which AWS technology makes its decisions would make it near impossible for superiors to foresee that the forces were about to commit such a crime that could then make them subject to command liability.If we were to consider that the robot was able to communicate its decision to the commander prior to acting, then this would in theory mean the commander would have sufficient knowledge of the impending criminal act and it could therefore could be held responsible. Be that as it may, this would only be in the case of a ‘human-in-the-loop’ system, if however, the system was fully autonomous, there would be a dramatic reduction in communication with a superior and the decisions would be made without the need of approval. Moreover, it has been established that a commander must have sufficiently alarming information in order to investigate, without receiving such information they cannot be held liable for negligently failing to find out information. Furthermore, in Prosecutor v Strugar, it was noted that knowledge of past offenses by a particular subordinate may constitute as ‘sufficiently alarming’ information, therefore the superior would have sufficient knowledge that the subordinate in question could commit future criminal acts, thus the mens rea of command responsibility would be satisfied. Taking this into account, we are then left with the confusion surrounding what the threshold is for ‘past unlawful acts’ that have been committed by AWS. AWS technology has been developed primarily as an aid to combat and due to its adaptability, there are different scenarios that this technology can be placed in and different tasks that it can be assigned. If we therefore consider a situation where one AWS robot engages in conduct that violates international laws are we to assume that this conduct is setting a precedent for a past unlawful act that can be applied to all robots of that variety and with that same programming or is this only going to be applicable to that individual robot? We could then contemplate a scenario where one specific algorithm has been programmed into a model of AWS robots that means they are to target combatants - however one of them mistakes a civilian as a combatant, strikes and kills the civilians. This scenario has the capability of becoming reality, so in the event that this does in fact happen, are we to consider its inaccurate judgment of proportionality to be unique to that individual robot, or should the entire model have this on their record as a ‘past unlawful act’, given the unpredictability of it has already been demonstrated by one robot. Moreover, as previously discussed, the AWS technology is rapidly advancing and is complex in nature, in theory, in order for a commander to be able recognise that an AWS robot is going to commit a criminal act imminently, then they would have to have a some-what in depth understanding on the way the technology works, the nature of its programming and the level of autonomy it presents. It could be argued that this is highly unrealistic as well as unreasonable, there for it seems likely that commander responsibly could be avoided. The issue of fast development and decision making in AWS also links to the concept that there must be effective control over a subordinate in order for commander responsibility to be applicable. According to the judgement in Prosecutor v Delalicet al, effective control comes specifically from the “material ability to prevent or punish criminal conduct”; punishment of AWS is not only pointless but also fairly impossible and the preventative aspect of this has already been identified as difficult due to all the varying qualities of AWS. Further to this, the unpredictability of AWS means that there are any number of circumstances that cannot be foreseen, or alternatively, can be foreseen but nothing can be done about it, this all contributes to the application of effective control being questionable. Examples of the unpredictability mentioned could be signal interference, errors in its programming and the fast processing speed of AWS. If one of these situations occurred then the ability for a commander to intervene and call off the attack is diminished, demonstrating that effective control is not in place. One could therefore reach the conclusion that the application of command responsibility would be confronted with numerous issues revolving around insufficient knowledge and lack of effective control and consequently command responsibility would be an unrealistic form of responsibility to pursue. 2.5 The accountability gap In the discussion surrounding responsibility and accountability, whether we are looking at state, commander or even criminal responsibility, a common feature is the gap in accountability. This gap has emerged due to the fact that the laws surrounding legal responsibility are yet to be adapted to accommodate the advanced nature of AWS and the fact that they are not human entities and so do not possess the human qualities referred to in countless laws regarding responsibility and accountability. Whilst one could analogously apply laws to the functioning of AWS, the accuracy of the conclusions would be heavily scrutinised, and it is unlikely that it would stand in a court of law. If we are to consider that AWS are likely to become more advanced and will be deployed more regularly, the likelihood of an international law violation is somewhat inevitable – when this violation occurs those involved will search for someone to be held accountable. Accountability is one of the most essential elements in international law, it aims to deter and prevent violations, and so protecting potential victims of human rights abuses and war crimes. The rules of International Humanitarian Law and Human Rights law are in place to protect people against violations to their rights, including the rights to life; yet the unpredictability of AWS technology means that they do not have the capability to comply with these rules, leaving people vulnerable to immense violations of their human rights. If civilians are wrongly targeted by AWS, the accountability gap means that there would be disputes regarding who is legally responsible for these violations. An accountability gap is dangerous for several reasons; however, one could argue that the most important reason is that if there are no consequences for human operators or commanders in the circumstances where such violations occur, then there is no deterrent for other states regarding future criminal acts. Bonnie Docherty, senior arms division researcher at Humans Rights Watch suggested that “no accountability means no deterrence of future crimes, no retribution for victims, no social condemnation of the responsible party”. If this is the case and there is a failure to produce a deterrent, then the same violations will be repeated by the AWS as there has been no intervention with its programming that would direct it away from the decisions it was making. Moreover, the violations that are being referred to are those that are inflicted on civilians, the lack of actual meaningful human control possessed by the weapons would therefore make it near impossible to hold anyone criminally or civilly liable for such violations and unlawful acts. Academics and experts alike argue that the presence of an accountability gap gives weight to the argument that there should be a pre-emptive ban on AWS production and deployment until the law can accommodate all eventualities of AWS. Human Rights Watch has said that whilst military commanders could potentially be found liable if they intentionally deployed a fully autonomous weapon to commit a crime, such justice would likely be eluded as the most likely situation is where there is a violation due to unforeseen actions of AWS that human intervention wasn’t able to stop. Whilst one could predict the potentially dramatic outcomes the accountability gap will create, the unknown still remains one of the biggest enemies, until predictions become reality and the violations have been committed, it is unlikely that a pro-active effort to fill the void will be attempted. In the meantime, any incidents caused by AWS will continue to evade justice and be marked down as an accident or even a glitch, in doing so the accountability gap is being allowed to act as a scapegoat for potential violations and by continuing to acknowledge this we merely risk trivializing the serious harm that could be done. Chapter 3: Legal and Ethical Implications So far, this dissertation has provided an insight into how the ambiguity and lack of clarity surrounding the definition of AWS has further contributed to the development of problems in this area. The previous chapter drew on the points made in chapter 1 and explored how the lack of a clear universal definition has contributed to insufficient laws surrounding responsibility and accountability as well as how elements of international law could be used to somewhat fill the void in this area. This chapter aims to consider the legal and ethical implications of AWS, contemplating the impacts of this technology on a global scale rather than limiting it to domestic implications. The integration of AWS into armed conflict inevitably provides a cross-over between the legal and ethical implications of AWS; however, the legal aspects involved in the deployment and development of this technology seem to be overshadowing the ethical implications that have and are likely to continue to increase as this technology advances. To say that legality is overshadowing ethics is not to say that the legality is insignificant, rather, that legality and ethics should be considered simultaneously in order for AWS to be assigned with appropriate regulations. As discussed in the previous chapter, the accountability gap provides the perfect opportunity for ethical implications to go somewhat under the radar. The unpredictability of AWS not only means that they lack the capability of adhering to the legality of International Humanitarian Laws (IHL), but their inability to simulate human qualities such as empathy and morality means that they are also incompatible with the ethical consideration embedded in IHL and Human Rights Laws. One could argue that ‘Human dignity’ is one of the most important formulas embedded in international politics – once introduced in Article 1 of the United Nations Universal Declaration of Human rights, the concept of human dignity became an umbrella term that was used to bridge the gaps in ideological gulfs. In turn, this provided a unified terminology allowing the concept of human dignity open to a certain level of interpretation that allows each member state to speak with one voice. Human dignity as a concept has become increasingly incorporated in international documents and treaties, underpinning the majority of IHL elements such as distinction and proportionality; because of this I believe that it is a principle that forms the foundations for the legal, political and ethical implications of AWS. 3.1 International Humanitarian LawsInternational Humanitarian Law (IHL), stems from international law that has been put in place in order to manage and monitor the use of violence in armed conflict; the two-fold aim of this is to save civilians from the consequences of armed conflict and to protect soldiers from unnecessary suffering. The rate at which this technology is now advancing therefore brings new challenges to the basic principles of IHL. Whilst there are internationally recognised agreements to ban or regulate a number of problematic weapons such as expanding bullets, poisonous gases, antipersonnel landmines, biological and chemical weapons, blinding lasers, incendiaries and cluster munitions, these principals were created at a time where the technology involved in LAWS and AWS had not been developed and so making provisions for them in IHL would have been a moot point. This has now led to a reality where such technology is advancing at such a rapid rate that it is now exceeding the parameters of IHL.3.2 DistinctionA core element of IHL that is threatened by the advancements of AWS technology is the principle of Distinction. This principle is made up from two components: combatants must be able to distinguish (i) between civilians and enemy combatants, and (ii) between civilian and military objects. Moreover, this principal has been codified in Article 48 of the Additional Protocol I to the Geneva Convention: ‘in order to ensure respect for and protection of the civilian population and civilian objects, the Parties to the conflict shall at all times distinguish between the civilian population and combatants and between civilian objects and military objectives and accordingly shall direct their operations only against military objectives.’When we consider distinction in relation to AWS, the main problem is that in an area of conflict the people present are not restricted to soldiers, there are a whole host of people involved including civilian workers, medics and injured combatants. The concern lies with the fact that AWS and LAWS technology is unable to discriminate between combatants and non-combatants that may be present during conflict. Furthermore, when we consider that the ability to exercise the principal of distinction requires an individual to have their own set of moral and ethical guidelines in order to make decisions, we are once again confronted with the fact that AWS technology does not have the ability to simulate the relevant human qualities which could allow it to construct its own set of moral and ethical principles. Human soldiers are able to take positive steps in coming to their decision as to whether the subject they are engaging with is a combatant or a non-combatant, something that also requires a certain amount of common sense. Further to this, another important feature that relies on distinction is the ability to recognise when someone is surrendering; Sparrow successfully identifies this as an important shortcoming of AWS, discussing that because this technology does not have the capacity to recognize someone surrendering, there are ethical implications in relation to the deployment of such weapons systems. Being able to differentiate between someone surrendering legitimately and someone using their body language to deceive the opposition is an extremely subjective decision and involves having the capacity to interpret the actions and intentions of the other person – this is not a quality that AWS technology currently possesses. A further issue is presented in the form of definition ambiguity surrounding the legal definition contained in the Geneva Conventions of 1949 and AP I, such ambiguities mean that elements are left open to a certain level of subjectivity. If we are unable to implement a clear definition that it follows that it would be virtually impossible to integrate the concept of discrimination into the programming of AWS. Moreover, in order for a robot to be able to discriminate, they must first have the ability of recognition. It wouldn’t necessarily be beyond the capability of designers to allow AWS to have the ability of recognising basic signals (such as surrendering), however, the difficulty of this is the fact that war is volatile and unpredictable and takes place in unpredictable terrain. If you consider all the environmental factors involved, then programming that allows AWS to have basic object recognition and classification skills are rendered useless. Whilst programming basic recognition/distinction software may not be entirely unrealistic, creating software that can be implanted in AWS programming that gives them the same sophisticated capacity of interpretation of human reactions is something that is extremely complex and unlikely to be reality in the near future.One could argue that because of this factor, the ability for AWS to comply with IHL is unrealistic and its lack of ability to distinguish between combatant and non-combatants is the reason that could be deemed unsuitable for integration into armed conflict. 3.3 Proportionality The prospect of civilian causalities is inevitable in armed conflict and there is no way in which this can be eliminated from war altogether despite the best efforts of IHL implementing concepts such as proportionality in an effort to protect the civilian population in war zones. The rule of proportionality is defined in Article 51 (5) (b) of Additional Protocol I. It states that a violation of proportionality will be “an attack which may be expected to cause incidental loss of civilian life, injury to civilians, damage to civilian object, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated”. It is clear from this definition that the principal of proportionality seeks to mandate that where there is collateral damage to civilian population, it must be proportional to military advantage. The concept of proportionality is applied in a military context when considering if the damage to civilian objects, civilian death or injury is anticipated prior to targeting a military object, an assessment is given in which the anticipated military advantage is weighed against the anticipated ‘collateral’ damage to protected civilians or civilian objects. In comparison to distinction, proportionality provides us with a more realistic opportunity to create programming that would enable AWS to make basic decisions that would present a certain level of compliance to the rules of proportionality; however, this would be very limited. Even if we were to assume that such programming could successfully embed basic proportionality understanding in AWS technology, we are once again faced with the same issue that can be seen to underlie many legal and technological hurdles facing AWS. Noel Sharkey also identifies this as a crucial barrier in AWS proportionality, the technology simply does not possess the relevant human qualities to decide on when the damage to civilians would exceed the aforementioned anticipated military advantage provided by the attack. Considering this, one could say that Kastan is correct in saying that despite potential technological advancements the future may hold, the necessary analysis and assessment involved in the principle of proportionality would have to be left to humans. Additionally, one must also take into account the infinite number of scenarios that AWS might be confronted with, this alone makes it almost impossible to program this technology to replicate the decision process of a combatant. If there is no way in which this issue can be resolved, then it appears as though AWS violates IHL in such a serious way that it could be considered as a war crime under the 1998 Rome Statue of the International Criminal Court.3.4 Human Rights AgreementsWith the rapid growth of AWS development over the past few years the debate surrounding their use has intensified, with the primary concern being the legality of them. Despite the fact that legality has seemingly taken centre, the implication arising from potential human rights violation is an issue that deserves equal attention. The lack of importance allocated to human rights in this conversation seems completely misguided; civilians are the unwilling victims of armed conflict, by deploying AWS into combat there is an automatic increase risk and concern regarding human rights violations. The use of force human soldiers apply will eventually be applied by AWS, and with the rapid development of technology this could occur sooner rather than later. One could argue that the consideration of Human Rights should not be considered as a separate issue and should in fact be considered alongside IHL due to the fact that they are complementary of each other when applied in the situation of armed conflict. As discussed by Christof Heyns, the question we should be concerned with is: ‘is the use of AWS to apply force permissible under human rights law, and is so under what circumstances’. In his report Heyns breaks this question down into 7 areas that all hold equal importance, however it could be argued that the main point of focus should be which human rights are at risk of being infringed if AWS are to dispense force either in a lethal or non-lethal way. The right to life and the right to human dignity present themselves as the most noticeable rights that are exposed to potential infringements, these two rights are widely recognised and are not only included in the main human rights treaty but can also be found in customary international law. If we first consider the right to life, according to article 6 (1) of the ICCPR, ‘every human being has the inherent right to life. This shall be protected by law. No one shall be arbitrarily deprived of his life’. The potential violation here occurs due to the fact that one of the assumptions under international human rights laws is that the ‘kill decision’ must be reasonable and carried out by a human. The concept of ‘reasonableness’ is inherently human and stems from a combination of strategy and emotion that can only be displayed by a human. Despite machines having a certain level of self-development in the form of machine learning, the extent of this does not reach to emotional development, and so it is not possible for machines to ‘reason’ in the same way a human does, and it therefore follows that if a machine cannot ‘reason’ it also cannot take a ‘reasonable’ decision. Like Heyns, if we consider Article 1 of the Universal Declaration of Human Rights, it provides that ‘all human beings are endowed with reason and conscience and should act towards one another in a spirit of brotherhood.’ Whilst the terminology is somewhat archaic, the meaning remains relevant and maintains a strong emphasis on the human being the party that exercises reasoning and interaction, not AWS or any other technology that may be developed to act autonomously in combat. This considered, if we allow AWS to make their own decision when it comes to force then there will certainly be an infringement to the right to life. With the inability to feel emotion and make reasonable decisions, it is extremely doubtful that that AWS will be able to distinguish between a person surrendering and a person whom has the intention to attack. Without this element of distinction, we are allowing the machines programming to determine whether to act with lethal effect, therefore presenting a grave risk to the right to life.The second human rights issue that should be considered is the potential violations to the right to human dignity; this right is seen to be at the heart of all human rights and should be consider in conjunction with other human rights, such as the right to life. Article 1 of the Universal Declaration of Human Rights provides the following: ‘all human beings are born free and equal in dignity and rights. They are endowed with reason and conscience and should act towards one another in a spirit of brotherhood.’ The concept of human dignity has distinct moral grounds and in saying that there has been a violation of human dignity you are implying that the action committed is morally problematic. When considering the violation of human dignity at the hands of AWS, a common theme amongst academics is that it is the removal of human agents from the decision to kill is the ultimate indignity; Heyns uses the example of the ‘Riobot’ to demonstrate this. Whilst it is not currently autonomous in relation to the use of force, it is not farfetched to assume that this could become a reality. The idea that the miners are being herded like cattle by an autonomous robot strips the miners of their dignity and de-humanises them. The so called ‘death by algorithm’ essentially means that AWS would be treating people as interchangeable entities and so will have total disregard for their dignity. In his report to the United Nations General Assembly Heyns, voices his concern regarding this and writes: ‘delegating this process [of deciding on targets] dehumanizes armed conflict even further and precludes a moment of deliberation in those cases where it may be feasible. Machines lack morality and mortality and should as a result not have life and death powers over humans’. One could say that ‘machines lack morality and mortality’ is the most significant aspect of this discussion. We are aware that AWS do not have the emotional capacity of a human, nor do they possess any quality that resembles empathy or reasonableness; this is simply something that cannot be programmed and remains a distinctive human feature. Because of this, AWS cannot possibly comprehend the implications of killing or injuring a human being.There are, of course, various candidates that could be affected by the use of AWS in combat, if we split these into two categories: active players (combatants) and passive players (civilians), we can begin to see whose dignity would be at stake. One could argue that the civilians are at greater risk of being stripped of their dignity due to them being passive players in combat, they have little regard to the rule of war due to the fact that they are not actively engaging in the war, unlike the combatants, they are unable anticipate what could happen and taking refuge is often not an option. Furthermore, if we consider that both passive and active players in combat will be inhabiting the same geographical area, the issues of distinction (as discussed above) becomes an issue. Without the capacity to discriminate between combatants and non-combatants and to combine this distinction with the rules of proportionality, how are we to expect that AWS will not violate the human right to dignity by indiscriminately killing a civilian trying to surrender? The issue of AWS violating human dignity is complex, whilst there are unknown factors and limitations to AWS, it does not take away the fact that they are programmed to have a high level of accuracy and in some senses provide more benefits than human combatants. There is a great deal of focus on the fact that human dignity is violated because a human commander is able to display mercy and compassion and robots do not posses such abilities, whilst this is true it does not stand to mean that human commanders will show mercy and compassion. A question raised by Dieter Birnbacher is whether or not an attack by a terror bomber is less cruel because the commander of the aeroplane might in principle be merciful whereas an autonomous system would not. In answer to this, it could be argued that there is a higher level of cruelty involved if an individual possesses the ability to be merciful yet actively chooses not to. In the discussion regarding AWS violating human dignity, it is imperative that there is a realisation that the mere capability to display mercy and compassion does not equate to a ‘safer’ warzone and a reduced risk to violations of human rights. By dismissing AWS because of their lack of human emotion is in some ways idealising human warfare by making the assumption that humans will always exercise correct and merciful judgment in the heat of a battle. It could be argued that humans are capable of violating the dignity of other humans in more ways than AWS are; robots lack the emotional capacity of humans and are yet to have full autonomy and so are limited to their programming. Humans, however, are capable of making their own decisions, and both historic and current war crimes demonstrate horrific violations of dignity and other human rights violations that were inflicted by humans without the aid of AWS. There is no doubt that AWS present us with dangers and uncertainties that should be dealt with and many of these include ethical considerations, however we should be cautious in thinking that AWS introduces a new quality of warfare.3.5 Ethical and Moral implicationsThe concept of ethics and morals hold great significance in the discussion regarding AWS; as previously discussed, the human rights implications and potential violations create a worrying narrative when it comes to introducing this technology into modern warfare and integrating it into society. Some may argue that these concepts have been pushed to one side in a bid to advance this technology as quickly as possible and with limited hurdles. Whilst the terms are often used with a great level of interchangeability, ‘ethics’ and ‘morals’ are in fact two distinct concepts. When we consider what is meant by morals, we are referring to notions that can be seen as right and wrong, such notions influence us individually in our daily life and should be seen as subjective; ethics however are norms that are shared by a group and are formed around mutual reciprocal recognition. Clearly the differences here are somewhat subtle yet they are important non-the-less; this chapter will focus mainly on the ethical implications of AWS and will consider how such ethics exist in the broader international community.The legality of implementing AWS has been at the forefront of the discussion regarding their development and deployment into armed conflict, however, the technological developments involved in this weaponry raise some serious concerns in relation to IHL, human rights and disarmament agreements. As with any new weapons development, each country strives to be in possession of the most advanced technology in order to put themselves at a strategical advancement. With the US government making the prediction of the automation of armed conflict for the year 2032, it would be logical to assume that the 40 countries currently developing AWS for their militaries will be anxious to be in possession of the most up to date technologies. This considered, the continued development of AWS and LAWS has the potential to trigger global arms race, a concern raised by several experts, including Robert Sparrow. The Human Rights Watch made the following statement regarding this point: “But the temptation will grow to acquire fully autonomous weapons, also known as ‘lethal autonomous robotics’ or ‘killer robots’. If one nation acquires these weapons, others may feel they have to follow suit to avoid falling behind in a robotic arms race”, if one considers what is meant by this, we a brought back to the reason why AWS were developed in the first place and that is to play a valuable role in armed conflict and reduce the mortality rate of soldiers. Autonomous systems as a whole provide countless advantages to the human race, the introduction of Autonomous systems into biology, neuroscience and cybernetics have made the impossible a possibility and the advancements in these areas are not showing any signs in slowing down. However, this is not the area of robotics that has sparked calls for a pre-emptive ban; the robotic revolution in military weaponry is where the core fear stems from and the developments in this area raise numerous ethical and moral questions. The kind of autonomous weapons that feature in science fiction films are of course not the technology that we are dealing with as it is yet to be developed, however one could argue that the calls for a pre-emptive ban are in fact based on these sci-fi orientated depictions of AWS becoming a reality and the ethical minefield that will be attached to this. Attempts have been made to create some kind of laws or set of rules that would be able to govern AI in order for them to make ethical decision for their own actions. One attempt comes in the form of The Three Laws of Robotics put forwards by Isaac Asimov. These rules provide us with an elegant set of ethical principals in relation to robotics, the three laws are as follows: (i) “a robot may not injure a human being, or through inaction, allow a human being to come to harm”, (ii) “A robot must obey the orders given it by human beings, except where such orders would conflict with the first law”, (iii) “a robot must protect its own existence as long as such protection does not conflict with the first or second law. Whilst these laws are successful in the realms of science fiction, they have very little practical implications for the contemporary autonomous military reality that we are being faced with, and academics such as Ronald Arkin generally regard the rules as an inadequate basis for machine ethics. Whilst Asimov’s Laws can essentially be disregarded in this discussion, the concept of applying some degree of ethical thinking and moral responsibility to AWS is something that cannot be ignored, and one would argue that the ethics surrounding AWS hold a higher level of importance to the actual legality of them. This seems like a somewhat drastic view to take however, the implementation of AWS, and indeed the rapid development of AWS technology, provides a legal anomaly and an accountability gap. The race to keep up with the development in this area has seemingly blinded the parties involved to their moral and legal responsibilities and whether or not this technology would be able to comply with international humanitarian laws, human rights law or the laws of armed conflict. Having said this, there has not been a complete failure in this department; The Martens Clause, which appears in the Geneva Convention, creates a legal obligation for states to consider moral implications when assessing new technology. Its application becomes necessary in the event there is no specific existing law on the topic, therefore would be applicable in relation to AWS and LAWS due to the limited laws in place to deal with them. In particular, this clause highlights the requirement for new technology to comply with the principles of humanity and dictates a public conscience. Many of the AWS that are in use are Human-on-the-loop weapons, yet with the technological advancements that in progress, the machines have a quicker reaction time than humans; this therefore removes the ability for a human operator to override the robots actions before impact. If we consider this fact in conjunction with the Martens clause, we notice that the technology indicates there is a gap in international law that threatens the human right to dignity and provides an ethical conundrum. Fundamental ethical implications of AWS have been highlighted by Lin in 2008, this came in the form of three core ethical implications: (i) whether LAWS would be able to follow established guidelines of IHL and the rules of engagement, as specified in the Geneva Conventions; (ii) whether they would know the difference between military and civilian personnel; (iii) whether they would recognize a wounded soldier and refrain from shooting. When considering these implications, it becomes apparent that these all relate back to issue of responsibility and accountability for AWS; all of the questions posed by Lin’s ethical implication can be answered if asked in relation to humans due to the fact that they are able to process information in both a logical and emotional way, something that AWS lacks. If we are to discuss these matters in the context of AWS, we start asking the following questions: if a robot acts for itself or in an unpredictable way that the human controllers don’t understand, who will be held responsible for it? This is a question that has been considered in chapter 2 in relation to the legality of assigning responsibility and accountability to AWS, however if we look at it from an ethical standpoint, we are dealing with the lives of innocent civilians whom the technology could not identify as such. If we take the example of an AWS in the form of a drone that has been programmed to protect a boarder, this drone then erroneously identifies a mother and child seeking asylum as a hostile threat and shoots them both dead. If this duty was given to a human solider then they would be able to distinguish between a threat and a civilian, yet at present AWS do not have this capability. Separate, yet not dissimilar to the ethical concerns raised by AWS technology is the morality of AWS, with one of the main concerns being whether or not their algorithms can be considered to be discriminatory enough. This takes us back to the issue regarding distinction discussed above and provides us with one main question that is integral to this discussion: should we relinquish the decision to kill a human to a non-human machine? This is a question that Aaron M. Johnson and Sidney Axinn discuss in their paper on the morality of autonomous robots, a question that one may argue is at the centre of all discussion be that legal, ethical, political or moral. The engineering involved in the development of AWS allows this technology to advance in a way that in theory makes them more compliable with IHL, calculating and performing tasks with levels of accuracy and efficiently than any human could possibly achieve. Yet despite this, it still fails to consider the moral judgments involved in these decisions; regardless of the progress made in machine development, the concept of integrating empathy and real human emotions such as love, guilt and mercy into a machine, is nothing short of science fiction. For example, if we consider a situation in armed combat where an enemy soldier finds themselves in the firing line of the oppositions AWS and makes an attempt to surrender, or pleads to be taken as a prisoner, the robot simply does not have an emotional capacity to make a judgment based on morals. The enemy soldier will consequently end up being shot and killed as this is the programming of the AWS. It is not unreasonable to assume that this scenario has the capability of becoming a reality, nor is it unreasonable to assume that at some point in the future, robotics will develop to the stage at which they have an artificial sense of morality and are able to make decisions based on basic human morals that have been embedded into their algorithms. Hellstrom takes the view that futuristic robots will be programmed with highly advanced cognitive abilities to perceive, plan and learn as well as a multitude of complex behaviour. He argues that this increase in autonomy could open up the possibility of Military commanders treating them as though they were human soldiers, directing orders at them and expecting tasks to be carried out in the same way in which a human would. CONCLUSION:The questions this dissertation was seeking to address regarded the way in which AWS sit within legal frameworks and ethical frameworks and if these were suitable. The work will look at and evaluate these findings, then postulate some future suggestions that may be appropriate. Chapter one provided some definitional parameters that allowed the discussion regarding AWS to develop, and in doing so provided us with foundations of the issues at hand. Autonomous weapons are not an entirely new form of technology, with basic forms such as antipersonnel land mines first used on a wide scale in World War II. Whilst these land mines work on the basis that there need not be any direct human intervention in setting it off, the Autonomous weapons that are being developed currently are embedded with extremely sophisticated software that allow for a more effective and advanced weapon. At the forefront of this discussion was the fact that AWS technology is being developed at an increasingly rapid rate with little consideration regarding the legal or ethical implications of such developments. Furthermore, the lexicography surrounding AWS is failing to adapt at an efficient rate, consequently paving the way to a dramatic definition ambiguity. An example of such ambiguity can be seen by the way in which ‘autonomous’ and ‘automation’ are being used interchangeably yet have different meanings and applications.Whilst the misuse of certain terminology could be seen as trivial, it actually provides us with a serious issue that is deeply embedded in the legislation governing the use of AWS. With approximately 30 countries either deploying or developing defensive systems that can be placed somewhere on the spectrum of ‘autonomous’, this is not simply a domestic issue, but an issue that has international implications. One of the most central points in relation to definitional parameters is the fact that at present, fully autonomous weapons are essentially a thing of science fiction and do not currently exist in the way people believe they do. The prospect of a ‘killer robot’ and the potential dangers they pose has sparked fear and unease into civilians and academics alike, with a widespread call for a pre-emptive ban. However, the research that was conducted for the purpose of this dissertation has led me to believe that whilst there should be caution and perhaps a certain level of apprehension regarding the development of AWS, a pre-emptive ban could be seen as a fear-based decision based on the imagery of anthropogenic robots depicted in science fiction. A global consensus on what is deemed to be an Autonomous Weapon would provide much needed clarity, allowing legislation to be effectively implemented and thus governing the use of such weaponry. A further concern discovered in chapter one was regarding self-awareness of AWS and the subsequent diminished human control that would follow. The delegation of the human decision-making responsibilities to AWS is a concern that many academics share, including Crootof and Asaro. Whilst it may be a trivial solution, introducing clarity surrounding what is classified as an Autonomous weapon provides us with a potential solution to reducing the uncertainty surrounding the development of self-aware AWS and whether or not they would be able to comply with the laws of war. Fully autonomous weapons are, again, merely hypothetical and the confusion here is once again embedded in conflicting definitions of what classifies as an AWS. Without definitional clarity it seems unlikely that any form of responsibility or accountability could be enforced, therefore leaving the floodgates open to a whole host of detrimental ethical implications. The issues regarding responsibility and accountability of AWS formed one of the key discussions at the centre of this dissertation. The wide-spread definitional ambiguity involved in AWS has consequently led to holding any human agent responsible or accountable becoming extremely difficult. Chapter two was able to identify the possible parties that could be exposed to potential liability, as well as a consideration to potential criminal responsibility and the accountability gap that has developed. Any violations committed by AWS technology are likely to contain criminal elements; however, the application of criminal responsibility to AWS faces various difficulties due to the fact that this legislation was designed for human application and elements of criminal responsibility such as mens rea are not compatible with the capabilities of AWS. It was interesting to note however, that the general principles of criminal responsibility, specifically those relating to German law, and the individual criminal responsibility for war crimes provided an opportunity to re-design these existing laws in the context of AWS. This was a somewhat unexpected discovery due the fact that the majority of current laws have no apparent leeway to integrate provisions to deal with AWS being implemented and developed. I found however, that if one was to approach AWS as being akin to a Vordermann, it removes the issues of lack of capacity to act with intent. The lack of human qualities AWS possesses have been the route to an extensive range of problems covered within this research, but if we were to consider AWS as an innocent agent, we then allow ourselves the possibility of attributing the responsibility and accountability to the human agent. Further to the responsibility and accountability concerns addressed, I hoped to discuss the legal and ethical implication that AWS posed, chapter 3 provided an outlook on these issues. Considering the global impact that the deployment of AWS has, a discussion surrounding the International Humanitarian Laws and Human Rights implication provided and appropriate angle to fully consider the legal effects that AWS threatens. The right to human dignity emerged as being one of the most prevalent issues in relation to both the legal and ethical implications of AWS, this could be due to the fact that it is a right that each nation has in common and is something that humanity holds dear to them. It seems to me that the development of this technology will continue regardless of whether the legislation is willing to keep up with these advancements. If we are to ask whether or not laws regarding responsibility and accountability for the actions of AWS are sufficient or not, the simple answer would be no. International and domestic laws provide the potential for appropriate adaptation yet a large proportion of the relevant parties are pre-occupied with either calling for a pre-emptive ban or engaging in preparation for a potential arms race. Bibliography: Cases:Prosecutor v Delalić and others, The International Criminal Tribunal for the Former Yugoslavia 1997-1999, page 462Prosecutor v Dusko Tadic a/k/a “DULE”, ‘Decision on the defence motion for interlocutory appeal on jurisdiction’ (International Criminal Tribunal for the former Yugoslavia)Prosecutor v Strugar, ICTY, Case No. IT—1-42- A, Judgement (appeals chamber), July 17, 2008, paras. 297-98Prosecutor v Halilvić, ICTY, Case No. IT-01-48-T, Judgment (Trial Chamber), Nov, 26, 2005, para. 54Prosecutor V Delalicet. Et al, ICTY Judgement of 16 November 1998Legislation:“S.397 – 109th Congress: The Protection of Lawful Commerce in Arms Act”, 2005International Covenant on Civil and Political Rights, Adopted and opened for signature, ratification and accession by general assembly resolution 2200A (XXI) of 16 December 1966 entry into force 23 March 1976, in accordance with Article 49international Criminal Court, Elements of Crimes art. 8 into, U.N. Doc. PCNICC/2000/1/Add.2 (June 30, 2000)UK Ministry of Defence, Joint Doctrine note, 2/11: The UK Approach to unmanned aircraft systems 2-3 (2011)Unmanned Aircraft systems: Terminology, definitions and classification, the UK Ministry of Defence, Joint directive Note 3/10, May 2010International Sources:Alon M, ‘The Duty to Investigate Civilian Casualties During Armed Conflict’, Yearbook of the international humanitarian law, (Springer: Cambridge University Press 2012)Asaro P, ‘On Banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making’, International Review of the Red Cross, Volume 94 Number 886 (Summer 2012)Heyns C, ‘Autonomous weapons systems and human rights law, presentation made at the informal expert meeting organized by the state parties to the Convention on Certain Conventional Weapons’ (13 – 16 May 2014, Geneva, Switzerland)Heyns C, ‘Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions’, UN Doc. A/HRC/23/47, (9 April 2013, 17.)Human Rights Watch, “The ‘Killer Robots’ Accountability Gap: Obstacles to Legal Responsibility Show Need for Ban’. (2015).Human Rights Watch, Losing Humanity: The Case against Killer Robots (New York: Human Rights Watch and the International Human Rights Clinic, 19 November 2012)Human Rights Watch. Mind the Gap, The lack of accountability for killer robots, (April 9, 2015)ICRC, Customary IHL Database, ihl-databases-icrc.org/customary-ihl/eng/docs/citation, (30/01/20)Statute of the International Court of Justice, 59 Stat, 1055 (1945), arts 34-39; Practical information: frequently asked questions, The international court of justiceBooks:N. Bhuta, S. Beck, R. Geiβ, H. Liu, & C. Kreβ (Eds.), Autonomous Weapons Systems: Law, Ethics, Policy (pp. 105-121). Cambridge: Cambridge University Press. doi:10.1017/CBO9781316597873.005Shanahan M, The Technological Singularity (2015)Wg Cdr (Dr) U C Jha (Retd), ‘Killer Robots: Lethal Autonomous Weapons Systems Legal, Ethical and Moral Challenges’ (2016)Journal Articles:Arkin R, Governing Lethal Behaviour in Autonomous Robots 48 (2009); see also Saenz, Myth of Three LawsArmin K, ‘Killer robots: legality and ethicality of autonomous weapons, USA: Ashgate’ (2009)Leys N, ‘Autonomous Weapons Systems and International Crises’, Strategic Studies Quarterly, Vol. 12, NO. 1 (SPRING 2018), Kastan B, ‘Autonomous Weapons Systems: A coming Legal “singularity”? ‘2013 Journal of Law, Technology and Policy, 45 (2013) Chu V S, ‘The protection of lawful commerce in arms act: an overview of limiting tort liability of gun manufactures’, Congressional Research Service (2012)Crootof R, ‘The Killer Robots are Here: Legal and Policy Implications’, [2015] Cardozo Law Review, Vol 36Docherty B, ‘The ‘Killer Robots’ Accountability Gap: Obstacles to Legal Responsibility Show Need for Ban’ (2015)Foy J, ‘Autonomous Weapons Systems: Taking the Human out of International Humanitarian Law’. Dalhousie Journal of Legal Studies, 23 (2014)Hammond D N, ‘Autonomous Weapons and the Problem of State Accountability’, Chicago Journal of International Law: Vol 15, No.2 (2015)Hazard G C Jr, ‘Law, Morals and Ethics’ (1995)Horowitz M C, ‘Why words matter: the real-world consequences of defining autonomous weapons systems’ 30 Temp Int’l & Comp LJ 85 (2106)Jain N, ‘Perpetrators and Accessories in International Criminal Law, Individual Mode of Responsibility for Collective Crimes’ (2014)Johnson A M. & Axinn S, ‘THE MORALITY OF AUTONOMOUS ROBOTS’, Journal of Military Ethics, 129-141, (2013)Marra W and McNeil S, “Understanding ‘The Loop’: Regulating the Next Generation of War Machines,” 36 Harvard Journal of Law and Public Policy 36. (2013)McFarland T and McCormack T, ‘Mind the Gap: Can developers of autonomous weapons systems be liable for war crimes?’, International Law studies, U.S Naval War college. volume 90 (2014)Patrick L, Bekey G, and Abney K, ‘Autonomous Military Robotics: Risk, Ethics and Design, report prepared for the US Department of Navy’, Office of Naval research, (2008)Scharre P and Horowitz M C, ‘An introduction to Autonomy in Weapon Systems’, Centre for a New American Security, (2015)Schlatow J et al., "Self-awareness in Autonomous Automotive Systems." Design, Automation & Test In Europe Conference & Exhibitin. (2017).Tononi G, ‘Consciousness as integrated information: a provisional manifesto’, The Biological Bulletin, Vol. 215 No. 3 (2008)Newspaper Article:McDonald H, 'Ex-Google Workers Fear 'Killer Robots' Could Cause Mass Atrocities' the Guardian (2019)Online sources:Seumas Miller, ‘Autonomous Weapons: An Introduction’, <http://counterterrorismethics.com/autonomous-weapons-an-introduction/>‘The Difference Between The Moral And The Legal’ (Reason and Meaning 2020) <https://reasonandmeaning.com/2016/03/31/the-difference-between-the-moral-and-the-legal/> Accessed 17 April 2020 Henry McDonald, 'Ex-Google Workers Fear 'Killer Robots' Could Cause Mass Atrocities' The Guardian (London 15 September 2019). Rebecca Crootof, ‘The Killer Roots are Here: Legal and Policy Implications’, [ 2015] Cardozo Law review Robot Sparrow, 'Killer Robots' (2007) Vol.24 Journal of applied science. Christof Heyns, ‘Autonomous weapons systems and human rights law’ presentation made at the informal expert meeting organized by the state parties to the Convention on Certain Conventional Weapons’ (13 – 16 May 2014), Geneva, Switzerland. UN Doc A/HRC/26/36 The advancements referred to here can not only be seen in the rapid development in Automated vehicles and weapons, but Artificial intelligence had a vital impact on society, from medical innovations using AI to simple household gimmicks such as ‘Alexa’. Murray Shanahan, The Technological Singularity (The MIT Press 2015). ‘The Difference Between The Moral And The Legal’ (Reason and Meaning 2020) <https://reasonandmeaning.com/2016/03/31/the-difference-between-the-moral-and-the-legal/> Accessed 17 April 2020 Wg Cdr (Dr) U C Jha (Rhetd), ‘Killer Robots, Lethal Autonomous Weapon Systems: Legal, Ethical and Moral Challenges’ (VIJ Books (India) Pty Ltd, 2016) ibid Rebecca Crootof, ‘The Killer Roots are Here: Legal and Policy Implications’, [2015] Cardozo Law review Human Rights Watch & INT’L Human rights clinic, Harvard Law School., ‘Loosing Humanity: The Case against Killer Robots’ 1-2, 5 (2012) Paul Scharre and Michael C. Horowitz, ‘An Introduction to Autonomy in Weapon Systems’, (2015) Centre for a New American Security Whilst many academics such as Kenneth Anderson, Daniel Reisner, and Matthew Waxman (“adapting the Law of Armed Conflict to Autonomous Weapons Systems,” international law studies, Vol.90 (2014), p.406)., have argued that limiting the specific situation where the weapons could be used would reduce the risk to civilians, the opposition argue that a narrowly constructed hypothetic case cannot legitimise the use of these weapons, prompting for a pre-emptive ban. Human Rights Watch, ‘Losing Humanity: The Case Against Killer Robots’ (19 November 2012) New York: Human Rights Watch and the International Human Rights Clinic, William Marra and Sonia McNeil, “Understanding ‘The Loop’: Regulating the Next Generation of War Machines,” (2013) 36 Harvard Journal of Law and Public Policy 36. Pages 1139–85, U.S Dep’t of Defense Directive No. 3000.09, Autonomy in Weapon Systems, (Nov.21, 2012) , <http://www.dtic.mil /whs/directives/corres/pdf/300009p.pdf.> Ibid ibid UK Ministry of Defence, Joint Directive Note 3/10, ‘Unmanned Aircraft systems: Terminology, definitions and classification’, (May 2010) ibid Rebecca Crootof, ‘The Killer Robots are Here: Legal and Policy Implications’, [2015] Cardozo Law Review, Vol 36, p.1854 Ibid, p. 1845 Peter Asaro, ‘On Banning autonomous weapon systems: human rights, automation, and the dehumanization of lethal decision-making’, (International Review of the Red Cross, Summer 2012) Volume 94 Number 886 Giulio Tononi, ‘Consciousness as integrated information: a provisional manifesto’, (2008) The Biological Bulletin, Vol. 215 No. 3 Johannes Schlatow et al., "Self-awareness in Autonomous Automotive Systems." Design, Automation & Test In Europe Conference & Exhibitin. (2017). ibid UK Ministry of Defence, Joint Doctrine note, 2/11: “The UK Approach to unmanned aircraft systems” 2-3 (2011) Nehal Bhuta, Susanne Beck, Robin Geiß, Hin-Yan Liu and Clause Kreß, Autonomous Weapons Systems, Law, Ethics, Policy, (Cambridge University Press 2016), pages, 303-324. United Nations General Assembly, ‘Report of The Special Rapporteur on Extrajudicial Summary or Arbitrary Executions’ (United Nations General Assembly, Human Rights Council 2013) A/HRC/23/47 James G Foy, ‘Autonomous Weapons Systems: Taking the Human out of International Humanitarian Law’, (2014) Vol.23, Dalhousie Journal of Legal Studies p.58 German Criminal Code, in the version promulgated on 13 November 1998, federal Law Gazette [bundersgesetxblatt] I, p. 3322, last amended by article 1 of the law of the 24 September 2013, federal Law gazette I, p. 3671 and with the text of Article 6(18) of the Law of 10 October 2013, federal gazette I, p. 3799 Rome Statute of the International Criminal Court, UN Doc A/CONF. 183/9, I July 2002 Article 30 of the Rome Statute IMT Charter (Nuremberg), Article 6; IMT Charter (Tokyo), Article 5; ICTY Statute, Articles 2–3); ICC Statute, Articles 5 and 25. Prosecutor v Dusko Tadic a/k/a “DULE”, ‘Decision on the defence motion for interlocutory appeal on jurisdiction’ (International Criminal Tribunal for the former Yugoslavia) International Criminal Court, Elements of Crimes art. 8 into, U.N. Doc. PCNICC/2000/1/Add.2 (June 30, 2000) Tim McFarland and Tim McCormack, ‘Mind the Gap: Can developers of autonomous weapons systems be liable for war crimes?’, (2014) International Law studies, U.S Naval War college. volume 90 Geiss Robin, ‘The international-law Dimension of Autonomous Weapons Systems’, (October 2015), International Policy Analysis, Germany, p.209. “S.397 – 109th Congress: The Protection of Lawful Commerce in Arms Act”, 2005 Vivian S. Chu, ‘The Protection of Lawful Commerce in Arms Act: An Overview of Limiting Tort Liability of Gun manufacturer’, (2012) Congressional Research Service PLCAA defines “negligent entrustment” as “the supplying of a qualified product by a seller for use by another person when the seller knows, or reasonably should know, the person to whom the product is supplied is likely to, and does, use the product in a manner involving unreasonable risk of physical injury to the person or others”, a plaintiffs claim of negligent entrustment will be asserted under state law. Robert Sparrow, ‘Killer Robots’, (2007) Journal of Applied Philosophy, Vol. 24, No. 1. Human rights watch, ‘Making the case: The Dangers of Killer Robots and the Need for a Pre-emptive Ban’ (2016) Tim McFarland and Tim McCormack, ‘Mind the Gap: Can developers of autonomous weapons systems be liable for war crimes?’, (2014) International Law studies, U.S Naval War college. volume 90 Robert Geiss, ‘Autonomous Weapons Systems: Risk Management and State Responsibility’ (2016) Third CCW meeting of experts on lethal autonomous weapons systems (LAWS) Geneva, 11-15 Ibid Article 1 of the Geneva Convention I-IV Robert Geiss, ‘Autonomous Weapons Systems: Risk Management and State Responsibility’ (2016) Third CCW meeting of experts on lethal autonomous weapons systems (LAWS) ibid Daniel N, Hammond, ‘Autonomous Weapons and the Problem of State Accountability’ (2015), Chicago Journal of International Law: Vol 15, No.2 Statute of the International Court of Justice, 59 Stat, 1055 (1945), arts 34-39; Practical information: frequently asked questions, The international court of justiceProsecutor v Delalić and others, The International Criminal Tribunal for the Former Yugoslavia 1997-1999, page 462 Prosecutor v Halilvić, ICTY, Case No. IT-01-48-T, Judgment (Trial Chamber), Nov, 26, 2005, para. 54 Wg Cdr (Dr) U C Jha (Retd), ‘Killer Robots: Lethal Autonomous Weapons Systems Legal, Ethical and Moral Challenges’(VIJ Books (India) Pty Ltd, 2016), Page 80 The Rome Statute of The International Criminal Court, Article 28 (Responsibility of Commanders and other superiors) Prosecutor v Strugar, ICTY, Case No. IT—1-42- A, Judgement (appeals chamber), July 17, 2008, paras. 297-98 ibid Prosecutor V Delalicet. Et al, ICTY Judgement of 16 November 1998 Robert Sparrow, ‘Killer Robots’ 24, Journal of Applied Philosophy (Sparrow can be seen to argue that being able to attribute legal or moral responsibility to someone forms one of the fundamental elements of a just war) Bonnie Docherty, Senior Arms Division researcher at Human Rights Watch and the reports lead author to ‘The ‘Killer Robots’ Accountability Gap: Obstacles to Legal Responsibility Show Need for Ban’ (2015) Human Rights Watch, “‘The ‘Killer Robots’ Accountability Gap: Obstacles to Legal Responsibility Show Need for Ban’. (2015). Bonnie Docherty, Senior Arms Division researcher at Human Rights Watch and the reports lead author to ‘The ‘Killer Robots’ Accountability Gap: Obstacles to Legal Responsibility Show Need for Ban’ (2015) Universal Declaration of Human Rights, UN GA Res. 217 (III) A, 10 December 1948 Wg Cdr (DR) U C Jha (Retd) ‘Killer Robots: Legal Autonomous Weapons Systems, Legal, Ethical and Moral Challenges’ 2016 International Humanitarian Law, Volume II, Chapter 1, Section A, Rule 1. ‘The Principle of Distinction between Civilians and Combatants’ International Humanitarian Law, Volume II, Chapter 2, Section A, Rule 7, ‘The Principle of Distinction between Civilian Objects and Military Objects’ Protocol Additional to the Geneva Convention of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I), 8 June 1977, 1125 UNTS 3, art 48, (entered into force 7 December 1979) [additional Protocol I] Robert Sparrow, ‘Twenty Seconds to Comply: Autonomous Weapon Systems and the Recognition of Surrender’, international law studies, Vol. 9, 205, p.299 - 728 ibid Krishnan Armin, ‘Killer robots: legality and ethicality of autonomous weapons, USA: Ashgate’ (2009), p.98 Robert Sparrow, ‘Twenty Seconds to Comply: Autonomous Weapon Systems and the Recognition of Surrender’, international law studies, Vol. 9, 205, p.299 - 728 Protocol Additional to the Geneva Convention of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I), 8 June 1977, 1125 UNTS 3, art 48, (entered into force 7 December 1979) [additional Protocol I] Wg Cdr (DR) U C Jha (Retd) ‘Killer Robots: Legal Autonomous Weapons Systems, Legal, Ethical and Moral Challenges Killer robots’, (VIJ Books (India) Pty Ltd, 2016) p.73 Noel E. Sharkey, ‘The Evitability of Autonomous Robot Warfare’, international review of the red cross (2012), Vol. 94, No. 886, p.789 Kastan Benjamin, ‘Autonomous Weapons Systems: A coming Legal “singularity”? ‘, 2013 Journal of Law, Technology and Policy, 45 (2013) 45-82 Article 85, para 3 (b) of the additional protocol I and article 8 (b) (iv) of the Rome statue of the international criminal court. See also; Margalit Alon, ‘The Duty to Investigate Civilian Casualties During Armed Conflict’, Yearbook of the international humanitarian law, Vol. 15, 2012, Springer: Cambridge University Press, P. 155 - 186 Christof Heyns, ‘Autonomous Weapons Systems and Human Rights Law, presentation made at the informal expert meeting organized by the state parties to the Convention on Certain Conventional Weapons’ 13 – 16 May 2014, Geneva, Switzerland International Covenant on Civil and Political Rights, Adopted and opened for signature, ratification and accession by general assembly resolution 2200A (XXI) of 16 December 1966 entry into force 23 March 1976, in accordance with Article 49 Article 1 of the Universal Declaration of Human Rights Article 1 of the Universal Declaration of Human Rights, UN GA Res. 217 (III) A, 10 December 1948 The ‘Riobot’ is being developed in order to control unrest on the mines in Africa. Christof Heyns, ‘Autonomous weapons systems and human rights law, presentation made at the informal expert meeting organized by the state parties to the Convention on Certain Conventional Weapons’ 13 – 16 May 2014, Geneva, Switzerland Christof Heyns, ‘Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions’, UN Doc. A/HRC/23/47, 9 April 2013, 17. Dieter Birnbacher, (2016). Are autonomous weapons systems a threat to human dignity? In N. Bhuta, S. Beck, R. Geiβ, H. Liu, & C. Kreβ (Eds.), Autonomous Weapons Systems: Law, Ethics, Policy (pp. 105-121). Cambridge: Cambridge University Press. doi:10.1017/CBO9781316597873.005 Ibid Nehal Bhuta, Susanne Beck, Robin Geiß, Hin-Yan Liu and Clause Kreß, ‘Autonomous Weapons Systems, Law, Ethics, Policy’, (2016) Page 121 Geoffrey C Jr Hazard, ‘Law, Morals and Ethics’ (1995) 19 S llI U L J 447, Page 451 Geoffrey C Jr Hazard, ‘Law, Morals and Ethics’ (1995) 19 S lII ULJ 447, Page 453 Wg Cdr (DR) U C Jha (Retd) ‘Killer Robots: Legal Autonomous Weapons Systems, Legal, Ethical and Moral Challenges Killer robots Robert Sparrow, Killer Robots, Journal of Applied Philosophy, Vol. 24, No. 1, 2007, p.62-77 A quote given by the Human Rights Watch on the discussion regarding the advancement of AI in the development of AWS and how these advancements are likely to revolutionize the way wars are fought, October 21, 2003. Isaac Asimov, The Three Laws of Robotics Ronald Arkin, ‘Governing Lethal Behaviour in Autonomous Robots’ 48 (2009); see also Aaron Saenz, ‘Myth of Three Laws’ (May 10, 2011) < https://singularityhub.com/2011/05/10/the-myth-of-the-three-laws-of-robotics-why-we-cant-control-intelligence/> The Martens Clause is part of the laws of armed conflicts, appearing in the preamble to the 1899 Hague Convention (II) with respect to the Laws and Customs of War on Land. robots that can select targets and deliver force under the oversight of a human operator who can override the robots actions and Whilst the three core ethical principles make reference to LAWS, due to the inherent similarities between LAWS and AWS, for the purpose of ethical consideration Lin’s three fundamental ethical implications can be used in relation to the conversation on AWS. Lin Patrick, George Bekey and Keith Abney, ‘Autonomous Military Robotics: Risk, Ethics and Design, report prepared for the US Department of Navy, Office of Naval research’, 2008, p.21 Aaron M. Johnson & Sidney Axinn (2013) ‘THE MORALITY OF AUTONOMOUS ROBOTS’, Journal of Military Ethics, 129-141, DOI: 10.1080/15027570.2013.818399 In this situation the consideration of AWS being akin to a Vordermann (the direct perpetrator of the criminal act) is based on the concept of a person committing a crime through another being indirect perpetration. The Hintermann controls the Vordermann in such a way that he is manipulated and used as a tool. Analogous to the way in which AWS are used as tools. PAGE \* MERGEFORMAT 38/9j/4AAQSkZJRgABAQAA3ADcAAD/4QCARXhpZgAATU0AKgAAAAgABAEaAAUAAAABAAAAPgEbAAUA
AAABAAAARgEoAAMAAAABAAIAAIdpAAQAAAABAAAATgAAAAAAAADcAAAAAQAAANwAAAABAAOgAQAD
AAAAAQABAACgAgAEAAAAAQAAAlagAwAEAAAAAQAAAQYAAAAA/+0AOFBob3Rvc2hvcCAzLjAAOEJJ
TQQEAAAAAAAAOEJJTQQlAAAAAAAQ1B2M2Y8AsgTpgAmY7PhCfv/AABEIAQYCVgMBIgACEQEDEQH/
xAAfAAABBQEBAQEBAQAAAAAAAAAAAQIDBAUGBwgJCgv/xAC1EAACAQMDAgQDBQUEBAAAAX0BAgMA
BBEFEiExQQYTUWEHInEUMoGRoQgjQrHBFVLR8CQzYnKCCQoWFxgZGiUmJygpKjQ1Njc4OTpDREVG
R0hJSlNUVVZXWFlaY2RlZmdoaWpzdHV2d3h5eoOEhYaHiImKkpOUlZaXmJmaoqOkpaanqKmqsrO0
tba3uLm6wsPExcbHyMnK0tPU1dbX2Nna4eLj5OXm5+jp6vHy8/T19vf4+fr/xAAfAQADAQEBAQEB
AQEBAAAAAAAAAQIDBAUGBwgJCgv/xAC1EQACAQIEBAMEBwUEBAABAncAAQIDEQQFITEGEkFRB2Fx
EyIygQgUQpGhscEJIzNS8BVictEKFiQ04SXxFxgZGiYnKCkqNTY3ODk6Q0RFRkdISUpTVFVWV1hZ
WmNkZWZnaGlqc3R1dnd4eXqCg4SFhoeIiYqSk5SVlpeYmZqio6Slpqeoqaqys7S1tre4ubrCw8TF
xsfIycrS09TV1tfY2dri4+Tl5ufo6ery8/T19vf4+fr/2wBDAAEBAQEBAQIBAQICAgICAgMCAgIC
AwQDAwMDAwQFBAQEBAQEBQUFBQUFBQUGBgYGBgYHBwcHBwgICAgICAgICAj/2wBDAQEBAQICAgMC
AgMIBQUFCAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAgICAj/
3QAEACb/2gAMAwEAAhEDEQA/AP7+KKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooA
KKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAo
oooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACii
igD/0P7+KKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKK
ACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooA
KKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigD/0f7+KKKKACiiigAo
oooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACii
igAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKK
ACiiigAooooAKKKKACiiigAooooAKKKKACiiigD/0v7+KKKKACiiigAooooAKKK/iz/4LIf8HAv7
bH7A/wC314n/AGZPg1pPgW78P6PpOi31rPrlldTXhfUbJLiUO0VzGpAYkLhRx1zQB/aZRX+bl/xF
pf8ABSr/AKAPww/8F19/8l0f8RaX/BSr/oA/DD/wXX3/AMl0Af6RtFf5uX/EWl/wUq/6APww/wDB
dff/ACXR/wARaX/BSr/oA/DD/wAF19/8l0Af6RtFf5uX/EWl/wAFKv8AoA/DD/wXX3/yXR/xFpf8
FKv+gD8MP/Bdff8AyXQB/pG0V/m5f8RaX/BSr/oA/DD/AMF19/8AJdH/ABFpf8FKv+gD8MP/AAXX
3/yXQB/pG0V/m5f8RaX/AAUq/wCgD8MP/Bdff/JdH/EWl/wUq/6APww/8F19/wDJdAH+kbRX+bl/
xFpf8FKv+gD8MP8AwXX3/wAl0f8AEWl/wUq/6APww/8ABdff/JdAH+kbRX+bl/xFpf8ABSr/AKAP
ww/8F19/8l0h/wCDtP8A4KVhf+QB8L//AAW33/yZQB/pHUV8N/8ABNn9ojx1+1t+wp8Mf2kviXDY
2+v+MfDEOsapDpaPHaJNI7qREsjOwXCjqxPvX3JQAUUUUAFFfhZ/wXu/4KP/AB3/AOCZn7MHhP4w
/AGz8P3uqa547h8NXkfiKCaeBbWSwvLosiwyxEPvt1GSxGCeO4/no/Yx/wCDsX9oLVf2jPD3h/8A
bP0Pwba/D3VLkaZrGreG7S6t7vSmuCqxXrCWeUSQxN/rlChthLKfl2kA/vqorI0rVdN1zTbbWtGu
Ibu0vIUurW5t3DxSwyqGR0ZSQyspBBBwRyKtX0j29pLcR4zHE7jPqATQBdor/N38Qf8AB2b/AMFJ
9NvtRtrfw/8ADDZZ3N1FHu069JKwyMq5/wBM64Xmv9En4Y+INQ8YfDjw94u1YRrdapolhqNwIgQi
y3MCSuEBJIUFiBnnHegDv6KKKACiiigAooooAKKK5XxF418IeELZr3xXqmn6bEgLPJfXEcKqO5Jc
jFAHVUVxHgb4heBPiboS+KPhzrOm67ppuJbUahpNzHd25mgcpLH5sTMm5GBVlzlTwcGvn/8Abs+N
fi79mz9jb4ofH/wFHZza14P8E6t4i0uHUEaS1e5sbd5YxKqFWKEqAwVgcd6APreiv8/L9kL/AIOf
v+ChXx0/am+GnwY8Y6J8OItK8XeOtD8N6pJZWF4twltqV5FbymJmu2AdVclSVIz1Br/QNoAKKKKA
CiiigAooooAKKKKACiiigAoor8iv+C2f7cXxf/4J4/sK6j+0j8D7XRrvX7TxNoukRQa7DJPaNDqE
/lSkrFJE28DlfmxnqDQB+utFfxf/APBGL/g4A/bT/wCCgH7fWgfsx/GjSPA9poGqeH9c1W5uNDs7
qG7EunW4miCtLcSLtLHDZUnHQiv7QKACiiigAooooAKK/mJ/4OGv+CzHxp/4JuR+DPgv+zTZ6cvj
Dxjp95rlz4g1i3+1W+nadaSrAqwW+5VknlkLcudsapkhiwA5b/g3i/4LT/Hf/govr/jH9nr9p610
u48U+F9Gh8S6Z4l0e3+yR31jJcC3liubcFkSaKRkKsjAOpPClckA/qfooooAKKKKACiiigAooooA
KKKKACiiigD/0/7+KKKKACiiigAooooAK/y5/wDg5t/5TC+Pf+xd8L/+myOv9Riv8uf/AIObf+Uw
vj3/ALF3wv8A+myOgD8B6KKKAOv0H4f+NfFPhzWfF/hvS7y80vw6lnJrt9BGWhsV1C4FpamdsYQT
TsIk55chRzXqPxF/ZT/aP+EOta74b+J/gvxBoV/4Z0Gy8UeILTUrR4ZNP0nUZ1trW8uA2SkM8zrG
jHgswA5r9Fv+CK/xS+LujfHfxN+z78GfhD4T+NWp/Evw/FDL4P8AGN7HZWHleHZ/7T8/dMpiLxso
dQ5BBAK5PFfsH/wWn/aH/b4039ljxLe/ta/s1fDr4ef8LOj0r4eTfEXQfEFtqurNFptwurW9ifI3
OYV+xsQHIRevXFAH8c1FFFABRRRQAV1fhDwJ4w8f3V/ZeCdNu9Tl0vR73xDqEdnGZGg03Tk827uX
A6RQJ8zsfujk1ylfoR/wS5+L/jv4P/toeHP+Fb+B9E+JGseMbK/+HFp4J8R3AtdO1X/hJYxamC4l
YMgVhwQ+FIOGIoA+edR/ZU/aO0k+GhqngvxBB/wmXhm78Z+FvMtXH9q6FYW7XVzf22f9Zbwwq0jO
OFUZPFfPayK6hkOQQCCOhB6V/dT+3T8c/wDgov8ABX9lPW/iV8c/2PvhJ4T8PeDfA178PdM8VaZ4
ks7298LaX4ii/sfydPigzIiZuVXy0ABHDYXJr+FSGIRwpCDnaoUH1wMZoAkpr/dp1Nf7tAH+tz/w
Q5/5RKfAb/sQrX/0bLX6s1+U3/BDn/lEp8Bv+xCtf/RstfqzQAUUUUAfyZf8HgH/ACYT8Ov+yu2v
/pn1Kv8AO8bBX5unfPav9EP/AIPAP+TCfh1/2V21/wDTPqVfzPf8G+v7JPwm/bm/af8AiV+zF8Zr
fzdJ8RfBbV1trtFBudOv4dT01rW+tifuzW7ncvYgsrZViKAP3D/4NiP+Cv8AHrOl2X/BNb9ozVMX
9ijf8Ko1m+kP+k2iAvJoksjnmSHlrTPLR5jHMa5/tc1X/kF3P/XCT/0E1/jZ/tRfs3/Hz/gnZ+1f
qfwZ8dvdaL4w8EaxDf6RrenlovOSNxNYapZScHZIoWRDn5WyjfMrCv8ASd/4Iqf8FVNB/wCCm/7K
DXHima2tfif4QtU0nx3pMZCGeTyisOp28faC8CliBxHIHTsCQD/K48Z/8hTXP+v7UP8A0bJX+1d8
Bv8Akhvgz/sU9I/9I4q/xVvHMTw61r0TjDLf6irD0xLJX+xX4o/aJ+HX7JX7B8P7SPxYuDbaB4Q+
HenavfFOZJfLs4ligiX+KWaVlijXu7KKAPrDxP4s8M+C9Fm8ReL9QstMsLdTJcXl/MlvDGo6lnkK
gAfUV8Yf8PQ/+CdZ8Q/8IsPjZ8M/t3meV5I8QWX387du7zduc8detf5hP/BQf/gp7+1n/wAFOvi1
NrXxS1O/i0Ga+MXhX4d6RJIdOsUkfbBF5EWDd3TZAaV1Zmc4QKuFr0pv+CCX/BVcfC3/AIW2fgzr
P9mfZPt/2HzbP+1fJ27939neb9q3Y52bN/tnigD/AFfPDXirw34y0eHxD4S1Cy1WwuFD295YTJPD
Ip6FXjLKc+ucVq3d1bWFrJe3jrFFEjSyyyEKiIoyWYngAAZJPSv8ib/gnp/wU+/av/4Jj/FuDXfh
dqeoTeHo9QEfin4d6rK407UIo32zxiCTP2W6UBgkqBWD/fDLkV/qEeH/ANor4c/tb/sDXn7SHwou
ftXh/wAX/DjUtWsS4HmReZZSrLBKo+7LDIGikXJw6sKAPVof2sv2X7uVIbb4h+C3klZY4kTWrNiz
OcKABLkkk4A710vxV/aE+BnwMto7z4zeMfDPhWKUZik8Qalb2AcDrt891z+Ar/FZ8KXv/CO6ppXi
OxhhafTbqz1GBXXKGW0kSZAwGCVLIAcEcelffWv/AAy/4KM/8FR/iPrn7Qv/AAiXxA+KerancyS3
msWGmXF5ZwAni1t2CeRDDEMKkMZwoxkEnJAP9cSf4wfCu28B23xVuvEeiJ4YvIIbmz8Qvewrp00V
xjyXjud3lssmRtOcN2zX+X7/AMHFHxKtPiN/wVh+I174W8RHXPD40/w4unNY35utOB/si280QhHa
EfPu3hR97Oec1/Vr+0/+xl+0r8cP+DbbwF+yl8OfB+pXfxBg8LeCIbjwpcqlneQy6bd28l1HKs5j
CPEiMWUkHjHWv8+79oH9nf4v/so/FjU/gT8eNEl8O+KdHS2l1HSZ3jd4VvIFuIWLRMyESROrjDHg
84NAH+hX/wAG0Xx9+BXw3/4JM+EPC/jfxf4Y0S/i8U+LJJLDUtStrWdFl1i4ZCYpHVgGBBBI5BBH
Br9VP+Cp/iDQvFf/AASq+OPiPwxeW2o2F78JPENxZ3lnKk0E8TWUmHjkUsrKexGRX+Zf+zj/AMEj
v2//ANrv4WWfxx/Z8+G194l8MX91dWVrq1vcWcavNYytBOgWaVH/AHcispyoBI4r++/xH8IviB8A
/wDg2w1f4LfFXTJNH8R+Gf2br/SNa0uVkdrW7g06RZIy0ZZTtPdSR70Af55H/BNoZ/4KB/Av/srH
hP8A9OlvX+tP8av2zf2UP2c7pNP+OfxF8HeFLmTBS11zVra0nYHoVikcOR77ce9f43Hw28WeM/AP
jjQ/HHw4uriy8Q6RqdpqOhXlmgeeC/gkV7eSFCrbnEgUqCpyccGv1R8Q/wDBGH/grt8UfAep/tWe
Ovhj4t1SO6t5df1O+8QXcMmv3cW0yvM1ncTG9clfmC7N2OAuMCgD/Ur+Dv7QvwN/aD0NvEXwQ8Xe
HPFtkhAkufD9/DfIhOSA5hZtp46H8q9A8WeK/C/gTQ5vFPjPUbLSdNtQpub/AFGZLa3i3sEXfLIy
quWIAyeScd6/xvf2K/2xfjD+wb+0DoP7QnwS1K50660m/gk1bT4nZbTVdNEim6sruHIWRJY9yjI3
I2GUhhmv9Iz/AIOA/EOm+Nf+CIvxO8WaX81pq2i+GtTt93OYbrV9PlTP/AWFAH62eGv2kf2fvGWu
2/hnwl438K6pqN5IY7SwsNUtp7iVgCxCRpIzNwCcAdBXsdzdW9hbvdXbpHFEjSyyuQqqqjJZieAA
OSSa/wAnn/ggLa28f/BYX4HMkagjX9SwQBkf8Si9r/U++PIz8DvGf/Yqav8A+kctAHGR/tbfsuyu
scXxE8FOzsFVV1mzJJJwAP3vOTXqnjT4g+B/htow8SfEDWNM0TTzMtuL3VbmO1gMsmdqeZKyruOD
gZycGv8AE8+H1nar4t0AiNP+QvYfwj/n4Sv9IT/g7Djjl/4JSWSOoYf8LQ8NHBGf+WN5QB/Qh4P/
AGgfgZ8QtY/4R7wH4w8NazqHlPcfYtL1K2uZ/LixvcJE7NtXIycYHevFfiJ/wUR/YV+Evig+CfiR
8XPh7ouro/lyadf65ZxTxtnGJEMmUP8AvYr/ACN/2Z9T/aM0z4nDw3+ya3iGPxh4q0268JxW/hGN
zql3Z6hs+0W0LQqZUWQIod02kKD8wGa+qP2nP+CQH/BQ79kf4TD4+/tE/DbUNH8MSSxLfar9otb4
2sl0wSM3q28sskO92C7pBguQpOSAQD/W98F+O/BXxG8PQeLvh/q2na3pdyu+21HSriO7tpR0yssT
Mrfma+fPi9+3d+xn8BdY/wCEd+MnxR8DeG9RBAOn6trNrb3I+sLSBwPcjFf5PX7K/wC3j+2p+yv4
W8SfB79ljxjr2h2Xj2GHSr3StJDTytMzgI+nx4YwXcmfKMkKh2VsDkKV+gfiF/wRY/4KkeGPg3qX
7UfxK+FniFdGgsn1vWbzULiG41lLfBklurizMz3pwuXkLLvVcsQADQB/qyfCz41fCP45+HF8YfBr
xNoXijS3O0X+g3sN9Bu9C8LMAfUdRX4Mf8HUZz/wSR1k/wDU+eF//Sw1/B3/AMEtf26vit+wF+2B
4P8Ain8N9SurbQ77XrDSfGWhROfsWq6NeXCQ3CSxZ2GSNH8yGQDejqMHaSG/vA/4OoZEl/4JH6xJ
GchvHXhZlI7g3ZxQB/Jb/wAGx+oWOlf8FdvCmpapNFbwQeCvF0k087BI40FiCWZzwAB1Jr/Qqvv+
CmH/AAT60vxX/wAIRqHxo+GsWqCXyGtG8QWIdZM42t+9wpzxg85r/JM/Zv8AgZ8ev2kvi1ZfBP8A
Zq0nVtc8V65bXMEGmaPKYZZrNU3XXmv5iKIAgzKXYJt+9X3l+0v/AMEJ/wDgpZ+yZ8KLr41fF34a
keHNOg+16veaHe2eqtp8IwGluYrSSSREU43OFKr1JA5oA/1jND1/QvE+kwa74bu7a/srmMS293aS
rNDKjDIKOpKsCOhBIrJ8a+PfBPw40U+JfH+r6domnCZLc3uqXEdrB5sn3V8yUqu5sHAzk9q/y4/+
CLP/AAV/+MP/AATp+PmheDvFOtXupfB7xBqkGneKfDl7K0tvpaXcix/2nYBt3kvBuDyxoQkibsjc
FYf11/8AB1fPZan/AMEkpL21ZJoZ/iJ4UmikQ7ldHllKsD3BByDQB/QD4N+P3wO+Imt/8I34A8X+
Gta1HyXuBZaXqNvcz+VHjc+yN2baMjJxgVz2t/tUfs0eHPiFafCjXvH/AIOs/E19dx2Fl4fuNXtY
9QnuJCFSFLYyCRnYkBVAJJPAr/Hw/Zt/aY+Mf7JfjDVvHvwB1JtA8Qa14YvvCR1i1QG7t7PU2i+0
Nav/AMs5mWMKjgErklfm2kfo1/wTi/YE/br139s/4MftBan8KfiJc+Hbf4peG9a1LxVf6RdmIW8e
owyTXcs8y72QLl2lJOR8xPegD+9T/gsJ/wAE5v2Jv2//AAd4a0z9qHxfB8P/ABDokt03hTxUt3aW
9yIptn2q2aK6IS4gY7GZcqVbBVhkg4f/AAR0/wCCaf7D3/BPzTPFKfs1eNoPiH4s1xbWPxL4lkvL
SeeKyhZzbW0dvaMy28JcsxyxMjjJYhQB+HH/AAeVwxS2n7P/AJiK2JvF2NwB/h0yvO/+DNaCGL4p
fHzy1Vc6D4UztGP+XjUaAP7qPEvivw14M0mXXPF2oWemWVuhea7v5o4IkUd2dyFA98ivinVv+CpX
/BOfQ9XOh6p8bPhnFdK3lvEfEFidrA4IYiUgEehNfzQ/8HS37Pv7V/7U37RXwW+DP7Nnhzxf4tSf
wvrl5f6JoCTyWIlW7t1jnuwpFshC5CPMeOQpHNfzueMf+Den/grL4F8D3fjzWPhFPLZ2ds95c2um
6jp15eiNF3PttYZ3lchckqqlvQZoA/1Nvh38V/hj8XdCTxN8K/EOi+I9Of7t7ol7FeQnvxJCzD9a
9Fr/ABr/ANiT9ub49/8ABPX436Z8b/gbq95p/wDZ95E2vaD5jrp+sWEbD7RZ3lv9xtyBlV9u+N8M
rAjB/wBhn4ceN9K+Jvw90L4kaFn7D4h0ax1uzDdfIvoEnjz77XGaAO3ooooAKKKKACiiigAooooA
/9T+/iiiigAooooAKKKKACv8uf8A4Obf+Uwvj3/sXfC//psjr/UYr/Ln/wCDm3/lML49/wCxd8L/
APpsjoA/Mf8AYG+D3ws+P37XXg74N/GrUYNH8M6/JqlpqGqXV2LGG1lTS7uazkkuGBCJ9sjh3eoy
o61/RlZf8ELv2CdJ+HnwjufEXxo8AXWuaS+qXHxxay8VoItWgWJ2so9EUpmF1cLv3YyMiv5aP2fv
D2i+Lvj/AOA/CXiS3S703VfG2g6bqFrJnZNbXeowRSxsRg4dGZTg5GeoNf2Lftkfsyf8EzLnwV+2
v8FfhJ8AdB8K+J/2d/A2n6po3jCC+upnu7rVrX7THJHC8mIzDjbzuzQB+Jv/AAbr+PPA/wANf+Cr
fhDxX8Q9Y03QdKg8L+LLZ9T1WdLa2jefT3jiDSuQqljgDnJPSvuP/gqJ4J0L9nz/AII3eBf2cNa+
KHhH4i+KoP2gtX8VTzeHNYOqNHp+qW2oSQBjIfNAiDhDkbQSAOoriv2ev2Tf+CKOr/sFTeB/jB8e
fBln8U/FV5ofiaPxNd2k/wBr8MxKltJfaL5IlWOXLLLE8hGQXJHQV9qav8Af+Dbi7+NHjTx/pXxS
+Htn4c134aL4T8N+FlS6MWh+IwCP7ejm87dLISQfKb5eKAPzH/4Jc/8ABM/9kX9r34W+A/i98dvi
D4a8LWen/EbxJpXxL0TWNeXTdRv9BhsrQ6UdNRgfKcXMkpkkJwy4A6V4b/wU2/4J+/AT9ib4d6Xr
Hw08c+GPGeq+JPih4lj0xPDGsDUlsPB0NtbvpMN4m1dt2rmUSSDKsCAOlfbX/BP/APZs/wCCfnwq
+D/7Y/xt+JvhzRf2i/DfwOsvDN/4O1h5J9Oj1WC8jn+0tD5Un7vdJhDndzFx1r5x/wCC+X7P/wCz
V8Dfiz8FtY/Zf8FWHgPRvHvwQ0rx1f6Jp80syLealcSv8zysxZkj2x5BAO3OBmgD8HK9g/Z78M+C
/G3x98DeDfiRKbfw7q/jDRtL1+cS/ZzHp11exQ3TiUjEeIWY7z93r2rx+oLlilrIw7I2PyoA/rov
v+CD/wCxc3wa1bTrD43fDRfHbfFU3Wj38nioHTo/AHmg/Y5Ytm46j5WV8wDbnmvx7/ZWs/hJ8D/+
C3HgfTPC2pW1v4H8KftER2ul6vfXSSQpo1hqrxwXEtycKy+SisZDwevANf1I+Ef2Mv8Aglc3x18G
fsW6j+z54euNa179mCP4y3HjVr67BF4kCwvEbcSgFnkJk35wD2r8Lv8Agmx+zr/wSH1n9kzxbr/7
ZPxg8J6d478deGLzR9B0fV7eYXPgnUYLm5hhv0McircPJH5MoRhjGBnrQB+tf/BRbS/Avwu/4J+f
tvahd/FvwT4zm+MfxX0jxx4L0LRtbF9dWemnVrHNsYHb5GUKWKxfIACegr+fX/gmb+wV8Df22/Af
ip/iF448NeC9c8OeOPCTrL4o1YadBeeGbgXja1DbptYvckRw+U/RDnPUV+z3hH9mz/g3S0T4i/B/
xTrvxg+H+o6R4H8J3GifELRGguY18aarLb+VHqU7ibdbtHJ+8CR8E8dK+aP+CeP7En/BPPWf+Cjv
xR8ER6hoXx/+Gnhz4G658SNNaNJrK0h1G3vI3FkpSXfvggxHuLYIkzjIoA8f/wCCn/8AwTC/Y3/Z
L+GHxM+NHwE+IXhrxJp0/inwnp3wv0DS9fXUtVsrCdZF1ptRQKBLudVMTgnYuc1/Ow/3a/og/wCC
t/ww/Y31H/gn3+zD+2V+yt8KtN+Fs/xWm8TXOsaVY3U12/lWBhihieWZmDbWDMCFX72DnFfzvv8A
doA/1uf+CHP/ACiU+A3/AGIVr/6Nlr9Wa/Kb/ghz/wAolPgN/wBiFa/+jZa/VmgAooooA/ky/wCD
wD/kwn4df9ldtf8A0z6lX4ef8GlH/KTzXv8AskOtf+nLTK/cP/g8A/5MJ+HX/ZXbX/0z6lX4df8A
BpMR/wAPPNe/7JFrX/py0ygD+pr/AILy/wDBI3R/+CkX7Pv/AAnvw1tYofi34Gsp7jwrdDCtqtp/
rZ9JnbgESkEwMfuS9MB2r/PC/Yc/bF+Nn/BNr9rHS/jp4FjuLXUtCvJtH8VeG7zdCNQ09nCX2nXU
bYKt8uVJGY5VVh0wf9jWv4TP+DnT/gj82gajqH/BSb9nPSwLG5ZD8VNGso+IZ3YImtoi/wAMhKpd
YHDYkPViAD+K/wAaahb6xqWs6zZoyRXt1e3kUb43Kk7vIFPbIDYOO9f6Kv8Awce6z4m0v/ghn4Us
9CaRbXUNZ8CWetFASDZi3MoV8fwmeOLr3xX+chqBB064x/zwk/8AQTX+vn+0h+x74Z/b0/4JnSfs
t+JpltF8TfD/AElNN1EqWNjqdtbwXFjc46kRzxpuA5KZA60Af5/n/BtZ4M+F3jj/AIK4+BbX4oQ2
tx/Zui65rXhy3vMNG+uWduDbMFPBkijaaWMdnQMOVFf6lHzf+zV/jU/En4ZftV/8E2f2pItF8YW2
qeB/iH4H1hL7Sr+IYzLAx8q6tJcbJ7eVSSCMq6EqeCRX72n/AIO1v2+f+FVf8IaPCfgYeJfsn2X/
AISwLcY3bdvn/Ys7PMz82PM257Y4oA+QP+DlHwP8LfAv/BWzxza/C+O1t/7S0TQ9a8RW9kAI49bu
4G+0sVXhZJYxFLIOpZix6mv6f/8Ag3O1XX9R/wCCE/i211lpGt7DVvH1npPmZwtobYSsqZ/hE8kv
45r+FH4ZfC79qf8A4KSftTSaB4Qg1Pxt8QvHOsNfapqEwLDzJ2Blu7uXASC3hUZJOFVAFUcAV/qV
fs8/sd+GP2Cf+CYp/ZZ8LT/bB4a+HusrqepbdpvtTurWe4vbnHHDzyMUB5CbR2oA/wAjHwfa299r
GjWN2u+GfULGCVCSA0cs0aOvGDypI4r/AGwPhp4D8G/DHwHpHgH4daVYaJoul2ENnpul6bCsFtbw
xoAqRxoAAPU9SeSSTX+KP4FI/wCEl0D/ALCumf8ApRFX+3Xpv/IOt/8Arin/AKCKALo6c1/lf/8A
ByZ/ymU+Kn/Xh4X/APTJa1/qgV/mpf8AB1R+zp4y+GP/AAUmf48XtrMPD3xI8L6XPp+obT5Jv9Hg
FhdW5foJERIpNp5KyDjigD+pH/g1z/5Q+eDsf9DZ4v8A/Tzc1+hf/BXD/lGF8e/+yV+Iv/SKSv8A
P1/4Jl/8HAX7QX/BNP8AZ3v/ANm7wr4T0TxZpDatd6zoU+qXMttJp1xfYa4RhGr+ZE0o8wDIILNz
g8f2b/Fr4/eNf2qf+DeDxT+0f8Rks4db8afs9an4g1OLT0MdtHPdWEjssSsWIQdBkn3oA/znf+Cb
oU/8FA/gWHAYf8LZ8J5DDIP/ABNLftX+yCyB8h8EEYII4Ir/ABv/APgm4wH/AAUC+Bf/AGVjwn/6
dLev9kWgD/Es+O8ENp8V/HNrbII44vFOvxxIgwqIl9OqqB2AAwK/0of+CxP/ACrqa1/2TnwH/wCl
elV/mx/H/wD5K949/wCxt8Q/+l9xX+pl+2b+zn4p/au/4Ie6t8DPAsLXGuap8F9Eu9Ftk5e4vtMs
7S/ghUf3pXtxGPdhQB/AX/wQJZV/4LBfA8kj/kP6iP8Ayk3lf6nXx3/5If4z/wCxU1b/ANI5a/xp
vgj8Yvib+y98c/Dvxt+HMj6R4t8Fa5Hqlh9rjJMN3asUeG4hYA7WBaORDjIJGRjNf0IftIf8HRP7
e/7Sfwql+Cnw78OeHfB15rlk+k6pq2iCbUNQuFnQxyraRyKFhMgYjJDsM8c4NAH81vw//wCRt0D/
ALC1h/6UJX+j/wD8HYP/ACiksv8AsqHhr/0TeV/nD+CAB410NAMbdYsRj6XCV/o8f8HYP/KKSy/7
Kh4a/wDRN5QB/MD/AMGtvP8AwV48O5A48B+KsEjp+4h6V/ch/wAFyraC4/4JGftAR3CrIo+HV/Kq
sMgPGyOjfVWUMD2IzX8OH/BrWR/w958O/wDYieKv/REFf3Nf8Fxf+URn7QP/AGTbUv5LQB/mof8A
BJQA/wDBUf8AZ+6f8lZ0D/0pFf64HxWt4bn4XeJbW6RZIpNA1GOVHGVZGtnBUg9QRwa/yP8A/gkk
R/w9H/Z+/wCys6B/6Uiv9cr4nf8AJNfEX/YDv/8A0negD/FB8KDZ4j0zZ8u3VLYLjjG2dcY+lf6U
3/B0MzN/wR1vCep8YeESf/AkV/mseFyP+Ek03/sK2/8A6PWv9Kf/AIOhv+UOd5/2OHhD/wBKRQB/
KX/wa/f8piPB3/Ym+LP/AEhWv9O3WNO0zVtKutI1yGG5sbm2ltry3uFDRSwSqVkR1bIKspIYHgg8
1/jvf8E9P23vGX/BPD9qrRP2qPh/pWn67f6RZX+mtpWpSPFDPbalF5M/zoCVcAZU4Iz2xX7pfthf
8HV/7Uf7Q3wU1T4R/B7wXpnw6uNbsZNP1PxHDfvf30dtMuyVbMGOJYnZSQJDuKg5AyAQAfzKfG/S
vCWi/FXxrovgt1k0G08Ta3aaS6nKnT4ryZIMEdvKVa/vc/4Lvy+IJ/8Ag3S+F8nizedTcfC86gZP
vmc2Sby2f4i2c+9fyk/8Egf+CYfxN/4KT/tP6L4XtdOuo/h7oWpWupeP/EUqFbaLT4ZBI1lHIQA9
zdAeWqgkgMXOAK/tH/4OqNOs9I/4JGtpGmxrBb2vxD8J29vDGMKkUUsiqij0VQAKAP5OP+DbL4Z+
Afip/wAFZvBuifEfSLDW7LT/AA5r+uWdpqUKzQJf2MCtb3HluCrPEzFk3AgNg4yAR/qVhQB+GK/z
Dv8Ag1w/5S9+Gv8AsRvFP/pPHX+nnQB/D7/weU/8evwA/wCu3i7/ANB0yvOf+DNrj4o/Hv8A7APh
X/0o1GvRv+Dyn/j1+AH/AF28Xf8AoOmV5z/wZtH/AIuj8fAe2g+FP/SjUaAP0g/4LUf8HEf/AAwT
8Urn9lj9mLQ9O8R+PrGzguPEWs6uzNpmjPcL5kVsYozvnuPLYSMu5VRWXLEnFfhX8NP+C8H/AAcA
/GW2k8d/CbQo/EejxSsHfR/Bb3VipQ/MnnRktx0Pz5FfnN/wXj+DHxG+DX/BVn4wN8Qra4jj8VeJ
pPGGgXkwPlXmlaiqNC0bdCIirQsAflMeDiv1j/4J6f8ABzf4Y/Yk/Yl8Mfsv6x8J7nV9W8G6a+m6
bqelXkFrZX673dJroMBIkzFv3hVX3HLZ5xQB/KD8Udb1zxL4q8TeJPE8C2up6nqmp6hqdrHEYFgu
7qaWaeMRH/ViOVmUIfu4x1Ff7I37C2f+GKPhAf8AqmHhf/0129f44/xP8XT/ABS+IXiX4gX0C2sv
ibXNT16e1jYssTapcy3LIrH7wQy7QcDOM8dK/wBLP/g3P/4KS+Mv29v2Z9V+H/i/w5ZaHJ8HNO8M
+Cob2yuGlGqKLB0Fw0bKPKO2BcqGbknnAFAH9F9FFFABRRRQAUUUUAFFFFAH/9X+/iiiigAooooA
KKKKACv8uf8A4Obf+Uwvj3/sXfC//psjr/UYr/Lp/wCDm9XH/BYXx2GBG7w14WcZ9P7NjGf0oA/J
r9jDwvrfjT9sH4U+F/DkJuL27+I3hwQx5CjEWowSuxLYAVEVmYk4ABJ6V/p3/wDBV34ZfC7xX/wT
x/aMl+Dth4Yfxr4v+G+ope3OmG0TUdWks7ctGs0qEPMyIpEYYkjoOtf5sH/BNL4efAv4tft1/Df4
cftNXw0vwHqur3UHiTUHv/7NENulhcyITdAgxZnSMZyM5x3r+0PxP/wTB/4N37Lw1qN3pXxGtnuo
bC4ltoz8QmbMqxsyAr5vOSBx3oA+gvgl4t/4JNfs+f8ABHX4Yftn/HDwV8ONQ08fDnRreS4Oh2E+
paxrsVosE9nEHj3y3b3Mbq2TkEMzEAE15x/wR8/bV/4Jf/8ABTu41b4aeJvgj8MvBXxG06a5vLXw
1No2nzJqOk72aKe0kMCmSSGPatwgGVb5hlCDX8sX7THxM+Hutf8ABBv9m74WaRremXPiLRviz4xv
tT0OG4V720tp5L3yZZoQdyI4ddpIAO4YPPPnn/BBD4geB/hZ/wAFYvhV47+I+rafoWi2Da99r1XV
JktraHzdGvI08yRyFXc7BRkjJIHUigD+wP8A4JN+BfgBr37bf7dNrZaV4U/4VbqvxE8P+EdN0uWK
1j0a6l0Oxlj1C2itWCxNHFcP82F2ljnvX4gf8Hb3hmysf2q/hB4j8JRWf/COJ8L5PDunPprRtbQy
6dqErtbIsZwnlwzxbVwBt6DFeuf8Ezv2Pf8AgkT+014C+J3xR/bO8aWuk+J5vjl40g06NfFp0ZJt
GF0stpOsCuoZZPNciQD5hjk4r4p/4L0fsnf8E0/2bvAfw31H9gbxPF4hv9W1vVbfxKkfiQ695NtD
bxNbtsLt5WXZhkfexjtQB/NlUkNheatImlaZE01xdultbwx8tJLKdiKvqWYgD3NR17B+zzofg7xP
8f8AwN4Y+IswtvD2o+MNGsNeuGm+zCLT572KO5czf8s9sLMd/wDCRu7UAf62fgD4ffCbT/gJ4dvd
WtfCyeP9P+E1p4NbVGktDqUIXTljkshck7xF52cpu25596/C3/gip4N/4J4+FP8AgjlYfGf9qfwt
8O3u/h3qfijSviHrPiTSrO4vLO9stVuHFtcSSxtI0vkyRLEuSWBRVzxVj/h1v/wbnhv+Sk2+c/8A
RQ3/APjtfz5eNPFPwC+H/wDwSF/ax/Z6+H3iLTJCP2r7KTwfpb36XF7f6DYXVlHDdRZbfPGIotxl
GQcEnvQB+6H/AATW/wCCm/8AwSW/bx/aX1/9nPxP8Dfh54Iur7VZF+GN1qmh6dt12yRQBFN+5xBf
OQ0iRZIZCADuBFe4fAH4efArwz/wcYfFXQ/hRpHhbRvBnhr9nXTfDPiqx06C2sdNh1bVb6Of7NLE
oSN5ZrYBnABJUYbpX8E37DmuaP4Y/bW+D/iXxDdQ2Nhp/wAU/Cl/e3t04iht4INWtpJJJHY4VEQF
mYnCjJr+rP4Q/Ar/AIJcftp/8FNv2vvHf7YHjXTrXTI/H+iz+CNSsvFA0i2v7e40/bdPE8bhbhVe
NBkEhTx1NAHoP/B2h8PfBeg/s0/AmH4MWGjWfhXwx4o13THstAEEdpp7alawy26CGAgRrKYZSCAB
u9yK/hmf7pr+uH/gtL+xH/wSI+AH7E8/xA/Yn8Yxa542XxZo1klgni1tZP2GeRxcv9lZ2ztUDL/w
9a/kfYZU59KAP9bj/ghz/wAolPgN/wBiFa/+jZa/Vmvyq/4IfwSW3/BJb4CxyjB/4QCzf8GeRh+h
r9VaACiiigD5L/a9/Yj/AGbP26vAWnfDP9p3w4viTRdK1ZNcsbNrme2EV4kUkCyboHRjiOVxgkjn
OMgV4f8Asn/8Emf2Dv2IPibP8Xv2afBEfh3X7nSJtDmv0vru5LWU8scskWyeV1GXhQ5AzxweTX6S
UUAFc54p8L+HvG3hy+8I+LbO21HS9TtJtP1HT7xBLBcW86FJIpEbhldWIIPrXR0UAfh5N/wbp/8A
BIaUMjfCmAK4YEDVdRAw3YD7R05wPSv2n0HRNP8ADOiWfhzRo/Ks7C2is7SLJOyGBBHGuTknCqBk
nJrcooA+UP2oP2I/2Vf2z/Da+Fv2mPBOi+KYYUKWlzew7by1z1NvdRlZo8k5IV8E9Qa/Igf8GvH/
AASdGuf2x/YHizyd+/8As7+25Ps/+7/q/Mx/wPPvX9E1FAHyj+y/+xL+yx+xf4bbwv8As0eCNE8K
wSoqXlzYw7r26C4x9oupC00nIyAzkA9AK+j/ABFoOmeKdAvfDWtxGaz1G0msbyEkr5kFwjRyLkHI
yrEcHI7Vv0x5I4kMkjBVHVmOAPxoA/Eew/4N2/8AgkbptzBeWPwqhSS1ljngb+1dRJV4mDocfaOc
MoPPXvX7axRpCixJwFUKB7DgVIjpIoeMhlPIIOQadQAV89ftH/sufAH9rj4cz/Cb9ovwvpfirQp2
Mi2uox5aCXGBLBKpWSGQA4DRspx1yMivoWigD+ebSv8Ag2E/4JO6Z4kHiB/DPiW6hEgkXSrnW5Ws
+DnbhUWUr25k+ua/Zi+/Zi+B19+znN+yXHoFrbfD6bw03hA+G7RnigTSXiMJtkZW3quwkZDbvfPN
fQNFAH44/Dj/AIIK/wDBLX4TePtB+J3gH4aRWGueHNWtNb0a9Gp37mC8sZVmgk2vOVba6A4II7YI
4r9jQABgUtFAH4oeIf8Ag3r/AOCTHifV7/XdZ+FsU13qd5cX97KdV1AeZNdSNJK2BcAAs7k8DA7Y
FfsT4W8M6R4O8Mab4O8Owi307SbC30ywg3FvLtrWJYokyTk4RQMkk+pzXTUUAfjv+1r/AMEJv+Cb
f7ZPji5+JvxN8EtpniK9cy6jq/hW5bS5LuQnmSeNFaF3J5L+XuJ5JJ5ro/2Uf+CJf/BOT9je/l1/
4T+Are51qWCS2XXvEUz6lfwpKpR/IaQBIWIJ+eNFf0NfrLRQB+JVn/wbxf8ABJLTr2HULL4WxJPB
MtxC39q6gdskbB1IBuMcMB14r9FP2q/2QP2f/wBtj4Vp8Ff2ktDXxF4aTVLbWF09p5rcfa7NXWGT
fC6P8okbjODnkcV9Q0UAfmL+y7/wR9/4J9/sa/Fy3+Ov7OvgWPQPE9rYXWmQ6it9eXBW2vFVZ12T
Sup3hRzjPpivuH41/Bf4c/tEfCbX/gf8XNPGq+GfE+nSaVremtI8QuLWXG9C8ZVxnHVSD716zUSz
wvIYVdS68soIyPqKAPyG+E3/AAQn/wCCYPwQ+KHh/wCMvwz+G0WneIvC+rW+uaLfrqV9J9nvbVt8
UgSSYo2D2YEHuK/WrVNOtNZ0240e/XzLe6gkt505G6ORSjDjnkE8itSigD8RLX/g3c/4JHWlxHdW
3wrhWSKZZkYarqPDowYHm4I+8M4Nfo1+1L+x/wDAb9s/4Oj4DftC6Q+teFv7QsdTfTUnlt982nPv
g3PEyvhT1GRmvqGigD8evGH/AAQX/wCCUXjHwIvgC4+EOhWFvGQ0d9pU1zb6grYxn7SJS7Z64bK5
7V84eEP+DYn/AIJPeFvECa7deGPEesIkglWw1fWpXtSQcgMsKRMR2wW+ua/oUooA8p+EPwT+EnwB
8EW/w3+CvhzSPC+hWg/0fTNFtktoQ3ALMEGXc4GXclj3Jrz79qv9kj4CftrfClvgp+0hoa+IfDTa
la6sdOeeWAG6s2LQvvgZH+UseM4PevpeigD8w/2Xv+CPX/BPn9jf4vW3xz/Z28CR6B4ns7G60631
JL+8nK296oSdNk0siHcAOSM/Sv08oooA+Hf2xf8AgnZ+yN+3uugJ+1P4VTxN/wAIw142ibrq5tvs
5vhGJ/8Aj3kj3bvKT72enGOaz/2Ov+CbH7G/7BGqa9rH7K/hJPDM/iWC0ttalS8ubn7RHZNI8IIu
JHA2mV+VxnPPQV95UUAfG/7Xf7BX7J/7dfhSHwn+074O03xJFaq40+/kUw6hZbzk/ZruIrKgJ5K7
tpPJBr8vPBH/AAbJ/wDBJ7wb4mj8S3HhTxBrSxTCVNO1rWZpbTIOQCkKxMw7YLkY65r+guigD8i/
2jv+CGv/AATL/adsdKtPGvw10zR5dFsk03T7zwiTo862sQwkT+QNkoUdDIjNyea+hP2Fv+CcX7Kn
/BOnwrrXhL9l7RbrSofEV5Bfa3PfXkt5PdzWqNHCzs/C7EdgAqgc9zzX3hRQAUUUUAFFFFABRRRQ
AUUUUAf/1v7+KKKKACiiigAooooAK/zR/wDg648CS+Fv+Cp6eLWDCPxP8NdCvUJGAXspbqzfH4Iu
a/0uK/hG/wCDxz4TzW3jn4H/AB1giHlXWm6/4Ru5x/z0gkt723Q/VXmI+hoA/inKqQVYbgeoI4/I
1H9nt+nlR5/vbasRxyTTLFEGeSRgiIgJZmbgBVHJJ7AV+3X7D/8AwQV/bM/a+t7Hxb4xjsvhf4W1
Nlh0fWfGh+yz6pcSqTDFZWbFZZS5wBwOD0NAH4gKiAltqhmGCcc/TNTRQPeSizt42nkbpDGhdmPo
EAJP5V2fxN+H3jL4L/EbXfhf8RNPkstd8L6vc6RrGmXA2sl1ZSeXJG3qrFeCOqkEda/ro/br/bF+
C3/BLnwd8Gtb/YS+BPwwg0/4rfDGx8d6f411uzOpXEdxMFFzbKswZS8IeM5OD83PSgD+VG6/Za/a
Js/hjf8Axp1XwH4ltvCemeQL7xBe6dLBZwieVIYSZJFXh5HVRjjJFcr8F/g/44+PHxX0D4KfCqxS
/wDEfinU4tI0WxDCL7RdSglE3ngZweT6V/VN/wAE5/28v2nf+CuXhj9o/wDYR/aV1mHW5vGnwU1D
VPh5otnaxWdraazosoljjtoowMM7ywseePKBr5z/AOCU3/BG/wD4KO/CT9vP4N/Hz4v/AA5u/DPh
vwx42sdb1u81e6ghMFrEkm9wm4k4JAxQB8lSf8G9/wDwVUhcxS/D62VlJVlbVbUEEdQQTXzD+1h/
wSv/AG2v2JfhtZ/Fr9ozwpHougahrCaDa3sd5Dch72WKSZY9sZPVInOfauo/4KU/Hv4z6f8A8FEf
jrZaL408Tw2UHxZ8URWkNnq10kCRLqE2xY1SXaEAxtC8elfs38N/2cf2nP8AgoV/wbweCPh98Hlu
fGfi/T/j9rOt3dvqeohrv+z7eK7gyHuHLEBpkCrnuccUAfy2fCz4OfEX44eM4fh38IPD194i124t
7i7g0nSoPOuZIbWMyzOqDtGgJb2+tU/Gvw0+IPw01JtJ+Ivh/WNBu4iUeLV7KW0dcHGMyIB+Vf1P
/wDBLj9ir9qX/glV4e/aK/b9/ao8Hah4QvPh98GdQsfAs9+0bC41rWX8qN4CpYEo0cSf9tcV8J/D
7/g4P/bGbSo/Cv7THhnwB8ZrCTZDPF4u0aE30/batzGu/c547c0Afgx8rqcYZTx1yKRo4GGzy1Kr
0BAwPpX79/8ABwx8KP2cPgf+074B+HvwH8D6P4E1W4+GOm+KvH2k6KzGCHWdYd3FuFYnAhSM4PGQ
wNeKfsRf8EV/2jP26v2YNa/aS+Hmr6BoqQ6+fD3hDSPEcws28TXcELSXcdlM5C7osbFHO9gwH3aA
PxuWGFTuRVU+qgA/piknYJC8n91S35DNe5fHv9nD46/sveOJ/hx8f/C2seFtXgkZPs2qQNGsu043
RSY2SL6FWNcV8MfAV/8AFb4m+G/hbpaNJc+JvEOm+HoY06s2o3MVsMe/7w0Af69//BMfwJP8Mv8A
gnZ8EPA1yhjmsfhf4cWdCMbZZLCKWQY/3nNfdNc74T8PWHhHwvpvhPTFCW2l2Fvp1ug6CK3jWNAP
+AqK6KgAooooA/mQ/wCDpj9oH44fs6/sVeBPFfwI8V634R1O9+J9vp13f6DctbTS2raXqEhidl6o
XRWx6qK/D7/g2z/bc/a/+Pf/AAU2tPh/8a/iV4v8U6G3gPX706VrWoSXFsZ4Tb+XIUbjcu5tp7ZN
frH/AMHgH/JhPw6/7K7a/wDpn1Kv5+v+DVf/AJSzWP8A2TnxJ/O1oA/00q/Pn/gq1408XfDf/gmv
8cvH3gDUbvR9b0f4Za7qGlapYSGK5tbmC0do5YnHKujAFT2NfoNX5q/8Fkf+UVH7Qf8A2SfxF/6R
vQB/nhfsAf8ABR/9vnxt+3j8F/Bni34v+PNS0nVvij4b07U9Ou9UlkgubWe/iSWGRD95HUlWB6g1
/qz1/ji/8E2P+Ui3wH/7K94V/wDTlDX+ph/wUC/4KU/s3f8ABNfwl4c8a/tITarDZeKNUn0fS20q
0a7c3FvD57BlBG0bBwfwoA/QrIor8NfgT/wcL/8ABNL48WfivVrHxVd+H7DwjpEes6pe+JbU2UbR
SyiGOODJLSzM7cIoJxk9K+LPFX/B2h/wT10PxVLo+g6D461fT0mMS6vBZJHG6g43pG7b2B6jpQB/
U7X8RX/B2L+2N+1t8GfiD8NvgT8Kte1vwn4I1zw9d69qGpaFNLZS6nqkN0YTayXURVgkEISTylYb
jJk5AAr+pn9iT/goN+y1/wAFBfh9L8QP2aPEcOrR2TJFqumTKYNQ0+WQEqlxbt8y5wcNyp9a+GP+
CyH7Xv8AwTK+Buk+Ffg9/wAFHvDLeJbLxOl5qug2/wDZhvhE1g0aSusilWibMijg/MOvSgD8jf8A
g04/bC/aw+O0PxP+Dfxj1rWPFfg7wtaaZqWh63rksl3cWV/eSypLYLdSEs8bxIJQjElCODhgK/s1
yK/GP/gjx+1j/wAE5vj98O/E/wAPf+Cc3h0+G9C8HXNlJrNl/ZpsA8+orKYpCzFmlYrAwJYnHHrX
zf4y/wCDnP8A4Jr/AA/8V6t4I8WyeNLLVdE1K60nUrOXRpBJBdWcrQzRsN3VXUj3oA/osyKK8n+C
Xxh8CftB/CPw18cfhld/bvD3izRrXXdIucYL213GJE3L/CwB2svZgR2r5m/b0/4KKfs3f8E4vh1p
PxO/aSvry1s9c1gaHpdvpsBubmecRNO7LGpB2IiZdugyo7igD7xor8kf2EP+CzX7In/BRn4p6l8J
P2bh4lutR0fR21zUrjUNOa2toLbzVhTfISQGkd8IO+G9DX0N+2f/AMFG/wBkP9gbwwniD9pPxbZ6
PPcqTYaNF+/1O7wP+WNsnzkE/wARwPegD7oor+VFf+Dt7/gn+de/s5vC/j0WPm+WdQ+yx/czjf5O
7dj8a/dr9jb/AIKAfsq/t6+C5PGv7M/iuy1xLcL/AGhpufJ1GxLdBcWz/Oo7Z5HvQB9qUV8m/tn/
ALY/wg/YP+AmoftG/HaS/j8O6beWVhcyadAbibzb+YQRYjBBI3sMnsOa/PH9lj/g4E/4J6/tc/Ft
Pg58ONW1mz1A6NqWvzXuu2X2KxgstJh8+5llndtqBUGefpQB+4GRRX8+F7/wco/8E47/APaD0P8A
Z7+H174i8UXmveI7Hwxaa3pFgx037XqFylrEwlcgtH5kgywHTmv3I+KPxX+HPwS8Cah8TPi1rWn6
BoGkwG41DVdTlEMEKDuWbqT2AySeADQB6PRX8wvxa/4OtP8AgnL4B8STeH/BNn4v8XRQOUOpaXZi
G1cqcZjeYgsDjg4Ga6b4N/8AB07/AMEz/iXqsej+Mp/E/gxpWCC51yxLWwJ/vSxE4HvigD9lP2+v
in8VPgf+xV8Uvi98ELIah4t8N+B9W1jw/amIz7ry3t2eNvKHL+Xy4XvjFf5bf7NX/BTT/goh4Q/a
s8N/FnwT8RfGfiTxRqPia0SXRdQv7i+s9Xa7uER7SSx3eWUl3lQqKuzOV24r/Vv1j48fCr/hne+/
aVsr+HWPBMPhK68Y/wBo2A8+O70iG0a7eSJT/rA8KkqpHPSv5Xf2ff8Agpl/wbqaZ+07o3xE+C/w
6bTPH3iHXrWx0vVI/DzIIr/U50gSSMFzHETI4yyr8uSRigD+wq1kme2jluF8t2RWdM/dYjJH4dK/
Ef8AaH/4OG/+CX/7N/ibWvBHijxnd6rrXh++utM1TTdBsJbqSC7s3Mc0TN8o3I4IPWvq/wD4KBf8
FOf2Z/8AgmpoXhfxB+0lNq8Ft4uvbvT9JbSrRrtjLYxpLLvAI2jbIuD3r/Jx/ap8d6D8W/2jviR8
TfCRkbTPFHjTXtc0xp12SG21G8mnhLqfukqy5HY59KAP9lb4O/E7QPjd8JvDXxi8KCZdK8V6DYeI
tNW5XbKLbUYEuId69m2ONw/CvTq/k4/Y1/4OR/8AgnH8Kf2Zfhd8D/FF34qGt6B4M8PeFr9YdKd4
vt1raQWsgVw2CvmKcHuOa/Zv9vn/AIKl/s0/8E3bLwvqv7RqeIIrLxe11HpN7pVi11D51osbvDIy
kbJCsgZR3AY9jQB+llFflF+wP/wWP/Yw/wCCjnxC1r4Xfs86jqra1oWlx63cWWsWjWbS2jSiFnh3
E79jsoYDpuFfqnPcxWkD3Ny6xxRqZJJJDtVUUZJJPQAdSaALVGa/nH1v/g6I/wCCXmi69e6CupeL
Lx7O+msBNaaRJJHO8MrR7oSG+dXIyhH3gRX9CPhHxDB4u8K6Z4rgt7m0i1Kwt9Rjtb2PyriFbiNZ
AkqE/I6hsMp6HI7UAdPRRRQAUUUUAFFFFABRRRQAUUUUAFFFFABRRRQAUUUUAf/X/v4ooooAKKKK
ACiiigAr+cn/AIOiv2em+M//AAS21fx7psHm3/w18R6Z4yiZRuYWhdrC8x7CG6Lt7JntX9G1eQ/H
j4ReG/j98FPFvwO8ZRLNpfi3w7qHh6+VxuHlX8DwlsHuu7cPcCgD/O9/4N/4/gfq3w3+NNx4M8A+
HfF37Sfg7w1ceM/hcnisG5sr6yt0UTW8Fucr9phkBIwCWEi9ADX43fH7/goz+2n+0f8AGXTfjd8W
fG+tS674f1WDVdAtIpGtbLSLqylEkK29ohVI/KdQMEZwMHrX7m/CrwD+yH/wbz+O4Pi38c/EsnxR
/aS0uC5ttH8B+E7kw6VoKXkbRB9UuB95pIWyyH+98oyM1/NP8cvifL8b/jP4q+MtxpWnaHL4r169
1+XSNJQpZWkl7K0rxwKeQgZiR9aAP6Y/2nf2if8Aght+1h4j8L/8FA/2mp/F158RfE/hHTY/G/wm
8GxiG3k12x3wT3F1c4GzzQigf3owrHkmvy9/4KWf8FK/hz+2v8Nfhp8BPg58M7X4feDPhIl7Z+Eo
zePeXxsb1UVreVj8oTdGrjBJBHpX5FV0ng7wh4n+IPi3S/Angqyn1PWNZvoNK0rTrVd011dXLiOG
KMd2diAPrQB0Xwp+L3xP+BnjOL4ifB7XdR8N67Bbz2kOq6VKYLhIbldksYYdFdThvUfhXc+Jf2sP
2ovGEz3Pif4i+Nr1nb5vO1i7x+IEg/Kv241D/gjX+x3+zz4R0jwR+39+0RpXw4+LOq241i78J2Fu
uo2+kWLqNlveSLkrcsSTjgdQOlZdn+wt/wAENwo0yT9prxle3S4je907w00lmW7ncFPGaAP55b2+
u9SupdQ1KaW4uJ5GlnnndpJZHY5ZnZslmYnJJPJrtvCvxW+KXgS1Fl4I8TeINGhVzIIdL1C4towz
HJbZG6qCep4r967L/ghF8PPjkP8AhJ/2LP2jvhl4w8PWkgfX5NenGk3+j2igl57iFyNwUdQo696r
x/8ABOv/AII5fDMNovxg/ah1bxFqsBMV3J8P9Ae+sY5V4ZVlKnOD70Afjd4k/bF/at8afDq++Efj
H4h+LdW8M6n5X9oaLqOoS3NtP5Eiyxh1kJyFkRWHPUA15D8O/FkHgL4haD47nsLfVI9D1ux1k6Zd
kiG7+xTpOIJSvISQoFbHY1+9Np+wZ/wQ91TUI9ctP2r9Xs9JikUXmm6joflakRnBCqVwCOMk9K+F
f+Ci3/BNL4hfsHazo/jTTNW0/wAbfCzxozT+AfiHosiyWmpQkeYIJQpPl3CR8uvQ4JGOlAH6c/GT
9v7/AIJGf8FR/Gtz4/8A25/A3jH4VfETVYba0uvHvg26Oo2RW1iWCAy2z87UQAc4wBXy7/wVx/bb
+FviXVPhj+x3+whrV1H8IPgf4dtIfDus6bI9rLq2v3cSy3uqMybW8xS2zd13tKehr8NqDz04oA/r
7/4JRftdeOv22P2ePixoP/BUPTtD+IfwL+EfgiTVtQ8X+KYB/blleuuLKws70AO8soVyBncDsHJY
V+Xf/BBj4EaV+03/AMFf/h+mk6a1r4e8M6zqXxFlsJGMwtLHSA8thDJIfvFbp7dNx+8Qa6//AIJ8
/wDBRL9lHSP2Ptc/4JiftueFr2x+HHjHXv7cuvH/AIQkaPVbXUTIrw3F9GP+PiKApHtA4CqARX9V
/wDwb9f8EnPB/wCxV4r+IX7TXhbxtofxG8P+ObCw0r4deKNGI+fQ1d7i585f4JmnEcbqP+eWe9AH
9PdFFFABRRRQB/Jp/wAHfsbN+wP8PHA+Vfi7a7j6Z0jUsV/Pt/wasMB/wVnsff4c+JB+trX9UH/B
0j8GNZ+Kv/BKnV/E2g27XE3gXxdovi+4CDJSyRpLK6f6JHdFz7Cv4Zv+CNH7YHhf9hn/AIKNfD/4
8ePZTB4bFxd+HPElyASLfTtYhNu9wQOSsEhjkYddqtQB/rm5Ffml/wAFkmVf+CVP7QZY4H/Cp/EQ
597N8V+gHhLxj4V8f+HbXxb4H1Ky1bTL2Fbi0v8ATpkngmjkG5GV0JByCK/nm/4OWP25/hT8Cv8A
gnr4t/Zzj1a0n8b/ABOto/DWn6Nbyq9zDp7zo99czIpJSIQI0YJxlnAHegD+Av8A4Jrgv/wUX+A4
Xkn4veFcY/7CUNf2L/8AB49/ybj8Ev8AsoOq/wDpqev5eP8Aghf8HNX+Nv8AwVi+C2g6ZbtPFo3i
oeMNRYDiG00GGS9Mjeg81I0H+0wFf1D/APB48R/wzl8Ev+ygar/6anoA/kW/4Jvf8E5/jf8A8FNf
jrcfA34L3Onaf/ZmmDXdf1TVXKwWdgsyQeYI1+aV/McBVHNfqn/wVH/4Ny/iF/wTq/Zdm/al0Px3
beMNL0W8s7bxNZPZm0lto76ZLeOeAgnegmdFYHnDZ7V9Df8ABnv/AMnw/FT/ALJRF/6d7Wv6b/8A
g49Gf+CNXxjPBxa6GRn/ALDdjQB/FJ/wbSfGnxR8Kf8AgrR4H8LaRdTx6b460/WPC+t2aMfKuEFh
NfW7MvQtFNbKVJ6Bm9a/VP8A4PI/+Sp/Ab/sA+Jv/R9lX4hf8EAf+UwvwO/7D+qf+mS/r9vf+DyP
/kqfwG/7APib/wBH2VAHqP8AwZrf8gr4/wD/AF+eE/8A0XqNfll/wdAfsYn9mz/goRJ8cPDlt5Ph
z4v6d/wkUZiTbFHrlmFt9Tj4wN0n7q49zIxr9Tf+DNb/AJBXx/8A+vzwn/6L1Gv2M/4ONf2MW/a6
/wCCbPiPW/Ddn9r8VfDSYePdB8tN0zw2aldSgXqcSWjSNgdXjX0FAHx1/wAGnP7XX/C3/wBiXXP2
W/EV0JNX+FWusNOidsudB1lnubcqD/DFcC4j9ANor8Dv+Dp39sUfHz9vq2/Z68O3Pm6H8IdIXTJl
RsxvruqBLm+bA43RRCCH2KsK+FP+CIn/AAUM0n/gnL+2XJ8XfF7zt4X1zwbrOg65bw8h5Fh+2acx
Hc/a4Ejz2Ep96+RvgN8Mfij/AMFIP26ND+Hl9LPdeJPix48e41u+OXaFdQne61C5b/Zgg81vQBcU
Af2hf8EGfhx4b/4Jk/8ABHT4gf8ABSD4s2W3UPFOnXnjVUZcTvoelo0GkWoPUfapt8ijv5ymv4vv
GHj39oP/AIKa/tk22rfELWIrzxn8RvEsNhBdarcCHT9OW7kwkYeQhILS2jJ6Y+Vc9TX+jv8A8FyP
g7D4V/4If/E/4T/CW1Nrp3hbwdosVnZWq8RaTod/ZPIoA7JbQMT7A1/mKfA/4ct8YPjH4Y+FMetW
Hhx/Emt2uiw65qrtHZ2ct2/lxyzyLgpHvIBPYEE0Af2K6t/wa/8A7Fi/BySDQ/2h7J/Hq2Bkjup7
3TxpD3oTiMxCXzBCW43Z3Ac1/MV+yF+018Yv+CX/AO29p/xL0C8NvfeDPFMmh+MNPsp/MtNT062u
Db6hbOUJSWN0V2ibnB2sK/c+P/g00/4KBzIJIviF4MdWAZWW6vSCD0IO7kVzuv8A/Bph+3BY2Ml1
rfj/AOHlukxEHn3c06KZZfkUbn6szkADqScCgD+jv/g5h1/TPFX/AARj8ReKdFkEtnqWt+EdQtJR
0eG41CCSNv8AgSsDX+bt8F/AXxd+K/xGsfhR8DbDVdU8SeJ9+h2el6Pu8+8juP8AWwnbj90yqDJn
5QoO7iv9FD/g4U8E6n8Mf+CDbfDbWZkuLvw8PAWiXU8Z+WSawuLW3kdc84ZkJGe1fyi/8GzqBv8A
gsb8OtwBxofivGfUaRNigD9Av2PP+DXj9vnwR8WPh58c/iFq/g3SB4d8YaD4mvtC8+Se5W306+hu
pY96AJ5hSMgDpnivob/g8N+OXxFstf8Ag/8As36fc3Nr4W1DTNW8XalBEzLFf39vPFa26y44cW6O
zBT3kB6gV/c/X5H/APBWn/gmJ8Af+Cnvws0f4X/EfV4/DfjDS5rq88C69GyfaYpWRVuYfKYgzwP+
7MqDkEKwwRQB/nzf8Ej/ANif9hv9tLxb4n8Pftj/ABb/AOFazaatn/wjmnq0UDap5/mee4uJvkXy
iqDb1O7Nft18XP8Ag1D8FeN/Ck3iX9hr416b4mmSMtHp+tm3mhl9FFxZs2w+7jFfnf8AH/8A4Nef
+CmHwlvZpPAen6D4/wBPjLNDc6LdiG5ZR0JtpsMGPoDX5Lab4/8A23/+CcvxruPC1hq/jD4a+M9B
lia70w3EkLR+YBIhkgZmidHXkHBDCgD/AExdF+D3jf8AZ9/4IaX3wO+JEEdrr/hP9m7VdB1e3hkE
scd3ZaDPHKqOOGXcpwR1Ff5d37IH/J1Pwp/7KN4W/wDTpbV/pDfsq/t2+Jf+CiH/AAQh8f8Ax+8e
28UHiWH4a+OfDHicWy7IZdR0vTbiN5417CaNo5Nv8JYjtX+bz+yB/wAnU/Cn/so3hb/06W1AH+ml
/wAFqf8Agkbr3/BWHwh8PvC+ieMLbwifBWr6nqMs1zatdfaRqEEUIVQpG0r5eST1zX+X58dfhzN8
E/jH4y+EV1creyeEfEmq+HpbxFKC4bTLiS3MoU8qHKZAPTNf7Ytf4yX7eylv21fjUig7j8T/ABWA
PX/iZXFAH9MP7Mn/AAao+OfjH8I/h/8AtCWvxZ0yyi8SaFonjBNNk02V3hW8hhvRCXB5KhtuRX9T
H/BaP9hMft3f8E7/ABd8I9HgSfxVoVmvirwZJtG/+1tJQyLCp6gXUYeA47P3xX03/wAE3dQstU/4
J8fBC+06VJoZPhT4WKSRkMrY0y3B59iMfWvtbIZc9QRQB/j+/wDBLH9sLVP2DP29fAH7QU8kttpl
jrK6H4ugfK50XU2FrfLIvHMIIlAPR4h6V/oSf8HAv7d9p+yJ/wAE1tcv/AupIvib4mxL4M8IzW8n
zmLU4i15eREc4hs97Bh0Zk9RX8Uf/Bw9+win7Fn/AAUJ12/8L2X2bwZ8TUl8b+HREm2GGe5kI1Oz
THH7q5JkCjokq18N/tift/fFz9sf4XfBz4Z/EqeRrT4Q+Bm8KWskkhb7ZcGc7ryTP8f2WK2hyf8A
nmT3oA+wP+CAH7DR/bc/4KJeF7HXrUz+Efh75fjzxS0i7opE0+VPsFo5PB+0XWzIPVEf3r/VdVQo
AAwAMYHQY7V/OV/wbNfsKyfsnfsB2vxc8aWP2bxb8W7lPFl8JU2zW+jomzSrZs8j90WnI/vTH0r+
jmgAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKAP/0P7+KKKKACiiigAooooAKKKKAP8A
O9/4Otv2CJvg7+05pP7cfgu1f/hH/iaseleJmQZS28R6dCEjc4A2i7tIgQST+8ib+8BX8mNf7Gv/
AAUM/Yu8Eft+/sj+L/2Y/G4SL+3LAyaLqTKC+navbfvLG7TvmKULuH8Sbl6Ma/yDfi38KfHnwI+K
fiL4L/FGyk03xF4W1e50TWrKQH93dWzlWIJxlHADIw4ZSG70AeeV+hv/AASQ0wax/wAFRP2f7BRn
PxW0OUgd/Jm80/ltzX55V+qn/BDvRn1z/grd8BLVQT5PjcXpHoLSyupc/wDjooA+xNT0zRfjH/wc
7XegeM7WHVtM1H9pufTb2yv1E8M1raXjx+U6PkNHtixtxjHtX+kVof7Nv7PnhixGmeH/AAN4StLd
RgRQ6TaKv/ouv82n9l6c+Jv+Dmuzu2JYzftPeJ7gHrxDd6jIPyCV/qCUAfyQ/wDBef8AYj/Zb0X4
yfsx694X8GaPot18Qfj3ovgXxo+iR/YBq+h33zT2lytuUV0YqOo7e9f0q+Af2S/2Y/hd4Zg8HfD/
AMAeEdL022QJFa2+lWpUKBgctGST7kk1+K3/AAX35+Jf7GBH/R1Xhr/0Fq/oyyKAP49/+Drr9nr4
K/D/APYU8M/ETwB4T8O6Lq7/ABP06zudR0uxgtbiWCayvcxs8SKSpZQcHuK/Af4yXNz4j/4NoPgz
ezM7/wBgftFa9o6biW2Ry2+oOqjPQYYYA4Ff0+/8Ha1mtx/wTB0u6xn7P8VdAbPpvhvE/rX8xjRD
W/8Ag18il6/2J+1U/HoLqxI/nL+tAH88tFFH9elAHonwj+Fnjf46fFLw78GPhpaSX+v+KtZtNB0i
1QZL3N5II1JHZEB3M3QIpJ6E1/sUfsYfsueDf2L/ANl3wT+zH4CBbT/COhwaa1ywAe6ueZLq5fAA
3TTu8h4/ixX8cP8Awai/8E2bjXfFepf8FHvivp5Fjpf2jw78M47heJrtwYtR1NA38MSE20LdCWkI
6Cv7wqACiiigAooooA8++KHw18H/ABk+HOu/Cj4iWMeo6F4k0m60XV7GYfJNaXkbRSofQlGOD2PN
f5nP/BT3/g3w/az/AGIvHWpeJfgjomr/ABF+GM1w0mkaro8Ju9V0+3bJWDULWIb2aMfL50alWAyQ
vQ/6g9V3iWRSkoDK3DKwyCPce/0oA/xi/B37SH7XXwFsn8E+DfGPj3wnboRG2kRXV5ZpGT1UQPjZ
9ABXTfDL9m39tn9uD4ixQ+AvC3jrx7rmoyiJtUuoLmaMZPBmvrnEMaDOTlwBjgHpX+wFrXwR+D3i
O8+36/4W0C8n6+bcWEEjZ/3mSuz0Twt4d8NW4tPDthZ2EQXaI7OFIlA+igD9KAP5+/8AghP/AMES
rL/gmd4TvvjH8Yrm31f4s+KtNSw1FrRt9hounlxL9htXKhnd2VWnlOAxUKAAOfh3/g8A8LeKvFP7
O3wWt/C2l6nqjw+PtUkmj0y1mumjQ6W4DOIVcqCeMmv7B6zNQ0uw1NVTUbeG4CncqzxrIAenG4HF
AH8Af/Bol4J8beF/22PijdeKNE1nTIZfhZFFHLqVjcWsbv8A2tbHYrTIoLYBOBzX9Lf/AAcUaPq+
vf8ABHr4v6XoNnd393Na6IIbSxhe4nk261ZMdscYZjgAk4HQZr9m7LQtI02QzadaW1u5G1mhjWMk
emVAyM4q3d2dvfRG2vIklibG6OVQ6tz3ByOv+eKAP8rj/ggp8NviPo3/AAV1+CWqa14c8Q2drDr2
ptNdXmmXUMMYOi3ygvJJGqjJIAyepr9q/wDg8D8G+MvFPxQ+Bj+FdH1fVFh0LxIszaZZT3YjLT2Z
Acwo20kA4z6V/cXb+HNCs7hbi1sbSKRWykkUKKy5HYgZHFWbzSNM1Mq2p2tvcsn3PPjWTaDycbhx
n2/pQB/F1/wZ6eEfF3hXS/j2vivSdV0sz3nhXyRqdnNaeZtj1Dds85E3YyM4zjNf2lahp9nq1hPp
epwpPbXMT29xBKNySRSAq6sCDkMpwQexwajsdI0vTNx022htt+N3kRrHux0JCgetatAH+RT/AMFQ
P+CfXxH/AGM/26viD8CfCvhzxBfeHbXWG1fwneWWn3N1FJo2pg3NogkhjKEwqxhYZyDGcjpX9FH/
AAaYfsD6zF4+8c/tyfFTR9Q0+TRI/wDhBvBkWqWsls5uLpFn1S6RJkV/lj8mBWA/ikFf3J3eg6Nf
zfaNQs7Wd8Bd8sKu2B2ywJx/nvVm00yxsIzBYwxQRk7tkKBF3HuVAAz/AIUAYfjjwV4a+Ivg7VPA
XjK0iv8ASdb0+40rU7KZd0c9rdRmKaNgf4WRiK/zJv8Agqd/wb/ftS/sM/EHU/F/wL0TWPH3wrmu
Wn0bVdHia71TS4mOVt7+2hHmExfdE8alWABYKev+oPVeWGOdDFMoZSMMrDcCD2wetAH+Up8Ev+C6
n/BVf9mHwbB8JtA8a6jNZadD9jtLTxXppvLu2RBgIGnCy/KOgbOK99/Zq+NX/BYD/gpv+2f8KdY8
cy/EbxX4c0b4ieHtb1MRWk2naBa2NjqEM880hKxQEJGhbaWYnoBmv9J3VPgX8Gtb1A6tq/hTw9c3
JO4zzafA75znqUz+POa9B0vQtF0O2FpotpbWkQGBHbRLGoA6cKAKAPwR/wCDm/Rdb8Q/8EkvF+l+
HrK81C6bxN4ZdbXT4JLmZlXU4ixEcSsxAAJPHFfyG/8ABth8PPiH4f8A+Cv/AMPdW1/w9r1hapof
ikPdX+nXNvChfSZlUGSSNUBJ4GTzX+nRd2Nrf2/2a+hjnjyGKSqHUkdMhgapW/h7RLKcXFjY2kMi
5AkihjRsHrggAjIz/wDqoA3a/lE/4Om/hP8AtXeMvhf8HfiT+yzpHi6/uvBvibW73Vr/AMGiZrzT
47m0gSGRltz5pRnRh8qsOOcV/V3UDxLImyRdw7g88f8A16AP8sT4ff8ABfH/AILAfs9Wa+ENY8Ya
ndC1TyFg8a6QZbhOOAWlSN8jtn9a+P7/AMJf8FB/+Ct37TN18Qo/DviPx5408SyW1rdapBYvbWEM
UCCKFZLhlWCCCJPVuBngng/62uvfBr4T+KZxc+JfDWg38g6PdWMErfmyV0+geD/C3hO1Fl4Y02x0
6EDCxWUCQqPbCBc0Afif8Bf2Cbj/AIJ5f8EQPHH7L8lz/bHiCP4X+NNX8RXVkjMlzrWq6bcyTpbo
BuZEJWGMYLMFHc1/m/8A7I/wp+K1r+1F8LLq68K+J4o4/iJ4Ykkll0m8REVNTtmZmYxAAAckngCv
9kuSBJkaKVQ6OCrKwyCOQQc9Ryax18LeHI2EsenWKsp3KywRggjnIIUHPp9O1AHR5B6V/l+f8F9P
+CXHx5/ZY/bV8cfHLw54f1fWPhz8QdduvGGma/pdtJcw2FzqEhmvLK88pWMLRzM5jZgFaNhzkEV/
qBgY4rK1LStP1m1aw1a3iuYHGHhnQSIw6EFWyCOe4oA/ySf2af8AgoR/wUw0rwd4d/ZI+BfjTx2/
hiLUraztPDugQTXEsNvJOvmQxtHG0iRfM3y5AUfSv9bvTkdNPt0lDbhCgbPXIUZzXEeG/hH8LfB9
22oeFfDmi6dO53NLZWUMLk5z95UB616PkUAfzF/8HWHwm+DHi7/gm5H8TviBdx2Hijwj4t04+B5Q
A0t5dai4t7qx552SW2+ZiPumFWOcYP8ADz/wSi/YlvP+CgP7dngj9neeKY6DJenXPGNxEP8AU6Dp
u2a6+YghTMdtupP8Uor9gP8Ag6g/bwH7QH7YOnfsk+B73zvDHwmicasImzHP4nv0But2OCbS32RD
+67y1+z3/Bpz+w4PhH+y7r37afjC1Ka18ULoaf4eMyYeHw7pUjIGUnkC7ut8nuscZ5oA/rG0jSdP
0LTLbQ9IgjtrOzgjtbW3iXakUUShI0UDoFUAAYwK1qKKACiiigAooooAKKKKACiiigAooooAKKKK
ACiiigAooooA/9H+/iiiigAooooAKKKKACiiigAwD1r+LP8A4Omf+CVz+MPDKf8ABSD4IabnVNCt
UsPidZWiEvdaamFttV2qPmktTiOc94SGP+rr+0ysHX/D+jeKNCvPDfiW1gvrDUbaWyvbO6QSQzwT
qUkjkQ5BVlJBB4I4oA/xDRz0+tfs3/wb16c2o/8ABYv4LRDkRahrNwR7RaLevn9K6v8A4Ll/8EmN
f/4JoftFPrPgO1ubn4UeM7qa88H6jtZ102csXm0ed843wjJgZjl4vVkavyE+D3xj+KX7P/xF0/4u
fBfXL/w54l0kynTdZ0yTyrmAzxtDLtf0eNip9QcUAfrfH8L/APgod+zv/wAFPNf/AGvfhH8HfG2q
avoHxZ8TeJtES98PX81jcrd3l7ErNsRdyPFcFlKsOcHNf0CWX/Bdr/guVpFiNR8TfslXE0CLulli
0fWEJX1CrKxH/fNfzKf8Pvf+Cr3kiA/G/wAY4/vCWPf/AN9bM5rNs/8AgtN/wVSsr9dRj+N/jdpA
wYiW78yM47GMgqQe/FAH6p/t4/8ABfTxD+1B4l+CUfx9+E2qeB/EPwd+Mmk/EnVNP86WL7dZWCMG
gjhvI0lhkYnhmLLj3r9ItM/4ORP+CmPxvMms/syfsrahqejSMzWNzLaarfb48/KxnjWGFiR3UV+X
/wCzv/wXX+FXxT8JX8P/AAVY+HPh74qeJPBMB8UfCrxEmmW8F3PrduCkWn6iYlVWgkMm8ylT9z5l
YhcfDHxV/wCC93/BUL4mazNfaR8RbjwZpzSE22h+C7aHS7K2iP3IlESb2CDgF2JOOTnmgD9EP+Cm
n7a3/BZf/gon8CU/Z7+M37OOteHNBTXbPxD5mi+H9UluTcWG/wAoGR2lAX5zuwOa88j+Dvxi+FP/
AAbV/E3wx8YvDGveGLqz/aM0LVrO01+ymspZba6isoDMiTKpZPMcrkDGQR1r86bH/gtb/wAFVtMX
bafHDxuR0xNdLL/6GrV5B8ff+CmH7eH7U3gGf4XftAfE3xL4n8OXM8F1caLqEw+yyTW0glhdkUAE
o6qynHBGaAPhivuj/gnJ+wp8Rf8Agor+1f4d/Zs8ArLBa3co1DxTrSrlNJ0S3Yfarlj03lT5cION
0jKOmcfHfhDwl4p+IHizTPAngbT7rV9a1m+h0zS9MsUMlxdXVwwSOKNByWZj+HJPAr/VE/4Inf8A
BKnw1/wTK/Znh0/xDFa3vxL8WJDqXjrWoxv2SYzFp1u55Fvag7f9t9zkcjAB+qPwR+DPw+/Z5+Ef
h34JfCrT4tM8PeF9JttH0myhGBHBbIEUnH3nbG52PLMSTya9XpAMDFLQAUUUUAFFFFABRWBr3ivw
v4WS3k8T6lYact5cpZ2jX9xHbia4k+5FGZGXe7dlGSfSt7cvqKAFopCcDNYekeKPDfiGW7h8P6hZ
Xz2Fy1nfLZzxzG3uE5aKUITskXIyrYI9KAN2iuM8U/Ef4e+BXgi8ca9o2jPchmt11W9gtDKEIDFB
M67gpYZxnGRnrXJf8NC/AL/oefB//g5sv/j1AHsFFcx4X8beDPHFnJqPgvV9M1i3il8iWfS7qK7j
SQAHYzRMwDYIOCc4I9am0zxd4U1vVr3QdG1PT7u+01lTUbK1uY5Z7VnGVE8aMWjLAHAYDNAHQ0UV
jnxF4fGuL4Ya+sxqbWpvl04zJ9pNuG2GYQ53+WG+XfjbnjOaANiikByM1h6j4p8MaPqlloer6jYW
t7qTOmnWdzcRxT3TIMsIY2YNIVHJCg4HWgDdozTQwNecfED4wfCf4T2q33xR8T+HvDcLjKS67qFt
YK3+6Z3TP4UAek5HSivGPhz+0T8Avi7P9l+Ffjbwn4klwT5Oh6taXsgA6/JBIzfpXs+RQAUU0sBW
FpXivwvruo32kaJqVheXemSrBqVra3Ecs1pI+SqTojFo2ODgMATg0Ab9FFMZ1UZY0APyKK+ctW/a
9/ZT0HxGfCGt/EvwFaaqJDG2nXOvWCXAccbSjTAqfYjNe+6Xq+la5p8WraLdW95azoJILm1kWWKR
T0ZXQlWHuDQBoUV5/wCJPiz8K/Bmpf2N4v8AE3h/Srzy1m+yalqNtazeW2dreXLIrbTg4OMGsD/h
oT4Bf9Dx4P8A/BzZf/HaAPX6K8gH7QnwDJwPHHg/PYf2zZf/AB6vVXvbWO2N5JIiwqhlaUsAgQDJ
Yt0xjnNAFqisbw/4j8P+K9Ji17wvfWepWM+TBeWEyXEEm0lTskjLK2CCDg9eK2aACsrVrfULrSbq
10mYWt3JbypbXLIJFilZSEcocbgrYJB64x3q3fX1lptpLf6hLHBBBG0080zBI440GWd2bAVQBkkn
AFeUj9oX4BEZHjnwf/4ObL/49QB/I/4r/wCDQ7QfH/jDUvHHjr47eJ9R1HW9UuNW1m7k0u2865uL
yVpbiQv1DOzNz2zX9fPwm+F/g/4LfDPQPhF8PrRLDQ/DOkWuiaVaRqAsVtZxLFGOB1KgFj3PJ5zV
bS/jd8GNd1KHRtD8XeGL28uZBFbWlpqtpNNLIeioiSFmY+gBNenjpzQAtFFFABRRRQAUUUUAFFFF
ABRRRQAUUUUAFFFFABRRRQAUUUUAf//S/v4ooooAKKKKACiiigAooooAKQjIxS0UAfL37Xv7JPwc
/bb+AWu/s6/HLTVv9D1u3KrKABcWN0mTBeWrnPlzQPhkYfQ5BIP+Ut/wUl/4JzfG3/gmj+0RefBP
4qxS3ul3JkvPCHiyOIpaa3poOFlTqEnjyFuIclkbkZRlJ/2Cq+I/29f2CPgN/wAFEvgHqHwF+PGn
+ZBNm50bWrZVXUNH1BVIju7OQj5XX+JTlZFJVwQcUAf45dFfoN/wUb/4JtftC/8ABND43z/Cn40W
clzpN5LK/hTxjaxMum63aKeGjY5EdwowZrdiWQ9CyENX589aACiij/8AVQAYNTQwy3U0drbRyTSz
OkcMUSl5HdztREVclmY8AAZJ6VNp+n3+r38GlaRBPd3d3PHa2lpao00080rBY4440BZ3diAqqCST
gV/fx/wQX/4N8ofgI2kfto/txaXFN462pqHg/wAD3irJF4cLcpd3g5WS/IIKJytv7ycqAeg/8G9P
/BDVf2T9CsP21P2rNNVviXq9n5nhjw7doGPhiwuF+/KDuH9oToxEmP8AUp+7HzF6/rLAxxUQQqNv
09hx2qWgAooooAKKKKACiiigD+aL/g5h+HGofGT4Mfs+/BzTtVudCm8W/tHeHfDUWs2hPnWUmoWd
7AlwoUrkxMwcDIzt7Gvqr/gk1+3N8S/iLdeJv2Af2zwmm/Hz4OMmna0ZCQnirQ02pZeILMsB5izx
lPO25IZgx5YhfKv+C93/AB+/skf9nc+CP/QLqvS/+CuX7DPxV+I//COft8fsTFdM/aA+D2/UdBeI
EJ4n0UbnvNAvFBAlSdd3lBs/OSuBvyAD9vDjvX4Ef8EQpZZPin+2UJHZtv7VfiVVBJbaPs9twMnp
kntX6J/8E+P25fhj/wAFCf2aNF/aD+HObO5n3ad4n8O3B/0zQ9btcLd2FyhAYNG4yhIG9CrcZwPz
p/4IfH/i6n7Zg/6ut8S/+k9tQB8+/wDBYf4IfDT9pT/grh+xl8CvjNpw1nwr4gg8fw6xpMsjxx3C
W+nR3EYYxlT8ssaN17V9yf8AEP3/AMEmv+iT6d/4F3f/AMdr4G/4LReNfjF8Ov8AgrB+xr40+AXh
GLx34vsbfx8+i+FJ79NMTUHfT4opVN3ICkXlwtJLyOdmByRX00P27/8Agtnn/kzrSP8AwvrL/wCN
0Afqf+yn+xj+zh+xP4L1H4d/s0eHIPDOkarqZ1i+s7eWSVZbxokh8zMjMQfLjUYHHFfjJ4xe5/Yq
/wCDiPw14mRzb+EP2qvhzP4fvFYlYU8WeFgJYWAzjfNAI04wS0h96/dT9nfxl8XfiF8GPD3jP47+
FovBPi6/sjNrnhWG9TUU06cSOvlLdIAsvyANkAD5q/Gv/g4s+GviCD9i/Qf2yPh5FI3iz9nn4g6H
8UdJmhBMn2K3uY4NRi46o0LhnGPux0Af0CZr+IHxF+1v8TE/4Lwr/wAFCorhz8GtK+LEH7Hs10ZG
+ziS401pJJ85KGMaq24nH3toJxX9QP7UH7a3gr4Q/wDBOTxV+3boV1E2l23w0k8ZaDMzArLNe2gk
06PIHJeeaJMAd6/D7S/+Ca3ivUf+DaO4+DF5FK3xJ1HwvL8dnuiNt2fF0lx/wkKNnr52wLb7s5zQ
B/V6Co4Ffzv6NLdftkf8HC2qanGzTeFP2VfhhHpke1i0L+L/ABopkkJHA3xWOVPoydc1+jH7Ef7Z
nhT48f8ABN/wP+2p4kvEWzufhzH4i8S3LMMQ3OmWxGqBvQpPBMCD6V8J/wDBvT4H13XP2SPE/wC2
x4/idfFH7RPxG134n3zzA7106W5e20uEZ/gSCLcnbD570Aew/wDBVP8Aby+MHwE1XwJ+x1+xlp1p
rnx2+MV5NZeF4b/LWmg6TbjF7rt6NrDyrf8AgVuGYNnOzB8s+DH/AAQK/ZfGPiD+29qOufHj4iX3
+kaz4l8b3c0tp5z/ADOlpYBzDDCp4VcE4xk9q4X4JW6fEH/g5Q+NmteK/wB7ceAPgD4V0fwpFLz9
nttZuBc3kkQPQmTKkj+8Rmv6IMj1oA/FD4q/8EB/+CdPjW1XUfhj4Xu/hn4jtWWXTPFHw+vZtJvb
SVCGV1WJhG4yBlXUgiv2gsLWSzsYLNneVoYUjMr/AHmKKAWPuetaFJkDkmgD5v8A2uf2j/Bv7In7
M3jj9pbx66Jpngzw7d63MjHBnlhQi3gX1eaYpEuM5LDFfyh/8E8NB+PP/BOX9q34I/tSftKapfS2
P7bGmahafEdL9z5Gi+ONSupdY0CMAkhPNtZ/s2DjD7+2BX6V/wDBXS9uv2yP2vPgT/wSS8PNJLpP
iXWk+LfxgMHIj8IeGZfMt7WYj7q316gT5u6r1zmvtj/gr9+x9e/tffsBeMPhr4DQ2/i7w3bw+Nvh
3cWqgS2viDw632uxEGOjSbGgAH/PT6UAfqVkV+BP/Bbf4jfFXxP4q/Z9/wCCf3wv8R3/AINg/aC+
Itz4d8V+KdJkMN/BoOk2y3d3bW0gwUkugwTOQTjbyGIr9Bv+Caf7Xmmft0fsRfD79pS3aMajrWhR
Q+JbROtnrln/AKPqVu6kZVkuEfC4+6R6155/wU5/4J4wft/fCzw7B4S8SXPgf4i/D3xJD40+G3ja
zTzX0vV7cYxLHld8EwCrIAcjCkZxggHlHhv/AIIL/wDBKzw94Mj8HTfCjRdSAhEU2qaq0t1qU7/x
SSXcjGUuzc53delez/sSf8E1/hh+wH418V3nwH8ReK/+EQ8S2tott4E1nUJdQ0zRrq1klZ59Padm
kj85ZAroSR8i46Yr824v+CgP/BZP9jO0i0b9tf8AZ1j+K2jWC+XdfEH4KXqyzzxJ1uJtIlXzA+OS
saonfiv1i/Yc/wCChP7Mv/BQfwBd+N/2edYnnuNJuFsvEfhzVoGsta0W6O4eRfWkh3RklWAcZVtp
wxINAHHftOf8Epf2Ef2xPib/AMLf/aI8CWniHxD/AGdBpX9oT3E8bfZbUuYo9sbqvy+Y2OM81/O9
/wAFKv8Agk3+wV8C/wBtb9jz4W/C/wACWul6F8TPipq2geNbCK5uGTUrG3sI5ooX3SEgK5J+XFf2
b1/PH/wWI/5SL/8ABP8A/wCy265/6a4qAPoRf+Dfz/gk0pDL8JtNBBBB+13fb/trX6e/F60h074I
eKLGzUpFB4U1KKJQfuqlnIqjPXgAV63XmHxr/wCSNeL/APsWdU/9JJKAPyN/4NzJJrj/AII4/ByS
dmkY2es5diSTjWb3qT/j/Kv3Br8P/wDg3G/5Q2/Bz/rz1r/083tfuBQB8rftw7k/Yt+L7rnI+F/i
ohgSMY0q56e9fz9f8Elf+CMn/BOL9oj/AIJufB342/F/4dWWs+JvEng221LWtTnurlZLm5d5AzsF
kA5AHav6CP25P+TKPjB/2S/xV/6armv5qf8AglX+1/8A8FXvAX/BOz4Q+Evgh+y/pvjLwnp/g62t
9B8US+M7SwfUrYO5E7WrxloiTn5Ce35AH7T/AAf/AOCK3/BNr4D/ABO0P4x/Cv4cWOl+I/Dl+mp6
PqMV1cu8FzGCFcB5CpxnjNfqrX5a/sb/ALTX/BRj4w/FW58L/tYfs/2Pwu8NR6PNeW/iK18UW+sv
LfpJGsdr9nhRWAdGkYv0+XGOa/UlegoAWiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAK
KKKAP//T/v4ooooAKKKKACiiigAooooAKKKKACiiigD5s/an/ZQ+BP7Z3wd1L4FftEeH7TxB4f1J
DmOdcTWs4B2XNrMMPDPGTlHQgg8ZwcV/m7f8FYv+CBf7SX/BO7Wr/wCJPw3h1Dx98JiWmg8R2UPm
6jo8RPEWqwRAEBen2mNfKYcsIz1/1HKpXdna39tJZ38KTxSKUkilUMjqexU8EexoA/w+1ZXUMhBB
6EdDXvH7OP7MPx7/AGuPifZ/Bz9nPwxqXinX7xlxbWCERW0THHn3U7fu7eFepeRlGPU8H/SX/ap/
4Nsv+Cav7T/xNh+Kcehap4HvZb/7Xrtp4Iuhp9jqqsSzrLbbGjhZ2PzPAI2PJJya/Vz9lj9jH9mb
9ir4ep8MP2Z/CGleFtLBDzmyizc3coGPNurly008h7tI5P4UAfjH/wAEc/8Ag35+EH/BP22sfjl8
dxY+NPi68Pmx3zR79M8PO4IaLTI5BlpcHa1043nogVTiv6QsAdKKKACiiigAooooAKKKKACiiigD
+fb/AIL3Efbf2SP+zufBH/oF1X9BBUE818Rftn/sQeCv20pfhnL4y1fVNJPwx+JukfE/TBpgiP2q
90gSCO3n81W/dP5p3bcN6EV9vUAfzIftm+C/GH/BHL9s1/8Agpt8CNPuLj4I/EbULXTP2j/B2nIW
j0q5lcR2/iq0t0ACsrN/pOBySc/6wlfU/wDggr4o8N+N/F37XfjXwbfW+p6Rq/7T+v6npWpWbiSC
7tLqztZYZonU4ZXQgg1+93jbwR4W+I3hDU/h/wCObC11XRdZsJ9M1XTb2MSwXNpcIY5YpEbgq6Eg
ivgL/gmx/wAEyvg//wAEw/AXi/4afBLU9a1HR/Ffi+bxWsGtPHI9hvhjgjtYpEVWaONIlCs+5z1J
zQB8E/8ABRMj/h+d+wv/ALnxG/8ATOK/ohr8qP8AgoD/AMEvdJ/bq+LPw6+Ntl8RvG3w38T/AA0g
1aHQNY8EyQQXSnWFiSdjLLFIyny4yg24+V2BzXzl/wAObf2hP+jyf2k//Bzbf/GKAP3jyvXivLPj
X8LPDvxy+D3in4MeMIVm0rxX4e1Dw9qEcgBBg1C3e3c4PcB8j0PTkV88fsV/sm+PP2TvDeu6D47+
LHxB+LE2s6hDfQah8QLuO6n09IohEYLYxogWNj8zA/xV9tEZGKAP4DPhH8UfHP7S37GXwA/4IeeM
ZJJPF2k/tJX3w7+I1iWJnXwZ8Pbg6pJJMvUQyRNHEhPDGHvX98UWmWEOmppEUEa2qQi2W3CgRiEL
tCbem3bxjGMcV+W/wu/4JHfs8/Cn/gpV4u/4KaaBdam/irxbpc1nJokoi/s2zuruOCK7vYAEEomu
Egw+XIy7kDJr9WqAP4IPip8WPHH7J37Ln7T3/BFbwa7xeKfEX7QWneDfhdZqSJ28J/E+4+2s8C8f
uoIo5Y2I4V5vev7ifgl8KfDvwL+DnhX4L+D4Uh0vwn4e0/w5p8SKFUQafbpAhwPUJknuSc8mvgH4
uf8ABJX9nv4xf8FIfBX/AAUr8SXOpx+KfBelxWUOjQCEade3NotwlpeXAKGUy263GE2uB8i5HFfq
pQB/Nj/wUxufE/8AwTj/AOCjHgH/AIK86Tpd7qfw41jwz/wqH46R6ZE09xpunS3In03WjGnLJBIA
khPACBcjfX9Bfw0+J3w7+MngfTfiX8K9a03xD4f1e1S80zV9KnS4triGQAqVdSRk5AKn5lOQcHiu
i8SeGtA8Y6DeeFfFdja6lpmoW8lpfafexLNBcQSgq8ckbgqyMpIII5H6fhbrX/BB3wP8M/Fd94t/
YD+L3xT/AGf11Oc3F/4e8GakLnw+7t1KabeiWOPPohCjsAKAP3pvL6y061kvtQmigghjaWaaZwiI
iDLMzNgAAckngVk3XiTQ7Pw/J4ruLu3XTYrNtRkv/MBgFqqeaZvMBwUCfNuzjHPSvwW1n/gh38Qv
jiv9hftq/tOfGz4neF2YG48Ii+g0HTbte8dyunojyIejfMM9iOtfrL+0D+zFpXxr/ZO8Qfsk+HtZ
1TwhpOt+Ff8AhDotU0Qo97ZaaYltnSBpw67mtwYtzAnBJ680AfzRfsY/sK6z/wAFgPiX8UP+Cqnj
b4k/FP4dQ+NfGd/4U+GJ8BamukSzeBtBZLazeVmhkdkmmjeQKCF3Atgls1+ho/4IQ2/T/hpz9p7v
/wAzePx/5d6/Yf8AZy+BHgb9mD4E+Ef2evhnC0GgeDdAs/D+mI2N7RWcYj8yQgAGSRgXdscsScCv
baAP5fv+CVPg2+/4Jcf8FIfif/wSl8RazqureEfHOk2/xi+EOra9IJLm7l2+Rrtu7oqI1wHQSPtU
bhHux8xz/Sxqvi7wzoes6d4e1jUbG0v9YeWPSrS5mSKa8aBd8qQIxBkZEO4hQSByeK+L/wBqn9gf
wT+0/wDHn4Q/tJya3q3hnxf8G/ENxrGhanowhze2d9Gsd5pt2JUbdbTqoDAEEZJBGa2f24v+Cf8A
8Av+CgHgHS/BHxyi1eGXw/qTa14Z17w9fzaZqmkag0Zj+0W1xAwIbaSCGDKe4oA+2WUsOmc+p4/z
2r+bbwZpPgzw5/wc061a/s+pBbx3/wCzy198YbfS8LajVjqKDT5LpY/lF20QiJ3fMV57mvU0/wCC
Pv7Zmk2beFfC37a/x4tvD2BDHZ3RsLq+jtxx5a38kXnZxwG619/fsH/8E4v2fP8Agn34Z1my+E6a
tq/iLxVeLqPjLxx4num1DXtcu1yVkurp+dilmKxoAi7icZJNAH3/AF/PH/wWIP8AxsX/AOCf/wD2
W3XP/TXFX9DlfC/7Uf7DPgv9qX47fBT47+KdX1XTr/4J+LbzxdollYCI299cXlutu0d15iM2wKuR
sKnPegD7ory/42Ef8Ka8Xf8AYs6p/wCkklenjpzXM+LvDkHi/wAK6n4Uunkii1PT7nTpZI8b0S5j
MRZcgjIDcZBoA/Gb/g3G/wCUNvwc/wCvPWv/AE83tft/kV/Ol8Jv+CCHjj4E/D7T/hV8Hf2q/wBo
Hw14b0pZV03RNH1Kzt7O3WaRpXEcSW4VdzsWOOpOa9X0n/gj3+0FpmrWmpT/ALYP7R93HbXUNw9r
PrNsYplicMY3AgBKMBhuc4NAH6YftxkH9ij4wD/ql/ir/wBNVzXyB/wQp4/4JCfs/wCf+if2n/oy
Wv0Z+L/w2sPjF8JvFHwl1eee0tPFPh7UfDtzc223zYYdRtpLZ3jDZXeiyEjPGfUcV+Fnw0/4IR/E
j4OeA9K+F3wu/ax/aE0Hw9odolhpGj6ZqlrBaWluhJEcUawYRck9B60Af0PnGOaAMDFfit8K/wDg
lD8d/hz8TfD3xB1f9rD9oHxHaaLrNpqtz4e1rVreTT9SjtpA7Wt0iwKzQygbXGeQTX7VUAFFFFAB
RRRQAUUUUAFFFFABRRRQAUUUUAFFFFABRRRQAUUUUAf/1P7+KKKKACiiigAooooAKKKKACiiigAo
oooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACii
igAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKK
ACiiigAooooAKKKKACiiigD/1f7+KKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooA
KKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAo
oooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACii
igD/1v7+KKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKK
ACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooA
KKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigD/1/7+KKKKACiiigAo
oooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACii
igAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKKACiiigAooooAKKKK
ACiiigAooooAKKKKACiiigAooooAKKKKACiiigD/2Q==sophie.quinceAndrew Grogan22020-03-06T10:22:00Z2021-02-17T14:45:00Z2021-02-17T14:45:00ZNormal1501748291960Microsoft Office Word01459224falseTitle1false109218falsefalse16.0000