The Briefing

Comparing The Strategic And Legal Features Of Cyberwar, Drone Warfare, And Autonomous Weapon Systems

Friday, February 27, 2015

Cyberwar, targeted killing using remotely-piloted drones, and autonomous weapon systems (AWS) are all emerging weapon technologies and operational capabilities that are, or gradually will come to be, available to the world’s advanced militaries — and in some instances, available to non-state actors and individuals as well.  Different as these systems and capabilities are in fundamental ways, they also share certain common features.  Some of these common features highlight general issues of how the law of armed conflict might, and possibly should, regulate their use in warfare. 

The most obvious similarity among these three weapon technologies is that each is a “standoff” system, in which the attacker is at a remote distance from the place attacked, potentially far away from any conventional military theater of operations.  The most obvious difference is that cyber, unlike the other two, is not directly kinetic.  Cyberattacks against civilian infrastructure such as the electric grid might have vastly more drastic effects in the “real” world of people and things than nearly any kinetic weapon short of a nuclear blast or an EMP (an electromagnetic pulse that destroys electronic circuits, potentially over a wide geographical swathe).  For all the risks cyberweapons pose — including the risk of uncontrolled viral replication — they fall into the category, broadly speaking, of attacks on an enemy’s command, control, and communications systems. In most cases (and this is important with respect to the laws of war), these so-called "C-cubed" functions involve networks that serve both civilian and military purposes — most obviously the Internet, but including many other infrastructure systems.  Yet a cyberattack requires no bomb or missile.

Remote-piloted drone aircraft — used for both long-run surveillance and targeted killings, either of which might be directed against a particular individual or a group of fighters — attack with a missile, a kinetic weapon. Drone operations are thus different from cyberwarfare operations in that regard, but like them with respect to the remote nature of the attack.  Teasing out what constitutes AWS, in contrast to drone warfare, involves a complex exercise in definitions of “automation” and “autonomy” — and it raises the question of whether any “true” AWS is currently deployed — but for this essay’s purposes, AWS can be treated as an “emerging” technology that is kinetic (like drones) and remote (like drones and cyber).

Remote, standoff weapon platforms disturb many people, mostly from fear that a system that separates the immediate operation of the weapon from the human operator increases the risks of unintended or unaccountable actions by the machine.  This is not a categorical legal objection to such systems. Remoteness together with technological complexity might make it more difficult to engineer a weapon system in one of these categories whose results and effects conform to the requirements of law.  Yet remoteness and technology together might improve the weapon system in important ways, including the ability to confine the effects of an attack more narrowly to intended, lawful targets, thereby reducing collateral damage. 

Remoteness is closely related to another feature of all these emerging weapon systems: the potential for attacks that involve far smaller footprints, not just of the attack, as discussed above, but for the weapon system itself and for its human operators — at least with regard to its presence and detectability at the situs of attack.  In other words, technologies that enhance precision (“discrimination” in law of war terminology) are good from the legal standpoint of seeking to limit the collateral damage of imprecise weaponry, but a weapon system whose precision comes in part from being difficult to detect and trace also makes possible essentially unattributable attacks. An attacker using a small-footprint weapon system could invoke plausible deniability or not acknowledge the act; at the extreme, no one but the attacker would know for certain who conducted the operation. 

Why care?  And especially why care given that this diminished ease-of-attribution (or at least of attribution quickly enough to matter, in hours rather than days or months or years) is a function of legally and normatively good features of these emerging weapon technologies?  The risk, as international and national security law scholar Michael Glennon pointed out in the Hoover Institution’s former journal Policy Review, is that difficulty with attribution of a cyberattack — difficulty in identifying the responsible party, quickly and definitively — is potentially highly destabilizing in the international arena.  The point certainly applies to remote kinetic weapon systems, though proposals to ban all these systems would not solve the attribution problem (given that legal restrictions are unlikely to persuade bad actors not to conduct attacks) and would deprive all parties, including civilians at risk on the battlefield, of advances in weapon technologies that might limit the spillover effects of conflict. 

Attribution may turn out to be less difficult in each of these realms than previously feared. Technical experts suggest that attribution is becoming more feasible in cyberattacks, though speed of attribution — and, therefore, meaningful response — remains an obstacle.  And although drones are a core tool against terrorists and non-state insurgent forces, current drones are almost helpless against detection by even modestly robust air-defense systems.  Even so, air-defense systems do not guarantee that a state will detect and be able to attribute any and all drone attacks carried out in its territory, particularly in places where the government does not anticipate such attacks. 

For states conducting drone operations, preventing their drones from being detected will not be easy without significant technological efforts to protect against the air defenses of even moderately sophisticated militaries. So, in the tradition of historical arms races, drone-deploying militaries are likely to continue down the path of producing less detectable, less vulnerable drones as their targets implement ever-more-sophisticated detection capabilities and countermeasures. Exactly how this co-evolution plays out, however, depends on technological developments and many other considerations, including defense budgets.

There is a final question apart from issues of remoteness, of precision and discrimination, and of attribution.  How many states, let alone non-state actors, are in fact capable of developing and effectively deploying these remote systems?  With respect to this question, differences between the technologies are significant.  Cyber tools are difficult to develop, to be sure. And, as with all these systems, they are even more difficult to develop if the aim is to make them precise and discriminating.  But creating and utilizing cyber tools (which, fundamentally, are software programs) is less complicated than operating successfully — let alone developing and producing —large aircraft, sensors, kinetic missiles with their own guidance systems, and so on. 

The physical attributes of drones and AWS require a sophisticated system of high technology industrial production. Even if these systems can be bought by a less sophisticated customer from a high-tech seller, a complex and expensive military infrastructure is required to operate and maintain them.  Recent management studies of drone warfare systems suggest, almost certainly correctly, that wide, deep, and rapid proliferation of these systems is quite unlikely, if for no other reason than that the remote precision-targeting drone systems described in this essay require an asset that is very difficult to come by: access to a sophisticated GPS system.  We take the ones we use most for granted — the system that enables us to ping a satellite and FindMyPhone or get turn-by-turn driving directions — but the US Air Force operates the most important systems, even with respect to civilian GPS.  Russia has alternatives, as does China, but Iran and North Korea are another matter.  Anyone can buy on Amazon a quite-impressive small drone (some are not so small), and the tactical utility of being able to use drones like these to see over the next hill in combat is priceless.  But the gap between that and a Predator or Reaper operated by someone half the globe away is so large that perhaps we should use different words for these very different machines.