10 February 2015

Keeping Humans in the Loop

By Captain George Galdorisi, U.S. Navy (Retired)

You say you want a revolution? Autonomous unmanned vehicles could bring on the biggest one yet.

In his best-selling book, War Made New , military historian Max Boot notes: “My view is that technology sets the parameters of the possible; it creates the potential for a military revolution.” 1 He supports his thesis with historical examples to show how technology-driven “Revolutions in Military Affairs” have transformed warfare and altered the course of history. The U.S. military has embraced a wave of technological change that has constituted a true revolution in the way war is waged.

One of the most rapidly growing areas of innovative technology adoption involves unmanned systems. In the past decade the military’s use of unmanned aerial vehicles (UAVs) has increased from only a handful to more than 5,000, while the use of unmanned ground vehicles (UGVs) has exploded from zero to more than 12,000. The use of unmanned surface vehicles (USVs) and unmanned underwater vehicles (UUVs) is also growing, as USVs and UUVs are proving to be increasingly useful for a variety of military applications. The skyrocketing use of unmanned systems (UxS) is already creating strategic, operational, and tactical possibilities that did not exist a decade ago.
Defining Warfare

Armed unmanned systems are not only changing the face of modern warfare, but they are also altering the process of decision-making in combat operations. Indeed, it has been argued that the rise in drone warfare is changing the way we conceive of and define “warfare” itself. These systems have been used extensively in the conflicts in Iraq and Afghanistan and will continue to be equally relevant—if not more so—as the U.S. strategic focus shifts toward the Asia-Pacific region and the high-end warfare that strategy requires.

However, while these unmanned systems are of enormous value today and are evolving to deliver better capabilities to the warfighter, it is their promise for the future that causes the most excitement. These systems have created a substantial buzz in policy, military, industry, and academic circles. But an increasing amount of it involves concerns—some legitimate—regarding the extent of autonomy these systems ought to have. Unless or until those concerns are addressed, the enormous potential of these technological marvels may never be realized.

The Plan for Military Autonomous Systems

At the highest levels of U.S. strategic and policy documents, unmanned systems are featured as an important part of the way the joint force will fight in the future. The 2014 Quadrennial Defense Review ( QDR ) notes, “Continuing a trend that began in the late 1990s, U.S. forces will increase the use and integration of unmanned systems.” Elsewhere in the document, unmanned systems are identified as “Maintaining our ability to project power.” Equally important, the QDR highlights unmanned systems as a key part of the Department of Defense’s commitment to innovation and adaptation. 2

The DOD’s vision for unmanned systems is to integrate them into the joint force for a number of reasons, especially to reduce the risk to human life in high threat areas, to deliver persistent surveillance over areas of interest, and to provide options to warfighters that derive from the inherent advantages of unmanned technologies—especially their ability to operate autonomously.

Because unmanned systems are used by all the military services, the DOD publishes a biennial roadmap to provide an overarching vision for the military’s use of them. The most recent such document, the FY 2013–2038 Unmanned Systems Integrated Roadmap , singled out the need for enhanced UxS autonomy, noting, “DOD envisions unmanned systems seamlessly operating with manned systems while gradually reducing the degree of human control and decision making required for the unmanned portion of the force structure.” 3 As Dyke Weatherington, the DOD Director for Unmanned Warfare and Intelligence, Surveillance, and Reconnaissance, noted, “The roadmap articulates a vision and strategy for the continued development, production, test, training, operation, and sustainment of unmanned systems technology across DOD. . . . This roadmap establishes a technological vision for the next 25 years.”

As the QDR and the roadmap both note, unmanned systems are especially important assets in those areas where the U.S. military faces a strong anti-access/area-denial (A2/AD) threat. The Joint Operational Access Concept identifies, “Unmanned systems, which could loiter to provide intelligence collection or fires in the objective area” as a key counter-area-denial capability. 4 And unmanned systems are a key component in executing the Air-Sea Battle Concept in high-threat areas such as the Western Pacific, where adversary A2/AD systems pose an unacceptably high risk to manned aircraft.
‘The New Triad’

Outside observers have highlighted the importance of unmanned systems in achieving U.S. strategic goals. In his 2013 article in Foreign Policy , “The New Triad,” Admiral James Stavridis identified unmanned systems as one of the three pillars of this triad, noting, “The second capability in the New Triad is unmanned vehicles and sensors. This branch of the triad includes not only the airborne attack ‘drones’ . . . but unmanned surveillance vehicles in the air, on the ground, and on the ocean’s surface. . . . Such systems have the obvious advantage of not requiring the most costly component of all: people.” 5

The U.S. Navy has been at the forefront of UxS development. The 28th Chief of Naval Operations Strategic Studies Group (SSG) spent one year examining this issue, and its report spurred increased interest in, and emphasis on, unmanned systems Navywide. Leveraging the SSG’s work, recent Navy focus has emphasized the need to enhance UxS command-and-control (C2) capabilities to allow one sailor to control multiple systems in an attempt to lower total ownership costs (TOC) of unmanned systems. This link between increased autonomy and decreased TOC has become an important theme in Navy UxS development.

Clearly, the Navy’s leadership is committed to UxS. CNO Admiral Jonathan Greenert’s Sailing Directions state, “Over the next 10 to 15 years . . . unmanned systems in the air and water will employ greater autonomy and be fully integrated with their manned counterparts.” 6 Admiral Greenert highlights the importance of unmanned systems in his Proceedings articles “Navy 2025: Forward Warfighters” and “Payloads Over Platforms: Charting a New Course,” where he argues that payloads, including unmanned systems, will increasingly become more important than platforms themselves. 7
The Challenges for Autonomous Systems

Well over a decade ago, in its report Roles of Unmanned Vehicles , the Naval Research Advisory Committee (NRAC) highlighted the bright future and enormous potential for autonomous systems, noting, “The combat potential of UVs (unmanned vehicles) is virtually unlimited. . . . There is no question that the Fleet/Forces of the future will be heavily dependent upon UVs.” 8 In the years following the NRAC report, the U.S. military has been working with industry and academia to make unmanned vehicles more and more autonomous. The reasons for this effort are compelling.

As described in the most recent Unmanned Systems Roadmap , there are four levels of autonomy: Human Operated, Human Delegated, Human Supervised, and Fully Autonomous. However, the Roadmap notes that in contrast to automatic systems, which simply follow a set of preprogrammed directions to achieve a predetermined goal, autonomous systems “are self-directed towards a goal in that they do not require outside control, but rather are governed by laws and strategies that direct their behavior.” 9

One of the most pressing challenges for the DOD is to reduce the prohibitively burdensome manpower requirements currently necessary to operate unmanned systems. Military manpower makes up the largest part of the TOC of systems across all the services. 10 But how expensive is military manpower? To better understand this need to reduce manpower requirements, it is important to understand the costs of manpower to the U.S. military, writ large.

Military manpower accounts comprise the largest part of the TOC of military systems across all services. Additionally, military manpower costs are the fastest-growing accounts, even as the total number of military men and women decrease . According to a 2012 Office of Management and Budget report, military personnel expenditures have risen from $74 billion in 2001 to $159 billion in 2012, an increase of almost 115 percent. 11 Mackenzie Eaglen and Michael O’Hanlon have noted that between Fiscal Year 2001 and 2012, the compensation cost per active-duty service member increased by 56 percent, after being adjusted for inflation. 12
Increased Manning

Lessons learned throughout the development process of most unmanned systems—especially unmanned aerial systems—demonstrate that those systems can actually increase manning requirements. The Air Force has estimated that the MQ-1 Predator requires a crew of about 168 personnel, while the MQ-9 Reaper requires a crew of 180, and the RQ-4 Global Hawk relies on 300 people to operate it. As General Philip Breedlove, Vice Chief of Staff of the Air Force, has emphasized, “The number one manning problem in our Air Force is manning our unmanned platforms.” 13

Compounding the TOC issue, the data overload challenge generated by the proliferation of unmanned aircraft and their sensors has created its own set of manning issues. In fact, the situation has escalated so quickly that many doubt that hiring additional analysts will help ease the burden of sifting through thousands of hours of video. A former Vice Chairman of the Joint Chiefs of Staff was quoted as complaining that a single Air Force Predator can collect enough video in one day to occupy 19 analysts, noting, “Today an analyst sits there and stares at Death TV for hours on end, trying to find the single target or see something move. It’s just a waste of manpower.” 14 The data-overload challenge is so serious that it’s widely estimated the Navy will face a tipping point by 2016, after which the service will no longer be able to process the amount of data it’s compiling. 15

With the prospect of future flat or declining military budgets, the rapidly rising cost of military manpower, and the increased DOD emphasis on total ownership costs, the mandate to move beyond the “many operators, one-joystick, one-vehicle” paradigm for UxS that has existed during the past decades for most unmanned systems is clear and compelling. The DOD and the services are united in their efforts to increase the autonomy of unmanned systems as a primary means of reducing manning and achieving acceptable TOC. But this drive for autonomy begs the question as to what this imperative to increase autonomy comports, and what, if any, downside occurs if we push UxS autonomy too far. Is there an unacceptable “dark side” to too much autonomy?
The Dark Side

An iconic film of the last century, Stanley Kubrick’s 2001: A Space Odyssey , features a key scene in which astronauts David Bowman and Frank Poole consider disconnecting the cognitive circuits of the ship’s computer, HAL ( H euristically programmed AL gorithmic computer), when he appears to be mistaken in reporting the presence of a fault in the spacecraft’s communications antenna. They attempt to conceal what they are saying, but are unaware that HAL can read their lips. Faced with the prospect of disconnection, HAL decides to kill the astronauts to protect and continue its programmed directives.

While few today worry that a 21st-century HAL will turn on its masters, the issues involved with fielding autonomous unmanned systems are complex, challenging, and increasingly contentious. Kubrick’s 1968 movie was prescient. Almost half a century later, while we accept advances in other aspects of UxS improvements such as propulsion, payload, stealth, speed, endurance, and other attributes, we are still coming to grips with how much autonomy is enough and how much may be too much. This is arguably the most important issue we need to address with unmanned systems over the next decade.

Unmanned systems become more autonomous in direct proportion to their ability to sense the environment and adapt to it. This capability enables unmanned systems to achieve enhanced speed in decision-making and allows friendly forces to act within an adversary’s OODA (observe, orient, decide, and act) loop, the brainchild of Air Force Colonel John Boyd, applied to fighter tactics. As the environment or mission changes, the ability to sense and adapt will allow unmanned systems to find the optimal solution for achieving their mission, without the need to rely on constant human-operator oversight, input, and decision-making . But while we need unmanned systems to operate inside the enemy’s OODA loop, are we ready for them to operate without our decision-making—to operate inside ourOODA loops?

In an article titled “Morals and the Machine,” The Economist addressed the issue of autonomy and humans-in-the-loop this way:

As they become smarter and more widespread, autonomous machines are bound to end up making life-or-death decisions in unpredictable situations, thus assuming—or at least appearing to assume—moral agency. Weapons systems currently have human operators “in the loop,” but as they grow more sophisticated, it will be possible to shift to “on the loop” operation, with machines carrying out orders autonomously. As that happens, they will be presented with ethical dilemmas. . . . More collaboration is required between engineers, ethicists, lawyers, and policymakers, all of whom would draw up very different types of rules if they were left to their own devices. 16

Bill Keller further explored the issue of autonomy for unmanned systems in The New York Times :

If you find the use of remotely piloted warrior drones troubling, imagine that the decision to kill a suspected enemy is not made by an operator in a distant control room, but by the machine itself. Imagine that an aerial robot, with no human in the loop, pulls the trigger. While Americans are debating the president’s power to order assassination by drone, powerful momentum is propelling us toward the day when we cede the same lethal authority to software. 17

The Department of Defense is addressing the issue of human control of unmanned systems as a first-order priority and is beginning to issue policy to ensure that humans do remain in the OODA loop. A November 2012 directive by then-Deputy Secretary of Defense Ashton Carter issued the following guidance:

Human input and ongoing verification are required for autonomous and semi-autonomous weapon systems to help prevent unintended engagements. These systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force. Humans who authorize the use of, or operate these systems, must do so with appropriate care and in accordance with the law of war, applicable treaties, weapon system safety rules and applicable rules of engagement. An autonomous system is defined as a weapon system that, once activated, can select and engage targets without further intervention by a human operator. 18

These are the kinds of directives and discussions that are—and should be—part of the dialogue between and among policy makers, military leaders, industry, academia, and the science-and-technology community as the design and operation of tomorrow’s UxS are thoughtfully considered. But it is one thing to issue policy statements, and quite another to actually design UxS to carry out the desired policy. This is not a trivial undertaking and—in Albert Einstein’s words—will require a new way of “figuring out how to think about the problem.” 19 Most informed discussion begins with the premise that adversaries who intend to use UxS against our interests will not be inhibited by the kinds of legal, ethical, and moral strictures the United States adheres to. Designing the right degree of autonomy into our unmanned systems is the central issue that will determine their success or failure.
Designing in the Right Degree of Autonomy

Most of us are familiar with the children’s fable “Goldilocks and the Three Bears.” As Goldilocks tastes three bowls of porridge, she finds one too hot, one too cold, and one just right. As the DOD and the services look to achieve the right balance of autonomy and human interaction—to balance these two often-opposing forces and get them “just right”—designing this capability into tomorrow’s unmanned systems at the outset rather than trying to blot it on after the fact may be the only sustainable road ahead. If we fail to do this, it is almost inevitable that concerns our armed unmanned systems will take on “HAL-like” powers and be beyond our control will derail the promise of these technological marvels.

The capabilities required to find this “just right” balance must leverage many technologies that are still emerging. The military knows what it wants to achieve, but often not what technologies or even capabilities it needs to field UxS with the right balance of autonomy and human interaction. A key element of this quest is to worry less about what attributes—speed, service ceiling, endurance, and others—the machine itself possesses and instead focus on what is inside the machine. The Defense Science Board report The Role of Autonomy in DoD Systems put it this way:

Instead of viewing autonomy as an intrinsic property of unmanned systems in isolation, the design and operation of unmanned systems needs to be considered in terms of human-systems collaboration. . . . A key challenge for operators is maintaining the human-machine collaboration needed to execute their mission, which is frequently handicapped by poor design. . . . A key challenge facing unmanned systems developers is the move from a hardware-oriented, vehicle-centric development and acquisition process to one that emphasizes the primacy of software in creating autonomy. 20

One need only go to an industry conference where UxS are being displayed at multiple booths to understand that today the emphasis is almost completely on the machine itself. What is inside is not a primary consideration. But as the Defense Science Board notes, software is the primary driver of capabilities. For example, the manned F-35 Lightning has ten billion lines of computer code, and there is human supervision by the pilot. How many lines of code will need to be built into an unmanned system to get the balance of autonomy and human interaction just right?

For the relatively small numbers of UxS that will engage an enemy with a weapon, this balance is crucial. Prior to firing a weapon, the unmanned platform needs to provide the operator—and there must be an operator in the loop—with a “pros and cons” decision matrix regarding what that firing might entail. When we build that capability into unmanned systems we will, indeed, have gotten it just right and the future of military autonomous systems will be bright.

1. Max Boot, War Made New: Technology, Warfare, and the Course of History 1500 to Today (New York: Gotham Books, 2006), 318-51. See also Bruce Berkowitz, The New Face of War: How War Will Be Fought in the 21st Century (New York: The Free Press, 2003).

2. Quadrennial Defense Review (Washington, DC: Department of Defense, 2014).

3. FY 2013–2038 Unmanned Systems Integrated Roadmap (Washington, D.C.: Department of Defense, 2013).

4. Department of Defense, Joint Operational Access Concept , (Washington, DC: 17 January 2012), 10.

5. ADM James Stavridis, USN, “The New Triad,” Foreign Policy , 20 June 2013.

6. ADM Jonathan Greenert, USN, Sailing Directions , www.navy.mil/cno/cno_sailing_direction_final-lowres.pdf [8] .

7. ADM Jonathan Greenert, USN, “Navy 2025: Forward Warfighters,” U.S. Naval Institute Proceedings , vol. 137, no. 12 (December 2011), and “Payloads Over Platforms: Charting a New Course,” U.S. Naval Institute Proceedings , vol. 138, no. 7 (July 2012).

8. Naval Research Advisory Committee, Roles of Unmanned Vehicles (Washington, DC: Naval Research Advisory Committee, 2003).

9. FY 2013–2038 Unmanned Systems Integrated Roadmap .

10. Navy Actions Needed to Optimize Ship Crew Size and Reduce Total Ownership Costs (Government Accountability Office, GAO-03-520, 9 June 2003). See also Connie Bowling and Robert McPherson, Shaping The Navy’s Future , (Washington, D.C.: Accenture White Paper, February, 2009), www.accenture.com/us-en/Pages/service-public-service-shaping-navys-futur... [9] .

11. Congressional Budget Office, Costs of Military Pay and Benefits in the Defense Budget , 14 November 2012. See also Mackenzie Eaglen and Michael O’Hanlon, “Military Entitlements are Killing Readiness,” Wall Street Journal , 25 July 2013, and Todd Harrison,Rebalancing Military Compensation: An Evidence-Based Approach , (Washington, DC: Center for Strategic and Budgetary Assessments, 12 July 2012).

12. Eaglen and O’Hanlon, “Military Entitlements are Killing Readiness.”

13. Quoted in Lolita Baldor, “Military Wants to Fly More Sophisticated Drones,” Associated Press, 4 November 2010.

14. Ellen Nakashima and Craig Whitlock, “Air Force’s New Tool: ‘We Can See Everything,’” The Washington Post , 2 January 2011.

15. The ISR “tipping point” has been noted in a TCPED study from the Office of the Chief of Naval Operations and PMW 120 (Battlespace Awareness and Information Operations), an independent Navy Cyber Forces study, and the NRAC study from summer 2010.

16. “Flight of the Drones: Why the Future of Air Power Belongs to Unmanned Systems,” The Economist, 8 October 2011.

17. Bill Keller, “Smart Drones,” The New York Times , 10 March 2013.

18. Deputy Secretary of Defense Ashton Carter Memorandum, “Autonomy in Weapon Systems,” 21 November 2012, www.defense.gov/ [10] . See also, “Carter: Human Input Required for Autonomous Weapon Systems,” Inside the Pentagon , 29 November 2012, for a detailed analysis of the import of this memo.

19. Wilber Shramm and William Porter, Men, Women, Messages and Media: Understanding Human Communication (New York: Harper and Row, 1982).

20. Defense Science Board, Task Force Report: The Role of Autonomy in DoD Systems , July 2012.

Captain Galdorisi writes frequently for Proceedings and is the coauthor of the Naval Institute Press book The Kissing Sailor (2012) with Lawrence Verria.

No comments: