Saturday, September 29, 2018

Adaptive Red Teaming: Protecting Across the Spectrum


Authors: John P. Sullivan and Adam Elkus
Publication: Red Team Journal
Date: July 2010

In a 2010 article in the Red Team Journal, John Sullivan and Adam Elkus outline their view on a style of red teaming they refer to as Adaptive and Analytic red teaming.

The authors first discuss the foundations of red teaming. They cite Schelling, who believes that red teaming is an approach to combat the “poverty of expectations” where “the danger is not that we shall read the signals and indicators with too little skill; the danger is in a poverty of expectations – a routine obsession with a few dangers that may be familiar rather than likely.”
Adaptive red teaming, the authors say, “involves an iterative range of analytical and physical approaches to understanding an adversary.” The authors suggest that these new methods for red teaming are valuable for conducting analysis on counterterrorism, counterinsurgency, and counterviolence. The authors furthermore state that analytic red teaming is an approach to thinking in an adversarial manner, in this case a terrorist or opposing force (OPFOR). The idea behind analytic red teaming is to get an “enhanced understanding of the groups particular driving factors strategic goals, leadership and decision-making dynamics and processes, operational capabilities and rationales, organizational dynamics and behaviors, adaptive capacities, etc. and their corollary and derivative operations.” Put simply, the goal is to understand the specified adversary’s “mindset” (ideology, strategic agenda, leadership) and the adversary operational behaviors (capabilities, modus operandi, targeting preferences).

The premise for the article rests on the idea that there is a larger spectrum of threats through which to use red teaming than just Islamic terrorism, which has drawn a majority of the focus of military, intelligence, and law enforcement organizations. Sullivan and Elkus believe that there are a variety of threats that are potential risks to public safety. While the authors mention some specific examples, they use the title “black blocs” as a catch-all concept for such anarchic or potentially violent groups. To protect against such groups, Sullivan and Elkus argue new the concepts kill chain, order of battle, and Design can enhance analytic red teaming in practice.

The authors describe the kill chain model as “the process of assembling weapons and personnel in place, conducting reconnaissance and dry runs, and then carrying out the act itself.” Sullivan and Elkus argue that by following the adversary’s necessary courses of action and tools required through a series of decision trees, with branches of tasks and subtasks, kill chain can produce data (of trends and potential) that can be used to test the adversary’s capabilities. The authors use an example of a black bloc to highlight the use of kill chain. The authors suggest that kill chain is an additional method that can highlight indicators of attack and vulnerabilities in an adversary’s command and control, by applying different network types.

Order of battle is a method used by the military to “displays the enemy’s organization and disposition” and denote “different types of units, equipment, and axes of advance…to predict the behavior of these units.” Sullivan and Elkus believe that “ORBAT analysis can be used to give teeth to analysis of the kill chain.” ORBAT’s feed into the kill chain, giving more information about the adversary’s cell to cell capabilities. An example of ORBAT used is in free-playing tactical decision-making games which theoretically tests how a unit can adapt to real-time tactical scenarios when there is “no right answer.” An example of such games in the real world is the training that Army units go through at the National Training Center.

Design is a method that was developed by the Army School of Advanced Military Studies to “frame a problem creatively prior to solving it.” Design frames the operational environment, frames the problem, and provides an operational approach to push the problem to an acceptable resolution.  The process occurs concurrently with the planning of the operation at hand. The authors believe this process can be used to challenge assumptions on longer-term issues that are more strategic in nature, like risk management and risk analysis. The authors suggest that Design can aid in better defensive measures against threats like Al-Qaeda and terrorism, writ-large. They cite the examples of the German Red Army and Irish Republican Army as examples.

In conclusion, the authors argue that by extending red team analysis with new methods, we “can help diagnose threats, vulnerability and risk, and point the way toward a better means of providing security and addressing emerging threats.” These methods can be used to prevent terrorist attacks or be used to refine prevention and deterrence activities.

Critique: The authors have a firm knowledge of alternative methods that could be used to enhance red team analysis. If I am not mistaken, the idea of the kill chain and ORBAT has, in a way, been incorporated into targeting analysis which is a key intelligence activity in the war against terror. Arguably, these methods have been effective. Otherwise it’s hard to see how these forms of analysis have been used inside the intelligence, military, and law enforcement arenas. Kill chain is a logical and practical method that has a lot of utility in all areas. ORBAT is only practiced by the military. ORBAT, and by extension IPB, is a specialized practice and is unlikely to be used outside military settings. Intelligence operations likely incorporate such thinking but in an unstructured way. Based on the article, I see no utility to Design that isn’t already served by the previous two methodologies.

Link to article: 
https://redteamjournal.com/papers/RTJ_Occasional_Paper_01_July_2010.pdf

The Defense Science Board: The Role and Status of Red Teaming Activities


In 2003 the Defense Science Board prepared a report for then Secretary of Defense, Donald Rumsfeld, on red teaming.  The report is broken down under seven sections:

1       -Introduction
2      -What Are Red Teams And Red Teaming?
3       -What makes an Effective Red Team?
4       -Observations About Current Red Team Activities
5      -Red Teams At The Strategic Level
6      -Conclusions
7      - Recommendations

The report indicates that the Defense Science Boards takes on a rather broad role in their definition of red teaming.  In their definition, they not only describe playing the adversary role as important but include devil’s advocate and other similar roles.  They justify this definition due to the shared goal of challenging the norms of an organization.  The board states that red teaming can be used by the Department of Defense at the strategic, operational and tactical level for a variety of areas. Red teaming can and is currently used by the DoD in training, concept development, security of complex networks, and for scenarios that provide little flexibility such as nuclear weapon stockpile issues (DSB, 2003). Red teaming can also be used to hedge against bias, conflict of interest and against inexperience which is common in the DoD and OGA due to short leadership tenure (DSB, 2003).

In detailing what makes an effective red team, the defense science board indicates common causes of failure as well as attributes that create effective red teams.  The top failure listed was when a red team does not take their assignment seriously. The authors of the report include an additional note about this failure from task force members, the TF members explain that they have often never been provided a clear statement of purpose when assigned to a red team (DSB, 2003). Other failures listed include:

-         - Lack of independence due to bureaucracy,
-          -Removal from the decision-making process
-          -Inadequate interaction with the blue team
-         - Destruction of the integrity of the findings due to information leaks

 After addressing those failures, the Defense Science Board details the attributes of effective red teams and identifies the following:

-         - A culture that supports internal criticism
-          -Independence with accountability
-          -Serious consideration of the red team’s output
-         - Robust interaction between red and blue team
-          -Careful selection of staff
-         - Skillful timing and implementation (using them too late, or too soon)

The report ultimately concludes that red teaming is effective and underutilized by the DoD and it should be expanded.  The Defense Science Board indicates that for the U.S. to better understand their current adversaries in the war on terrorism and to prevent complacency, red teaming must play a role.


Defense Science Board Task Force. (2003). The Role and Status of DoD Red Teaming Activities. Department of Defense. Retrieved from https://fas.org/irp/agency/dod/dsb/redteam.pdf

Friday, September 28, 2018

The 'Best Practices' of Red Teaming
Dr. Brad Gladman
Summary and Critique by Jillian J

Summary
Dr. Gladman explores the importance of red teaming, organizations and methodology, the composition of the red team, and the process of red teaming. He presents a description of red teaming, acknowledging that the concept of a red team manifests itself in many forms-- "In general, red team efforts, both devil's advocate and 'opposing force,' can help hedge against surprise, particularly catastrophic surprise, through their challenge function that provides a wider and deeper understandings of potential adversary options and behaviors, and can expose potential vulnerabilities in friendly strategies, force postures, plans, programs, and concepts. Red team efforts can also help organizations to avoid biases and the tendency to accept the typical assumptions and solutions to problems" (Gladman 2007). The traditional opposing force role can uncover how human adversaries may threaten the system, but Gladman asserts that considering the impact of an event like and earthquake necessitates the other type of red team role--devil's advocate. Modeling this type of adversarial event is an important function of a red team. HE also distinguishes between opposing force red team and risk assessment wherein the former assumes the adversary and situation is unknown and in the latter the threat is already known. However, Gladman states that we cannot completely separate the two.

The Importance of Red Teaming
This section served as a history lesson in red teaming, citing its use back to the early nineteenth century with the rise of the Prussian General Staff with Kriegspiele or war game. Gladman goes on to discuss how important it is for red teams to avoid group-think (the red team all getting on one track and thus failing to present diverse, original scenarios) and mirror-imaging (the red team presenting scenarios in a 'this is how we'd act if we were them'"). To fight these tendencies, he suggests assigning a critical evaluator to each team member to foster an environment where they can voice objections and attempting to understand the specific nature of the operating environments complexity and the capabilities of the adversary, respectively. 

Organizations and Methodology
Gladman addresses the importance of culture and asserts that a direct reporting relationship to the commander or head of the lead agency is critical to the red team's success or failure. Additionally. the red team must engage in continual sharing of relevant information and findings with teams at each command level. The red team must be seen as a critical and legitimate part of the planning process, have freedom to challenge areas where they detect problems, and must be logically organised and applied at critical times during the study. Gladman (2007) writes that ideally, "there is robust interaction between, initially, the red and planning teams, and later between red and blue teams during pre-event exercises. Both teams must view this interaction as a 'win-win' versus a 'win-lose' prospect." This results in sharper skills and greater application. At this point he also highlights the importance of clearly understanding the mandate and its expected outcomes.

Composition of the Red Team 
Personality and subject matter expertise are both important to consider when selecting members of a red team. Someone may be professionally qualified, but they lack the temperament required for the task. He suggests some key team members e.g. a policy advisor (POLAD) to advise on policy issues and maintain a range of contacts across government departments, a team facilitator, to coordinate team discussions and identify assumptions and biases, and a Operational Research and Analysis (ORA) member, who helps decision-makers improve the effectiveness of operations or systems. Members must continue to learn and adapt, while being able to present their recommendations in a way that ensures appropriate attention. Gladman discusses how experience shows red teams often fail because they don't that the assignment seriously or don't have adequate exposure to the planning staffs and documents.

Process of Red Teaming
Gladman includes the this image, highlighting different functions (planning assumptions as "devil's advocate", developing "vignettes" or scenarios prioritized for impact and degree of risk, and carrying out exercises in the "opposing force" role, and finally assessing lessons learned) of the red team and how it leads up to the 2010 Vancouver Olympics.
Next he discusses useful tools for red teaming e.g. capturing thoughts and ideas, brainstorming and simulation software, and the matrix included below. The matrix assigns a level of risk (Low, Medium, High, Extreme) at the intersection of severity if it were to occur and likelihood of its occurrence. 
Gladman maintains that red teaming can create a mental framework much better prepared for the unexpected. 

Critique 
Gladman wrote that the paper's central argument was that, "military organizations which more easily welcome the challenging of assumptions and plans frequently fare better in the operations they undertake than those that do not." I didn't get a clear sense of that as much as I got a clear understanding of red teaming elements. After reading the paper I certainly think that's true, but I don't think it came through as the prevailing argument. Structure aside, I liked his idea of facilitating a win-win atmosphere. Instead of having the red team try to outwit the blue team or the blue team try to outplan the red team, he presented it as more collaborative. The red team's primary job is to provide the adversarial scenarios alongside the blue team, resulting in greater preparedness. It was also interesting to see him include devil's advocacy in the two-part role of a red team and also to read about the importance of direct communication with command. Additionally, I believe this was my first exposure to the Risk Assessment Matrix and I'd like to keep it in mind for future analyses. I like how it puts a structure behind the judgement. Overall, Gladman's paper was informative and extensive, providing useful explanations and direction.





Tuesday, September 25, 2018

Summary of Findings: Intelligence Preparation of the Battlefield (IPB) (4 out of 5 Stars)



Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the articles read in advance (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst University, in September 2018 regarding IPB as an Analytic Method, specifically. This technique was evaluated based on its overall validity, simplicity, flexibility and its ability to effectively use unstructured data.

Description:
Intelligence Preparation of the Battlefield (IPB) is an analytic method for understanding the threat and environment in a specific geographic area. The military has applied IPB to  analyzing the mission variables of enemy, terrain, weather, and civil considerations in an area of interest to determine their effect on operations.  As a result of this process, unit leaders are able to: describe the unit’s operating environment as well as the effects the environment has on the unit, and determine likely courses of action (COA) by the adversary.

Strengths:
  • Provides base to assess intelligence gaps in the environment
  • Helps prioritize requirements
  • Increases awareness of the battlespace
  • Structure allows analysts to identify extraneous information
  • Flexible based on terrain, operations, contingencies

Weaknesses:
  • Does not take 2nd and 3rd order effects of battle into consideration when planning
  • Assumes adversary is fighting “the same battle”; Conventional-on-Conventional warfare vs. Conventional-on-Guerrilla warfare (i.e. Vietnam)
  • Depends on analyst experience level
  • Requires understanding of tactics and maneuvers
  • Is dependent on leader guidance and direction in requirements
  • Requires knowledge of adversary commander and forces

How-To:
  1. Define the battlefield environment
  2. Describe the battlefield effects
  3. Evaluate the threat
  4. Determine threat courses of action (COA)

Application of Technique:
Analysts constructed a terrain model from a topographical map of Gettysburg.  Sand was used to create the terrain and depict changes in elevation.  Roads were identified with black yarn and water was identified using blue yarn. After completion of the terrain model, key terrain within the area of operations were identified.

Topographical Map of Gettysburg (1863)
Application of Technique using a Sand Table
to create a Terrain Model of the map shown above.

For Further Information:
  1. FM 34-130: Intelligence Preparation of the Battlefield
  2. Street Smart: Intelligence Preparation of the Battlefield for Urban Operations