Monday, April 5, 2010

Red Team "Two Sides to Every Story"

In the article, Red Team "Two Sides to Every Story," by Lieutenant Colonel John Nelson, he writes about a particular experience after becoming one of the first graduates of the University of Foreign Military and Cultural Studies Red Team School. The courses are relatively new, but the concept is quite old, and dates back to Napoleonic times. As a graduate, he was asked in December 2008 to head up a Red Team to look into the views of the people of Iraq, the Government of Iraq, and the enemy itself.

Strengths:

· Provides and independent capability to evaluate concepts, plans, and operations from multiple different perspectives.

· Provides and understanding of the opposition through their cultural eyes.

· Provides Independent thinkers that are able to travel throughout the Division’s area of responsibility.

Weaknesses:

· Only effective if there is a true understanding of the culture.

· Although it aids in decision making, it is still only a simulation and does not account for independent thinking of the opposition.

Conclusion:

The experience of Lieutenant Colonel Nelson provides insight as to how read teaming was used in the war in Iraq, however without follow up information there is no way of knowing whether this process actually helped with decision making during the time in which it was used. I searched for follow up articles but I have yet to locate any as of this time.

Source:

http://www.military-writers.com/articles/red_team.html

Red-teaming: Improve Your Chances Of Winning The Business

In this article, Dr. Earl R. Smith II talks about using Red Teaming practices to increase a company's chances of winning bids / businesses.

Dr. Smith, a consultant and advisor to many corporate houses argues for the use of red teaming techniques to evaluate and modify a proposal so that the chances of winning the business are higher. He however, does not provide any examples of where this has actually succeeded in this article.

Red Teaming a Proposal the Dr Smith Way:

Putting together a Red Team:

Each red-team is tailor-made for the company, the client and the RFP.

According to Dr. Smith - the only reliable way to make sure the proposal is well focused and provides what the client (whose business is sought) requires is to have it reviewed by a truly objective panel which views it through the eyes of the client. This he says is the core of the process.

Scheduling the Red Team evaluation:

For a red team to effectively contribute and to allow for maximum benefit to the proposal team, Dr. Smith suggests that the red teaming should be scheduled far ahead of the due date of the proposal.

Evaluating the proposal:

The red team evaluates the proposal as if they are being asked to award the business; looking for weaknesses and strengths and checking to make sure threshold questions are addressed.

If the proposal has serious problems or if they think it is off base, they will question and respond in a way the client normally would.

Improving the proposal:

But unlike a client; they are on board to help improve the chances of winning. So after the evaluation, they also provide a critique which helps the team improve the proposal, value proposition, presentation and therefore the chances of winning the business.

Some Guidelines for Forming and Running a Red Team Review:

  • Because of their experience, members of red-teams emulate the process and mindset of the clients that the company is going to present to.
  • At least three people serve on each team.
  • They are knowledgeable in the company's space.
  • Team members must have no significant prior connection with the company that is presenting.
  • They must be willing and able to commit the necessary time and attention to the process.
  • Red team members are given at least a week to read the materials to be used in the presentation and do a bit of personal research.
  • Team members must be committed to helping improve the chances of getting the business.

Comments:

The focus of this article seems to be more on providing information on red teaming techniques for prospective clients. Hence it provides a rather broad overview of red teaming. Though the author does mention a couple of examples of proposals that failed because they did not go through a red team evaluation, he does not mention any examples where red teaming has either helped improve or win a bid. The article could have been more compelling if it had had such examples.

The Role and Status of DoD Red Teaming Activities

The Defense Science Board Task Force was charged to examine the use of red teams in the Department of Defense and recommend ways that such teams could be of greater value to the department.

What are Red Teams and Red Teaming?

The purpose of the red teaming is to reduce an enterprise's risks and increase its opportunities. Red teaming can be used at multiple levels:
  • Strategic level to challenge assumptions and visions
  • Operational level to challenge force posture, a commander;s war plan
  • Tactical level to challenge military units in training or programs in development

Red teaming provides enterprises with:

  • Deeper understanding of potential adversary options and behaviors
  • Hedge against the social comfort of "the accepted assumption or accepted solution."
  • Hedge against inexperience

Areas that red team can play an important role within DoD:

  • Training
  • Concept development and experimentation
  • Security of complex networks and systems
  • Activities where there is not much opportunities to try things out (nuclear weapons stockpile issue)

The Task Force identifies three types of red teams:

  1. Surrogate adversaries or competitors of the enterprise: The purpose of this red team is to sharpen enterprise's skills and expose vulnerabilities that adversaries might exploit.
  2. Devil's Advocates: This provides critical analysis in order to anticipate problems and avoid surprises.
  3. Sources of judgement independent of enterprises' "normal" processes. The objective to often be a sounding board to the sponsor.

What Makes an Effective Red Team?

Typical causes of red team failure include the followings.

The red teams:

  • Does not take its assignment seriously
  • Could lose its independence and captured by bureaucracy
  • Could be too removed from decision making process and became marginalized
  • Could have inadequate interaction with blue (the program it is challenging)
  • Could lose the confidence of the decision maker by leaking its finding to outsiders
  • not capturing the culture of the adversary

Attributes of Effective Red Teaming

  • The culture of the enterprise: Red teaming can thrive in an environment that not only tolerates, but values internal criticism and challenges.
  • Top Cover: A red team needs a scope, charter and a relationship that fit the management structure.
  • Robust interaction between the red and blue teams: It is not a win or lose game. The objective is to establish a win-win environment in which blue learn from the processes and comes out with sharper skills.
  • Usually careful selection of staff: Many very talented individuals are not suited, temperamentally or motivationally to be effective red team members.

Observation About Current Red Team Activities

US navy's SSBN Security Program: It was established in the early 1970s to identify the potential vulnerabilities that the Soviet Union might exploit to put US SSBN at risk. The program's focus shifted in the mid 1980s to evaluate and assess findings from the intelligence community. Recent work has involved terrorist threat and security in ports.

Over decades the program's principles have remained unchanged:

  • Strong and widely acknowledged national purpose
  • Stable funding
  • Highly competent people
  • Access to the details of the target program
  • Independent to criticize
  • Direct accountability to senior official
  • A strong but not subordinate relationship to the intelligence community

Missile Defense Agency-Red Teaming Experience: for almost two decades the purpose of this program has been to identify, characterize, and mitigate the risk associated with the development and deployment of the missile defense system.

Air Force Red Team Program: It provides assessments of concepts and technology.

  • Provides disciplined approach to guide decision making in technology development
  • Allows warning regarding vulnerability of fielded capabilities
  • Gives insight into defining what sensitive information to protect

The US Army Red Franchise Organization: Established in 1999, and is responsible for defining the operational environment in next two decades. The operational environment is the intellectual foundation for transforming the Army from a threat-based force to the capabilities based objective force.

USJFCOM Red Teams: This program has been using red team for joint concept development and experimentation.

OSD's Defense Adaptive Red Team (DART) Activity: Established in 2001 and its mission is to support the development of new joint operational concepts by providing red teaming for JFCOM, the combatant commands, Advanced Concept Technology Demonstration (ACTD)and joint Staff.

Red Teams at the Strategic Level

Red Teams at strategic level occurs when the entire enterprise is challenged. The role of red team in such a situation is to:

  • Clarify the degree of urgency of the threat
  • Create alternatives backed by data, feasibility, likely outcome, difficulty of implementation, resources required, and likely resistance, communication needs
  • Gather opposing views
  • Lead discussion toward choice of an acceptable solution
  • Plan implementation

Conclusions

Red teaming has been a valuable and underutilized tool for the Department of Defense. The Defense Science Board Task Force recommend that the red team role be expended. there are two main reasons:

  1. To deepen understanding of the adversaries the US now faces in the war on terrorism and in particular their capabilities and potential response to US initiatives. red Teaming helps to identify the range of options available to potential adversaries.
  2. To urge against Complacency. The US military is tempting to transform itself. It is necessary to continue transforming our armed forces to deal with committed and adaptive adversaries.

Saturday, April 3, 2010

Modeling Behavior of the Cyber Terrorist

According to the article “Modeling Behavior of the Cyber Terrorist” by Gregg Schundel and Bradley Wood, it is not clear whether the Cyber-Terrorist is real or simply a theoretical class of adversary. However, this work is based on the assumption that the Cyber-Terrorist is a very real potential threat to modern information systems.

The Experiment
In order to red team an unknown, potential adversary, a set of parameters are set for the red team to follow based on the Defense Advanced Research Project’s Agency’s (DARPA) understanding of terrorist behavior:
  • The cyber-terrorist is believed to have a level of sophistication somewhere between that of a sophisticated hacker and a foreign intelligence organization.
  • This adversary is assumed to be able to raise funds on the order of hundreds of thousands to a few million dollars, and he is willing to spend these funds to accomplish his mission.
  • This adversary is assumed to be able to acquire all design information on a system of interest.
  • This adversary is assumed to be very risk averse. Premature detection is a serious negative consequence for the cyber-terrorist.
  • This adversary has specific targets or goals in mind when they attack a given system.
  • The adversary will also expend only the minimum amount of resources needed to accomplish their mission.
  • The cyber-terrorist is assumed to be professional, creative, and very clever. They will seek unorthodox and original methods to accomplish their goals.

Findings
The Information Design Assurance Red Team (IDART) spent most of its time gathering intelligence on the target system. Their results were only considered successful if the team met their objectives and preserved stealth. In this study the red team followed the same basic process repeatedly, and gave up before mounting an attack with a risk threshold that was too high.

Conclusion
DARPA’s experience suggests some improvements to the process that they are using to model the cyber-terrorist adversary including the use of additional red teams, improving the scientific method used to record and test red team behavior, incorporating verified terrorist behavior, war-gaming cyber terrorist scenarios, and improving the library of possible approaches to difficult threats.

Red Teaming Experiments with Deception Technologies

With their study “Red Teaming Experiments with Deception Technologies,” Fred Cohen, et.al., conducted a series of 30 experimental assessments on the use of specific deceptive methods against human attackers in order to understand the role of deception in information processes.

The Experiment
In total, 5 experimental runs of duration 4 hours each were run on each of 6 exercises. This represents 30 runs, including deception "on" and deception "off" control groups (6 each) and random "on" "off" mixes (18).

Each run was preceded by a standard briefing and a run-specific briefing and followed by filling out of standard assessment forms, both individually by all team members and as a group. The exercises were of increasing intensity and difficulty so as to keep the participants challenged. Feedback was provided and varying amounts of information were disclosed to teams during the course of the experiment.

The participants, in this case, were students ranging in age from 16 to 38, all in computer-related fields, all with excellent grade point averages, all US citizens, and all interested in information protection, and all participating in an intensive program of study and research in this area.

Findings
The use of red teams in simulating the effectiveness of deception methods on human network attackers revealed several interesting results:
  • Teams which were not aware they were not working in a deceptive environment engaged in self-deception which hindered their progress. The study concluded that the threat of deception offers some protection against attackers.
  • Teams unknowingly operating under deceptive conditions who followed a deception to its logical end gave up on the problem earlier than the time allotted because they believed they had finished correctly.
  • For teams operating in a deceptive environment, even after being educated on the deceptive techniques that were being employed, were rarely able to move more rapidly past the deceptions, and often followed the same deceptive route they had learned in previous experiments.
  • Teams continually subjected to deception became disheartened and only 3 of the original 15 participants under deception finished the study, while 8 of 12 participants not working under deceptive conditions finished the study.

Conclusion
The net objective of combined deceptions is that attackers spend more time going down deception paths rather than real paths, that the deception paths are increasingly indifferentiable to the attackers, and that the defenders can gain time, insight, data, and control over the attackers while reducing defensive costs and improving outcomes. Content-oriented deception can be an effective deterrent against network attackers, and deception capabilities should be improved to combat highly skilled, long term network threats.

Friday, April 2, 2010

The "Red Team" Forging a Well-Concieved Contingency Plan

According to the article, “The Red Team: Forging a Well Conceived Contingency Plan,” a Red Team is described as: a group of subject-matter experts (SME), with various, appropriate air and space disciplinary backgrounds, that provides an independent peer review of products and processes, acts as a devil’s advocate, and knowledgeably role-plays the enemy and outside agencies, using an iterative, interactive process during operations planning.” This article shows the manner in which the US Air Force In Europe is working to set up a Red Team in an effort to simulate contact with different enemies throughout the planning of Operation Allied Force in 1998 – 99.


Strengths:


· Yields a closely synchronized planning staff.

· Drive a more complete analysis at all phases.

· Reveal overlooked planning opportunities.

· Extrapolate unexplored strategic implications.


Weaknesses:


· Losses effectiveness without Blue planners’ acceptance of Red as a value-adding group.

· Trust of the Blue Team is key to the success of the exercise.

· Only a simulation and is not always an accurate representation of the enemies decision making.


How-To:


1. Create a Red team composed of Subject Matter Experts, external to the Blue team’s sources.

2. Preparation by the Red Team. Team members should immerse themselves in learning everything they can about what has gone before in the crisis at hand and what the enemy and other adversaries are thinking. (Perhaps by creating a checklist of the information that the team needs to know.)

3. Meeting between the Red Team and Blue planners to explain critical points of the Red Team’s purpose, in order to alleviate friction.

4. Conduct a Mission Analysis Seminar.

5. Participate in a Course of Action (COA) War Game.

6. Select a COA and conduct a Plan War Game based on it, in order to create an actual Operations Order (OPORD).

7. Conduct a Mission Rehearsal.



Conclusion:


The USAFE’s Red Teaming efforts continue to evolve. As part of this the focus of the Red Team continues to evolve, and helps ability to refine plans every time contact with the enemy is simulated through the use of the Red Team because, “no plan survives contact with the enemy.”

Thursday, April 1, 2010

Red Teaming for Law Enforcement

According to Michael K. Meehan, Captain of Seattle Police Department, law enforcement needs to borrow useful training techniques from the military and Private industry as it improves its terrorism countermeasure. One of these techniques is known as Red Teaming. The Military uses red teams in war-gaming scenarios as the opposing force in a simulated conflict to reveal weaknesses. In the business world, red teaming refers to an independent peer review of abilities, vulnerabilities, and limitations.

Scope of Red Teaming

Red Teaming is an interactive process conducted during crisis action planning to assess planning decisions, assumptions, processes, and products from the perspective of friendly, enemy, and outside organizations. Red Teaming has also been described as the " capability-based analytical or physical manifestation of an adversary, which serves as an opposing force." Red Teams evaluate a target or tactic, but not the likelihood that a particular target will be attacked.

Analytical Red Teaming

Analytical red teams portray an adversary but are not involved in actual field play. Analytical red teaming adds value to discussion based exercises and can range from basic peer review to near-real-time force-on-force interaction, as in games or simulations. During analytical red teaming, participants analyze the attack plans and look for indicators and warnings, key decision points, and vulnerabilities in the plan.

Physical Red Teaming

Physical red teaming involve individuals portraying actual, realistic adversary moves and counter moves in an exercise.

Benefits of Red Teaming

Successful red teaming offers a hedge against surprise and inexperience and a guard against complacency. Effective red teaming can define and threshold of detection, suspicion, and action.

Impediments to Effective Red Teaming

Impediments to effective red teaming fall in to two categories:
Situational: selection and training of members, and exercise conditions.
Organizational: organizationally imposed constrains, distribution and reception of the lesson learned. Red team success requires an organizational culture that values constructive criticism.

Methodology for Using Red Teaming in Exercise

  • Determine the objective or desired result
  • Communicate with government or private partners
  • Determine the scale or type of exercise
  • Develop the scenario
  • Identify and train the appropriate participant
  • Conduct and evaluate the exercise
  • Prepare documentation
  • Evaluate the performance
  • Develop the improvement plan
  • Make required and desired improvements
  • Exercise again

Limitations to Red Teaming

There is not enough information to predict all possible means of attack. Typically, red team exercises are based on prior events and are less likely to anticipate new, unplanned or never before seen events. Red teaming plans and procedures need to be stressed and once stressed, must evolve and improve.

Reflections From A Red Team Leader

Introduction

Ms. Susan Craig is a red team analyst at the Joint Intelligence Operations Center at U.S. Pacific Command. In her article, she identified the purpose of red teaming as a technique that questions our assumptions and and allows us to think like the enemy. She has highlighted the skills and attributes (based on personal red teaming experiences) that make an effective red team and red team leader. These include, but are not limited to, critical and creative thinking skills, cultural awareness, effective communication skills and group diversity.

Strengths
  • Develops a more accurate frame of reference of the operational environment
  • Causes consideration of the way dependent variables influence each other within that environment
  • Thinking within the construct of the enemy's culture helps mitigate underestimating the enemy
  • The mindset and skills need to be a "red teamer" can be applied outside of red teaming for better conceptualization/analysis
  • Creates opportunity by recognizing cultural myths and the ways they can be exploited to gain operational/psychological advantage

Weaknesses
  • Dependent upon depth of knowledge in the following areas: physical environment, nature and stability of state, sociological demographics, regional and global relationships, military capabilities, information, technology, external organizations, national will, time, economics and culture. If these are not well understood, the exercise may be prone to mirror-imaging.
  • Because it attempts to conceptualize a complex environment, the full extent of actions may not be considered.

Conclusion

Craig argues that red teaming is different from intelligence analysis in the following ways:
  1. The red team is not bounded by the construct/plan developed by the staff or by the need for evidence and corroboration.
  2. The red teamer is more like a historian (whose job is to ask big, broad questions) than an intelligence analyst (whose job is often to answer very specific, narrow questions).
  3. The red team's job goes beyond understanding the environment to include understanding how we can shape it.


Red teaming is used in intelligence analysis, as well as the operational environment, because bias can impede the quality and value of analytic products as it can impede decision making. Utilizing this technique for inhibiting bias improving critical thinking skills, cultural awareness, and more effective communication is critical for understanding the operational environment and determining causality within that environment.

Source

This article came from Military Review, March-April 2007.

Accessed via the Red Team Journal

http://redteamjournal.com/resources/