Tuesday, April 13, 2010

Spatial Information Technologies in Critical Infrastructure Protection

Summary
This document prepared by the National Consortia for Remote Sensing in Transportation(NCRST), addresses the necessity of remote sensing technologies in protecting critical transportation infrastructure. The document discuss the varying concepts on defining critical infrastructure, threats to the infrastructure, protection of infrastructure and disaster management needs. The document then discusses the critical role of remote sensing technologies. While they have become a vital tool in critical infrastructure protection, the systems are limited by the users and their status.

Strengths
Remote sensing provides low cost, multi-purpose, wide area “at-a-distance” data. That can assist in modeling, identification of critical infrastructure. It can also provide powerful visualization to assist when shaping policy.

limitations
While remote sensing data is useful, it may be unusable due to timeliness and expertise required to convert data. Some sensing may also be limited by its sensitive nature, limiting its use. Additional concerns stem from the interoperability of data sets and systems.

Source:http://www.ncgia.ucsb.edu/ncrst/research/cip/CIPAgenda.pdf

High-Resolution Satellite Imagery and the Conflict in Sri Lanka

In May 2009, the Science and Human Rights Program of the American Association for the Advancement of Science (AAAS) acquired and analyzed commercial high-resolution satellite imagery of the Civilian Safety Zone (CSZ) and surrounding area in northeastern Sri Lanka. The project was done at the request of Human Rights Watch and Amnesty International, who expressed concern over the status and safety of civilians due to the heavy fighting occurring 9-10 May, 2009. Comparing the May 6 and May 10, 2009 images of the CSZ, AAAS found significant removal of IDP shelters. In addition, imagery showed evidence of bombshell craters, destroyed permanent structures, mortar positions, and 1,346 individual graves. AAAS’s analysis was based on images from various publically accessible commercial satellites, US Army Field Manuals, and open-source information from public statements and media reports.

Strengths - Satellite imagery analysis is a useful way to assess the situation on the ground during conflicts in which no outside parties are allowed in the area.

Weaknesses – None

Source: http://shr.aaas.org/geotech/srilanka/srilanka.shtml

Landsat Satellite Images Change Detection Methods

Minnesota's Department of Natural Resources (DNR)'s website depicts its use of Landsat Thematic Mapper (TM) images to map Minnesota land cover. Landsat TM images are digital and differ from traditional digital images in their ability to express additional measures of brightness beyond simple RGB brightness. Specifically, Landsat records an additional four sets of brightness from the near, middle, and thermal infrared portions of the electromagnetic spectrum. Landsat images are useful in showing changes between two images taken of the same area but years apart. Image differencing, or subtracting the original image from the new image to determine changes, is a simple technique in remote sensing.

Preparatory Steps

Both images should show the same season and must be accurately registered (matched up) to the ground and to each other. Remove areas with clouds and distinguish between forest and nonforest areas. The images must be radiometrically calibrated to minimize effects of instrument variations and atmospheric haze.

Analysis

Values in the difference image that are red or orange show loss of vegetation, while greens depict growth of vegetation. The example uses a simple bell curve in which minor changes in vegetation account for 80% of the total changes and significant changes occur on either side of that distribution.

Strengths

  • Relatively easy to perform
  • Objective in nature

Weaknesses

  • Chance for error if preparatory steps not followed
  • Does not account for other factors of vegetation change between the images
  • Unlike satellite image analysis, it is not possible to compare analysis units and determine which has shown the most vegetation loss in this case, since the statistical thresholds display roughly the same amount of change and non-change regardless of what has actually happened in the area.

Source: http://www.ra.dnr.state.mn.us/changeview/change_tech.html

Monday, April 12, 2010

Tsunami Satellite Image Analysis Reveals Dramatic Water Quality Changes

Satellite Imagery Analysis

Summary:
An article in XPress Press, via Applied Analysis Inc. describes how Satellite Imagery Analysis was used to determine water quality levels after the Tsunami that hit Sri Lanka and India. Applied Analysis Inc., an American company, used satellite imagery analysis processes originally designed for military use, to determine the clarity of the water. Applied Analysis Inc. used the IKONOS imagery software for their analysis, and believes that the technology they used will be able to better identify other problems with water supplies in the near future.

Strengths:
  • Can be used on multiple types of problems/issues.
  • Continually upgraded technology.
  • Has relatively clear data that is visible.
Weaknesses:
  • None noted.

Link

http://www.xpresspress.com/news/AppliedAnalysis_011305.html

Satellite Image Analysis Reveals South Ossetian Damage

Satellite Imagery Analysis

Summary:
An article from ScienceDaily on October 9, 2008 discusses the usage of satellite imagery analysis to determine the damage done in the South Ossetian during the Russian-Georgian conflict. The article uses analysis done by the American Association for the Advancement of Science at the request of Amnesty International USA. The AAAS study examined damage to 24 villages near the city of Tskhinvali, which is considered the main city in South Ossetia. The AAAS looked at the damage of structures on August 10th, and compared them to the damage on August 19th. They determined that Tskhinvali sustained the greatest amount of damage (182 structures) between the 10th and 19th of August. The details of how they believe the buildings were destroyed corroborated the stories from the ground that fires were the main cause. The AAAS study combined eye-witness accounts of destruction with objectively interpreted satellite images purchased through three commercial vendors: GeoEye, DigitalGlobe and ImageSat International. They also used the software packages ERDAS Imagine and ArcView to leverage the power of remote sensing technology and Geographic Information Systems (GIS), respectively.

Strengths:
  • Used to corroborate reports on the ground.
  • Can be very detailed.
  • Easy to compare images from different dates.

Weaknesses:
  • Need of trained professionals.
  • Software can be expensive.

Source:

http://www.sciencedaily.com/releases/2008/10/081009144105.htm

Sunday, April 11, 2010

A Comparison of Aerial Photography, Landsat TM and SPOT Satellite Imagery

A study was done on the tropical wetland environments of northern Australia; population growth and environmental problems are encroaching and threatening the wetlands. Because of the remoteness and fragility of the area, remote sensing was an appealing method for obtaining information about the land. The purpose of the study was to investigate the utility of image data sets to see which type of image produced the best resolution for mapping the environment.

Method:
1.) In the study, Landsat, TM (satellite images) , SPOT XS and PAN (remote sensing satellites) and large-scale, true-color aerial photography were evaluated for mapping the vegetation.

2.) Five sample points were placed at 1km intervals. Each point was labeled and information about the location was recorded. To ensure the correct cover types were recorded, both photographic and written records were collected at each site. This procedure provided 12 observations per sample point and a total sample size of 240 observations for the entire swamp.

3.) Landsat, TM, SPOT, and large-scale photography was used for each location.

4.) Images were evaluated.

Results:
The study concluded that aerial photography was superior to satellite imagery for detailed mapping of vegetation environment studied.

The results suggest that either Landsat TM or SPOT XS imagery is adequate for mapping these generalized land cover classes. But the resolution needed for the classification of the vegetation was acquired through the aerial imagery. When comparing the two satellite imagery data sets, the broader spectral range of the Landsat TM data appeared to more than compensate for the superior spatial resolution of the SPOT imagery.

The researchers note that the most useful technique will depend on the application and is closely related to the physical characteristics of the features being mapped.

Study can be accessed through EBSCOhost:
Vegetation mapping of a tropical freshwater swamp in the Northern Territory, Australia: a comparison of aerial photography, Landsat TM and SPOT satellite imagery by K. R. HARVEY and G. J. E. HILL


Geographic Business Intelligence

The geographic-search company MetaCarta, which was originally funded by the Department of Defense and others in the government sector, is now directing its technology at businesses. MetaCarta introduced an enterprise feature for GIS that organizes data geographically and allows for visualization between structured and unstructured data. This allows the user to search database using a map as a filter. Claudine Bianchi, vice president of marketing at MetaCarta states that in corporations, 80% of unstructured data has a geographic reference; whether it is an exact location or any type of point of reference.

MetaCarta is targeting specific industries where geographic information is a critical part of the business, such as the critical infrastructures, educations, and fraud detection.

MetaCarta sites this example for its use in the business sector: “Before an oil company commits to spending millions of dollars to drill a new well, geographic intelligence could be used to comb silos of information about that specific location, illustrating on a map whether the area has been drilled previously, when the lease expires, and other relevant trends about the site.”

MetaCarta's GIS technology comprises three components:

Geographic Text Search - combines text and geographic search capabilities for structured and unstructured data that can then be displayed on a map

GeoTagger - provides XML metatagging of geographic sites so they can be stored, and indexed

Geographic Data Modules - a collection of places, names and coordinates with assigned significance values and natural-language processing based on business or topic

Damage Detection From High Resolution Satellite Images For The 2003 Boumerdes, Algeria Earthquake

Application of Satellite Imagery Analysis:

The researchers used imagery from a high resolution commercial satellite, QuickBird, to assess the damage to buildings that resulted from an earthquake of magnitude 6.8 that struck the coast of Algeria on May 21, 2003. The study focused on the two cities of Boumerdes and Zemmouri. Using photographs from before and after the event, researchers coded buildings as either a 1-5 based on level of destruction. The higher the number, the greater the damage. They also noted the location of tents constructed to house refugees. This study proves the efficacy of remote damage assessment in the field of disaster management.

Strengths:
  • Satellite photos cover a large area at one time
  • More manageable than on-the-ground classification efforts
Weaknesses:
  • Hard to compare images taken from different angles and at different resolutions
  • Shadows can affect interpretation
Results:

The study revealed that destruction classifications based on 2 images (from before and after the event) were more stable and consistent across five raters than those based on just the photo from after the earthquake. The precision level was 80% meaning that the raters were in agreement about the level of damage to a structure for 80% of the buildings. Similarly, the number of damaged buildings was greater using both sets of photos than just the ones depicting the earthquake's aftermath. The use of both sets of images was more crucial for buildings with low-grade damage.

The authors also compared their satellite-aided results to those from a team of Algerian engineers conducting classification of building damage on the ground in Boumerdes. The comparison indicated that overall, people using satellite images to assess damage tended to under-estimate the degree of structural damage. The authors did not quantify this effect, but noted that it warrants future study.

Source:
http://staff.aist.go.jp/m.matsuoka/others/13wcee_yamazaki_kouchi.pdf

Friday, April 9, 2010

Using A Time Series Of Satellite Imagery To Detect Land Use And Land Cover Changes In The Atlanta, Georgia Metropolitan Area

Application of Satellite Imagery Analysis:

This article detailed a study in which researchers used a series of Landsat MSS and Landsat TM images from between 1973-1998 to assess the impact of land use/cover change on temperature change and air quality in Atlanta. The study indicated disturbing trends of deforestation and urban sprawl, but it also yielded worth-while information about the use of satellite photograph interpretation.

Strengths:

  • Provides an excellent overview of a region
  • Allows a retrospective view
  • Can provide highly accurate results

Weaknesses:

  • Relies on interpretation
  • Limited by resolution, image quality, atmospheric haze, and contrast

How to (Steps):

  1. Geometric Rectification- align landmarks (necessary for analysis over time)
  2. Radiometric Normalization (RRN)- necessary when images were taken by different sensors, to standardize radiometry, or contrast within an image
  3. Design of Classification Scheme- decide on classification categories ex. cultivated land or grassland
  4. Image Classification- divide the photograph into "clusters", spatial areas with the same characteristics, then decide into which category they should be placed
  5. Spatial Reclassification- an effort to reduce errors in classification involving techniques such as having a human check portions of computer-classified work
  6. Accuracy Assessment- compare classification results to what the ground areas truly are as revealed by aerial photographs

Accuracy of Method:

The study results stated that user accuracy was over 80% in 5 out of 6 land-type classification categories. The accuracy for the sixth was in the high seventies. Landsat TM images which have a higher resolution yielded slightly more accurate results than those from the Landsat MSS images especially for high and low density urban areas and forests. User accuracy for these categories was 96%, 98%, and 98% respectively.

Source:

http://vk.cs.umn.edu/sboriah/LandCoverBib/YangYL2002.pdf

Thursday, April 8, 2010

Green Team Summary of Findings: Red Teaming (4 out of 5 stars)

Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the 16 articles read in advance (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst College on 8 April 2010 regarding Red Teaming specifically. This technique was evaluated based on its overall validity, simplicity, flexibility and its ability to effectively use unstructured data.

Description

Red Teaming is an interactive process conducted during crisis action planning to assess planning decisions, assumptions, processes, and products from the perspective of friendly, enemy, and outside organizations. Red Teaming has also been described as the "capability-based analytical or physical manifestation of an adversary, which serves as an opposing force."

Red Teams evaluate a target or tactic, but not the likelihood that a particular target will be attacked. Successful red teaming offers a hedge against surprise and inexperience and a guard against complacency. Effective red teaming can define the threshold of detection, suspicion, and action.

Red Teaming is the only type of alternative analysis method that is mandated by congress. SEC. 1017 of the Intelligence Reform and Terroism Prevention Act of 2004 states that "Not later than 180 days after the effective date of this Act, the Director of National Intelligence shall establish a process and assign an individual or entity the responsibility for ensuring that, as appropriate, elements of the intelligence community conduct alternative analysis (commonly referred to as ‘‘red-team analysis’’) of the information and conclusions in intelligence products." (Source: http://www.nctc.gov/docs/pl108_458.pdf )


How to:

1) Determine the objective or desired result
2) Communicate with government, private partners, or other stakeholders in the process
3) Determine the scale or type of exercise and develop the scenario
4) Create a Red team composed of Subject Matter Experts, external to the Blue team’s sources.
5) Preparation by the Red Team. Team members should immerse themselves in learning everything they can about what has gone before in the crisis at hand and what the enemy and other adversaries are thinking. (Perhaps by creating a checklist of the information that the team needs to know.)
6) Meeting between the Red Team and Blue planners to explain critical points of the Red Team’s purpose, in order to alleviate friction.
7) Conduct and evaluate the exercise
8) Prepare documentation
9) Evaluate the performance
10) Develop the improvement plan
11) Make required and desired improvements
12) Exercise again


Tips for Success:
  • At least three people serve on each team
  • Team members must have no significant prior connection with the company that is presenting.
  • Devote the necessary time and attention to the process.
  • Red team members should be given at least a week to read the materials to be used in the presentation and do a bit of personal research.

Strengths and Weaknesses

Strengths
  • Lets Players consider the system as a whole
  • Reduce Risk
  • Avoid Predictable Patterns
  • Preclude mirror-imaging
  • Perturb the organization
  • Overcome bias
  • Improve adaptability and flexibility
  • Yields a closely synchronized planning staff
  • Reveal overlooked planning opportunities
  • Provides confidence to the Blue team
  • Provides an independent capability to evaluate concepts, plans, and operations from multiple different perspectives
  • Provides and understanding of the opposition through their eyes

Weaknesses
  • If not everyone agrees on the value of the exercise, it can become ineffective
  • The process may lose its independence and be “captured” by the bureaucracy
  • Could be too removed from the decision-making process and become marginalized
  • The process may destroy the integrity of the process and lose the confidence of decision-makers by “leaking” its findings to outsiders
  • There must be trust of the blue team for success of the exercise
  • It is only a simulation and is not always an accurate representation of the enemies decision making
  • The process is only effective if there is a true understanding of the opponent
  • The process does not account for independent thinking of the opposition
  • It can take time and effort to step back and view the system like an outsider, or even an insider who intends to harm

Personal Application of Technique

The class was divided into groups: Red Team/Blue Team and two referees. Red Team (offense) had 8 members and Blue Team had 4 members. Red had 1 safety card and Blue had 2 which would protect the members if they were within arms length; the way to determine who was "out." The simulation was over when either the Red Team captured the VIP or the Blue Team defended at the end of the alloted time. There were rules on specific ways to move; 90 degree angles, straight lines, and a minimum of 5 steps and max of 10. The main portion of the application was spent planning each players movement because all movements had to be written down before the simulation began. At the end of ten minutes, both teams met to engage in the simulation. Both teams moved at the same time and enacted the plans. After the simulation (Red Team captured the VIP within four moves) all groups met to discuss what both sides could have done differently and what either team was more likely to do.


Summary of Findings (White Team): Red Teaming (4 out of 5 Stars)

Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the 16 articles read in advance of (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst College on 8 April 2010 regarding Red Teaming specifically. This technique was evaluated based on its overall validity, simplicity, flexibility and its ability to effectively use unstructured data.

Description:
Red Teaming is a broad range technique that covers both methods and modifiers depending on its utility and depth. Congress mandates the use of Red Teaming in National Security fields as referenced in the Intelligence Reform and Terrorism Prevention Act of 2004 SEC 1017, "Not later than 180 days after the effective date of this Act, the Director of National Intelligence shall establish a process and assign an individual or entity the responsibility for ensuring that, as appropriate, elements of the intelligence community conduct alternative analysis (commonly referred to as ‘‘red-team analysis’’) of the information and conclusions in intelligence products." (http://www.nctc.gov/docs/pl108_458.pdf) However, its use extends to law enforcement and competitive intelligence fields as well.

Based on current research and articles, there is no universal definition. It can cover ideas as simple as playing devil's advocate, and as complex as a full scale war simulations conducted at the National Training Center at Fort Irwin Military Reservation for the United States Military.

Strengths:
  • Applicable to National Security, Law Enforcement, and Business sectors.
  • Reduces risk by a means of internal auditing
  • Precludes mirror-imaging
  • Mitigates surprise
  • Avoids predictable patterns
  • Helps overcome bias
  • Improves adaptability and flexibility
  • Helps players view system as a whole,as well as individual components
  • Identifies decision maker choices for strategic players
  • Helps prevent bad investments- time, effort, money, resources
  • Improves the quality of questions asked about particular situations
  • Provides "awareness training" and improves safeguards of a system, particularly in an IT or computer networking situation
  • Challenges taboos and assumptions
  • Revealing the consequences of different perspectives; in-particular the perspectives of those with different goals and risk profiles
Strength is dependent upon the team compiled; composition, goals, management support, relationship with Blue Team, rules of engagement, and available information.

Weaknesses:
  • There is not one agreed upon definition
  • Full extent of an opponent's actions may not be considered
  • Red team may not take their responsibilities seriously
  • Could lose its independence and be “captured” by the bureaucracy
  • Red Teamers may not be allowed to act outside of Blue Team norms
  • Suggestions of Red Team may not be incorporated into the organizational structure without proper follow-up
  • Members of the Red Team may not be able to access the same knowledge as the real attackers
  • Red Team may not accurately represent real opponent's decision making process
How To:
  • Determine the objective or desired result.
  • Communicate with stake holders involved in the exercise including management / decision makers on the scope, scale and type of exercise.
  • Based on the exercise, create a Red team composed of Subject Matter Experts, external to the Blue team’s sources.
  • Preparation by the Red Team. Team members should learn everything they can about what has gone before in the crisis at hand, the blue team's plan and what the enemy and other adversaries may be thinking. (Perhaps by creating a checklist of the information that the team needs to know.)
  • Meeting between the Red Team and Blue planners to explain critical points of the Red Team’s purpose, in order to alleviate friction.
  • Red team creats a plan / Course of Action (CoA).
  • An exercise / simulation is conducted (Ex: A War Game).
  • Exercise is evaluated and improvements identified.
  • The required and desired improvements are incorporated.
  • Exercise and evaluate again till the desired objective is reached.

Application:
Our class played a game with 4 players comprising the Blue Team, 8 players comprising the Red Team, and 2 referees to enforce the rules. It was similar to "Capture the Flag" in that the Blue Team's goal was to defend an object and the Red Team's goal was to capture the object. We conducted the exercise within the confines of our department's building. The rules of the game were as follows: both teams must remain within the building; before the game began, both teams were required to create a plan of attack and could not deviate from that plan once play began; the teams could divide their members and start from any of the four entrances to the building; once play began, each player on each team must move at least 5 steps but no more than 10 steps (a step is defined a heel-to-toe); players must move in straight lines or at 90 degree angles; if two players come within arms' length of each other they are both eliminated, unless they reveal a safety card which protects the owner from elimination one time only.

Here is how the game played out: the Blue Team, having only four members, took their position surrounding the object and worked their way outwards trying to cover each entrance to the building, but that strategy proved ineffective given the Red Team's strategy to overwhelm one entrance and use the other three as decoys. Therefore, the Blue Team only had one player to defend against five Red Team players. Obviously, once the Red Team eliminated the lone Blue Team defender, they easily won the game.

In our debrief, it was obvious that the Red Team would be victorious given their advantage in the amount of players they had. It was possible for the Blue Team to prolong the game, but eventually they would be overwhelmed by the Red Team. However, an interesting aspect came up: just because the Red Team won, does that mean that they do not need to alter their strategy for the future? It's a question that we believe should be asked when performing Red Teaming exercises.

Tuesday, April 6, 2010

Seeing Red: Creating a Red-Team Capability for the Blue Force

According to Colonel Gregory Fontenot, US Army retired, red-teaming is vital for understanding the ways enemies will fight. In his article, he explains that Red-teaming has been used since the 19th century as a tool (combined with war-gaming) and has continued to improve decision making when confronting and responding to adaptive enemies. After the 2003 Defense Science Board's (DSB) study on Red-teaming commended the added value from such exercises, various initiatives were employed to conduct red-teaming; however, the DSB could not could not find a commonly agreed-on description of red-team capabilities and functions or a means to assure quality of effort. The solution, according to Fontenot, was the University of Foreign Military and Cultural Studies (UFMCS) at Fort Leavenworth, Kansas, curriculum for education, training, and practical experience for red-team leaders and members. He further highlights red-team capabilities, benefits, and reasons for failures which are outlined below.

Red-Team Capabilities:
  • Expand problem definition
  • Challenge planning assumptions
  • Provide independent view of friendly and enemy vulnerabilities
  • Provide understanding of adversary through his cultural lens
  • Identify 2nd- and 3rd-order effects of plans
  • Reveal overlooked opportunities
  • Anticipate strategic implications
  • Provide alternate courses of action
  • Coordinate scientific and technical examinations
Red-Team Benefits:
  • Reduce risk
  • Preclude mirror-imaging
  • Mitigate surprise
  • Perturb the organization
  • Avoid predictable patterns
  • Overcome bias
  • Improve adaptability and flexibility
Red-Team Failure (reasons for):
  • Does not take its assignment seriously
  • Could lose its independence and be “captured” by the bureaucracy
  • Could be too removed from the decision-making process and become marginalized
  • Could have inadequate interaction with blue teams and be viewed as just another
    sideline critic
  • Could destroy the integrity of the process and lose the confidence of decision-makers
    by “leaking” its findings to outsiders
Conclusion

Red-teaming is an excellent tool for understanding and dealing with adaptive enemies. The value of red-teaming has been acknowledged by the U.S. military and has caused re-organization to incorporate specific curricula to educate and train red-team leaders. Red-teaming is often combined with devil's advocacy and war-gaming and is most effective when used with other tools/techniques.

Source
Colonel Gregory Fontenot, Military Review, Sep-Oct 2005
Accessed via http://www.au.af.mil/au/awc/awcgate/awc-sims.htm#redteams

Red Team, Blue Team: How To Run An Effective Simulation

The military does it. The Government Accountability Office does it. So does the NSA. And the concept is making its way into the corporate world, too: war gaming the security infrastructure. Originally, the exercises were used by the military to test force-readiness. They have also been used to test physical security of sensitive sites like nuclear facilities and the Department of Energy's National Laboratories and Technology Centers. In the '90s, experts began using red team-blue team exercises to test information security systems.

This is one of the easiest ways to identify security vulnerabilities, and it also helps with an issue key to any successful red team-blue team exercise: buy in. Yes, it's one of the most overused phrases in a consultant's vocabulary, but the approval of management and employees is essential when testing information security systems.

The goal of a red team-blue team exercise is not just to identify holes in security, but to train security personnel and management.

Weakness: If not everyone agrees on the value of the exercise, it can quickly devolve into defensive posturing and wasted time. After all, you may be asking higher-ups for the time and budget required to fix flaws the exercise discovers.

An attacker will disregard more than rules; he or she will disregard the company's norms. Consider who your attackers may be. Power plants may be targeted by terrorists. Banks by criminals. Anyone by a disgruntled ex-employee. It can take time and effort to step back and view the system like an outsider, or even an insider who intends to harm.

Strength: One of the values of a tabletop exercise is that it lets players consider the system as a whole. Most companies that don't house nuclear materials are unlikely to engage in full-scale physical exercises with armed forces storming their building, but it's important to consider physical security when developing whiteboard attacks. A tabletop exercise provides the opportunity to reflect and assess response options as well as attacks. And then think about what possible breaches might mean. "This gives the blue team, the defenders, confidence," says Assante. "It's also very useful to the red team. You see vulnerabilities in a whole new light. And they bring that training back" to their coworkers.

Once you've fixed the holes your whiteboard exercises identified, however, a live attack-and-defend exercise can provide a whole new level of insight, but it's not an activity to be taken on lightly. In some cases, vulnerabilities can be safely demonstrated on a live corporate network, but it's not wise to launch a real attack against your production systems.

Examples: Even at National Labs, employees are often the weakest link in a security plan. But even if you don't have to worry about employees copying classified material onto home computers, it's important to think about how an enemy could exploit weaknesses in your employees' behavior. Do they prop-open automatic doors? Click on e-mail attachments from strangers? You can test for these problems and similar ones. Assuming you have a written security policy and employees are aware of it, you may not want to announce a red-team exercise, since your goal is to determine the risks of normal behavior. Other managers have left USB devices lying around office buildings to see who picked them up and plugged them into their computers. They've also sent phishing e-mails to employees to see who would take the bait.

Conclusions:
"Many people migrate from a wired network to a wireless one assuming it works exactly the same, because from their perspective it does work the same," explains Sandia Parks. "They don't realize that there are different characteristics that provide different attack surfaces."

"Red-teaming is good at helping the customer understand interdependencies," says Clem, who advocates bringing a red-team mentality to design decisions. He wants his clients to think, How does that added functionality affect security? What could the bad guy do if we do that?

Hypergame Analysis, Part 1

In situations of competition and conflict, no single player can dictate the outcome. What occurs depends on the strategy each player pursues. In turn, the strategy each player pursues depends on the strategy each player believes his or her opponent will pursue, and so on. Analysts often use game theory to model such situations.

In 1977, Peter Bennett introduced hypergame analysis, an elegant and useful extension to game theory. Unlike standard game theoretic models, Bennett’s concept permits players to perceive different games. This feature better approximates real-world conditions and, in particular, allows analysts to model situations involving manipulation, stratagem, and deception more directly.

In hypergame terms, a situation in which both players correctly perceive the same game is designated a level-zero hypergame. A situation in which both players believe they are playing the same game while at least one player misperceives the game is designated a level-one hypergame. A situation in which at least one player perceives the other player’s (assumed) misperceptions is designated a level-two hypergame.

Example:
A scammer offers a great deal to a mark on the street: “My friend’s business has failed,” says the scammer, “and I’ve got a van full of DVD players I need to sell quickly at a great price.” The mark hesitates. Maybe they’re stolen, he thinks. He decides to take a look anyway. The scammer opens the back of a van containing stacks of boxes. He opens one to reveal an off-brand but slick-looking portable DVD player. “This is yours for $20,” he tells the mark, who weighs the opportunity. The stuff’s obviously boxed, the mark tells himself; maybe it’s not stolen after all. He ignores his initial misgivings, hands over a twenty, and walks away with a mint-in-the box DVD player, or so he believes. When he gets to his car, he eagerly opens the box and discovers a brick. He drives back to the scene of the crime, but the scammer is gone. This situation is easily described using the hypergame framework. The mark assumes the two are playing the same game. In this game, the scammer’s options are {(sell a stolen player) (sell a legitimate player)} while the mark’s options are {(buy a player) (walk away)}. The mark’s challenge, then, is to decide whether the players are stolen. If the mark doesn’t care either way, then the choice is easy: buy a player. The scammer is playing a different game. For the sake of this example, let’s assume the scammer keeps a couple of real DVD players handy in case he suspects the mark might blow the con. In the scammer’s game, then, the mark’s options are {(buy a player) (walk away) (blow the con)} and the con’s options are {(sell a broken player) (sell a working player)}. If all goes well for the scammer, the mark never suspects (1) the scammer is playing a different game and (2) the scammer is playing a higher-order game–that is, the scammer is not only playing a different game but is aware of the mark’s misperceptions. This yields an advantage to the scammer. As long as the mark doesn’t suspect that most of the boxes contain bricks, he believes his choice is simply an ethical one: should I buy possibly stolen merchandise ? The concept of higher perspectives is sometimes referred to as expectation.5 Expectation is arguably as critical to the hypergame approach as is the more basic concept of different games.

Weaknesses: Hypergames of level three and higher are possible but challenging because they require increasingly convoluted mental recursions (I think he thinks I think he thinks, and so on). Hypergame analysis as described in the existing literature can be fairly complex. It is not something an interested analyst or red teamer will typically pick up in a day. When it is used, it is usually delegated to a specialist, who must then translate the outcome back into terms a decision maker can absorb.

Strengths: As the example illustrates, the player who correctly perceives a level-two hypergame enjoys a clear decision advantage over a player who believes the two sides are playing the same game. This situation does not necessarily arise by chance, and a clever player will aim to create and exploit such conditions. As a result, the benefit of hypergame modeling to a red teamer or decision maker rests not strictly in describing a situation but also in modeling a situation explicitly in order to gain a position of advantage.
Awareness of the hypergame construct encourages a player to avoid granting his or her opponent a position of advantage.

Conclusions: In any game-like contest, a player should always remember to ask “what do I perceive, and what does my opponent perceive?” To be avoided, for example, are states in which you, a player, believe you and your opponent are playing the same game when your opponent is actually playing a level-two hypergame. To be sought are states in which these roles are reversed.

Red Teaming: A means to Military Transformation

In John Sandoz article was written to address the task ordered objective of joint operational concepts and joint experimentation to assist the Department of Defense(DoD) in attaining the objectives of Joint Vision 2020.

The Role of Red Teaming in Defining Threats
Sandoz writes that in an uncertain security environment, the DoD needs to consider future threats from three perspectives:
1) The evidentiary threat- are studied collected against, analyzed, and reported on exclusively by the intelligence community.
2) Technically feasible threat - are becoming more difficult to asses. This is mainly due to the fact that potential opponents can readily purchase many technologies and systems abroad.
3) The adaptive threat- is difficult to define because U.S. military forces may not know the unexpected ways in which its opponents could counter war-fighting capabilities.

Sandoz makes the point that in all three cases red teaming should play a key role in developing and evaluating potential threats.

Advantages of red teaming:
1) If done at an inter-agency level , adaptive Red Teams can also inform the national policy process by examining alternative strategies and the roles of government agencies, allies, and non-governmental organizations (NGOs) can play in achieving national policy goals.
2) As part of a disciplined process of red-blue interaction , vigorous red teaming can inform the transformation process and lead to a more robust and relevant future military capabilities.
3) Red teaming and joint experimentation can help prevent bad investments and at the same point, provide a means for U.S. forces to become more agile.

Challenges of red teaming:
1) Prevailing cultures and processes that are intolerant of surprise.
2) Adaptive red teaming does introduce elements of uncertainty and can be disruptive to individual programs.

Three levels where interactive red teaming could support joint concept development and experimentation:
Level 1- Red teaming could challenge ones strategic context and visions of future military capabilities by inventing and exploring counter-strategies and challenging scenarios.
Level 2- Red teaming could challenge new operational concepts in ways that future adversaries might use to thwart U.S. military forces in accomplishing their assigned missions.
Level 3- Red teaming activity could be in direct support of experimentation, including the OPFOR for specific experiments. Red teaming at this level would develop context and concepts for opposing specific Blue operational concepts.


Conclusion:
Sandoz concludes that a broader approach to red teaming , featuring a disciplined process of Red-Blue interaction, could inform and help guide transformation at several levels of the national security process in the three levels as stated above. He goes on to write that because war is a phenomenon between thinking opponents, a broad approach to interactive red teaming is important to inform one's thinking about future military challenges and explore ideas for dealing with them .

The “Red Team” Forging a Well-Conceived Contingency Plan

Editorial Abstract: Independent peer review by recognized experts is crucial to the production of any quality product, whether a professional journal or war plan. Colonel Malone and Major Schaupp discuss evolving efforts to use “Red Teams” to incorporate this kind of review into the crisis-action planning process. Employing such teams at critical phases during both the planning itself and the mission rehearsal of completed plans will yield more robust and vetted war plans.



Throughout the lengthy planning effort for Operation Allied Force in 1998–99, allied leaders and planners widely adhered to a significant assumption. When the order arrived to execute the operation- on the very eve of hostilities- that assumption continued to prevail.

What if an enemy, “Red,” announced his intended reaction to a “Blue” campaign plan before Blue executed it? What if Red obligingly pointed out the flaws in Blue’s plan that he intended to exploit and revealed several hidden weaknesses of his own? Surely, once Blue optimized his strengths and protected his vulnerabilities, the operation would stand a much greater chance of success.

Furthermore, what if representatives of the press and the public confided to Blue planners the elements of the operation that concerned them most as well as those with which they might take issue? What if national leadership explained in advance some of the “wrenches” they might throw into the works during execution? What if senior war-fighting commanders and higher headquarters staffs worked alongside the planners to ensure correct understanding of every facet of their guidance and answered the planners’ key questions? If all these pieces of information were synthesized into the plan during the planning process, the plan would have a better chance of surviving any contingency.


For example, Gen Gregory S. Martin, commander of United States Air Forces in Europe (COMUSAFE), tasked his command’s first Red Team to assess an offensive air and space campaign. After analyzing requirements and considering the restrictions imposed by the “need to know,” the Red Team leader formed the team with SMEs from the following areas:

• air operations and strategy
• command and control (C2)
• joint operations
• logistics
• space operations and strategy
• intelligence, surveillance, and reconnaissance (ISR)
• combat search and rescue
• information operations and information warfare
• law
• politics

Weaknesses:
When possible, the commander should draw Red Team members from sources external to the Blue planning organization. Although this may seem intuitive, it is not always easy to accomplish. Most organizations that have the necessary experts are usually fully employed- indeed, the Blue planning orga-nization itself is a perfect example. A commander may be tempted to dual-hat his or her own Blue planners as Red Team members; after all, what better people to assess a plan than the ones most intimately familiar with it? But this seemingly workable solution is fatally flawed: one of the prime benefits of Red Teaming is an independent review of Blue products and reasoning- a second set of eyes on the plan. Try as it might, even the most talented planning group cannot discern its own oversights- if it could, those oversights would not occur in the first place. As concerned as Blue planners must inevitably be with the details, it is sometimes difficult for them to stand back and see the big picture.

Strengths:
When one considers the overall mission of the Red Team- generating a more effective plan- it becomes clear that the team is not consistently “red.” At times, rather than challenging Blue reasoning, its members will provide assistance to the planners, offering another perspective or additional information. This is especially true of the senior mentor, a vital participant in the process although not technically a member of the Red Team. This periodic functional shift on the part of the Red Team- from devil’s advocate to planning partner- does not detract from the overall effort. On the contrary, it broadens the range of thinking and contributions of the entire group, enhancing the planning effort.


Red Team Rules of Engagement
As the Red Team prepares to integrate into the planning effort, it must acknowledge a simple fact: very few people perceive a review and assessment of their efforts as benign. Even assistance, which is ultimately what the Red Team provides, is often not welcome, especially when it comes from people unknown and external to the Blue planning team. To mitigate this friction, the Red Team should meet with the Blue planners as early as possible to explain a number of critical points about a Red Teaming effort. The following ROEs should apply to every Red Teaming event throughout the process:

• The commander’s perceived intent should not limit innovation (e.g., drive certain COAs).
• Red Teaming events are meant to be interactive, candid discussions reminiscent of the flight debrief after a mission.
• The principle of nonattribution is in effect.
• Participants should remain objective in their contributions to the effort; personal agendas or personality conflicts are not welcome.
• Participants should stay professional- no fighting in public.


The first item in this list addresses a problem that can be insidious and deadly to a well-developed plan: the natural tendency to favor a war-fighting commander’s perceived intent in developing COAs. Too often, a planning staff presents the commander with several COAs, knowing full well that all but the perceived favorite are throwaways. As a result, staffers sometimes spend little time seriously developing the COAs.

As the Red Team moves into action, its ability to gain the confidence and trust of the Blue planners is absolutely critical. Failure in this area will lead to Red Team failure. One cannot overstate the importance of avoiding an “us against them” relationship between Blue and Red. Again, the commander’s early buy-in and influence in this area, as well as adherence to the ROEs outlined above, will pay large dividends to the process. When this groundwork is laid successfully, the Blue team will understand why the OPFOR, for instance, is doing its utmost to simulate a realistic, hostile enemy.

Conclusions
USAFE’s early Red Teaming efforts will continue to evolve. Development of the commander’s Red Team becomes more focused with each effort. One thing is already clear- Red Teaming adds great value to contingency planning. It would likely do the same for deliberate planning. Air and space staffs should consider the doctrine already in place, as well as the ideas expounded here, with a view toward making Red Teaming a staple of the planning process.

Field Marshal Helmuth von Moltke’s adage “no plan survives contact with the enemy” is true. But through Red Teaming, a plan can be refined after each contact with a Red Team. This process is valuable because it brings a contingency plan, together with the reasoning and information behind it, under the scrutiny of a well-simulated enemy. Better still, the Red Team can imitate outside agencies, higher headquarters, and even “Murphy’s Law.” A plan that survives this kind of treatment should be healthy indeed. To modify Gen George S. Patton’s famous quotation, “A good plan, well rehearsed, is better than a perfect plan unrehearsed.”

Red Dawn: The Emergence of a Red Teaming Capability in the Canadian Forces

In Matt Lauder's article , written in the summer of 2009 for the Canadian Army Journal, set out to briefly identify and explore examples of red teaming from across the private and public sectors. By using these examples he was able to draw from them in order to outline the characteristics of red teaming and propose an integrated, and working , definition of red teaming for the possible use by the Canadian Forces (CF).

Lauder breaks red teaming down into 3 sectors:
1) Civilian applications
2) Military applications
3) Red teaming in the CF

1) Civilian applications

Some examples of red teaming in civilian applications include Jack Davis teachings at the Sherman Kent Centre, National Security labs within the U.S. Department of Energy , and the Forensic Audits and Special Investigations Team (FSI) of the U.S. Government Accountability Office (GAO) have all conducted similar penetration tests using this technique of red teaming.

2) Military Applications
John F. Sandoz teaches red teaming at the Institute for Defence Analysis .

Red teaming is also being taught at the University of Foreign Military and Cultural Studies (UFMCS).For the UFMCS, the goal of red teaming is to enable planners and decision-makers to avoid group-think, mirror-imaging, and cultural miscalculations.

3) Red Teaming Canadian Forces
In this article it is pointed out that red teaming is done in a much
more informal and irregular manner, and more often in a tactical-training setting. The CF did use red teaming as a technique to help them prepare for the 2010 Winter Olympics.

Why Red Teaming is important?
Lauder outlines two main reasons as to why red teaming is important:

1) Red teaming mitigates complacency, group-think, and mirror-imaging (i.e. imposing blue force
behaviours and tactics on the adversary; in other words, seeing the adversary as we
see ourselves).
2) Red teaming is a process by which blue force may be able to deepen its understanding of, and therefore the ability to respond to, the adversary.

Conceptual Framework of Red Teaming:
Red Teaming broken down into four broad and generic organizational processes:
1) Innovation
2) Planning and Analysis
3) Training and Professional Development
4) Operations

Lauder's six key characteristics of red teaming:
1) Trust
2) Positional Authority
3) Relative Independence
4) Expertise
5) Adaptability
6) Flexibility

The following are a number of areas that require further investigation:

1) What are the qualities and characteristics of good and effective red teamers and
how are red teamers selected?
2) What type of training is required for red teamers?
3) Is there a particular red team composition that is more effective than others?
4) What kind of learning environment is most effective?
5) Does the role of the red team differ in certain environments (i.e. does the role differ
across settings and levels)?
6) What type of interaction is necessary (between red and blue) to encourage
learning?

Conclusion:
In general, civilian applications tend to use red teaming on the tactical level (in particular, but not exclusively, to test physical or synthetic networks, systems, or operational programs), whereas military applications tend to be employed on the operational and strategic levels, and largely within a planning setting or in a decision-support role (although, in the CF, red teaming appears to be most often utilized in exercise or training environments). It is clear that, while application of the red teaming concept may differ across sectors, both the civilian and military communities utilize red teaming in an active, rather than a passive, fashion, and that red teamers must possess a deep understanding of the adversary (i.e. thinking and behaviour) for the purposeof role-playing the adversary (or, advising as to what the adversary may think and do) in training, planning, or operations (i.e. live) setting. Moreover, it is apparent that red teamers must see themselves as

Effectively Using Red Teams

Dave Herndon's article details how companies should use Red Teams to get their proposals to win contracts.

When to Review
This presents a dilemma to many firms because the later they review proposals, the more robust the review may be. On the other hand, reviewing a proposal late may leave insufficient time to implement the recommended changes. Herndon argues that the best way to avoid this issue is to conduct more than one review during the process.

Herndon identifies three types of red teams:
1) Evaluating-and-Recommending Fixes Red Team - reviews the proposal for a broad range of factors, including:
  • Compliance
  • Completeness
  • Responsiveness Presentation
  • Sell
This team makes recommendations on how deficiencies can be fixed. Normally, this team does not possess customer expertise to formally score a proposal with solicitation evaluation factors but it can provide an informal quality score (excellent, good, marginal, and unacceptable) of each section for general evaluation purposes.

2)Customer-Evaluation-Simulation Red Team - attempts to simulate the customer's formal proposal evaluation process

This team measures proposals by:
  • Evaluating each solicitation requirement
  • listing proposer benefits and deficiencies
  • identifying needed clarifications for each solicitation requirement
In order for this red team to score a proposal effectively, its members must have a comprehensive understanding of the customer's requirements including the budget for the proposed work and political agendas.

3)Running Red Team - When a proposal is on an extremely tight schedule, a running red team is often an effective method of proposal evaluation. When a writer completes a section draft, he or she immediately gives it to the running red team for a quick response evaluation.

Composition of Red Team
Avoid using senior company executives on the red team unless they agree to give full-time effort to the review. The most important member of a red team is the red team manager. The ideal individual is someone totally familiar with the proposal review process, the proposal preparation, and the customer's requirements.

Red team members normally include:

  • Outside proposal professionals
  • Customer specialists
  • Employees who thoroughly know the bidder's capabilities, products, services, and past performance history
  • Subject matter experts
Red Team Planning Procedures
The red team evaluation should be planned early. The capture manager, proposal manager, and red team manager should then determine the type of red team to be used, its exact function, and a list of desired red team members. Red team procedures should include a breakdown of tasks for each red team member and a schedule for red team activities.

Preparing the Proposal for Red Team Evaluation
The most important thing in preparing a proposal for review is having it complete. Herndon recommends that the proposal be given a hard edit prior to red team review and a detailed compliance matrix should be included. This matrix should be in a check-off-list format that follows the requested information of the solicitation proposal instructions, evaluation factors, and statement of work.

Red Team Evaluation
After receiving the proposal, the red team evaluation procedures will include:

  • Final assignment review
  • Finalization of review schedule
  • Coordination with proposal team for debrief and follow-up actions
  • Review of total proposal against the solicitation requirements
  • In-depth review of assigned sections, noting deficiencies and strengths, identifying needed clarifications, providing recommendations, and completing evaluation forms
  • Compilation of comments into single book
  • Debrief of proposal team
The red team should meet and prepare a formal debriefing to the proposal team. The red team should concentrate their presentation on a realistic approach to improve the proposal.

Post Red Team Evaluation Actions
After the red team evaluation, the red team members should assist the proposal team in making the recommended fixes. When this responsibility is understood in advance of the review, the red team comments will invariably be more realistic.

Red Teaming: The Art of Ethical Hacking

In Chris Peake's article, he discusses information security broadly but also how Red Teaming may be used by companies to perform security assessments on their networks/systems. There is an "Infosec Process" containing five components:

1) Assess the current state of risk by evaluating the existing security methods, measures and policies.

2) Based on the Assessment, design a security posture by creating policies that effectively manage the risk to the system/network.

3) Identify and implement the technical tools and physical controls necessary to manage risk.

4) Provide awareness training to the company to protect sensitive information through the cooperation and involvement of the employees.

5) Audit the system/network to confirm that the controls and employees adhere to policy.

This is a revolving process that should be performed continually by companies according to Peake. Red Teaming falls under the assessment stage of the Infosec Process (#1). The Red Team uses tools to probe for vulnerabilities and can project possible threats based on the scope of the assessment requested by the customer. However, the Red Teaming approach is attempting to circumvent security only need to find a single vulnerability, while security professionals need to find all possible vulnerabilities for a given system in order to assess the associated risk. A thorough Red Team assessment should provide an accurate situational awareness of the security posture of a given system/network. But identifying risk through Red Teaming and other methods cannot provide information security alone; the company/organization must continue through the Infosec process in order to appropriately manage risk and provide security protection.

A Red Team assessment evaluates various areas of security in a multi-layered approach. The Red Team tests policy compliance of the security controls at each layer (Operating System, Application, Host, LAN, Perimeter) and the control is tested in a manner specific to the area of security to which it applies. There are six areas of security where vulnerability assessment testing occurs:

-Internet Security
-Communications Security
-Information Security
-Social Engineering
-Wireless Security
-Physical Security

Red Teaming is “ethical hacking.” As such, it must be carried out with the utmost confidentiality, discretion, and clarity. Typically, Red Teams are third-party entities hired to make an impartial assessment of the network or system. The customer sets the scope of the project to specify the area of information to be assessed. The Red Team is responsible for supplying the customer with a detailed plan as well as a list of methods and tools that will be used during the evaluation. Any testing performed outside the scope stated by the customer, can be considered an unwarranted attack by the Red Team.

The most important requirement for Red Teaming is customer consent. Because, by definition and purpose, the Red Team takes an attacker-like approach to testing security, to begin an assessment without explicit permission is legally perceived as an unwarranted attack on the system/network. This being said, many Red Team evaluations are purposefully kept from network and system administrators as a means of testing personnel response to security events. The scope of the Red Teaming assessment can be very general or very specific when defining what the assessment will include or address. The scope of the project depends on time or cost of the assessment and/or on the objective of the assessment as defined by the customer.

Red Teaming is commonly mistaken as just penetration testing (pen-testing) when in fact, pen-testing is a component of the Red Teaming assessment. But pen-testing cannot provide a complete security analysis alone. If a system/network is penetrated, the test proves that there is at least one vulnerability that can be used to gain access to the system/network. And if the pen-test was unsuccessful, the test only proves that the person performing the pen-test was unable to find any exploits in the system, it doesn't guarantee that there are no vulnerabilities present.

A good rule of thumb for companies to follow when planning Red Team assessments is to identify the weakest areas or the "low-hanging fruit" and have these areas tested for vulnerabilities. Hackers will target a specific vulnerability to gain access (rather than numerous) to avoid detection.

Ethical hacking must strictly follow pre-approved testing guidelines that are established with the customer. The team must also document all the steps/procedures in testing in order to retrace the team’s actions in case of an incident due to testing or for retesting/verification of results if necessary. Upon completion of the Red Teaming effort all results should be submitted to the customer in a final report detailing the vulnerabilities that were discovered and how each was discovered. The report should also make an assessment of the overall level of risk of the network/system in addition to the risk level of each vulnerability. The final report is as important as the testing itself because it will direct the customer to take additional security steps.

Finally, when assembling a Red Team it is important to have specialists in a wide variety of areas (Peake lists 21 separate specialties) in order to provide the most thorough security assessment.




Monday, April 5, 2010

Red Teams: Towards Radical Innovation

This article appeared in an Executive Technology Report by IBM corporation in July 2005 and was based on an essay by Peter Andrews, Consulting Faculty Member at the IBM Advanced Business Institute in Palisades, New York.

In this article, the author talks about red teaming as a tool that can help provide competitive advantage to businesses and the key factors that are essential for successful red teaming. The author also mentions some of the reasons red teaming can go wrong.

In the private sector, red teams are mainly used as review panels. The author argues that red teams can act as a driver for innovation by challenging assumptions, finding vulnerabilities and actively finding unconventional means to get a jump on mainstream (or Blue) planning teams.

Key Benefits of a Red Team:
  • Identifying significant vulnerabilities
  • Discovering new uses for innovations
  • Challenging taboos and assumptions
  • Providing a minority report on a new concept or idea
  • Revealing the consequences of different perspectives;
    • in-particular the perspectives of those with different goals and risk profiles.
The success of a Red Team primarily depends on the people who make up the team. The critical success factors can be listed as below:

Composition of team:

This is by far as the most important reason which can ensure the success of red teaming. The team needs to include experts and also people who ask naive questions. They need to inhabit the roles of adversaries and risk delivering bad news. They also need to understand the mindset and culture of their own organizations and that of their adversaries. They need to be capable of detailed critical analysis, but also need to be imaginative and iconoclastic. Most of all they need to be adept at communicating surprising concepts in clear, compelling language.

Management support:

The red team needs the authority and standing to get a fair hearing for its ideas and concepts. It also needs material support, proper staffing and access to information and resources without which the red team may find itself blocked or ignored.

Relationship with blue team:

The red team has to have the trust of the blue team without which, the blue team may hide key data and be reluctant to incorporate the views and insights of the red team. At the same time the red team must maintain a level of independence and a willingness to make unpopular statements.

Goals:

The required deliverables of the red team must be defined and there must be some measure of success. The red team must have a level of accountability and should know what is promised and be able to deliver on those promises.

Available information:

The author states that providing the red team with an "open book" on innovation and regularly meeting with the blue team can benefit both teams. This is opposed to the school of thought where the blue team does not disclose any of its plans or tactics to the red team.

Rules of the game:

The author advocates clearly defining the rules of engagement wit
h regard to information, judgment of success, what
comprises proof and when/how opinions and insights are offered. Also the consequences with respect to career advancement and rewards also needs to be stated up front.

The author says that red teams are the most successful when they begin questioning assumptions and digging into the role of adversaries so as to approach good acting.

However, the author acknowledges that this is not easy and quotes a U.S. Department of Defense review which identified the many ways a red team exercise can go wrong. Some of the reasons which the author highlights in this article are as below:
  • Red team members not taking their assignment seriously
  • Red team not getting enough inside information to be credible
  • Teams violating trust by leaking information
  • Lack of quality members in the red team
  • Inability to step into shoes of adversaries

The author concludes that in a relatively stable environment and companies that do not have the means to take radical ideas and turn them into action, Red Teams are not a good use of resources. But in highly competitive environment Red Teams are beginning to be seen as important tools for many businesses.