Tuesday, April 6, 2010
Red Teaming: A means to Military Transformation
The Role of Red Teaming in Defining Threats
Sandoz writes that in an uncertain security environment, the DoD needs to consider future threats from three perspectives:
1) The evidentiary threat- are studied collected against, analyzed, and reported on exclusively by the intelligence community.
2) Technically feasible threat - are becoming more difficult to asses. This is mainly due to the fact that potential opponents can readily purchase many technologies and systems abroad.
3) The adaptive threat- is difficult to define because U.S. military forces may not know the unexpected ways in which its opponents could counter war-fighting capabilities.
Sandoz makes the point that in all three cases red teaming should play a key role in developing and evaluating potential threats.
Advantages of red teaming:
1) If done at an inter-agency level , adaptive Red Teams can also inform the national policy process by examining alternative strategies and the roles of government agencies, allies, and non-governmental organizations (NGOs) can play in achieving national policy goals.
2) As part of a disciplined process of red-blue interaction , vigorous red teaming can inform the transformation process and lead to a more robust and relevant future military capabilities.
3) Red teaming and joint experimentation can help prevent bad investments and at the same point, provide a means for U.S. forces to become more agile.
Challenges of red teaming:
1) Prevailing cultures and processes that are intolerant of surprise.
2) Adaptive red teaming does introduce elements of uncertainty and can be disruptive to individual programs.
Three levels where interactive red teaming could support joint concept development and experimentation:
Level 1- Red teaming could challenge ones strategic context and visions of future military capabilities by inventing and exploring counter-strategies and challenging scenarios.
Level 2- Red teaming could challenge new operational concepts in ways that future adversaries might use to thwart U.S. military forces in accomplishing their assigned missions.
Level 3- Red teaming activity could be in direct support of experimentation, including the OPFOR for specific experiments. Red teaming at this level would develop context and concepts for opposing specific Blue operational concepts.
Conclusion:
Sandoz concludes that a broader approach to red teaming , featuring a disciplined process of Red-Blue interaction, could inform and help guide transformation at several levels of the national security process in the three levels as stated above. He goes on to write that because war is a phenomenon between thinking opponents, a broad approach to interactive red teaming is important to inform one's thinking about future military challenges and explore ideas for dealing with them .
The “Red Team” Forging a Well-Conceived Contingency Plan
Throughout the lengthy planning effort for Operation Allied Force in 1998–99, allied leaders and planners widely adhered to a significant assumption. When the order arrived to execute the operation- on the very eve of hostilities- that assumption continued to prevail.
What if an enemy, “Red,” announced his intended reaction to a “Blue” campaign plan before Blue executed it? What if Red obligingly pointed out the flaws in Blue’s plan that he intended to exploit and revealed several hidden weaknesses of his own? Surely, once Blue optimized his strengths and protected his vulnerabilities, the operation would stand a much greater chance of success.
Furthermore, what if representatives of the press and the public confided to Blue planners the elements of the operation that concerned them most as well as those with which they might take issue? What if national leadership explained in advance some of the “wrenches” they might throw into the works during execution? What if senior war-fighting commanders and higher headquarters staffs worked alongside the planners to ensure correct understanding of every facet of their guidance and answered the planners’ key questions? If all these pieces of information were synthesized into the plan during the planning process, the plan would have a better chance of surviving any contingency.
For example, Gen Gregory S. Martin, commander of United States Air Forces in Europe (COMUSAFE), tasked his command’s first Red Team to assess an offensive air and space campaign. After analyzing requirements and considering the restrictions imposed by the “need to know,” the Red Team leader formed the team with SMEs from the following areas:
• air operations and strategy
• command and control (C2)
• joint operations
• logistics
• space operations and strategy
• intelligence, surveillance, and reconnaissance (ISR)
• combat search and rescue
• information operations and information warfare
• law
• politics
Weaknesses:
When possible, the commander should draw Red Team members from sources external to the Blue planning organization. Although this may seem intuitive, it is not always easy to accomplish. Most organizations that have the necessary experts are usually fully employed- indeed, the Blue planning orga-nization itself is a perfect example. A commander may be tempted to dual-hat his or her own Blue planners as Red Team members; after all, what better people to assess a plan than the ones most intimately familiar with it? But this seemingly workable solution is fatally flawed: one of the prime benefits of Red Teaming is an independent review of Blue products and reasoning- a second set of eyes on the plan. Try as it might, even the most talented planning group cannot discern its own oversights- if it could, those oversights would not occur in the first place. As concerned as Blue planners must inevitably be with the details, it is sometimes difficult for them to stand back and see the big picture.
Strengths:
When one considers the overall mission of the Red Team- generating a more effective plan- it becomes clear that the team is not consistently “red.” At times, rather than challenging Blue reasoning, its members will provide assistance to the planners, offering another perspective or additional information. This is especially true of the senior mentor, a vital participant in the process although not technically a member of the Red Team. This periodic functional shift on the part of the Red Team- from devil’s advocate to planning partner- does not detract from the overall effort. On the contrary, it broadens the range of thinking and contributions of the entire group, enhancing the planning effort.
Red Team Rules of Engagement
As the Red Team prepares to integrate into the planning effort, it must acknowledge a simple fact: very few people perceive a review and assessment of their efforts as benign. Even assistance, which is ultimately what the Red Team provides, is often not welcome, especially when it comes from people unknown and external to the Blue planning team. To mitigate this friction, the Red Team should meet with the Blue planners as early as possible to explain a number of critical points about a Red Teaming effort. The following ROEs should apply to every Red Teaming event throughout the process:
• The commander’s perceived intent should not limit innovation (e.g., drive certain COAs).
• Red Teaming events are meant to be interactive, candid discussions reminiscent of the flight debrief after a mission.
• The principle of nonattribution is in effect.
• Participants should remain objective in their contributions to the effort; personal agendas or personality conflicts are not welcome.
• Participants should stay professional- no fighting in public.
The first item in this list addresses a problem that can be insidious and deadly to a well-developed plan: the natural tendency to favor a war-fighting commander’s perceived intent in developing COAs. Too often, a planning staff presents the commander with several COAs, knowing full well that all but the perceived favorite are throwaways. As a result, staffers sometimes spend little time seriously developing the COAs.
As the Red Team moves into action, its ability to gain the confidence and trust of the Blue planners is absolutely critical. Failure in this area will lead to Red Team failure. One cannot overstate the importance of avoiding an “us against them” relationship between Blue and Red. Again, the commander’s early buy-in and influence in this area, as well as adherence to the ROEs outlined above, will pay large dividends to the process. When this groundwork is laid successfully, the Blue team will understand why the OPFOR, for instance, is doing its utmost to simulate a realistic, hostile enemy.
Conclusions
USAFE’s early Red Teaming efforts will continue to evolve. Development of the commander’s Red Team becomes more focused with each effort. One thing is already clear- Red Teaming adds great value to contingency planning. It would likely do the same for deliberate planning. Air and space staffs should consider the doctrine already in place, as well as the ideas expounded here, with a view toward making Red Teaming a staple of the planning process.
Field Marshal Helmuth von Moltke’s adage “no plan survives contact with the enemy” is true. But through Red Teaming, a plan can be refined after each contact with a Red Team. This process is valuable because it brings a contingency plan, together with the reasoning and information behind it, under the scrutiny of a well-simulated enemy. Better still, the Red Team can imitate outside agencies, higher headquarters, and even “Murphy’s Law.” A plan that survives this kind of treatment should be healthy indeed. To modify Gen George S. Patton’s famous quotation, “A good plan, well rehearsed, is better than a perfect plan unrehearsed.”
Red Dawn: The Emergence of a Red Teaming Capability in the Canadian Forces
Lauder breaks red teaming down into 3 sectors:
1) Civilian applications
2) Military applications
3) Red teaming in the CF
1) Civilian applications
Some examples of red teaming in civilian applications include Jack Davis teachings at the Sherman Kent Centre, National Security labs within the U.S. Department of Energy , and the Forensic Audits and Special Investigations Team (FSI) of the U.S. Government Accountability Office (GAO) have all conducted similar penetration tests using this technique of red teaming.
2) Military Applications
John F. Sandoz teaches red teaming at the Institute for Defence Analysis .
Red teaming is also being taught at the University of Foreign Military and Cultural Studies (UFMCS).For the UFMCS, the goal of red teaming is to enable planners and decision-makers to avoid group-think, mirror-imaging, and cultural miscalculations.
3) Red Teaming Canadian Forces
In this article it is pointed out that red teaming is done in a much
more informal and irregular manner, and more often in a tactical-training setting. The CF did use red teaming as a technique to help them prepare for the 2010 Winter Olympics.
Why Red Teaming is important?
Lauder outlines two main reasons as to why red teaming is important:
1) Red teaming mitigates complacency, group-think, and mirror-imaging (i.e. imposing blue force
behaviours and tactics on the adversary; in other words, seeing the adversary as we
see ourselves).
2) Red teaming is a process by which blue force may be able to deepen its understanding of, and therefore the ability to respond to, the adversary.
Conceptual Framework of Red Teaming:
Red Teaming broken down into four broad and generic organizational processes:
1) Innovation
2) Planning and Analysis
3) Training and Professional Development
4) Operations
Lauder's six key characteristics of red teaming:
1) Trust
2) Positional Authority
3) Relative Independence
4) Expertise
5) Adaptability
6) Flexibility
The following are a number of areas that require further investigation:
1) What are the qualities and characteristics of good and effective red teamers and
how are red teamers selected?
2) What type of training is required for red teamers?
3) Is there a particular red team composition that is more effective than others?
4) What kind of learning environment is most effective?
5) Does the role of the red team differ in certain environments (i.e. does the role differ
across settings and levels)?
6) What type of interaction is necessary (between red and blue) to encourage
learning?
Conclusion:
In general, civilian applications tend to use red teaming on the tactical level (in particular, but not exclusively, to test physical or synthetic networks, systems, or operational programs), whereas military applications tend to be employed on the operational and strategic levels, and largely within a planning setting or in a decision-support role (although, in the CF, red teaming appears to be most often utilized in exercise or training environments). It is clear that, while application of the red teaming concept may differ across sectors, both the civilian and military communities utilize red teaming in an active, rather than a passive, fashion, and that red teamers must possess a deep understanding of the adversary (i.e. thinking and behaviour) for the purposeof role-playing the adversary (or, advising as to what the adversary may think and do) in training, planning, or operations (i.e. live) setting. Moreover, it is apparent that red teamers must see themselves as
Effectively Using Red Teams
When to Review
This presents a dilemma to many firms because the later they review proposals, the more robust the review may be. On the other hand, reviewing a proposal late may leave insufficient time to implement the recommended changes. Herndon argues that the best way to avoid this issue is to conduct more than one review during the process.
Herndon identifies three types of red teams:
1) Evaluating-and-Recommending Fixes Red Team - reviews the proposal for a broad range of factors, including:
- Compliance
- Completeness
- Responsiveness Presentation
- Sell
2)Customer-Evaluation-Simulation Red Team - attempts to simulate the customer's formal proposal evaluation process
This team measures proposals by:
- Evaluating each solicitation requirement
- listing proposer benefits and deficiencies
- identifying needed clarifications for each solicitation requirement
3)Running Red Team - When a proposal is on an extremely tight schedule, a running red team is often an effective method of proposal evaluation. When a writer completes a section draft, he or she immediately gives it to the running red team for a quick response evaluation.
Composition of Red Team
Avoid using senior company executives on the red team unless they agree to give full-time effort to the review. The most important member of a red team is the red team manager. The ideal individual is someone totally familiar with the proposal review process, the proposal preparation, and the customer's requirements.
Red team members normally include:
- Outside proposal professionals
- Customer specialists
- Employees who thoroughly know the bidder's capabilities, products, services, and past performance history
- Subject matter experts
The red team evaluation should be planned early. The capture manager, proposal manager, and red team manager should then determine the type of red team to be used, its exact function, and a list of desired red team members. Red team procedures should include a breakdown of tasks for each red team member and a schedule for red team activities.
Preparing the Proposal for Red Team Evaluation
The most important thing in preparing a proposal for review is having it complete. Herndon recommends that the proposal be given a hard edit prior to red team review and a detailed compliance matrix should be included. This matrix should be in a check-off-list format that follows the requested information of the solicitation proposal instructions, evaluation factors, and statement of work.
Red Team Evaluation
After receiving the proposal, the red team evaluation procedures will include:
- Final assignment review
- Finalization of review schedule
- Coordination with proposal team for debrief and follow-up actions
- Review of total proposal against the solicitation requirements
- In-depth review of assigned sections, noting deficiencies and strengths, identifying needed clarifications, providing recommendations, and completing evaluation forms
- Compilation of comments into single book
- Debrief of proposal team
Post Red Team Evaluation Actions
After the red team evaluation, the red team members should assist the proposal team in making the recommended fixes. When this responsibility is understood in advance of the review, the red team comments will invariably be more realistic.
Red Teaming: The Art of Ethical Hacking
1) Assess the current state of risk by evaluating the existing security methods, measures and policies.
2) Based on the Assessment, design a security posture by creating policies that effectively manage the risk to the system/network.
3) Identify and implement the technical tools and physical controls necessary to manage risk.
4) Provide awareness training to the company to protect sensitive information through the cooperation and involvement of the employees.
5) Audit the system/network to confirm that the controls and employees adhere to policy.
This is a revolving process that should be performed continually by companies according to Peake. Red Teaming falls under the assessment stage of the Infosec Process (#1). The Red Team uses tools to probe for vulnerabilities and can project possible threats based on the scope of the assessment requested by the customer. However, the Red Teaming approach is attempting to circumvent security only need to find a single vulnerability, while security professionals need to find all possible vulnerabilities for a given system in order to assess the associated risk. A thorough Red Team assessment should provide an accurate situational awareness of the security posture of a given system/network. But identifying risk through Red Teaming and other methods cannot provide information security alone; the company/organization must continue through the Infosec process in order to appropriately manage risk and provide security protection.
A Red Team assessment evaluates various areas of security in a multi-layered approach. The Red Team tests policy compliance of the security controls at each layer (Operating System, Application, Host, LAN, Perimeter) and the control is tested in a manner specific to the area of security to which it applies. There are six areas of security where vulnerability assessment testing occurs:
-Internet Security
-Communications Security
-Information Security
-Social Engineering
-Wireless Security
-Physical Security
Red Teaming is “ethical hacking.” As such, it must be carried out with the utmost confidentiality, discretion, and clarity. Typically, Red Teams are third-party entities hired to make an impartial assessment of the network or system. The customer sets the scope of the project to specify the area of information to be assessed. The Red Team is responsible for supplying the customer with a detailed plan as well as a list of methods and tools that will be used during the evaluation. Any testing performed outside the scope stated by the customer, can be considered an unwarranted attack by the Red Team.
The most important requirement for Red Teaming is customer consent. Because, by definition and purpose, the Red Team takes an attacker-like approach to testing security, to begin an assessment without explicit permission is legally perceived as an unwarranted attack on the system/network. This being said, many Red Team evaluations are purposefully kept from network and system administrators as a means of testing personnel response to security events. The scope of the Red Teaming assessment can be very general or very specific when defining what the assessment will include or address. The scope of the project depends on time or cost of the assessment and/or on the objective of the assessment as defined by the customer.
Red Teaming is commonly mistaken as just penetration testing (pen-testing) when in fact, pen-testing is a component of the Red Teaming assessment. But pen-testing cannot provide a complete security analysis alone. If a system/network is penetrated, the test proves that there is at least one vulnerability that can be used to gain access to the system/network. And if the pen-test was unsuccessful, the test only proves that the person performing the pen-test was unable to find any exploits in the system, it doesn't guarantee that there are no vulnerabilities present.
A good rule of thumb for companies to follow when planning Red Team assessments is to identify the weakest areas or the "low-hanging fruit" and have these areas tested for vulnerabilities. Hackers will target a specific vulnerability to gain access (rather than numerous) to avoid detection.
Ethical hacking must strictly follow pre-approved testing guidelines that are established with the customer. The team must also document all the steps/procedures in testing in order to retrace the team’s actions in case of an incident due to testing or for retesting/verification of results if necessary. Upon completion of the Red Teaming effort all results should be submitted to the customer in a final report detailing the vulnerabilities that were discovered and how each was discovered. The report should also make an assessment of the overall level of risk of the network/system in addition to the risk level of each vulnerability. The final report is as important as the testing itself because it will direct the customer to take additional security steps.
Finally, when assembling a Red Team it is important to have specialists in a wide variety of areas (Peake lists 21 separate specialties) in order to provide the most thorough security assessment.
Monday, April 5, 2010
Red Teams: Towards Radical Innovation
- Identifying significant vulnerabilities
- Discovering new uses for innovations
- Challenging taboos and assumptions
- Providing a minority report on a new concept or idea
- Revealing the consequences of different perspectives;
- in-particular the perspectives of those with different goals and risk profiles.
- Red team members not taking their assignment seriously
- Red team not getting enough inside information to be credible
- Teams violating trust by leaking information
- Lack of quality members in the red team
- Inability to step into shoes of adversaries
Red Team "Two Sides to Every Story"
In the article, Red Team "Two Sides to Every Story," by Lieutenant Colonel John Nelson, he writes about a particular experience after becoming one of the first graduates of the
Strengths:
· Provides and independent capability to evaluate concepts, plans, and operations from multiple different perspectives.
· Provides and understanding of the opposition through their cultural eyes.
· Provides Independent thinkers that are able to travel throughout the Division’s area of responsibility.
Weaknesses:
· Only effective if there is a true understanding of the culture.
· Although it aids in decision making, it is still only a simulation and does not account for independent thinking of the opposition.
Conclusion:
The experience of Lieutenant Colonel Nelson provides insight as to how read teaming was used in the war in Iraq, however without follow up information there is no way of knowing whether this process actually helped with decision making during the time in which it was used. I searched for follow up articles but I have yet to locate any as of this time.
Source:
http://www.military-writers.com/articles/red_team.html
Red-teaming: Improve Your Chances Of Winning The Business
If the proposal has serious problems or if they think it is off base, they will question and respond in a way the client normally would.
- Because of their experience, members of red-teams emulate the process and mindset of the clients that the company is going to present to.
- At least three people serve on each team.
- They are knowledgeable in the company's space.
- Team members must have no significant prior connection with the company that is presenting.
- They must be willing and able to commit the necessary time and attention to the process.
- Red team members are given at least a week to read the materials to be used in the presentation and do a bit of personal research.
- Team members must be committed to helping improve the chances of getting the business.
Comments:
The focus of this article seems to be more on providing information on red teaming techniques for prospective clients. Hence it provides a rather broad overview of red teaming. Though the author does mention a couple of examples of proposals that failed because they did not go through a red team evaluation, he does not mention any examples where red teaming has either helped improve or win a bid. The article could have been more compelling if it had had such examples.
The Role and Status of DoD Red Teaming Activities
What are Red Teams and Red Teaming?
The purpose of the red teaming is to reduce an enterprise's risks and increase its opportunities. Red teaming can be used at multiple levels:
- Strategic level to challenge assumptions and visions
- Operational level to challenge force posture, a commander;s war plan
- Tactical level to challenge military units in training or programs in development
Red teaming provides enterprises with:
- Deeper understanding of potential adversary options and behaviors
- Hedge against the social comfort of "the accepted assumption or accepted solution."
- Hedge against inexperience
Areas that red team can play an important role within DoD:
- Training
- Concept development and experimentation
- Security of complex networks and systems
- Activities where there is not much opportunities to try things out (nuclear weapons stockpile issue)
The Task Force identifies three types of red teams:
- Surrogate adversaries or competitors of the enterprise: The purpose of this red team is to sharpen enterprise's skills and expose vulnerabilities that adversaries might exploit.
- Devil's Advocates: This provides critical analysis in order to anticipate problems and avoid surprises.
- Sources of judgement independent of enterprises' "normal" processes. The objective to often be a sounding board to the sponsor.
What Makes an Effective Red Team?
Typical causes of red team failure include the followings.
The red teams:
- Does not take its assignment seriously
- Could lose its independence and captured by bureaucracy
- Could be too removed from decision making process and became marginalized
- Could have inadequate interaction with blue (the program it is challenging)
- Could lose the confidence of the decision maker by leaking its finding to outsiders
- not capturing the culture of the adversary
Attributes of Effective Red Teaming
- The culture of the enterprise: Red teaming can thrive in an environment that not only tolerates, but values internal criticism and challenges.
- Top Cover: A red team needs a scope, charter and a relationship that fit the management structure.
- Robust interaction between the red and blue teams: It is not a win or lose game. The objective is to establish a win-win environment in which blue learn from the processes and comes out with sharper skills.
- Usually careful selection of staff: Many very talented individuals are not suited, temperamentally or motivationally to be effective red team members.
Observation About Current Red Team Activities
US navy's SSBN Security Program: It was established in the early 1970s to identify the potential vulnerabilities that the Soviet Union might exploit to put US SSBN at risk. The program's focus shifted in the mid 1980s to evaluate and assess findings from the intelligence community. Recent work has involved terrorist threat and security in ports.
Over decades the program's principles have remained unchanged:
- Strong and widely acknowledged national purpose
- Stable funding
- Highly competent people
- Access to the details of the target program
- Independent to criticize
- Direct accountability to senior official
- A strong but not subordinate relationship to the intelligence community
Missile Defense Agency-Red Teaming Experience: for almost two decades the purpose of this program has been to identify, characterize, and mitigate the risk associated with the development and deployment of the missile defense system.
Air Force Red Team Program: It provides assessments of concepts and technology.
- Provides disciplined approach to guide decision making in technology development
- Allows warning regarding vulnerability of fielded capabilities
- Gives insight into defining what sensitive information to protect
The US Army Red Franchise Organization: Established in 1999, and is responsible for defining the operational environment in next two decades. The operational environment is the intellectual foundation for transforming the Army from a threat-based force to the capabilities based objective force.
USJFCOM Red Teams: This program has been using red team for joint concept development and experimentation.
OSD's Defense Adaptive Red Team (DART) Activity: Established in 2001 and its mission is to support the development of new joint operational concepts by providing red teaming for JFCOM, the combatant commands, Advanced Concept Technology Demonstration (ACTD)and joint Staff.
Red Teams at the Strategic Level
Red Teams at strategic level occurs when the entire enterprise is challenged. The role of red team in such a situation is to:
- Clarify the degree of urgency of the threat
- Create alternatives backed by data, feasibility, likely outcome, difficulty of implementation, resources required, and likely resistance, communication needs
- Gather opposing views
- Lead discussion toward choice of an acceptable solution
- Plan implementation
Conclusions
Red teaming has been a valuable and underutilized tool for the Department of Defense. The Defense Science Board Task Force recommend that the red team role be expended. there are two main reasons:
- To deepen understanding of the adversaries the US now faces in the war on terrorism and in particular their capabilities and potential response to US initiatives. red Teaming helps to identify the range of options available to potential adversaries.
- To urge against Complacency. The US military is tempting to transform itself. It is necessary to continue transforming our armed forces to deal with committed and adaptive adversaries.
Saturday, April 3, 2010
Modeling Behavior of the Cyber Terrorist
The Experiment
In order to red team an unknown, potential adversary, a set of parameters are set for the red team to follow based on the Defense Advanced Research Project’s Agency’s (DARPA) understanding of terrorist behavior:
- The cyber-terrorist is believed to have a level of sophistication somewhere between that of a sophisticated hacker and a foreign intelligence organization.
- This adversary is assumed to be able to raise funds on the order of hundreds of thousands to a few million dollars, and he is willing to spend these funds to accomplish his mission.
- This adversary is assumed to be able to acquire all design information on a system of interest.
- This adversary is assumed to be very risk averse. Premature detection is a serious negative consequence for the cyber-terrorist.
- This adversary has specific targets or goals in mind when they attack a given system.
- The adversary will also expend only the minimum amount of resources needed to accomplish their mission.
- The cyber-terrorist is assumed to be professional, creative, and very clever. They will seek unorthodox and original methods to accomplish their goals.
Findings
The Information Design Assurance Red Team (IDART) spent most of its time gathering intelligence on the target system. Their results were only considered successful if the team met their objectives and preserved stealth. In this study the red team followed the same basic process repeatedly, and gave up before mounting an attack with a risk threshold that was too high.
Conclusion
DARPA’s experience suggests some improvements to the process that they are using to model the cyber-terrorist adversary including the use of additional red teams, improving the scientific method used to record and test red team behavior, incorporating verified terrorist behavior, war-gaming cyber terrorist scenarios, and improving the library of possible approaches to difficult threats.
Red Teaming Experiments with Deception Technologies
The Experiment
In total, 5 experimental runs of duration 4 hours each were run on each of 6 exercises. This represents 30 runs, including deception "on" and deception "off" control groups (6 each) and random "on" "off" mixes (18).
Each run was preceded by a standard briefing and a run-specific briefing and followed by filling out of standard assessment forms, both individually by all team members and as a group. The exercises were of increasing intensity and difficulty so as to keep the participants challenged. Feedback was provided and varying amounts of information were disclosed to teams during the course of the experiment.
The participants, in this case, were students ranging in age from 16 to 38, all in computer-related fields, all with excellent grade point averages, all US citizens, and all interested in information protection, and all participating in an intensive program of study and research in this area.
Findings
The use of red teams in simulating the effectiveness of deception methods on human network attackers revealed several interesting results:
- Teams which were not aware they were not working in a deceptive environment engaged in self-deception which hindered their progress. The study concluded that the threat of deception offers some protection against attackers.
- Teams unknowingly operating under deceptive conditions who followed a deception to its logical end gave up on the problem earlier than the time allotted because they believed they had finished correctly.
- For teams operating in a deceptive environment, even after being educated on the deceptive techniques that were being employed, were rarely able to move more rapidly past the deceptions, and often followed the same deceptive route they had learned in previous experiments.
- Teams continually subjected to deception became disheartened and only 3 of the original 15 participants under deception finished the study, while 8 of 12 participants not working under deceptive conditions finished the study.
Conclusion
The net objective of combined deceptions is that attackers spend more time going down deception paths rather than real paths, that the deception paths are increasingly indifferentiable to the attackers, and that the defenders can gain time, insight, data, and control over the attackers while reducing defensive costs and improving outcomes. Content-oriented deception can be an effective deterrent against network attackers, and deception capabilities should be improved to combat highly skilled, long term network threats.
Friday, April 2, 2010
The "Red Team" Forging a Well-Concieved Contingency Plan
According to the article, “The Red Team: Forging a Well Conceived Contingency Plan,” a Red Team is described as: “a group of subject-matter experts (SME), with various, appropriate air and space disciplinary backgrounds, that provides an independent peer review of products and processes, acts as a devil’s advocate, and knowledgeably role-plays the enemy and outside agencies, using an iterative, interactive process during operations planning.” This article shows the manner in which the US Air Force In Europe is working to set up a Red Team in an effort to simulate contact with different enemies throughout the planning of Operation Allied Force in 1998 – 99.
Strengths:
· Yields a closely synchronized planning staff.
· Drive a more complete analysis at all phases.
· Reveal overlooked planning opportunities.
· Extrapolate unexplored strategic implications.
Weaknesses:
· Losses effectiveness without Blue planners’ acceptance of Red as a value-adding group.
· Trust of the Blue Team is key to the success of the exercise.
· Only a simulation and is not always an accurate representation of the enemies decision making.
How-To:
1. Create a Red team composed of Subject Matter Experts, external to the Blue team’s sources.
2. Preparation by the Red Team. Team members should immerse themselves in learning everything they can about what has gone before in the crisis at hand and what the enemy and other adversaries are thinking. (Perhaps by creating a checklist of the information that the team needs to know.)
3. Meeting between the Red Team and Blue planners to explain critical points of the Red Team’s purpose, in order to alleviate friction.
4. Conduct a
5. Participate in a Course of Action (COA) War Game.
6. Select a COA and conduct a Plan War Game based on it, in order to create an actual Operations Order (OPORD).
7. Conduct a Mission Rehearsal.
Conclusion:
Thursday, April 1, 2010
Red Teaming for Law Enforcement
Scope of Red Teaming
Red Teaming is an interactive process conducted during crisis action planning to assess planning decisions, assumptions, processes, and products from the perspective of friendly, enemy, and outside organizations. Red Teaming has also been described as the " capability-based analytical or physical manifestation of an adversary, which serves as an opposing force." Red Teams evaluate a target or tactic, but not the likelihood that a particular target will be attacked.
Analytical Red Teaming
Analytical red teams portray an adversary but are not involved in actual field play. Analytical red teaming adds value to discussion based exercises and can range from basic peer review to near-real-time force-on-force interaction, as in games or simulations. During analytical red teaming, participants analyze the attack plans and look for indicators and warnings, key decision points, and vulnerabilities in the plan.
Physical Red Teaming
Physical red teaming involve individuals portraying actual, realistic adversary moves and counter moves in an exercise.
Benefits of Red Teaming
Successful red teaming offers a hedge against surprise and inexperience and a guard against complacency. Effective red teaming can define and threshold of detection, suspicion, and action.
Impediments to Effective Red Teaming
Impediments to effective red teaming fall in to two categories:
Situational: selection and training of members, and exercise conditions.
Organizational: organizationally imposed constrains, distribution and reception of the lesson learned. Red team success requires an organizational culture that values constructive criticism.
Methodology for Using Red Teaming in Exercise
- Determine the objective or desired result
- Communicate with government or private partners
- Determine the scale or type of exercise
- Develop the scenario
- Identify and train the appropriate participant
- Conduct and evaluate the exercise
- Prepare documentation
- Evaluate the performance
- Develop the improvement plan
- Make required and desired improvements
- Exercise again
Limitations to Red Teaming
There is not enough information to predict all possible means of attack. Typically, red team exercises are based on prior events and are less likely to anticipate new, unplanned or never before seen events. Red teaming plans and procedures need to be stressed and once stressed, must evolve and improve.