Wednesday, May 6, 2009

Summary Of Findings: Bayesian Analysis (4 out of 5 Stars)

Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the 12 articles read in advance of (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst College on 6 MAY 2009 regarding Bayesian Analysis specifically. This technique was evaluated based on its overall validity, simplicity, flexibility and its ability to effectively use unstructured data.

Bayesian analysis is a method that uses Bayesian statistics to assess the likelihood of an event happening in light of new evidence. It generates an estimate and the use of Bayesian statistics in Intelligence analysis allows for the uncertainty of the traditional intelligence data set to be understood in a scientifically valid manner.

*can limit analyst biases by reducing the weight of evidence simply because it is new or vivid
*forces the analyst to resassess evidence and consider alternative possibilites
*adheres to rigid mathematical formulas
*provides a numerical likelihood
*provides audit trail and ability to reproduce results

*Probabilities are based largely on subjectivity
*Susceptible to biases
*Highly complex problems require heavy computations
*Can be mathematically complex
*Not always useful as a stand alone method (works well in tandem with methods like Delphi); may require SMEs for determining probability distributions
*Some reliance on ambiguous validities
*"Negative evidence"--absence of positive evidence


This method loosely follows the guidance suggested by his line of research into the use of natural frequencies in teaching and explaining Bayes to beginners.

1.) Create a 2x2 matrix. Label the quadrants with the respective information that creates true positive, false negative, false positive, and true negative quadrants.
2.) Take the given information, the base line (for example, 100 out of 1,000) with the new information (for example, a new document that is 90% credible saying that war is immiment) which means that your true positive and your false negative must equal 100 and the false positive and true negative must equal 900.
3.) To calculate the true positive quandrant, take 90% of the 100 from the base line (which equals 90).
4.) To calculate the false negative quadrant, take the numerator of the base line (100) and subtract the true positive quadrant (90), creating an answer of 10.
5.) To calculate the true negative quadrant, take 90% of your non-war cases (900), equalling 810.
6.) To calculate the false positives, subtract the sum of the three quadrants known from the total number of cases (1,000), which equals 90.
7.) To calculate the new probablitiy, divide what the numerator of the base line (100) from the new total of positive caes (90+90=180), which equals 55.5%

The 55.5% means that there is a 55% probability that countries X and Y are likely to go to war.

To understand the basic mathematical principles behind Bayes, the class worked through some sample problems. One of the problems was based on a medical test with an 80% accuracy rate for a cancer with 2% affliction rate in the general population. The class applied this to a sample population of 1000 cases. We established a matrix and assessed the true positive, false positive, false negative, and true negative quantities (16, 116, 4, and 784 respectively). We plugged these numbers into the appropriate matrix fields. We then divided the number of actual cases of cancer (20 or 2% of 1000) into the number of positive tests (132--the 16 true positives and 116 false postives). The result was 15% rate of those who have the cancer from the positive tests, a rather stark difference from the 2% base rate! This problem actually reflects the number of breast cancer rates from a medical treatment from around a two decades ago!
Note: see the matrix for a synopsis of another of the problems we worked through (a peice of evidence emerging suggesting a cause for war).

The class also used a Bayesian application to assess the likelihood we would contract swine flu. We started with the initial hypothesis that we would contract swine flu or we would not contract swine flu, and assigned an initial probability to each hypothesis (the latter >5%) We then added weighted evidence which influenced the base rate of the hypothesis. After all the evidence was entered, the class assessed the likelihood of contracting swine flu.

Tuesday, May 5, 2009

Using Search Engine Optimization For Intelligence Analysis

SEO Analysis Now - A Site For Using SEO in Intelligence Analysis
Rated: 4 Stars out of 5

As a final requirement of this course, I had to research and report findings on an analytical technique of my choosing. In addition, I had to test-out and apply the technique in order to reveal its true strengths and shortcomings; as well as come up with a how-to guide to use the technique. For my project, I chose to conduct a Search Engine Optimization (SEO) Analysis on the Websites of two popular coffee shops, Starbucks (which is well established) and Caribou Coffee (which is slowly gaining popularity) and reflect on how this process can be applied to Competitive Intelligence (CI), to Law Enforcement Intelligence (LEI), or to the National Security sector. I chose SEO not only because it is an emerging analytical process and can be useful to intelligence analysts, but also because I found it quite intriguing and fun.

SEO provides a way to gain insight into a Website’s audience.

If we know who an audience is (age, gender, ethnicity, education, affluence, etc), how they behave (what other Websites, or types of Websites, they visit), from where they log on, when they visit, and what other sites direct their traffic (as well as what their general interests are), then we can make assessments and predictions about how to either promote their behavior (steering them toward a particular site – useful for CI & marketing purposes) or how to counter that behavior (keeping them away from sites – useful for LEI & N’tl Sec purposes).

For more detailed information about how SEO can be used for Intel analysis, please check out my project:

SEO Analysis Now - A Site For Using SEO in Intelligence Analysis

(The image below is a geographical search index comparison of online users searching for Caribou Coffee [left] and Starbucks [right]. Images provided by Google Insights for Search)

Monday, May 4, 2009

Bayesian statistics: principles and benefits

This article is meant to summarize the basics of Bayesian statistics for beginners.

In Bayesian statistics:
Graphically, the narrower the curve, the tighter the parameters. The difference between frequentist methods and Bayesian analysis is the use of past information, which is principally subjective. It is important for the prior information to be defensible and reasonable. The author believes subjectivity is a strength of the system because it allows for the examination of posterior distributions from different informed observers.

Until the 1990s, computational tools for conducting Bayesian analysis were nascent or non-existent. While there are tools for the specialist available today, the general practitioner of Bayesian analysis will find there are few user friendly tools available.

The author enumerates a number of benefits for using Bayesian analysis. They are:
  • It provides meaningful and intuitive inferences.
  • It can answer complex questions cleanly and exactly.
  • It makes use of all available information.
  • It is well suited for decision-making.
Enhanced by Zemanta

Introduction to Bayesian Analysis

"Bayesian statistics is concerned with generating the posterior distribution of the unknown parameters given both the data and some prior density for these parameters. Bayesian statistics creates a much more complete picture of the uncertainty in the estimation of the unknown parameters."

The article's main usefulness lies in suggestions on how to improve the Bayesian process. The author's first suggestion is to remove the effect of non-critical (or at least non interesting) parameters from the overall findings. They label these non-critical parameters as "nuisance" parameters.

Second, the authors recognize the lack of hypothesis testing in the scholarly works of their peers. They underhandedly advocate for such studies to be done.

The Logic of Intelligence Failure
By Bruce G. Blair, Ph.D.

The author of this article asserts that critics of the intelligence community concerning both of the major recent US intelligence failures--Iraq WMD and 9/11--ought to come to the realization that threats of varying levels of uncertainty are typically inaccurately assessed. Only successive, repeated assessments of updated data can narrow the gap between perceptions and reality. Since data users and intelligence analysts tend to initially process information subjectively (against their own rational beliefs, judgments, opinions), and modify their rationality as new information or intelligence comes in.

Blair argues that decision making was the result of intelligence analysis that basically followed the laws of reason. He contends that, "applying a rule of logic known as Bayes' law to these cases (9/11 WMD) shows that the intelligence process produced conclusions that were not only plausible but reasonable."

Blair utilized the following formula in his study:
While all of the probabilities come from the minds of people and are inherently subjective, the analysis itself depends on the product of successive analyses of real (objective) data, the probabilities of which are likely to converge with reality, so long as the individuals involved are thinking logically/rationally. Lower rates of error are likely to accelerate this convergent process.

Here are two scenarios, applied to Baye's formula:


Terrorist Attack

The author then provides possible scenarios , based on Bayesian calculations and iterations, that determine the point at which the relationship of warnings converge with the reality of the event.


Thus, concerning situations that involve preemption or preventive war, the author suggests that neither inductive reasoning nor even Bayesian analysis can truly clarify the validity of warnings, intelligence interpretations, or new information in certain situations--"two observers with different preexisting beliefs will often believe that the same bit of behavior confirms their beliefs - hawks seeing aggresive behavior and doves seeing evidence of conciliatory behavior".

What is Bayesian Analysis?

This article is a summary of the International Society for Bayesian Analysis's (ISBA) definition of the basics of the Bayesian theorem and analysis. The according to the ISBA website, the organization "was founded in 1992 to promote the development and application of Bayesian analysis useful in the solution of theoretical and applied problems in science, industry and government. By sponsoring and organizing meetings, publishing the electronic journal of Bayesian statistics Bayesian Analysis, and other activities ISBA provides a focal point for those interested in Bayesian analysis and its applications".

Bayesian analysis, a statistical tool for handling probability distributions, got its start in the mid 18th century. It was not, however, until the 1980s when modern computers were able to handle the complex computations that made Bayesian implementation difficult, that Bayesian analysis gained more widespread acceptance. Since then, its use has increased in popularity, being used in many different applications--from healthcare, to weather, to criminal justice. Despite the many nuanced manifestations of Bayesian analysis, it serves a common application: to analyze the probability of unknown and uncertain occurences.

How To:

The left side of the equation expresses the known quantities--"parameters"--as a probability of the current data--"prior distribution". 'y' represents the new data that enters into the calculation. Thus, the "'likelihood,' [is] proportional to the distribution of the observed data given the model parameters.

On the right, the equation's new probablity distribution (posterior distribution) is read: "posterior is proportional to the prior times the likelihood".


  • Many diverse applications
  • "Philosophical consistency"
  • Lacks problems that are associated with other 'frequentist' methods
  • Produce clear answers, products
  • Reformulates for each variable


  • Subjective nature of prior probabilities-"your prior information is different from mine".
  • More complex problems require more powerful computational tools

Saturday, May 2, 2009

Cruise Missile Proliferation: An Application of Bayesian Analysis To Intelligence Forecasting

Michael William Gannon

The author applies Bayesian analysis to the problem of cruise missile proliferation. The author defines Bayesian analysis as, "a quantitative procedure in which alternative hypothetical outcomes are postulated and their prior probabilities estimated. As additional relevant events occur, the probabilities of their association with each hypothesis are used to calculate a revised probability for each alternative outcome." He notes that Bayesian analysis has been used by the CIA to provide Indicators & Warnings (I&W) as to the probability of outbreak of armed conflict. Any observed event has a probability associated with its actual occurrence, depending on initial causes. By observing and evaluating events that do occur, "posterior probabilities" can be assigned to each cause, creating a likelihood of an event that may occur in the future based on similar initial causes.

Strengths of the method, according to the author, are "The principal advantage of the method is the establishment of a formal analytical framework which accommodates weighted inputs of all observed events, makes differing interpretations of a given event more explicit, and provides a readily available chronological record of the analytical process." A major weakness to the method, however, is that it "is limited to situations which can be expressed as a number of
mutually exclusive outcomes. An ample flow of data which is logically related to the hypotheses to be tested must be available, and analysts must be qualified to assign realistic probabilities associating the observed events to their hypothetical causes."

In assessing the history of Bayesian analysis, the author notes that the CIA found that Bayesian analysis and the Delphi method to be highly complimentary. Furthermore, the CIA found that Bayesian analysis had distinct advantages over other methods. These advantages were:

(1) More information can be extracted from the available data.. .and probabilities are not at the mercy of the most recent or most visible item.
(2) The formal procedure has been shown to be less conservative than the analysts' informal opinions, and to drive the probabilities away from fifty-fifty faster and farther than the analysts' overall subjective judgments do....
(3) The procedure provides a reproducible sequence for arriving at the final figures ....
(4) The formulation of the questions forces the analyst to consider alternative explanations of the evidence he sees.. . [and] to look at how well the evidence explains hypotheses other than the one he has already decided is the most likely.
(5) The use of quantified judgments allows the results of the analysis to be displayed on a numerical scale, rather through through the use of [subjective terms].

Limitations discovered by the CIA included:
(1) The question must lend itself to formulation in mutually exclusive categories .
(2) The question must be expressed as a specific set of hypothetical outcomes.
(3) There should be a fairly rich flow of data which is at least peripherally related to the question.
(4) The question must revolve around the type of activity that produces preliminary signs and is not largely a chance or random event.

Ultimately, Bayesian analysis is intended as a forecasting tool, but has the added benefit of utilizing raw data (which can be "graded" for source reliability and assessed for explanatory hypotheses). This provides the analyst with a "quick reference" source to conduct "snap shot evaluations" on current program evaluations, such as the state of a nations missile program.

Authors Comment:
This paper is a master's thesis that used Bayesian Analysis to assess future cruise missile proliferation. I did not include any findings of the thesis in this summary; instead I summarized the sections that dealt with the method of Bayesian analysis itself.

Bayes' Theorem and Intelligence

Net Wars

According to the author, "Beliefs are based on probabilistic information. Bayes Theorem says that our initial beliefs are updated to to posterior beliefs after observing new conditions." As an analytic method, Bayesian analysis provides a formula which allows the analyst to upgrade original assertions as new evidence is discovered, and assign likelihoods to events; the more we observe the better we can predict the likelihood of a certain event. The formula used to update the analysts initial beliefs to posterior beliefs is: p(C|O) = p(O|C)p(C)/p(O|C)p(C) + p(O|¬C)p(¬C). According to Bayes, initial beliefs have a high margin of error; this is alleviated by incorporating new evidence through this formula. This formula "produces interesting results because it accounts for uncertainties created by False Positives and False Negatives."

The author provides the following example of using Bayesian analysis to update your beliefs:

"There is a case of this occurring. Europeans believed that swans were always White and there could be no Black Swans. They updated their probability of a Swan being White to 99% based on their limited experiences. As they explored the world, they found Black Swans in Australia. This reduced the probability of a swan being white and increased the probability of a swan being black. This process of inductive reasoning can be explained via Bayesian probability."

The author provides the following example of using Bayesian analysis as an intelligence methodology:

"There are 10,000 civilians. 1% of whom are insurgents pretending to be civilians. Police can investigate individuals and determine if they are an insurgent or civilian with 95% certainty.

Prior Probability is this: 0.01 (10,000) and 0.99(10,000). So
Group 1: 100 insurgents
Group 2: 9,900 Civilians

The Police investigate the entire population. This produces four groups:
Group 1: Insurgents - Positive test (0.95)
Group 2: Insurgents - False Negative test (0.05)
Group 3: Civilians - False Postive test (0.05)
Group 4: Civilians - Negative test (0.95)

How certain are the police that the men they captured are actually insurgents? The answer is 16%.
(0.95 x 0.01)/ (0.95 x 0.01) + (0.05 x 0.99) =
0.0095/0.0590 = 0.161"

The 16% certainty rate stems from the uncertainty that always exists as some insurgents escape detection while some innocents test positive as insurgents. The author notes that this is an extremely oversimplified example; actual Bayesian analysis in this situation would require some serious computing power that takes into account many other factors, as well as multiple testing to insure that the most accurate results were reached. Nonetheless, the example highlights the use of Bayesian analysis as a method of predictive analysis. However, the method predicts the probability of a particular event happening, and not whether that event will actually occur.

The author returns to the "black swan" example, stating that these highly unlikely, yet possible, intelligence "black swans" are events that can occur, but are highly unlikely too. Just because they haven't happened, doesn't mean they won't. Bayesian analysis provides a method for determining their likelihood. The author concludes by reiterating that intelligence analysis is not about predicting future events, but rather about predicting the likelihood of future events. "The inability to stop a Black Swan event, or a false prediction of a Black Swan event, does not always mean that the intelligence community 'failed'." Rather, the notion that Intel failed comes from the distorted view of Intel analysts as fortune tellers, rather than the reducers of uncertainty that they truly are.

Friday, May 1, 2009

Bayesian Statistics In The Real World Of Intelligence Analysis: Lessons Learned

Bayesian Statistics In The Real World Of Intelligence Analysis: Lessons Learned
By Kristan Wheaton, Jennifer Lee, & Hemangini Deshmukh
Journal of Strategic Studies, vol. 2 n.1
February 2009

In this article, Kris Wheaton, in collaboration with Jennifer Lee and Hemangini Deshmukh, agree that alternative methods, ones that are more structured, should be applied to the intelligence process in order to improve intelligence analysis. Recognizing the potential that Bayesian Statistics can bring to the field of intelligence, the author questions the ease in which entry-level intelligence analysts can apply and use this advanced statistical method.

Following the model of an experiment conducted by Gerd Gigenrenzer to test the accuracy of a diagnosis by two groups of doctors (with one group using traditional statistic formulations [5% or .05] and the other using natural frequencies [5 times out of 100]), the author tested 67 Senior Intelligence Studies students at Mercyhurst College. The findings of Wheaton’s experiment were extremely similar to that of Gigenrenzer’s, showing that groups who receive natural frequencies have a much higher rate of being accurate when using Bayesian statistics (79% versus 18% accuracy). This accuracy is attributed to the power of ‘framing’ questions. The author concludes, “natural frequencies are an effective method for encouraging Bayesian reasoning.”

In addition to the experiment, the article covers a brief how-to and overview of Bayesian Analysis. In short, Bayesian statistics is particularly useful due to its ability to take-in-account probabilities of one event affecting another, allowing the analyst to rationally update a prior assessment in light of new evidence. This process also helps in the reduction of two very common cognitive biases, the vividness and recency biases. See article for relevant examples of how the Bayes Theorem can be applied to intelligence-related issues.

The Bayes Theorem is illustrated below:

Bayesian Analysis For Intelligence: Some Focus on the Middle East

Bayesian Analysis For Intelligence: Some Focus on the Middle East
By Nicholas Schweitzer
Approved For Release 1994
CIA Historical Review Program
02 July 96

Nicholas Schweitzer suggests that advanced analytical methods, such as Bayesian analysis, should be used to aid analysts in an age where information flows continue to rise. In an effort to test Bayesian Analysis as a tool for intelligence analysts, he used the technique among a group of intelligence analysts to assess complex political-military problems. The Middle East was chosen as a discussion point because of the level regional complexities.

Schweitzer defines Bayesian Analysis as “a tool of statistical inference used to deduce the probabilities of various hypothetical causes from the observation of a real event. It also provides a convenient method for recalculating those probabilities in the light of a continuing flow of new events…the ‘rule of Bayes’ states that the probability of an underlying cause (hypothesis) equals its previous probability multiplied by the probability that the observed event was caused by that hypothesis.”

How to:

Because of limitations, the Bayesian technique can only be applied where certain criteria are met. First, the question to be answered must lend itself to formulation in mutually exclusive categories (i.e. war vs. no war); the insertion of overlapping possibilities reduces accuracy of the Bayesian technique. Second, the question must be expressed as a specific set of hypothetical outcomes. Third, there should be a fairly rich flow of data that is at least peripherally related to the question. Lastly, the question must revolve around the type of activity that produces preliminary signs and is not largely a chance or random event. If this criteria is met then:

1. Assign numeric probabilities to hypotheses. The sum of the values must equal .1 or 100%. Because the examination of political/military affairs and events do not automatically yield quantified results, the possible outcomes (hypotheses) have to be quantified. Schweitzer asserts that implementing a Delphi method is the best solution to quantify possible outcomes. He suggests the following procedure to do this:
  • Use analysts who are experts on the subject matter (preferably ones who are working on the situation with you)
  • Establish a periodic routine for reporting
  • On the first day of the period, each of a number of participating analysts submits the items of evidence they have seen since the last round.
  • Submissions should be in the form of 1-2 sentences summarizing the item, along with the date, source, & classification.
  • The inclusion of relevant items and exclusion of irrelevant items is up to the discretion of the analyst.
  • A coordinator consolidates the items, resolving differences of wording, emphasis, and meaning, and returns the complete list of items to the participants.
  • On the following day, the analysts (working individually) evaluate the items and return the numerical assessments
  • *the use of a group of analysts, as opposed to a single expert, is highly recommended*
2. Assess and quantify the evidence that supports/negates the hypotheses.
3. Calculate the new probabilities according to the rule of Bayes:

E is an event, an “item” of intelligence
H is a hypothesis, a hypothetical cause of events
Hi is one of a set of n mutually exclusive hypotheses
P(Hi) is the starting, or “prior” probability of a hypothesis
P(E/Hi) is the probability of an event given Hi, of an event occurring, given a particular underlying cause
P(Hi/E) is the probability of a hypothesis given E, the “revised” probability of a hypothesis, given that a particular event has occurred.

Strengths (please see article for further explanation):
  • Allows for the weighting of evidence
  • Provides transparency in intelligence assessments
  • Forces the consideration of alternative possibilities
  • Quantifies analysis instead of using words of estimative probability
  • Displays the trend toward an outcome quicker than the analyst can typically realize it on their own
  • Incorporating the Delphi method adds credibility to the assessment when presented to managers and decision makers.
Weaknesses (please see article for further explanation):
  • Limited applicability
  • Data problems – can exist in deciding which information is relevant and should be included, as well as what weight values should be given to evidence.
  • Source reliability – what is the best practice to account for this
  • “Negative evidence”- the absence of any positive evidence may in itself be highly indicative of something
  • Problems over time – problems in using this method in a project continuing over many months
  • Problem with numbers – cannot use the probability of ‘zero’ (doesn’t work mathematically or analytically) therefore extremely low probabilities must be indicated by a very small number. Also, some people have difficulty thinking in, and assigning, probabilities.
  • Subject to bias and manipulation – this is one of the reasons for which the author suggests using a group of experts/analysts to assign probabilities.

Wednesday, April 29, 2009

A Bayesian Approach To Modeling Binary Data: The Case Of High-Intensity Crime Areas

Law, Jane & Haining, Robert
Geographical Analysis, Vol. 36. No 3 (July 2004) Ohio State University

The purpose of this paper is to apply Bayesian logistic models to binary data in order to explain how high-intensity crime areas (HIAs) are distributed in Sheffeild, England.

The article defines an HIA as an urban area experiencing high levels of violent crime. Perpetrators of crimes in HIAs often reside in the neighbors where they committ crimes, resulting in high levels of witness intimidation making it difficult to identify an HIA with full accuracy.

The article argues that Bayesian approaches are useful for spatial data analysis because of its emphasis on randomness, prior beliefs about values, and WinBUGS software that enables analysts to test hypothese with spatial elements.
Prior to using Bayesian analysis, Sheffeild was divided into Basic Command Units (BCUs). Within the BCUs were HIAs. Within both BCUs and HIAs were Enumeration Districts (EDs). EDs were then classified in terms of if they were in an HIA or not. According to the article several EDs were miscategorized because the original statistics used the standard logistic model, that did not analyze the spatial relationship between EDs. The article continues by discussing the use of Logistic Regression with WinBUGS software, to analyze random elements and spatial analysis to redesignate the EDs.

Figure 1, shows what is known as the logit function, including random effects. However it does not include spatial relationships of EDs.

In Figure 3A spatial relationships are shown with the links to W which is known as the contingency matrix. The contingency matrix is used to show the likelyhood of EDs neighboring HIAs and being at risk of spill-over crime from the HIAs making the EDs likley to become HIAs.

The paper's conclusion is that using a three stage Bayesian hierarchal model, can give estimated probabilities of EDs being HIAs, because it takes into account spatial elements between EDs. The paper also argues for "map decomposition" as another relevant way for analyzing spatial models along with Bayesian hierarchal models.

Why I Don't Like Bayesian Statistics

Gelman, Andrew, Professor of statistics and political science and director of the Applied Statistics Center at Columbia University


Professor Gelman refers to Bayesian inference as a "coherent mathematical theory," but does not trust it for scientific applications. Gelman believes that it is too easy to apply subjective beliefs about a given situation to Bayesian theory; because people want to believe their own preconceived notions and reject results statistical results they do not want to agree with. Bayesian methods according Gelman, encourage this kind of thinking.

Gelman takes special issue with political scientists like himself adopting Bayesian methods. Bayesian approaches tend to assume exchangeability of variables. However in political science it is impossible to exchange each of the 50 states, they cannot be used randomly or as samples.

Gelman continues by saying that he is not hostile to mathematics of Bayesianism, but its "philosophical foundations, that the goal of statistics is to make an optimal decision." Gelman believes that statistics are for doing "estimation and hypothesis testing," not to "minimize the average risk." He also faults the Bayesian philosophy of axiomatic reasoning because it implies that random sampling should not be done which Gelman considers to be "strike against the theory right there." He also accuses Bayesians of believing in "the irrelevance of stopping times," which means that stopping an experiment it will not change your inference. Gelman concludes by saying "the p-value does change when you alter the stopping rule, and no amount of philosophical reasoning will get you around that point."

Bayes' Formula

Author's Note: This is a great video for teaching Bayes' Theorem in its simplest form.

In order to illustrate the utility of Bayes’ Theorem, the author draws upon two simple scenarios. First, suppose someone faces the decision of needing to choose between three doors. If the person making the decision does not have any prior knowledge about the situation, the scenario creates an unconditional probability. But, once the person receives new information about the scenario, the rational person should reconsider his/her decision and subsequent probabilities.

Bayes’ Theorem is about the introduction of new information used to adjust probabilities and create conditional probabilities. In the formula, P(G/U), P is the probability that G will occur, if U happens.

To illustrate the application of Bayes’ Theorem and conditional probabilities, the author illustrates a second scenario. Pretend that there is a 70% probability that the economy will grow and a 30% probability that the economy will slow (an unconditional probability). The author owns a stock that has an 80% chance of increasing if the economy grows. That same stock, however, only has a 30% chance of increasing if the economy slows. The 80% and the second 30% are conditional probabilities; they are based on the condition that the economy will grow or slow.

The author can then determine the scenario’s four conditional probabilities:
1) What is the probability that the economy will grow and the stock will increase?
2) What is the probability that the economy will grow and the stock will decrease?
3) What is the probability that the economy will slow and the stock will increase?
4) What is the probability that the economy will slow and the stock will decrease?

To answer these questions and determine their probabilities, the author uses the equation: P(UG) = P(U/G)P(G). Notice that this equation is longer than the first because this one incorporates two conditions: the economy will grow/slow and the stock will increase/decrease.
Reblog this post [with Zemanta]

Bayes' Theorem for Intelligence Analysis

Jack Zlotnick
CIA Historical Review Program

Author’s Note: Released by the CIA’s Historical Program in the early 1990’s, Jack Zlotnick wrote this piece in the 1970’s. At the time, the CIA was still in the process of testing Bayes’ Theorem. Due to the ongoing testing period (at that time), Zlotnick does not offer a position on the utility and validity of the Bayesian method with regards to intelligence. In fact, Zlotnick spends a considerable amount of time in the article discussing the ways the theorem should continue to be tested.

Due to the very nature of intelligence, analysts should be naturally interested in the Bayesian Theorem. Intelligence is probabilistic in nature. Intelligence analysts usually conduct their analysis based on incomplete evidence in which they must address probabilities (thus WOEP’s).

For intelligence applications, Bayes’ Theorem is represented by the equation R=PL. “R” is the revised estimate of the odds favoring one hypothesis over another competing hypothesis (the odds of a particular hypothesis occurring after new evidence is entered into the equation). “P” is the prior estimate on the hypotheses probabilities (the odds before considering the new evidence entered into the equation). The analyst must offer judgments about “L” or the likelihood ratio. This variable is the analyst’s evaluation of the “diagnosticity” of an item of evidence. For instance, if a foreign power mobilizes its troops, what are the chances that “X” will happen over “Y”.

The principle features of the Bayes Theorem distinguish it from conventional intelligence analysis in three ways. First, it forces analysts to quantify judgments that are not ordinarily expressed in numeric terms. Second, the analyst does not take the available evidence as given and draw conclusions. And third, the analyst makes his/her own judgments about the bits and pieces of evidence. He/she does not sum up the evidence as he/she would if he/she had to judge its meaning for a final conclusion. The mathematics does the summing up.

The author is skeptical that the complex tasks analysts are forced to consider can be reduced to numeric values. Bayes’ Theorem, however, may be useful for examining strategic warning by uncovering patterns of activity by foreign powers.
Reblog this post [with Zemanta]

Summary Of Findings: Gap Analysis (3 Out Of 5 Stars)

Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the 12 articles read in advance of (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst College on 29 APR 2009 regarding Gap Analysis specifically. This technique was evaluated based on its overall validity, simplicity, flexibility and its ability to effectively use unstructured data.

Traditionally, "gap analysis" is a method used to conduct an internal operational analysis, whereas the gap analysis identifies the "gap" between a current state and a desired endstate within a company or agency. From an intelligence analysis perspective, "gap analysis" can be used as a tool to identify the likely pathway or pathways a target may take to arrive at a given endstate from a known position. Thus, "gap analysis" does not necessarily provide an estimate, but rather provides the analyst with a list of possible actions a target may likely take. Gap analysis as an analytic technique bears a striking resemblance to several other methods, such as Indicators & Warnings and Decision Trees.

1) Identify the target
2) Characterize the current status of the target as well as the target's goals
3) Identify what you want to know about the target
4) List the pieces of information that you have
5) Use a systematic approach to infer what the target is likely to do in order for the target to reach its goals

* No structured method to conduct the analysis
* May not leave you with a clear estimate
* Open to bias and other cognitive downfalls (satisficing, mirror imaging, etc.)
* Overlaps with the process of other methods and modifiers (i.e. decision treees, I&W, Brainstorming, SWOT, etc.)
* Susceptible to deception
* Danger of pitfalls

1) Identify the target
2) Characterize the current status of the target as well as the target's goals
3) Identify what you want to know about the target
4) List the pieces of information that you have
5) Use a systematic approach to infer what the target is likely to do in order for the target to reach its goals

For the first application, the group tried to determine what thesis topic Mary, a fictional first year graduate, would write about. Professor Wheaton acted in place of Mary, and we role played in questions and answer format. The group determined the gaps needing filled would be what intelligence track she was most interested in (national security, law enforcement, or competitive), what area or topic in her previous classes interested her the most, the choice of her primary reader, and the reader's academic interests. Upon filling these gaps, the group ascertained a plausible topic for Mary's thesis.

For the second application, the group discussed Russia's long held ambitions for a warm water port, preferably on the Mediterranean. The first part of the discussion centered on hostilities between Georgia and Russia, and actions Russia could take against Georgia to maintain their sphere of influence in the Black Sea. Additionally, the group discussed what actions may be necessary in their diplomatic relations with other nations bordering the sea. After discussing how Russia could potentially gain a Black Sea port by bringing Georgia into their sphere of influence, the group discussed how Russia could proceed toward gaining access to Mediterranean ports. The group determined that Turkey would be central in future Russian objectives for ports in the Mediterranean. The group put together lists of things the Russians could do that they are not doing now that would indicate their goals of extending their influence in Georgia and Turkey.

These applications illustrated the following:
* Helped in thinking through the steps that lead to a decision
* Allowed for open discussion and debate, helping the critical thinking process
* At some points it felt like stabbing in the dark, but the estimates later became more clear as the group discussed the options
* Eliminated peripheral influences not directly related to the topic, such as administrative processes.

Sunday, April 26, 2009

Forecasting for Success: The Power of Regulatory Gap Analysis

This paper offers a broad plan for conducting a regulatory gap analysis for biomedical products, a definition of gap analysis, when and how to use gap analysis, and the benefits of conducting a gap analysis.

"Fewer than 1% of all biomedical products conceived move beyond preclinical testing, and fewer than 10% of those products make it onto the market." Thus biomedical product developers need to harness their time wisely to improve these statistics. Conducting a gap analysis is one way to cut down on both preclinical and clinical testing times.

The paper defines gap analysis as "the process of reviewing all available information for a candidate product to assess current development status, identify potential gaps in information required for subsequent steps, and develop a strategy to fill those holes."

According to the authors, the ideal time for performing a gap analysis is in the preclinical stages. This will allow for an understanding of the products unique nature. A particular focus of the information collection step needs to be on regulatory thresholds, precedents, and milestones. Once the developers know where their gaps are located, they can formulate a strategy for developing the product according to the standards set forth by regulatory bodies. The authors suggest sharing the action plan with the regulatory board to understand their concerns over the future developmental stages.

Aside from forming the basis for a developmental plan, their are other benefits to conducting gap analysis. First, it eliminates unnecessary developmental research and development testing. With the biomedical industry in particular, time is a precious commodity. Also, the gap analysis will aid in the speed in which regulatory agencies review and approve the product, since it knows the steps the developers are following. Troubleshooting these concerns will save a lot of headache later.

Gap analysis methodology for identifying future ICT related eGovernment research topics – case of “ontology and semantic web” in the context of eGover$File/Paper95.pdf

This study evaluates the the gaps between the current state of government e-services to those proposed by numerous European conventions. The authors review how they defined and identified those gaps, how they were interrelated, and the methodology used to draw conclusions.

The use of information and communication technologies (ICT) currently employed by governments is poor. The future needs demands placed on governments to meet constituent demands require the streamlined implementation of ICT. The results of this study are part of the eGovRTD2020 project for which this work was commissioned.

For the sake of the study, a gap was identified as either expressing a mismatch between issues of consideration, or an issue of research currently not under consideration. There are five main steps the authors undertook to identify gaps and actions toward future scenarios. Generally speaking, they are:
  • where are we at?
  • where can we go?
  • how do we get there from here?
  • this is how we get there.
  • this is what we need to do to get there
The researchers then grouped the gap areas together and conducted conclusions on how they were interrelated (the later via a SWOT-esque approach).

The authors highlight four key steps they considered while conducting their gap analysis. They are:
  • understand the current environment
  • understand the broader context of the environment (for a holistic view)
  • base the results on a clear assessment framework
  • support the analysis quantitatively

Saturday, April 25, 2009

Gap Analysis As A Tool For Community Economic Development

Suzette D. Barta and Mike D Woods
Oklahoma State University

Author’s Note: This article is about “sales gap analysis.” The article focuses on a project in Oklahoma in which small cities are trying to determine the health of their local economies. Particularly, the cities are trying to determine whether local buyers are spending their money within the local community or outside the community. Although this article is slightly off topic, the discussion on gap analysis does contain a couple nuggets of valuable insight.

The purpose of this article is to help community leaders understand that there are some tools available that can help local merchants better understand the weaknesses of their local retail market. Once the leaders understand the local market’s weaknesses, a competitive response plan can be drafted.

The Oklahoma projects aims to determine whether local markets have a retail surplus or a leakage of retail (Are local customers shopping within the local market? Are external customers shopping within the local market?) This determination of a retail surplus or leakage is called a sales gap analysis.

Gap analysis is a technique for identifying the strengths and weaknesses in a local retail market. In this situation, the analysis estimates how many shoppers are coming to a community to purchase retail items.

Local residents must decide for themselves whether a retail gap is acceptable, not acceptable, or even preferable. If it is deemed not acceptable, then community leaders should devise a competitive strategy to meet the needs of the community. A common misperception however, is to assume that if a gap exists, then it must be filled.

A gap analysis only indicates the possible areas of leakage. The analysis does not indicate why the leakage exists, whether or not the leakage is acceptable, or how to stop the leakage from occurring. Gap analysis is only a starting point. The information generated from a gap analysis is only valuable if it is used to stimulate further discussion and to devise an appropriate competitive strategy for action.
Reblog this post [with Zemanta]

Gap Analysis Strengthens Link Between Requirements And Verification

Brian Hooper and Bill St. Clair
COTS Journal

Author's Note: This article primarily discusses ways in which requirements traceability may be improved, specifically in companies working on military projects that require safety certifications. The authors highlight gap analysis as the means in which the weaknesses in requirements traceability are discovered.

Gap analysis is a technique routinely used in business to measure the development/maturity of working processes and to identify potential areas for improvement. Gap analysis provides an opportunity to examine operating processes and products typically by employing a third party to conduct the assessment. The valuable information offered by the gap analysis helps to improve a company’s processes so that when a formal assessment or certification of products is conducted, the assessment is much more likely to be passed on the first attempt.

Companies are looking outside their own market sector for best practices and approaches, techniques, and standards. Gap analysis provides a framework to isolate areas in which they need to improve. The results from a gap analysis also aid companies in efficiently refocusing their resources in order to achieve the desired improvement.

With the increased need for software control, a gap analysis of safety-critical projects (particularly military related projects) regularly flags the field of requirements traceability. Many development standards require a Requirements Traceability Matrix (RTM). Requirements traceability is a widely accepted best practice in the development industry to ensure that all requirements are implemented and that all products can be traced back to one or more requirements.

More often than not, however, traceability matrices are performed as a low-priority task. Constructing a RTM requires an enormous amount of time and money. Failure to construct an accurate RTM may result in a product failing its certification assessment.
Reblog this post [with Zemanta]

Teaching the New Competencies Using the Gap Analysis Approach


Doctors Bell and Kozakowski recommend using gap analysis to aid students (in this case medical students at all levels) in evaluating their current competency levels and developing a plan for improvement.

Medical schools typically define the core competencies that their students must meet upon the completion of classes, and at the end of the program. At the residency level, physicians must demonstrate achievement in the general competency categories, as identified by the Accreditation Council for Graduate Medical Education:
  • Patient Care
  • Medical Knowledge
  • Practice-Based Learning and Improvement
  • Interpersonal/Communication Skills
  • Professionalism
  • Systems-based practice.
Additionally, the authors describe the process of faculty-conducted assessments of the learners' achieved competencies. They provide a questionnaire used by faculty at the Lake Erie College of Osteopathic Medicine as an example.

Using gap analysis.
The authors describe the process of incorporating gap analysis into this type of self-evaluation. They mention four main steps:
  1. "Articulate a desired future state"
  2. "Describe the current state"
  3. "Examine internal and external issues that must be addressed to progress from the current state to the desired future state"
  4. "Delineate strategies and tactics that will ensure that the 'gap' between current state and desired future state is narrowed"
Gap analysis has been used in this application at the Hunterdon Medical Center Family Medicine Residency Program. Meetings are held between faculty and residents in which the residents are to score where they think fall, on a scale of 0-100, when performing a specific competency. The resident is then asked to name a practicing physician who he or she views as a 100 on the same scale. The faculty member asks the resident to identify those specific behaviors and characteristics that make the physician a 100. The last step involves the faculty member asking the resident to describe his or her own behaviors in relation to those of the 100-level physician, identifying specifics steps and learning issues that can reduce that gap between their own score and the 100-level score.

In the article's summary, the authors indicate that, at the time of publication, there had been no formal evaluation of this technique's effectiveness in this application.

Applied Strategic Planning

Goodstein, Leonard D., Timothy M. Nolan, and J. W. Pfeiffer. Applied Strategic Planning: how to develop a plan that really works. New York: McGraw-Hill, 1993. Ch.11.

Much of this chapter is dedicated to addressing the results of a gap analysis. This summary will primarily focus on the information relevant to the process of conducting a gap analysis, and any associated advantages or pitfalls.

Goodstein calls gap analysis the "moment of truth" for a strategic planning team. This analysis provides the team with the opportunity to identify specific gaps that exist between the organization's current status and the "performance required for the successful realization of its strategic business model." The organization's current status and level of performance is the product of an internal performance audit.

Probably the most important product of a gap analysis is the estimate of the size of the gap and whether or not the strategies and tools at hand are enough to reduce the size of the gap. Every effort must be taken to close each perceived gap, and the team's responsibility is to reevaluate the desired future, business model, and solutions until all gaps are closed. This may require the team to repeat the process several times, or even revisit the mission statement or business model periodically.

There are two basic approaches to reducing gaps:
  1. Transactional Solution: Requires a modification or reduction of goals
  2. Transformation Solution: Requires a reduction of the obstacles causing the gap
The authors offer some words of caution to the strategic planning team. They recommend a strong consultant in the gap analysis phase to reduce the chances of the team falling into a group think frame of mind. Additionally, it is important to ask whether or not it is realistic to assume that a particular gap even has the potential be closed. Identifying a gap in one area can result in the realization of other gaps in different areas. The team must also assess the feasibility of change when assessing the scope and nature of existing gaps.

Human Performance Improvement For Tactical Teams

Hathaway, D.J. FBI Law Enforcement Bulletin. June 2008 Volume 77 Number 6

This article uses gap analysis, among other methods such as performance and business analysis, in order to reassess the effectiveness of the FBI's tactical teams. The article explains how in today's world, any disparity between a tactical teams standard operating procedure (SOP) and real world actions may have legal ramifications, as well as embarrass and/or damage the credibility of the agency. By using methods such as gap analysis, these shortcomings can be identified and avoided.

According to the article, a performance analysis, "explains the current state of the department's tactical team and defines the desired one...Performance analysis incorporates organizational, environmental, and gap analysis. " Thus the gap analysis is only one component of an overall performance analysis. "As the name implies, gap analysis defines the area between actual performance of the team and the desired level and, as such, constitutes the second stage of performance analysis."

"Whereas performance analysis identifies the problem or area of concern, gap analysis brings the issue to the forefront, begins to frame it in terms of human behaviors and expected outcomes, and addresses the complex issue of consequences." Tactical teams may identify a gap in shooting accuracy, which would be a performance state, and seek to correct that problem to align with the desired performance state. Consequences and outcomes addressed with this particular performance may be unnecessary loss of life and/or inability to stop a criminal.

Once the gap is identified, it needs to be prioritized. Some questions to ask are, "How often does the gap occur? How costly will it be to fix? Or, how important is the gap? What if the team did nothing? Discovering a gap does not mean that it can, should, or will be addressed." Once gaps are prioritized, they can be dealt with on a basis of whats more critical to success, and keep the analysis systematic and appropriate. As a complete method, gap analysis is only identified as a "critical step" in the "human performance improvement model" (see image above). This critical step serves to identify the performance issues and facilitate the next step, which is the investigation and correction of those issues.

Authors Comment: The article further highlights each step within the "human performance improvement model, however they were not addressed in this summary due to the focus solely on gap analysis.

Gap Analysis

Encyclopedia of Management

According to the article, gap analysis is defined as, "studying the difference between standards and the delivery of those standards." As a method of analyzing a business model, it is important to conduct a "before-and-after" analysis prior to conducting the actual gap analysis. The before-stage is identified, such as customer expectation, and then the after-stage is identified, such as actual customer experience. The difference between the two is the "gap", and once identified the gap can be addressed.

According to the article, "Gap analysis involves internal and external analysis." In the business model, this means that a business must address the customer's needs and expectations, as well as the appropriate business response to those needs and expectations. In order to implement the external analysis, the article represents the use of focus group interviews, consisting of ten to twelve customers, who are invited to share their experiences with a business. After recording the experiences of the focus group, the article recommends the implementation of a quantitative method to rank order the identified expectations and experiences, such as a 1-10 scale. The gaps can then be easily identified according to the gaps on the scale between experiences and expectations.

Gap analysis is a useful method to identify shortcomings within a business model. The article applies to the gap analysis method exclusively to the development of better customer relations, however gap analysis can be also be applied exclusively for internal analysis as well, such as what employees may expect from employers, and vice versa.

Thursday, April 23, 2009

Intelligence Requirements and Threat Assessment

Intelligence Requirements and Threat Assessment
Ch. 10 in
Law Enforcement Intelligence: A Guide For State, Local, and Tribal Law Enforcement Agencies
by, David L. Carter, Ph.D.
School of Criminal Justice
Michigan State University

Chapter 10 of the Law Enforcement Intelligence: A Guide For State, Local, and Tribal Law Enforcement Agencies defines an intelligence gap as an unanswered question during the analytical process where “critical information is missing that prevents a complete and accurate assessment of an issue.”

In the past, a “dragnet” approach was the traditional method for filling information gaps. This approach set out to collect mass amounts of data in the hopes that the desired data was collected. The requirements-based approach to filling gaps seeks to make collection more objective, more efficacious, and less problematic. Dr. Carter asserts that this approach is scientific in nature and that “the intelligence function can use a qualitative protocol to collect the information that is needed to fulfill requirements. This protocol is an overlay for the complete information collection processes of the intelligence cycle.” The diagram below compares the Tradition-based and the Requirements-based approaches to filling intelligence gaps:

Carter states that organization (or even intelligence need) may have to develop its own unique process to filling information gaps, however the following acts as a good guide to follow:
  1. Understand your intelligence goal
  2. Build an analytic strategy. (What types of information are needed? How can the information be collected?)
  3. Define the social network. (Who is in the network? How does their business cycle function? Who has access to the information needed? What is the social behavior?)
  4. Define logical networks. (How does the organization operate? Funding sources. Communications sources. Logistics and supply.)
  5. Define physical networks.
  6. Task the collection process. (Determine the best methods of getting the information)
  7. Get the information.
  8. Analyze the information.

Gap Analysis: As Is or To Be, What is the Question?

Gap Analysis: As Is or To Be, What is the Question?
By Dorothy Ball,
Four Thought Group

This article describes how to use gap analysis in Business Process Improvement (BPI) strategies, relating the method to Health Care organizations. In this summary, I have attempted to relate the gap analysis process, as the author uses it, to the intelligence field - identifying and overcoming intelligence gaps.

Dorothy Ball, a senior policy and business consultant at Four Thought Group, Inc., states that gap analysis is an effective solution for businesses and organizations that have a service orientation (which includes for-profits, non-profits, and government agencies). To paraphrase, gap analysis is a method for an organization to improve performance or make gains by identifying the potential needs for that organization to move from where they are (or what intelligence you have) to where they want to be (or what intelligence is still needed). Ball describes gap analysis as a method to develop a roadmap that gets you from “where you are now (As Is) to where you want to be (To Be).”

How to:
  1. Identify what your organization looks like now (or what intelligence/information you currently have available) and what you want your organization to look like (or what intelligence/information you desire). Needs for improvement (more intelligence) is often indicated by changes such as policy, resources, environment (or events that spark inquiry). Changes may be event-driven, or ongoing.
  2. Understand the Business Process (or collection process). Ball describes the business process as “a collection of related, structured activities, or chain of business functions, activities and tasks, that produce each specific service or product… Each business process consists of inputs, method, and outputs. The inputs are required before the method can be put into practice to achieve the outcome. When the method is applied to the inputs then certain outputs will be created.”
  3. Use comparative analysis techniques to identify what is needed to get you from where you are (the Intel you have) to where you want to be (the Intel desired). Examination of the current process, and the modeling of the new process, may be necessary in order to discover the interrelationships/interconnectedness of how the current process can get you the desired results.
  4. Create a plan for implementation.

Wednesday, April 22, 2009

Gartley's Gap Theory Explained

Bobbit Steven G. Futures December 2008

Retrieved From: 27 Apr, 2009.

Bobbit's goal in the article to explain to traders how to deal price gaps in the stock market. Gaps in prices of a stock happen when "the current bar opens above the high or below the low of the previous bar." The term "gap" is often used as a verb throughout the article. Human emotion manipulates markets and cause frequent "gapping."

Bobbit explains the "natural sequence of gaps" in a Gap Succession Chart. The sequence begins with the "breakaway gap" which indicate a change in pricing trends of a given stock. Next is the "measuring gap," according to Bobbit who cites H.M. Gartley measuring gaps are the most difficult for an analyst to spot but also the most important because spotting them can lead to better predictions of where the price is going to go. Next is the "exhaustion gap" which according to the article is the easiest to spot because they happen during significant "up or down" moves of a stock's price. Bobbit concludes the article with several mathematical equations to help determine where gaps may occur while charting prices.

Benchmark Your CI Capabilities: Using A Self Diagnosis Framework

By Singh, Arjan & Beurschgens, Andrew, Fuld & Company
Competitive Intelligence Magazine Volume 9, Number 1, January-February 2006

This article discusses the Self-Diagnostic Framework (SDF). SDF is a tool for analysts and proprietary Competitive Intelligence (CI) professionals to benchmark the current level of their CI capabilities compared to "world class CI capability." SDF incorporates gap analysis to provide recommendations for companies to improve their CI functions.

The authors explain how there are four development stages a CI department can go through. The first stage is known as "stick fetching." This is when CI is used by decision makers (DM) after they are well into their decision making process. DMs will request information from a CI department who have been at a distance from the decision making process and therefore have little understanding as to why certain information is needed. The next stage is the "pilot stage." This happens when an organization expresses a committment to further develop a CI function and give it a "mandate" to help in the decision making process. After the pilot stage is the "proficient stage" where an organization's CI team is proficient in most of the elements in the SDF. Once they have all the elements in SDF covered they are considered to have achieved the "world-class" stage.

The SDF is broken down into eleven attributes. An analyst or team of analysts, look at each attribute and determine which stage they are in for that attribute.

The article describes how two european companies used the SDF to finds gaps and inedaquacies in their CI functions. Both companies were successful in further developing value added CI functions in their organizations.

Summary Of Findings: Game Theory (4 out of 5 Stars)

Note: This post represents the synthesis of the thoughts, procedures and experiences of others as represented in the 12 articles read in advance of (see previous posts) and the discussion among the students and instructor during the Advanced Analytic Techniques class at Mercyhurst College on 22 APR 2009 regarding Game Theory specifically. This technique was evaluated based on its overall validity, simplicity, flexibility and its ability to effectively use unstructured data.


Game theory is a method based on applied mathematics and economic theory. It can be useful when attempting to analyze (and ultimately predict) the strategic interactions between two or more actors and the way in which their actions influence future decisions. Game theory assumes that all actors are rational, and can be influenced by various individuals and factors. Games typically involve five common elements: players, strategies, rules, outcomes, and payoffs.


-assumes rational actors
-assumes actors will adjust their actions based on the actions of other actors
-not clearly differentiated from role-playing, simulations, and/or decision trees
-very mathematically based (can be intimidating)
-difficult to quantify options, strategies, and motivations
-may not be a valid method to produce an accurate estimation (see Game Theory, Simulated Interaction, and Unaided Judgment For Forecasting Decisions in Conflict: Further Evidence)
--In real world applications, identifying all of the key players and outcomes can be difficult


-Visual step-by-step trail to a conclusion/estimate
-Ability to quantify variables in play
-Emphasis on mathematics and scientifc method
-Applicable to multiple fields (economics, conflict, etc)
-90% rate of success according to BDM


Game Theory varies in complexity and in application, however, each application has the following in common:

*Establish the players and the complexity of the game being played, so as to understand the rules which govern the players and the game.
*Identify the possible outcomes for the choices the players can make (although this is particularly difficult as not all decisions can be predicted)
*Establish measurable values for predicted outcomes.
*Eliminate dominated strategies and employ dominate strategies. Repeat this step until a clear, singular strategy emerges or equilibrium is reached between the players.
*Employ selected strategy.


As a class, we visited and played the repeatable version of Prisoner's Delemma under the "Interactive Materials" tab. Each student played the game at their personal computers. Our objective as we played against the five "personalities" was to identify the particular strategies employed by the computer (in addition to scoring the most utility points). Some of the strategies employed by the computers personalities included "tit-for-tat" and "tit-for-two tats."

Monday, April 20, 2009

Game Theory, Political Economy, and the Evolving Study of War and Peace

By Bruce Bueno de Mesquita


Studies of war and peace increasingly center around domestic interests and institutions for clues on how to shape international affairs. This change in thinking coincides with advances made in non-cooperative game theory and political economy modeling.

De Mesquita refutes realism and state-centric theories as logical explanations for the causes of war. He simultaneously enhances the validity of the liberal peace theory by examining the influences which democrats need to consider when threatening or declaring war.

Realism's claim about a balance of power needed to maintain international stability is refuted by the political economy approach. Simply put, the political economy approach states that the causes of and solutions to international conflict can best be understood by looking within states. It treats leaders as the object of study, not the states as realism does.

A game-theoretic focus concludes that war conducted by rationally acting states is always ex post inefficient. Leaders conduct wars at times to maintain a hold on power, since their domestic constituencies would likely vote them out of office (for democratic states). Autocracies have the advantage of not needing to concern itself with the well-being of their citizenry since they do not face election.

Game theory also validates the liberal peace theory. It claims that leaders (as the object of the political-economy approach to international relations) will only wage war when the outcome is victory, as a 93% success rate for wars initiated by democracies indicate. Since both sides need to consider their reelection prospects, a negotiated settlement to the conflict is the preferred method for resolving conflicts between democratic nations.
Enhanced by Zemanta