Thursday, October 1, 2015

Bayesian Intelligence Analysis

Davide Barbieri


The author shows how Bayes’ method work for intelligence analysis firstly by defining the other probability measurements. The study’s objective is to give a general idea of Bayes’ method by showing the logical overlap and differentiated points with other mathematical measurement.

The author mentions about the simple probability approach which  was made famous by French mathematician Laplace: it says that the probability P of an event E is the number of favorable cases m divided by the total number of possible cases n:  P(E)=m/n
This classic definition can be applied when all the possible elementary outcomes of a random trial (or experiment) are known and each of them has the same chances of happening. For example, the probability that the roll of a die will give an even number is 3/6=1/2.
If things become less obvious then the conditional probability technique can be used which defined as: P (A | B) = P (A and B) / P (B) Chances that an event will occur given another event, that is the imposed condition: P(A|B), which can be read as the probability of A given B. For example, the probability of any given number in a fair die is 1/6. The joint probability that the roll of a die will give an even number n greater than 3 is P(AB), where A is the event “n is even” corresponding to the following outcomes: {2, 4, 6}, and B is the event “n>3” corresponding to {4, 5, 6}. Therefore, their intersection is (AB) = {4, 6}, and the corresponding joint probability is calculated as 2 favorable cases out of 6 possible: P(AB) =2/6=1/3. Since the probability of B is P(B)=3/6=1/2, then the conditional probability that n is even given B is P(A|B)=P(AB)/P(B)=(1/3)/(1/2)=2/3. In fact there are 3 numbers which are greater than 3 in a die: {4, 5, 6}, 2 of which are even: {4, 6}
Joint probability
Finally, the author comes to the Bayes’ rule. And, he suggests that “In science, and in medicine in particular, researchers want to know the probability of an event given some evidence.” This sentence actually forms the foundation of what Bayes’ method seeks for. Regarding the intelligence perspective, in a paper by Zlotnick (1970), Bayes’ rule is defined in the following way: R = PL where R is the estimate of the conditional probability of hypothesis H after revising the latest evidence E. R is equal to P, the prior estimate (which is given) times L, the likelihood ratio of event E in case hypothesis H is true:
The author mentions about some example studies that the analysts utilized Bayes method to have an estimate. The first one is: In August 1969 the CIA had to evaluate the hypothesis that the USSR would attack China within the following month, in order to destroy its alarming nuclear capabilities. Analysts were asked to make an estimate, that is to evaluate the probability of a war. A list of intelligence items (Evidence) was collected such as E1, E2 … En. Then the analysts evaluated from their past experience the likelihoods P(Ei|H) of E1, E2... En in case of a war. They were asked to revise their estimates every week as the new evidence were available. Eventually, Bayesian probabilities always fell below conventional probabilities, demonstrating a better predictive accuracy.
Conventional and Bayesian probabilities (Adapted from Fisk 1972)
Another example: There are two hypotheses. Israel was not going to attack Syria (H0, the null hypothesis) or that Israel was actually going to attack (H1, the alternative hypothesis). Being two complementary events, the sum of their probabilities had to be 1.
The prior probabilities of the two competing hypotheses were set to P(H0)=0.9 and P(H1)=0.1, with war being unlikely. An additional piece of information was then revealed: “Israeli finance minister Rabinowitz stated that the nation’s economic situation is one of war and scarcity, not one of peace and prosperity”. After hearing the minister’s statement on the radio, the two likelihoods of such an event were estimated by the analysts as P(E|H0)=0.8 (in case of no war) and as P(E|H1)=0.99 in case of war. Then they applied Bayes’ rule and revised the probability of an attack accordingly:
P (H0|E) = P(E|H0)P(H0) / P(E) = 0.8 x 0.9 / 0.819 = 0.88
P (H1|E) = P(E|H1)P(H1) / P(E) = 0.99 x 0.1 / 0.819 = 0.12
P(E) = P(H0)P(E|H0) + P(H1)P(E|H1) = (0.9 x 0.8) + (0.1 x 0.99)=0.72+0.099= 0.819
The perceived risk increased consistently according to the analysts’ view.
To sum up, we find the revised probability of the previously stated probability in case of additional available information by applying the rule.
The study asserts that the Bayes’ approach can give strategic warnings, can force analysts to quantify their estimates in numerical values, it may reduce cognitive bias. Since analysts are usually better at evaluating a single piece of evidence at a time rather than at drawing inferences from a large body of evidence, the method can enable them to focus thoroughly on one single evidence each time.
The technique is very useful in terms of taking into account many different evidences that seemingly aren’t related with each other, whereas the classical probability approach deals with mainly related events (e.g. rolling a dice: you always roll the same dice and already every dice has the same features). Moreover, we use WEPs while articulating our estimates. But, Bayes’ approach enables analysts to give more accurate estimates rather than giving a range. On the other side, the method itself requires some talent in statistics and learning and applying the method is tough. The technique still requires analysts’ perceptions and experiences at the inception and while forming the formula (as assessed in article, there is a given probability which is the evaluation of the analyst about the topic. And then, after applying the rule we find the revised probability of the previously stated one). Therefore, to some extent, we can say that if the confirmation bias is present while weighting the evidences, the results shall not reflect a %100 ( or almost) accurate estimate.


  1. As I understand from the article, Bayesian analysis seems like updating the probability according to new pieces of evidences in order to reach the most accurate one. On the hand, analysts find and weight the pieces of evidences, therefore it is a very subjective process. The results of the analysis are solid numbers but the process is overly subjective. It comes to me as a big contradiction.

    1. Osman, I agree with your statement on subjectivity here and it is an issue I had with Ertugrul's critique. While a numerical estimate may appear more accurate (and may likely be more desired by DMs) than the range implied by a WEP, the determined probability is just as much of an estimate as the WEP is, due primarily to the lack of definite probabilities assigned to real-world events such as the ones we study. The analyst's skill, biases, and subjectivity is heavily influential at this stage of the analysis.

    2. I agree both of you regarding the weighing's inclusive nature of analyst's biases, knowledge, and shortcomings. But, isn't that what we do in our analyses too: weighing the evidences/information from our point of view. Therefore, I think, the real issue in here isn't the mindset of analyst; rather, the issue is the level of savviness of the analyst who applies the Bayes to a specific problem. More precisely, if one doesn't has adequate background in advanced math and statistics then he/she won't be able to apply the method by only placing the numbers into the formula. I think, the statistics is the art of interpreting the situation and converting it into set of meaningful numbers. Therefore, to be able to mention about the validity of one's accuracy level in weighing the evidences/informations, we need that person to be skilled enough at Bayes.

  2. I agree with the last statement. One should have advanced math knowledge and the computational modeling background to conduct this method successfully.

  3. I found this interesting to read. I was wondering if the article discusses or accounts for being able to predict the probability of given events, but not which events themselves will happen. As you mention, this method can enable analysts to focus on one single piece of evidence, so it seems like there is a flaw in that it does not weigh the evidence or events against each other which can lead to further insights for analysts.

    1. Yes Katie, you are right the author asserts that the method enables an analyst to focus on one evidence; and he doesn't mention whether he/she must take into account the contraries too. However, I believe that we must assume that they take into account other evidences and contrary information too while focusing on one evidence.

  4. In the critique, you said it gives more accurate estimates rather than a range. In my article, the author advocated for having a range instead of a precise number when using intelligence questions because he argued Bayes should be used as a starting point for analysis and not a conclusion. What do you think of this?