Friday, October 3, 2014

Fusion of intelligence information: A Bayesian approach

The September 11th attacks should be viewed as one defense failure in a series of foiled attempts.  In risk management, only defensive failures are broadcasted while successes often go unnoticed to the public.  The attacks were successful due to a lack of information sharing.  In regards to the little amount of information that was shared, intelligence professionals were not able to measure the value of particular pieces of information.  In other words, they did not effectively find signals in noise.

According to Paté-Cornell, Bayesian models provide the efficient means to find signals in voluminous amounts of noisy information.  Bayesian models calculate the probability of an event given a new signal and a base-line probability or the priority prior to the new signal.  A basic Bayesian model should contain a prior probability estimate, a probability estimate given a new piece of evidence, and a probability assessment of the relevance of the new piece of evidence.

Paté-Cornell takes a Bayesian model for fusion intelligence one step further.  She introduces various steps in the model in order to account for the possibilities of a false positive and a false negative given a new piece of evidence (or missed evidence).  This approach is normally used in engineering risk analysis, but it is logically applicable to intelligence work.  The generalized formula is shown below.

Unfortunately for Paté-Cornell, not all intelligence analysts are math fans.  The extra requirements of estimating false positives and false negatives, while logically sound, are difficult to do.  Intelligence professions operate in a world of unknown unknowns arguably more than any other profession.  Quadrant Crunching™ provides a qualitative and visual alternative for those who opt not to stress over numbers.  With that being said, it is highly commendable to incorporate uncertainty into the equation.

A simpler, more user-friendly Bayesian model would be one given by Nate Silver in his book, Signal and the Noise: Why So Many Predictions Fail – but Some Don’t.  Silver’s Bayes contains the three, known necessary variables, but only includes one unknown one dealing with the new signal.  The way it is structured, the prior probability or baseline is resilient in the face of new information.

“x” is the prior probability, “y” is the probability of the signal resulting from a particular event,  and “z” is the probability of the signal being unrelated to the same event considered in “y”.  This formula is much simpler and, according to Silver, can lead to “vast predictive insights.”

Pate-Cornell, E. (2002). Fusion of intelligence information: A Bayesian approach. Risk Analysis, 22(3), 445–454.


  1. Kyle,

    I noticed that several articles reviewed this week make references to using Bayesian methods in teams, instead of by individuals like in the article I reviewed.

    I can envision some mayhem in a group context when deciding how to account for false positives and false negatives as prescribed by Paté-Cornell.

    Do you think that the more user-friendly model by Nate Silver is more conducive to team forecasts due to its relative simplicity?

  2. Bayesian, no matter if it's a model by Paté-Cornell or Silver, can use group averages in its estimates. So each individual produces their own estimates, then all of the individual estimates are averaged in a separate Bayesian model. Aggregated estimates are offered all of the time.

    Regardless of which model is used, I suspect that simultaneous group work on one Bayesian model would be difficult in too large of a group.