The September 11th attacks should be viewed as one defense failure in a series of foiled attempts. In risk management, only defensive failures are broadcasted while successes often go unnoticed to the public. The attacks were successful due to a lack of information sharing. In regards to the little amount of information that was shared, intelligence professionals were not able to measure the value of particular pieces of information. In other words, they did not effectively find signals in noise.
According to Paté-Cornell, Bayesian models provide the efficient means to find signals in voluminous amounts of noisy information. Bayesian models calculate the probability of an event given a new signal and a base-line probability or the priority prior to the new signal. A basic Bayesian model should contain a prior probability estimate, a probability estimate given a new piece of evidence, and a probability assessment of the relevance of the new piece of evidence.
Paté-Cornell takes a Bayesian model for fusion intelligence one step further. She introduces various steps in the model in order to account for the possibilities of a false positive and a false negative given a new piece of evidence (or missed evidence). This approach is normally used in engineering risk analysis, but it is logically applicable to intelligence work. The generalized formula is shown below.
Unfortunately for Paté-Cornell, not all intelligence analysts are math fans. The extra requirements of estimating false positives and false negatives, while logically sound, are difficult to do. Intelligence professions operate in a world of unknown unknowns arguably more than any other profession. Quadrant Crunching™ provides a qualitative and visual alternative for those who opt not to stress over numbers. With that being said, it is highly commendable to incorporate uncertainty into the equation.
A simpler, more user-friendly Bayesian model would be one given by Nate Silver in his book, Signal and the Noise: Why So Many Predictions Fail – but Some Don’t. Silver’s Bayes contains the three, known necessary variables, but only includes one unknown one dealing with the new signal. The way it is structured, the prior probability or baseline is resilient in the face of new information.
“x” is the prior probability, “y” is the probability of the signal resulting from a particular event, and “z” is the probability of the signal being unrelated to the same event considered in “y”. This formula is much simpler and, according to Silver, can lead to “vast predictive insights.”
Pate-Cornell, E. (2002). Fusion of intelligence information: A Bayesian approach. Risk Analysis, 22(3), 445–454.