“Do You Really Need More Information? The US Intelligence Community invests heavily in improved intelligence collection systems while managers of analysis lament the comparatively small sums devoted to enhancing analytical resources, improving analytical methods, or gaining better understanding of the cognitive processes involved in making analytical judgments.”
Psychology of Intelligence Analysis — Richards J. Heuer
“We all tend to overlook evidence that contradicts our views. When confronted with new data, our pre-existing ideas can cause us to see structure that isn’t there. This is a form of confirmation bias, whereby we look for and recall information that fits with what we already think. It can be adaptive: humans need to be able to separate out important information and act quickly to get out of danger. But this filtering can lead to scientific error.”
How scientists can stop fooling themselves over statistics — Dorothy Bishop (Nature August 3 2020)
August 21 2017 — Accurate intelligence judgments do not solely rely on the abundance and accuracy of the information. Indeed it has long been known that rigorous analysis of the information is at least as important as the gathered material in order to reach accurate intelligence estimates. Follow us on Twitter:@INTEL_TODAY
RELATED POST: Disinformation — Who Coined That Word Anyway?
UPDATE (August 16 2020) — In a piece published in Nature this month, Dorothy Bishop reminds us that we all need to build lifelong habits to avoid being led astray by confirmation bias.
Observations that are contrary to our expectations need special attention. In 1876, Charles Darwin said that he made it a habit “whenever a published fact, a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once: for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favourable ones”.
I myself have experienced this. When writing literature reviews, I have been shocked to realize that I had completely forgotten to mention papers that run counter to my own instincts, even though the papers had no particular flaws. I now make an effort to list them.
We all find it difficult to see the flaws in our own work — it’s a normal part of human cognition. But by understanding these blind spots, we can avoid them.
When one is willing to follow these advices, reaching the TRUTH is only a question of time as the investigations of the Ustica Massacre and the Buenos Aires bombings demonstrate.
On the other hand, those who fail to take into account these well-understood flaws of human cognition will never come anywhere near the TRUTH. They will die believing in their idiotic conspiracy theories.
“US spies blame Iran for Lockerbie bomb” is a textbook example. Today, we KNOW that there is not a shred of evidence supporting this nonsense and yet some people seem to believe that fairy tale. Why on earth?
END of UPDATE
In the 1960’s, CIA psychologists investigated the correlations between the amount of information available to experts, the accuracy of their judgments, and the experts’ confidence in the accuracy of these judgments. The results of these experiments are far-reaching.
In one of these experiments, experienced horse race handicappers were shown a long list of variables that included data related to the recent performances of the horses, the weight of the jockeys, the time since the last race, the weather conditions, etc. Each handicapper was asked to order these variables according to its perceived importance in the making of his prediction.
Next the handicappers were shown real data that had been renamed to ensure that they could not remember the events. Each of them was then given the five variables he had listed as the most useful. At that point they were asked to make a prediction as well as an estimate of the degree of accuracy of his prediction (from 0% to 100%). The same exercise was repeated after the handicappers were given 10, 20 and 40 variables.
The result of this particular experiment is abundantly clear. On one hand the accuracy of the prediction did not improve with additional information. As a matter of fact, several handicappers tended to be less accurate as more information became available to them.
On the other hand, their confidence in the accuracy of their predictions increased significantly as they were provided with more information. (See figure above.)
Note that only when provided with the lowest amount of information were the handicappers realistic about the probability of their predictions being correct.
Key findings from these experiments:
Once an experienced analyst has the minimum information necessary to make an informed judgment, obtaining additional information generally does not improve the accuracy of his or her estimates.
Additional information does, however, lead the analyst to become more confident in the judgment, to the point of overconfidence.
Experienced analysts have an imperfect understanding of what information they actually use in making judgments.
They are unaware of the extent to which their judgments are determined by a few dominant factors, rather than by the systematic integration of all available information.
Analysts actually use much less of the available information than they think they do.
About Richards “Dick” J. Heuer
Richards “Dick” J. Heuer, Jr. is a former CIA veteran of 45 years and most known for his work on analysis of competing hypotheses and his book, Psychology of Intelligence Analysis.
The former provides a methodology for overcoming intelligence biases while the latter outlines how mental models and natural biases impede clear thinking and analysis.
Throughout his career, he has worked in collection operations, counterintelligence, intelligence analysis and personnel security. In 2010 he co-authored a book with Randolph (Randy) H. Pherson titled Structured Analytic Techniques for Intelligence Analysis.
Richards Heuer is well-known for his analysis of the extremely controversial and disruptive case of Soviet KGB defector Yuri Nosenko, who was first judged to be part of a “master plot” for penetration of CIA but was later officially accepted as a legitimate defector. [Wikipedia]
About the Analysis of Competing Hypotheses
“Analysis of competing hypotheses (ACH) is an analytic process that identifies a complete set of alternative hypotheses, systematically evaluates data that is consistent and inconsistent with each hypothesis, and rejects hypotheses that contain too much inconsistent data.”
ACH is an eight step process to enhance analysis:
- Identify all possible hypotheses
- Make a list of significant evidence and arguments
- Prepare a matrix to analyze the “diagnosticity” of evidence
- Drawn tentative conclusions
- Refine the matrix
- Compare your personal conclusions about the relative likelihood of each hypothesis with
- the inconsistency scores
- Report your conclusions
- Identify indicators
Heuer originally developed ACH to be included as the core element in an inter-agency deception analysis course during the Reagan administration in 1984 concentrated on Soviet deception regarding arms deals.
The Palo Alto Research Center (PARC) in conjunction with Heuer developed the PARC ACH 2.0.5 software for use within the intelligence community in 2005.
This experiment is described in a book published on the CIA website:
Psychology of Intelligence Analysis — Richards J. Heuer Jr
Chapter V – Do You Really Need More Information?
CIA Richards J. Heuer: Information Collection vs Analytical Methods
One Year Ago — CIA Dr Richards J. Heuer: “Information Collection vs Analytical Methods”
CIA Dr Richards J. Heuer: “Information Collection vs Analytical Methods” [Quick Update]