As intelligence analysts, our analysis of data and information is influenced by prior knowledge including our thoughts, perceptions, and biases. Members of the intelligence community share a common knowledge–and a personal knowledge–which we use when organizing and categorizing data and information. The more their organization and categorization relates to a shared common knowledge, the easier their retrieval is.
Analysis is not always easy. Cognitive flaws, or brain bugs, affect our thinking resulting from differences, among other factors, in the biases and misperceptions of analysts (Buonomano, 2011). For example, hundreds of millions of dollars are spent annually to predict terrorist attacks, yet before the Sept. 11 acts, the vast majority of defense planning focused on defense against biological and chemical threats (Warrick & Stephens, 2001). The possibility of using an airplane to bomb New York and the Pentagon was suggested by a Pentagon panel, but it was never published. Fearing its release would inspire terrorists, the idea of such an attack was also regarded as too radical (Warrick & Stephens, 2001). Wald (2001) noted several incidents during the 1990s regarding terrorists’ use of jetliners, but these events were not the basis for extrapolation by aviation or intelligence officials.
Critical thinking is useful to identify and transcend cognitive biases by basing our analysis on logic, reason, and empiricism. Two aspects of critical thinking are prediction and diagnosis, which can be regarded as two sides of the same coin. We currently teach these concepts to our students in the School of Security and Global Studies at American Military University so that the student can develop their analysis without prior bias. Prediction requires thinking forward from cause to outcome such as predicting that a person from a given country poses a threat to initiate a terrorist act. Diagnosis requires thinking backward from the act to the cause such as diagnosing the basis or reason for the act (Fernbach, Darlow, & Sloman, 2010). Analysis requires critical thinking, considering potential problems defensively and offensively.
By Valerie Davis
Faculty Member, Intelligence Studies at American Military University
Buonomano, D. (2011). Brain bugs: How the brain’s flaws shape our lives. New York: W.W. Norton.
Fernbach, P. M., Darlow, A., & Sloman, S. A. (2010). Neglect of alternative causes in predictive but not diagnostic reasoning. Psychological Science, 21(3), 329-336. doi:10.1177/0956797610361430
Wald, M. (2001, October 3). Earlier hijackings offered signals that were missed. New York Times.
Warrick, J. & Stephens, J. (2001, October 2). Before attack, U.S. expected different hit chemical, germ agents focus of preparations. Washington Post, p. A01.