By John Irvin, NOIR Team
What does a controversial psychology experiment that took place over forty years ago regarding the causes of conflict between prisoners and prison guards have in common with Dr. David Charney’s proposal for the creation of a controversial new US government counterintelligence organization to reduce the threat of insider espionage? It isn’t just that they’re both controversial, or the fact that insider spies face the very real prospect of learning firsthand what it’s like to be a prisoner.
Stanford University Professor Philip Zimbardo’s August, 1971, experiment – in which he set up a mock prison in the basement of the Stanford Psychology Department building and filled it with 24 student volunteers, randomly chosen to play the role of prisoner or guard – is one of the most frequently cited studies in psychology. Praised, criticized, impossible to exactly replicate, and taught as part of any college Psych 101 course (not to mention being the subject of an upcoming film, due for release 17 July, 2015), the Stanford Prison Experiment (SPE) suggested the dominant role of institutional (i.e., environment, situation, “nurture”) factors over dispositional (i.e., personality, genetics, “nature”) factors in determining human behavior.
Part of the controversy that has accompanied the SPE from the beginning is the disconcerting conclusion Zimbardo draws that, under the right set of circumstances, men and women whom we would unquestionably consider decent, moral human beings can engage in shockingly immoral, harmful behaviors. In fact, his 2007 book describing in detail not only the SPE, but also his work with one of the defendants on trial for abuse of Iraqi prisoners at the Abu Ghraib prison, is subtitled “Understanding How Good People Turn Evil.”
A key justification for establishing a National Office for Intelligence Reconciliation (NOIR) to reduce the threat of insider espionage by employing Dr. Charney’s theory of the true psychology of the insider spy, is the proposition that an otherwise trustworthy member of the US intelligence community can make the seemingly inexplicable transformation into an insider spy as a result of what Dr. Charney calls a psychological perfect storm. Not only can it happen, it has in the past and will certainly continue to happen in the future, often resulting in catastrophic damage to US national security.
The so-called nature versus nurture debate – the question of whether an individual’s behavior is more a product of his or her biology or of the external circumstances that he or she is subject to – is one of the oldest in psychology.
Theories run the gamut from the “blank slate” position that human behavioral traits are almost exclusively a result of environmental influences, to the “nativist” position that the characteristics of the human species as a whole are a product of evolution and individual differences are due to each person’s unique genetic code. Both tend to be extreme positions and most researchers and practitioners operate with the understanding that both genetic and environmental factors influence individual human behavior, with any disagreement centering more on the degree to which one or the other dominates.
This debate, however, is more than just a philosophical or clinical one. The question of whether an individual’s behavior is a result of being innately good/bad, moral/immoral, trustworthy/untrustworthy has serious law enforcement and counterintelligence (CI) consequences. In the realm of CI, this dualistic, black and white view of human behavior would suggest that preventing espionage is mainly a matter of hiring “good” people and screening out “bad” ones. If morality is innate, then trustworthy employees will remain trustworthy almost regardless of the situation. It presupposes that on the rare occasion when a “good” employee goes “bad,” it was more likely the result of inadequate screening or deception rather than situational or institutional issues.
In this deterministic mindset, good people can be relied upon not to choose to do evil; it just isn’t in their nature to harm others, to behave deceitfully, to betray their friends, family, organization, or country. If they do cross over the line, the common sense conclusion is that they were never really that good in the first place. There must have been an ingrained personality flaw that was somehow overlooked during their initial screening or later on by their managers and co-workers. Good people just don’t turn bad.
Under this preconceived notion, security is a matter of ensuring bad apples are quite simply never allowed into the barrel. Obviously, this is an overly simplistic view of human psychology. Nevertheless, it remains a surprisingly pervasive belief, perhaps because it is so simple. It is the basis for the false perception that once an individual is deemed trustworthy, once they are granted security clearances, receive an appropriate salary, and become part of the team, the employer’s worries are over. This is the dispositional view of human behavior gone too far and it can lead to an insidious complacency.
A corollary to the dispositional view is the perhaps cynical opinion that no one is intrinsically “good” and every employee, regardless of vetting and performance, must be watched at all times. It is a zero-defect, trust-no-one attitude. This may sound like the situational view (that even good people can, under the right circumstances, turn bad), but differs in that situational factors are almost ignored. Regardless of situational or institutional factors, constant monitoring of all employees is seen as the only practical solution to the threat of bad behavior.
This view fails to acknowledge that most employees are trustworthy and only a small fraction of them will, under very specific and difficult to foresee circumstances, become untrustworthy. In practice, constant monitoring of all employees has a deleterious effect on workforce morale and individual worker loyalty to the employer. This “everyone is suspect” approach can actually contribute to the sort of unhealthy work environment that may encourage insider espionage. Unfortunately, it also seems to be the basis for current technological approaches toward dealing with the insider threat.
If, as Zimbardo and other researchers suggest, situational factors can have such a profound effect on an individual so as to make a “good” person behave in a decidedly “bad” manner when subject to precisely the right set of circumstances, then no effort to screen out the “bad apples” can ever be completely effective. Thorough screening remains absolutely necessary, since it is very effective in keeping out those individuals who are clearly unsuited to serve in a position of trust. The problem is that potential insider spies will always still get through the screening, obtain clearances, and serve in positions of trust with access to classified information in spite of the best screening. They will turn to espionage despite the best monitoring. The system fails because at the time of screening and throughout months or years of service, they are demonstrably not “bad” people.
Instead, the potential insider spy is more likely to be quite justifiably viewed by his or her peers and employer as among the most trustworthy individuals, having been thoroughly screened and accepted into the fold as a member of a group that takes pride in considering themselves the “good guys.” The potential insider spy isn’t a “bad apple” and may never become one during the course of a lengthy career. It is only when circumstances arise in just the right manner, at just the right time, and are perceived by the employee in just the right way, that the trustworthy employee contemplates betrayal of that trust.
Dr. Charney describes the core psychology of the insider spy as an intolerable sense of personal failure, as privately defined by that person. How does an organization screen for a subjective mental state that may, in fact, never arise? How does an organization identify the situational factors that may cause a particular individual to go from “good” to “bad” when those factors are unique to the individual? When asked, how many people who (with real or imagined justification) consider themselves “good” would even harbor the thought that under certain circumstances they might turn “bad,” much less identify those circumstances?
The obvious answer is that trying to figure out and prevent the unique circumstances that might cause that transformation is nearly impossible. Screening is necessary but inadequate. Monitoring of employee behavior is a two-edged sword – necessary, but if excessive it is likely to encourage just the sort of behavior it is intended to prevent. Efforts at prevention are absolutely necessary, but inevitably some individual will make it past the barriers, avoid detection, and turn traitor. While (fortunately) a statistically small number, some good apples will inevitably go bad.
Dr. Charney’s true psychology of the insider spy builds on the work of Zimbardo and others, takes the lessons of the SPE, and provides a practical solution to the reality that under certain circumstances good people can and will do very bad things. The result is the National Office for Intelligence Reconciliation (NOIR), a proposed organization that addresses the reality that efforts at prevention will never be completely reliable.
When the inevitable happens and a trusted employee turns insider spy, NOIR employs an understanding of the mindset of the insider spy in order to quickly and effectively end the espionage and mitigate the damage already done. Unhindered by outdated black and white concepts of “good guys” and “bad guys” it stands ready for the day when a trusted IC employee, like one of Zimbardo’s Stanford Prison Experiment guards, finds him- or herself in the sort of subjective, often unforeseeable set of circumstances that can bring out the worst in the best of us.
 Zimbardo, P.G. (2007). The Lucifer Effect: Understanding How Good People Turn Evil. New York: Random House
 Weckert, John, et al. (2005). Electronic Monitoring in the Workplace: Controversies and Solutions. Hershey, PA: Idea Group Publishing
 In The Lucifer Effect, Zimbardo cites two studies in particular; Yale University psychologist Stanley Milgram’s famous 1961 experiment in which volunteers were asked to give increasingly painful electric shocks to total strangers (see Milgram, S. (1974), Obedience to Authority: An Experimental View, London: Tavistock Publications), and Palo Alto, California, high school history teacher Ron Jones’ 1967 classroom experiment in which he created a social movement, “The Third Wave,” in order to explain how the German populace could accept the actions of the Nazi regime (see http://www.telegraph.co.uk/culture/film/3559727/The-Wave-the-experiment-that-turned-a-school-into-a-police-state.html).
WEBSITE: Stanford Prison Experiment