Stephen Dufrechou | December 19, 2010
A
recent cognitive study, as reported by the
Boston Globe, concluded that:
Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
In light of these findings, researchers concluded that a defense mechanism, which they labeled “backfire”, was preventing individuals from producing pure rational thought. The result is a self-delusion that appears so regularly in normal thinking that we fail to detect it in ourselves, and often in others: When faced with facts that do not fit seamlessly into our individual belief systems, our minds automatically reject (or backfire) the presented facts. The result of backfire is that we become even more entrenched in our beliefs, even if those beliefs are totally or partially false.
“The general idea is that it’s absolutely threatening to admit you’re wrong,” said Brendan Nyhan, the lead researcher of the Michigan study. The occurrence of backfire, he noted, is “a natural defense mechanism to avoid that cognitive dissonance.”
The conclusion made here is this: facts often do not determine our beliefs, but rather our beliefs (usually non-rational beliefs) determine the facts that we accept. As the Boston Globe article notes:
In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.
Despite this finding, Nyhan claims that the underlying cause of backfire is unclear. “It’s very much up in the air,” he says. And on how our society is going to counter this phenomena, Nyhan is even less certain.
These latter unanswered questions are expected in any field of research, since every field has its own limitations. Yet here the field of psychoanalysis can offer a completion of the picture.
Disavowal and Backfire: One and the Same
In an article by psychoanalyst Rex Butler, Butler independently comes to the same conclusion as the Michigan Study researchers. In regards to facts and their relationship to belief systems (or ideologies), Butler says that:
there is no necessary relationship between reality and its symbolization … Our descriptions do not naturally and immutably refer to things, but … things in retrospect begin to resemble their description. Thus, in the analysis of ideology, it is not simply a matter of seeing which account of reality best matches the ‘facts’, with the one that is closest being the least biased and therefore the best. As soon as the facts are determined, we have already – whether we know it or not – made our choice; we are already within one ideological system or another. The real dispute has already taken place over what is to count as the facts, which facts are relevant, and so on.
This places the field of psychoanalysis on the same footing as that of cognitive science, in regards to this matter. But where cognitive studies end, with Nyhan’s question about the cause of backfire, psychoanalysis picks up and provides a possible answer. In fact, psychoanalysts have been publishing work on backfire for decades; only psychoanalysis refers to backfire by another name: “disavowal”. Indeed, these two terms refer to one and the same phenomena.
The basic explanation for the underlying cause of disavowal/backfire goes as follows.
“Liberals” and “conservatives” espouse antithetical belief systems, both of which are based on different non-rational “moral values.” This is a fact that cognitive linguist George Lakoff has often discussed, which incidentally brings in yet another field of study that supports the existence of the disavowal/backfire mechanism.
In accordance with these different non-rational belief systems, any individual’s ideology tends to function also as a ‘filtering system’, accepting facts that seamlessly fit into the framework of that ideology, while dismissing facts that do not fit.
When an individual—whether a “liberal”, “conservative”, or any other potential ideology—is challenged with facts that conflict with his/her ideology, the tendency is for that individual to experience feelings of anxiety, dread, and frustration. This is because our ideologies function, like a lynch pin, to hold our psychologies together, in order to avoid, as Nyhan puts it, “cognitive dissonance”. In other words, when our lynch pins are disturbed, our psychologies are shaken.
Psychoanalysts explain that, when this cognitive dissonance does occur, the result is to ‘externalize’ the sudden negative feelings outward, in the form of anger or resentment, and then to ‘project’ this anger onto the person that initially presented the set of backfired facts to begin with. (Although, sometimes this anger is ‘introjected’ inward, in the form of self-punishment or self-loathing.)
This non-rational eruption of anger or resentment is what psychoanalysts call “de-sublimation”. And it is at the point of de-sublimation, when the disavowal/backfire mechanism is triggered as a defense against the cognitive dissonance.
Hence, here is what mentally occurs next, in a matter of seconds:
In order to regain psychological equilibrium, the mind disavows the toxic facts that initially clashed with the individuals own ideology, non-rationally deeming the facts to be false—without assessing the validity of the facts.
The final step occurs when the person, who offered the toxic facts, is then non-rationally demonized. The person, here, becomes tainted as a ‘phobic object’ in the mind of the de-sublimated individual. Hence, the other person also becomes perceived to be as toxic as the disavowed facts, themselves.
At this point, ad hominem attacks are often fired at the source of the toxic facts. For example: ‘stupid liberal’ or ‘stupid conservative’, if in a political context. Or, ‘blasphemer’ or ‘heretic’, if in a religious context. At this point, according to psychoanalysis, psychological equilibrium is regained. The status quo of the individual’s ideology is reinforced to guard against future experiences of de-sublimation.
Why Do Different Ideologies Exist?
This all begs the obvious question about the existence of differing ideologies between people. Why do they exist? And how are they constituted differently? George Lakoff has demonstrated in his studies (which are supported strongly by psychoanalysis), that human beings are not born already believing an ideology. Rather people are socialized into an ideology during their childhood formative years. The main agents which prescribe the ideology are the parental authority figures surrounding the child, who rear him, from infantile dependency on the parent-figures, into an independent adult. The parental values of how the child should be an independent and responsible adult, in regards to his relations between his self and others, later informs that child’s ideology as an adult.
Lakoff shows that two dominant parenting types exist, which can determine the child’s adult ideology. Individuals reared under the “Strict Parent” model tend to grow-up as political conservatives, while those raised under a “Nurturing Parent” model tend to become political liberals. His most influential book on these matters, “Moral Politics: How Liberals and Conservatives Think”, was published in 1996.
Of course, peoples’ minds can fundamentally change, along with their ideological values. But short of a concerted effort by an individual to change, through one form of therapy or another, that change is mostly fostered by traumatic or long-endured life experiences.
Yet many minds remain rock solid for life, beliefs included. As psychiatrist Scott Peck sees it, “Only a relative and fortunate few continue until the moment of death exploring the mystery of reality, ever enlarging and refining and redefining their understanding of the world and what is true.”
Thus to answer Nyahan’s question—how can society counter the negative effects of backfire?—it seems only one answer is viable. Society will need to adopt the truths uncovered by cognitive science and psychoanalysis. And society will have to use those truths to inform their overall cultural practices and values. Short of that, Peck’s “fortunate few” will remain the only individuals among us who resist self-delusion.
~ Stephen Dufrechou is Editor of Opinion and Analysis for News Junkie Post.