Monday, June 30, 2008
The International Herald Tribune has a piece today entitled "Your brain lies to you", about how the brain processes information, often incorrect information that has been absorbed so long ago that one doesn't remember where it came from, and therefore just retains it as information alone without context, which is what makes it easy for false information to slip in between the cracks and become permanent 'knowledge', which then gives you a false view of the world. Here's the article's explanation of how it happens:
It then goes into greater detail on how this can apply to everyday life, and politics:
The brain does not simply gather and stockpile information as a computer's hard drive does. Facts are stored first in the hippocampus, a structure deep in the brain about the size and shape of a fat man's curled pinkie finger. But the information does not rest there. Every time we recall it, our brain writes it down again, and during this re-storage, it is also reprocessed. In time, the fact is gradually transferred to the cerebral cortex and is separated from the context in which it was originally learned. For example, you know that the capital of California is Sacramento, but you probably don't remember how you learned it.
This phenomenon, known as source amnesia, can also lead people to forget whether a statement is true. Even when a lie is presented with a disclaimer, people often later remember it as true.With time, this misremembering gets worse. A false statement from a noncredible source that is at first not believed can gain credibility during the months it takes to reprocess memories from short-term hippocampal storage to longer-term cortical storage.
Journalists and campaign workers may think they are acting to counter misinformation by pointing out that it is not true. But by repeating a false rumor, they may inadvertently make it stronger. In its concerted effort to "stop the smears," the Obama campaign may want to keep this in mind. Rather than emphasize that Obama is not a Muslim, for instance, it may be more effective to stress that he embraced Christianity as a young man.
Consumers of news, for their part, are prone to selectively accept and remember statements that reinforce beliefs they already hold. In a replication of the study of students' impressions of evidence about the death penalty, researchers found that even when subjects were given a specific instruction to be objective, they were still inclined to reject evidence that disagreed with their beliefs.
It seems to me that in general this is in general a useful function of the brain, because when I read a newspaper I don't want to be weighed down with where I learned the letter l, then the letter e, then the letter t, and then the first time I learned how to spell the whole word properly, and so on. The brain seems to be geared towards taking useful information, removing the context (it seems to trust that it's made the right decision in the first place when it absorbed it), and then putting it to better use, in the same way that learning letters leads to learning words, and then learning how a sentence works, then learning to read overall, then learning to appreciate literature, and whatever else comes after that if you take the time to train yourself in that way.
So what's a good way to counteract this? Interaction with others I think, particularly places like Wikipedia. In Wikipedia when you contribute to an article and the information is false, there's a good chance that someone will come along and erase it, and/or ask for a source. With your self-confident "I heard it somewhere, I'll just find the source and restore the edit" you then learn that the information was actually false in the first place, and gain a greater understanding of the world through this. The Straight Dope is also a good column to read to challenge oneself on things that one has heard before and has come to believe, but aren't based in fact.
Any other suggestions?