Pages

Monday, July 21, 2014

Humans Already Use Way, Way More Than 10 Percent of Their Brains

 

From The Atlantic, Sam McDougle dispels the silly myth that we only use 10% of our brains. Oddly, the first time I saw this myth debunked was in Omni Magazine back in the 1980s.

The impetus for this new debunking of the myth is the new Luc Besson - Scarlett Johansson film, Lucy (trailer below), which begins with a neuroscientist (Morgan Freeman) delivering the following lines in a lecture:
“It is estimated most human beings only use 10 percent of the brain’s capacity. Imagine if we could access 100 percent. Interesting things begin to happen.”
We can and so access 100% of our brains (barring organic damage), and still, very few people are doing interesting things.

Humans Already Use Way, Way More Than 10 Percent of Their Brains

It’s a complex, constantly multi-tasking network of tissue—but the myth persists.

Sam McDougle | Jul 16 2014


Chris Helgren/Reuters

By now, perhaps you’ve seen the trailer for the new sci-fi thriller Lucy. It starts with a flurry of stylized special effects and Scarlett Johansson serving up a barrage of bad-guy beatings. Then comes Morgan Freeman, playing a professorial neuroscientist with the obligatory brown blazer, to deliver the film’s familiar premise to a full lecture hall: “It is estimated most human beings only use 10 percent of the brain’s capacity. Imagine if we could access 100 percent. Interesting things begin to happen.”

Johansson as Lucy, who has been kidnapped and implanted with mysterious drugs, becomes a test case for those interesting things, which seem to include even more impressive beatings and apparently some kind of Matrix-esque time-warping skills.

Of course, the idea that “you only use 10 percent of your brain” is, indeed, 100 hundred percent bogus. Why has this myth persisted for so long, and when is it finally going to die?



Unfortunately, not any time soon. A survey last year by The Michael J. Fox Foundation for Parkinson's Research found that 65 percent of Americans believe the myth is true, 5 percent more than those who believe in evolution. Even Mythbusters, which declared the statistic a myth a few years ago, further muddied the waters: The show merely increased the erroneous 10 percent figure and implied, incorrectly, that people use 35 percent of their brains. The idea that swaths of the brain are stagnant pudding while one section does all the work is silly.

Like most legends, the origin of this fiction is unclear, though there are some clues. According to Sam Wang, a neuroscientist at Princeton and the author of Welcome to Your Brain, the catalyst may have been the self-help industry. In the early 1900s, William James, one of the most influential thinkers in modern psychology, famously said that humans have unused mental potential. This completely reasonable assertion was later revived, in mangled form, by the writer Lowell Thomas in his foreword to the 1936 self-help bible How To Win Friends And Influence People. “Professor William James of Harvard used to say that the average person develops only 10 percent of his latent mental ability,” Thomas wrote. It appears that he, or perhaps someone else in his day, simply plucked the golden number out of the sky.

The 10 percent claim is demonstrably false on a number of levels. First, the entire brain is active all the time. The brain is an organ. Its living neurons, and the cells that support them, are always doing something. (Where’s the “you only use 10 percent of your spleen” myth?) Joe LeDoux, a professor of neuroscience and psychology at NYU, thinks that people today may be thrown off by the “blobs”—the dispersed markers of high brain activity—seen in functional magnetic resonance imaging (fMRI) of the human brain. These blobs are often what people are talking about when they refer to the brain “lighting up.”

Say you’re watching a movie in an fMRI scanner. Certain areas of your brain—the auditory and visual cortices, for instance—will be significantly more active than others; and that activity will show up as colored splotches when the fMRI images are later analyzed. These blobs of significant activity usually cover small portions of the brain image, often less than 10 percent, which could make it seem, to the casual observer, that the rest of the brain is idling. But, as LeDoux put it to me in an email, “the brain could be one hundred percent active during a task with only a small percentage of brain activity unique to the task.” This kind of imaging highlights big differences in regional brain activity, not everything the brain is doing. And that’s why the 10-percent myth, compared with other fantasies, is especially pernicious.

In fact, the entire premise of only “using” a certain proportion of your brain is misguided. When your brain works on a problem—turning light that hits your retina into an image, or preparing to reach for a pint of beer, or solving an algebra problem—its effectiveness is as much a question of “where” and “when” as it is of “how much.” Certain regions of the brain are more specialized than others to deal with certain tasks, and most behavior depends on tight temporal coordination between those regions. Your visual system helps you locate that pint of beer, and your motor system gets your hand around it. The idea that swaths of the brain are stagnant pudding while one section does all the work is silly. The brain is a complex, constantly multi-tasking network of tissue.

Still, the appeal of the myth is clear. If we only use 10 percent of our brains, imagine how totally great life would be if we could use more. You could dazzle Grandma and her nursing-home crew during Jeopardy. Or, like Lucy, you could learn Chinese calligraphy in an hour. The 10-percent myth presses the same buttons as any self-help scheme that promises to make us better, faster. As Wang told me, it’s “like the 4-hour workweek guy.”

And that’s why the 10-percent myth, compared with other fantasies, is especially pernicious. It has a distinct air of scientific plausibility—it’s a zippy one-liner with a nice round number, a virus with obvious vectors in pop-psychology books, easy to repeat at cocktail parties. The myth is also part of a larger way of thinking about the brain that is characterized by misleading simplifications—like the notion that the right side of the brain is creative and the left side rational. “Those kinds of ideas self-perpetuate,” LeDoux told me. “It's like saying dopamine is responsible for pleasure and the amygdala makes fear. Both are wrong.”

Neuroscience is still in its adolescence, but it is all too often served to the public as a more mature field. As psychologist Gary Marcus recently argued, biology—including neuroscience—is not like physics, where scientists build on a bedrock of established formal laws. As of yet, neuroscience has no such foundation, which may increase the likelihood that sweeping myths about the brain endure. Neuroscience and psychology are the rare scientific fields that, for many, have tangible personal value—there is no self-help industry based on the therapeutic effects of the Higgs boson—which means slow, careful progress in research often gets juiced up by opportunist, or at least over-eager, non-scientists.

I’m still going to see Lucy, if only because it’s directed by Luc Besson and The Fifth Element blew my 10-year-old mind when it came out. But here’s to hoping for Hollywood sci-fi movies that lean more on the weird and imaginative and less on the widely believed and dead wrong.

No comments:

Post a Comment