Here are a few of my favorite people and their responses.
James Flynn has defined "shorthand abstractions" (or "SHA's") as concepts drawn from science that have become part of the language and make people smarter by providing widely applicable templates ("market", "placebo", "random sample," "naturalistic fallacy," are a few of his examples). His idea is that the abstraction is available as a single cognitive chunk which can be used as an element in thinking and debate.The Edge Question 2011
WHAT SCIENTIFIC CONCEPT WOULD IMPROVE EVERYBODY'S COGNITIVE TOOLKIT?
The term 'scientific"is to be understood in a broad sense as the most reliable way of gaining knowledge about anything, whether it be the human spirit, the role of great people in history, or the structure of DNA. A "scientific concept" may come from philosophy, logic, economics, jurisprudence, or other analytic enterprises, as long as it is a rigorous conceptual tool that may be summed up succinctly (or "in a phrase") but has broad application to understanding the world.
[Thanks to Steven Pinker for suggesting this year's Edge Question and to Daniel Kahneman for advice on its presentation.]
164CONTRIBUTORS (113,500 words): Daniel Kahneman, Richard Dawkins, V.S. Ramachandran, Richard Thaler, Brian Eno, J. Craig Venter, Martin Rees, Mahzarin Banaji, Stefano Boeri, Nigel Goldenfeld, Dimitar Sasselov, Gary Marcus, Eric Weinstein, Neri Oxman, David Pizarro, Andrew Revkin, Stuart Firestein, Beatrice Golomb, Diane Halpern, Kevin Hand, Barry Smith, Kevin Hand, Garrett Lisi, David Dalrymple, Xeni Jardin, Seth Lloyd, Brian Knutson, Carl Page, Victoria Stodden, David Rowan, Hazel Rose Markus & Alana Conner, Fiery Cushman, David Eagleman, Joan Chiao, Max Tegmark, Tecumseh Fitch, Joshua Greene, Stephon Alexander, Gregory Cochran, Tor Norretranders , Laurence Smith, Carl Zimmer, Roger Highfield, Marcelo Gleiser, Richard Saul Wurman, Anthony Aguirre, Sam Harris, P.Z. Myers, Sue Blackmore, Bart Kosko, David Buss, John Tooby, Eduardo Salcedo-Albaran, Paul Bloom, Evgeny Morozov, Mark Pagel, Kathryn Schulz, Ernst Pöppel, Tania Lombrozo, Paul Saffo, Jay Rosen, Timothy Taylor, Jonah Lehrer, Marco Iacoboni, Dave Winer, George Church, Kai Krause, Gloria Origgi, Tom Standage, Vinod Khosla, Dan Sperber, Geoffrey Miller, Satyajit Das, Alun Anderson, Eric Topol, Amanda Gefter, Scott D. Sampson, John McWhorter, Jon Kleinberg, Christine Finn, Nick Bostrom, Robert Sapolsky, Adam Alter, Ross Anderson, Paul Kedrosky, Mark Henderson, Thomas A. Bass, Gerald Smallberg, James Croak, Greg Paul, Susan Fiske, Marti Hearst, Keith Devlin, Gerd Gigerenzer, Matt Ridley, Andrian Kreye, Don Tapscott, David Gelernter, Linda Stone, Matthew Ritchie, Joel Gold, Helen Fisher, Giulio Boccaletti, Daniel Goleman, Donald Hoffman, Richard Foreman, Lee Smolin, Thomas Metzinger, Lawrence Krauss, William Calvin, Nicholas Christakis, Alison Gopnik, Kevin Kelly, Clay Shirky, Andy Clark, Neil Gershenfeld, Jonathan Haidt, Marcel Kinsbourne, Douglas Rushkoff, Lisa Randall, Frank Wilczek, Jaron Lanier, Jennifer Jacquet, Daniel Dennett, Stephen M. Kosslyn, Carlo Rovelli, Juan Enriquez, Terrence Sejnowski, Irene Pepperberg, Michael Shermer, Samuel Arbesman, Douglas Kenrick, James O'Donnell, David G. Myers, Rob Kurzban, Richard Nisbett, Samuel Barondes, Hans Ulrich Obrist, Nicholas Carr, Emanuel Derman, Aubrey De Grey, Nassim Taleb, Rebecca Goldstein, Clifford Pickover, Charles Seife, Rudy Rucker, Sean Carroll, Gino Segre, Jason Zweig, Dylan Evans, Steven Pinker, Martin Seligman, Gerald Holton, Robert Provine, Roger Schank, George Dyson, Milford Wolpoff, George Lakoff, Nicholas Humphrey, Christian Keysers, Haim Harari, W. Daniel Hillis, John Allen Paulos, Bruce Hood, Howard Gardner
SAM HARRIS
Neuroscientist; Chairman, Project Reason; Author, The Moral LandscapeWe are Lost in Thought
I invite you to pay attention to anything — the sight of this text, the sensation of breathing, the feeling of your body resting against your chair — for a mere sixty seconds without getting distracted by discursive thought. It sounds simple enough: Just pay attention. The truth, however, is that you will find the task impossible. If the lives of your children depended on it, you could not focus on anything — even the feeling of a knife at your throat — for more than a few seconds, before your awareness would be submerged again by the flow of thought. This forced plunge into unreality is a problem. In fact, it is the problem from which every other problem in human life appears to be made.
I am by no means denying the importance of thinking. Linguistic thought is indispensable to us. It is the basis for planning, explicit learning, moral reasoning, and many other capacities that make us human. Thinking is the substance of every social relationship and cultural institution we have. It is also the foundation of science. But our habitual identification with the flow of thought — that is, our failure to recognize thoughts as thoughts, as transient appearances in consciousness — is a primary source of human suffering and confusion.
Our relationship to our own thinking is strange to the point of paradox, in fact. When we see a person walking down the street talking to himself, we generally assume that he is mentally ill. But we all talk to ourselves continuously — we just have the good sense to keep our mouths shut. Our lives in the present can scarcely be glimpsed through the veil of our discursivity: We tell ourselves what just happened, what almost happened, what should have happened, and what might yet happen. We ceaselessly reiterate our hopes and fears about the future. Rather than simply exist as ourselves, we seem to presume a relationship with ourselves. It's as though we are having a conversation with an imaginary friend possessed of infinite patience. Who are we talking to?
While most of us go through life feeling that we are the thinker of our thoughts and the experiencer of our experience, from the perspective of science we know that this is a distorted view. There is no discrete self or ego lurking like a minotaur in the labyrinth of the brain. There is no region of cortex or pathway of neural processing that occupies a privileged position with respect to our personhood. There is no unchanging "center of narrative gravity" (to use Daniel Dennett's phrase). In subjective terms, however, there seems to be one — to most of us, most of the time.
Our contemplative traditions (Hindu, Buddhist, Christian, Muslim, Jewish, etc.) also suggest, to varying degrees and with greater or lesser precision, that we live in the grip of a cognitive illusion. But the alternative to our captivity is almost always viewed through the lens of religious dogma. A Christian will recite the Lord's Prayer continuously over a weekend, experience a profound sense of clarity and peace, and judge this mental state to be fully corroborative of the doctrine of Christianity; A Hindu will spend an evening singing devotional songs to Krishna, feel suddenly free of his conventional sense of self, and conclude that his chosen deity has showered him with grace; a Sufi will spend hours whirling in circles, pierce the veil of thought for a time, and believe that he has established a direct connection to Allah.
The universality of these phenomena refutes the sectarian claims of any one religion. And, given that contemplatives generally present their experiences of self-transcendence as inseparable from their associated theology, mythology, and metaphysics, it is no surprise that scientists and nonbelievers tend to view their reports as the product of disordered minds, or as exaggerated accounts of far more common mental states — like scientific awe, aesthetic enjoyment, artistic inspiration, etc.
Our religions are clearly false, even if certain classically religious experiences are worth having. If we want to actually understand the mind, and overcome some of the most dangerous and enduring sources of conflict in our world, we must begin thinking about the full spectrum of human experience in the context of science.
But we must first realize that we are lost in thought.
* * * * *
JONAH LEHRER
Contributing Editor, Wired; Author, How We Decide
Control Your Spotlight
In the late 1960s, the psychologist Walter Mischel began a simple experiment with four-year old children. He invited the kids into a tiny room, containing a desk and a chair, and asked them to pick a treat from a tray of marshmallows, cookies, and pretzel sticks. Mischel then made the four-year olds an offer: they could either eat one treat right away or, if they were willing to wait while he stepped out for a few minutes, they could have two treats when he returned. Not surprisingly, nearly every kid chose to wait.
At the time, psychologists assumed that the ability to delay gratification — to get that second marshmallow or cookie — depended on willpower. Some people simply had more willpower than others, which allowed them to resist tempting sweets and save money for retirement.
However, after watching hundreds of kids participate in the marshmallow experiment, Mischel concluded that this standard model was wrong. He came to realize that willpower was inherently weak, and that children that tried to outlast the treat — gritting their teeth in the face of temptation — soon lost the battle, often within thirty seconds.
Instead, Mischel discovered something interesting when he studied the tiny percentage of kids who could successfully wait for the second treat. Without exception, these "high delayers" all relied on the same mental strategy: they found a way to keep themselves from thinking about the treat, directing their gaze away from the yummy marshmallow. Some covered their eyes or played hide-and-seek underneath the desk. Others sang songs from "Sesame Street," or repeatedly tied their shoelaces, or pretended to take a nap. Their desire wasn't defeated — it was merely forgotten.
Mischel refers to this skill as the "strategic allocation of attention," and he argues that it's the skill underlying self-control. Too often, we assume that willpower is about having strong moral fiber. But that's wrong — willpower is really about properly directing the spotlight of attention, learning how to control that short list of thoughts in working memory. It's about realizing that if we're thinking about the marshmallow we're going to eat it, which is why we need to look away.
What's interesting is that this cognitive skill isn't just a useful skill for dieters. Instead, it seems to be a core part of success in the real world. For instance, when Mischel followed up with the initial subjects 13 years later — they were now high school seniors — he realized that performance on the marshmallow task was highly predictive on a vast range of metrics. Those kids who struggled to wait at the age of four were also more likely to have behavioral problems, both in school and at home. They struggled in stressful situations, often had trouble paying attention, and found it difficult to maintain friendships. Most impressive, perhaps, were the academic numbers: The little kid who could wait fifteen minutes for their marshmallow had an S.A.T. score that was, on average, two hundred and ten points higher than that of the kid who could wait only thirty seconds.
These correlations demonstrate the importance of learning to strategically allocate our attention. When we properly control the spotlight, we can resist negative thoughts and dangerous temptations. We can walk away from fights and improve our odds against addiction. Our decisions are driven by the facts and feelings bouncing around the brain — the allocation of attention allows us to direct this haphazard process, as we consciously select the thoughts we want to think about.
Furthermore, this mental skill is only getting more valuable. We live, after all, in the age of information, which makes the ability to focus on the important information incredibly important. (Herbert Simon said it best: "A wealth of information creates a poverty of attention.") The brain is a bounded machine and the world is a confusing place, full of data and distractions — intelligence is the ability to parse the data so that it makes just a little bit more sense. Like willpower, this ability requires the strategic allocation of attention.
One final thought: In recent decades, psychology and neuroscience have severely eroded classical notions of free will. The unconscious mind, it turns out, is most of the mind. And yet, we can still control the spotlight of attention, focusing on those ideas that will help us succeed. In the end, this may be the only thing we can control. We don't have to look at the marshmallow.
* * * * *
ANDY CLARK
Philosopher and Cognitive Scientist, University of Edinburgh. Author: Supersizing the Mind: Embodiment, Action, and Cognitive ExtensionPredictive Coding
The idea that the brain is basically an engine of prediction is one that will, I believe, turn out to be very valuable not just within its current home (computational cognitive neuroscience) but across the board: for the arts, for the humanities, and for our own personal understanding of what it is to be a human being in contact with the world.
The term 'predictive coding' is currently used in many ways, across a variety of disciplines. The usage I recommend for the Everyday Cognitive Toolkit is, however, more restricted in scope. It concerns the way the brain exploits prediction and anticipation in making sense of incoming signals and using them to guide perception, thought, and action. Used in this way 'predictive coding' names a technically rich body of computational and neuroscientific research (key theorists include Dana Ballard, Tobias Egner, Paul Fletcher, Karl Friston, David Mumford, and Rajesh Rao) . This corpus of research uses mathematical principles and models that explore in detail the ways that this form of coding might underlie perception, and inform belief, choice, and reasoning.
The basic idea is simple. It is that to perceive the world is to successfully predict our own sensory states. The brain uses stored knowledge about the structure of the world and the probabilities of one state or event following another to generate a prediction of what the current state is likely to be, given the previous one and this body of knowledge. Mismatches between the prediction and the received signal generate error signals that nuance the prediction or (in more extreme cases) drive learning and plasticity.
We may contrast this with older models in which perception is a 'bottom-up' process, in which incoming information is progressively built (via some kind of evidence accumulation process, starting with simple features and working up) into a high-level model of the world. According to the predictive coding alternative, the reverse is the case. For the most part, we determine the low-level features by applying a cascade of predictions that begin at the very top; with our most general expectations about the nature and state of the world providing constraints on our successively more detailed (fine grain) predictions.
This inversion has some quite profound implications.
First, the notion of good ('veridical') sensory contact with the world becomes a matter of applying the right expectations to the incoming signal. Subtract such expectations and the best we can hope for are prediction errors that elicit plasticity and learning. This means, in effect, that all perception is some form of 'expert perception', and that the idea of accessing some kind of unvarnished sensory truth is untenable (unless that merely names another kind of trained, expert perception!).
Second, the time course of perception becomes critical. Predictive coding models suggest that what emerges first is the general gist (including the general affective feel) of the scene, with the details becoming progressively filled in as the brain uses that larger context — time and task allowing — to generate finer and finer predictions of detail. There is a very real sense in which we properly perceive the forest before the trees.
Third, the line between perception and cognition becomes blurred. What we perceive (or think we perceive) is heavily determined by what we know, and what we know (or think we know) is constantly conditioned on what we perceive (or think we perceive). This turns out to offer a powerful window on various pathologies of thought and action, explaining the way hallucinations and false beliefs go hand-in-hand in schizophrenia, as well as other more familiar states such as 'confirmation bias' (our tendency to 'spot' confirming evidence more readily than disconfirming evidence).
Fourth, if we now consider that prediction errors can be suppressed not just by changing predictions but by changing the things predicted, we have a simple and powerful explanation for behavior and the way we manipulate and sample our environment. In this view, action is there to make predictions come true and provides a nice account of phenomena that range from homeostasis to the maintenance of our emotional and interpersonal status quo.
Understanding perception as prediction thus offers, it seems to me, a powerful tool for appreciating both the power and the potential pitfalls of our primary way of being in contact with the world. Our primary contact with the world, all this suggests, is via our expectations about what we are about to see or experience. The notion of predictive coding, by offering a concise and technically rich way of gesturing at this fact, provides a cognitive tool that will more than earn its keep in science, law, ethics, and the understanding of our own daily experience.
* * * * *
GEORGE LAKOFF
Cognitive Scientist and Linguist; Richard and Rhoda Goldman Distinguished Professor of Cognitive Science and Linguistics, UC Berkeley; Author, The Political MindConceptual Metaphor
Conceptual Metaphor is at the center of a complex theory of how the brain gives rise to thought and language, and how cognition is embodied. All concepts are physical brain circuits deriving their meaning via neural cascades that terminate in linkage to the body. That is how embodied cognition arises.
Primary metaphors are brain mappings linking disparate brain regions, each tied to the body in a different way. For example, More Is Up (as in "prices rose") links a region coordinating quantity to another coordinating verticality. The neural mappings are directional, linking frame structures in each region. The directionality is determined by First-Spike-Dependent Plasticity. Primary metaphors are learned automatically and unconsciously by the hundreds prior to metaphoric language, just by living in the world and having disparate brain regions activated together when two experiences repeatedly co-occur.
Complex conceptual metaphors arise via neural bindings, both across metaphors and from a given metaphor to a conceptual frame circuit. Metaphorical reasoning arises when source domain inference structures are used for target domain reasoning via neural mappings. Linguistic metaphors occur when words for source domain concepts are used for target domain concepts via neural metaphoric mappings.
Because conceptual metaphors unconsciously structure the brain's conceptual system, much of normal everyday thought is metaphoric, with different conceptual metaphors used to think with on different occasions or by different people.
A central consequence is the huge range of concepts that use metaphor cannot be defined relative to the outside world, but are instead embodied via interactions of the body and brain with the world.
There are consequences in virtually every area of life. Marriage, for example, is understood in many ways, as a journey, a partnership, a means for grown, a refuge, a bond, a joining together, and so on. What counts as a difficulty in the marriage is defined by the metaphor used. Since it is rare for spouses to have the same metaphors for their marriage, and since the metaphors are fixed in the brain but unconscious, it is not surprising that so many marriages encounter difficulties.
In politics, conservatives and progressives have ideologies defined by different metaphors. Various concepts of morality around the world are constituted by different metaphors. These results show the inadequacy of experimental approaches to morality in social psychology (e.g, Haidt's moral foundations theory) which ignore both how conceptual metaphor constitutes moral concepts and why those metaphors arise naturally in cultures around the world.
Even mathematical concepts are understood via metaphor, depending on the branch of mathematics. Emotions are conceptualized via metaphors that are tied to the physiology of emotion. In set theory, numbers are sets of a certain structure.
On the number line, numbers are points on a line. "Real" numbers are defined via the metaphor that infinity is a thing; an infinite decimal like pi goes on forever, yet it is a single entity — an infinite thing.
Though conceptual metaphors have been researched extensively in the fields of cognitive linguistics and neural computation for decades, experimental psychologists have been experimentally confirming their existence by showing that, as circuitry physically in the brain they can influence behavior in the laboratory. The metaphors guide the experimenters, showing them what to look for. Confirming the conceptual metaphor that The Future Is Ahead; The Past is Behind, experimenters found that subjects thinking about the future lean slightly forward, while those thinking about the past lean slightly backwards. Subjects asked to do immoral acts in experiments tended to wash or wipe their hands afterwards, confirming the conceptual metaphor Morality Is Purity. Subjects moving marbles upward tended to tell happy stories, while those moving marbles downward tended to tell sad stories, confirming Happy Is Up; Sad is Down. Similar results are coming in by the dozens. The new experimental results on embodied cognition are mostly in the realm of conceptual metaphor.
Perhaps most remarkable, there appear to be brain structures that we are born with that provide pathways ready for metaphor circuitry. Edward Hubbard has observed that critical brain regions coordinating space and time measurement are adjacent in the brain, making it easy for the universal metaphors for understanding space in terms of time to develop (as in "Christmas is coming" or "We're coming up on Christmas.") Mirror neuron pathways linking brain regions coordinating vision and hand actions provide a natural pathway for the conceptual metaphor that Seeing Is Touching (as in "Their eyes met").
Conceptual metaphors are natural and inevitable. They begin to arise in childhood just by living in the everyday world. For example, a common conceptual metaphor is Events with Causal Effects Are Actions by a Person. That is why the wind blows, why storms can be vicious, and why there is religion, in which the person causing those effects is called a god, or God if there is only one. The most common metaphors for God in the Western traditions is that God is a father, or a person with father-like properties — a creator, lawgiver, judge, punisher, nurturer, shepherd, and so on — and that God is The Infinite: the all-knowing, all-powerful, all-good, and first cause. These metaphors are not going to go away. The question is whether they will continue to be taken literally.
Those who believe, and promote the idea, that reason is not metaphorical —that mathematics is literal and structures the world independently of human minds —are ignoring conceptual metaphor and encouraging false literalness, which can be harmful.
The science is clear. Metaphorical thought is normal. That should be widely recognized.
Every time you think of paying moral debts, or getting bogged down on a project, or losing time, or being at a crossroads in a relationship, you are unconsciously activating a conceptual metaphor circuit in your brain, reasoning using it, and quite possibly making decisions and living your life on the basis of your metaphors. And that's just normal. There's no way around it!
Metaphorical reason serves us well in everyday life. But it can do harm if you are unaware of it.
Tags:
1 comment:
I'm a regular reader of your blog and actually look forward to reading your posts. I'm also interested in fitness and see you have a list of fitness sites and thought you might want to link out to these great sites:
http://www.musclechronicle.com
http://www.gethenchnow.com
http://www.simplyshredded.com
http://www.criticalbench.com/
Post a Comment