Saturday, June 05, 2010

Axel Cleeremans - The Grand Challenge for Psychology: Integrate and Fire!

Another interesting article from Frontiers in Psychology, an open source peer-reviewed journal. The article suggests a pseudo-integral approach to psychology, offering 27 subfields (see diagram above and explanation in the article) - but my sense is that not all of the categories are the right ones. It is a move in the right direction though.

Université Libre de Bruxelles, Belgium

Is psychology a scientific discipline of its own? Or is it the case, as Scott (1991) upheld, that “Psychology lacks a clear identity”? The latter is certainly the impression one gets reading Ludin (1979), who, in his opus titled “Theories and Systems of Psychology”, describes the emergence of psychology over the 20th century with retrospectively comical words that would perhaps best be used to describe how hunter-gatherers got together to form tribes:
“As psychology has evolved during the present century, different groups of people who called themselves psychologists have banded together to put forth communities of ideas and efforts designed to direct the way psychology should go. When a particular group shared similar ideas and opposed others, a “school” of psychology was formed”. (p. 1)
The balkanization of psychology in separate subfields that Scott laments is nothing new. Already in 1949, for instance, on the occasion of the creation of the Belgian Society for Psychology, its main incipient, the famous Michotte, wrote that
“it seems to me that it would be of interest to better know each other and to coordinate our efforts by creating opportunities for regular contact, which would make it possible for us to discuss either theoretical problems or professional questions, and, in a more general way, of anything that is relevant to professional psychologists” (Michotte, 1954, p. 1).
There is undoubtedly a bewildering diversity of approaches to the old and respectable problem of how the mind works. This comes as no surprise, given the complexity of exploring not only the intricacies of our mental life, but, more to the point, how the mind relates to the body, and in particular to the brain. The brain is itself so complex that a neuroscientist can spend his entire career working on a single type of neuron. But this is not all, for neither body nor mind ever stand still. The brain changes when we grow up, and as we age. Further, as agents, we are in continuous interaction with the world and with other agents, and to such an extent that one may reasonably claim that it is meaningless to study psychological mechanisms without considering how they are modulated by the environment and by social factors.

Psychology is thus, by nature, a “hub” discipline, for its object of study is quite literally spread over several levels of description that span the entire spectrum of reality – from molecules to ecstasy. As Freud (1949) presciently noted,
“We know two kinds of things about what we call our psyche (or mental life): firstly, its bodily organ and scene of action, the brain (or nervous system) and, on the other hand, our acts of consciousness, which are immediate data and cannot be further explained by any sort of description. Everything that lies in between is unknown to us, and the data do not include any direct relation between these two terminal points of our knowledge. If it existed, it would at most afford an exact localization of the processes of consciousness and would give us no help towards understanding them”.
By some accounts at least, we have come a long way since then, and indeed both consciousness (as a singular problem), and brain imaging (as the best technology to bridge the gap between neural activity and mental life) now stand tall in the study of mind. And thus we have integrated methods and ideas from a number of related disciplines, from the neurosciences to philosophy, from economics to psychiatry, from biology to artificial intelligence, so spawning interconnected fields that all have human beings as their core object of interest.

More often than not unfortunately, this multiplicity of approaches has proven detrimental to the psychological sciences, not only because their object is constantly under threat of appropriation by other disciplines, but also because of a lack of cogent dialogue within our own community. “Things appear to be changing in Psychological Science, however”, as Cacciopo (2007), then president of the Association of Psychological Science, remarked in an Observer article. And indeed they are. For someone who has been active in our field for just about 25 years now, it is truly remarkable to witness the emergence of entirely new fields such as social neuroscience, experimental psychopathology, or neuropsychoanalysis. Some such specialties bear monickers that clearly reflect the ambition to be integrative, such as “developmental social cognitive neuroscience”. Almost all carry either a “neuro -” prefix or a “science” postfix, thus reflecting both enthusiasm in the face of the increased availability of entirely new tools to study the mind and perhaps also some preoccupation with granting the new subfields better status by describing them as “science” rather than as mere “psychology”. Make no mistake however: The mind is a messy affair, and it is not in virtue of the fact that we now have considerably better methods available to probe it that it will suddenly unwrap itself for inspection in newfound simplicity…

Cacciopo (2007) in his Observer piece, showed an interesting little graph delineating and he distinguished between “levels of organization” on the one hand, and “cross-cutting perspectives” on the other, so carving up our science in subfields characterized both by their object of interest and by the method through which it was approached.

Here is my own take at this attempt (Figure 1), not Frontiers’ tiered pyramid, but a Rubik’s cube of subfields defined by crossing three dimensions: Levels of organization (Biology, Individuals, Groups), Methods (Observation and Population Studies, Experimentation and Modeling, Intervention), and Perspectives (Normality, Change and Differences, Pathology). The first dimension – Levels of Organization – is self-explanatory and simply refers to whether the research is focused on understanding the mind by focusing on its neural correlates, on the mental representations and on the behaviour of individuals, or on the processes that take place when such individuals interact with other people. The second dimension – Methods – is an attempt to capture the astounding diversity of approaches to psychological phenomena that characterizes the field. Observation is grouped with Population Studies to the extent that both involve some form of data-mining and descriptive statistics. Experimentation and Modeling (in particular computational modeling) are the mainstay of psychological research and instantiate methods where one attempts to manipulate specific factors in such a way as explore their effects on the variables of interest and hence develop an understanding of the mechanisms involved. Intervention methods refer to an altogether different approach perhaps more typical of Clinical, Educational, and Neuro-Psychology whereby researchers actively act upon the participants so as to improve their condition or to promote the occurrence of specific behaviours. Finally, the third dimension – Perspectives – describes the overall focus of the research – normal or pathological functioning on the one hand, change through evolution, development, maturation and learning as well as differences between individuals or species on the other. Of course, one may always quibble with specific choices in carving up a domain as complex as Psychology in such a manner. Thus, it should be clear that “Experimentation and Modeling” includes brain imaging methods, and that fields such as Animal Psychology are subsumed in the “Change and Differences” perspective whenever the main concern is to understand the human mind by comparing it with other minds. Finally, while this analysis depicts psychology as fragmented in neat little subdomains, it is obvious that the proposed dimensions define each other and are thus very much co-dependent, both at their junctures and over their own levels.

Figure 1. Carving up psychology at its joints results in a Rubik's Cube of 27 subfields.

But you get the idea… and the resulting cube, which contains 27 subfields, satisfyingly produces many of the expected specialties. For instance, crossing “Individuals” with “Experimentation and Modeling” and “Pathology” gives us “Frontiers in Psychopathology”. Crossing “Groups” with “Observation and Population Studies” and “Normality” gives us “Frontiers in quantitative psychology and measurement”. Not all specialties can be confined to one little cube. Rather, some can span several because of their own interdisciplinary nature. Thus for instance, “Frontiers in Evolutionary Psychology”, while squarely focused on Change and Differences as its central perspective, can clearly interest different levels of organization. Likewise, “Frontiers in Theoretical and Philosophical Psychology” is perhaps best seen as forming the background against which our cube is depicted inasmuch as the questions the specialty will address could potentially concern any domain, any perspective, or any method that has something to do with the discipline as a whole.

The singular “Grand Challenge” for the Psychological Sciences, as they blossom in the 21st century, is thus clearly integration – a stance that prominent colleagues such as Mischel (2004) have long been promoting. Not only is it high time that we start speaking to each other, but it may also be the case that doing so is absolutely essential for psychology to thrive as a discipline of its own in the future. Everybody else, indeed, feels like a psychologist today. Even physicists sometimes feel they have better answers than we do to particularly hard problems such as consciousness. But it takes considerable wits to design interesting psychology experiments, and this is not something that comes cheap. As one famous pioneer in neuroimaging methods once confessed to me, “The most interesting part of any neuroimaging study is the behavioral paradigm”. Designing such experiments is what psychologists do best, and I am convinced that excellence in experimental design and behavioural methods will remain the greatest strength of psychology in the future.

Thus, the Rubik’s cube of Psychology depicted in Figure 1 needs to be set right! But perhaps not in the way a real Rubik’s cube is set right, that is, by attempting to obtain uniformly colored surfaces, but instead by twisting and turning it in such a way that each face contains as many different colors as possible, so fostering fecund conjunctions where the colors blend into each other at the seams… For indeed this appears to be the only forward in psychology, as much as it is true that it makes little sense to speak of pathological states in the face of disagreement about what is normal; as much as it is true that the inductive reasoning fostered by observation should be followed by the deductive reasoning made possible by experimentation; as much as it is true that cognition can only be understood in the social context in which information processing takes place.

Now is thus the time to expand our thinking and cast our conceptual nets in a way that is relevant to psychology as a whole. Instead of having each of us retreat to his or her own little space of our collective cube, we should instead strive to expand our reach so as to make psychology fully assume its role as a modern “hub” discipline, a discipline through which Man can be understood in his full complexity, from individual differences to social trends, from neurons to emotions. This is no simple task; it is indeed a “grand challenge”; one that I am confident Frontiers in Psychology will help address.


Axel Cleeremans is a Research Director with the National Fund for Scientific Research (Belgium).


Cacciopo, J. T. (2007). The structure of psychology. APS Obs. 20, 50–51.
Freud, S. (1949). An Outline of Psychoanalysis (J. Strachey, Trans.). London: Hogarth Press.
Ludin, R. W. (1979). Theories and Systems of Psychology, 2nd edn. Lexington, MA: D. C. Heath & Company.
Michotte, A. (1954). Untitled. Psychol. Belg. 1, 1–2.
Mischel, W. (2004). Toward an integrative science of the person. Annu. Rev. Psychol. 55, 1–22.
Scott, T. R. (1991). A personal view of the future of psychology departments. Am. Psychol. 46, 975–976.

Citation: Cleeremans A (2010) The grand challenge for psychology: integrate and fire!. Front. Psychology 1:12. 1-3. doi:10.3389/fpsyg.2010.00012

Copyright: © 2010 Cleeremans. This is an open-access article subject to an exclusive license agreement between the authors and the Frontiers Research Foundation, which permits unrestricted use, distribution, and reproduction in any medium, provided the original authors and source are credited.


Tags: , , , , , , , , , , , ,

Spending Time in Nature Makes People Feel More Alive

In my series at The Masculine Heart exploring ideas for a model of masculine identity development (most recent entry here, with links to all previous entries), I spent a whole long post on the role of nature in human development and health.

This article from Science Daily adds more proof for that idea.

Spending Time in Nature Makes People Feel More Alive, Study Shows

ScienceDaily (June 4, 2010) — Feeling sluggish? The solution may require getting outside the box -- that big brick-and-mortar box called a building.

Being outside in nature makes people feel more alive, finds a series of studies published in the June 2010 issue of the Journal of Environmental Psychology. And that sense of increased vitality exists above and beyond the energizing effects of physical activity and social interaction that are often associated with our forays into the natural world, the studies show.

"Nature is fuel for the soul, " says Richard Ryan, lead author and a professor of psychology at the University of Rochester. "Often when we feel depleted we reach for a cup of coffee, but research suggests a better way to get energized is to connect with nature," he says.

The findings, adds Ryan, are important for both mental and physical health. "Research has shown that people with a greater sense of vitality don't just have more energy for things they want to do, they are also more resilient to physical illnesses. One of the pathways to health may be to spend more time in natural settings," says Ryan.

In recent years, numerous experimental psychology studies have linked exposure to nature with increased energy and heightened sense of well-being. For example, research has shown that people on wilderness excursions report feeling more alive and that just recalling outdoor experiences increases feelings of happiness and health. Other studies suggest that the very presence of nature helps to ward off feelings of exhaustion and that 90 percent of people report increased energy when placed in outdoor activities.

What is novel about this research, write the authors, is that it carefully tests whether this increased vitality associated with the outdoors is simply the feel-good spillover from physical activity and people-mixing often present in these situations. To tease out the effects of nature alone, the authors conducted five separate experiments, involving 537 college students in actual and imagined contexts. In one experiment, participants were led on a 15-minute walk through indoor hallways or along a tree-lined river path. In another, the undergraduates viewed photographic scenes of buildings or landscapes. A third experiment required students to imagine themselves in a variety of situations both active and sedentary, inside and out, and with and without others.

Two final experiments tracked participants' moods and energy levels throughout the day using diary entries. Over either four days or two weeks, students recorded their exercise, social interactions, time spent outside, and exposure to natural environments, including plants and windows.

Across all methodologies, individuals consistently felt more energetic when they spent time in natural settings or imagined themselves in such situations. The findings were particularly robust, notes Ryan; being outside in nature for just 20 minutes in a day was enough to significantly boost vitality levels. Interestingly, in the last study, the presence of nature had an independent energizing effect above that of being outdoors. In other words, conclude the authors, being outdoors was vitalizing in large part because of the presence of nature.

The paper builds on earlier research by Ryan, Netta Weinstein, a psychologist at the University of Hamburg, Germany, and others showing that people are more caring and generous when exposed to nature. "We have a natural connection with living things," says Ryan. "Nature is something within which we flourish, so having it be more a part of our lives is critical, especially when we live and work in built environments." These studies, concludes Ryan, underscore the importance of having access to parks and natural surroundings and of incorporating natural elements into our buildings through windows and indoor plants.

The paper was coauthored by Weinstein; Jessey Bernstein, McGill University; Kirk Warren Brown, Virginia Commonwealth University; Louis Mastella, University of Rochester; and Marylène Gagné, Concordia University.

Journal Reference:
  1. Richard M. Ryan, Netta Weinstein, Jessey Bernstein, Kirk Warren Brown, Louis Mistretta, Marylène Gagné. Vitalizing effects of being outdoors and in nature. Journal of Environmental Psychology, 2010; 30 (2): 159 DOI: 10.1016/j.jenvp.2009.10.009

Lama Surya Das - Buddha is as Buddha Does

A one hour book talk with Lama Surya Das on one of his recent books (the most recent one that I own), Buddha is as Buddha Does (2008) - the video comes courtesy of

Lama Surya Das talks about Buddha is as Buddha Does: The Ten Original Practices for Enlightened Living.

The Buddha realized that each person is inherently perfect with the capacity to overcome suffering and transform themselves into forces for good. In this book, a celebrated teacher presents the Buddha's core principles in an accessible style for modern readers - Book Passage

Surya Das is one of the foremost Western Buddhist meditation teachers and scholars. Born Jeffrey Miller, he was raised in Valley Stream on New York's Long Island, where he celebrated his bar mitzvah and earned letters in basketball, baseball, and soccer at Valley Stream Central High School (class of 1968). While a student at the State University of New York at Buffalo, he attended antiwar protests, marched on Washington, and attended Woodstock. After graduating with honors from college, he traveled throughout Europe and the East, and he has spent nearly thirty years studying Zen, vipassana, yoga, and Tibetan Buddhism with many of the great old masters of Asia.

The Dalai Lama - The Ultimate Nature of the Mind

by the Dalai Lama
translated and edited by
Jeffrey Hopkins
with Anne Klein


Dalai Lama Quote of the Week

...not only is the ultimate nature of the mind unpolluted by contaminations, but also the conventional nature of the mind, that is, its mere clear knowing, is unpolluted by contaminations as well. Therefore, the mind can become either better or worse, and it is suitable to be transformed. However, no matter how much one cultivates the bad consciousnesses that provide a support for the conception of inherent existence, they cannot be cultivated limitlessly.* Cultivation of the good consciousnesses, on the other hand, which are opposite to those and which have the support of valid cognition, can be increased limitlessly. On the basis of this reason, we can ascertain that the stains on the mind can be removed.

Thus, the final nature of a mind that has removed its stains so that they will never be generated again is liberation. Therefore, we can become certain that liberation is attainable. Not only that, but just as the contaminations of the afflictions are removable, so are their predispositions as well. Therefore, we can be certain that the final nature of the mind with all the contaminations of the afflictions and their predispositions removed is attainable. This is called a non-abiding nirvana or a Body of Truth. Thereby it is generally established that liberation and omniscience exist.

--from The Buddhism of Tibet by the Dalai Lama, translated and edited by Jeffrey Hopkins, with Anne Klein, published by Snow Lion Publications

The Buddhism of Tibet • 5O% off • for this week only
(Good through June 11th).

* For a further explanation of this point, see the Archive of Quotes by the Dalai Lama. Either search for "based on ignorance" or scroll down to the November 4 (2005) quote.

Eddie Izzard - Dressed to Kill (full-length video)

The funniest man alive. Laughter. It does a body (and mind) good.

Friday, June 04, 2010

Buddhism: Max Roth talks with Genpo Roshi

I have info on this two-part video interview, but it's interesting. Most of what I know of Genpo Roshi is his Big Mind product, so learning his history is enlightening, so to speak. Roth seems to be a Fox TV reporter in Utah.

The topic of the interview seems to be, "How do we get to heaven?" Hmmmm . . . wonder why you would ask a Buddhist that question, unless you know nothing about Buddhism.

Part One:

Part Two:

For a More Ethical Civilization - Prof. Rodrigue Tremblay

Intriguing article from Global Research - this is post is based on a recent speech related to Rodrigue Tremblay's new book, The Code for Global Ethics: Ten Humanist Principles.

For a More Ethical Civilization

"When plunder becomes a way of life for a group of men living together in society, they create for themselves, in the course of time, a legal system that authorizes it and a moral code that glorifies it." ~ Frederic Bastiat (1801-1850), French economist
"The world today is as furiously religious as it ever was. ... Experiments with secularized religions have generally failed; religious movements with beliefs and practices dripping with reactionary supernaturalism have widely succeeded." ~ Peter Berger, Desecularization of the World, 1999

I think that on balance the moral influence of religion has been awful. With or without religion, good people can behave well and bad people can do evil; but for good people to do evil—that takes religion.” ~ Steven Weinberg, 1979 Nobel Laureate in Physics
There has never been more talk about ethics than today, not only in private lives, but also in government circles, in business boardrooms and in the media. That is because most people realize we are living a very corrupt period.

In 2009, the United States ranked 19th in a worldwide corruption index, way below New Zealand (1st) or Denmark (2nd).

Indeed, more than three quarters of Americans believe that we are living at a time of declining moral values. A recent Gallup poll found that 76 percent of Americans think moral values in their country are getting worse, while only 14 percent believe they’re getting better. This would seem to be paradoxical, since other indicators show that the United States is getting more religious and pious. More religion and less morality?

For instance, it has been observed that teen birth rates are the highest in the most religious states.

That may be because poor people tend to be more religious compared to the rich and tend to be less educated and less well informed. Consider also that it has been observed that religious people are more racist than average.

Morality is a complex issue, but that is no reason to sweep it under the rug of indifference.

In a new book, I attempt to tackle the issue of ethics and its sources. I have arrived at the conclusion that humanity needs a new worldview—a new moral code— a new objective standard of right and wrong, because our prevailing sources of morality are at best inadequate, and at worse, perverse.

This is because many of our problems today are not only technical in nature, but they also have a moral underpinning, and are thus much more difficult to solve. It may also be because our scientific and technological progress seems to be advancing much faster than our moral progress, with the consequence that problems arise faster than our moral ability to face them and to solve them can cope. Indeed, our problems are more and more global in nature, while our moral worldview is still essentially parochial.

We thought that wars of aggression (or pre-emptive wars) had been abolished with the adoption of the United Nations Charter on June 26, 1945 and the issuance of the Nuremberg Charter on August 8, 1945. But wars of aggression persist. —We also thought that financial crises [see] and the severe economic recessions and sometimes depressions they provoked were a thing of the past, thanks to a protecting net of financial regulations designed to control greed and prevent a repeat of the past. Well, twenty years of wholesale deregulation has brought us back to an era of anything goes and financial collapse. We also thought that the problem of poverty in the world could be alleviated, but abject poverty persists in many parts of the world.

There seems to be a pattern here, and it is that humanity seems unable to break out of a cycle of wars, economic crises and endemic poverty.

And, these throwbacks to an unpalatable past seem to coincide with other developments, such as the spread of nuclear weaponry, the persistence of ignorance, growing social and economic inequalities, disregard for basic democratic principles, the rise in global pollution, and an increasing religion-based willingness to kill and terrorize.

With the current globalization of our problems, we need to extend our circle of empathy and view humanity as a worldwide extended human family. As long as we refrain from facing that challenge, divisiveness and unsolvable conflicts will persist.

The contradiction between modern problems, new scientific knowledge and the inadequacy of our prevalent source of morality or of ethics, led me to ask what kind of values would be required to face the new challenges. What would our civilization look like if we were to adopt them?

In such a such a civilization:

• All human beings would be equal in dignity and in human rights.

• Life on this planet would not be devalued and seen as only a preparation for a better life after death, somewhere beyond the clouds.

• The virtues of tolerance and of human liberty would be proclaimed and applied, subject only to the requirements of public order.

• Human solidarity and sharing would be better accepted as a protection against poverty and deprivation.

• The manipulation and domination of others through lies, propaganda, and exploitation schemes of all kinds would be less prevalent.

• There would be less reliance on superstition and religion to understand the Universe and to solve life's problems and more on reason, logic and science.

• Better care of the Earth's natural environment—land, soil, water, air and space—would be taken in order to bequeath a brighter heritage to future generations.

• We would have ended the primitive practice of resorting to violence or to wars to resolve differences and conflicts.

• There would be more genuine democracy in the organization of public affairs, according to individual freedom and responsibility.

• Governments would see that their first and most important task is to help develop children's intelligence and talents through education.

Yes we can, if we try.

* Drawn from notes for a conference by Dr. Rodrigue Tremblay at the American Humanist Association's Annual Meeting, San Jose, California, Friday, June 4, 2010.

Rodrigue Tremblay is professor emeritus of economics at the University of Montreal and can be reached at He is the author of the book The Code for Global Ethics, Ten Humanist Principles, by Dr. Rodrigue Tremblay, prefaced by Dr. Paul Kurtz, has just been released by Prometheus Books. Please visit the book site.

Rodrigue Tremblay is a frequent contributor to Global Research. Global Research Articles by Rodrigue Tremblay.

NPR - 'The Shallows': This Is Your Brain Online

Everyone seems to have an opinion on how the internet is shaping our brains. Most people see it as a bad thing, based on the coverage in the news, while a handful of people recognize, I think, its potential to act as an extension of our consciousness far beyond the confines of our bodies, our homes, our communities.

For some opposing arguments, see these articles which are largely responses to Carr's article in the Atlantic (Is Google Making Us Stupid?):
The Brain: How Google Is Making Us Smarter
Humans are "natural-born cyborgs," and the Internet is our giant "extended mind."
by Carl Zimmer (Discover Magazine)
Edge's Reality Club offers a series of responses, including some who disagree:
On the Britannica Blog, there was a lively debate:

July 17, 2008

Britannica Forum:
This Is Your Brain; This is Your Brain on the Internet

Clay Shirky, Nicholas Carr, Larry Sanger, Matthew Battles

Why Abundance is Good: A Reply to Nick Carr
Clay Shirky

But the anxiety at the heart of "Is Google Making Us Stupid?" doesn't actually seem to be about thinking, or even reading, but culture. ...

... As Carr notes, "we may well be reading more today than we did in the 1970s or 1980s, when television was our medium of choice." Well, yes. But because the return of reading has not brought about the return of the cultural icons we'd been emptily praising all these years, the enormity of the historical shift away from literary culture is now becoming clear.

And this, I think, is the real anxiety behind the essay: having lost its actual centrality some time ago, the literary world is now losing its normative hold on culture as well. The threat isn't that people will stop reading War and Peace. That day is long since past. The threat is that people will stop genuflecting to the idea of reading War and Peace. [...MORE]

Why Skepticism is Good: My Reply to Clay Shirky
Nicholas Carr

It's telling that Shirky uses gauzily religious terms to describe the Internet—"our garden of ethereal delights"—as what he's expressing here is not reason but faith. I hope he's right, but I think that skepticism is always the proper response to techno-utopianism. [...MORE]

A Defense of Tolstoy & the Individual Thinker: A Reply to Clay Shirky
Larry Sanger

I've already responded in another forum to Nick Carr's essay, which I thought was very thought-provoking, if not entirely on target; I won't repeat here what I said there. But in it you can see that I would disagree almost perfectly with Clay Shirky, who I want to respond to separately here.

I want to respond to Clay Shirky. I've read War and Peace twice. It's one of my very favorite novels, and I love it—it's enormously interesting. In Clay's view, it seems, the new speed and deeply social nature of intellectual discourse means that, soon, the only relevant discourse will occur in blog- or Twitter-sized chunks. Is this the hip "upstart literature," proudly "diverse, contemporary, and vulgar," that is now "the new high culture"?

If so, God help us. [...MORE]

Yes, the Internet Will Change Us (But We Can Handle It)
Matthew Battles

Nick Carr's Atlantic essay has also prompted a discussion over at publisher John Brockman's blog The Edge. Brockman's authors include computer science visionaries, evolutionary biologists, and cognitive scientists, and Carr's concerns about the cognitive effects of the Internet are very much their cup of tea. [...MORE]

This piece from NPR is a discussion of Nicholas Carr's new book, The Shallows: What the Internet Is Doing to Our Brains, which is expanded from his Atlantic article.

June 2, 2010

June 2, 2010

Try reading a book while doing a crossword puzzle, and that, says author Nicholas Carr, is what you're doing every time you use the Internet.

Carr is the author of the Atlantic article Is Google Making Us Stupid? which he has expanded into a book, The Shallows: What the Internet Is Doing to Our Brains.

Carr believes that the Internet is a medium based on interruption — and it's changing the way people read and process information. We've come to associate the acquisition of wisdom with deep reading and solitary concentration, and he says there's not much of that to be found online.

Chronic Distraction

Carr started research for The Shallows after he noticed a change in his own ability to concentrate.

"I'd sit down with a book, or a long article," he tells NPR's Robert Siegel, "and after a couple of pages my brain wanted to do what it does when I'm online: check e-mail, click on links, do some Googling, hop from page to page."

The Shallows

The Shallows: What the Internet Is Doing to Our Brains
By Nicholas Carr
Hardcover, 276 pages
W.W. Norton & Co.
List price: $26.95

This chronic state of distraction "follows us" Carr argues, long after we shut down our computers.

"Neuroscientists and psychologists have discovered that, even as adults, our brains are very plastic," Carr explains. "They're very malleable, they adapt at the cellular level to whatever we happen to be doing. And so the more time we spend surfing, and skimming, and scanning ... the more adept we become at that mode of thinking."

Would You Process This Information Better On Paper?

The book cites many studies that indicate that online reading yields lower comprehension than reading from a printed page. Then again, reading online is a relatively recent phenomenon, and a generation of readers who grow up consuming everything on the screen may simply be more adept at online reading than people who were forced to switch from print.

Still, Carr argues that even if people get better at hopping from page to page, they will still be losing their abilities to employ a "slower, more contemplative mode of thought." He says research shows that as people get better at multitasking, they "become less creative in their thinking."

The idea that the brain is a kind of zero sum game — that the ability to read incoming text messages is somehow diminishing our ability to read Moby Dick — is not altogether self-evident. Why can't the mind simply become better at a whole variety of intellectual tasks?

Carr says it really has to do with practice. The reality — especially for young people — is that online time is "crowding out" the time that might otherwise be spent in prolonged, focused concentration.

"We're seeing this medium, the medium of the Web, in effect replace the time that we used to spend in different modes of thinking," Carr says.

Nicholas Carr
Joanie Simon

The Natural State Of Things?

Carr admits he's something of a fatalist when it comes to technology. He views the advent of the Internet as "not just technological progress but a form of human regress."

Human ancestors had to stay alert and shift their attention all the time; cavemen who got too wrapped up in their cave paintings just didn't survive. Carr acknowledges that prolonged, solitary thought is not the natural human state, but rather "an aberration in the great sweep of intellectual history that really just emerged with [the] technology of the printed page."

The Internet, Carr laments, simply returns us to our "natural state of distractedness."

~ Nicholas Carr is also the author of The Big Switch: Rewiring the World, from Edison to GoogleDoes IT Matter? and He blogs at Rough Type.
The following in an excerpt from the book posted at the NPR page.

'The Very Image Of A Book'

From: 'The Shallows: What The Internet Is Doing To Our Brains'

The Shallows

The Shallows: What the Internet Is Doing to Our Brains
By Nicholas Carr
Hardcover, 276 pages
W.W. Norton & Co.
List price: $26.95

Pundits have been trying to bury the book for a long time. In the early years of the nineteenth century, the burgeoning popularity of newspapers — well over a hundred were being published in London alone — led many observers to assume that books were on the verge of obsolescence. How could they compete with the immediacy of the daily broadsheet? "Before this century shall end, journalism will be the whole press — the whole human thought," declared the French poet and politician Alphonse de Lamartine in 1831. "Thought will spread across the world with the rapidity of light, instantly conceived, instantly written, instantly understood. It will blanket the earth from one pole to the other — sudden, instantaneous, burning with the fervor of the soul from which it burst forth. This will be the reign of the human word in all its plenitude. Thought will not have time to ripen, to accumulate into the form of a book — the book will arrive too late. The only book possible from today is a newspaper."

Lamartine was mistaken. At the century's end, books were still around, living happily beside newspapers. But a new threat to their existence had already emerged: Thomas Edison's phonograph. It seemed obvious, at least to the intelligentsia, that people would soon be listening to literature rather than reading it. In an 1889 essay in the Atlantic Monthly, Philip Hubert predicted that "many books and stories may not see the light of print at all; they will go into the hands of their readers, or hearers rather, as phonograms." The phonograph, which at the time could record sounds as well as play them, also "promises to far outstrip the typewriter" as a tool for composing prose, he wrote. That same year, the futurist Edward Bellamy suggested, in a Harper's article, that people would come to read "with the eyes shut." They would carry around a tiny audio player, called an "indispensable," which would contain all their books, newspapers, and magazines. Mothers, wrote Bellamy, would no longer have "to make themselves hoarse telling the children stories on rainy days to keep them out of mischief." The kids would all have their own indispensables.

Five years later, Scribner's Magazine delivered the seeming coup de grace to the codex, publishing an article titled "The End of Books" by Octave Uzanne, an eminent French author and publisher. "What is my view of the destiny of books, my dear friends?" he wrote. "I do not believe (and the progress of electricity and modern mechanism forbids me to believe) that Gutenberg's invention can do otherwise than sooner or later fall into desuetude as a means of current interpretation of our mental products." Printing, a "somewhat antiquated process" that for centuries "has reigned despotically over the mind of man," would be replaced by "phonography," and libraries would be turned into "phonographotecks." We would see a return of "the art of utterance," as narrators took the place of writers. "The ladies," Uzanne concluded, "will no longer say in speaking of a successful author, ‘What a charming writer!' All shuddering with emotion, they will sigh, 'Ah, how this "Teller's" voice thrills you, charms you, moves you.'"

The book survived the phonograph as it had the newspaper. Listening didn't replace reading. Edison's invention came to be used mainly for playing music rather than declaiming poetry and prose. During the twentieth century, book reading would withstand a fresh onslaught of seemingly mortal threats: moviegoing, radio listening, TV viewing. Today, books remain as commonplace as ever, and there's every reason to believe that printed works will continue to be produced and read, in some sizable quantity, for years to come. While physical books may be on the road to obsolescence, the road will almost certainly be a long and winding one. Yet the continued existence of the codex, though it may provide some cheer to bibliophiles, doesn't change the fact that books and book reading, at least as we've defined those things in the past, are in their cultural twilight. As a society, we devote ever less time to reading printed words, and even when we do read them, we do so in the busy shadow of the Internet. "Already," the literary critic George Steiner wrote in 1997, "the silences, the arts of concentration and memorization, the luxuries of time on which ‘high reading' depended are largely disposed." But "these erosions," he continued, "are nearly insignificant compared with the brave new world of the electronic." Fifty years ago, it would have been possible to make the case that we were still in the age of print. Today, it is not.

Some thinkers welcome the eclipse of the book and the literary mind it fostered. In a recent address to a group of teachers, Mark Federman, an education researcher at the University of Toronto, argued that literacy, as we've traditionally understood it, "is now nothing but a quaint notion, an aesthetic form that is as irrelevant to the real questions and issues of pedagogy today as is recited poetry — clearly not devoid of value, but equally no longer the structuring force of society." The time has come, he said, for teachers and students alike to abandon the "linear, hierarchical" world of the book and enter the Web's "world of ubiquitous connectivity and pervasive proximity" — a world in which "the greatest skill" involves "discovering emergent meaning among contexts that are continually in flux."

Clay Shirky, a digital-media scholar at New York University, suggested in a 2008 blog post that we shouldn't waste our time mourning the death of deep reading — it was overrated all along. "No one reads War and Peace," he wrote, singling out Tolstoy's epic as the quintessence of high literary achievement. "It's too long, and not so interesting." People have "increasingly decided that Tolstoy's sacred work isn't actually worth the time it takes to read it." The same goes for Proust's In Search of Lost Time and other novels that until recently were considered, in Shirky's cutting phrase, "Very Important in some vague way." Indeed, we've "been emptily praising" writers like Tolstoy and Proust "all these years." Our old literary habits "were just a side-effect of living in an environment of impoverished access." Now that the Net has granted us abundant "access," Shirky concluded, we can at last lay those tired habits aside.

Such proclamations seem a little too staged to take seriously. They come off as the latest manifestation of the outré posturing that has always characterized the anti-intellectual wing of academia. But, then again, there may be a more charitable explanation. Federman, Shirky, and others like them may be early exemplars of the post-literary mind, intellectuals for whom the screen rather than the page has always been the primary conduit of information. As Alberto Manguel has written, "There is an unbridgeable chasm between the book that tradition has declared a classic and the book (the same book) that we have made ours through instinct, emotion and understanding: suffered through it, rejoiced in it, translated it into our experience and (notwithstanding the layers of readings with which a book comes into our hands) essentially become its first readers." If you lack the time, the interest, or the facility to inhabit a literary work — to make it your own in the way Manguel describes — then of course you'd consider Tolstoy's masterpiece to be "too long, and not so interesting."

Although it may be tempting to ignore those who suggest the value of the literary mind has always been exaggerated, that would be a mistake. Their arguments are another important sign of the fundamental shift taking place in society's attitude toward intellectual achievement. Their words also make it a lot easier for people to justify that shift — to convince themselves that surfing the Web is a suitable, even superior, substitute for deep reading and other forms of calm and attentive thought. In arguing that books are archaic and dispensable, Federman and Shirky provide the intellectual cover that allows thoughtful people to slip comfortably into the permanent state of distractedness that defines the online life.

Our desire for fast-moving, kaleidoscopic diversions didn't originate with the invention of the World Wide Web. It has been present and growing for many decades, as the pace of our work and home lives has quickened and as broadcast media like radio and television have presented us with a welter of programs, messages, and advertisements. The Internet, though it marks a radical departure from traditional media in many ways, also represents a continuation of the intellectual and social trends that emerged from people's embrace of the electric media of the twentieth century and that have been shaping our lives and thoughts ever since. The distractions in our lives have been proliferating for a long time, but never has there been a medium that, like the Net, has been programmed to so widely scatter our attention and to do it so insistently.

David Levy, in Scrolling Forward, describes a meeting he attended at Xerox's famed Palo Alto Research Center in the mid-1970s, a time when the high-tech lab's engineers and programmers were devising many of the features we now take for granted in our personal computers. A group of prominent computer scientists had been invited to PARC to see a demonstration of a new operating system that made "multitasking" easy. Unlike traditional operating systems, which could display only one job at a time, the new system divided a screen into many "windows," each of which could run a different program or display a different document. To illustrate the flexibility of the system, the Xerox presenter clicked from a window in which he had been composing software code to another window that displayed a newly arrived e-mail message. He quickly read and replied to the message, then hopped back to the programming window and continued coding. Some in the audience applauded the new system. They saw that it would enable people to use their computers much more efficiently. Others recoiled from it. "Why in the world would you want to be interrupted — and distracted — by e-mail while programming?" one of the attending scientists angrily demanded.

The question seems quaint today. The windows interface has become the interface for all PCs and for most other computing devices as well. On the Net, there are windows within windows within windows, not to mention long ranks of tabs primed to trigger the opening of even more windows. Multitasking has become so routine that most of us would find it intolerable if we had to go back to computers that could run only one program or open only one file at a time. And yet, even though the question may have been rendered moot, it remains as vital today as it was thirty-five years ago. It points, as Levy says, to "a conflict between two different ways of working and two different understandings of how technology should be used to support that work." Whereas the Xerox researcher "was eager to juggle multiple threads of work simultaneously," the skeptical questioner viewed his own work "as an exercise in solitary, singleminded concentration." In the choices we have made, consciously or not, about how we use our computers, we have rejected the intellectual tradition of solitary, single-minded concentration, the ethic that the book bestowed on us. We have cast our lot with the juggler.

Excerpted from The Shallows: What the Internet Is Doing to Our Brains by Nicholas Carr. Copyright 2010 by Nicholas Carr. Excerpted by permission of W.W. Norton & Co.

Time Magazine - 10 Questions for the Dalai Lama

Interesting, though brief.

10 Questions for the Dalai Lama

Do you ever feel angry or outraged?
Kantesh Guttal, PUNE, INDIA

Oh, yes, of course. I'm a human being. Generally speaking, if a human being never shows anger, then I think something's wrong. He's not right in the brain. [Laughs.]

How do you stay so optimistic and faithful when there is so much hate in the world?Joana Cotar, FRANKFURT
I always look at any event from a wider angle. There's always some problem, some killing, some murder or terrorist act or scandal everywhere, every day. But if you think the whole world is like that, you're wrong. Out of 6 billion humans, the troublemakers are just a handful. (See pictures of the life and times of the Dalai Lama.)

How has the role set out for you changed since you first came to be the Dalai Lama?Andy Thomas, CARMARTHEN, WALES
I became the Dalai Lama not on a volunteer basis. Whether I was willing or not, I [had to study] Buddhist philosophy like an ordinary monk student in these big monastic institutions. Eventually I realized I have a responsibility. Sometimes it is difficult, but where there is some challenge, that is also truly an opportunity to serve more.

Do you see any possibility of reconciliation with the Chinese government in your lifetime?Joseph K.H. Cheng, MELBOURNE
Yes, there is a possibility. But I think past experience shows it is not easy. Many of these hard-liners, their outlook is very narrow and shortsighted. They are not looking at it in a holistic way. However, within the People's Republic of China, there is wider contact with the outside world. There are more and more voices of discontentment among the people, particularly among the intellectuals. Things will change — that's bound to happen.

How can we teach our children not to be angry?Robyn Rice, GRAND JUNCTION, COLO.
Children always look to their parents. Parents should be more calm. You can teach children that you face a lot of problems but you must react to those problems with a calm mind and reason. I have always had this view about the modern education system: we pay attention to brain development, but the development of warmheartedness we take for granted. (Watch TIME's interview with the Dalai Lama.)

Have you ever thought about being a normal person instead of being the Dalai Lama?Grego Franco, MANILA
Yes, at a young age. Sometimes I felt, "Oh, this is a burden. I wish I was an unknown Tibetan. Then I'd have more freedom." But then later I realized that my position was something useful to others. Nowadays I feel happy that I'm Dalai Lama. At the same time, I never feel that I'm some special person. Same — we are all the same.

Do you miss Tibet?Pamela Delgado Córdoba, AGUASCALIENTES, MEXICO
Yes. Tibetan culture is not only ancient but relevant to today's world. After seeing the problems of violence, we realize that Tibetan culture is one of compassion and nonviolence. There is also the climate. In India during monsoon season, it is too wet. Then, I very much miss [Tibet].

What do you say to people who use religion as a pretext to violence or killing?Arnie Domingo, QUEZON CITY, PHILIPPINES
There are innocent, faithful people that are manipulated by some other people whose interest is different. Their interest is not religion but power or sometimes money. They manipulate religious faith. In such cases, we must make a distinction: these [bad things] are not caused by religion.

Have you ever tried on a pair of trousers?Ju Huang, STAMFORD, CONN.
When it's very, very cold. And particularly in 1959, when I escaped, I wore trousers, like laypeople dressed. So I have experience.

Do you believe your time here on earth has been a success?Les Lucas, KELOWNA, B.C.
Hmmm. That's relative. It's so difficult to say. All human life is some part failure and some part achievement.

See pictures of the Dalai Lama at the White House.

See TIME's photo-essay "Six Decades of Spiritual Leadership."