Saturday, December 25, 2010

Alva Noë - What Is Love?

The Love sculpture at the Indianapolis Museum of Art.

A little different post this week from Alva Noë in his column for NPR's 13.7: Culture and Cosmos blog.

Disclaimer: The author has written a great deal about perception, as a philosopher, and as a cognitive scientist; whatever his authority, it is not that he is a better perceiver than others. And so for the topic of today's essay.

Children do not love their parents. They are connected to them. And they need them.

Love, in contrast, is an achievement. It doesn't come for free.

Scientists like to ask: How do we perceive so much, given the impoverished data actually available to the nervous system? But the better question would be: Why do we see so little, given all there is that shows itself to us?

To bring the world into focus for consciousness, we need to do a lot of work. Think of the work a child needs to do to learn to read. But once the child reads, texts have powerful, inescapable meanings.

Learning to read is, in this way, achieving an openness to the world, or at least to a world of ideas and possibilities.

Love, in a similar way, is an achieved openness. Love, I think, is a sustained openness to another person. It's hard to learn to read; in general, although we never notice this, it is very heard to learn to see. The purpose of education is to teach us how to see. It is very hard to be able to see another person. To see someone you must know him or her. Sometimes we just cannot see what is there before us!

They say the hot love of romance is fueled by hormonal events and chemical secretions in the brain. No doubt it is. It may be also fueled by music, wine, food, and tobacco, by fantasy and lust. The stars need to align just right for one to get oneself into a position to look, and so to see

someone else. The brain rush is good. Sex is good. Food is good. But in the end, these are the topography you must traverse, to be able to perceive, and so maybe to find, love.

Love is an emotional condition. (So is seeing.) And love for another person, however wonderful, however pleasurable, is challenging.

If a love is very hot, it can take a long time for the heat to pass. But eventually it does; a transformation must take place. The less burning love of partners — the next condition — is more practical; it is no less an achievement. Openness to the other, over time, requires that one makes arrangements. You’ve got to be with the one you love, and this means you’ve got to make adjustments, adapt, shift, accommodate. Mature love is an achievement of dedication.

The love of a parent for a child has something in common with both of these forms of love. It is passionate and sensual, like one’s first passion for one’s lover. But it is practical above all else; the parent erects a structure in which to understand and guide and know the child.

Love is not only always love for a person.

You can love a work of art. Indeed, a work of art is an invitation to love. For it is offers itself up to be recognitized. Art is an opportunity for openness. Exactly the same is true of philosophy.

(To find out whether something is art, or philosophy, ask yourself: is it possible to love this?)

Can you love the whole world? I suppose this is like the question, Is there a God?

Photo: jmscottIMD/via Flickr

Robert Leahy - The Best Christmas Present? Try Gratitude

A little wisdom for this Christmas morning - gratitude, it does a heart good. In the days of yore (whenever yore was) people gave thanks for the rising sun and the rebirth of the light as the days began to get longer after the slow descent into winter. Now we can still practice that same gratitude for whatever gifts we have in our lives, no matter how simple or how small.

The Best Christmas Present? Try Gratitude
Robert Leahy, Ph.D. - Director of the American Institute for Cognitive Therapy
Posted: December 24, 2010

It's the holidays and you are wondering what to give those people in your life who are special. What's the latest gadget, the latest trend, the onething you just can't live without? Or, you wonder what you might get. Is it the jewelry that you wanted? The iPad that seems a bit too expensive? Is it a gift certificate? Giving and getting. What wonderful Christmas joy this seems to be. "So, what did you get?"

I have a suggestion for a gift -- a gift that you can receive and give at the same time. It's called "gratitude." What you can do is think about the people that you love, the special people, and contemplate why they matter to you. What would life be like without your best friend, your partner, your mother or father, your kids? Imagine that they no longer existed and now you had a chance to get them back -- but only if you could prove that you really were grateful. What would you miss about your best friend? Think about the conversations, the memories, the laughter, and the tears -- you both shared. Now think about how grateful you are for having him or her in your life. Now, tell them.

I think back about my mother who died 24 years ago. I am forever grateful to her. She cared for me when I was a child, made me laugh, gave me confidence, kissed me and gave me the ability to love. I am grateful today. And always will be. I am grateful for people and things that are gone -- but stay with me forever because I keep them in my gratitude. No one can ever take away my appreciation.

I am grateful to friends, family members, to my colleagues, my patients who continue to teach me. I am grateful that I can open my eyes and see the snow from a window where I am sitting. I am grateful to all the authors whose work has inspired me, made me think and feel in new and deeper ways, authors like Shakespeare and John Donne and James Joyce and all the others -- all gone, perhaps, but all here forever in my heart.

A patient of mine told me about how he had been cheated out of money. He was bitter, dwelling on it and complaining. He had every right to feel this way. But I suggested that he set this aside for a few minutes and to imagine the following: "Everything has been taken away. All your senses, your family, money, job, memories -- you are nothing. And now you can get one thing back at a time -- but only if you can convince me that you truly appreciate it, truly have gratitude. What do you want back first -- and why do you appreciate it?" He was suddenly quiet, tears began forming in his eyes. He said, "I want my daughter back". And I asked him what he appreciated about her and he began describing the good and the bad -- the love, affection, fun and the obstacles they shared together. And he continued with wanting his wife and what she meant and why life would seem to be empty -- impossible -- without her.

And then I said, "Imagine you are blind. But you can open your eyes for ten minutes and see what is really important. What would that be?" And, of course, it was his family. "I noticed that in your list of things you didn't put the money or your job or your possessions. And it seems to me that you already have everything that is the most important. Except you haven't been noticing it, haven't paid attention. So over the next two weeks you can either focus on the money that you lost or you can make your family feel loved and appreciated. You choose."

He chose gratitude.

When I was a kid I read the short story by O'Henry -- "The Gift of the Magi." It's about a young couple, Della and Jim, who are poor but who love each other. Tomorrow is Christmas and neither one has enough money to buy the other the present they really want to buy. Jim wants to get her a beautiful comb for her flowing hair, she wants to get him a chain for his heirloom watch. She sells her hair to buy the chain, he sells the watch to buy the comb. A comb -- but there will be no hair -- a watch-chain, but there is no longer a watch.

O'Henry ends the story with the following:

And here I have lamely related to you the uneventful chronicle of two foolish children in a flat who most unwisely sacrificed for each other the greatest treasures of their house. But in a last word to the wise of these days let it be said that of all who give gifts these two were the wisest. O all who give and receive gifts, such as they are wisest. Everywhere they are wisest. They are the magi.

Allen Frances - DSM 5 and Practical Consequences

Allen Frances offers some insights into the revision process for the DSM-5 in his Psychology Today blog - DSM5 in Distress.
Published on December 23, 2010

Last week, I had a brief, but heated debate with a friend who is on the DSM 5 Task Force. He is strongly supporting a proposed new diagnosis for DSM 5 that I oppose just as strongly. Surprisingly, I think we agree completely on the facts, but then disagree completely on how they should be interpreted and acted upon.

Here are the facts upon which we agree:
1) The available scientific literature, though quite limited, does confirm that potential patients do exist who would meet the suggested criteria for this disorder.
2) Existing studies suggest a rate of at least 5% of the proposed diagnosis in the general population.
3) The rate could conceivably double (or more) if the diagnosis becomes official, is widely used in primary care, and is targeted by drug company marketing.
4) There is no treatment with proven efficacy, but some people currently not diagnosed might benefit from existing treatment.
5) If included, the diagnosis will likely cause extensive false positive diagnosis of normals who will often receive unnecessary and potentially harmful and expensive treatment.

I consider these facts and conclude that:
1) It is premature to include this diagnosis until much more research is available on its rate in the general population, the rate of false positive diagnosis, whether treatment helps and what are its risks.
2) New diagnosis must prove their safety and efficacy applying the same strict standards of evidentiary support that we would require before the introduction of a new drug (since the risks and benefits can be equivalent).
3)Patients not covered with a specific label can always be diagnosed and treated within the "Not Otherwise Specified" categories.
4)Practical consequences are crucial in deciding whether a change should be made. The presence of a (skimpy)scientific literature indicating that patients with the proposed disorder can be found is insufficient to support its inclusion.
5) The default position is a "do no harm" conservative noninclusion. Any change in DSM 5 that can possibly be misused will very likely be misused- this is the clearest lesson of DSM IV.
6)The education on how to use DSM 5 will be dominated and twisted by drug company marketing.

My friend disagrees strongly, arguing that:
1) He knows from the literature and experience that such patients exist.
2) They need help.
3) It is irrelevant to his task to consider whether the inclusion of the proposed diagnosis in DSM 5 may lead to overdiagnosis and overtreatment. His job is simply to evaluate the available science.
4) Any potential misuse of DSM 5 is not his worry. It should be solved by education of the mental health clinicians.

You decide which approach makes more sense. It seems clear to me that pragmatic concerns for patient welfare always trump "science", especially since the "science" underpinning psychiatric diagnosis is so thin and subject to alternative interpretations.

A much fuller discussion of this tension between science vs pragmatics can be found in an extremely interesting issue of the Journal for the Advancement of Philosophy and Psychiatry (that is devoted in its entirety to the conceptual issues that face psychiatric diagnosis). See particularly the commentaries by Drs Porter, Kinghorn, and Ghaemi, and my replies to them.

The issue is available online at:

Friday, December 24, 2010

Dalai Lama Quote of the Week - Focusing on Breath Breaks Flow of Attachment


by H.H. the Dalai Lama
and Alexander Berzin


Dalai Lama Quote of the Week

When we focus our attention on the passage of breath, we break the usually continuous flow of thoughts of attachment, hostility and so forth, whatever they might be. This causes such thoughts to subside for the moment. Thus, by occupying the mind with our breath, we cleanse it of all positive and negative conceptual thoughts and thus remain in a neutral state of mind unspecified as either constructive or destructive. This is the meaning of the line in the root text, "Thoroughly clean out your state of awareness." This unspecified or neutral state of mind, cleaned out of all positive and negative conceptual thoughts, is the most conducive one to work with. Because an unspecified state of mind like this is unburdened and supple, it is relatively easy to generate it into a constructive state.

--from The Gelug/Kagyu Tradition of Mahamudra by H.H. the Dalai Lama and Alexander Berzin, published by Snow Lion Publications

The Gelug/Kagyu Tradition of Mahamudra • Now at 5O% off
(Good through December 31st).

Geshe Jampa Tegchok - Choosing a Meditation Style that Fits

An Explanation of the Thirty-seven
Practices of Bodhisattvas

by Geshe Jampa Tegchok,
edited by Thubten Chodron


Dharma Quote of the Week

It is important to note that we should make sure that our meditation suits our mind. If we feel comfortable doing analytical meditation on the various topics in a progressive way, we should go ahead with it. If, on the other hand, we find it difficult and it is not compatible with our mind, we should meditate on whatever topic we like.

If we enjoy meditation on emptiness, we should go ahead with this. If it suits us and we derive pleasure from meditating principally on the altruistic intention, we can emphasize this. At some point if we find that we cannot really get into whatever analytical meditation we have been doing, but doing prostrations, chanting mantra, visualizing a meditation deity, or reciting aspirational prayers brings peace and pleasure to our mind, we should do that practice.

--from Transforming Adversity into Joy and Courage: An Explanation of the Thirty-seven Practices of Bodhisattvas by Geshe Jampa Tegchok, edited by Thubten Chodron, published by Snow Lion Publications

Transforming Adversity into Joy and Courage • Now at 5O% off
(Good through December 31st).

Grindstone Redux - Underground Music in the 1980s

Cool - I remember buying albums by the Circle Jerks, Dead Kennedys, Butthole Surfers, and other non-commercial punk musicians in a little hole-in-the-wall record store in Grants Pass, Oregon - if you were friendly with the owner, you could also buy pipes and bongs (which were illegal then), but that's another documentary entirely.

Grindstone ReduxGrindstone Redux (2010) 60 min

The Story of the 1980's Underground Music Network.

This is the story of how the music business was transformed in the 1980s by like-minded musicians who decided to self-publish their work. They formed a “network” before the internet or email made it commonplace.

Today it is common practice for musicians to self-publish their work. But this is a recent development in music history, which began in the 1980s. It came in response to a music business with a narrow spectrum of music that was being released and promoted.

Before the internet or email, there was a network of “Underground” musicians and bands who networked with each other by producing recordings at home, exchanging them with each other, and communicating with hand-written letters. They created their own “record labels.” They produced independent magazines and the first generation of radio shows devoted to independently released music.

In the 1980s the music industry was also transformed by numerous technological developments: the first affordable recording equipment, the first affordable synthesizers and affordable photo-copying. For the first time, musicians were able to record and distribute their music without a “major” record company.

The network was supported by small, independent record stores (which have all but disappeared) where musicians could place their work for sale, as well as discover other like-minded artists and magazines. This created a sense of community in which artists were working locally with stores in their home towns. To a great extent, these stores helped build the network .

Authors@Google: Chris Chabris - The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us

Remember the viral video where the people are passing around the basketball and in the middle a man in a gorilla suit walks through? Most people missed the gorilla because they were told to focus on the passing of the ball.

Chris Chabris was the researcher behind that video (with Daniel Simons) - and here talks about the book he co-authored: The Invisible Gorilla: And Other Ways Our Intuitions Deceive Us.
Google, Mountain View on October 5, 2010.

The original "invisible gorilla" video is here:

The video played at 6:08 is here:

Tom Vanderbilt Reviews The Invisible Gorilla:
"Do you remember when you first saw--or more likely, didn't see--the gorilla? For me it was one afternoon a number of years ago when I clicked open one of those noxious-but-irresistible forwarded emails ("You Won't Believe Your Eyes!"). The task was simple--count the number of passes in a tight cluster of basketball players--but the ensuing result was astonishing: As I dutifully (and correctly) tracked the number of passes made, a guy in a gorilla suit had strolled into the center, beat his chest, and sauntered off. But I never saw the gorilla. And I was hardly alone.

The video, which went on to become a global viral sensation, brought "inattentional blindness"--a once comparatively obscure interest of cognitive psychologists--into striking relief. Here was a dramatic reminder that looking is not necessarily seeing, that "paying" attention to one thing might come at the cost of missing another altogether. No one was more taken with the experience than the authors of the original study, Daniel Simons and Christopher Chabris, as they recount in their new--and, dare I say, eye-opening--book, The Invisible Gorilla. "The fact that people miss things is important," they write, "but what impressed us even more was the surprise people showed when they realized what they had missed."
The Invisible Gorilla uses that ersatz primate as a departure point (and overarching metaphor) for exploring the myriad of other illusions, perceptual or otherwise, that we encounter in everyday life--and our often complete lack of awareness as we do so. These "gorillas" are lurking everywhere--from the (often false) memories we think we have to the futures we think we can anticipate to the cause-and-effect chains we feel must exist. Writing with authority, clarity, and a healthy dose of skepticism, Simons and Chabris explore why these illusions persist--and, indeed, seem to multiply in the modern world--and how we might work to avoid them. Alas, there are no easy solutions--doing crosswords to stave off cognitive decline in one's dotage may simply make you better at doing crosswords. But looking for those "gorillas in our midst" is as rewarding as actually finding them. "

The Invisible Gorilla website:

Thursday, December 23, 2010

Stephen Dufrechou - Are We Too Dumb for Democracy? The Logic Behind Self-Delusion

That's not a rhetorical question. But I also am not convinced "dumb" is the right word. Dufrechou is solid footing in referencing George Lakoff, but much less so when referring to psychoanalysis.

To be honest, Dufrechou has it backwards in saying that psychoanalysis supports Lakoff's research - it's the other way around. There is little to no empirical support for psychoanalysis, while Lakoff, as a linguist, is rivaled only by Chomsky.

Anyway, this is an interesting look at why people hold on to false beliefs in the face of overwhelming evidence to the contrary.

When faced with facts that do not fit seamlessly into our individual belief systems, our minds automatically reject (or backfire) the presented facts.
A recent cognitive study, as reported by the Boston Globe, concluded that:

Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

In light of these findings, researchers concluded that a defense mechanism, which they labeled “backfire”, was preventing individuals from producing pure rational thought. The result is a self-delusion that appears so regularly in normal thinking that we fail to detect it in ourselves, and often in others: When faced with facts that do not fit seamlessly into our individual belief systems, our minds automatically reject (or backfire) the presented facts. The result of backfire is that we become even more entrenched in our beliefs, even if those beliefs are totally or partially false.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” said Brendan Nyhan, the lead researcher of the Michigan study. The occurrence of backfire, he noted, is “a natural defense mechanism to avoid that cognitive dissonance.”

The conclusion made here is this: facts often do not determine our beliefs, but rather our beliefs (usually non-rational beliefs) determine the facts that we accept. As the Boston Globe article notes:

In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident we’re right, and even less likely to listen to any new information. And then we vote.

Despite this finding, Nyhan claims that the underlying cause of backfire is unclear. “It’s very much up in the air,” he says. And on how our society is going to counter this phenomena, Nyhan is even less certain.

These latter unanswered questions are expected in any field of research, since every field has its own limitations. Yet here the field of psychoanalysis can offer a completion of the picture.

Disavowal and Backfire: One and the Same

In an article by psychoanalyst Rex Butler, Butler independently comes to the same conclusion as the Michigan Study researchers. In regards to facts and their relationship to belief systems (or ideologies), Butler says that:

there is no necessary relationship between reality and its symbolization … Our descriptions do not naturally and immutably refer to things, but … things in retrospect begin to resemble their description. Thus, in the analysis of ideology, it is not simply a matter of seeing which account of reality best matches the ‘facts’, with the one that is closest being the least biased and therefore the best. As soon as the facts are determined, we have already – whether we know it or not – made our choice; we are already within one ideological system or another. The real dispute has already taken place over what is to count as the facts, which facts are relevant, and so on.

This places the field of psychoanalysis on the same footing as that of cognitive science, in regards to this matter. But where cognitive studies end, with Nyhan’s question about the cause of backfire, psychoanalysis picks up and provides a possible answer. In fact, psychoanalysts have been publishing work on backfire for decades; only psychoanalysis refers to backfire by another name: “disavowal”. Indeed, these two terms refer to one and the same phenomena.

The basic explanation for the underlying cause of disavowal/backfire goes as follows.

“Liberals” and “conservatives” espouse antithetical belief systems, both of which are based on different non-rational “moral values.” This is a fact that cognitive linguist George Lakoff has often discussed, which incidentally brings in yet another field of study that supports the existence of the disavowal/backfire mechanism.

In accordance with these different non-rational belief systems, any individual’s ideology tends to function also as a ‘filtering system’, accepting facts that seamlessly fit into the framework of that ideology, while dismissing facts that do not fit.

When an individual—whether a “liberal”, “conservative”, or any other potential ideology—is challenged with facts that conflict with his/her ideology, the tendency is for that individual to experience feelings of anxiety, dread, and frustration. This is because our ideologies function, like a lynch pin, to hold our psychologies together, in order to avoid, as Nyhan puts it, “cognitive dissonance”. In other words, when our lynch pins are disturbed, our psychologies are shaken.

Psychoanalysts explain that, when this cognitive dissonance does occur, the result is to ‘externalize’ the sudden negative feelings outward, in the form of anger or resentment, and then to ‘project’ this anger onto the person that initially presented the set of backfired facts to begin with. (Although, sometimes this anger is ‘introjected’ inward, in the form of self-punishment or self-loathing.)

This non-rational eruption of anger or resentment is what psychoanalysts call “de-sublimation”. And it is at the point of de-sublimation, when the disavowal/backfire mechanism is triggered as a defense against the cognitive dissonance.

Hence, here is what mentally occurs next, in a matter of seconds:

In order to regain psychological equilibrium, the mind disavows the toxic facts that initially clashed with the individuals own ideology, non-rationally deeming the facts to be false—without assessing the validity of the facts.

The final step occurs when the person, who offered the toxic facts, is then non-rationally demonized. The person, here, becomes tainted as a ‘phobic object’ in the mind of the de-sublimated individual. Hence, the other person also becomes perceived to be as toxic as the disavowed facts, themselves.

At this point, ad hominem attacks are often fired at the source of the toxic facts. For example: ‘stupid liberal’ or ‘stupid conservative’, if in a political context. Or, ‘blasphemer’ or ‘heretic’, if in a religious context. At this point, according to psychoanalysis, psychological equilibrium is regained. The status quo of the individual’s ideology is reinforced to guard against future experiences of de-sublimation.

Why Do Different Ideologies Exist?

This all begs the obvious question about the existence of differing ideologies between people. Why do they exist? And how are they constituted differently? George Lakoff has demonstrated in his studies (which are supported strongly by psychoanalysis), that human beings are not born already believing an ideology. Rather people are socialized into an ideology during their childhood formative years. The main agents which prescribe the ideology are the parental authority figures surrounding the child, who rear him, from infantile dependency on the parent-figures, into an independent adult. The parental values of how the child should be an independent and responsible adult, in regards to his relations between his self and others, later informs that child’s ideology as an adult.

Lakoff shows that two dominant parenting types exist, which can determine the child’s adult ideology. Individuals reared under the “Strict Parent” model tend to grow-up as political conservatives, while those raised under a “Nurturing Parent” model tend to become political liberals. His most influential book on these matters, “Moral Politics: How Liberals and Conservatives Think”, was published in 1996.

Of course, peoples’ minds can fundamentally change, along with their ideological values. But short of a concerted effort by an individual to change, through one form of therapy or another, that change is mostly fostered by traumatic or long-endured life experiences.

Yet many minds remain rock solid for life, beliefs included. As psychiatrist Scott Peck sees it, “Only a relative and fortunate few continue until the moment of death exploring the mystery of reality, ever enlarging and refining and redefining their understanding of the world and what is true.”

Thus to answer Nyahan’s question—how can society counter the negative effects of backfire?—it seems only one answer is viable. Society will need to adopt the truths uncovered by cognitive science and psychoanalysis. And society will have to use those truths to inform their overall cultural practices and values. Short of that, Peck’s “fortunate few” will remain the only individuals among us who resist self-delusion.

~ Stephen Dufrechou is Editor of Opinion and Analysis for News Junkie Post.

Alex Ross - Listen to This

Sounds like a cool book - via Google Tech Talks - Although maybe a bit geeky. [This post is corrected from the earlier one where I had the wrong title.]
Alex Ross: Listen to This
October 15, 2010.

About The Rest Is Noise: Listening to the Twentieth Century
(Ross' last book)

"Alex Ross's award-winning international bestseller, The Rest Is Noise, has become a contemporary classic, establishing him as one of our most popular and acclaimed cultural historians.

Listen to This
, which takes its title from a beloved 2004 essay in which Ross described his late-blooming discovery of pop music, showcases the best of Ross's writing from more than a decade at The New Yorker. These pieces, dedicated to classical and popular artists alike, are at once erudite and lively. In a previously unpublished essay, Ross brilliantly retells hundreds of years of music history--from Renaissance dances to Led Zeppelin--through a few iconic bass lines of celebration and lament. He vibrantly sketches canonical composers such as Schubert, Verdi, and Brahms; gives us in-depth interviews with modern pop masters such as Björk and Radiohead; and introduces us to music students at a Newark high school and to indie-rock hipsters in Beijing.

Whether his subject is Mozart or Bob Dylan, Ross shows how music expresses the full complexity of the human condition. Witty, passionate, and brimming with insight, Listen to This teaches us how to listen.

About the Author:

Alex Ross has been the music critic for The New Yorker since 1996. He is the author of the international bestseller The Rest Is Noise: Listening to the Twentieth Century, which was a finalist for the 2008 Pulitzer Prize and won the 2007 National Book Critics Circle Award."

Documentary - Food Matters

Good stuff - and important for everyone to see.

“Let thy Food be thy Medicine and thy Medicine be thy Food” – Hippocrates. That is the message from the founding father of modern medicine echoed in the controversial new documentary film Food Matters from Producer-Directors James Colquhoun and Laurentine ten Bosch.

With nutritionally-depleted foods, chemical additives and our tendency to rely upon pharmaceutical drugs to treat what’s wrong with our malnourished bodies, it’s no wonder that modern society is getting sicker. Food Matters sets about uncovering the trillion dollar worldwide ’sickness industry’ and gives people some scientifically verifiable solutions for overcoming illness naturally.

“With access to better information people invariably make better choices for their health…”

In what promises to be the most contentious idea put forward, the filmmakers have interviewed several leading experts in nutrition and natural healing who claim that not only are we harming our bodies with improper nutrition, but that the right kind of foods, supplements and detoxification can be used to treat chronic illnesses as fatal as terminally diagnosed cancer.

The focus of the film is in helping us rethink the belief systems fed to us by our modern medical and health care establishments. The interviewees point out that not every problem requires costly, major medical attention and reveal many alternative therapies that can be more effective, more economical, less harmful and less invasive than conventional medical treatments.

The ‘Food Matters’ duo have independently funded the film from start to finish in order to remain as unbiased as possible, delivering a clear and concise message to the world. Food Matters.

A "New" Approach to Health from Jamie Simko on Vimeo.

Maureen O'Hara - Relational Empathy: Beyond Modernist Egocentricism to Postmodern Holistic Contextualism

Nice article (book chapter) from Maureen O'Hara of Center for Studies of the Person, La Jolla, California on the nature of relational empathy in moving away from an individualist perspective in psychotherapy.

Maureen O'Hara
Center for Studies of the Person, La Jolla, California

Considering empathy as both construct and human activity, the chapter contributes to the fast-growing discussion of the limits of the indigenous psychology of the Western world in addressing the relational needs of its members. In particular it examines the limits of Modernist individualism as a paradigm for understanding human experience, and on ways Western psychological descriptions and understandings of empathy in particular — whether Rogerian, psychoanalytic, existential or more generic — have obscured some of the important ways empathy functions in human relationships. Because of its position as a modernist, objectivist discourse, Western psychology has been slow to recognize how its own modes of enquiry and expression have limited our understanding of relational realities. The chapter extends understanding of empathy beyond its present role as the "royal road to understanding" of individuals by approaching it from within somewhat different frames of reference from those traditionally characteristic of psychological discussion. Empathy is then discussed in a more multi-levelled or holistic way as a way of being in, belonging to and knowing the relational contexts in which human beings find ourselves situated. Although the main arguments expand understanding of empathy as a therapeutic process the chapter concludes with a discussion of the social conditions of late twentieth century psychology. As our world undergoes what some consider to be the birth pangs of its first truly "global civilization", in which national, ethnic, religious, gender, class, boundaries are being shifted and erased on unprecedented scales, all of us, whether in formerly tribal or collectivist societies or in Western individualist ones, will need new postmodernist psychologies with which to navigate this new world.

Wednesday, December 22, 2010

Michael Shermer - Stephen Hawking’s Radical Philosophy of Science

I did not know that Stephen Hawking had become a relativist in his perspective on science - or as Shermer calls it, belief-dependent realism: "None of us can ever be completely sure that the world really is as it appears, or if our minds have unconsciously imposed a misleading pattern on the data." Shermer refers to studies (and there are a lot of them) that show are beliefs, assumptions, biases shade the way we view the world, speak about it, and relate to it.

Hawking's new perspective is similar:
Hawking presents a philosophy of science he calls “model-dependent realism,” which is based on the assumption that our brains form models of the world from sensory input, that we use the model most successful at explaining events and assume that the models match reality (even if they do not), and that when more than one model makes accurate predictions “we are free to use whichever model is most convenient.”
Here is the whole article - very interesting reading from my perspective.

From Big Questions Online:

Stephen Hawking’s Radical Philosophy of Science

Is Hawking right to claim that reality is dependent on the model used to describe it?

Stephen Hawking in zero-G
photo: NASA
Tuesday, November 23, 2010

Do you think that there is a computer screen sitting in front of you right now?

It would certainly seem so if you are reading these words online, but in fact you are not actually “seeing” the computer screen in front of you. What you see are photons of light bouncing off the screen (and generated by the internal electronics of the screen itself), which pass through the hole in the iris of your eye, through the liquid medium inside your eye, wending their way through the bipolar and ganglion cells to strike the rods and cones at the back of your retina. These photons of light carry just enough energy to bend the molecules inside the rods and cones to change the electrochemical balance inside these cells, causing them to fire, or have what neuroscientists call an “action potential.”

From there the nerve impulse races along the neural pathway from the retina to the back of the brain, leaping from neuron to neuron across tiny gaps called synaptic clefts by means of neurotransmitter substances that flow across those gaps. Finally, they encounter the visual cortex, where other neurons record the signals that have been transduced from those photons of light, and reconstruct the image that is out there in the world.

Out of an incomprehensible number of data signals pouring in from the senses, the brain forms models of faces, tables, cars, trees, and every conceivable known (and even unknown — imagined) object and event. It does this through something called neural binding. A “red circle” would be an example of two neural network inputs (“red” and “circle”) bound into one percept of a red circle. Downstream neural inputs, such as those closer to muscles and sensory organs, converge as they move upstream through convergence zones, which are brain regions that integrate information coming from various neural inputs (eyes, ears, touch, etc.) You end up perceiving a whole object instead of countless fragments of an image. This is why you are seeing an entire computer screen with a meaningful block of text in front of you right now, and not just a jumble of data.

At any given moment there are, in fact, hundreds of percepts streaming into the brain from the various senses. All of them must be bound together for higher brain regions to make sense of it all. Large brain areas such as the cerebral cortex coordinate inputs from smaller brain areas such as the temporal lobes, which themselves collate neural events from still smaller brain modules such as the fusiform gyrus (for facial recognition). This reduction continues all the way down to the single neuron level, where highly selective neurons — sometimes described as “grandmother” neurons — fire only when subjects see someone familiar. Other neurons only fire when an object moves left to right across one’s visual field. Still other neurons only fire when an object moves right to left across the visual field. And so on, up the networks, goes the binding process. Caltech neuroscientists Christof Koch and Gabriel Kreiman, in conjunction with UCLA neurosurgeon Itzhak Fried, for example, have even found a single neuron that fires when the subject is shown a photograph of Bill Clinton (PDF) and no one else!

The models generated by biochemical processes in our brains constitute “reality.” None of us can ever be completely sure that the world really is as it appears, or if our minds have unconsciously imposed a misleading pattern on the data. I call this belief-dependent realism. In my forthcoming book, The Believing Brain, I demonstrate the myriad ways that our beliefs shape, influence, and even control everything we think, do, and say about the world. The power of belief is so strong that we typically form our beliefs first, then construct a rationale for holding those beliefs after the fact. I claim that the only escape from this epistemological trap is science. Flawed as it may be because it is conducted by scientists who have their own set of beliefs determining their reality, science itself has a set of methods to bypass the cognitive biases that so cripple our grasp of the reality that really does exist out there.

According to the University of Cambridge cosmologist Stephen Hawking, however, not even science can pull us out of such belief dependency. In his new book, The Grand Design, co-authored with the Caltech mathematician Leonard Mlodinow, Hawking presents a philosophy of science he calls “model-dependent realism,” which is based on the assumption that our brains form models of the world from sensory input, that we use the model most successful at explaining events and assume that the models match reality (even if they do not), and that when more than one model makes accurate predictions “we are free to use whichever model is most convenient.” Employing this method, Hawking and Mlodinow claim that “it is pointless to ask whether a model is real, only whether it agrees with observation.”

For example, in physics experiments sometimes light acts as a particle and sometimes it acts as a wave. Well, which is it, particle or wave? The answer depends on which model of light you use. In the famous double-slit experiment, light is passed through two slits and forms an interference pattern of waves on the back surface. When you send single photons of light one at a time through one slit, the light acts like individual particles. But when you shoot the single photons of light one at a time through two slits, they form an interference wave pattern as if they were interacting with other photons, even though they are not … at least not in this universe!

How is this possible? One solution to the mystery is that the photons are interacting with photons in other universes. Hawking and Mlodinow employ the model developed by Richard Feynman called “sum over histories,” in which every particle in the double-slit experiment takes every possible path that it can, and thus interacts with itself in its different histories.

So which model of light best matches reality? According to Hawking and Mlodinow, none of them do, or they all do. “There is no picture- or theory-independent concept of reality,” the scientists conclude. “If there are two models that both agree with observation, like the goldfish’s picture and ours, then one cannot say that one is more real than another. One can use whichever model is more convenient in the situation under consideration.”

Model-dependent realism argues that there is no privileged position in the universe — no Archimedean point outside of our brain that we can access to know what reality really is. There are just models. It is not possible to understand reality without having some model of reality, so we are really talking about models, not reality. Is there a way around this apparent epistemological trap?

There is. It’s called science.

The tools and methods of science were designed to test whether or not a particular model or belief about reality matches observations made not just by ourselves but by others as well. When one scientific lab corroborates the findings of another lab, and those findings support of a tested model, then it strengthens our confidence that the model (or hypothesis, or theory) more closely corresponds to reality, even if we can never know with 100 percent certainty the true nature of that reality.

Even when two models appear to be equally supported by observations, over time we accumulate more precise observations that tell us which model more closely matches reality. Historians of science contend that in the 16th century, the newly introduced Copernican sun-centered model of the solar system was, in fact, no better at explaining the observations of the movement of the planets than was the Ptolemaic earth-centered model. As observations of the movement of planets increased in accuracy, the Copernican model won out.

If model-dependent realism were taken to its nth degree, we could never actually say that the Copernican model is better than, or superior to, or more closely matches reality than the Ptolemaic model. Hawking and Mlodinow would surely agree, because they argue that a model is good if it meets four criteria:

  • Is elegant
  • Contains few arbitrary or adjustable elements
  • Agrees with and explains all existing observations
  • Makes detailed predictions about future observations that can disprove or falsify the model if they are not borne out

As a historian of science, I conclude that, in fact, nearly all scientific models — indeed, belief models of all sorts — can be parsed in such a manner and, in time, found to be better or worse than other models. In the long run, we discard some models and keep others based on their validity, reliability, predictability, and perceived match to reality. Yes, even though there is no Archimedean point outside of our brains, I believe there is a real reality, and that we can come close to knowing it through the lens of science — despite the indelible imperfection of our brains, our models, and our theories.

Such is the nature of science, which is what sets it apart from all other knowledge traditions.

Michael Shermer is the publisher of Skeptic magazine, a monthly columnist for Scientific American, and an adjunct professor at Claremont Graduate University. His books include The Science of Good and Evil, Why Darwin Matters, and The Mind of the Market. He can be reached at

Humanities in the Digital Age: Alison Byerly & Steven Pinker

Communications Forum

Nice discussion, courtesy of MIT World: Distributed Intelligence.

There are many conservatives (in name only - a true conservative would never advocate for this) who would love to see the humanities no longer taught in colleges in universities, or at least not in those that are publicly funded.

They would love to see university education focused on career paths, not on providing a well-rounded liberal arts education. Maybe it's that word, liberal. Or maybe it's that well-educated people tend to be more independent thinkers and less likely to be knee-jerk reactionaries (i.e., fundamentalist conservatives).

Humanities in the Digital Age

Reports of the demise of the humanities are exaggerated, suggest these panelists, but there may be reason to fear its loss of relevance. Three scholars whose work touches a variety of disciplines and with wide knowledge of the worlds of academia and publishing ponder the meaning and mission of the humanities in the digital age.

Getting a handle on the term itself proves somewhat elusive. Alison Byerly invokes those fields involved with “pondering the deep questions of humanity,” such as languages, the arts, literature, philosophy and religion. Steven Pinker boils it down to “the study of the products of the human mind.” Moderator David Thorburn wonders if the humanities are those endeavors that rely on interpretive rather than empirical research, but both panelists vigorously make the case that the liberal arts offer increasing opportunities for data-based analysis.

Technology is opening up new avenues for humanities scholars. In general, Byerly notes, humanities “tend to privilege individual texts or products of the human mind, rather than collective wisdom or data.” More recently, online collections or data bases of text, art and music make possible wholly different frameworks for study. Pinker cites his own use of automated text analysis in Google books to research the history of violence, tracing the rise and fall of such words as “glorious” and “honorable” -- connected in times past with nations’ war-making. Humanities scholars could routinely deploy tools like this to strengthen argument and interpretation, says Pinker, allowing them “to say things are warranted, true, coherent.”

Humanists are adopting new tools and methods for teaching and publishing as well. Byerly describes the freedom afforded her as a professor of Victorian literature when she can direct students to specific interactive websites for historical and cultural background, allowing her to focus on a specific novel in class. As Middlebury provost, she has been broadening the concept of publication to include work in different media online. However, as Pinker notes, the process for publishing articles in scholarly journals remains painfully slow: in experimental psychology, a “six year lag from having an idea to seeing it in print.” He suggests a “second look” at the process of peer review, perhaps publishing everything online, “and stuff that’s crummy sinks to the bottom as no one links to or reads it.” Pinker looks forward to a future where he no longer has to spend a “lot of time leafing through thick books” looking for text strings, or flipping to and from footnotes. “We could love books as much as we always have, but not necessarily confine ourselves to their limitations, which are just historical artifacts,” he says.

Such changes in the humanities may not come a moment too soon. In spite of relatively stable numbers of graduates in the U.S., the liberal arts may be increasingly endangered. Byerly sees “an inherent aura of remoteness about humanities: It studies the past, and distant past. At a time when technology seems to be speeding things up, bringing information to us faster, humanities’ pace doesn’t seem in tune with the times.” Pinker’s “nightmare scenario” is the “disaggregation of practical aspects of undergraduate education of students, and humanities,” akin to the way newspapers lost classified advertising. Humanities faculty, who tend not to bring in grants the way science faculty do, may prove an irresistible target for budget cutters. To protect these fields, Pinker proposes, integrate them with social sciences: connect English literature to the sciences of human nature, for instance, or music theory to auditory perception. “Make humanities’ faculty indispensable,” he urges.


David Thorburn headshot

David Thorburn - MIT Professor of Literature
MacVicar Faculty Fellow Director, MIT Communications Forum

More on David Thorburn


Alison Byerly headshot

Alison Byerly - Professor of English & American Literatures, Middlebury College

Byerly's Middlebury website

Steven Pinker headshot

Steven Pinker - Harvard College Professor, and Johnstone Family Professor of Psychology

Pinker at Harvard | Department of Psychology Faculty Bio

Luxuriant Flowing Hair Club for Scientists

Jesse Prinz - Does Consciousness Outstrip Sensation?

associative visual agnosia

This is a very cool post from a very cool blog (On the Human) - Jesse Prinz attempts to narrow down the types of qualia of which we can be conscious - and how that process might look inside the brain. I don't really agree with Prinz, but I always enjoy reading what other people think about the "hard problem" of consciousness.

Does Consciousness Outstrip Sensation?

by Jesse Prinz
City University of New York, Graduate Center

1. Introduction: The Battle of the Bulge

In trying to develop a theory of how consciousness arises in the brain, it is important to begin with an account of which kinds of brain events can be conscious. Once we know which brain events are conscious, we can investigate what distinguishes them from those that are not. Pretty much everyone agrees that some activities in the brain are never conscious. For example, it would be hard to find researches who think there can be conscious events in the cerebellum. Most researchers nowadays also deny that there can be conscious events in subcortical structures, though there is an occasional plea for the thalamus (Baars) or the reticular formation (Damasio). But what about the neocortex? Is any activity in that folded carapace a candidate for conscious experience? Does each cortical neuron vie for the conscious spotlight, like the contestants on a televised talent show? (Recall Dennett on “cerebral celebrity.”)

To make progress on this question, we can ascend from brain to mind, and ask which of our psychological states can be conscious. Answers to this question range from boney to bulgy. At one extreme, there are those who say consciousness is limited to sensations; in the case of vision, that would mean we consciously experience sensory features such as shapes, colors, and motion, but nothing else. This is called conservatism (Bayne), exclusivism (Siewert), or restrictivism (Prinz). On the other extreme, there are those who say that cognitive states, such as concepts and thoughts, can be consciously experienced, and that such experiences cannot be reduced to associated sensory qualities; there is “cognitive phenomenology.” This is called liberalism, inclusivism, or expansionism. If defenders of these bulgy theories are right, we might expect to find neural correlates of consciousness in the most advanced parts of our brain.

In this discussion, I will battle the bulge. I will sketch a restrictive theory of consciousness and then consider arguments for and against cognitive phenomenology.

2. The Locus of Consciousness

Not only do I think consciousness is restricted to the senses; I think it arises at a relatively early level of sensory processing. Consider vision. According to mainstream models in neuroscience, vision is hierarchically organized. Let’s consider where in that hierarchy consciousness arises.

Low-level vision, associated with processes in primary visual cortex (V1) registers very local features, such as small edges and bits of color. Intermediate-level vision is distributed across a range of brain areas with names like V2 through V7. Neural activations in these areas integrate local features into coherent wholes, but, at the same time they preserve the componential structure of the stimulus. The intermediate level represents shapes as presented from particular vantage points, separated from a background, and located at some position in the visual field. High-level vision abstracts away from these features, and generates representations that are invariant across a range of different viewing positions. Such invariant representations facilitate object recognition; a rose seen from different angles in different light causes the same high-level response, allowing us to recognize it as the same. Following Ray Jackendoff, I think consciousness arises at the intermediate level. We experience the world as a collection of bounded objects from a particular point of view, not as disconnected, edged, or viewpoint invariant abstractions.

Consciousness at the intermediate level

To drive home this point, consider some facts about low- and high-level vision. The contents of low-level vision don’t seem to match the contents of experience. For example, when two distinct colors are rapidly flickered, we experience one fused color, but low-level vision treat the flickered colors as distinct; fusion occurs at the intermediate level (Jiang et al., 2007; see also Gur and Snodderly, 1997). Likewise, the low level seems oblivious to readily perceived contours, which are registered in intermediate areas (Schira et al. 2004). There is also evidence that the suppression of blink responses doesn’t arise until intermediate-level V3; V1 activity cannot explain why blinks go unperceived (Bristow et al., 2005). High-level vision does better than low-level vision on these features, but suffers from other problems when considered as a candidate for consciousness. Many high-level neurons are largely indifferent to size, position, and orientation. High-level neurons can also be indifferent to handedness, meaning they fire the same way when an object is facing to the right or the left. In addition, high-level neurons often represent complex features, so activity in a single neuron might correspond to a face, even though faces are highly structured objects, with clearly visible parts. It would seem that the neural correlates of visual consciousness cannot be so sparsely coded: just as we can focus in on different parts of a face, we should be able to selectively enhance activity in neurons corresponding to the parts of a face, rather than having a single neuron correspond to the whole.

Such considerations lead me to think that, in vision, the only cells that correspond to what we experience are those at the intermediate-level. I think this is true in other senses as well. For example, when we listen to a sentence, the words and phrases bind together as coherent wholes (unlike low-level hearing), and we retain specific information such as accent, pitch, gender, and volume (unlike high-level hearing). Across the senses, the intermediate-level is the only level at which perception is conscious.

I now want to argue that all conscious experience can be fully explained by appeal to perceptual states at the intermediate-level, including conscious states that seem to be cognitive in nature. That makes me a restrictivist.

There is much more - go read the whole post.