Saturday, January 10, 2009

Seed - Extending Darwinism

Seed Magazine looks at the evolution of Darwinian theory to embrace ideas beyond heredity, natural selection, and evolution than genes and DNA. It's about time. Even E.O. Wilson now recognizes that evolution is social as well as biological (he seems to be one of the few hard scientists approaching an integrated (integral?) vision for human knowledge - too bad he doesn't have another 30 years left in him to flesh it out more.

Anyway, here is the article.

Is there more to heredity, natural selection, and evolution than genes and DNA?


Image courtesy of Bitforms Gallery, NYC (detail of "Path 25, 2001" by C.E.B. Reas).

Like Charles Darwin, Jean-Baptiste Lamarck suggested that living organisms are products of a long process of transformation. But instead of asserting, as Darwin did, that diversity emerges through the natural selection and accumulation of heritable variations over time, Lamarck proposed two mechanisms of evolutionary change: an inherent tendency in living matter to become increasingly more complex and the inheritance of acquired characteristics — environmentally induced or learned individual adaptations that accrue over time and pass to offspring. Many biologists at the time, including Darwin himself, believed such "soft" inheritance was complementary to the theory of natural selection.

Soft inheritance was passionately debated for decades but fell from favor in the 20th century with the forging of the Modern Evolutionary Synthesis (MS), a version of Darwinism that unified the theory of natural selection with Mendelian genetics, and, later, the myriad discoveries from the midcentury molecular biology revolution of the 1950s, '60s, and '70s. For the past 60 years, it has provided the theoretical basis for evolutionary studies.

In the MS, Lamarck's soft inheritance is effectively impossible. It explains biological heredity only in terms of the blind variation of genes (which are DNA base sequences). Gene exchange through either sexual production or some rare genetic mutations accounts for inherited differences between individuals; any and all bodily changes acquired or induced during an individual's lifetime, such as muscles enlarged through exercise, cannot be passed to offspring. Macroevolutionary changes (such as new species) are simply the gradually accumulated effects of genetic variations; sudden evolutionary changes are rare and insignificant.

I and several other biologists believe the MS is in need of serious revision. Growing evidence indicates there is more to heredity than DNA, that heritable non-DNA variations can take place during development, sometimes in response to an organism's environment. The notion of soft inheritance is returning to reputable scientific inquiry. Moreover, there seem to be cellular mechanisms activated during periods of extreme stress that trigger bursts of genetic and non-genetic heritable variations, inducing rapid evolutionary change. These realizations promise to profoundly alter our view of evolutionary dynamics.

Collectively, the processes that we believe have been neglected in evolutionary studies are known as epigenetic mechanisms. Epigenetics is a term that includes all the processes underlying developmental flexibility and stability, and epigenetic inheritance is part of this. Epigenetic inheritance is the transmission of developmental variations that have nothing to do with changes in DNA base sequences. In its broad sense, it covers the transmission of any differences that do not depend on gene differences, so it encompasses the cultural inheritance of different religious beliefs in humans and song dialects in birds. It even includes the developmental legacies that a young mammal may receive from its mother through her placenta or milk — transmitted antibodies, for example, or chemical traces that tell the youngsters what the mother has been eating and, therefore, what they should eat. But epigenetic inheritance is commonly associated with cellular heredity, in which differences that arise among genetically identical cells are transmitted to daughter cells.

Biologists have long suspected that mechanisms for epigenetic cell heredity must exist. Take, for example, our own embryonic development, when cells assume different roles. Some become kidney cells, others liver cells, and so on. Although they have the same DNA, liver cells and kidney cells look different and have different functions. In biologist jargon, they have the same genotype but different phenotypes. Moreover, they "breed true": Kidney cells generate more kidney cells, and liver cells generate more liver cells, even though the stimuli that induced the different phenotypes in embryonic precursor cells are long gone. There must be some epigenetic mechanisms to ensure that a cell "remembers" what it was induced to be and transmit this "memory" of its altered state to daughter cells. This much is obvious. But surprisingly, we now know that cellular epigenetic variations are transmitted not only within organisms, but sometimes also between generations of organisms, via their sperm and eggs.

So if cells pass on information in epigenetic memories as well as in their DNA sequences, how are the two types of inheritance related? Marion Lamb and I have hit upon a helpful analogy. An organism's genotype (DNA) is like a musical score; phenotypes are particular interpretations and performances of that score. Like DNA's replication, a score can be copied and transmitted from generation to generation through high-fidelity duplication processes, and although small mistakes (mutations) crop up from time to time, the score remains essentially unchanged. But nowadays music is not transmitted solely through the score: Interpretations can be passed to future generations using the very different technology of recording and broadcasting (analogous to epigenetic inheritance mechanisms). Even with identical musical scores, performances differ, since they depend on the culture, conductor, musicians, and instruments. In the same way, DNA 's performance depends on conditions within the cell. Because of recordings, the musical interpretations of one generation may influence the subsequent performances of later generations. Similarly, because of epigenetic inheritance, the characteristics acquired in one generation can affect what happens in the next. Interesting interactions can occur between the two routes of music transmission — changes in the score will obviously affect the performance, but some performances may actually modify the score that later musicians use. There are comparable interactions between genetic and epigenetic inheritance.

In the 1990s the evidence that epigenetic variants can be transmitted between generations of organisms was rather sparse, but this is no longer true. Epigenetic cell inheritance has become a major topic in molecular biology, and more and more examples of transgenerational inheritance are emerging. Gal Raz and I recently went through the scientific literature and found more than a hundred well-documented cases. They include inherited differences in the cortex; the surface structures of the protozoan Paramecium; self-reproducing architectural variants of proteins (prions) in fungi; inherited variations in flower morphology and color; inherited diseases in rats, induced during pregnancy by administering one dose of a male hormone suppressor to their mothers; inherited heart deformities in mice that were associated with transmissible, regulatory small RNA molecules. Our list isn't endless, but it shows that heritable epigenetic variations occur in all types of organisms and affect many different types of traits. Our findings may be the tip of a very large iceberg. The variations had certain stabilities — some lasting for many generations, some for only a few — and they involved several molecular mechanisms.

The mechanism about which we know the most is DNA methylation. Certain genes are "silenced," or rendered inactive, when small chemical groups (methyls) bond to some of the Cs of the four-letter alphabet (TAG C) that encodes information in DNA. These are not mutations because the coding properties of these Cs do not change, and if methyls are removed, the genes can become active again. This is not only an epigenetic control mechanism but also an epigenetic inheritance system, since a gene's pattern of methylation, and hence its state of activity, can be replicated and passed on to daughter cells.

One interesting and important discovery is that stresses — unusual conditions difficult for organisms to cope with, such as extreme heat, starvation, toxic chemicals, or drastic hormonal changes — are potent inducers of heritable epigenetic change. They can, for instance, alter patterns of methylation at many different DNA sites. Genomic stresses can have a similar effect, such as those from hybridization between plant species. Two recently formed natural hybrids between American and European species of the cordgrass Spartina had 30 percent of their parental DNA methylation patterns altered. This is an extreme example, but there are many others showing that stress conditions cause genome-wide effects and are sometimes associated with extensive changes in DNA sequences. It seems that stress can lead to a repatterning of the genome.

What does all of this mean for our view of heredity and evolution? The first implication is that when we see different heritable types in a population, we should not automatically assume they are genetically different; the differences may well be epigenetic. When they are, they might have been induced by environmental conditions. Unfortunately, at present we do not know how much epigenetic variation exists in natural populations, although botanists are beginning to study this in plants. If there is as much natural variation induced by environmental factors as lab studies suggest, then rapid evolutionary change could occur without any genetic change at all.

Further, induced and heritable epigenetic changes may guide genetic changes. Imagine that an environmental change repeatedly leads to a particular developmental adjustment — hot conditions induce thin fur in a population of mammals, for example. This epigenetic change could persist in the population until a genetic change occurred and rendered the thin-hair phenotype "inborn." In this way induced phenotypic changes, including epigenetically heritable ones, may precede and direct the selection of genetic changes. As Mary Jane West-Eberhard has aptly put it, "Genes are followers in evolution."

In the light of epigenetics, old views of macroevolution must change. If wide-ranging epigenetic and genetic changes occur in stressful conditions, they are likely to have many effects on an organism's form and function, its phenotype. This implies that conditions requiring novel adaptations to cope with them are often the very same ones that spur the massive epigenetic and genetic alterations conducive to rapid evolutionary change. A firm linkage between the production of new variation and its subsequent selection, something forbidden within the MS, grows ever clearer.

My colleagues and I have argued that various types of epigenetic inheritance have played key roles in all the major evolutionary transitions. For example, the symbiotic relations with bacteria that gave rise to modern cells would have been impossible without epigenetic mechanisms allowing their cell membranes to reproduce; cellular epigenetic inheritance mechanisms were necessary for the transition from single-celled creatures to complex multicellular organisms with many cell types; a new non-genetic system of information transmission (symbolic language) was crucial for the transition to human culture.

There is no doubt that acknowledging epigenetic inheritance alters our perspective on heredity and evolution. It eliminates the "negatives" in the MS and produces a broader theoretical framework, which puts the development horse firmly in front of the genetic cart. The origin of phenotypic variations takes a central position, sudden generational evolutionary changes assume significance, and soft inheritance gains recognition as part of heredity and evolution. Once again, Darwinian evolutionary theory is extending its boundaries and inspiring new studies that will further enrich our understanding of life, its history, and its future.


Transformations: Identity Construction in Contemporary Culture

This interesting article appeared at the Times Higher Education blog back in December, but I am only just now getting around to posting it. This is really a review of Transformations: Identity Construction in Contemporary Culture, by Grant McCracken. Indiana University Press.

But the ideas discussed are intriguing even in the absence of having read the book. The author seems to be arguing that "entertainment culture" is dead (or at least dying) and being replaced by "transformation culture" (I'm not sure what this means, but the notion seems to be that as the contexts of culture change, so do the mechanisms of identity).

I don't think this assumes transformation as vertical change - how we think of it in integral theory - so much as horizontal change, which we generally term translation. But I will have to get and read the book to know for sure.

Transformations: Identity Construction in Contemporary Culture

4 December 2008

My history degree was rigorous and traditional. The pendulum of curriculum swung between militarism and diplomacy. The disconnection between popular culture and history was stark and obvious. Some "worthy" films by Eisenstein or Lang were watched as a history-lite introduction to revolution and war, but popular music and television, let alone fanzines and ephemera, were excluded. Ruthlessly.

At this time, I secretly augmented my reading of Paul Kennedy and A. J. P. Taylor with the early works of John Fiske. He agitated the relationship between producers and consumers. He probed texts not sources, audiences not authors. He was not worried by Wagner or Battleship Potemkin. Trained by Terence Hawkes in Cardiff, Fiske gained his greatest fame in Australia and the US, where generations of students owe him a wide and yet unpaid scholarly debt.

One of Fiske's students was Henry Jenkins, who is now head of comparative media cultures at Massachusetts Institute of Technology: a thriving department whose members blog, podcast, vodcast, conference and write. A productive researcher in this hothouse of popular cultural research is Grant McCracken.

When reading McCracken's expansive Transformations, many early Fiskean inflectives are present. Resistive readers work against "dominant culture", but digitisation has helped McCracken's argument in ways that Fiske could not have imagined. Blogs, Second Life and user-generated content have given McCracken the textual fodder to extend Fiske's ideas and provide the evidence that never seemed rigorous enough to confirm Fiske's theories, hypotheses and abstractions.

Transformations is an expansive book with an enormous project. Beginning with the maxim that "entertainment is dead", he asks what ideas and agendas will emerge in its place. While this question is not fully answered, there is much discussion of "agency" and "participation". On this journey, he argues that "entertainment culture" has been replaced with a "transformational" one. In recognising the changes to media industries and fans, he counters the "dumbing down" discourse by logging the increased sophistication of popular culture.

He confirms with a flourish: "Individuals who once submitted to the blandishments of entertainment are now interested in something more active."

The book's unifying argument is that "when consumers become producers, one of the objectives of their creative activity is the construction and multiplication of new selves". At its most basic, Transformations explores how, as contexts change, identities change.

The book is uneven, but provocative in its unevenness. Those interested in identity politics and popular culture will find much of relevance and importance. The textual examples offer useful case studies for undergraduates and the overarching argument will provide a productive node of dialogue and debate for postgraduates.

The problem in the book is - not surprisingly - the relationship between capitalism and the academy. It is also the crack at the base of much popular cultural studies: either we mouth the elitist biases of the arts establishment or become cheerleaders for The X Factor and the market economy.

Transformations is often sucked into the latter, but effectively critiques the former. But the validation of "expansionary individualism" can create justifications for hyper-consumerism, environmental damage and a disconnection from (perhaps old-fashioned) collectivised politics. Significantly, the "transformation" in the title refers to individual identity, not to wider social change. Perhaps that is the great difference between Fiske and McCracken. Fiske was never satisfied with the individual text or audience member. He wanted a meaning that arched beyond the self. It is a commitment and legacy that we need to remember.

Reviewer: Tara Brabazon is professor of media, University of Brighton. Her latest book is Thinking Popular Culture: War, Terrorism and Writing (2008).


V.S. Ramachandran - SELF AWARENESS: THE LAST FRONTIER

This is a great essay from one of the world's leading neuroscientists on the topic of self-awareness, one of the more disputed areas of consciousness studies. Really, there is not even any agreement that consciousness is the last, great problem for science, or that consciousness even exists.

This original essay first appeared at Edge.
SELF AWARENESS: THE LAST FRONTIER

One of the last remaining problems in science is the riddle of consciousness. The human brain—a mere lump of jelly inside your cranial vault—can contemplate the vastness of interstellar space and grapple with concepts such as zero and infinity. Even more remarkably it can ask disquieting questions about the meaning of its own existence. "Who am I" is arguably the most fundamental of all questions.

It really breaks down into two problems—the problem of qualia and the problem of the self. My colleagues, the late Francis Crick and Christof Koch have done a valuable service in pointing out that consciousness might be an empirical rather than philosophical problem, and have offered some ingenious suggestions. But I would disagree with their position that the qualia problem is simpler and should be addressed first before we tackle the "Self." I think the very opposite is true. I have every confidence that the problem of self will be solved within the lifetimes of most people reading this column. But not qualia.

SELF AWARENESS: THE LAST FRONTIER [1.1.09]
By V.S. Ramachandran

An Edge Original Essay


V.S. RAMACHANDRAN is a Neuroscientist, Director, Center for Brain and Cognition, University of California, San Diego; Author, Phantoms in the Brain.

V.S. Ramachandran's Edge Bio Page


SELF AWARENESS: THE LAST FRONTIER

One of the last remaining problems in science is the riddle of consciousness. The human brain—a mere lump of jelly inside your cranial vault—can contemplate the vastness of interstellar space and grapple with concepts such as zero and infinity. Even more remarkably it can ask disquieting questions about the meaning of its own existence. "Who am I" is arguably the most fundamental of all questions.

It really breaks down into two problems—the problem of qualia and the problem of the self. My colleagues, the late Francis Crick and Christof Koch have done a valuable service in pointing out that consciousness might be an empirical rather than philosophical problem, and have offered some ingenious suggestions. But I would disagree with their position that the qualia problem is simpler and should be addressed first before we tackle the "Self." I think the very opposite is true. I have every confidence that the problem of self will be solved within the lifetimes of most readers of this essay. But not qualia.

The qualia problem is well known. Assume I am an intellectually highly advanced, color-blind martian. I study your brain and completely figure out down to every last detail what happens in your brain—all the physico-chemical events—when you see red light of wavelength 600 and say "red". You know that my scientific description, although complete from my point of view, leaves out something ineffable and essentially non-communicable, namely your actual experience of redness. There is no way you can communicate the ineffable quality of redness to me short of hooking up your brain directly to mine without air waves intervening (Bill Hirstein and I call this the qualia-cable; it will work only if my color blindness is caused by missing receptor pigments in my eye, with brain circuitry for color being intact.) We can define qualia as that aspect of your experience that is left out by me—the color-blind Martian. I believe this problem will never be solved or will turn out (from an empirical standpoint) to be a pseudo-problem. Qualia and so-called "purely physical" events may be like two sides of a Moebius strip that look utterly different from our ant-like perspective but are in reality a single surface.

So to understand qualia, we may need to transcend our ant-like view, as Einstein did in a different context. But how to go about it is anybody's guess.

The problem of self, on the other hand, is an empirical one that can be solved—or at least explored to its very limit—by science. If and when we do it will be a turning point in the history of science. Neurological conditions have shown that the self is not the monolithic entity it believes itself to be. It seems to consist of many components each of which can be studied individually, and the notion of one unitary self may well be an illusion. (But if so we need to ask how the illusion arises; was it an adaptation acquired through natural selection?)

Consider the following disorders which illustrate different aspects of self.

• Out of body experiences: patients with right fronto-parietal strokes report floating out into space watching their body down below—undoubtedly contributing to the myth of disembodied souls. Left hemisphere strokes result in the feeling of a mysterious presence—a phantom twin—hovering behind the patient's left shoulder.

• Apotemnophilia: An otherwise completely normal person develops an intense desire to have his arm or leg amputated. The right parietal (a part of it known a SPL) normally contains a complete internal image of the body. We showed recently that in these patients the part of the map corresponding to the affected limb is congenitally missing, leading to alienation of the limb.

The patients are sometimes sexually attracted to amputees, We postulate that in " normal" individuals there is a genetically specified homunculus in S2 that serves as a template acting on limbic and visual areas to determine aesthetic preference for ones own body type. Hence pigs are attracted to pigs not people. (Which is not to deny an additional role for olfactory and visual imprinting) But if the image in S2 is missing a limb this may translate into an aesthetic preference toward an amputee - mediated by reverse projections that are known to connect the ("emotional") amygdala to every stage in the visual hierarchy.

• Transsexuality; A woman claims that for as far back as she can remember she felt she was a man trapped in a woman's body—even experiencing phantom penises and erections. Our ordinary notion of every person having a single sexual identity (or self) is called into question. It turns out there are at least four distinct aspects of sexuality; your external anatomy, your internal brain-based body image, your sexual orientation and your sexual identity—who you think others think of you as. Normally these are harmonized in fetal development but if they get uncoupled you become a transsexual person. (It is important to note there is nothing "abnormal" about them, any more than you would regard being gay as abnormal.)

• A patient with a phantom arm simply watches a student volunteer's arm being touched. Astonishingly the patient feels the touch in his phantom. The barrier between him and others has been dissolved.

• Cotards syndrome; the patient claims he is dead and rejects all evidence to the contrary.

• Capgras delusion; the patient claims that his mother looks like his mother but is in fact an imposter. Other patients claim that they inhabit a house that's a duplicate of their real house. Bill Hirstein and I (and Haydn Ellis and Andrew Young) have shown that this highly specific delusion arises because the visual area in the brain is disconnected from emotional areas. So when our patient David sees his mother he recognizes her—along with the penumbra of memories linked to her. But no emotions and no jolt of familiarity is evoked so he rationalizes away his curious predicament saying she is an imposter. It is important to note that these patients are usually intelligent and mentally stable in most other respects. It is the selective nature of the delusion that makes it surprising and worth studying.

David also had difficulty abstracting across successive encounters of a new person seen in different contexts to create an enduring identity for that person. Without the flash of recognition he ought to have experienced in the second, third or n'th exposure, he couldn't bind the experiences together into a single person. Even more remarkably David sometimes duplicated his own self! He would often refer to "The other David who is on vacation." It was as if even successive episodes of his own self were not bound together the way they are in you and me.

This is not to be confused with MPD ("multiple personality disorder") seen in psychiatric contexts. MPD is often a dubious diagnosis made for medico-legal and insurance purposes and tends to fluctuate from moment to moment. (I have often been tempted to send two bills to an MPD patient to see if he pays both.) Patients like David, on the other hand, may give us genuine insight into the neural basis of selfhood.

• In another disorder the patient, with damage to the anterior cingulate
develops "akinetic mutism". He lies in bed fully awake and alert but cannot talk or walk—indeed doesn't interact in any way with people or things around him. Sometimes such patients wake up (when given certain drugs ) and will say "I knew what was going on around me but I simply had no desire to do anything ". It was if he had selective loss of one major attribute of the self— free will".

• Even odder is a phenomenon called "The telephone syndrome". The patient (I'll call him John) will display akinetic mutism—no visual consciousness—when seeing his (say) father in person. But if he receives a phone call from his father he suddenly becomes conscious and starts conversing with him normally. (S. Sriram and Orrin Devinsky, personal communication.) It's as if there are two Johns—the visual John who is only partially conscious and the auditory John (with his own self) who talks over the phone. This implies a degree of segregation of selves—all the way from sensory areas to motor output—that no one would have suspected.

We will now consider two aspects of self that are considered almost axiomatic. First its essentially private nature. You can empathise with someone but never to the point of experiencing her sensations or dissolving into her (except in pathological states like folie a duex and romantic love). Second, it is aware of its own existence. A self that negates itself is an oxymoron. Yet both these axioms can fall apart in disease; without affecting other aspects of self. An amputee can literally feel his phantom limb being touched when he merely watches a normal person being touched. A person with Cotard's syndrome will deny that he exists; claiming that his body is a mere empty shell. Explaining these disorders in neural terms can help illuminate how the normal self is constructed.

To account for some of these syndromes we need to invoke mirror neurons discovered by Giacomo Rizzolatti, Victorio Gallase and Marco Iacoboni. Neurons in the prefrontal cortex send out sophisticated signals down the spinal cord that orchestrate skilled and semi-skilled movements such as putting food in your mouth, pulling a lever, pushing a button, etc. These are "ordinary" motor command neurons but some of them, known as mirror neurons, also fire when you merely watch another person perform a similar act. It's as if the neuron (more strictly the network of which the neuron is part) was using the visual input to do a sort of "virtual reality simulation" of the other persons actions—allowing you to empathize with her and view the world from her point of view.

In a previous Edge essay I also speculated that these neurons can not only help simulate other people's behavior but can be turned "inward"—as it were—to create second-order representations or metarepresentations of your own earlier brain processes. This could be the neural basis of introspection, and of the reciprocity of self awareness and other awareness. There is obviously a chicken-or-egg question here as to which evolved first, but that is tangential to my main argument. (See also Nick Humphrey's contributions to Edge.) The main point is that the two co-evolved, mutually enriching each other to create the mature representation of self that characterizes modern humans. Our ordinary language illustrates this, as when we say "I feel a bit self conscious", when I really mean that I am conscious of others being conscious of me. Or when I speak of being self critical or experiencing "self-pity". (A chimp could—arguably—feel pity for a begging chimp, but I doubt whether it would ever experience self-pity.)

I also suggest that although these neurons initially emerged in our ancestors to adopt another's allocentric visual point of view, they evolved further in humans to enable the adoption of another's metaphorical point of view. ("I see it from his point of view" etc.) This, too, might have been a turning point in evolution although how it might have occurred is deeply puzzling.

There are also: "touch mirror neurons" that fire not only when your skin is touched but when you watch someone else touched. This raises an interesting question; how does the neuron know what the stimulus is? Why doesn't the activity of these neurons lead you to literally experience the touch delivered to another person? There are two answers. First the tactile receptors in your skin tell the other touch neurons in the cortex (the non-mirror neurons) that they are not being touched and this null signal selectively vetos some of the outputs of mirror neurons. This would explain why our amputee experienced touch sensations when he watched our student being touched; the amputation had removed the vetoing. It is a sobering thought that the only barrier between you and others is your skin receptors!

A second reason why your mirror neurons don't lead you to mime everyone you watch or to literally experience their tactile sensations might be that your frontal lobes send feedback signals to partially inhibit the mirror neurons' output. (It cant completely inhibit them; otherwise there would be no point having mirror neurons in the first place.) As expected, if the frontal lobes are damaged you do start miming people ("echopraxia").

Recent evidence suggests that there may also be mirror neurons for pain, disgust, facial expression—perhaps for all outwardly visible expression of emotions. (We call these "empathy" neurons or Gandhi neurons.) Some of these are in the anterior cingulate—others in the insula.

I mention these to emphasize that despite all the pride that your self takes in its individuality and privacy, the only thing that separates you from me is a small subset of neural circuits in your frontal lobes interacting with mirror neurons. Damage these and you "lose your identity"—your sensory system starts blending with those of others. Like the proverbial Mary of philosopher's thought experiments, you experience their qualia.

We suggest that many otherwise inexplicable neuro-psychiatric symptoms may arise from flaws in these circuits leading to "you-me" confusion and impoverished ego-differentiation. Lindsay Oberman, Eric Altschuler and I have seen strong preliminary hints that autistic children have a paucity of mirror neurons which would not only explain their poor imitation, empathy and 'pretend play" (which requires role-playing) but also why they sometimes confuse the pronouns I and You, and have difficulty with introspection. Even Freudian phenomena like "projection", seen in all of us, may have similar origins; "I love you" turns to "You love me" to make me feel safer.

Let us return to Cotards syndrome—the ultimate paradox of the self negating its own existence (sometimes claiming "I am dead", "I can smell my body rotting", etc.). We postulate that this arises from a combination of two lesions. First, a lesion that is analogous to Capgras but far more pervasive. Instead of emotions being disconnected from just visual centers, it is disconnected from all sensations and even memories of sensations. So the entire world becomes an imposter—unreal (not just the mother). Second, there may be dysfunctional interaction between the mirror neurons and frontal inhibitory structures leading to a dissolution of the sense of self as being distinct from others (or indeed from the world ). Lose the world and lose yourself—and it's as close to death as you can get. This is not a fully developed explanation by any means; I mention it only to indicate the style of thinking that we may need to explain these enigmatic syndromes.

Now imagine these same circuits become hyperactive as sometimes happens when you have seizures originating in the temporal lobes (TLE or temporal lobe epilepsy). The result would be an intense heightening of the patient's sensory appreciation of the world and intense empathy for all beings to the extent of seeing no barriers between himself and the cosmos—the basis of religious and mystical experiences. (You lose all selfishness and become one with God.) Indeed many of history's great religious leaders have had TLE. My colleague, the late Francis Crick, has suggested that TLE patients as well as priests may have certain abnormal transmitters in their brains that he calls "theotoxins". (He once told philosopher Pat Churchland that he had nothing against religion per se, so long as it was a private arrangement between consenting adults.)

I hasten to add that the involvement of the temporal lobes in mystical experiences does not in itself negate the existence of an abstract God, who, in Hindu philosophy, represents the supreme dissolution of all barriers. Perhaps the TLE patient has seen the truth and most of us haven't. I don't have TLE myself but have had personally had epiphanies when listening to a haunting strain of music, watching the aurora borealis, or looking at Jupiter's moons through a telescope. During such epiphanies I have seen eternity in a moment and divinity in all things. And, indeed, felt one with the Cosmos. There is nothing "true "or "false" about such experiences—they are what they are; simply another way of looking at reality.

Let us turn now to out-of-body experiences. Even a normal person—such as the reader—can at times adopt a "detached" allocentric stance toward yourself (employing something like mirror neurons) but this doesn't become a full blown delusion because other neural systems (e.g. inhibition from fontal structures and skin receptors ) keep you anchored. But damage to the right fronto-parietal regions or ketamine anesthesia (which may influence the same circuits) removes the inhibition and you start leaving your body even to the extent of not feeling your own pain. You see your pain "objectively" as if someone else was experiencing it. Some such opossum-like detachment also occurs in dire emergencies when you momentarily leave yourself and watch your body being raped or mauled by a lion. This reflex is normally protective (lying still to fool predators) but a vestige of it in humans may manifest as "dissociative" states under conditions of extreme stress.

The purported "unity" or internal consistency of self is also a myth. Most patients with left arm paralysis caused by right hemisphere stroke complain about it as, indeed, they should. But a subset of patients who have additional damage to the "body image" representation in the right SPL (and possibly insula) claim that their paralyzed left arm doesn't belong to them. The patient may assert that it belongs to his father or spouse. (As if he had a selective "Capgras" for his arm). Such syndromes challenge even basic assumptions such as "I am anchored in this body" or "This is my arm". They suggest that "belongingness" is a primal brain function hardwired through natural selection because of its obvious selective advantage to our hominoid ancestors. It makes one wonder if someone with this disorder would deny ownership of (or damage to) the left fender of his car and ascribe it to his mother's car.

There appears to be almost no limit to this. An intelligent and lucid patient I saw recently claimed that her own left arm was not paralyzed and that the lifeless left arm on her lap belonged to her father who was "hiding under the table". Yet when I asked her to touch her nose with her left hand she used her intact right hand to grab and raise the paralyzed hand—using the latter as a "tool" to touch her nose! Clearly somebody in there knew that her left arm was paralyzed and that the arm on her lap was her own, but "she"—the person I was talking to—didn't know. I then lifted her "father's hand" up toward her, drawing attention to the fact that it was attached to her shoulder. She agreed and yet continued to assert it belonged to her father. The contradiction didn't bother her.

Her ability to hold mutually inconsistent beliefs seems bizarre to us but in fact we all do this from time to time. I have known many an eminent theoretical physicist who prays to a personal God; an old guy watching him from somewhere up there in the sky. I might mention that I have long known that prayer was a placebo; but upon learning recently of a study that showed that a drug works even when you know it is a placebo, I immediately started praying. There are two Ramachandrans—one an arch skeptic and the other a devout believer. Fortunately I enjoy this ambiguous state of mind, unlike Darwin who was tormented by it. It is not unlike my enjoyment of an Escher engraving.

In the last decade there has been a tremendous resurgence of interest among neuroscientists in the nature of consciousness and self. The problem has been approached from many angles—ranging from single neuron electrophysiology to macroscopic brain anatomy (including hundreds of brain imaging studies ) What has been missing, though, is what might be called "psycho-anatomy"; whose goal is to explain specific details of certain complex mental capacities in terms of equally specific activity of specialized neural structures. As an analogy, consider the discovery of the genetic code. Crick and Watson unraveled the double helix, and saw in a flash that the complementarity of the two strands of the helix is a metaphor of the complementarity of parent and offspring in heredity. (Pigs give birth to pigs—not to donkeys.) In other words the structural logic of DNA dictates the functional logic of heredity. No such radical insight has emerged in neuroscience that would allow us to precisely map function on to structure.

One way of achieving this goal, as we have seen in this essay, might be to explore syndromes that lie at the interface between neurology and psychiatry Given the inherent complexity of the human brain, it is unlikely that there will be a single climactic solution like DNA (although I don't rule it out). But there may well be many instances where such a synthesis is possible on a smaller scale and these may lead to testable predictions and novel therapies. They may even pave the way for a grand unified theory of mind of the kind physicists have been seeking in trying to unify gravitation, relativity and quantum mechanics.

When such a theory finally emerges we can either accept it with equanimity or ask "Now that we have solved the problem of self, what else is there?"

For some serious responses to this essay, be sure to check out THE REALITY CLUB: Marc D. Hauser, V.S. Ramachandran, Timothy D. Wilson, Arnold Trehub, Robert Provine. The exchange between the various people is entertaining and enlightening.


Friday, January 09, 2009

Ask Men - 5 Things You Didn't Know about Shakespeare

Cool list - I didn't know most a couple of these, especially the thing about his wealth. But the point about the sonnets is debatable (certainly a couple of them fall into this category, but few scholars would lump all of them into this statement).

5 Things You Didn't Know: Shakespeare

By Andrew Moore Entertainment Correspondent

William Shakespeare - Credit: Wikimedia Commons

William Shakespeare is arguably the greatest writer in the history of the English language. Nearly 400 years after the bard's death we still stage his works in our theaters and study them in our classrooms. If he'd only written Hamlet, Romeo & Juliet or King Lear, we'd probably still consider him one of the greatest writers in the English canon -- but he wrote them all, along with numerous other classics.

We continually return to Shakespeare because of his ability to tell us something about ourselves. He was a perceptive student of human nature with an unrivaled capacity for drama and description. Shakespeare had a way with words, unmatched by anyone before or since. He's the writer we study perhaps more than any other.

You may be an admirer of the bard, but we're betting there are at least five things you didn't know about Shakespeare.

1- Shakespeare's sonnets were written for men

Published in 1609, Shakespeare's sonnet sequence is 154 poems long. The sequence contains some of the most beautiful and enduring love poetry in the English language: "Shall I compare thee to a summer's day? / Thou art more lovely and more temperate: / Rough winds do shake the darling buds of May, / And summer's lease hath all too short a date." Pretty smooth, right? Well, one of the five things you didn't know about Shakespeare is that that poem is about two dudes. In fact, 126 of Shakespeare's sonnets are about a deep love between two male friends.

Now, it was impossible to imagine someone as a "homosexual" in the Renaissance, which is not to say that men didn't have sex with each other. It just means that people weren't considered "gay" or "straight" in early modern England. Instead, homoeroticism was understood to be a normal extension of male friendship, and that desire is on full display in Shakespeare's sonnets.

2- Danielle Steel has been translated more than Shakespeare

It's true. American author Danielle Steel has written nearly 80 novels and she's sold more than 550 million copies of her works worldwide. Her novels have been translated in 47 countries, and into 28 different languages. While Shakespeare's plays have been translated into some 80 languages over the past four centuries, he wrote only 37 plays (along with his poetry). The point is: There are more Danielle Steel novels circulating around the world today than there are Shakespearean plays.

3- Shakespeare invented "torture"

Shakespeare didn't just invent "torture," but also "excitement," "addiction" and "savagery." Another of the five things you might not have known about Shakespeare is just how much he's influenced the English language. Our man Will invented about 1,700 words in the English language. A remarkable number of the phrases and words we use every day first appeared in Shakespeare's work. Shakespeare converted verbs into adjectives or nouns into verbs whenever it suited him. Amazingly, his linguistic inventions stuck, and we still use them today.

There was more to William Shakespeare than sonnets and new words. His words caught on because Shakespeare had such a knack for describing the indescribable. The list of words created by Shakespeare includes "arouse," "accused," "amazement," "bedroom," "champion," "compromise," "fashionable," "flawed," "gossip," "hurried," "lonely," "majestic," "negotiate," "swagger," and, yeah, "torture."

4- Shakespeare's grave is cursed

It may be Shakespeare's final poem, an ominous couple of lines etched on his tomb in Trinity Church in Stratford-upon-Avon; "Blessed be the man that spares these stones / And cursed be he who moves my bones," that alludes to this grave's curse. It's an invocation to future generations to leave the bard in peace.

Shakespeare's curse has been taken seriously enough that his bones have remained in place for four centuries. Even recently, as the church undergoes some much needed renovations, nervous historians and clergymen are organizing their efforts around the bard's bones, to ensure they don't disturb the dead playwright.

5- Shakespeare was rich

Sure, you knew Shakespeare was a great writer, but we're betting that Shakespeare's business acumen is one of the five things you didn't know about Shakespeare. In Renaissance England, playwrights often worked with a set theater and a set group of actors called "companies." This arrangement allowed dramatists to write plays with parts tailored to the particular skills and talents of specific actors in the company.

Shakespeare was the primary dramatist for a company known as the Lord Chamberlain's Men (later renamed the King's Men). However, unlike his fellow playwrights, Shakespeare was actually a shareholder in the company; so he wasn't just paid a fee for each play he wrote, but he also took in a share of the company's overall profits. Consequently, Shakespeare was wealthier than many of his fellow playwrights. He was rich enough, while living in London, to purchase a second home in Stratford-upon-Avon. He may have warned us that "[a]ll that glitters is not gold," but William was no fool. He took care of business.

Resources:

Daily Dharma - Changing Like the Weather


Today's Daily Dharma from Tricycle features Pema Chodron.

Changing Like the Weather

The first noble truth says simply that it's part of being human to feel discomfort. We don't even have to call it suffering anymore; we don't even have to call it discomfort. It's simply coming to know the fieriness of fire, the wildness of wind, the turbulence of water, the upheaval of earth, as well as the warmth of fire, the coolness and smoothness of water, the gentleness of the breezes, and the goodness, solidness, and dependability of the earth. Nothing in its essence is one way or the other. The four elements take on different qualities; they're like magicians. Sometimes they manifest in one form and sometimes in another.... The first noble truth recognizes that we also change like the weather, we ebb and flow like the tides, we wax and wane like the moon.

~ Pema Chodron, Awakening Loving-Kindness; from Everyday Mind, edited by Jean Smith, a Tricycle book

Treating Brain Diseases with Ultrasound


An interesting article from The Economist looks at the future of treating brain dysfunction with ultrasound.

Sound and no fury

Jan 8th 2009
From The Economist print edition

It may, in the future, be possible to treat brain diseases with ultrasound

THE idea of treating maladies of the mind by blasting the brain with noise sounds, to the layman, like kicking a television set in order to repair it. It is, however, on the cards.

The noise in question is ultrasound. This has been used for decades to scan human interiors—particularly wombs containing developing fetuses. The ultrasound is reflected from surfaces within the body (such as the skin of a fetus) in the way that audible sound echoes from a cliff face. William Tyler and his colleagues at Arizona State University, however, want to take things a stage further. They think that ultrasound might be used therapeutically as well.

The team knew from experiments done by other groups of researchers that ultrasound can have a physical effect on tissue. Unfortunately, that effect is generally a harmful one. When nerve cells were exposed to it at close range, for example, they heated up and died. Dr Tyler, however, realised that all of the studies he had examined used high-intensity ultrasound. He guessed that lowering the intensity might allow nerve cells to be manipulated without damage.

To test this idea, he and his colleagues placed slices of living mouse brain into an artificial version of cerebrospinal fluid, the liquid that cushions the brain. They then beamed different frequencies of low-intensity ultrasound at the slices and monitored the results using dye molecules that give off light in response to the activity of proteins called ion channels. (An ion channel is a molecule that allows the passage of electrically charged atoms of sodium, potassium, calcium and so on through the outer membrane of a cell.)

The purpose of all this was to coax the cells to release neurotransmitters. These are molecules that carry information from one nerve cell to another. When they arrive, they cause ion channels to open and thus trigger the electrical impulses that pass messages along nerve fibres. When those pulses arrive at the other end of a fibre they, in turn, trigger the release of more neurotransmitters.

Disruption of this system of communication is characteristic of several medical conditions, including Alzheimer’s disease, Parkinson’s disease, depression and epilepsy. Ways of boosting the release of neurotransmitters may thus have therapeutic value. And the ultrasound did indeed boost their release.

How that came about is not absolutely certain, but Dr Tyler thinks the shaking that his ultrasound gave to the cells in question opened up some of their ion channels. The cells were thus fooled into acting as though an impulse had arrived, and released neurotransmitters as a consequence.

Any medical application of the idea is a long way away. But ultrasound does now offer at least the possibility of manipulating the brains of people suffering from mental illnesses without resorting to drugs or electrodes. And that is certainly a path worth investigating.

Check Out Brain Science Podcast #51

Another great episode of The Brain Science Podcast - this week she talks to Seth Grant about the evolution of synapses, the space between brain cells where all the magic happens.

Brain Science Podcast #51: Seth Grant on Synapse Evolution


Episode 51 of the Brain Science Podcast is an interview with Dr. Seth Grant from Cambridge University, UK. Dr. Grant’s work focuses on the proteins that make up the receptors within synapses. (Synapses are the key structures by which neurons send and receive signals.) By comparing the proteins that are present in the synapses in different species Dr. Grant has come to some surprising conclusions about the evolution of the synapse and the evolution of the brain.

In this interview Dr. Grant explains how his research team has uncovered the identity of synapse proteins in a variety of species including yeast, fruit flies, and mice. Our discussion is centered on the paper he published in Nature Neuroscience in June 2008. Dr. Grant’s team has made several surprising discoveries. First, he has discovered that some proteins associated with neuron signaling are actually found in primitive unicellular organisms like yeast. He has also discovered that the protein structure of the synapse becomes more complex as one moves from invertebrates like fruit flies to vertebrates like mice, but that most of the complexity seems to have arisen early on in vertebrates.

According to Dr. Grant:

The origins of the brain appear to be in a protosynapse or ancient set of proteins found in unicellular animals, and when unicellular animals evolved into metazoans or multicellular animals their protosynaptic architecture was coopted and embelished by the addition of new proteins onto that ancient protosynaptic set, and that set of new molecules was inserted into the junctions of the first neurons or the synapse between the first neurons in simple invertebrate animals. When invertebrates evolved into vertebrates, around a billion years ago, there was a further addition or enhancement of the number of these synaptic molecules and that has been conserved throughout vertebrate evolution where they have much larger numbers of synaptic molecules. The large complex synapses evolved before large anatomically complex brains.

The discovery that there are significant differences between the synapses in vertebrates and non-vertebrates is significant because it has long been assumed that synapses were essentially identical between species and that brain and behavioral complexity was based on having more neurons and bigger brains. Instead, Dr. Grant proposes an alternative hypothesis:

The first part of the brain to ever evolve was the protosynapse. In other words, synapses came first.

When this big synapse evolved what the vertebrate brain then did as it grew bigger and evolved afterwards; it exploited the new proteins that had evolved into making new types of neurons in new types of regions of the brain. In other words, we would like to put forward the view that the synapse evolution has allowed brain specialization, regionalization, to occur.

Play Episode 51

Download Transcript of Dr. Grant’s Interview (PDF)

Subscribe via iTunes™

Addition Show Notes and Links:

References:

Blog posts and other links:

Learn more about Dr. Grant’s work:


Psychology Today - Unleashing the Power of Emotional Connection


A new blog has been added over at Psychology Today's fine collection of offerings, Emotional Connection.

Here is an excerpt from the Q&A format of the introductory post.

Q: What are the greatest stumbling blocks people encounter when trying to release their emotional resistance and begin feeling successfully?

· Analyzing - an attempt to figure our way out of an emotion
"What's going on? Why am I feeling so anxious?"

· Judging - a decision that something's wrong with the emotion, or with us for having it
"This guilt is too much. I shouldn't let him get to me."

· Assessing - excessive focus on how well or poorly we're connecting
"I'm not feeling much of anything. Am I doing this right?"

· Bargaining - conditions placed on how long or how deeply we're willing to feel
"I'll feel this grief fully today, but it better not show up again tomorrow."

Whenever these stumbling blocks occur, the solution is simply to notice them with equanimity and resume surfing as soon as possible.

Q: When people are falling short of their dreams and goals and can't tell which emotions they're resisting, what are they supposed to do?

RC: A big portion of the book is devoted to answering this question. The basic steps are:

1) Find the Flinch - Identify the aspect of moving toward your vision that causes you to pull up short

2) Cut to the Chase - Examine the "worst-case scenario" in going forward and determine how that outcome would make you feel

3) Weather the Storm - Imagine that outcome as a reality, and then connect with the entire range of emotions that arise.

4) Repeat As Necessary - Apply the same course of action if and when you get stuck again in pursuit of your goal, regarding the same emotions from before or any new ones that may arise.

Let's see this in action. A client of mine always wanted to write but never got around to it. His flinch occurred every time he walked past his waiting computer. His worst-case scenario was writing something that his most loved and respected friends thought was pure drek. He realized this would make him feel like an abject failure.

Together, we imagined that he wrote a whole novel, was super excited about it, and gave it to his friends who were promptly horrified. They hated the book vehemently and ridiculed him for writing it.

His emotional response to this imaginary situation was a daunting wave of shame. I guided him to stay on the wave through many challenges and distractions, and after a few minutes it abated.

"Well," he told me, "that really wasn't so bad. I kind of feel like, "Oh, well, at least I tried. That's better than never writing anything."

This process revealed to my client that the one thing holding him back had been his resistance to shame. Repeated a few more times, it released his resistance almost completely. Now, with nothing holding him back, he writes at least thirty minutes a day.

Q: You maintain that emotional resistance is also a health hazard. In what way?

RC: Our emotions want and need to be felt. The harder and longer we keep them locked within, the more they struggle to get out. One result of this battle is stress, which is proven to be a leading cause of serious illness. Another result is the depletion of our life energy, which quickly turns into depression.

Q: You also tout emotional connection as an effective way to end addictions and compulsions. Can you describe how that works?

RC: All addictions and compulsions, as I mentioned earlier, are really about resisting emotions. Once we connect with those emotions, addictions and compulsions lose both their purpose and power.

If you're unwilling to feel disappointment, for instance, you might flop on the couch every week, eat popcorn, watch American Idol and snicker at all the contestants. But once you become willing to experience disappointment, both old and new, you might actually sign up for the Open Mike Night at your local pub.

Or if you're unwilling to feel distrust, you might check your spouse's email over and over. You might even be convinced that you're doing this precisely because you distrust. But once you become willing to feel your distrust directly, your need for hyper vigilance would cease. Instead, you could then choose to talk openly with your spouse about the feeling. Or, if your spouse truly is untrustworthy, you might finally be able to move onto a more healthy relationship.

Go read the whole post - this looks to be a good and useful blog.


Thursday, January 08, 2009

Scientific American Mind - Is Hypnosis a Distinct Form of Consciousness?


This cool article at Scientific American Mind - Is Hypnosis a Distinct Form of Consciousness? - suggests that the hypnotic state is a unique form of consciousness. Specifically, it's a "state" of consciousness, separate from but similar to waking, dreaming, and deep sleep, but I'm not sure science is ready to admit Advaita or Buddhist theories of consciousness into their hallowed halls.

Is Hypnosis a Distinct Form of Consciousness?

Studies confirm that during hypnosis subjects are not in a sleeplike state but are awake

By Scott O. Lilienfeld and Hal Arkowitz

The hypnotist, dangling a swinging pocket watch before the subject’s eyes, slowly intones: “You’re getting sleepy … You’re getting sleepy …” The subject’s head abruptly slumps downward. He is in a deep, sleeplike trance, oblivious to everything but the hypnotist’s soft voice. Powerless to resist the hypnotist’s influence, the subject obeys every command, including an instruction to act out an upsetting childhood scene. On “awakening” from the trance half an hour later, he has no memory of what happened.

In fact, this familiar description, captured in countless movies, embodies a host of misconceptions. Few if any modern hypnotists use the celebrated swinging watch introduced by Scottish eye surgeon James Braid in the mid-19th century. Although most hypnotists attempt to calm subjects during the “induction,” such relaxation is not necessary; people have even been hypnotized while pedaling vigorously on a stationary bicycle. Electroencephalographic (EEG) studies confirm that during hypnosis subjects are not in a sleeplike state but are awake—though sometimes a bit drowsy. Moreover, they can freely resist the hypnotist’s suggestions and are far from mindless automatons. Finally, research by psychologist Nicholas Spanos of Carleton University in Ontario shows that a failure to remember what transpired during the hypnosis session, or so-called posthypnotic amnesia, is not an intrinsic element of hypnosis and typically occurs only when subjects are told to expect it to occur.

The Consciousness Question
The iconic scene we described at the article’s outset also raises a deeper question: Is hypnosis a distinct state of consciousness? Most people seem to think so; in a recent unpublished survey, psychologist Joseph Green of Ohio State University at Lima and his colleagues found that 77 percent of college students agreed that hypnosis is a distinctly altered state of consciousness. This issue is of more than academic importance. If hypnosis differs in kind rather than in degree from ordinary consciousness, it could imply that hypnotized people can take actions that are impossible to perform in the waking state. It could also lend credibility to claims that hypnosis is a unique means of reducing pain or of effecting dramatic psychological and medical cures.

Despite the ubiquitous Hollywood depiction of hypnosis as a trance, investigators have had an extremely difficult time pinpointing any specific “markers”—indicators—of hypnosis that distinguish it from other states. The legendary American psychiatrist Milton Erickson claimed that hypnosis is marked by several unique features, including posthypnotic amnesia and “literalism”—a tendency to take questions literally, such as responding “Yes” to the question “Can you tell me what time it is?” We have already seen that posthypnotic amnesia is not an inherent accompaniment of hypnosis, so Erickson was wrong on that score. Moreover, research by Green, Binghamton University psychologist Steven Jay Lynn and their colleagues shows that most highly hypnotizable subjects do not display literalism while hypnotized; moreover, participants asked to simulate hypnosis demonstrate even higher rates of literalism than highly hypnotizable subjects do.

Other experts, such as the late University of Pennsylvania psychiatrist Martin Orne, have argued that only hypnotized participants experience “trance logic”—the ability to entertain two mutually inconsistent ideas at the same time. For example, a hypnotist might suggest to a subject that he is deaf and then ask him, “Can you hear me now?” He may respond “No,” thereby manifesting trance logic. Nevertheless, research by the late Theodore X. Barber, then at the Medfield Foundation, and his colleagues showed that participants asked to simulate hypnosis displayed trance logic just as often as hypnotized people did, suggesting that trance logic is largely a function of people’s expectations rather than an intrinsic component of the hypnotic state itself.

Brain Changes
Still other investigators have sought to uncover distinct physiological markers of hypnosis. Under hypnosis, EEGs, especially those of highly suggestible participants, sometimes display a shift toward heightened activity in the theta band (four to seven cycles per second). In addition, hypnotized participants frequently exhibit increased activity in their brain’s anterior cingulate cortex (ACC).

Yet neither finding is surprising. Theta activity is typically associated with states of quiet concentration, which frequently accompany hypnosis. The ACC is linked to the perception of contradictions, which many hypnotized participants experience as they imagine things—such as childhood experiences in the present—that seem to conflict with reality. Further, psychologists have reported similar brain changes among awake subjects. For example, the ACC becomes activated during the famous Stroop task, which requires subjects to name the colors of ink (such as “green”) in which competing color words (such as “blue”) are printed. Thus, these brain changes are not unique to hypnosis.

Fueling the perception of hypnosis as a distinct trancelike state is the widespread assumption that it leads to marked increases in suggestibility, even complete compliance to the therapist’s suggestions. Nowhere is this zombielike stereotype portrayed more vividly than in stage hypnosis shows, in which people are seemingly induced to bark like dogs, sing karaoke and engage in other comical behaviors in full view of hundreds of amused audience members.

Yet research shows that hypnosis exerts only a minor impact on suggestibility. On standardized scales of hypnotic suggestibility, which ask participants to comply with a dozen suggestions (that one’s arm is raising on its own power, for example), the increase in suggestibility following a hypnotic induction is typically on the order of 10 percent or less. Moreover, research demonstrates that a formal hypnotic induction is not needed to produce many of the seemingly spectacular effects of hypnosis, such as reduction of extreme pain or various physical feats, popular in stage hypnosis acts, such as suspending a participant horizontally between the backs of two chairs. One can generate most, if not all, of these effects merely by providing highly suggestible people with sufficient incentives to perform them. Stage hypnotists are well aware of this little secret. Before beginning their shtick, they prescreen audience members for high suggestibility by providing those people with a string of suggestions. They then handpick their participants from among the minority who comply.

We agree with Lynn and psychologist Irving Kirsch of the University of Hull in England, who wrote in 1995 that “having failed to find reliable markers of trance after 50 years of careful research, most researchers have concluded that this hypothesis [that hypnosis is a unique state of consciousness] has outlived its usefulness.” Increasingly, evidence is suggesting that the effects of hypnosis result largely from people’s expectations about what hypnosis entails rather than from the hypnotic state itself. Still, it is always possible that future studies could overturn or at least qualify this conclusion. In particular, research on potential physiological markers of hypnosis may elucidate how hypnosis differs from other states of consciousness. Although hypnosis poses fascinating mysteries that will keep scientists busy for decades, it seems clear that it has far more in common with everyday wakefulness than with the watch-induced trance of Hollywood crime thrillers.

Note: This article was originally printed with the title, "Altered States".

Flu Shot Fails


If you get the flu, you're screwed. Tamiflu no longer works and scientists don't know why.

Flu Found Resistant to Main Antiviral Drug

Published: January 8, 2009

Virtually all the flu in the United States this season is resistant to the leading antiviral drug Tamiflu, and scientists and health officials are trying to figure out why.

The problem is not yet a public health crisis because this has been a below-average flu season so far and the chief strain circulating is still susceptible to other drugs — but infectious disease specialists are worried nonetheless.

Last winter, about 11 percent of the throat swabs from patients with the most common type of flu that were sent to the Centers for Disease Control and Prevention for genetic typing showed a Tamiflu-resistant strain. This season, 99 percent do.

“It’s quite shocking,” said Dr. Kent A. Sepkowitz, director of infection control at Memorial Sloan-Kettering Cancer Center in New York. “We’ve never lost an antibiotic this fast. It blew me away.”

The single mutation that creates Tamiflu resistance appears to be spontaneous, and not a reaction to overuse of the drug. It may have occurred in Asia, and it was widespread in Europe last year.

In response, the C.D.C. issued new guidelines two weeks ago. They urged doctors to test suspected flu cases as quickly as possible to see if they are influenza A or influenza B, and if they are A, whether they are H1 or H3 viruses.

The only Tamiflu-resistant strain is an H1N1. Its resistance mutation could fade out, a C.D.C. scientist said, or a different flu strain could overtake H1N1 in importance, but right now it causes almost all flu cases in the country, except in a few mountain states, where H3N2 is prevalent.

Complicating the problem, antiviral drugs work only if they are taken within the first 48 hours. A patient with severe flu could be given the wrong drug and die of pneumonia before test results come in. So the new guidelines suggest that doctors check with their state health departments to see which strains are most common locally and treat for them.

“We’re a fancy hospital, and we can’t even do the A versus B test in a timely fashion,” Dr. Sepkowitz said. “I have no idea what a doctor in an unfancy office without that lab backup can do.”

If a Tamiflu-resistant strain is suspected, the disease control agency suggests using a similar drug, Relenza. But Relenza is harder to take — it is a powder that must be inhaled and can cause lung spasms, and it is not recommended for children under 7.

Relenza, made by GlaxoSmithKline, is known generically as zanamivir. Tamiflu, made by Roche, is known generically as oseltamivir.

Alternatively, patients who have trouble inhaling Relenza can take a mixture of Tamiflu and rimantadine, an older generic drug that the agency stopped recommending two years ago because so many flu strains were resistant to it. By chance, the new Tamiflu-resistant H1N1 strain is not.

“The bottom line is that we should have more antiviral drugs,” said Dr. Arnold S. Monto, a flu expert at the University of Michigan’s School of Public Health. “And we should be looking into multidrug combinations.”

New York City had tested only two flu samples as of Jan. 6, and both were Tamiflu-resistant, said Dr. Annie Fine, an epidemiologist at the city’s health department. Flu cases in the city are only “here and there,” she said, and there have been no outbreaks in nursing homes. Elderly patients, and those with the AIDS virus or on cancer therapy are most at risk.

But, she added, because of the resistance problem, the city is speeding up its laboratory procedures so it can do both crucial tests in one day.

“And we strongly suggest that people get a flu shot,” she said. “There’s plenty of time and plenty of vaccine.” Exactly how the Tamiflu-resistant strain emerged is a mystery, several experts said.

Resistance appeared several years ago in Japan, which uses more Tamiflu than any other country, and experts feared it would spread.

But the Japanese strains were found only in patients already treated with Tamiflu, and they were “weak” — that is, they did not transmit to other people.

“This looks like a spontaneous development of resistance in the most unlikely places — possibly in Norway, which doesn’t use antivirals at all,” Dr. Monto said.

Dr. Henry L. Niman, a biochemist in Pittsburgh who runs recombinomics.com, a Web site that tracks the genetics of flu cases around the world, has been warning for months that Tamiflu resistance in H1N1 was spreading.

He argues that it started in China, where Tamiflu use is rare, was seen last year in Norway, France and Russia, then moved to South Africa (where winter is June to September), and back to the northern hemisphere in November.

The mutation conferring resistance to Tamiflu, known in the shorthand of genetics as H274Y on the N gene, was actually, he said “just a passenger, totally unrelated to Tamiflu usage, but hitchhiking on another change.”

The other mutation, he said, known as A193T on the H gene, made the virus better at infecting people.

Furthermore, he blamed mismatched flu vaccines for helping the A193T mutation spread. Flu vaccines typically protect against three flu strains, but none have contained protections against the A193T mutation.

Dr. Joseph S. Bresee, the C.D.C.’s chief of flu prevention, said he thought Dr. Niman was “probably right” about the resistance having innocently piggy-backed on a mutation on the H gene — which creates the spike on the outside of the virus that lets it break into human cells. But he doubted that last year’s flu vaccine was to blame, since the H1 strain in it protected “not perfectly, but relatively well” against H1N1 infection, he said.

Dr. Niman said he was worried about two aspects of the new resistance to Tamiflu. Preliminary data out of Norway, he said, suggested that the new strain was more likely to cause pneumonia.

The flu typically kills about 36,000 Americans a year, the C.D.C. estimates, most of them the elderly or the very young, or people with problems like asthma or heart disease; pneumonia is usually the fatal complication.

And while seasonal flu is relatively mild, the Tamiflu resistance could transfer onto the H5N1 bird flu circulating in Asia and Egypt, which has killed millions of birds and about 250 people since 2003. Although H5N1 has not turned into a pandemic strain, as many experts recently feared it would, it still could — and Tamiflu resistance in that case would be a disaster.