Pages

Saturday, May 05, 2012

2012 Bioethics Conference: The Moral Brain - More Videos


Here are a few more videos from the 2012 Bioethics Conference - The Moral Brain, from NYU. The Paul Bloom lecture on the moral life of babies is especially good - we are born with a moral sense, which should have some impact on how we view ourselves of moral beings.

When we act against those seemingly innate morals, the question should be what went wrong that caused that behavior, rather the current model of thinking in terms of punishment and deterrence. What if, instead, we asked why people are in so much pain or fear that they do these things?


Part I: The Significance of Neuroscience for Morality: Lessons from a Decade of Research
It has been a decade since the first brain imaging studies of moral judgments by Joshua Greene, Jorge Moll and their colleagues were reported. During this time, there have been rich philosophical and scientific discussions regarding a) whether brain imaging data can tell us anything about moral judgments, and b) what they do tell us if they can tell us something about moral judgments.  In this workshop, we aim to bring leading philosophers, neuroscientists, and psychologists in this area together to examine these issues and to explore the future directions of this research.

SATURDAY, MARCH 31, 2012

Session VI: The Moral Life of Babies and Why It Matters

Session Chair: Joshua Knobe, Associate Professor, Program in Cognitive Science & Department of Philosophy, Yale University
Speaker: Paul Bloom, Brooks & Suzanne Ragen Professor of Psychology & Cognitive Science, Yale University

Abstract: This talk will explore three case-studies of moral psychology: (1) Physical contact, such as helping, hindering, and hitting; (2) Fair and unfair distribution of resources; and (3) Violations of purity, with special focus on sexual behavior. I will review some ongoing experimental work with babies and young children that bears on the emergence of moral intuitions and motivations in these domains, and I will argue that these domains show strikingly different patterns of development. I end with an argument that developmental moral psychology is relevant to problems of normative ethics, though in a rather indirect way.




Session VII: Feeling Good About Feeling Bad: Moral Aliefs and Moral Dilemmas

Session Chair: Jesse Prinz, Distinguished Professor of Philosophy, City University of New York
Speaker: Tamar Gendler, Professor of Philosophy, Yale University

Abstract: In some cases, moral behavior seem to be fully commendable (only) when the subject performs it wholeheartedly, without conflict between or among counter- or pro-moral beliefs and counter- or pro-moral aliefs. But in others, (perhaps those where moral demands at different levels pull in different directions) moral behavior seems to be fully commendable (only) when the subject experiences a conflict between pro-moral beliefs and pro-moral aliefs where the latter -- generally pro-social -- response is morally overridden in this (exceptional) circumstance. In still others, moral behavior seems to be fully commendable when it occurs as a result of the agent's overcoming certain counter-moral aliefs or beliefs. What sorts of systematic patterns do these cases exhibit, and how do they connect to Tetlock's work on tragic and taboo tradeoffs, Williams' work on "one thought too many" and "residues", Kant and Arpaly on enkratia and (reverse) akrasia, and recent work in neuroscience?





Session VIII: Morphing Morals: Neurochemical Modulation of Moral Judgment and Behavior

Session Chair: Andre Fenton, Professor of Neural Science, Center for Neural Science, New York University
Speaker: Molly Crockett, Sir Henry Wellcome Postdoctoral Fellow, Laboratory of Social & Neural Systems Research, Department of Economics, University of Zurich

Abstract: Neuroscientists are now discovering how hormones and brain chemicals shape social behavior, opening potential avenues for pharmacological manipulation of ethical values. In this talk, I will present an overview of recent studies showing how altering brain chemistry can change moral judgment and behavior. These findings raise new questions about the anatomy of the moral mind, and suggest directions for future research in both neurobiology and practical ethics.





Session IX: Are Intuitions Heuristics?

Session Chair: Laura Franklin-Hall, Assistant Professor of Philosophy, New York University
Speaker: S. Matthew Liao, Director of Graduate Studies, Center for Bioethics; Clinical Associate Professor of Bioethics; Affiliated Professor of Philosophy, New York University

Abstract: Many psychologists and philosophers are attracted to the idea that intuitions are heuristics, a kind of mental short-cut or rule of thumb. As a result, many think that the issue of whether intuitions are reliable is just the issue of whether heuristics are reliable. In this paper, I argue that there are reasons to suspect that intuitions are heuristics. I consider the implication of this point for the debate concerning the reliability of intuitions and for those who hold some kind of dual-process model of moral judgment.





Documentary - Crime Inc: Hallucinogens


Crime Inc. is a series by National Geographic that looked at various drugs (among other things) and the crime associated with them. Among the shows are this one, on hallucinogens, and others on crack, hash, ecstasy, ketamine, prescription drugs, and designer drugs. Unfortunately, they take an alarmist position in some ways (especially with LSD), despite the effort to show the healing uses of some of these drugs.

Here is a general summary of the show/series (provided by a drug treatment facility):
Drugs Inc. Series: Hallucinogens

In today’s society the influence and popularity of television programming is at an all time high. People watch more TV than ever before, and some of the programs offered are extremely educational and beneficial. National Geographic Channel offers all kind of unique and educational programming, but one show in particular has gotten an extreme response due to the nature of the show—Drugs Inc. Drugs Inc profiles a multibillion-dollar industry that fuels crime and violence like no other substance on the planet. Each drug has its own unique origin and abuse pattern, but the outcome of addiction to any drug typically leads to the same rock-bottom lifestyle. While some users sacrifice their lives to an addiction they can’t escape, others find drugs to be their only saving grace from physical or emotional pain almost impossible to overcome. The series goes in depth by following actual drug abusers, dealers, and families involved with addiction to drugs.

Each episode focuses on real people and real world situations involving a certain drug. It’s very shocking, educational, and inspirational to see how these drugs affect our society and ruin individual’s lives. The hallucinogens episode goes into detail about the history of the drugs and their cultural and religious ties. Hallucinogenic substances are among the oldest drugs used by human kind, as hallucinogenic substances naturally occur in mushrooms, cacti and a variety of other plants. The most common hallucinogen abused is LSD, which stands for Lysergic Acid Diethylamide, or also known as Acid. Other hallucinogens are mescaline, mushrooms, psilocybin and ibogaine. The danger with hallucinogens is how a user simply loses control of their senses and perception of reality. There have been several instances of people doing absurd and damaging things to their body or even dying from doing something inspired by a hallucinogen trip. This episode can be very inspiring for someone who abuses hallucinogens to stop their drug involvement because they can see what will happen to them.


Will Wilkinson - Politics vs. Empathy (at Big Think)


Last month, Will Wilkinson posted this brief article looking at some research about the ways in which we over-estimated the extent to which others feel what we feel, unless those others hold opposing politics views (and one could assume, any other quality that makes them different from ourselves).

What interests me most about this research is that it was focused on embodied empathy (the study used cold and thirst as the empathic feeling states). This suggests, to me, that we are willing to allow cognitive biases (differing political view) to override the basic humanity of shared physiological discomfort - in essence, we dehumanize others by attributing to them opposing political views.

This is interesting - and disheartening - research, but there needs to be more of it, and as Wilkinon suggests, we need to know more about "the limits of outgroup empathic reception."
It appears to be extremely difficult to keep in our tiny tribal monkey minds, but do try: they are not really so different. Better: there is no they, only us. Why are we so prone to violence? Why do we cross borders illegally? Why do we hate us?
Indeed.

By the way, I use the image of Sam Harris at the top of this post because he is the perfect example of someone who has lost all ability to empathize with others who are different from him, in this case Muslims, and likely anyone who appears to be from Muslim parts of the world. He is advocating that we profile all people who appear Muslim for TSA screenings in airports. And yet he was the one person in his own article who posed a serious threat:
I once accidentally used a bag for carry-on in which I had once stored a handgun—and passed through three airport checkpoints with nearly 75 rounds of 9 mm ammunition.
Seriously - I have friends who are Indian by birth, with brown skin, who are Muslim, and very affluent. They have been "profiled" a dozen or more times, while I have never been singled out for extra scrutiny. The TSA already profiles Muslim men, and they are no more dangerous than the old couple or the three year old Harris describe in his rant.

Anyway, here is the article from Big Think.

Politics vs. Empathy


Politics makes us stupid. This is one of my recurring themes. This is the principal reason I refuse to be a partisan or ideological team player. People call me libertarian but I don't in part because I'm not one, but mostly because I suspect that accepting any such label dings my IQ about 15 points. It turns out politics not only makes us stupid. It also makes us callous. Here's the abstract of "More Than Skin Deep: Visceral States Are Not Projected Onto Dissimilar Others" by Ed O'Brien and Phoebe C. Ellsworth of the University of Michigan [via : healthcanal.com]
What people feel shapes their perceptions of others. In the studies reported here, we examined the assimilative influence of visceral states on social judgment. Replicating prior research, we found that participants who were outside during winter overestimated the extent to which other people were bothered by cold (Study 1), and participants who ate salty snacks without water thought other people were overly bothered by thirst (Study 2). However, in both studies, this effect evaporated when participants believed that the other people under consideration held opposing political views from their own. Participants who judged these dissimilar others were unaffected by their own strong visceral-drive states, a finding that highlights the power of dissimilarity in social judgment. Dissimilarity may thus represent a boundary condition for embodied cognition and inhibit an empathic understanding of shared out-group pain. Our findings reveal the need for a better understanding of how people’s internal experiences influence their perceptions of the feelings and experiences of those who may hold different values from their own.
Got that? We overestimate the extent to which others feel what we're feeling, unless they're on another team.

The authors call the tendency to generalize our own feelings "egocentric projection." What's the point of it?
[T]he social projection of visceral feelings may derive from the tendency to imagine another person’s situation by first imagining oneself in the same situation (Van Boven & Loewenstein, 2003); in other words, social projection of visceral feelings may reflect a more general projection of similarity.
If this rationale is correct, it suggests that people may not project visceral states onto others who are clearly different from themselves.
Makes sense, right? If we want to know how others are feeling, one quick and dirty trick is just to imagine ourselves in their shoes, see how we feel, and then attribute those feelings to others. But we're not perfect at abstracting away from the atypical particularities of own present internal states. So, if we happen to be a little cold or thirsty, we'll project our chill or thirstiness into our little internal simulation of others. But not if others have, as in this study, different politics. What does politics have to do with thirst or chill. Nothing at all. That we are so quick to find our own feelings irrelevant to the understanding of people with different politics, just think how intuitively alien people who eat strange food and speak other languages much seem.

On one hand, the lack of egocentric projection onto out-group members eliminates errors of overprojection. Thirsty liberals will overestimate the thirstiness of other liberals, but not of conservatives. So a sense of difference can eliminate a certain common bias. On the other hand, this trivial gain in objectivity seems to be due to a sense that out-groupers are so dissimilar that it's not worth putting ourselves into their shoes, which is a harrowing thought.

Now, O'Brien and Ellsworth's study was designed to pick up the absence of projection, which does suggests a certain failure of empathy. But there is nothing in the study to suggest that this necessarily leads us to make other errors about what outgroupers feel. We'd need to better understand the positive value of egocentric projection of visceral states in order to fully grasp the implications of our tendency not to project our feelings into outgroupers. O'Brien and Ellsworth do take a stab at some practical implications:
Our research ... suggests that people may be uninfluenced by their own pain when gauging pain felt by dissimilar others. Thus, if lawmakers first test interrogation practices (as suggested by Nordgren et al., 2011), they may not project the experience onto those for whom it is designed (e.g., suspected terrorists), and this could lead to an unintended acceptance of torture. Similarly, homeless populations often struggle with poor nutrition and intemperate weather; personally feeling hungry and cold may be insufficient to sensitize people who have no long-term worries about food and shelter to the plight of this highly stigmatized out-group (Harris & Fiske, 2006). These consequences suggest a surprising limitation in people’s capacity to empathize with others with whom they disagree or differ from. Perceptions of dissimilar others are apparently uninformed by visceral feelings.
Maybe strapping men into pregnancy bellies doesn't help?!

The limits of empathetic projection are interesting and suggestive, but I'd like to know more about the limits of out-group empathetic reception. No doubt there is work on this, and that it is even more depressing. If, say, white people were capable of fully empathizing with young black men, the American gulag system could not exist. It appears to be extremely difficult to keep in our tiny tribal monkey minds, but do try: they are not really so different. Better: there is no they, only us. Why are we so prone to violence? Why do we cross borders illegally? Why do we hate us?

Photo credit: WBEN-TV on Flickr

Friday, May 04, 2012

Aura Readers May Experience "Emotional Synesthesia"


For centuries, or longer, there have been healers who read auras as a way to diagnose or treat illness. Science has not been able to explain this skill through identifying or examining the aura itself. Looking for the aura itself did not produce results, so they decided to look for clues in the healers. In new research they have found evidence that some healers are experiencing "emotional synesthesia." In essence, they are getting visual information about the person's emotional state.

Scientific evidence proves why healers see the 'aura' of people

Posted On: May 4, 2012
Researchers in Spain have found that many of the individuals claiming to see the aura of people –traditionally called "healers" or "quacks"– actually present the neuropsychological phenomenon known as "synesthesia" (specifically, "emotional synesthesia"). This might be a scientific explanation of their alleged "virtue". In synesthetes, the brain regions responsible for the processing of each type of sensory stimuli are intensely interconnected. This way, synesthetes can see or taste a sound, feel a taste, or associate people with a particular color.

The study was conducted by the University of Granada Department of Experimental Psychology Óscar Iborra, Luis Pastor and Emilio Gómez Milán, and has been published in the prestigious journal Consciousness and Cognition. This is the first time that a scientific explanation is provided on the esoteric phenomenon of the aura, a supposed energy field of luminous radiation surrounding a person as a halo, which is imperceptible to most human beings.

In neurological terms, synesthesia is due to cross-wiring in the brain of some people (synesthetes); in other words, synesthetes present more synaptic connections than "normal" people. "These extra connections cause them to automatically establish associations between brain areas that are not normally interconnected", professor Gómez Milán explains. Many healers claiming to see the aura of people might have this condition.


The case of the "Santón de Baza"

The University of Granada researchers remark that "not all healers are synesthetes, but there is a higher prevalence of this phenomenon among them. The same occurs among painters and artists, for example". To carry out this study, the researchers interviewed some synesthetes as the healer from Granada "Esteban Sánchez Casas", known as "El Santón de Baza".

Many people attribute "paranormal powers" to El Santón, such as his ability to see the aura of people "but, in fact, it is a clear case of synesthesia", the researchers explain. El Santón presents face-color synesthesia (the brain region responsible for face recognition is associated with the color-processing region); touch-mirror synesthesia (when the synesthete observes a person who is being touched or is experiencing pain, s/he experiences the same); high empathy (the ability to feel what other person is feeling), and schizotypy (certain personality traits in healthy people involving slight paranoia and delusions). "These capacities make synesthetes have the ability to make people feel understood, and provide them with special emotion and pain reading skills", the researchers explain.

In the light of the results obtained, the researchers remark the significant "placebo effect" that healers have on people, "though some healers really have the ability to see people's auras and feel the pain in others due to synesthesia". Some healers "have abilities and attitudes that make them believe in their ability to heal other people, but it is actually a case of self-deception, as synesthesia is not an extrasensory power, but a subjective and 'adorned' perception of reality", the researchers state.

An Introduction to the Films of Peter Greenaway: Three Early Shorts

From Open Culture, three early short films from the avant garde film director Peter Greenaway, the warped and brilliant mind behind Drowning By Numbers (1988), The Cook, The Thief, His Wife and Her Lover (1989), Prospero's Books (1991), The Pillow Book (1996), and 8½ Women (1999), among other films.

An Introduction to the Films of Peter Greenaway: Three Early Shorts



This week brings a Peter Greenaway double-bill to one of Los Angeles’ choicest revival cinemas, and what better way to get myself into the appropriate headspace than by first watching a few Greenaway shorts on the internet? When not making films, staging art happenings, or giving lectures, Greenaway teaches at the European Graduate School, and so he has a faculty page featuring a selection of videos of and pertaining to his work. These go all the way back to 1969, when he made a black-and-white short called Intervals. Shot around Venice during the Biennale, the film showcases the tendencies around which Greenaway has gone on to build his entire body of work: an attention to architecture; an attraction to historic centers of European art; an inclination toward schematic structures based upon numbers, alphabets, and music; and a desperate desire to escape the confines of narrative.


In 1973′s H is for House, you can more clearly sense Greenaway’s droll, calculating humor that would emerge in full force in the eighties, the decade which saw the release of his well-known pictures The Draughtsman’s Contract, A Zed and Two Noughts, and The Cook, the Thief, His Wife and Her Lover. It begins with the tale of a naturalist who, accustomed to following the sun every day as it moves from one side of house to the other, simply cannot adapt when the Earth one day begins turning in the other direction. Things then take an architectural and alphabetical turn as the camera examines various features of a country house — the one from the story, perhaps? — and several voices announce all the things certain letters, mostly H, can stand for. (“Home movie,” “heliolithic,” “haberdashery,” and so on.) A lady and a toddler appear, and eventually we’re hearing another story, this time of a woman who always stares north, suspicious that sneaky city builders will encroach on her territory from that direction. As the sun sets around the house, one of the voices then tells us of a man who, convinced that sunsets “recharge” his eyesight, inadvertently sparks a rivalry between those who prefer to watch the sun fall and those who prefer to watch it rise.


In 1975, Greenaway came as close as he ever has to fantasy by making Water Wrackets (part one, part two).  Narrating over images of swamps, Greenaway crafts the absurdly elaborate far-future mythology of the title creatures. While these relatively simple pieces lack the visual abundance and long-form scope of the feature films that would come later, they nevertheless provide an illuminating glimpse into the way Greenaway sees and organizes the world. Though it enriches the viewing experience to understand these qualities in any filmmaker, with Greenaway it’s almost a necessity. Watch these shorts, and you’ll come away that much better equipped to enjoy The Falls, The Belly of an Architect, The Pillow Book, and all of Greenaway’s other movies, whether in a favorite theater or on the comfort of your own couch.

Related content:
Peter Greenaway Looks at the Day Cinema Died — and What Comes Next
Darwin: A 1993 Film by Peter Greenaway

Colin Marshall hosts and produces Notebook on Cities and Culture. Follow him on Twitter at @colinmarshall.

The Significance of Neuroscience for Morality: Lessons from a Decade of Research


NYU hosted a 3-day conference - 2012 Bioethics Conference: The Moral Brain - organized by the NYU Center for Bioethics in collaboration with the Duke Kenan Institute for Ethics with generous support from the NYU Graduate School for Arts & Science and the NYU Humanities Initiative. The conference was held Friday, March 30, 2012 through Sunday, April 1, 2012 - they began posting the videos on May 1.

Here are some brief opening remarks followed by the first day's lectures - I will post more as they become available (Session II has not been posted yet).

Welcoming: The Significance of Neuroscience for Morality: Lessons from a Decade of Research
Thomas Carew, Dean of the Faculty of Arts & Science, New York University
It has been a decade since the first brain imaging studies of moral judgments by Joshua Greene, Jorge Moll and their colleagues were reported. During this time, there have been rich philosophical and scientific discussions regarding a) whether brain imaging data can tell us anything about moral judgments, and b) what they do tell us if they can tell us something about moral judgments. In this workshop, we aim to bring leading philosophers, neuroscientists, and psychologists in this area together to examine these issues and to explore the future directions of this research.




Session I: Beyond Point-And-Shoot Morality: Why Cognitive (Neuro)Science Matters for Ethics
Session Chair: Joseph LeDoux, University Professor; Henry and Lucy Moses Professor of Science; Professor of Neural Science and Psychology, Center for Neural Science and Psychology, New York University

Speaker: Joshua Greene: John & Ruth Hazel Associate Professor of Social Sciences, Department of Psychology, Harvard University
Abstract: Does the "is" is of empirical moral psychology have implications for the "ought" of normative ethics? I'll argue that it does. One cannot deduce moral truths form scientific truths, but  cognitive science, including cognitive neuroscience, may nevertheless influence moral thinking in profound ways. First, I'll review evidence for the dual-process theory of moral judgment, according to which characteristically deontological judgments tend to be driven by automatic emotional responses while characteristically consequentialist judgments tend to be driven by controlled cognitive processes. I'll then consider the respective functions of automatic and controlled processes. Automatic processes are like the point-and-shoot settings on a camera, efficient but inflexible. Controlled processes are like a camera's manual mode, inefficient but flexible. Putting these theses together, I'll argue that deontological philosophy is essentially a rationalization of automatic responses that are too inflexible to handle our peculiarly modern moral problems. I'll recommend consequentialist thinking as a better alternative for modern moral problem-solving.




Session III: When the Mind Matters for Morality
Session Chair: William Ruddick, Professor of Philosophy, New York University; Former Director of the Center for Bioethics.
Speaker: Liane Young, Assistant Professor of Psychology, Boston College

Abstract: Mental state reasoning is critical for moral cognition, allowing us to distinguish, for example, murder from manslaughter. I will present neural evidence for distinct cognitive components of mental state reasoning for moral judgment, and investigate differences in mental state reasoning for distinct moral domains, i.e. harm versus purity, for self versus other, and for groups versus individuals. I will discuss these findings in the context of the broader question of why the mind matters for morality.




Session IV: The Representation of Reinforcement Values in Care-Based Morality & the Implications of Dysfunction for the Development of Psychopathic Traits

Session Chair: Lila Davachi, Associate Professor of Psychology, New York University

Speaker: James Blair: Chief of the Unit on Affective Cognitive Neuroscience, National Institute of Mental Health, NIH

Abstract: This talk will concentrate on two brain areas critical for the development of care-based morality (social rules covering harm to others). The role of the amygdala in stimulus-reinforcement learning will be considered, particularly when the reinforcement is social (the fear, sadness and pain of others). The role of orbital frontal cortex in the representation of value, critical for decision making (both care-based moral and non-moral) will also be considered. Data showing dysfunction in both of these systems and these functions and their interaction in individuals with psychopathic traits will be presented and the implications of these data for care-based (and other forms of) morality will be considered.



Session V: Is There One Moral Brain
Session Chair: Don Garrett, Chair of Department and Professor of Philosophy, New York University

Speaker: Walter Sinnott-Armstrong: Chauncey Stillman Professor in Practical Ethics, Department of Philosophy & Kenan Institute for Ethics, Duke University

Abstract: Different kinds of moral dilemmas produce activity in different parts of the brain, so there is no single neural network behind all moral judgments. This paper will survey the growing evidence for this claim and its implications for philosophy and for method in moral neuroscience.



Thursday, May 03, 2012

TEDxStudioCityED - Daniel Siegel, MD - Mindfulness and Neural Integration

It's great to see Dr. Dan Siegel doing a TEDx and not just the big TED events.

His most recent book is the Pocket Guide to Interpersonal Neurobiology: An Integrative Handbook of the Mind (Norton Series on Interpersonal Neurobiology) - there is also an updated reissue of  The Developing Mind, Second Edition: How Relationships and the Brain Interact to Shape Who We Are.





Exploring Relationships and Reflection in the Cultivation of Well-Being.
Daniel Siegel, MD, is Clinical Professor of psychiatry at UCLA, Co-Director of Mindful Awareness Research Center, Executive Director of Mindsight Institute, author, and recipient of numerous awards and honorary fellowships.

This talk examines how relationships and reflection support the development of resilience in children and serve as the basic '3 R's" of a new internal education of the mind.

Harvard Magazine - Sociobiologist E.O. Wilson on the Evolution of Culture


I just recently received my copy of E.O. Wilson's new book, The Social Conquest of Earth. Whether we are talking about religion, sports teams, or nations, biologist E.O. Wilson argues that our need to be a part of groups - and to defined them or fight for them - is part of what has allowed us to inhabit the earth, and part of what makes us human.

This excerpt from the book was posted by Harvard Magazine.

On the Origins of the Arts

Sociobiologist E.O. Wilson on the evolution of culture 

Rich and seemingly boundless as the creative arts seem to be, each is filtered through the narrow biological channels of human cognition. Our sensory world, what we can learn unaided about reality external to our bodies, is pitifully small. Our vision is limited to a tiny segment of the electromagnetic spectrum, where wave frequencies in their fullness range from gamma radiation at the upper end, downward to the ultralow frequency used in some specialized forms of communication. We see only a tiny bit in the middle of the whole, which we refer to as the “visual spectrum.” Our optical apparatus divides this accessible piece into the fuzzy divisions we call colors. Just beyond blue in frequency is ultraviolet, which insects can see but we cannot. Of the sound frequencies all around us we hear only a few. Bats orient with the echoes of ultrasound, at a frequency too high for our ears, and elephants communicate with grumbling at frequencies too low.
Tropical mormyrid fishes use electric pulses to orient and communicate in opaque murky water, having evolved to high efficiency a sensory modality entirely lacking in humans. Also, unfelt by us is Earth’s magnetic field, which is used by some kinds of migratory birds for orientation. Nor can we see the polarization of sunlight from patches of the sky that honeybees employ on cloudy days to guide them from their hives to flower beds and back.

Our greatest weakness, however, is our pitifully small sense of taste and smell. Over 99 percent of all living species, from microorganisms to animals, rely on chemical senses to find their way through the environment. They have also perfected the capacity to communicate with one another with special chemicals called pheromones. In contrast, human beings, along with monkeys, apes, and birds, are among the rare life forms that are primarily audiovisual, and correspondingly weak in taste and smell. We are idiots compared with rattlesnakes and bloodhounds. Our poor ability to smell and taste is reflected in the small size of our chemosensory vocabularies, forcing us for the most part to fall back on similes and other forms of metaphor. A wine has a delicate bouquet, we say, its taste is full and somewhat fruity. A scent is like that of a rose, or pine, or rain newly fallen on the earth.

We are forced to stumble through our chemically challenged lives in a chemosensory biosphere, relying on sound and vision that evolved primarily for life in the trees. Only through science and technology has humanity penetrated the immense sensory worlds in the rest of the biosphere. With instrumentation, we are able to translate the sensory worlds of the rest of life into our own. And in the process, we have learned to see almost to the end of the universe, and estimated the time of its beginning. We will never orient by feeling Earth’s magnetic field, or sing in pheromone, but we can bring all such information existing into our own little sensory realm.

By using this power in addition to examine human history, we can gain insights into the origin and nature of aesthetic judgment. For example, neurobiological monitoring, in particular measurements of the damping of alpha waves during perceptions of abstract designs, have shown that the brain is most aroused by patterns in which there is about a 20 percent redundancy of elements or, put roughly, the amount of complexity found in a simple maze, or two turns of a logarithmic spiral, or an asymmetric cross. It may be coincidence (although I think not) that about the same degree of complexity is shared by a great deal of the art in friezes, grillwork, colophons, logographs, and flag designs. It crops up again in the glyphs of the ancient Middle East and Mesoamerica, as well in the pictographs and letters of modern Asian languages. The same level of complexity characterizes part of what is considered attractive in primitive art and modern abstract art and design. The source of the principle may be that this amount of complexity is the most that the brain can process in a single glance, in the same way that seven is the highest number of objects that can be counted at a single glance. When a picture is more complex, the eye grasps its content by the eye’s saccade or consciously reflective travel from one sector to the next. A quality of great art is its ability to guide attention from one of its parts to another in a manner that pleases, informs, and provokes.

In another sphere of the visual arts there is biophilia, the innate affiliation people seek with other organisms, and especially with the living natural world. Studies have shown that given freedom to choose the setting of their homes or offices, people across cultures gravitate toward an environment that combines three features, intuitively understood by landscape architects and real estate entrepreneurs. They want to be on a height looking down, they prefer open savanna-like terrain with scattered trees and copses, and they want to be close to a body of water, such as a river, lake, or ocean. Even if all these elements are purely aesthetic and not functional, home buyers will pay any affordable price to have such a view.

People, in other words, prefer to live in those environments in which our species evolved over millions of years in Africa. Instinctively, they gravitate toward savanna forest (parkland) and transitional forest, looking out safely over a distance toward reliable sources of food and water. This is by no means an odd connection, if considered as a biological phenomenon. All mobile animal species are guided by instincts that lead them to habitats in which they have a maximum chance for survival and reproduction. It should come as no surprise that during the relatively short span since the beginning of the Neolithic, humanity still feels a residue of that ancient need.

If ever there was a reason for bringing the humanities and science closer together, it is the need to understand the true nature of the human sensory world, as contrasted with that seen by the rest of life. But there is another, even more important reason to move toward consilience among the great branches of learning. Substantial evidence now exists that human social behavior arose genetically by multilevel evolution. If this interpretation is correct, and a growing number of evolutionary biologists and anthropologists believe it is, we can expect a continuing conflict between components of behavior favored by individual selection and those favored by group selection. Selection at the individual level tends to create competitiveness and selfish behavior among group members—in status, mating, and the securing of resources. In opposition, selection between groups tends to create selfless behavior, expressed in greater generosity and altruism, which in turn promote stronger cohesion and strength of the group as a whole.

An inevitable result of the mutually offsetting forces of multilevel selection is permanent ambiguity in the individual human mind, leading to countless scenarios among people in the way they bond, love, affiliate, betray, share, sacrifice, steal, deceive, redeem, punish, appeal, and adjudicate. The struggle endemic to each person’s brain, mirrored in the vast superstructure of cultural evolution, is the fountainhead of the humanities. A Shakespeare in the world of ants, untroubled by any such war between honor and treachery, and chained by the rigid commands of instinct to a tiny repertory of feeling, would be able to write only one drama of triumph and one of tragedy. Ordinary people, on the other hand, can invent an endless variety of such stories, and compose an infinite symphony of ambience and mood.

What exactly, then, are the humanities? An earnest effort to define them is to be found in the U.S. congressional statute of 1965, which established the National Endowment for the Humanities and the National Endowment for the Arts:
The term “humanities” includes, but is not limited to, the study of the following: language, both modern and classical; linguistics; literature; history; jurisprudence; philosophy; archaeology; comparative religion; ethics; the history, criticism, and theory of the arts; those aspects of social sciences which have humanistic content and employ humanistic methods; and the study and application of the humanities to the human environment with particular attention to reflecting our diverse heritage, traditions, and history and to the relevance of the humanities to the current conditions of national life.
Such may be the scope of the humanities, but it makes no allusion to the understanding of the cognitive processes that bind them all together, nor their relation to hereditary human nature, nor their origin in prehistory. Surely we will never see a full maturing of the humanities until these dimensions are added.

Since the fading of the original Enlightenment during the late eighteenth and early nineteenth centuries, stubborn impasse has existed in the consilience of the humanities and natural sciences. One way to break it is to collate the creative process and writing styles of literature and scientific research. This might not prove so difficult as it first seems. Innovators in both of two domains are basically dreamers and storytellers. In the early stages of creation of both art and science, everything in the mind is a story. There is an imagined denouement, and perhaps a start, and a selection of bits and pieces that might fit in between. In works of literature and science alike, any part can be changed, causing a ripple among the other parts, some of which are discarded and new ones added. The surviving fragments are variously joined and separated, and moved about as the story forms. One scenario emerges, then another. The scenarios, whether literary or scientific in nature, compete. Words and sentences (or equations or experiments) are tried. Early on an end to all the imagining is conceived. It seems a wondrous denouement (or scientific breakthrough). But is it the best, is it true? To bring the end safely home is the goal of the creative mind. Whatever that might be, wherever located, however expressed, it begins as a phantom that might up until the last moment fade and be replaced. Inexpressible thoughts flit along the edges. As the best fragments solidify, they are put in place and moved about, and the story grows and reaches its inspired end. Flannery O’Connor asked, correctly, for all of us, literary authors and scientists, “How can I know what I mean until I see what I say?” The novelist says, “Does that work?,” and the scientist says, “Could that possibly be true?”

The successful scientist thinks like a poet but works like a bookkeeper. He writes for peer review in hopes that “statured” scientists, those with achievements and reputations of their own, will accept his discoveries. Science grows in a manner not well appreciated by nonscientists: it is guided as much by peer approval as by the truth of its technical claims. Reputation is the silver and gold of scientific careers. Scientists could say, as did James Cagney upon receiving an Academy Award for lifetime achievement, “In this business you’re only as good as the other fellow thinks you are.”

But in the long term, a scientific reputation will endure or fall upon credit for authentic discoveries. The conclusions will be tested repeatedly, and they must hold true. Data must not be questionable, or theories crumble. Mistakes uncovered by others can cause a reputation to wither. The punishment for fraud is nothing less than death—to the reputation, and to the possibility of further career advancement. The equivalent capital crime in literature is plagiarism. But not fraud! In fiction, as in the other creative arts, a free play of imagination is expected. And to the extent it proves aesthetically pleasing, or otherwise evocative, it is celebrated.

The essential difference between literary and scientific style is the use of metaphor. In scientific reports, metaphor is permissible—provided it is chaste, perhaps with just a touch of irony and self-deprecation. For example, the following would be permitted in the introduction or discussion of a technical report: “This result if confirmed will, we believe, open the door to a range of further fruitful investigations.” Not permitted is: “We envision this result, which we found extraordinarily hard to obtain, to be a potential watershed from which many streams of new research will surely flow.”

What counts in science is the importance of the discovery. What matters in literature is the originality and power of the metaphor. Scientific reports add a tested fragment to our knowledge of the material world. Lyrical expression in literature, on the other hand, is a device to communicate emotional feeling directly from the mind of the writer to the mind of the reader. There is no such goal in scientific reporting, where the purpose of the author is to persuade the reader by evidence and reasoning of the validity and importance of the discovery. In fiction the stronger the desire to share emotion, the more lyrical the language must be. At the extreme, the statement may be obviously false, because author and reader want it that way. To the poet the sun rises in the east and sets in the west, tracking our diel cycles of activity, symbolizing birth, the high noon of life, death, and rebirth—even though the sun makes no such movement. It is just the way our distant ancestors visualized the celestial sphere and the starry sky. They linked its mysteries, which were many, to those in their own lives, and wrote them down in sacred script and poetry across the ages. It will be a long time before a similar venerability in literature is acquired by the real solar system, in which Earth is a spinning planet encircling a minor star.

On behalf of this other truth, that special truth sought in literature, E. L. Doctorow asks,
Who would give up the Iliad for the “real” historical record? Of course the writer has a responsibility, whether as solemn interpreter or satirist, to make a composition that serves a revealed truth. But we demand that of all creative artists, of whatever medium. Besides which a reader of fiction who finds, in a novel, a familiar public figure saying and doing things not reported elsewhere knows he is reading fiction. He knows the novelist hopes to lie his way to a greater truth than is possible with factual reportage. The novel is an aesthetic rendering that would portray a public figure interpretively no less than the portrait on an easel. The novel is not read as a newspaper is read; it is read as it is written, in the spirit of freedom.
Picasso expressed the same idea summarily: “Art is the lie that helps us to see the truth.”

The creative arts became possible as an evolutionary advance when humans developed the capacity for abstract thought. The human mind could then form a template of a shape, or a kind of object, or an action, and pass a concrete representation of the conception to another mind. Thus was first born true, productive language, constructed from arbitrary words and symbols. Language was followed by visual art, music, dance, and the ceremonies and rituals of religion.

The exact date at which the process leading to authentic creative arts is unknown. As early as 1.7 million years ago, ancestors of modern humans, most likely Homo erectus, were shaping crude teardrop-shaped stone tools. Held in the hand, they were probably used to chop up vegetables and meat. Whether they were also held in the mind as a mental abstraction, rather than merely created by imitation among group members, is unknown.

By 500,000 years ago, in the time of the much brainier Homo heidelbergensis, a species intermediate in age and anatomy between Homo erectus and Homo sapiens, the hand axes had become more sophisticated, and they were joined by carefully crafted stone blades and projectile points. Within another 100,000 years, people were using wooden spears, which must have taken several days and multiple steps to construct. In this period, the Middle Stone Age, the human ancestors began to evolve a technology based on a true, abstraction-based culture.

Next came pierced snail shells thought to be used as necklaces, along with still more sophisticated tools, including well-designed bone points. Most intriguing are engraved pieces of ocher. One design, 77,000 years old, consists of three scratched lines that connect a row of nine X-shaped marks. The meaning, if any, is unknown, but the abstract nature of the pattern seems clear.

Burials began at least 95,000 years ago, as evidenced by thirty individuals excavated at Qafzeh Cave in Israel. One of the dead, a nine-year-old child, was positioned with its legs bent and a deer antler in its arms. That arrangement alone suggests not just an abstract awareness of death but also some form of existential anxiety. Among today’s hunter-gatherers, death is an event managed by ceremony and art.

The beginnings of the creative arts as they are practiced today may stay forever hidden. Yet they were sufficiently established by genetic and cultural evolution for the “creative explosion” that began approximately 35,000 years ago in Europe. From this time on until the Late Paleolithic period over 20,000 years later, cave art flourished. Thousands of figures, mostly of large game animals, have been found in more than two hundred caves distributed through southwestern France and northeastern Spain, on both sides of the Pyrenees. Along with cliffside drawings in other parts of the world, they present a stunning snapshot of life just before the dawn of civilization.

The Louvre of the Paleolithic galleries is at the Grotte Chauvet in the Ardèche region of southern France. The masterpiece among its productions, created by a single artist with red ocher, charcoal, and engraving, is a herd of four horses (a native wild species in Europe at that time) running together. Each of the animals is represented by only its head, but each is individual in character. The herd is tight and oriented obliquely, as though seen from slightly above and to the left. The edges of the muzzles were chiseled into bas relief to bring them into greater prominence. Exact analyses of the figures have found that multiple artists first painted a pair of rhinoceros males in head-to-head combat, then two aurochs (wild cattle) facing away. The two groups were placed to leave a space in the middle. Into the space the single artist stepped to create his little herd of horses.

The rhinos and cattle have been dated to 32,000–30,000 years before the present, and the assumption has been that the horses are that old as well. But the elegance and technology evident in the horses have led some experts to reckon their provenance as dating to the Magdalenian period, which extended from 17,000 to 12,000 years ago. That would align the origin with the great works on the cave walls of Lascaux in France and Altamira in Spain.

Apart from the exact date of the Chauvet herd’s antiquity, the important function of the cave art remains uncertain. There is no reason to suppose the caves served as proto-churches, in which bands gathered to pray to the gods. The floors are covered with the remains of hearths, bones of animals, and other evidences of long-term domestic occupation. The first Homo sapiens entered central and eastern Europe around 45,000 years ago. Caves in that period obviously served as shelters that allowed people to endure harsh winters on the Mammoth Steppe, the great expanse of grassland that extended below the continental ice sheet across the whole of Eurasia and into the New World.
Perhaps, some writers have argued, the cave paintings were made to conjure sympathetic magic and increase the success of hunters in the field. This supposition is supported by the fact that a great majority of the subjects are large animals. Furthermore, 15 percent of these animal paintings depict animals that have been wounded by spears or arrows.

Additional evidence of a ritualistic content in the European cave art has been provided by the discovery of a painting of what is most likely a shaman with a deer headdress, or possibly a real deer’s head. Also preserved are sculptures of three “lion-men,” with human bodies and the heads of lions—precursors of the chimeric half-animal-half-gods later to show up in the early history of the Middle East. Admittedly, we have no testable idea of what the shaman did or the lion-men represented.

A contrary view of the role of cave art has been advanced by the wildlife biologist R. Dale Guthrie, whose masterwork The Nature of Paleolithic Art is the most thorough on the subject ever published. Almost all of the art, Guthrie argues, can be explained as the representations of everyday Aurignacian and Magdalenian life. The animals depicted belong to the species the cave dwellers regularly hunted (with a few, like lions, that may have hunted people), so naturally that would be a regular subject for talk and visual communication. There were also more figures of humans or at least parts of the human anatomy that are usually not mentioned in accounts of cave art. These tend to be pedestrian. The inhabitants often made prints by holding their hands on the wall and spewing ocher powder from their mouths, leaving an outline of spread thumb and fingers behind. The size of the hands indicates that it was mostly children who engaged in this activity. A good many graffiti are present as well, with meaningless squiggles and crude representations of male and female genitalia common among them. Sculptures of grotesque obese women are also present and may have been offerings to the spirits or gods to increase fertility—the little bands needed all the members they could generate. On the other hand, the sculptures might as easily have been an exaggerated representation of the plumpness in women desired during the frequent hard times of winter on the Mammoth Steppe.

The utilitarian theory of cave art, that the paintings and scratchings depict ordinary life, is almost certainly partly correct, but not entirely so. Few experts have taken into account that there also occurred, in another wholly different domain, the origin and use of music. This event provides independent evidence that at least some of the paintings and sculptures did have a magical content in the lives of the cave dwellers. A few writers have argued that music had no Darwinian significance, that it sprang from language as a pleasant “auditory cheesecake,” as one author once put it. It is true that scant evidence exists of the content of the music itself—just as, remarkably, we have no score and therefore no record of Greek and Roman music, only the instruments. But musical instruments also existed from an early period of the creative explosion. “Flutes,” technically better classified as pipes, fashioned from bird bones, have been found that date to 30,000 years or more before the present. At Isturitz in France and other localities some 225 reputed pipes have been so classified, some of which are of certain authenticity. The best among them have finger holes set in an oblique alignment and rotated clockwise to a degree seemingly meant to line up with the fingers of a human hand. The holes are also beveled in a way that allows the tips of the fingers to be sealed against them. A modern flutist, Graeme Lawson, has played a replica made from one of them, albeit of course without a Paleolithic score in hand.

Other artifacts have been found that can plausibly be interpreted as musical instruments. They include thin flint blades that, when hung together and struck, produce pleasant sounds like those from wind chimes. Further, although perhaps just a coincidence, the sections of walls on which cave paintings were made tend to emit arresting echoes of sound in their vicinity.

Was music Darwinian? Did it have survival value for the Paleolithic tribes that practiced it? Examining the customs of contemporary hunter-gatherer cultures from around the world, one can hardly come to any other conclusion. Songs, usually accompanied by dances, are all but universal. And because Australian aboriginals have been isolated since the arrival of their forebears about 45,000 years ago, and their songs and dances are similar in genre to those of other hunter-gatherer cultures, it is reasonable to suppose that they resemble the ones practiced by their Paleolithic ancestors.

Anthropologists have paid relatively little attention to contemporary hunter-gatherer music, relegating its study to specialists on music, as they are also prone to do for linguistics and ethnobotany (the study of plants used by the tribes). Nonetheless, songs and dances are major elements of all hunter-gatherer societies. Furthermore, they are typically communal, and they address an impressive array of life issues. The songs of the well-studied Inuit, Gabon pygmies, and Arnhem Land aboriginals approach a level of detail and sophistication comparable to those of advanced modern civilizations. The musical compositions of modern hunter-gatherers generally serve basically as tools that invigorate their lives. The subjects within the repertoires include histories and mythologies of the tribe as well as practical knowledge about land, plants, and animals.

Of special importance to the meaning of game animals in the Paleolithic cave art of Europe, the songs and dances of the modern tribes are mostly about hunting. They speak of the various prey; they empower the hunting weapons, including the dogs; they appease the animals they have killed or are about to kill; and they offer homage to the land on which they hunt. They recall and celebrate successful hunts of the past. They honor the dead and ask the favor of the spirits who rule their fates.

It is self-evident that the songs and dances of contemporary hunter-gatherer peoples serve them at both the individual and the group levels. They draw the tribal members together, creating a common knowledge and purpose. They excite passion for action. They are mnemonic, stirring and adding to the memory of information that serves the tribal purpose. Not least, knowledge of the songs and dances gives power to those within the tribe who know them best.

To create and perform music is a human instinct. It is one of the true universals of our species. To take an extreme example, the neuroscientist Aniruddh D. Patel points to the Pirahã, a small tribe in the Brazilian Amazon: “Members of this culture speak a language without numbers or a concept of counting. Their language has no fixed terms for colors. They have no creation myths, and they do not draw, aside from simple stick figures. Yet they have music in abundance, in the form of songs.”

Patel has referred to music as a “transformative technology.” To the same degree as literacy and language itself, it has changed the way people see the world. Learning to play a musical instrument even alters the structure of the brain, from subcortical circuits that encode sound patterns to neural fibers that connect the two cerebral hemispheres and patterns of gray matter density in certain regions of the cerebral cortex. Music is powerful in its impact on human feeling and on the interpretation of events. It is extraordinarily complex in the neural circuits it employs, appearing to elicit emotion in at least six different brain mechanisms.

Music is closely linked to language in mental development and in some ways appears to be derived from language. The discrimination patterns of melodic ups and downs are similar. But whereas language acquisition in children is fast and largely autonomous, music is acquired more slowly and depends on substantial teaching and practice. There is, moreover, a distinct critical period for learning language during which skills are picked up swiftly and with ease, whereas no such sensitive period is yet known for music. Still, both language and music are syntactical, being arranged as discrete elements—words, notes, and chords. Among persons with congenital defects in perception of music (composing 2 to 4 percent of the population), some 30 percent also suffer disability in pitch contour, a property shared in parallel manner with speech.

Altogether, there is reason to believe that music is a newcomer in human evolution. It might well have arisen as a spin-off of speech. Yet, to assume that much is not also to conclude that music is merely a cultural elaboration of speech. It has at least one feature not shared with speech—beat, which in addition can be synchronized from song to dance.

It is tempting to think that the neural processing of language served a preadaptation to music, and that once music originated it proved sufficiently advantageous to acquire its own genetic predisposition. This is a subject that will greatly reward deeper additional research, including the synthesis of elements from anthropology, psychology, neuroscience, and evolutionary biology.

Wednesday, May 02, 2012

Crazy Wisdom - The Life and Times of Chogyam Trungpa Rinpoche - Full Movie (2011)


I don't know how long this will be available on YouTube, but here is the whole film of Crazy Wisdom - The Life and Times of Chogyam Trungpa Rinpoche. I very much enjoyed this when I saw it last year.
Crazy Wisdom is the long-awaited feature documentary to explore the life, teachings, and "crazy wisdom" of Chogyam Trungpa, Rinpoche, a pivotal figure in bringing Tibetan Buddhism to the West. Called a genius, rascal, and social visionary; 'one of the greatest spiritual teachers of the 20th century,' and 'the bad boy of Buddhism,' Trungpa defied categorization. Raised and trained in the rigorous Tibetan monastic tradition, Trungpa came to the West and shattered our preconceived notions about how an enlightened teacher should behave - he openly smoked, drank, and had intimate relations with students - yet his teachings are recognized as authentic, vast, and influential. Twenty years after his death, with unprecedented access and exclusive archival material, Crazy Wisdom looks at the man and the myths about him, and attempts to set the record straight. Written by Lisa Leeman, Producer.


2007 Roundtable Discussion - Mystery of the Mind

A nice collection of smart people - Gianfranco Basti, Ned Block, Richard Haier, Joseph LeDoux, Patrick McGrath, and Craig Piers - discuss the mystery of the mind in this roundtable from 2007, sponsored by the .




  • Monsignor Gianfranco Basti is the dean of the Pontifical Lateran University's philosophy department
  • Ned Block (Ph.D., Harvard) is Silver Professor of Philosophy, Psychology and Neural Science at NYU
  • Richard Haier is a neuroscience consultant and Professor-In-Residence, Emeritus, University of California, Irvine School of Medicine
  • Joseph LeDoux is the "Henry and Lucy Moses Professor of Science" at NYU
  • Patrick McGrath is professor of psychology at Dalhousie University in Halifax, Nova Scotia and at the IWK Health Centre where he co-directs the Centre for Pediatric Pain Research and is director of the Centre for Family Health Research.
  • Craig Piers is a psychotherapist and clinical supervisor at Williams College health center.

Beyond Human Nature: How Culture and Experience Shape our Lives by Jesse Prinz

Sounds like an interesting new book from philosopher Jesse Prinz - reviewed by Simon Blackburn at The New Statesman. It looks like Amazon has it for the Kindle, but the hardcover has to be ordered from other vendors. The U.S. version is not out until November - but it can be pre-ordered.


Beyond Human Nature: How Culture and Experience Shape our Lives

By Jesse Prinz


Beyond Human Nature: How Culture and Experience Shape our Lives
Jesse Prinz
Allen Lane, 416pp, £22

It is astonishing how quickly nature has gone into retreat. Until five or ten years ago, the dominant story was that our genes were our fate. Our fixed endowments in the shape of unlearned capacities, innate modules, biologically hard-wired dispositions and evolutionary inheritances from the savannah dominated the scene, with culture and history relegated to mere bit players.

The first cracks in the consensus appeared with the realisation that genes work differently in different environments. For example, it was discovered that if rats were separated into two groups, one of which received maternal care and love while the other did not, parts of the brain grew better in the former group and they were less likely to flood themselves with stress hormones such as cortisol. So, if you want a laid-back rat, mother it properly. Epigenetic factors began to muscle in on the DNA monopoly.

Of course, in human beings we already knew - didn't we? - that such environmental factors affected children's characters. And it didn't take much guessing to suppose that it did this by making some difference to their brains. But somehow it took the addition of brain scans and neurophysiological and endocrinal data from rats to make such beliefs respectable again, so that anthropology could begin to claw back ground from biology.

Jesse Prinz, a philosopher at the City University of New York, has written an excellent guide to the current state of play. Prinz is admirably cautious about the nature-nurture dispute, which always has to come down to matters of detail and degree. His interest is in human flexibility, although he freely admits that "we need very sophisticated biological resources to be as flexible as we are". Nevertheless, it is clear where his sympathies lie. Early in the book he tells us that only "a tiny fraction of articles in psychology journals take culture into consideration". So it is time to redress the balance, and Prinz does it with insight, learning and above all a wonderful eye for the weaknesses in biological reductionist arguments.

Prinz lays out his case by first considering the difference between colour vision, which is a natural capacity with a well-understood biological underpinning, and the capacity to play baseball, which requires putting together a number of general capacities in a way that takes a great deal of nurture to develop. The question, then, is the size of the innate inventory of capacities, rules, dispositions and tendencies, shaped over time by evolution, and themselves constituting adaptations to older environments. Are they large, computationally fixed and relatively inflexible, like colour vision? Or is it more a matter of general-purpose abilities (running, balancing, throwing, remembering) exquisitely tuned by culture and learning into one form or another, like baseball?

In the former camp we have evolutionary psychologists, nativists and those who like a picture of the mind as a kind of Swiss Army knife: an aggregation of dedicated modules rigidly shaped by evolution. In the latter camp, we have those who stress general purpose learning capacities, which in one environment might enable you to become a cricketer, but in another a baseball player. At the dawn of the scientific revolution, the philosophical ancestors of the first group were rationalists such as Descartes and Leibniz, who saw the mind as ready-furnished by God with a nice array of innate capacities. The ancestors of the second group were the empiricists, who thought that we needed no such interior designer. Experience could do the furnishing all by itself.

Ever since the work of Noam Chomsky in the middle of the 20th century, our capacities with language have been one of the major battlegrounds. The trump card of Chomsky and his followers is the "poverty of stimulus" argument. This alleges that empiricism cannot account for language learning. We learn too much, too quickly, making too few mistakes, extrapolating what we learn too accurately, for this to be the result of any general empirical learning process. Out of all the myriad possible patterns linguistic systems might implement, the infant almost miraculously picks up the right one, with far too little experience or correction to explain the unfolding capacities. Only a few theorists have dared to challenge this Chomskyan consensus. And Chomskyans are certainly right that the infant cannot be doing it by consciously formulating rules, since even expert linguists often cannot do as much.

Prinz makes a strong, detailed case that statistical learning, the poster child of empiricism, can account for everything we know about language learning. Children do not just imitate, they extrapolate. They try things out. They take patterns they hear and extend them experimentally. They are unconsciously nudged into shape by the regularities in the data sets to which they are exposed. And this makes sense: the brain is designed to pick up on patterns in the environment, whether they indicate edibility in food, change in the weather, the passage of a predator, the way to cook a squirrel, or the acceptability of a new sentence. Instead of arriving packed with innate universal grammars, we come ready to pick up whatever the world is going to throw at us. The quicker we learn its ins and outs, the better.

The example may sound dry, but there is a vital humanistic lesson in the book. It has been all to easy to cite "innate" differences as justifications of the social status quo, when too often it is the social status quo that generates the illusion of the innate differences. For example, the belief that girls are naturally girlish and boys naturally boyish ignores the ubiquitous pressures to conform to the acceptable pattern, starting well before birth and reinforced throughout life. Prinz writes well about this, too. Similar remarks apply, of course, to those who, themselves belonging to the supposedly superior group, put different IQ scores or arithmetical or musical ability down to differences of race, before reflecting on the social and cultural environments of those who are being compared.

Prinz's final chapter is about sex, but I shall not spoil the plot. Suffice it to say that we are not naturally polygamous, or monogamous, or anything else, except perhaps naturally inclined to bend the truth on questionnaires. "Those who want to understand our preferences will learn more from history books than from chimpanzee troops in the Gombe," Prinz tells us. "[B]iology can help explain why we are more likely to flirt with a person than a potato, but that's just where the story begins."

From start to finish this book is a fine, balanced, enormously learned and informative blast on the trumpet of common sense and humane understanding. The story is largely optimistic but also reminds us that when things go wrong around us, we are all capable of going wrong with them.

Simon Blackburn is Bertrand Russell professor of Philosophy at Cambridge. His most recent book is "Practical Tortoise Raising and Other Philosophical Essays" (Oxford University Press, £25)

Tuesday, May 01, 2012

io9 - This Is Your Brain on Marijuana

Take a few hits, sit back and melt into the couch, time gets a little wonky (slow), memory gets unreliable, and suddenly you're hungry. Sounds vaguely familiar, right? Well, if you have ever smoked marijuana it sounds familiar.

io9 posted an article (or, reposted actually) on the stuff going on in your brain after a few puffs on a joint, or a bong hit if you are so inclined.

Interesting stuff.

What cannabis actually does to your brain

Archaeologists recently found a 2,700-year-old pot stash, so we know humans have been smoking weed for thousands of years. But it was only about 20 years ago that neuroscientists began to understand how it affects our brains.

Scientists have known for a while that the active ingredient in cannabis was a chemical called delta-9-tetrahydrocannabinol, or THC for short. Ingesting or smoking THC has a wide range of effects, from the psychoactive "getting high" to the physiological relief of pain and swelling. It also acts as both a stimulant and depressant. How could one substance do all that?

What cannabis actually does to your brain  
Meet the cannabinoid receptor

In the 1980s and 90s, researchers identified cannabinoid receptors, long, ropy proteins that weave themselves into the surfaces of our cells and process THC. They also process other chemicals, many of them naturally occurring in our bodies. Once we'd discovered these receptors, we knew exactly where THC was being processed in our bodies and brains, as well as what physical systems it was affecting. Scattered throughout the body, cannabinoid receptors come in two varieties, called CB1 and CB2 - most of your CB1 receptors are in your brain, and are responsible for that "high" feeling when you smoke pot. CB2 receptors, often associated with the immune system, are found all over the body. THC interacts with both, which is why the drug gives you the giggles and also (when interacting with the immune system) reduces swelling and pain.

Cannabinoid receptors evolved in sea squirts about 500 million years ago; humans and many other creatures inherited ours from a distant ancestor we share with these simple sea creatures. THC binds to receptors in animals as well as humans, with similar effects.

Tasty, tasty, tasty
Cannabis notoriously makes people hungry - even cancer patients who had lost all desire to eat. One study showed that cancer patients who thought food smelled and tasted awful suddenly regained an ability to appreciate food odors after ingesting a THC compound. There are CB1 receptors in your hypothalamus, a part of your brain known to regulate appetite, and your body's own cannabinoids usually send the "I'm hungry" message to them. But when you ingest THC, you artificially boost the amount of cannabinoids sending that message to your hypothalamus, which is why you get the munchies.

Understanding this process has actually led to a new body of research into safe diet drugs that would block those cannabinoid receptors. That way, your hypothalamus wouldn't receive signals from your body telling it to eat, and would reduce hunger cravings in dieters.

What you're forgetting
What's happening in your brain when smoking pot makes you forget what you're saying in the middle of saying it? According to the book Marijuana and Medicine (National Academies Press):
One of the primary effects of marijuana in humans is disruption of short-term memory. That is consistent with the abundance of CB1 receptors in the hippocampus, the brain region most closely associated with memory. The effects of THC resemble a temporary hippocampal lesion.
That's right - smoking a joint creates the effect of temporary brain damage.

What happens is that THC shuts down a lot of the normal neuroprocessing that goes on in your hippocampus, slowing down the memory process. So memories while stoned are often jumpy, as if parts are missing. That's because parts literally are missing: Basically you are saving a lot less information to your memory. It's not that you've quickly forgotten what's happened. You never remembered it at all.

What cannabis actually does to your brain  
A bit of the old timey wimey

Cannabis also distorts your sense of time. THC affects your brain's dopamine system, creating a stimulant effect. People who are stoned often report feeling excited, anxious, or energetic as a result. Like other stimulants, this affects people's sense of time. Things seem to pass quickly because the brain's clock is sped up. At the same time, as we discussed earlier (if you can remember), the drug slows down your ability to remember things. That's because it interferes with the brain's acetylcholine system, which is part of what helps you store those memories in your hippocampus. You can see that system's pathway through the brain in red in the illustration at left.

In an article io9 published last year about the neuroscience of time, we noted:
The interesting thing about smoking pot is that marijuana is one of those rare drugs that seems to interact with both the dopamine and the acetylcholine system, speeding up the former and slowing down the latter. That's why when you get stoned, your heart races but your memory sucks.
It's almost as if time is speeding up and slowing down at the same time.

Addiction and medicine
Some experts call cannabis a public health menace that's addictive and destroys lives by robbing people of ambition. Other experts call it a cure for everything from insomnia to glaucoma, and advocate its use as a medicine. The former want it to be illegal; the latter want it prescribed by doctors. Still other groups think it should be treated like other intoxicants such as alcohol and coffee - bad if you become dependent on it, but useful and just plain fun in other situations.

What's the truth? Scientists have proven that cannabis does have medical usefulness, and the more we learn the more intriguing these discoveries become. Since the early 1980s, medical researchers have published about how cannabis relieves pressure in the eye, thus easing the symptoms of glaucoma, a disease that causes blindness. THC is also "neuroprotective," meaning in essence that it prevents brain damage. Some studies have suggested that cannabis could mitigate the effects of Alzheimer's for this reason.

At the same time, we know that THC interferes with memory, and it's still uncertain what kinds of long-term effects the drug could have on memory functioning. No one has been able to prove definitively that it does or does not erode memory strength over time. Obviously, smoking it could cause lung damage. And, like the legal intoxicant alcohol, cannabis can become addictive.

Should cannabis be illegal, while alcohol flows? Unfortunately that's not the kind of question that science can answer. Let's leave the moral questions to courts, policymakers and shamans. I'll be off to the side, smoking a joint, thinking about my acetylcholine system and the many uses of the hippocampus.

No, you aren't having a drug flashback. This post originally appeared on io9 on April 20, 2011.