Pages

Saturday, April 14, 2012

Bookforum Omnivore - How We View Human Evolution

Bookforum's Omnivore posted a nice collection of links on human evolution and how we assemble the family tree of our ancestors. Enjoy!



Emotional Maturation, Resilience, and the Aging Brain

All four of these videos deal in some way with how we age, mature, and thrive through resilience. Barring unfortunate circumstances, we all age - so the question is How Will We Age? Will we age well, with grace and maturity, or will we decline in function, become less resilient, and suffer cognitive decline?

Professor Lorraine Tyler - The Resilient Brain: Cognition and Aging

"The Resilient Brain: Cognition and Aging", this year's British Academy/British Psychological Society Lecture, was given by Professor Lorraine Tyler FBA at the Royal Society, London, on Thursday 22 September 2011 at the Royal Society.

Professor Tyler discussed some research that takes a positive view of changes across the lifespan, and in doing so is starting to overturn existing stereotypes of ageing.





Exploring the Crossroads of Attention and Memory in the Aging Brain: Views from the Inside

Dr. Adam Gazzaley studies the neural mechanisms of memory and attention, how these processes change with normal aging and dementia, and how we might intervene therapeutically to alleviate memory and attention deficits. Series: "UCSF Osher Mini Medical School for the Public" [4/2012]





The Aging but Resilient Brain: Keeping Neurons Happy

Joel Kramer, UCSF Professor of Neuropsychology and the Director of the Memory and Aging Center Neuropsychology program. He explores the underlying biological mechanisms of cognitive aging, and interventions that may optimize cognitive functioning as we age. Series: "UCSF Osher Mini Medical School for the Public" [4/2012]





How Our Emotional Lives Mature: Changes and New Strengths

Robert Levenson, UC Berkeley Department of Psychology, explores the changes in emotion that occur with age. Much of his research focuses on the nature of human emotion, in terms of its physiological manifestations, variations in emotion associated with age, gender, culture, and pathology, and the role emotion plays in interpersonal interactions. Series: "UCSF Osher Mini Medical School for the Public" [4/2012]


Open Culture - Celebrate Samuel Beckett’s Birthday with Waiting For Godot (the Film)


Yesterday was the 106th anniversary of the birth of Samuel Beckett, one of my favorite novelists and certainly my favorite playwright. During my first year after grad school (the first time) I read everything by Beckett that I could get my hands on - for a while I tried to write plays, but I realized I was just trying to write Beckett for the late 20th century.




Celebrate Samuel Beckett’s Birthday with Waiting For Godot (the Film)

Samuel Beckett's pared-down prose and plays are among the greatest achievements of late modernism.

At a young man Beckett moved to Paris, where he befriended another Irish exile, James Joyce. As a writer, Beckett realized early on that he would never match Joyce’s “epic, heroic” achievement. Where Joyce was a synthesizer, Beckett once said, he was an analyzer. “I realized that my own way was impoverishment,” he said, “in lack of knowledge and in taking away, subtracting rather than adding.”

To celebrate Beckett’s birthday we bring you a pair of videos, including an excellent 2001 film version (above) of the most famous of his enigmatic creations, Waiting for Godot. It’s the centerpiece of Beckett on Film, a series of adaptions of all 19 of Beckett’s plays, organized by Michael Colgan, artistic director of the Gate Theatre in Dublin. The film features Barry McGovern as Vladimir, Johnny Murphy as Estragon, Alan Stanford as Pozzo and Stephen Brennan as Lucky. It was directed by Michael Lindsay-Hogg, who describes Waiting for Godot as being “like Mozart–too easy for children, too difficult for adults.” He goes on:
The play is what it is about. Samuel Beckett would have said it’s about two men waiting on the side of the road for someone to turn up. But you can invest in the importance of who is going to turn up. Is it a local farmer? Is it God? Or is it simply someone who doesn’t show up? The important thing is the ambiguity–the fact that it doesn’t really state what it is. That’s why it’s so great for the audience to be part of–they fill in a lot of the blanks. It works in their imaginations.
You can order the 19-film boxed set of Beckett on Film here, and read the full text of Waiting for Godot while listening to a CBC audio recording of the play, read by the Stratford Festival Players, starting here.

For fans of Harold Pinter, there is also a film clip of him talking about his first meeting with Beckett, his mentor and friend.

Friday, April 13, 2012

Marilynne Robinson - "The locus of the human mystery is perception of this world"

Marilynne Robinson intrigues me. She is a religious person, and by that I mean she believes in God, who is very fond of science and the sciences. She is a philosopher who is best known as a novelist (her second novel, Gilead, was awarded the Pulitzer Prize, and she teaches at the prestigious University of Iowa MFA program). Her 2011 book, Absence of Mind: The Dispelling of Inwardness from the Modern Myth of the Self (The Terry Lectures Series), was one of the better philosophy books of the year and an eloquent defense of human subjectivity.

This essay, which appeared in February in The Chronicle Review (from The Chronicle of Higher Education) is an excerpt from her book When I Was a Child I Read Books, in which it appears as "Freedom of Thought." The book was be published by Farrar, Straus & Giroux in March 2012. It's a long essay, but it's also very thought-provoking . . . . She defends the religious sensibility, not so much religion itself, that sees sacredness in the world around us.

Reclaiming a Sense of the Sacred

A writer contemplates religion, science, art, and the miraculous

Reclaiming a Sense of the Sacred 1
Myoung Ho Lee, courtesy Yossi Milo Gallery, New York
Over the years of writing and teaching, I have tried to free myself of constraints I felt, limits to the range of exploration I could make, to the kind of intuition I could credit. I realized gradually that my own religion, and religion in general, could and should disrupt these constraints, which amount to a small and narrow definition of what human beings are and how human life is to be understood. And I have often wished my students would find religious standards present in the culture that would express a real love for human life and encourage them also to break out of these same constraints.

For the educated among us, moldy theories we learned as sophomores, memorized for the test and never consciously thought of again, exert an authority that would embarrass us if we stopped to consider them. I was educated at a center of behaviorist psychology and spent a certain amount of time pestering rats. There was some sort of maze-learning experiment involved in my final grade, and since I remember the rat who was my colleague as uncooperative, or perhaps merely incompetent at being a rat, or tired of the whole thing, I don't remember how I passed. I'm sure coercion was not involved, since this rodent and I avoided contact. Bribery was, of course, central to the experiment and no black mark against either of us, though I must say, mine was an Eliot Ness among rats for its resistance to the lure of, say, Cheerios.

I should probably have tried raising the stakes. The idea was, in any case, that behavior was conditioned by reward or its absence, and that one could extrapolate meaningfully from the straightforward demonstration of rattish self-interest promised in the literature, to the admittedly more complex question of human motivation. I have read subsequently that a female rat is so gratified at having an infant rat come down the reward chute that she will do whatever is demanded of her until she has filled her cage with them. This seems to me to complicate the definition of self-interest considerably, but complexity was not a concern of the behaviorism of my youth, which was reductionist in every sense of the word.

It wasn't all behaviorism. We also pondered Freud's argument that primordial persons, male, internalized the father as superego by actually eating the poor fellow. Since then we have all felt bad—well, the male among us, at least. Whence human complexity, whence civilization. I did better on that exam. The plot was catchy.

The situation of the undergraduate rarely encourages systematic doubt. What Freud thought was important because it was Freud who thought it, and so with B.F. Skinner and whomever else the curriculum held up for our admiration. There must be something to all this, even if it has only opened the door a degree or two on a fuller understanding. So I thought at the time. And I also thought it was a very bleak light that shone through that door, and I shouldered my share of the supposedly inevitable gloom that came with being a modern.

In English class we studied a poem by Robert Frost, "The Oven Bird." The poem asks "what to make of a diminished thing." That diminished thing, said the teacher, was human experience in the modern world. Oh dear. Modern aesthetics. We must learn from this poem "in singing not to sing." To my undergraduate self I thought, "But what if I like to sing?" And then my philosophy professor assigned us Jonathan Edwards's Doctrine of Original Sin Defended, in which Edwards argues for "the arbitrary constitution of the universe," illustrating his point with a gorgeous footnote about moonlight that even then began to dispel the dreary determinisms I was learning elsewhere. Improbable as that may sound to those who have not read the footnote.

At a certain point I decided that everything I took from studying and reading anthropology, psychology, economics, cultural history, and so on did not square at all with my sense of things, and that the tendency of much of it was to posit or assume a human simplicity within a simple reality and to marginalize the sense of the sacred, the beautiful, everything in any way lofty. I do not mean to suggest, and I underline this, that there was any sort of plot against religion, since religion in many instances abetted these tendencies and does still, not least by retreating from the cultivation and celebration of learning and of beauty, by dumbing down, as if people were less than God made them and in need of nothing so much as condescension. Who among us wishes the songs we sing, the sermons we hear, were just a little dumber? People today—television—video games—diminished things. This is always the pretext.

Simultaneously, and in a time of supposed religious revival, and among those especially inclined to feel religiously revived, we have a society increasingly defined by economics, and an economics increasingly reminiscent of my experience with that rat, so-called rational-choice economics, which assumes that we will all find the shortest way to the reward, and that this is basically what we should ask of ourselves and—this is at the center of it all—of one another. After all these years of rational choice, brother rat might like to take a look at the packaging just to see if there might be a little melamine in the inducements he was being offered, hoping, of course, that the vendor considered it rational to provide that kind of information. We do not deal with one another as soul to soul, and the churches are as answerable for this as anyone.

If we think we have done this voiding of content for the sake of other people, those to whom we suspect God may have given a somewhat lesser brilliance than our own, we are presumptuous and also irreverent. William Tyndale, who was burned at the stake for his translation of the Bible, who provided much of the most beautiful language in what is called by us the King James Bible, wrote, he said, in the language a plowboy could understand. He wrote to the comprehension of the profoundly poor, those who would be, and would have lived among, the utterly unlettered. And he created one of the undoubted masterpieces of the English language. Now we seem to feel beauty is an affectation of some sort. And this notion is as influential in the churches as it is anywhere. The Bible, Christianity, should have inoculated us against this kind of disrespect for ourselves and one another. Clearly it has not.
For me, at least, writing consists very largely of exploring intuition. A character is really the sense of a character, embodied, attired, and given voice as he or she seems to require. Where does this creature come from? From watching, I suppose. From reading emotional significance in gestures and inflections, as we all do all the time. These moments of intuitive recognition float free from their particular occasions and recombine themselves into nonexistent people the writer and, if all goes well, the reader feel they know.

There is a great difference, in fiction and in life, between knowing someone and knowing about someone. When a writer knows about his character, he is writing for plot. When he knows his character, he is writing to explore, to feel reality on a set of nerves somehow not quite his own. Words like "sympathy," "empathy," and "compassion" are overworked and overcharged—there is no word for the experience of seeing an embrace at a subway stop or hearing an argument at the next table in a restaurant. Every such instant has its own emotional coloration, which memory retains or heightens, and so the most sidelong, unintended moment becomes a part of what we have seen of the world. Then, I suppose, these moments, as they have seemed to us, constellate themselves into something a little like a spirit, a little like a human presence in its mystery and distinctiveness.

Two questions I can't really answer about fiction are (1) where it comes from, and (2) why we need it. But that we do create it and also crave it is beyond dispute. There is a tendency, considered highly rational, to reason from a narrow set of interests, say survival and procreation, which are supposed to govern our lives, and then to treat everything that does not fit this model as anomalous clutter, extraneous to what we are and probably best done without. But all we really know about what we are is what we do. There is a tendency to fit a tight and awkward carapace of definition over humankind, and to try to trim the living creature to fit the dead shell.

The advice I give my students is the same advice I give myself—forget definition, forget assumption, watch. We inhabit, we are part of, a reality for which explanation is much too poor and small. No physicist would dispute this, though he or she might be less ready than I am to have recourse to the old language and call reality miraculous. By my lights, fiction that does not acknowledge this at least tacitly is not true. Why is it possible to speak of fiction as true or false? I have no idea. But if a time comes when I seem not to be making the distinction with some degree of reliability in my own work, I hope someone will be kind enough to let me know.

When I write fiction, I suppose my attempt is to simulate the integrative work of a mind perceiving and reflecting, drawing upon culture, memory, conscience, belief or assumption, circumstance, fear, and desire—a mind shaping the moment of experience and response and then reshaping them both as narrative, holding one thought against another for the effect of affinity or contrast, evaluating and rationalizing, feeling compassion, taking offense. These things do happen simultaneously, after all. None of them is active by itself, and none of them is determinative, because there is that mysterious thing the cognitive scientists call self-awareness, the human ability to consider and appraise one's own thoughts. I suspect this self-awareness is what people used to call the soul.

Modern discourse is not really comfortable with the word "soul," and in my opinion the loss of the word has been disabling, not only to religion but to literature and political thought and to every humane pursuit. In contemporary religious circles, souls, if they are mentioned at all, tend to be spoken of as saved or lost, having answered some set of divine expectations or failed to answer them, having arrived at some crucial realization or failed to arrive at it. So the soul, the masterpiece of creation, is more or less reduced to a token signifying cosmic acceptance or rejection, having little or nothing to do with that miraculous thing, the felt experience of life, except insofar as life offers distractions or temptations.

Having read recently that there are more neurons in the human brain than there are stars in the Milky Way, and having read any number of times that the human brain is the most complex object known to exist in the universe, and that the mind is not identical with the brain but is more mysterious still, it seems to me this astonishing nexus of the self, so uniquely elegant and capable, merits a name that would indicate a difference in kind from the ontological run of things, and for my purposes "soul" would do nicely.

Perhaps I should pause here to clarify my meaning, since there are those who feel that the spiritual is diminished or denied when it is associated with the physical. I am not among them. In his Letter to the Romans, Paul says, "Ever since the creation of the world [God's] invisible nature, namely, his eternal power and deity, has been clearly perceived in the things that have been made." If we are to consider the heavens, how much more are we to consider the magnificent energies of consciousness that make whomever we pass on the street a far grander marvel than our galaxy? At this point of dynamic convergence, call it self or call it soul, questions of right and wrong are weighed, love is felt, guilt and loss are suffered. And, over time, formation occurs, for weal or woe, governed in large part by that unaccountable capacity for self-awareness.

The locus of the human mystery is perception of this world. From it proceeds every thought, every art. I like Calvin's metaphor—nature is a shining garment in which God is revealed and concealed. As we perceive we interpret, and we make hypotheses. Something is happening, it has a certain character or meaning which we usually feel we understand at least tentatively, though experience is almost always available to reinterpretations based on subsequent experience or reflection. Here occurs the weighing of moral and ethical choice. Behavior proceeds from all this, and is interesting, to my mind, in the degree that it can be understood to proceed from it.

We are much afflicted now by tedious, fruitless controversy. Very often, perhaps typically, the most important aspect of a controversy is not the area of disagreement but the hardening of agreement, the tacit granting on all sides of assumptions that ought not to be granted on any side. The treatment of the physical as a distinct category antithetical to the spiritual is one example. There is a deeply rooted notion that the material exists in opposition to the spiritual, precludes or repels or trumps the sacred as an idea. This dichotomy goes back at least to the dualism of the Manichees, who believed the physical world was the creation of an evil god in perpetual conflict with a good god, and to related teachings within Christianity that encouraged mortification of the flesh, renunciation of the world, and so on.

For almost as long as there has been science in the West, there has been a significant strain in scientific thought which assumed that the physical and material preclude the spiritual. The assumption persists among us still, vigorous as ever, that if a thing can be "explained," associated with a physical process, it has been excluded from the category of the spiritual. But the "physical" in this sense is only a disappearingly thin slice of being, selected, for our purposes, out of the totality of being by the fact that we perceive it as solid, substantial. We all know that if we were the size of atoms, chairs and tables would appear to us as loose clouds of energy. It seems to me very amazing that the arbitrarily selected "physical" world we inhabit is coherent and lawful. An older vocabulary would offer the word "miraculous." Knowing what we know now, an earlier generation might see divine providence in the fact of a world coherent enough to be experienced by us as complete in itself, and as a basis upon which all claims to reality can be tested. A truly theological age would see in this divine providence intent on making a human habitation within the wild roar of the cosmos.

But almost everyone, for generations now, has insisted on a sharp distinction between the physical and the spiritual. So we have had theologies that really proposed a "God of the gaps," as if God were not manifest in the creation, as the Bible is so inclined to insist, but instead survives in those dark places, those black boxes, where the light of science has not yet shone. And we have atheisms and agnosticisms that make precisely the same argument, only assuming that at some time the light of science will indeed dispel the last shadow in which the holy might have been thought to linger.

Religious experience is said to be associated with activity in a particular part of the brain. For some reason this is supposed to imply that it is delusional. But all thought and experience can be located in some part of the brain, that brain more replete than the starry heaven God showed to Abraham, and we are not in the habit of assuming that it is all delusional on these grounds. Nothing could justify this reasoning, which many religious people take as seriously as any atheist could do, except the idea that the physical and the spiritual cannot abide together, that they cannot be one dispensation.

We live in a time when many religious people feel fiercely threatened by science. O ye of little faith. Let them subscribe to Scientific American for a year and then tell me if their sense of the grandeur of God is not greatly enlarged by what they have learned from it. Of course many of the articles reflect the assumption at the root of many problems, that an account, however tentative, of some structure of the cosmos or some transaction of the nervous system successfully claims that part of reality for secularism. Those who encourage a fear of science are actually saying the same thing. If the old, untenable dualism is put aside, we are instructed in the endless brilliance of creation. Surely to do this is a privilege of modern life for which we should all be grateful.

For years I have been interested in ancient literature and religion. If they are not one and the same, certainly neither is imaginable without the other. Indeed, literature and religion seem to have come into being together, if by literature I can be understood to include pre-literature, narrative whose purpose is to put human life, causality, and meaning in relation, to make each of them in some degree intelligible in terms of the other two. I was taught, more or less, that we moderns had discovered other religions with narratives resembling our own, and that this discovery had brought all religion down to the level of anthropology. Sky gods and earth gods presiding over survival and procreation. Humankind pushing a lever in the hope of a periodic reward in the form of rain or victory in the next tribal skirmish. From a very simple understanding of what religion has been, we can extrapolate to what religion is now and is intrinsically, so the theory goes. This pattern, of proceeding from presumed simplicity to a degree of elaboration that never loses the primary character of simplicity, is strongly recurrent in modern thought.

I think much religious thought has also been intimidated by this supposed discovery, which is odd, since it certainly was not news to Paul, or Augustine, or Thomas Aquinas, or Calvin. All of them quote the pagans with admiration. Perhaps only in Europe was one form of religion ever so dominant that the fact of other forms could constitute any sort of problem. There has been an influential modern tendency to make a sort of slurry of religious narratives, asserting the discovery of universals that don't actually exist among them. Mircea Eliade is a prominent example. And there is Joseph Campbell. My primary criticism of this kind of scholarship is that it does not bear scrutiny. A secondary criticism I would offer is that it erases all evidence that religion has, anywhere and in any form, expressed or stimulated thought. In any case, the anthropological bias among these writers, which may make it seem free of all parochialism, is in fact absolutely Western, since it regards all religion as human beings acting out their nature and no more than that, though I admit there is a gauziness about this worldview to which I will not attempt to do justice here.

This is the anthropologists' answer to the question, why are people almost always, almost everywhere, religious. Another answer, favored by those who claim to be defenders of science, is that religion formed around the desire to explain what prescientific humankind could not account for. Again, this notion does not bear scrutiny. The literatures of antiquity are clearly about other business.

Some of these narratives are so ancient that they clearly existed before writing, though no doubt in the forms we have them they were modified in being written down. Their importance in the development of human culture cannot be overstated. In antiquity people lived in complex city-states, carried out the work and planning required by primitive agriculture, built ships and navigated at great distances, traded, made law, waged war, and kept the records of their dynasties. But the one thing that seems to have predominated, to have laid out their cities and filled them with temples and monuments, to have established their identities and their cultural boundaries, to have governed their calendars and enthroned their kings, were the vivid, atemporal stories they told themselves about the gods, the gods in relation to humankind, to their city, to themselves.

I suppose it was in the 18th century of our era that the notion became solidly fixed in the Western mind that all this narrative was an attempt at explaining what science would one day explain truly and finally. Phoebus drives his chariot across the sky, and so the sun rises and sets. Marduk slays the sea monster Tiamat, who weeps, whence the Tigris and the Euphrates. It is true that in some cases physical reality is accounted for, or at least described, in the terms of these myths. But the beauty of the myths is not accounted for by this theory, nor is the fact that, in literary forms, they had a hold on the imaginations of the populations that embraced them which expressed itself again as beauty. Over time these narratives had at least as profound an effect on architecture and the visual arts as they did on literature. Anecdotes from them were painted and sculpted everywhere, even on household goods, vases, and drinking cups.

This kind of imaginative engagement bears no resemblance whatever to an assimilation of explanatory models by these civilizations. Perhaps the tendency to think of classical religion as an effort at explaining a world otherwise incomprehensible to them encourages us to forget how sophisticated ancient people really were. They were inevitably as immersed in the realm of the practical as we are. It is strangely easy to forget that they were capable of complex engineering, though so many of their monuments still stand. The Babylonians used quadratic equations.

Yet in many instances ancient people seem to have obscured highly available real-world accounts of things. A sculptor would take an oath that the gods had made an idol, after he himself had made it. The gods were credited with walls and ziggurats, when cities themselves built them. Structures of enormous shaped stones went up in broad daylight in ancient cities, the walls built around the Temple by Herod in Roman-occupied Jerusalem being one example. The ancients knew, though we don't know, how this was done, obviously. But they left no account of it. This very remarkable evasion of the law of gravity was seemingly not of great interest to them. It was the gods themselves who walled in Troy.

In Virgil's Aeneid, in which the poet in effect interprets the ancient Greek epic tradition by attempting to renew it in the Latin language and for Roman purposes, there is one especially famous moment. The hero, Aeneas, a Trojan who has escaped the destruction of his city, sees a painting in Carthage of the war at Troy and is deeply moved by it and by what it evokes, the lacrimae rerum, the tears in things. This moment certainly refers to the place in classical civilization of art that pondered and interpreted the Homeric narratives, which were the basis of Greek and Roman religion. My point here is simply that pagan myth, which the Bible in various ways acknowledges as analogous to biblical narrative despite grave defects, is not a naïve attempt at science.

It is true that almost a millennium separated Homer and Virgil. It is also true that through those centuries the classical civilizations had explored and interpreted their myths continuously. Aeschylus, Sophocles, and Euripides would surely have agreed with Virgil's Aeneas that the epics and the stories that surround them and flow from them are indeed about lacrimae rerum, about a great sadness that pervades human life. The Babylonian Epic of Gilgamesh is about the inevitability of death and loss. This is not the kind of language, nor is it the kind of preoccupation, one would find in a tradition of narrative that had any significant interest in explaining how the leopard got his spots.

The notion that religion is intrinsically a crude explanatory strategy that should be dispelled and supplanted by science is based on a highly selective or tendentious reading of the literatures of religion. In some cases it is certainly fair to conclude that it is based on no reading of them at all. Be that as it may, the effect of this idea, which is very broadly assumed to be true, is again to reinforce the notion that science and religion are struggling for possession of a single piece of turf, and science holds the high ground and gets to choose the weapons.

In fact there is no moment in which, no perspective from which, science as science can regard human life and say that there is a beautiful, terrible mystery in it all, a great pathos. Art, music, and religion tell us that. And what they tell us is true, not after the fashion of a magisterium that is legitimate only so long as it does not overlap the autonomous republic of science. It is true because it takes account of the universal variable, human nature, which shapes everything it touches, science as surely and profoundly as anything else. And it is true in the tentative, suggestive, ambivalent, self-contradictory style of the testimony of a hundred thousand witnesses, who might, taken all together, agree on no more than the shared sense that something of great moment has happened, is happening, will happen, here and among us.

I hasten to add that science is a great contributor to what is beautiful and also terrible in human existence. For example, I am deeply grateful to have lived in the era of cosmic exploration. I am thrilled by those photographs of deep space, as many of us are. Still, if it is true, as they are saying now, that bacteria return from space a great deal more virulent than they were when they entered it, it is not difficult to imagine that some regrettable consequence might follow our sending people to tinker around up there. One article noted that a human being is full of bacteria, and there is nothing to be done about it.

Science might note with great care and precision how a new pathology emerged through this wholly unforeseen impact of space on our biosphere, but it could not, scientifically, absorb the fact of it and the origin of it into any larger frame of meaning. Scientists might mention the law of unintended consequences—mention it softly, because that would sound a little flippant in the circumstances. But religion would recognize in it what religion has always known, that there is a mystery in human nature and in human assertions of brilliance and intention, a recoil the Greeks would have called irony and attributed to some angry whim of the gods, to be interpreted as a rebuke of human pride if it could be interpreted at all. Christian theology has spoken of human limitation, fallen-ness, an individually and collectively disastrous bias toward error. I think we all know that the earth might be reaching the end of its tolerance for our presumptions. We all know we might at any time feel the force of unintended consequences, many times compounded. Science has no language to account for the fact that it may well overwhelm itself, and more and more stand helpless before its own effects.

Of course science must not be judged by the claims certain of its proponents have made for it. It is not in fact a standard of reasonableness or truth or objectivity. It is human, and has always been one strategy among others in the more general project of human self-awareness and self-assertion. Our problem with ourselves, which is much larger and vastly older than science, has by no means gone into abeyance since we learned to make penicillin or to split the atom. If antibiotics have been used without sufficient care and have pushed the evolution of bacteria beyond the reach of their own effectiveness, if nuclear fission has become a threat to us all in the insidious form of a disgruntled stranger with a suitcase, a rebuke to every illusion of safety we entertained under fine names like Strategic Defense Initiative, old Homer might say, "the will of Zeus was moving toward its end." Shakespeare might say, "There is a destiny that shapes our ends, rough-hew them how we will."

The tendency of the schools of thought that have claimed to be most impressed by science has been to deny the legitimacy of the kind of statement it cannot make, the kind of exploration it cannot make. And yet science itself has been profoundly shaped by that larger bias toward irony, toward error, which has been the subject of religious thought since the emergence of the stories in Genesis that tell us we were given a lavishly beautiful world and are somehow, by our nature, complicit in its decline, its ruin. Science cannot think analogically, though this kind of thinking is very useful for making sense and meaning out of the tumult of human affairs.

We have given ourselves many lessons in the perils of being half right, yet I doubt we have learned a thing. Sophocles could tell us about this, or the book of Job. We all know about hubris. We know that pride goeth before a fall. The problem is that we don't recognize pride or hubris in ourselves, any more than Oedipus did, any more than Job's so-called comforters. It can be so innocuous-seeming a thing as confidence that one is right, is competent, is clear-sighted, or confidence that one is pious or pure in one's motives.

As the disciples said, "Who then can be saved?" Jesus replied, "With men this is impossible, but with God all things are possible," in this case speaking of the salvation of the pious rich. It is his consistent teaching that the comfortable, the confident, the pious stand in special need of the intervention of grace. Perhaps this is true because they are most vulnerable to error—like the young rich man who makes the astonishing decision to turn his back on Jesus's invitation to follow him, therefore on the salvation he sought—although there is another turn in the story, and we learn that Jesus will not condemn him. I suspect Jesus should be thought of as smiling at the irony of the young man's self-defeat—from which, since he is Jesus, he is also ready to rescue him ultimately.

The Christian narrative tells us that we individually and we as a world turn our backs on what is true, essential, wholly to be desired. And it tells us that we can both know this about ourselves and forgive it in ourselves and one another, within the limits of our mortal capacities. To recognize our bias toward error should teach us modesty and reflection, and to forgive it should help us avoid the inhumanity of thinking we ourselves are not as fallible as those who, in any instance, seem most at fault. Science can give us knowledge, but it cannot give us wisdom. Nor can religion, until it puts aside nonsense and distraction and becomes itself again.

~ Marilynne Robinson is a professor of creative writing at the University of Iowa. This essay is an excerpt from her book When I Was a Child I Read Books, in which it appears as "Freedom of Thought." The book will be published by Farrar, Straus & Giroux in March 2012. © 2012 Marilynne Robinson.

TED Blog - Reason vs. Compassion: Rebecca Newberger Goldstein and Steven Pinker at TED2012

Interesting discussion between Steven Pinker (linguist/psychologist) and Rebecca Newberger Goldstein (philosopher/novelist), who are husband and wife, that also includes Stewart Brand, Seth Godin, Chris Anderson, and Ken Robinson. Other than a TED event, there are not a lot of other places a conversation like this might happen (Edge being the other place).

Enjoy.

Reason vs. Compassion: Rebecca Newberger Goldstein and Steven Pinker at TED2012

Photo: James Duncan Davidson

Steven Pinker is a linguist and psychologist. Rebecca Newberger Goldstein is a philosopher and novelist. The pair are also married, and they have taken to the TED stage, in front of a dinner table with several luminaries, to have a very public argument, or, in their rather more academic terms, a Socratic dialogue. An edited version of the conversation follows.

Rebecca Newberger Goldstein: Reason appears to have fallen on hard times. Popular culture plumbs new depths of dumb; political discourse has become a race to the bottom. We live in an era of scientific creationism, 9/11 conspiracy theories, psychic hotlines and an insurgence of religious fundamentalism. People who think too well are often accused of elitism. Even in the academy there are attacks on logocentrism, the crime of letting logic dominate our thinking.

Steven Pinker: Is this necessarily a bad thing? Maybe reason is over-rated? Maybe it’s dominated by overeducated policy wonks, like the best and brightest who dragged us into the quagmire of Vietnam. They threatened our way of living with weapons of mass destruction. Perhaps compassion and conscience, not a whole-hearted calculation, will save us. My fellow psychologists have shown we live by our bodies and emotions; they use a teeny power of reason to rationalize feelings after the fact. Wasn’t it no less a thinker than your fellow philosopher, David Hume, who famously wrote: “reason is, and ought to be, only the slave of the passions?” Perhaps  if irrationality is inevitable, we should lie back and enjoy it?

RNG: Alas, poor Hume. He implied no such thing. How could a reasoned argument logically entail the ineffectiveness of reasoned arguments. You’re trying to persuade us of reason’s impotence. You’re not threatening us or bribing us asking us to resolve the issue with a beauty contest or a show of hands. By the act of trying to reason us into your position, you’re conceding reason’s potency. Reason isn’t up for grabs here. It can’t be.

SP: But can reason lead us in directions that are good or decent or moral? You pointed out that reason is a means to the end and the end depends on the reasoners’ passions. Can reason lead to peace and harmony if the reasoner wants peace and harmony just as reason can lay out a roadmap to conflict and strife? Can reason force a reasoner to want less cruelty and waste?

RNG: On its own, the answer is no. But it doesn’t take much to switch it to yes. You need two conditions: firstly, that reasoners all care about their own well-being, and secondly that we are members of a community of reasoners who can affect people’s messages and reasoning. That is certainly true of our gregarious and loquacious species. Combine self-interest and socialism with reason and you arrive a morality that requires you to take others’ interests into account. Suppose I say “please get off my foot” or “don’t stab me with a steak knife because you are curious to see how I’ll react.” My appeal to you can’t privilege my wellbeing over yours if I want you to take me seriously. I can’t say my well-being matters because I’m me and you’re not. Any more than I can persuade you that the spot I stand on is always somehow special because wherever I stand I get to say “I’m here and everyone else is there.” You’d be very quick to point out the inconsistency, not to speak of the lunacy of my position. You could counter with the same argument, only substituting yourself for me. There is complete parity: logical and moral.

SP: That sounds good in theory, but it hasn’t worked that way in practice. In particular, the momentous historical development: we seem to be getting more humane. Centuries ago, our ancestors burnt cats alive, knights waged war on each other by trying to kill as many peasants as possible. Governments killed people for frivolous reasons, like stealing a cabbage. Executions were designed to be prolonged and painful as possible: crucifixion, disembowlment, on the wheel. Respectable people kept slaves. For all of our flaws, we have lost these practices.

RNG: So human nature has changed?

SP: Not exactly. We still harbor instincts that can erupt in violence like greed, tribalism, sadism,m but we have instincts that steer ourselves away: we also have empathy, fairness, what Abraham Lincoln referred to as the “better angels of our nature.” Our circle of empathy has expanded. Years ago, we used to empathize with blood relations and a small circle of allies. With expansion of literacy and travel, that’s expanded to include race, nation, perhaps eventually all humanity.

RNG: Can hard-headed scientists really give so much credit to soft-hearted empathy?

SP: They can and they do. Empathy emerges early in life, perhaps before the age of one. And books on empathy have become best sellers, like The Age of Empathy.

RNG: I’m all for empathy. Who isn’t? But all on its own, it’s a feeble instrument for making moral progress. For one thing it’s innately biased toward blood relation, toward babies or warm fuzzy animals. Outsiders can go to hell. Even our best efforts to remain connected with others fall miserably short, a sad truth of human nature. Take Adam Smith, who wrote: “if he was to lose his little finger tomorrow, he would not sleep tonight; but provided he never saw them, he would snore with the most profound security over the ruin of a hundred million of his brethren.”

SP: So if empathy wasn’t enough to make us more humane, what else was there?

RNG: One of our most effective better angels might be reason. Reason has muscle. Reason provides the push to widen that circle of empathy. Every one of the humanitarian developments you mention originated with thinkers who gave reasons for why some practice was indefensible. They demonstrated that the way people treated some particular group of others was logically inconsistent with the way they insisted on being treated themselves.

SP: Are you saying reason can actually change people’s minds? Don’t people stick with the convictions that serve their interests or conforms to the cultures they grew up in?

RNG: Here’s an fascinating fact: contradictions bother us. At least, when we are forced to confront them, which is just another way of saying we are susceptible to reason. Look at the history of moral progress; trace a direct pathway to changing the way we actually feel. Time and again, people lay out an argument as to why some practice is indefensible, irrational, inconsistent with values already held. The essay would go viral, be translated into many languages, get debated at pubs, coffee houses, at salons and dinner parties, and influence leaders, legislators, popular opinion. Eventually their conclusions get absorbed into the common sense of decency. Few of us today feel the need to put forth rigorous philosophical argument about why slavery is wrong, or public hangings or beating children. By now these things feel wrong. But just those arguments had to be made, and they were, in centuries past.

SP: Are you saying that people needed a step by step argument to know why there was something a wee bit wrong with burning heretics at the stake?

RNG: Oh, they did. The Frenchman Sebastian Castellio wrote precisely on this topic.

SP: And of cruel and unusual punishment like breaking people on the wheel?

RNG: Look at a pamphlet circulated in 1764 by the Italian jurist, Cesare Beccaria: “That a punishment may produce the effect required, it is sufficient that the evil it occasions should exceed the good expected from the crime.”

SP: But surely anti-war movements depended on demonstrations and catchy tunes by folk artists and wrenching photographs of the human costs of war?

RNG: No doubt, but modern anti-war movements reach back to a long chain of thinkers who had argued as to why we ought to mobilize our emotions against war.

SP: But everyone knows the abolition of slavery depended on faith and emotion, and was driven by Quakers. It only became popular when Harriet Beecher Stowe‘s novel, Uncle Tom’s Cabin became a best-seller.

RNG: Yes, but the ball got rolling a century before. John Locke bucked the tide of millennia that had regarded slavery as perfectly natural. He argued that it was inconsistent with principles of rational government.

SP: Sounds familiar. Where have I heard this before? Oh yes, Mary Astell extended this plight to women and the family.

RNG: The logic is the same. Once that’s hammered home it becomes increasingly uncomfortable to ignore the inconsistency. Look at the 1960s, with civil rights, gay rights, children’s right, even animal rights. But fully two centuries before the enlightened thinker Jeremy Bentham had exposed the sensibility of insensible practices such as cruelty to animals and the persecution of homosexuals.

SP: Still, in every case it took at least a century for the arguments of these great thinkers to trickle down to the population as a whole. Could there be practices we take for granted where the argument is against us for all to see but nonetheless we persist in them?

RNG: You mean, will our great grandchildren be as appalled by some of our practices as we are by slave owning heretic burning gay-bashing ancestors?

SP: I’m sure everyone here could think of an example. The imprisonment of non violent drug offenders or tolerance of rape in prisons? The possession of nuclear weapons?

RNG: The appeal to religion to justify the unjustifiable such as the ban on contraception?

SP: What about religious faith in general? So, I have been convinced that reason is the better angel that holds the greatest hope for the moral progress our species will enjoy and holds out the greatest hope for the future.

RNG: And if there’s a flaw in our argument, you’ll be depending on reason to point it out.
That ended the talk, but the discussion then moved back to the dinner table. Chris Anderson invited the participants to ask questions.

Stewart Brand: Now what? This removes the tragic theory of history from the stage of history. That is an astounding thing to do?

SP: To the extent that history is driven by ideas, the ideas are changing. The debate over gay marriage is raging, but it used to be whether it was legal at all. We often forget the progress that’s been made.

Stewart Brand: Is that progress irreversible?

SP: No, there can be unexpected surprises, but that is the drift. Even though some parts of the world are behind the curve, they get caught up. My favorite example is the abolition of slavery.

Seth Godin: Mass movements involve lots and lots of people. But reason isn’t interpreted in the same way by all people at all times. A lot of what goes into a mass movement doesn’t have much to do with the idea at the center, but with the perception of what our neighbors think, and things like that. Is reason really the driving force?

RNG: I have a great deal of confidence in people’s reason. When those arguments are out there it penetrates. It takes a long time. It’s fascinating what’s happening with animals. You can see the ideas moving, people are becoming more accepting of vegetarians and vegans.

Seth Godin: But none of that has to do with reason.

Chris Anderson: My impression was that the argument is that reason is the slow burn that wins the day. It’s not the only part of the story. Is there another question?

Ken Robinson: Slavery is illegal now, but there are more people in slavery than any time in history. (The audience applauds, in recognition that it’s an important topic to discuss.)

SP: Percentagewise it’s at an all-time low.

Ken Robinson: But if you’re one of them. Now, there wasn’t much talk about religion, which is often seen as a place where emotion and compassion come together. There was a debate recently between Richard Dawkins and the Archbishop of Canterbury (so not very enlightening), and the Archbishop at the end admitted that some other animals may have souls. The headline read: “Monkeys may have souls, says primate.”

Chris Anderson: And that’s why you invite Sir Ken to dinner.

Wellcome Trust - Metacognition - I know (or don't know) that I know

This is a great article from the Wellcome Trust on Steve Fleming's article from 2010 on Relating Introspective Accuracy to Individual Differences in Brain Structure, or how brain structure differences impact how we think about thinking. The article was originally published in Science (17 September 2010 ), Vol. 329, no. 5998; pp. 1541-1543. DOI: 10.1126/science.1191883 

Dr. Fleming has made the article available as a PDF through his website, as well as a wealth of other articles. 

External links

Feature: Metacognition - I know (or don't know) that I know

27 February 2012. By Penny Bailey
Cortical surface of the brain

At New York University, Sir Henry Wellcome Postdoctoral Fellow Dr Steve Fleming is exploring the neural basis of metacognition: how we think about thinking, and how we assess the accuracy of our decisions, judgements and other aspects of our mental performance.
Metacognition is an important-sounding word for a very everyday process. We 'metacognise' whenever we reflect upon our thinking process and knowledge.

It's something we do on a moment-to-moment basis, according to Dr Steve Fleming at New York University. "We reflect on our thoughts, feelings, judgements and decisions, assessing their accuracy and validity all day long," he says.

This kind of introspection is crucial for making good decisions. Do I really want that bar of chocolate? Do I want to go out tonight? Will I enjoy myself? Am I aiming at the right target? Is my aim accurate? Will I hit it? How sure am I that I'm right? Is that really the correct answer?

If we don't ask ourselves these questions as a kind of faint, ongoing, almost intuitive commentary in the back of our minds, we're not going to progress very smoothly through life.

As it turns out, although we all do it, we're not all equally good at it. An example Steve likes to use is the gameshow 'Who Wants to be a Millionaire?' When asked the killer question, 'Is that your final answer?', contestants with good metacognitive skills will assess how confident they are in their knowledge.

If sure (I know that I know), they'll answer 'yes'. If unsure (I don't know for sure that I know), they'll phone a friend or ask the audience. Contestants who are less metacognitively gifted may have too much confidence in their knowledge and give the wrong answer - or have too little confidence and waste their lifelines.

Metacognition is also fundamental to our sense of self: to knowing who we are. Perhaps we only really know anyone when we understand how, as well as what, they think - and the same applies to knowing ourselves. How reliable are our thought processes? Are they an accurate reflection of reality? How accurate is our knowledge of a particular subject?

Last year, Steve won a prestigious Sir Henry Wellcome Postdoctoral Fellowship to explore the neural basis of metacognitive behaviour: what happens in the brain when we think about our thoughts and decisions or assess how well we know something?

Killer questions

One of the challenges for neuroscientists interested in metacognition has been the fact that - unlike in learning or decision making, where we can measure how much a person improves at a task or how accurate their decision is - there are no outward indicators of introspective thought, so it's hard to quantify.

As part of his PhD at University College London, Steve joined a research team led by Wellcome Trust Senior Fellow Professor Geraint Rees and helped devise an experiment that could provide an objective measure of both a person's performance on a task and how accurately they judged their own performance.

Thirty-two volunteers were asked to look at a series of two very similar black and grey pictures on a screen and say which one contained a brighter patch.

"We adjusted the brightness or contrast of the patches so that everyone was performing at a similar level," says Steve. "And we made it difficult to see which patch was brighter, so no one was entirely sure about whether their answer was correct; they were all in a similar zone of uncertainty."

They then asked the 'killer' metacognitive question: How sure are you of your answer, on a scale from one to six?

Comparing people's answers to their actual performance revealed that although all the volunteers performed equally well on the primary task of identifying the brighter patches, there was a lot of variation between individuals in terms of how accurately they assessed their own performance - or how well they knew their own minds.

Magnetic resonance imaging (MRI) scans of the volunteers' brains further revealed that those who most accurately assessed their own performance had more grey matter (the tissue containing the cell bodies of our neurons) in a part of the brain located at the very front, called the anterior prefrontal cortex. In addition, a white-matter tract (a pathway enabling brain regions to communicate) connected to the prefrontal cortex showed greater integrity in individuals with better metacognitive accuracy.

The findings, published in 'Science' in September 2010, linked the complex high-level process of metacognition to a small part of the brain. The study was the first to show that physical brain differences between people are linked to their level of self-awareness or metacognition.

Intriguingly, the anterior prefrontal cortex is also one of the few parts of the brain with anatomical properties that are unique to humans and fundamentally different from our closest relatives, the great apes. It seems introspection might be unique to humans.

"At this stage, we don't know whether this area develops as we get better at reflecting on our thoughts, or whether people are better at introspection if their prefrontal cortex is more developed in the first place," says Steve.

I believe I do

Although this research and research from other labs points to candidate brain regions or networks for metacognition located in the prefrontal cortex, it doesn't explain why they are involved. Steve plans to use his fellowship to address that question by investigating the neural mechanisms that generate metacognitive reports.

He's approaching the question by attempting to separate out the different kinds of information (or variables) people use to monitor their mental and physical performance.

He cites playing a tennis shot as an example. "If I ask you whether you just played a good tennis shot, you can introspect both about whether you aimed correctly and about how well you carried out your shot. These two variables might go together to make up your overall confidence in the shot."

To evaluate how confident we are in each variable (aim and shot) we need to weigh up different sets of perceptual information. To assess our aim, we would consider the speed and direction of the ball and the position of our opponent across the net. To judge how well we carried out the actual shot, we would think about the position of our feet and hips, how we pivoted, and how we swung and followed through.

There may well have been some discrepancy between the shot we wanted to achieve and the shot we actually made. This is a crucial distinction for scientists exploring decision making. "Psychologists tend to think of beliefs, 'what I should do', as being separate from actions," explains Steve.

"When you're choosing between two chocolate bars, you might decide on a Mars bar - that's what you believe you should have, what you want and value. But when you actually carry out the action of reaching for a bar, you might end up reaching for a Twix instead. There's sometimes a difference there between what you should do and what you actually end up doing, and that's perhaps a crucial distinction for metacognition. My initial experiments are going to try to tease apart these variables."

Research into decision making has identified specific brain regions where beliefs about one choice option (one chocolate bar, or one tennis shot) being preferable to another are encoded. However, says Steve, "what we don't know is how this type of information [about values and beliefs] relates to metacognition about your decision making. How does the brain give humans the ability to reflect on its computations?"

He aims to connect the finely detailed picture of decision making given to us by neuroscience to the very vague picture we have of self-reflection or metacognition.

New York, New York

Steve is working with researchers at New York University who are leaders in the field of task design and building models of decision making, "trying to implement in a laboratory setting exactly the kind of question we might ask the tennis player".

They are designing a perceptual task, in which people will have to choose a target to hit based on whether a patch of dots is moving to the left or right. In other words, people need to decide which target they should hit (based on their belief about its direction of motion), and then they have to hit it accurately (action).

"We can use a variety of techniques to manipulate the difficulty of the task. If we make the target very small, people are obviously going to be more uncertain about whether they're going to be able to hit it. So we can separately manipulate the difficulty of deciding what you should do, and the difficulty of actually doing it."

Once the task is up and running, they will then ask the volunteers to make confidence judgements - or even bets - about various aspects of their performance: how likely they thought it was that they chose the right target, or hit it correctly. Comparing their answers with their actual performance will give an objective measure of the accuracy of their beliefs (metacognition) about their performance.

Drilling down

Such a task will mean Steve and his colleagues can start to decouple the perceptual information that gives people information about what they should do (which target to hit) from the perceptual information that enables them to assess the difficulty of actually carrying out the action (hitting the target).

And that in turn will make it possible to start uncoupling various aspects of metacognition - about beliefs, and about actions or responses - from one another. "I want to drill down into the basics, the variables that come together to make up metacognition, and ask the question: how fine-grained is introspection?"

He'll then use a variety of neuroscience techniques, including brain scanning and intervention techniques such as transcranial magnetic stimulation (to briefly switch off metacognitive activity in the brain), to understand how different brain regions encode information relevant for metacognition. "Armed with our new task, we can ask questions such as: is belief- and action-related information encoded separately in the brain? Is the prefrontal cortex integrating metacognitive information? How does this integration occur? Answers to these questions will allow us to start understanding how the system works."

Since metacognition is so fundamental to making successful decisions - and to knowing ourselves - it's clearly important to understand more about it. Steve's research may also have practical uses in the clinic. Metacognition is linked to the concept of 'insight', which in psychiatry refers to whether someone is aware of having a particular disorder. As many as 50 per cent of patients with schizophrenia have profoundly impaired insight and, unsurprisingly, this is a good indicator of whether they will fail to take their medication.

"If we have a nice task to study metacognition in healthy individuals that can quantify the different components of awareness of beliefs, and awareness of responses and actions, we hope to translate that task into patient populations to understand the deficits of metacognition they might have." With that in mind, Steve plans to collaborate with researchers at the University of Oxford and the Institute of Psychiatry in London when he returns to finish his fellowship in the UK.

A science of metacognition also has implications for concepts of responsibility and self-control. Our society currently places great weight on self-awareness: think of a time when you excused your behaviour with 'I just wasn't thinking'. Understanding the boundaries of self-reflection, therefore, is central to how we ascribe blame and punishment, how we approach psychiatric disorders, and how we view human nature.

Image: An inflated cortical surface of the human brain reconstructed from MRI scans and viewed from the front. Areas of the prefrontal cortex where increased grey matter volume correlated with greater metacognitive ability are shown in hot colours. Credit: Dr Steve Fleming.

Reference

Thursday, April 12, 2012

Jack White Is the Coolest, Weirdest, Savviest Rock Star of Our Time

From the New York Times Magazine last weekend, an excellent article on the genius that is Jack White - on the eve of his first solo album, “Blunderbuss,” due out April 24 (preorder at Amazon for $9.99). I am definitely looking forward to the new album.

Jack Outside the Box: Jack White Is the Coolest, Weirdest, Savviest Rock Star of Our Time


Jessica Dimmock/VII, for The New York Times

By JOSH EELLS
Published: April 5, 2012

In an industrial section of south-central Nashville, stuck between a homeless shelter and some railroad tracks, sits a little primary-colored Lego-block of a building with a Tesla tower on top. The inside holds all manner of curiosities and wonders — secret passageways, trompe l’oeil floors, the mounted heads of various exotic ungulates (a bison, a giraffe, a Himalayan tahr) as well as a sign on the wall that says photography is prohibited. This is the home of Third Man Records: the headquarters of Jack White’s various musical enterprises, and the center of his carefully curated world.
More from the Magazine:
The 6th Floor Blog: Jack White’s Blue Period - A music video for “Sixteen Saltines.”
Jessica Dimmock/VII, for The New York Times
White rehearsing for a Raconteurs show in Nashville in September 2011. 

Michael Lavine / Meg and Jack in their heyday, 2001.
Jessica Dimmock/VII, for The New York Times / Jack White (right) and his tour manager, Lalo Medina, in the musicians’ apartments at the United Record Pressing plant in Nashville.

“When I found this place” White said one day last April, “I was just looking for a place to store my gear. But then I started designing the whole building from scratch.” Now it holds a record store, his label offices, a concert venue, a recording booth, a lounge for parties and even a darkroom. “The whole shebang,” White said. It’s a one-stop creativity shop as designed by an imaginative kindergartner — a cross between Warhol’s Factory and the Batcave. 

White, looking like a dandyish undertaker in a black suit and matching bowler, was in the record store, which doubles as a tiny Jack White museum. He is most famous as the singer for the White Stripes, the red-and-white-clad Detroit duo that played a stripped-down, punked-up take on Delta blues; their gold and platinum records adorned the walls. Albums from Third Man artists, including White’s other bands, the Raconteurs and the Dead Weather, filled the racks. The décor reflected his quirky junk-art aesthetic: African masks and shrunken heads from New Guinea; antique phone booths and vintage Victrolas. 

White is obsessive about color and meticulous in his attention to detail. Inside, the walls that face west are all painted red, and the ones that face east are all painted blue. The exterior, meanwhile, is yellow and black (with a touch of red). Before he made his living as a musician, White had an upholstery shop in Detroit, and everything related to it was yellow and black — power tools, sewing table, uniform, van. He also had yellow-and-black business cards bearing the slogan “Your Furniture’s Not Dead” as well as his company name, Third Man Upholstery. When he started the record label, he simply carried everything over. “Those colors sort of just mean work to me now.” 

Roaming the hallways were several young employees, all color-coordinated, like comic-book henchmen. The boys wore black ties and yellow shirts; the girls wore black tights and yellow Anna Sui dresses. (There were also a statistically improbable number of redheads.) White stopped in front of one cute girl in bluejeans and Vans. “Can you guess which Third Man employee is getting fined $50 today?” he asked, smiling. 

Some have called Third Man a vanity project, like the Beatles’ Apple Records or Prince’s Paisley Park. But White’s tastes are far more whimsical. He has produced records for the ’50s rockabilly singer Wanda Jackson; the Detroit shock rappers Insane Clown Posse; a band called Transit, made up of employees of the Nashville Metropolitan Transit Authority. (Their first single was called “C’mon and Ride.”) And gimmicks like Third Man’s Rolling Record Store, basically an ice-cream truck for records, show he’s as much a huckster as an artist. 

“I’m trying to get somewhere,” White, who is 36, said, reclining in his tin-ceilinged office. He’s an imposing presence, over six feet tall, with intense dark eyes and a concerningly pale complexion. On his desk sat a cowbell, a pocketknife, a George Orwell reader and an antique ice-cream scoop. There was also a stack of business cards that read: “John A. White III, D.D.S. — Accidentist and Occidental Archaeologist.” “The label is a McGuffin. It’s just a tool to propel us into the next zone. There aren’t that many things left that haven’t already been done, especially with music. I’m interested in ideas that can shake us all up.” 

White walked back to a room called the Vault, which is maintained at a constant 64 degrees. He pressed his thumb to a biometric scanner. The lock clicked, and he swung the door open to reveal floor-to-ceiling shelves containing the master recordings of nearly every song he’s ever been involved with. Unusually for a musician, White has maintained control of his own masters, granting him extraordinary artistic freedom as well as truckloads of money. “It’s good to finally have them in a nice sealed environment,” White said. I asked where they’d been before, and he laughed. “In a closet in my house. Ready to be set on fire.” 

White said the building used to be a candy factory, but I had my doubts. He’s notoriously bendy with the truth — most famously his claim that his White Stripes bandmate, Meg White, was his sister, when in fact she was his wife. Considering the White Stripes named themselves for peppermint candies, the whole thing seemed a little neat. “That’s what they told me,” he insisted, not quite convincingly. I asked if I needed to worry about him embellishing details like that, and he cackled in delight. “Yes,” he said. “Yes.” 

TED Talks - Frans de Waal: Moral Behavior in Animals


Frans de Waal is a leader in the field of primate studies - his most recent book is The Age of Empathy: Nature's Lessons for a Kinder Society. This talk is from TEDxPeachtree, filmed in November of 2011, but just posted this month.

A fellow researcher into the emotional lives of animals, Robert Sapolsky of Stanford University (who has spent considerable time studying primates, A Primate's Memoir: A Neuroscientist's Unconventional Life Among the Baboons), says this about The Age of Empathy:
“It’s hard to feel the pain of the next guy. First, you have to notice that he exists…then realize that he has different thoughts than you…and different emotions…and that he needs help…and that you should help because you’d like the same done for you…and, wait, did I remember to lock the car?…and…  Empathy is often viewed as requiring cognitive capacities for things like theory of mind, perspective taking and the golden rule, implying that empathy is pretty much limited to humans, and is a fairly fragile phenomenon in us.  For decades, Frans de Waal has generated elegant data and thinking that show that this is wrong.   In this superb book, he shows how we are not the only species with elements of those cognitive capacities, empathy is as much about affect as cognition, and our empathic humanity has roots far deeper than our human-ness.”
—Robert Sapolsky, author of Why Zebras Don’t Get Ulcers 
Our view of non-human animals is changing rapidly. Some of the top neuroscientists, such as Antonio Damasio in his recent book, Self Comes to Mind: Constructing the Conscious Brain, suggests that we need to rethink our understanding of consciousness in the animal world.

Frans de Waal: Moral behavior in animals

Empathy, cooperation, fairness and reciprocity -- caring about the well-being of others seems like a very human trait. But Frans de Waal shares some surprising videos of behavioral tests, on primates and other mammals, that show how many of these moral traits all of us share.





Dr. Frans B. M. de Waal is a biologist and primatologist known for his work on the behavior and social intelligence of primates. His first book, Chimpanzee Politics (1982), compared the schmoozing and scheming of chimpanzees involved in power struggles with that of human politicians. Ever since, de Waal has drawn parallels between primate and human behavior, from peacemaking and morality to culture. His scientific work has been published in hundreds of technical articles in journals such as Science, Nature, Scientific American, and outlets specialized in animal behavior. His popular books – translated into fifteen languages – have made him one of the world’s most visible primatologists. His latest books are Our Inner Ape (2005, Riverhead) and The Age of Empathy (2009, Harmony).

De Waal is C. H. Candler Professor in the Psychology Department of Emory University and Director of the Living Links Center at the Yerkes National Primate Center, in Atlanta. He has been elected to the National Academy of Sciences (US), the American Academy of Arts and Sciences, and the Royal Dutch Academy of Sciences. In 2007, he was selected by Time as one of The Worlds’ 100 Most Influential People Today, and in 2011 by Discover as among 47 (all time) Great Minds of Science.

Read TEDxPeachtree's Q&A with Frans de Waal >>

Open Culture - Classic 1959 Performance with Miles Davis and John Coltrane

Nearly 30 minutes of serious jazz awesomeness featuring Miles Davis and John Coltrane - "The Sound of Miles Davis" from a 1959 TV appearance. As usually is the case with excellent video finds, this comes from Open Culture.

‘The Sound of Miles Davis’: Classic 1959 Performance with John Coltrane