Pages

Saturday, October 13, 2012

Raymond Tallis - Did Time Begin With A Bang?


Raymond Tallis is writing a new book to "rescue our thinking about time (and a good deal of metaphysics) from the domination of physics." In this column from Philosophy Now, he explains a little bit of where he is heading in the new book.

Did Time Begin With A Bang?

Raymond Tallis doesn’t know, at present.

I am half way through writing Of Time and Lamentation, an attempt to rescue our thinking about time (and a good deal of metaphysics) from the domination of physics. I justify this shameless self-promotion by presenting it as a warning: you must expect your columnist, over the next year or so, occasionally to share with you some of his puzzles about the nature of time. The one that is preoccupying me at present is the question of whether time does or does not have a beginning. It’s an issue that has wandered through Western thought on the border between philosophy and theology for millennia, and no end seems to be in sight.

Some of you will be familiar with Kant’s cunning argument in The Critique of Pure Reason (1781), in which he demonstrates to his own satisfaction that time cannot be something in the world out there, a property of things in themselves: on the contrary, he says, time belongs to the perceiving subject. (For those who don’t feel up to reading the original, Robin Le Poidevin’s discussion in his brilliant Travels in Four Dimensions: The Enigmas of Space and Time is an ideal starting point.) Kant’s argument revolves around the question of whether or not the world has a beginning in time. He shows that we can prove both that the world must have and that it can’t have a beginning in time, so there must be something wrong with the idea. This is the first of his famous four ‘antinomies’ – philosophical problems with contradictory but apparently necessary solutions – the others relating to atoms, freedom, and God.

The world, Kant says, must have a beginning in time, otherwise an infinite amount of time – an ‘eternity’, as Kant called it – would have already passed in this world – but no infinite series can be completed. On the other hand, the world can’t have had a beginning in time, because this would imply a period of empty time before the world came into being, and nothing (least of all a whole world) can come into being in empty time, as there isn’t anything to distinguish one moment in empty time from another. To put this another way: since successive moments of empty time are identical, there would not be a sufficient reason for one moment to give birth to the world while its predecessors were sterile.

One current standard response to Kant’s Antinomy of Time is to say that the world did have a beginning – at the Big Bang, 13.75 billion years ago – but that it was not a beginning in already-existing empty time, since the beginning of the world also started time itself. This solution echoes St Augustine’s assertion that “the world was made, not in time, but simultaneously with time”: God brought the world and time into being together, so that the question of (say) what God was doing before the Creation does not arise. Kant’s First Antinomy is therefore based on a false premise. Job done. Tick in box. Next question please.

Not so fast. Let us look at the claim that time and the world began with the Big Bang 13.75 billion years ago. This is claiming two rather remarkable things: that time began at a particular time; and that time and the world began at the same time. Let us look at these assumptions, starting with the assertion that time began at a particular time.

Time Zero

Since the Big Bang can be assigned a date, Something must have come out of Nothing at a particular moment. What was special about a specific moment 13.75 billion years ago? The cosmologists say that there was nothing special about it: the universe is a random event that happened for no particular reason. It grew out of a quantum field – the ‘inflaton’ – which found itself in a false vacuum – temporarily stable, but not in the lowest energy state. Random fluctuation (uncaused, as things are in the quantum world) sent the inflaton tumbling into a true vacuum, which generated an equal amount of positive energy (matter) and negative energy (gravity). Thus the Big Bang didn’t need causes to bring it about because no net stuff is created. Far from solving the problem of creation, this has multiplied the problems beyond those the physicists were struggling with: an energetic quantum vacuum – a fidgety Nothing – looks a little dodgy, for a start.
Never mind, that’s quantum physics for you. What seems more vulnerable is the idea that we can finesse Something out of Nothing by the generation of equal amounts of positive and negative energy, so that the universe has zero total energy. This seems to be somewhat literal-minded, taking the pluses and minuses in an equation for reality. Worse, it looks as if in pursuit of an explanation we have doubled the number of unexplainables: we have to explain two lots of energy. In short, Kant’s problem of explaining why one moment of empty time should be privileged to deliver a universe is not solved by appealing to random fluctuations, because fluctuations in Nothing – even if they generate pairs of virtual particles (virtual particle plus virtual antiparticle) – don’t seem likely to help us to explain Something.

Some scientists have given up on the idea that the Big Bang is at the beginning of time (and space); rather, it is but a recent event in a much longer history. (See for example, ‘Bang Goes The Theory’, New Scientist, 30th June 2012.) Instead of one Big Bang, there is a series of big bangs livening up a cosmos that has been around forever. This, of course, only displaces the problem of the emergence of Something out of Nothing, and Kant’s problem of an infinite time having already passed is back.
Meanwhile, there are variants of the Big Bang theory in which it is suggested that time doesn’t arrive on the scene at once; or rather, it behaves at first like another dimension of space – in the earliest universe there is a period in which it is too early for time. If so, we have to ask what ‘too early’ could possibly mean in this context. Are we referring to a time before time exists – before it makes sense to speak of ‘before’?

There is something dubious about dating the beginning of time in any case. Allocating a time to the beginning of time is like allocating a time to any moment in time. Saying that a time t1 took place at a particular time t1 seems like a harmless tautology, but it is actively misleading, because it treats a moment in time as if it were itself a kind of occurrence in time, as opposed to part of the framework within which things can occur. To say that time began at tbeginning is thus to treat the beginning of time more like a kind of occurrence. This impression is confirmed when it is asserted that tbeginning occurred 13.75 billion years ago.

Those who want to defend the notion of a moment in time as being like an occurrence, might be tempted to say we often think this way when we talk about stretches of time. For example, there is nothing wrong with saying that Wednesday – not in itself an event, but simply a holder for events – began, came into being, at twelve midnight on Tuesday. The analogy is not sound, however. Tuesday or Wednesday are not time in itself, but divisions placed upon time. However, the Big Bang is supposed to be both timeless and at a particular time: at the beginning of, and yet not part of, the series that it begins.

The problem with saying that time began at a particular time is highlighted by this obvious question: If it is perfectly valid to speak of 13.2, 13.3 and 13.75 billion years ago, why is it not valid to talk about 13.8, 13.76 or even 13.755 billion years ago? Steven Hawking addresses this question by arguing that to talk of time before the Big Bang is like talking about points north of the North Pole. Once you have got to the North Pole, it makes no sense to imagine you can go any further north.
This analogy does not work. As mathematical philosopher J.R. Lucas has pointed out, there is a deeper, astronomical sense of north which allows a line pointing in that direction to be extended indefinitely. North is a direction that has no terminus: you can be north of the North Pole. So the question still stands.

What about the second assumption: the supposed simultaneity of the beginning of the universe and the origin of time? How can we think of the start of time itself (as opposed to the time of something) being at the same time as an event (the creation of the universe)? There is no ‘at the same time as’ until the universe has differentiated to the point where one event can be temporally related to other events via an observer. What’s more, once the universe has come into being, and more than one event has occurred, the notion of simultaneity as absolute and observer-independent is invalidated by the Special Theory of Relativity. Finally, there is a mismatch between a universe, whose coming-into-being is extended over time, and time itself, whose coming into being is presumably instantaneous, or at least not extended through time.

Haunted By Kant

So Kant’s First Antinomy still haunts us. But we have good reason not to jump from its problems to Kant’s conclusion that time is somehow internal to the human mind. If time were only a property of human minds, we would not be able to make sense of what in After Finitude (2008) the French philosopher Quentin Meillassoux called ‘ancestrals’. Ancestrals are those realities that pre-date human consciousness and yet have a clear temporal order. For example, the Earth came into being 4.56 billion years ago, before life on Earth originated (3.5 billion years ago) and before conscious humans began emerging (several million years ago). So if we believe what evolutionary science tells us, we cannot reduce time to one of Kant’s two ‘forms of sensible intuition’ (roughly, modes of human perception, the other form being space).

However, this conclusion, too, can be challenged, by arguing that the allocation of events to past dates is itself internal to the calendar time that humans have invented – that we project the framework we have established to structure our time beyond the situation within which it arose: that the very idea of ‘ago’ is established with respect to a system that has been built up by humans and extrapolated from ways of seeing that serve us well but do not necessarily reveal truths about how the universe is in itself, beyond our manner of perceiving it. On this matter, the jury remains out.

Many physicists despise the kinds of arguments I have presented. Their feeling about philosophers (or even physicists) who want to make other than mathematical sense of what physics tells us about the fundamental nature of things could be summarised in David Mirmin’s “Shut up and calculate!” And Steven Hawking has dismissed philosophers as poor sods who haven’t kept up with physics. It is important not to lose one’s nerve and to note that physicists who dismiss philosophy are often doing philosophy themselves, but very badly.

The question of whether time has a beginning is far from resolution. Equally vexing is the opposite question, as to whether it has an end. Another time, perhaps.

© Prof. Raymond Tallis 2012

Raymond Tallis is a physician, philosopher, poet, broadcaster and novelist. His latest book, In Defence of Wonder, is out now from Acumen.

Steve Fleming - Neuroscience and Criminality

This excellent article from Steve Fleming appeared at AEON, a very cool web magazine for those who are not familiar with it. Fleming, who blogs at The Elusive Self, takes an interesting look at how our increasing understanding of the neuroscientific foundations of human behavior are changing or will change our notions of guilt and criminality.

Was it really me?

Neuroscience is changing the meaning of criminal guilt. That might make us more, not less, responsible for our actions

| 26 September 2012

Illustration by Matt Murphy  
Illustration by Matt Murphy

Steve Fleming is a cognitive neuroscientist. He is a postdoctoral fellow at New York University and a blogger at The Elusive Self.

In the summer of 2008, police arrived at a caravan in the seaside town of Aberporth, west Wales, to arrest Brian Thomas for the murder of his wife. The night before, in a vivid nightmare, Thomas believed he was fighting off an intruder in the caravan – perhaps one of the kids who had been disturbing his sleep by revving motorbikes outside. Instead, he was gradually strangling his wife to death. When he awoke, he made a 999 call, telling the operator he was stunned and horrified by what had happened, and unaware of having committed murder.

Crimes committed by sleeping individuals are mercifully rare. Yet they provide striking examples of the unnerving potential of the human unconscious. In turn, they illuminate how an emerging science of consciousness is poised to have a deep impact upon concepts of responsibility that are central to today’s legal system.

After a short trial, the prosecution withdrew the case against Thomas. Expert witnesses agreed that he suffered from a sleep disorder known as pavor nocturnus, or night terrors, which affects around one per cent of adults and six per cent of children. His nightmares led him to do the unthinkable. We feel a natural sympathy towards Thomas, and jurors at his trial wept at his tragic situation. There is a clear sense in which this action was not the fault of an awake, thinking, sentient individual. But why do we feel this? What is it exactly that makes us think of Thomas not as a murderer but as an innocent man who has lost his wife in terrible circumstances?

Our sympathy can be understood with reference to laws that demarcate a separation between mind and body. A central tenet of the Western legal system is the concept of mens rea, or guilty mind. A necessary element to criminal responsibility is the guilty act — the actus reus. However, it is not enough simply to act: one must also be mentally responsible for acting in a particular way. The common law allows for those who are unable to conform to its requirements due to mental illness: the defence of insanity. It also allows for ‘diminished capacity’ in situations where the individual is deemed unable to form the required intent, or mens rea. Those people are understood to have control of their actions, without intending the criminal outcome. In these cases, the defendant may be found guilty of a lesser crime than murder, such as manslaughter.

In the case of Brian Thomas, the court was persuaded that his sleep disorder amounted to ‘automatism’, a comprehensive defence that denies there was even a guilty act. Automatism is the ultimate negation of both mens rea and actus reus. A successful defence of automatism implies that the accused person had neither awareness of what he was doing, nor any control over his actions. That he was so far removed from conscious awareness that he acted like a runaway machine.

The problem is how to establish if someone lacks a crucial aspect of consciousness when he commits a crime. In Thomas’s case, sleep experts provided evidence that his nightmares were responsible for his wife’s death. But in other cases, establishing lack of awareness has proved more elusive.
It is commonplace to drive a car for long periods without paying much attention to steering or changing gear. According to Jonathan Schooler, professor of psychology at the University of California, Santa Barbara, ‘we are often startled by the discovery that our minds have wandered away from the situation at hand’. But if I am unconscious of my actions when I zone out, to what degree is it really ‘me’ doing the driving?

This question takes on a more urgent note when the lives of others are at stake. In April 1990, a heavy-goods driver was steering his lorry towards Liverpool in the early evening. Having driven all day without mishap, he began to veer on to the hard shoulder of the motorway. He continued along the verge for around half a mile before he crashed into a roadside assistance van and killed two men. The driver appeared in Worcester Crown Court on charges of causing death by reckless driving. For the defence, a psychologist described to the court that ‘driving without awareness’ might occur following long, monotonous periods at the wheel. The jury was sufficiently convinced of his lack of conscious control to acquit on the basis of automatism.

The argument for a lack of consciousness here is much less straightforward than for someone who is asleep. In fact, the Court of Appeal said that the defence of automatism should not have been on the table in the first place, because a driver without ‘awareness’ still retains some control of the car. None the less, the grey area between being in control and aware on the one hand, and in control and unaware on the other, is clearly crucial for a legal notion of voluntary action.

If we accept automatism then we reduce the conscious individual to an unconscious machine. However, we should remember that all acts, whether consciously thought-out or reflexive and automatic, are the product of neural mechanisms. For centuries, scientists and inventors have been captivated by this notion of the mind as a machine. In the 18th century, Henri Maillardet, a Swiss clockmaker, built a remarkable apparatus that he christened the Automaton. An intricate array of brass cams connected to a clockwork motor made a doll produce beautiful drawings of ships and pastoral scenes on sheets of paper, as if by magic. This spookily humanoid machine, now on display at the Franklin Institute in Philadelphia, reflects the Enlightenment’s fascination with taming and understanding the mechanisms of life.

Modern neuroscience takes up where Maillardet left off. From the pattern of chemical and electrical signaling between around 85 billion brain cells, each of us experiences the world, makes decisions, daydreams, and forms friendships. The mental and the physical are two sides of the same coin. The unsettling implication is that, by revealing a physical correlate of a conscious state, we begin to treat the individual not as a person but as a machine. Perhaps we are all ‘automata’, and our notions of free choice and taking responsibility for our actions are simply illusions. There is no ghost in the machine.

In his book Incognito (2011), David Eagleman argues that society is poised to slide down the following slippery slope. Measurable brain defects already buy leniency for the defendant. As the science improves, more and more criminals will be let off the hook thanks to a fine-grained analysis of their neurobiology. ‘Currently,’ Eagleman writes, ‘we can detect only large brain tumours, but in 100 years we will be able to detect patterns at unimaginably small levels of the microcircuitry that correlate with behavioral problems.’ On this view, responsibility has no place in the courtroom. It is no longer meaningful to lock people up on the basis of their actions, because their actions can always be tied to brain function.

While it is inevitable that defence teams will look towards neuroscientific evidence to shift the balance in favour of a mechanistic, rather than a personal, interpretation of criminal acts. But we should be wary of attempts to do so. If every behaviour and mental state has a neural correlate (as surely it must), then everything we do is an artifact of our brains. A link between brain and behaviour is not enough to push responsibility out of the courtroom. Instead we need new ways of thinking about responsibility, and new ways to conceptualise a decision-making self.

Responsibility does not entail a rational, choosing self that floats free from physical processes. That is a fiction. Even so, demonstrating a link between criminal behaviour and conscious (or unconscious) states of the brain changes the legal landscape. Consciousness is, after all, central to the legal definition of intent.

In the early ’70s, the psychologist Lawrence Weiskrantz and the neuropsychologist Elizabeth Warrington discovered a remarkable patient at the National Hospital for Neurology and Neurosurgery in London. This patient, known as DB, had sustained damage to the occipital lobes (towards the rear of the brain), resulting in blindness in half of his visual field. Remarkably, DB was able to guess the position and orientation of lines in his ‘blind’ hemifield. Subsequent studies on similar patients with ‘blindsight’ confirmed that these responses relied on a neural pathway quite separate from the one that usually passes through the occipital lobes. So it appears that visual consciousness is selectively deleted in blindsight. At some level, the person can ‘see’ but is not aware of doing so.

Awareness and control, then, are curious things, and we cannot understand them without grappling with consciousness itself. What do we know about how normal, waking consciousness works? Hints are emerging. Studies by Stanislas Dehaene, professor of experimental cognitive psychology at the Collège de France in Paris, have revealed that a key difference between conscious and unconscious vision is activity in the prefrontal cortex (the front of the brain, particularly well-developed in humans). Other research implies that consciousness emerges when there is the right balance of connectivity between brain regions, known as the ‘information integration’ theory. It has been suggested that anesthesia can induce unconsciousness by disrupting the communication between brain regions.

Just as there are different levels of intent in law, there are different levels of awareness that can be identified in the lab. Despite being awake and functioning, one’s mind might be elsewhere, such as when a driver zones out or when a reader becomes engrossed. A series of innovative experiments have begun to systematically investigate mind-wandering. When participants zone out during a repetitive task, activity increases in the ‘default network’, a set of brain regions previously linked to a focus on internal thoughts rather than the external environment. Under the influence of alcohol, people become more likely to daydream and less likely to catch themselves doing so. These studies are beginning to catalogue the influences and mechanisms involved in zoning out from the external world. With their help we can refine the current legal taxonomy of mens rea and put legal ideas such as recklessness, negligence, knowledge and intent on a more scientific footing.

An increased scientific understanding of consciousness might one day help us to determine the level of intent behind particular crimes and to navigate the blurred boundary between conscious decisions and unconscious actions. At present, however, we face serious obstacles. Most studies in cognitive neuroscience rely on averaging together many individuals. A group of individuals allows us to understand the average, or typical, brain. But it does not follow that each individual in the group is typical. And even if this problem were to be overcome, it would not help us to adjudicate cases in which normal waking consciousness was intact, but happened to be impaired at the time of the crime.

Nonetheless, the brain mechanisms underpinning different levels of consciousness are central to a judgment of automatism. Without consciousness, we are justified in concluding that automatism is in play, not because consciousness itself is not also dependent on the brain, but because consciousness is associated with actions worth holding to a higher moral standard. This perspective helps to arrest the slide down Eagleman’s slippery slope. Instead of negating responsibility, neuroscience has the potential to place conscious awareness on an empirical footing, allowing greater certainty about whether a particular individual had the capacity for rational, conscious action at the time of the crime.

Some worry that an increased understanding of consciousness and voluntary action will dissolve our sense of personal responsibility and free will. In fact, neurological self-knowledge could have the opposite effect. Suppose we discover that the brain mechanisms underpinning consciousness are primed to malfunction at a particular time of day, say 7am. Up until this discovery, occasional slips and errors made around this time might have been put down to chance. But now, armed with our greater understanding of the fragility of consciousness, we would be able to put in place counter-measures to make major errors less likely. For Brian Thomas, a greater understanding of his sleep disorder might have allowed him to control it. He had stopped taking his anti-depressant medication when he was on holiday, because he believed it made him impotent. This might have contributed to the night terrors that caused him to strangle his wife.

Crucially, increased self-knowledge often percolates through to laws governing responsible behaviour. A diabetic who slips into a coma while driving is held responsible if the coma was the result of poor management of a known diabetic condition. Someone committing crimes while drunk is held to account, so long as they are responsible for becoming drunk in the first place. A science of consciousness illuminates the factors that lead to unconsciousness. In reconsidering the boundary between consciousness and automatism we will need to take into account the many levels of conscious and unconscious functioning of the brain.

Our legal system is built on a dualist view of the mind-body relationship that has served it well for centuries. Science has done little to disrupt that until now. But neuroscience is different. By directly addressing the mechanisms of the human mind, it has the potential to adjudicate on issues of capacity and intent. With a greater understanding of impairments to consciousness, we might be able to take greater control over our actions, bootstrapping ourselves up from the irrational, haphazard behaviour traditionally associated with automata. Far from eroding a sense of free will, neuroscience may allow us to inject more responsibility than ever before into our waking lives.

~ For references to the scientific research discussed in this essay, see Steve Fleming's blog The Elusive Self.

The RSA Debates the Economic Future of the Millennial Generation


This is a short clip from the RSA debate, but you can go listen to the podcast of the full event including audience Q&A (or use the link below).

Listen to the audio
(full recording including audience Q&A)
Please right-click link and choose "Save Link As..." to download audio file onto your computer.


The Millennials Debate

The Millennial generation have grown up in a radically changing consumer landscape, have witnessed a dramatic shift in power from producer to consumer, are connected and empowered by technological change and the social web, and yet, crucially, face an uncertain and, for many, insecure economic future.

Watch our panel as they ask: Will the entrepreneurial drive of the Millennial generation shape the zeitgeist for the first half of 21st century, harnessing transformations in the way we live and work to drive up productivity and living standards?

Speakers: Martha Lane Fox, the UK's Digital Champion, founder of Go ON UK, founder of Luckyvoice and co-founder of lastminute.com; Madsen Pirie, founder and president of the Adam Smith Institute; Ed Howker, journalist, broadcaster and co-author of "Jilted Generation: how Britain bankrupted its youth"; and Adam Lent, director of projects, RSA.

Friday, October 12, 2012

Camille Paglia On Seminal Images in Art

She's baaaaack . . . . It's been a few years since Camille Paglia has released a book, and in her newest work she returns to the subject that brought her fame in the first place - art, and specifically, within its cultural and historical contexts. Her new book is Glittering Images: A Journey Through Art from Egypt to Star Wars and she is interviewed here by Tom Ashbrook for On Point (NPR).

There is a link to one excerpt in the post below, here is another from the Wall Street Journal.

Camille Paglia On Seminal Images

Critic, provocateur Camille Paglia on the vanishing visual arts and American souls at risk

October 12, 2012 at 11:00 AM


Untitled (Green Silver), ca. 1949. (Jackson Pollock)
Untitled (Green Silver), ca. 1949. (Jackson Pollock)

Critic, provocateur Camille Paglia brings a tough, earthy, brilliant edge to everything she touches.  Her politics can be a street fight.  Her intellect a razor blade.  Her insight, a joy.  Now Camille Paglia is looking at what we see.  What we look at these days.  A flood of pixels.  Facebook photo albums.

A jittery dollhouse of You Tube fancy.  And precious little real visual art.  In a sea of images we are losing our connection, she says, to the great messages of art.  Its wisdom.  Its insight.  Its glory.

This hour, On Point:  Camille Paglia on our missing art, and – she says – our souls at risk.

-Tom Ashbrook

Guests

Camille Paglia, author, teacher, and social critic. Her new book is Glittering Images: A Journey Through Art from Egypt to Star Wars. You can read an excerpt here.

From Tom’s Reading List

  • Salon “A feminist critic of feminism, a Democrat frequently infuriated by Democrats (now is one of those times), she thrives at being a bomb-thrower from the inside. And she’s at it again! During an interview last week about her new book, she held forth on subjects as varied as the state of the arts, Bravo’s addictive “Real Housewives” franchise, her old nemesis Naomi Wolf and, yes, politics — where she gave us her surprise pick for president.”
  • Bloomberg “The painting was executed over three months in 1907 in Picasso’s jammed, squalid one-room studio apartment in bohemian Montmartre in Paris. Its fleshy pinks are a survival from the artist’s Rose Period but with a stunning change of tone. There is no longer any humor or pleasure.”   
  • New York Times “The impetus for her visit was the Oct. 16 publication of “Glittering Images: A Journey Through Art From Egypt to Star Wars” (Pantheon), Ms. Paglia’s sixth book and her first to focus squarely on visual art. Asked to meet at the museum, she had responded with an enthusiastic e-mail, calling the choice “ideal” because “it made such a huge impression on me as a small child.” Now 65, she first visited in the early 1950s, on a trip from Endicott, N.Y., the upstate working-class enclave where she had been born and raised, and the foray had been so formative that “I’ll never forget it!!!,” she wrote.”
Photos:

Here’s a gallery of some of Paglia’s “glittering images.”

Xenia Goodloe (John Wesley Hardrick)

The death of Marat (Jacques-Louis David)

Venus with Mirror (Titan)

The Acropolis in Athens, Greece.

Cycladic figurine 2800-2300 BC

Untitled (Green Silver), ca. 1949. (Jackson Pollock)

Thelonious Monk, Live in Oslo and Copenhagen (1966)

A little Friday Jazz courtesy of Open Culture and Thelonious Monk. Enjoy.

Thelonious Monk, Live in Oslo and Copenhagen (1966)

A little present for what would be Thelonius Monk’s 95th birthday today — 100 grand minutes of Monk performing live in Oslo and Copenhagen in 1966. In the spring of that year, Monk brought his legendary quartet (tenor saxophonist Charlie Rouse, bassist Larry Gales, and drummer Ben Riley) to Scandinavia to perform two televised shows. The recording, saved for posterity thanks to YouTube, features some Monk classics: Blue Monk, Epistrophy, Round Midnight and others. Sit back and enjoy.
Related Content: 
Advice From the Master: Thelonious Monk Scribbles a List of Tips for Playing a Gig
A Child’s Introduction to Jazz by Cannonball Adderley (with Louis Armstrong & Thelonious Monk)
The Universal Mind of Bill Evans: Advice on Learning to Play Jazz & The Creative Process

Maurice Merleau-Ponty's Ontology of the Flesh


Maurice Merleau-Ponty (14 March 1908 – 3 May 1961) was a French phenomenological philosopher, strongly influenced by Karl Marx, Edmund Husserl, and Martin Heidegger. He was a central figure in French philosophical circles, being close friends with Jean-Paul Sartre and Simone de Beauvoir.

Merleau-Ponty was also one of the central 20th century philosophers to bring the body back into the equation of experience and consciousness, rejecting the dualism that had reigned since Descartes. His work sought to demonstrate "a corporeity of consciousness as much as an intentionality of the body," offering a counter to Descartes.

This is a section from the entry in the Stanford Encyclopedia of Philosophy on Merleau-Ponty and his philosophy, written by Bernard Flynn.

Citation:
Flynn, Bernard, "Maurice Merleau-Ponty", The Stanford Encyclopedia of Philosophy (Fall 2011 Edition), Edward N. Zalta (ed.).

Ontology Of The Flesh

Let us now return to our discussion of Merleau-Ponty's ontology. The guiding thread that we had been following was his critique of transcendental philosophy, particularly the notion of subjectivity that is implied in this philosophical project. In The Visible and the Invisible this critique is deepen and further developed. One could argue that, historically, the project of transcendental philosophy begins as a refutation of skepticism. By taking this stance, we begin by putting ourselves on the side of the negative. Like Stanley Cavell, Merleau-Ponty sees that the skeptic radically transforms the ordinary meaning of the question, “Do I know that?” Extending this question to everything changes its meaning. Merleau-Ponty claims that philosophy elects certain beings' “sensations, representations, thoughts, consciousness, or even a deceiving being--in order to separate itself from all being” (VI, 107). He argues that the radical skeptic borrows something from our experience, absolutizes it, then in his quest for complete certainty, he uses it to terrorize our experience of ‘inherence in the world’, an experience that Merleau-Ponty, in The Visible and the Invisible, calls “perceptual faith.” Those who would begin philosophy by attempting to refute skepticism must also agree with the skeptic's rejection of our inherence in Being; they do so in the name of establishing absolute evidence which would deliver us from our contingent insertion into Being. Merleau-Ponty sees this ‘desire to be delivered from contingency’ operative in critical philosophy's effort to “undo our natural pact with the world in order to remake it.” Its attempt to follow backward the path taken by the ‘subject who has constituted the object’ in order to arrive at the unity of subjectivity. It does so in a manner similar to the way one could walk indifferently in either direction from Notre Dame to the Etoile, or from the Etoile to Notre Dame.

It is Merleau-Ponty's contention that it is not possible to achieve this return to subjectivity. Reflection is always secondary, that is, it must recognize itself as founded on a pre-reflective experience of Being that cannot be assimilated, employing the felicitous phrase of Adorno, “without remainder.” This reflection which must always be mindful of its own situated character is what Merleau-Ponty names “hyper-reflection.” This sort of reflection is expressed in an excellent manner by a line of Kafka, cited by Lefort in his Preface to The Visible and the Invisible, “that things present themselves to him not by their roots, but by some point or another situated towards the middle of them” (VI, xvvi). Merleau-Ponty evokes our ineluctable inherence in Being as evidence that Husserl's project of free variation, while being useful, was not able to accomplish what Husserl desired of it. Free variation was Husserl's way to move from the register of ‘fact’ to that of ‘essence’. One begins with a real factual experience, then by means of free variation one transforms it in imagination up to the point where it is no longer an object of the same type. At this point, Husserl claims that we intuit its essential structure.

Merleau-Ponty agrees that we can vary our experience in imagination, that we can move from the real to the virtual, that is, we can give ourselves leeway. However, we cannot “complete” the circuit by which the real would become simply a variant of the possible. He writes, “On the contrary, it is the possible worlds and possible things that are variants and doubles of the actual world and of actual beings” (VI, 112). It is the ineluctability of our inherence in the world that forecloses both the attempt to move from the fact to the essential structure and the project of completing the phenomenological reduction.

In the last chapter of the never completed The Visible and the Invisible entitled “The Intertwining–the Chiasm,” Merleau-Ponty begins to give a positive elaboration of the ontological position to which he has been led. In a number of respects, his last work distances itself from certain central notions in the phenomenological tradition. Nonetheless, in one respect it is mindful of Husserl's injunction, “Return to the things themselves.” Merleau-Ponty wishes to begin in a dimension of experience which has not been “worked over, that offers us, all at once, pell-mell, both subject and object--both existence and essence--and, hence, gives philosophy resources to redefine them” (VI, 130). When Merleau-Ponty speaks of “perceptual faith” his notion of faith is perhaps the very opposite of the agonized Kierkegaardian “leap of faith.” It is a faith the commitment of which has ‘always already’ been made, a faith which subtends the avowal of responsibility by which personal identity is formed. Perceptual faith is a faith that I am in no danger of losing, except in the philosophical interpretation of it which portrays it as knowledge. This chapter on what Merleau-Ponty calls the Chiasm is a continuation of his study of perception, however, at first viewing it may not appear as such. In the Phenomenology of Perception, he insisted upon making a distinction between operative intentionality and act intentionality, but in The Visible and the Invisible this distinction is deepened in such a way that the concept of intentionality itself is thrown into question. In his critical reflections on Sartre, which due to spatial constraints we have not been able to develop here, Merleau-Ponty said that for a subject defined as For-itself, as consciousness of itself, passivity could have no meaning. He argues that, defined as such, consciousness could not but be sovereign.

In his late thought, Merleau-Ponty poses the question whether a consciousness, defined as intentional, is adequate to think a notion of perception viewed as the self-revelation of the sense of a world in and through a being which is itself a part of the world, flesh of its flesh, a world which “... is much more than the correlative of my vision, such that it imposes my vision upon me as a continuation of its own sovereign existence” (VI, 131). For him, to see is not to pose a thing as the object pole, much less a noema (Husserl), of my act of seeing. Rather seeing is being drawn into a dimension of Being, a tissue of sensible being to which the perceiving body is not foreign. Merleau-Ponty speaks of the perception of the color ‘red’ as not merely the awareness of a quality belonging to an object. He claims that for an experience ‘prior to being worked over’, it is an encounter with “a punctuation in the field of red things, which includes the tiles of rooftops; the flags of gatekeepers and of the revolution; of certain terrains near Aix or Madagascar. It is also a punctuation in the field of red garments, which includes, along with the dresses of women, the robes of professors, bishops and advocates general...and its red is literally not the same if it appears in one constellation or in another … . A certain red is also a fossil, drawn up from the depths of imaginary worlds” (VI, 132). When seeing, I do not hold an object at the terminus of my gaze, rather I am delivered over to a field of the sensible which is structured in terms of the “difference between things and colors, a momentary crystallization of colored being or visibility” (VI, 132).

When we turn in the direction of the seer, we do not discover a transcendental ego but a being who is itself of the sensible, a being which “knows it before knowing it”(VI, 133). The sensate body possesses “an art of interrogating the sensible according to its own wishes, an inspired exegesis” (VI, 135). If I wish to feel the cloth of a coat that I am about to purchase, it will not suffice if I pound it with my fists or quickly wisk my hand over it. Rather it must be touched as it wishes to be touched and for this my body needs no instruction. Like the cloth, my hand is a part of the tangible world; between it and the rest of the tangible world there exists a “relationship by principle” (VI, 133). My hand which touches the things is itself subject to being touched. “Through this crisscrossing within it of the touching and the tangible, its own movements incorporate themselves in the universe that they interrogate, are recorded on the same map as it” (VI, 133).

Merleau-Ponty writes, “it is not different for vision.” He argues that it is essential that the seer itself must be visible, that is, seeable; he refers to the body as an “exemplar sensible” being both sensate and sensible. As in the case with touching, there is a pre-possession of the vision by the visible, and vice versa. The body, being itself visible, uses its being to participate in the being of the visible world. Rather than speaking of the act of seeing, one ought to speak of a “visibility, sometimes wondering, sometimes reassembled” (VI, 138). What Merleau-Ponty calls “flesh” is the generality of the sensible, “an anonymity innate to myself” (VI, 133). We see that there is a progression of Merleau-Ponty's ontology which moves from the notion of Gestalt in The Structure of Behavior, to the notion of ‘the one’ (the on) that is the ‘subject’of perception in the Phenomenology of Perception, then to the notion of the Flesh in The Visible and the Invisible. The flesh is neither some sort of ethereal matter nor is it a life force that runs through everything. Rather it is a notion which is formed in order to express the intertwining of the sensate and the sensible, their intertwining and their reversibility. It is this notion of reversibility that most directly problemetizes the concept of intentionality, since rather than having the model of act and object, one has the image of a fold, and of the body as the place of this fold by which the sensible reveals itself.

We see that this notion of intertwining does not only concern the relationship between the sensible and the sensate, between the body and the world. It also orchestrates the relationship between the visible and the invisible. As Merleau-Ponty undercuts, or if one prefers deconstructs, the opposition between subject and object, he also wishes to do the same for the opposition between the visible and the invisible, the sensible and the ideal. We have seen this project already operating in The Structure of Behavior, where he viewed the human order, which is to say the order of symbolic behavior, as a sublation of both physical and vital structures. Also in The Visible and the Invisible he searchs for an “infrastructure” of thought in the body. This infrastructure is located in the body's non-coincidence with itself. In his reflection on the touching-touched , he has shown that my hand, my eye, my voice is both touching, seeing and speaking, and at the same time tangible, visible and audible. However, between these two dimensions there is a non-coincidence; I never, at the same instance, experience my hand as touching and as touched. He writes, “Either my right hand really passes over to the rank of the touched, but then its hold on the world is interrupted, or it retains its hold on the world, but then I do not really touch it” (VI, 148).

There is a divergence (écart) which short-circuits the body's immanence with itself and creates an internal fissure in the visible, thereby generating differentiation rather than identity. There is, Merleau-Ponty says, a sort of reflection that the body effects on itself. Six pages before this incompleted text breaks off, he tells us that we have reached the “most difficult point,” that is, “the bond between flesh and idea, and the internal armature which [it] manifests and which it conceals” (VI, 149). The invisible, the idea, is not the contrary of the visible, it is the invisible of the visible. Merleau-Ponty evokes Proust's notion of the “little phrase” in the musical piece, which in Remembrance of Things Past signify Swann's love for Odette, as an instance of a meaning that cannot be extracted from its sensible incarnation but which, nonethelesss, is itself not strictly speaking sensible. Permit me to use another example. Harry Matthews, in my opinion a very important American writer, when he finished his undergraduate studies at Harvard University went to live in Paris where he has lived for the past 40 years or so. He is obviously completely fluent in French, but he writes in English. At a public lecture, someone asked if he ever thought of writing in French. His answer was a definite “no” because he said that to write in French, as he does in English, it would have been necessary to have attended high school in France. In the context, it was clear that he did not mean that there were certain expressions that French school children use that he does not know. Rather he meant that he did not have a sense of the sensible infrastructure which underlies forms of popular speech, like Proust's ‘little phrases’ which cannot be abstracted from their context. As Merleau-Ponty claims that there is “an ideality that is not alien to the flesh, that gives it its axis, its depth, its dimensions.”

There are meanings that can be abstracted from the sensible body but not from “another, less heavy, more transparent, body, as though it were to change flesh, abandoning the flesh of the body for that of language, and thereby [they] would be emancipated, but not freed, from every condition” (VI, 153). Language is a more diaphanous body, but body nonetheless, which is capable of sedimentation, of forming a world which, in Hannah Arendt's phrase, houses the speaker. The notion of “the invisible of the visible” continues the theme of a logos of the perceived world that we discovered in the Phenomenology of Perception, along with the theme of silence significance (pre-linguistic meaning), a silence which is not the contrary of language. In 1961 Merleau-Ponty's own voice fell silent. But insofar as it provokes speech, it was a silence which was not the contrary of language.

Thursday, October 11, 2012

Samuel Beckett Directs His Masterpiece, "Waiting for Godot" (1985)


Via Open Culture, Samuel Beckett directed this 1985 version of his classic absurdist play, Waiting for Godot, which is probably my favorite play of all time. It's great the see how the author envisioned the production of his own play, especially in light of his frequent sense that other directors were misunderstanding the text.

Samuel Beckett Directs His Absurdist Play Waiting for Godot (1985)

October 10th, 2012


Samuel Beckett’s absurdist play, Waiting for Godot, premiered in Paris in 1953, at the Théâtre de Babylone, under the direction of French actor, Roger Blin. Many other directors staged the play in the years to come, each time interpreting it in their own way. All the while, Beckett complained that the play was being subjected to “endless misunderstanding.” However, when an actor, Peter Woodthrope, once asked him to explain what Godot is all about, Beckett answered quixotically: “It’s all symbiosis, Peter; it’s symbiosis.” Thanks for the clarification, Sam.

Beckett never gave a clear explanation. But perhaps he offered up something better. In 1985, Beckett directed three of his plays — Waiting for Godot, Krapp’s Last Tape and Endgame — as part of a production called “Beckett Directs Beckett.” The plays performed by the San Quentin Players toured Europe and Asia with much fanfare, and with Beckett exerting directorial control. Act 1 of Waiting for Godot appears above; Act 2 below. And do keep this in mind. Beckett paces things slowly. So you won’t hear your first sound until the 2:00 mark.

Find the text of Waiting for Godot in our collection of Free eBooks.


Gray Matters: Brain Science in the 21st Century (Stanford Roundtable)


A panel of thought leaders and scientists explores the developing world of neuroscience and what it means for people on a personal basis. Topics include neuroplasticity and different therapies used to treat stroke victims.

The Roundtable at Stanford University is an annual event that is part of Stanford's Reunion Homecoming Weekend put on for Stanford alumni - October 6, 2012.

Moderator: Juju Chang (ABC News Nightline)
Panelists: John Hennessy (President, Stanford University); Dr. Frank Longo (Chair, Department of Neurology, Stanford University); Jill Bolte Taylor (Neuroanatomist, author, My Stroke of Insight); Carla Shatz (Professor of biology and neurobiology, Stanford University); Bob Woodruff (ABC News, Founder of The Bob Woodruff Foundation).

Gray Matters: Brain Science in the 21st Century
What if you could use sadness to make you more creative, erase bad memories and wipe out stress, keep your brain fit into your 90s, and drastically reduce your risk of Alzheimer's and memory loss?

The plasticity and capability of the brain has never been better understood. New research is revealing compelling findings that will change the way we think, interact and plan throughout our lives. As longevity and at the same time mental health issues are on the rise, our ability to impact the brain is also increasing.

Yet these are the very early days, as some put it, of understanding "those three pounds of meat inside our heads." How can we apply the new brain science to our own lives, and how is neuroscience in the 21st century going to impact us all?

Join ABC news correspondent Juju Chang and a panel of distinguished thought leaders and scientists to explore the brave new world of neuroscience and what it means for you and your family.

Secular Buddhist Podcast, Episode 137- Stephen Schettini: Secular Practice One-On-One


Here is another interesting episode of the Secular Buddhist Podcast. This week Ted Meissner spoke with "The Naked Monk," Stephen Schettini, about his personal evolution from religious Buddhism to secular practice. He was ordained as a Buddhist monk in the Tibetan Gelug tradition and became a founding member of Tharpa Choeling Centre d'hautes etudes Tibetaines in Mont Pelerin, Switzerland. For eight years he trained there and in Asia as a translator and instructor of Buddhism, and began teaching in 1980. After eight years he returned to secular life and pursued education in the Theravada tradition before striking out on his own and eventually creating his Quiet Mind Seminars.

Secular Buddhist Podcast, Episode 137 :: Stephen Schettini :: Secular Practice One-On-One

| October 6, 2012


Stephen Schettini

The Naked Monk Stephen Schettini joins us to speak about personal evolution from religious Buddhism to secular practice.
It’s interesting, isn’t it, how things change and yet so much remains the same? We have in our world, for example, not only ongoing lineages of religious Buddhism, but these traditions are growing alongside new non-traditional forms. The development of secular Buddhism doesn’t take away from or ruin tradition, it simply opens up new fields of exploration.
And what an opportunity this is for us to learn! Our practice doesn’t have to remain in one place, bound by the constraints of convention. We can and should investigate our options, and understand that in different times and circumstances of our lives, we’re going to incline to different approaches to engagement with moment by moment existence. In other words, what we do evolves.
Stephen is the founder and director of Quiet Mind Seminars. He’s led hundreds of meditation workshops in the Montreal area since 2003 through www.thequietmind.org, and has contributed columns regularly to local newspapers and to The Suburban, Quebec’s largest English-language weekly. He also freelanced for the Montreal Gazette. Stephen made a living in print communications and over the next 20 years authored, co-authored, illustrated, and designed dozens of books on information technology and health science.
So, sit back, relax, and have a nice light roast coffee, with a dash of hazelnut creamer.
 

Books


Web Links

Music for This Episode Courtesy of Rodrigo Rodriguez

The music heard in the middle of the podcast is from Rodrigo Rodriguez. The track used in this episode is “Shika no Tone” from his CD, Traditional and Modern Pieces: Shakuhachi.

Wednesday, October 10, 2012

Katie J.M. Baker - Poor Kids Getting Prescribed ADHD Meds They Don't Need, Against Their Will

This article by Katie J.M. Baker originally appeared in Jezebel and was re-posted at AlterNet. I am not at all surprised by any of this, but it is still horrible what we are doing these children.

A doctor who prescribes the meds said, "We've decided as a society that it's too expensive to modify the kid's environment. So we have to modify the kid."
 

Doctors are prescribing prescription pills like Adderall to low-income kids even if they don't "need" drugs to function because it's often the only realistic way to help them do well in school.

"I don't have a whole lot of choice," one doctor who treats poor families outside of Atlanta, Georgia, told the  New York Times . "We've decided as a society that it's too expensive to modify the kid's environment. So we have to modify the kid."

It's easy for those of us without kids struggling to succeed in inadequate schools to act horrified about the way A.D.H.D diagnosis rates are rising as school funding drops — because it  is horrifying to imagine a bunch of elementary schoolers hopped up on speed that's doing god knows what to their little brains (well, we know that some reported side effects include growth suppression, increased blood pressure and psychotic episodes; we'll get to that in a second) — but it all depends on how you measure success. Is the end goal a perfectly clear blood stream or good grades against the odds? Some parents (and doctors) would choose the latter.

"We as a society have been unwilling to invest in very effective nonpharmaceutical interventions for these children and their families," Dr. Ramesh Raghavan, a child mental-health services researcher at Washington University in St. Louis and an expert in prescription drug use among low-income children, told the  Times. "We are effectively forcing local community psychiatrists to use the only tool at their disposal, which is psychotropic medications."

The negative effects on the kids in this story, both emotionally and physically, are heartbreaking. "My kids don't want to take it, but I told them, ‘These are your grades when you're taking it, this is when you don't,' and they understood," said one parent who added that Medicaid covers almost all of her prescription costs. (Too bad they don't cover tutors or therapy instead...) And then there's this terrible anecdote about 11-year-old Quintin, one of five children who take more types of pills (Adderall, Risperdal, Clonidine) than the women in Valley of the Dolls :
When puberty's chemical maelstrom began at about 10, though, Quintn got into fights at school because, he said, other children were insulting his mother. The problem was, they were not; Quintn was seeing people and hearing voices that were not there, a rare but recognized side effect of Adderall. After Quintn admitted to being suicidal, Dr. Anderson prescribed a week in a local psychiatric hospital, and a switch to Risperdal.
After that, Quintn's parents flushed all of their pharmaceuticals down the toilet and vowed never to give their kids prescription speed ever again. Just kidding! They actually kept giving their 12-year-old daughter and 9-year-old son Adderall, to help their grades and because their daughter was "a little blah." Her dad acknowledged that this was a "cosmetic" fix (I'll say; I've heard better justifications from cokeheads), but said, "If they're feeling positive, happy, socializing more, and it's helping them, why wouldn't you? Why not?"

That's exactly how I felt about taking Adderall in college. I'd pop one every few months or so, usually during finals if I had a long paper to write because Adderall made the process so much easier and so much more enjoyable. Every time I took it, I'd eventually get swept up in an inner debate about performance-enhancing drugs (made much more intense from said drugs, of course): if Adderall was so helpful, why didn't I get my own prescription and take it on a more regular basis? Why was to say I didn't need it? What did "need" really mean, anyway?

I never got one, because I hated the way I always eventually crashed after an Adderall-fueled writing session — as productive as those were — and I didn't want to become dependent on something I knew was bad for me and that I could do without. But at least I was a 20-year-old adult at the time able to make my own decisions, not a little kid with a developing brain. That's exactly what Dr. William Graf, a pediatrician and child neurologist who works with poor families, said he was concerned about: the "authenticity of development."

"These children are still in the developmental phase, and we still don't know how these drugs biologically affect the developing brain," he told the Times. "There's an obligation for parents, doctors and teachers to respect the authenticity issue, and I'm not sure that's always happening."

But, again, how can we expect parents whose children are flailing in deficient schools to prioritize the intangible concept of "authentic development" over the quick fix offered by drugs like Adderall? Realistically, we can't.

Documentary - A World of Art: The Metropolitan Museum of Art


 Very cool, via Top Documentary Films.

A World of Art: The Metropolitan Museum of Art


A World of Art: The Metropolitan Museum of ArtWhat makes a masterpiece? In this visually stunning high definition production, A World of Art, the magnificence of America’s premier art museum lights up the screen.

One of the architectural glories of New York, the Met stretches 1000 feet along Fifth Avenue. Inside is a dazzling three dimensional encyclopedia of world art, radiating 5,000 years of artistic history.

Founded in 1870, the Metropolitan Museum of Art was built on the shoulders of capitalism: J.P. Morgan, Havemeyer, Lehman, Rockefeller, and Annenberg are just a few of the names behind the Met’s collections.

Met is the largest art museum in the United States with among the most significant art collections. Its permanent collection contains more than two million works, divided among nineteen curatorial departments. The main building, located on the eastern edge of Central Park along Manhattan’s Museum Mile, is by area one of the world’s largest art galleries.

Represented in the permanent collection are works of art from classical antiquity and Ancient Egypt, paintings and sculptures from nearly all the European masters, and an extensive collection of American and modern art. The Met also maintains extensive holdings of African, Asian, Oceanic, Byzantine, and Islamic art.

The museum is also home to encyclopedic collections of musical instruments, costumes and accessories, and antique weapons and armor from around the world. Several notable interiors, ranging from 1st-century Rome through modern American design, are permanently installed in the Met’s galleries.

Watch the full documentary now

The Six Paramitas, by Ringu Tulku


This nice collection of six videos on the Six Paramitas (perfections) was posted at Buddhism Now - enjoy. A little background, via Wikipedia:
In Mahāyāna Buddhism, the Prajñapāramitā Sūtras, the Lotus Sutra (Skt., Saddharma Puṇḍarīka Sūtra), and a large number of other texts, list the six perfections as (original terms in Sanskrit):
  1. Dāna pāramitā: generosity, giving of oneself (in Chinese, Korean, and Japanese, 布施波羅蜜; in Wylie Tibetan, sbyin-pa)
  2. Śīla pāramitā : virtue, morality, discipline, proper conduct (持戒波羅蜜; tshul-khrims)
  3. Kṣānti (kshanti) pāramitā : patience, tolerance, forbearance, acceptance, endurance (忍辱波羅蜜, bzod-pa)
  4. Vīrya pāramitā : energy, diligence, vigor, effort (精進波羅蜜, brtson-’grus)
  5. Dhyāna pāramitā : one-pointed concentration, contemplation (禪定波羅蜜, bsam-gtan)
  6. Prajñā pāramitā : wisdom, insight (智慧波羅蜜, shes-rab)
Note that this list is also mentioned by the Theravāda commentator Dhammapala, who says it is equivalent to the above list of ten.[11]
In the Ten Stages (Daśabhūmika) Sutra, four more pāramitās are listed:
7. Upāya pāramitā: skillful means
8. Praṇidhāna pāramitā: vow, resolution, aspiration, determination
9. Bala pāramitā: spiritual power
10. Jñāna pāramitā: knowledge
__

In the Pāli canon's Buddhavaṃsa[3] the Ten Perfections (dasa pāramiyo) are (original terms in Pāli):
  1. Dāna pāramī : generosity, giving of oneself
  2. Sīla pāramī : virtue, morality, proper conduct
  3. Nekkhamma pāramī : renunciation
  4. Paññā pāramī : transcendental wisdom, insight
  5. Viriya (also spelled vīriya) pāramī : energy, diligence, vigour, effort
  6. Khanti pāramī : patience, tolerance, forbearance, acceptance, endurance
  7. Sacca pāramī : truthfulness, honesty
  8. Adhiṭṭhāna (adhitthana) pāramī : determination, resolution
  9. Mettā pāramī : loving-kindness
  10. Upekkhā (also spelled upekhā) pāramī : equanimity, serenity
Two of the above virtues, metta and upekkha also comprise two of the four immeasurables (brahmavihāra).
So here is the teaching from Ringu Tulku Rinpoche.

The Six Paramitas, by Ringu Tulku



Six short films on the Six Paramitas; Giving, Conduct, Patience, Diligence, Meditation, and Wisdom. Ringu Tulku speaks most clearly and eloquently, laying out the basis of Buddhism and the path to take for those who wish to practise.

Ringu Tulku Rinpoche is a Tibetan Buddhist monk of the Kagyu Order. He was trained in all schools of Tibetan Buddhism and has served as Tibetan Textbook Writer and Professor of Tibetan Studies in Sikkim for twenty-five years.

The Perfection of Giving



The Perfection of Conduct


The Perfection of Patience


The Perfection of Diligence


The Perfection of Meditation


The Perfection of Wisdom



Tuesday, October 09, 2012

Simon J Makin - The Story of a Lonely Brain (Scientific American)


I have been reading quite a bit of infant development research and I have become pretty well convinced now that we have totally and completely underestimated the sentience of infants. They are not born completely lacking a self and in some form of primordial fusion state. They are actually able to manipulate their environments (primary narcissism - Freud / archaic narcissism - Kohut) in ways that would make an adult narcissist (secondary narcissism - Freud / narcissistic personality disorder - Kohut) envious.

More importantly, the last 30 years of research demonstrates the incredible power of nurture in the formation of self and identity. Some traits may be character-based, but much of we become, from affect control to self-image, is the result of our primary relationships with caregivers during the first three years of life, and especially the first year (when all learning and identity formation is pre-verbal). 

Yes, this provides added weight to the role of the primary caregiver - but as Winnicott showed, we do not need perfect parents, just "good enough" parents.

The Story of a Lonely Brain

October 1, 2012

Prologue:
Humans are born to a longer period of total dependence than any other animal we know of, and we also know that mistreatment or neglect during this time often leads to social, emotional, cognitive and mental health problems in later life. It’s not hard to imagine how a lack of proper stimulation in our earliest years – everything from rich sensory experiences and language exposure to love and care – might adversely affect our development, but scientists have only recently started to pull back the curtain on the genetic, molecular and cellular mechanisms that might explain how these effects arise in the brain.

Chapter 1: In which we encounter a menagerie
You’ll often hear it said that human beings are “social animals”. What biologists tend to mean by that phrase is behaviour like long-lasting relationships or some kind society, whether that’s the social hierarchy of gorillas or the extreme organisation of bees and ants. But, to an extent, most animals are social. A mother usually bonds with its offspring in any species of bird or mammal you care to mention, and almost all animals indulge in some kind of social behaviour when they mate.

But there is another sense in which most animals seem to be fundamentally social. There is an emerging scientific understanding of the way social experience moulds the biochemistry of the brain and it looks like most species don’t just prefer the company of others – they need it to develop properly. Take that staple of genetics research, drosophila – aka the fruit fly. While they are not as social as primates or bees, they are more social than you might think, and there have been studies showing that social isolation can disrupt their mating behaviour or even reduce their lifespan.

Related effects have been seen in a range of animals, from rats through pigs, right up to our close relative, the rhesus monkey. Harry Harlow’s controversial experiments on rhesus monkeys in the 60s showed that prolonged isolation in the first year of life caused severe psychological disturbance, and a more recent study found that rhesus monkeys raised alone performed worse on memory and learning tasks and had smaller corpus callosums than monkeys raised in a social environment. The corpus callosum is a bundle of neural fibres that connects the two hemispheres of the brain and those of isolated monkeys were smaller due to having significantly less “cross-hemispheric projections”.

Chapter 2: A matter of two types of matter

Oligodendrocyte, axon and myelin sheath
Oligodendrocyte, axon and myelin sheath

The corpus callosum is also the largest white matter structure in the brain. The brain’s grey matter is neurons, the brain cells that do the heavy lifting of neural processing. The main function of white matter, composed of glial cells and myelinated axons, is to pass messages between different grey matter areas. Axons are the output cables of neurons, carrying electrical impulses to synapses on other neurons, where they influence whether they fire impulses along their axons, and so on, in the unimaginably complex web of interconnections we call a brain.

Glial cells play a largely supportive role, protecting and nourishing neurons, but a type of glial cell known as an oligodendrocyte is also responsible for producing myelin. This fatty white substance insulates axons with a myelin sheath, speeding up the transmission of electrical impulses by a factor of up to at least 50, compared to unmyelinated axons. The speed of transmission is likely to be crucial since although neurons only have one axon, they receive input at multiple synapses, probably from axons of very different lengths. So if the chance of a neuron firing depends on a number of impulses arriving simultaneously, transmission along those different length axons will need to be precisely timed.

At birth the brain is full of grey matter, but white matter is only common in a few places. As we grow the volume of grey matter tends to increase as we head into puberty, and then starts to decrease. White matter on the other hand, continually increases, and the process of myelination continues into our late 20s. This suggests a growth spurt of neurons in childhood, followed by a period of pruning and consolidation in adolescence continuing into adulthood. A lot of this is probably biologically set in stone, but the fine details are no doubt shaped by our life experiences. White matter development has to follow grey to an extent of course, since you can’t insulate an axon that isn’t there, and myelination presumably continues into adulthood as axons continue to sprout new branches and prune others in response to experience.

Chapter 3: The orphans
White matter is also a suspect in the effects of neglect on children. A study of seven children adopted into US families from Romanian institutions used diffusion tensor imaging (a type of MRI) to examine white matter bundles connecting various brain regions. Compared to a group of normally raised children, the adopted kids had less of this white matter, particularly in something called the uncinate fasciculus, which connects parts of the brain’s temporal lobe, including the amygdala, with parts of the prefrontal cortex. The paper makes no specific claims beyond “a structural change” in this white matter “tract”, but it’s worth noting that the amygdala plays a role in memory and emotional reactions and the prefrontal cortex is important for decision making and social behaviour.
The tragedy of Romanian orphans provided psychologists with a “natural experiment” to study the effects of early social deprivation and the Bucharest Early Intervention Project was the first randomised controlled trial to study the benefits of transfer to foster care for very young institutionalised children. A total of 136 children were assessed with a comprehensive battery of tests before being randomly assigned to either remain in an institution or be placed in foster care, at an average age of just under two.1 The project has generated many publications showing benefits in everything from cognitive development to psychiatric outcomes for the children placed in foster care, with the most benefit seen for the youngest children – especially if they were less than two when they were removed from an institution.

One study published just last month found differences in the volume of both grey and white matter between children who remained in an institution, those who were placed in foster care and those who had never been institutionalised. Children who had been in an institution had less grey matter, regardless of whether or not they were later placed in foster care. The foster care group however, did not have less white matter than the group who had never been institutionalised, whereas those who remained in an institution did have less. This suggests that placement in foster care at an early enough age might allow white matter development to “catch up” in children moved into better environments. So, since the earlier studies demonstrated a range of beneficial effects of foster care, it doesn’t seem like too much of a leap to suppose that those effects are due to changes in white matter. That’s a long way from scientific proof, but it’s a reasonable working hypothesis.

It’s also worth mentioning that although the orphans adopted by US families mentioned above had differences in white matter which hadn’t been reversed by foster care, the structure involved, the uncinate fasciculus, is the last major white matter tract to mature and the only one that is still developing beyond the age of 30.

Chapter 4: The experiment
But none of this tells us anything about how social deprivation influences white matter, which brings us to the protagonist of this story. Observations such as these led a team at Boston Children’s Hospital, led by neurobiologist Gabriel Corfas, to investigate the effects of social isolation on mice bioengineered to develop fluorescent oligodendrocytes – the cells that produce myelin.

Beginning at an age of three weeks, just after weaning, the mice were split into groups: One group was isolated and another was housed normally, with four mice per cage. After four weeks they were tested and the isolated mice performed significantly worse on measures of social interaction and working memory – two behaviours believed to depend on the medial prefrontal cortex (mPFC). Two weeks later, the researchers inspected the oligodendrocytes in the mPFCs of the mice. Sure enough, although the number of cells was the same in both groups, the isolated mice had oligodendrocytes that were stunted, with simpler shapes, fewer branches, and so on, and two genes that produce proteins important for myelination were turned on, or expressed, less often in the isolated mice’s mPFCs. Electron microscopy also revealed significantly thinner myelin sheaths.

Examining normally housed mice showed that the first two weeks of this six week period was when a lot of oligodendrocyte development went on in the mPFC, and analysing mice after two weeks of isolation revealed similar defects to mice isolated for the full term.

So the team ran another experiment. They isolated some three week-old mice for just the first two weeks before returning them to normal housing and took another group from normal housing at five weeks old and isolated them for only the last four weeks. The mice isolated for the last four weeks were indistinguishable from mice housed normally throughout, whereas the mice isolated for the first two weeks showed the same retarded myelination and reduced performance as mice isolated for the full six weeks – even after being returned to normal housing for the last four weeks.

In other words, these first two weeks (the fourth and fifth of a mouse’s life) appears to be what is known as a “critical period” for oligodendrocyte development and myelination in the mPFC and lack of social interaction during this time retards this development, which, in turn, causes problems with memory and social behaviour.

Chapter 5: In which we follow a pathway

Oligodendrocyte transfected with green fluorescent protein
Oligodendrocyte transfected with green fluorescent protein

But how does this happen? Well, to dig any deeper we need to know about cell signalling pathways. Certain genes code for proteins, which means they produce that protein if expressed. Proteins have a huge number of different functions, but one of them is to transmit a signal from one cell to another. The protein binds to a receptor on the destination cell, and – well, something happens. Again, this can take many forms, but in the context of brain development one likely result is cell differentiation, where a simpler cell changes into a more complex, specialised one, but there are a variety of other things that can happen.

The NRG1 gene codes for a protein essential for brain development called neuregulin-1, which happens to bind to a class of receptors on oligodendrocytes called ErbB. Neuroscientists have known for some time that this signalling pathway is important for oligodendrocyte development, but Klaus-Armin Nave of the Max Planck Institute for Experimental Medicine in Gottingen published a paper in 2006 showing that it also controls the amount of myelin produced by Schwann cells – the peripheral nervous system’s version of oligodendrocytes. Shortly after, another study from Boston Children’s Hospital showed that if specific ErbB receptors in mice are blocked, oligodendrocytes are stunted, myelin is thinner and impulses travel more slowly down axons.

To investigate the influence of this cellular control mechanism on myelination in the mPFC, the researchers genetically engineered mice in which they could eliminate certain ErbB receptors with a drug. They found that if they did this before the two week critical period, mice housed normally had the same reduction in myelination and lower performance as ordinary mice that had been isolated. If they did it after the critical period it had no such effect. So it seems the neuregulin-to-ErbB binding mechanism needs to be working for mice to benefit from normal social interaction during the critical period, but after that it doesn’t matter. Finally, the team compared levels of all the components of this mechanism in mice isolated during this period with levels in regularly housed mice. The isolated mice had less of a specific kind of neuregulin-1, known as type III.

To check that all these effects of isolation weren’t just part of a decline across the board, the researchers also looked at general physical activity and changes in gene expression in the motor cortex, which controls movement and sits right next to the mPFC in the brain. They found no changes in either of these measures, showing that isolation only affects specific things.

So let’s recap. Mice deprived of company during a sensitive time in their young lives express less of a certain protein in their prefrontal cortex. This interferes with a signalling mechanism involved in oligodendrocyte development and myelination. Reduced myelination alters the speed at which neural impulses travel down axons, which messes with the delicate timing of neural processing. This leads to problems with working memory and social behaviour, which aren’t reversed by reintroducing the mice back into society. And there you have it.

Chapter 6: The lonely brain
Well, actually no. This is the brain we’re talking about and nothing about the brain is ever simple. For a start there are other interpretations of how myelin deficiencies could cause behavioural changes. Corfas himself has shown that blocking ErbB receptors causes changes in dopamine signalling which could offer an alternative explanation for the effects of isolation on behaviour.

Secondly, this isn’t the only mechanism that’s been proposed to account for the effects of experience on myelination. R Douglas Fields and colleagues published a paper last year showing that a neurotransmitter called glutamate, released along axons in response to electrical activity, increases myelination by stimulating production of myelin proteins in oligodendrocytes. This was only done in a culture dish, not in a live animal, but it’s an appealing theory of how experience might shape brain development because electrical activity in axons is neural activity and neural activity is experience, so it’s an intuitive mechanism for the effects of experience on white matter development. This is one of the thorny questions in this area: does experience shape white matter, which then shapes neural activity, or does neural activity shape white matter? The answer is probably both.

Also, nobody is suggesting the effects of social isolation are confined to white matter and myelination. The brain is a staggeringly complex thing and a scenario as broad as social experience, or the lack of it, is likely to have a wide range of consequences. Remember the orphans in chapter 3 who had less white matter than children raised in families? They also had less grey matter. And while this does go through a period of decline as we develop, it’s unlikely their reduced grey matter was an indication of accelerated development as they were only around 9 at the time and so not yet at the age when this usually happens. It also wouldn’t really tally with the mental health problems they invariably suffered from.

As a specific example, there was a paper published this July looking at the effects of social isolation on behavioural performance in rats. It focussed on the barrel cortex, the part of a rat’s brain that receives input from their whiskers, as whiskers are an important social communication channel for rats. The researchers found that early isolation disrupted a signalling pathway involved in forming healthy neural circuitry in the barrel cortex, leading to decreased whisker sensitivity and deficiencies in “whisker-related” behaviour. This had nothing to do with white matter and everything to do with grey matter.

Epilogue:
Schizophrenia and mood disorders usually crop up in adolescence and have been linked to disturbances in white matter and myelination, and disruptions of the neuregulin-1-to-ErbB signalling pathway. Corfas has also shown that disruption of oligodendrocyte genes involving our old friend neuregulin, causes schizophrenic-like behaviour in mice. All of which suggests that this latest study may shed more light on new ways to understand and possibly even treat these conditions.

Corfas’ team is also currently looking at ways to target neuregulin-1 and ErbB signalling pathways with drugs that might stimulate myelination. It’s conceivable that this could eventually lead to ways to treat the debilitating effects of early social deprivation, but we will have to proceed very cautiously down this particular path as too much myelination is likely to be as bad as too little.

Footnote:
1 (Before you start worrying about the morality of randomly assigning orphans to foster care, be assured this study went to great lengths to ensure it was a force for good. The study recruited its own foster parents, as “government-sponsored foster care was limited to about one family” and none of the children received less care than would have happened if the study hadn’t been conducted. Also, crucially, following the ethical guideline that “no subject should be randomised to an intervention known to be inferior to the standard of care”, the researchers point out: “at the start of our study there was uncertainty about the relative merits of institutional and foster care in the Romanian child welfare community, with a historical bias in favour of institutional care.” Government officials and child protection professionals were kept informed of the results as the study progressed, with the result that the Romanian government eventually passed laws prohibiting placing infants younger than two in institutions. If you’re still not convinced, read this.)

Images: Neuron with oligodendrocyte and myelin sheath by Andrew c; Oligodendrocyte by Jurjen Broeke.

Simon J Makin 
About the Author: Simon J Makin is an auditory perception researcher turned science writer and journalist. Originally from Liverpool in the north of England, he has a bachelor's in engineering, a masters in Speech and Hearing Sciences, and a PhD in computational auditory modelling from The University of Sheffield. He spent several years working as a research fellow in the psychology dept at The University of Reading, before recently branching out and retraining in journalism. Follow on Twitter @SimonMakin.