Saturday, May 07, 2011

The Dalai Lama - Having Strong Consideration for Others' Happiness and Welfare

25th Anniversary Edition

by the Fourteenth Dalai Lama
His Holiness Tenzin Gyatso,
edited and translated by Jeffrey Hopkins,
co-edited by Elizabeth Napper

Dalai Lama Quote of the Week

In order to have strong consideration for others' happiness and welfare, it is necessary to have a special altruistic attitude in which you take upon yourself the burden of helping others. In order to generate such an unusual attitude, it is necessary to have great compassion, caring about the suffering of others and wanting to do something about it. In order to have such a strong force of compassion, first you must have a strong sense of love which, upon observing suffering sentient beings, wishes that they have happiness--finding a pleasantness in everyone and wishing happiness for everyone just as a mother does for her sole sweet child.

In order to have a sense of closeness and dearness for others, you first train in acknowledging their kindness through using as a model a person in this lifetime who was very kind to yourself and then extending this sense of gratitude to all beings. Since, in general, in this life your mother was the closest and offered the most help, the process of meditation begins with recognizing all other sentient beings as like your mother. (p.44)

--from Kindness, Clarity, and Insight 25th Anniversary Edition by The Fourteenth Dalai Lama, His Holiness Tenzin Gyatso, edited and translated by Jeffrey Hopkins, co-edited by Elizabeth Napper, published by Snow Lion Publications

Kindness, Clarity, and Insight • Now at 5O% off
(Good until May 13th).

Furtherfield - An interview with Michel Bauwens founder of Foundation for P2P Alternatives

Cool -

An interview with Michel Bauwens founder of Foundation for P2P Alternatives

Michel Bauwens is one of the foremost thinkers on the peer-to-peer phenomenon. Belgian-born and currently resident in Chiang-Mai, Thailand, he is founder of the Foundation for P2P Alternatives.

It's a commonplace now that the peer-to-peer movement opens up new ways of creating relating to others. But you've explored the implications of P2P in depth, in particular its social and political dimensions. If I understand right, for you the phenomenon represents a new condition of capitalism, and I'm interested in how that new condition impacts on the development of culture - in art and also architecture and urban form.

As a bit of a background, I'd like to look at what you've identified as the simultaneous "immanence" and "transcendence" of P2P: it's interdependent with capital, but also opposed to it through the basic notion of the Commons. Could you elaborate on this?

With immanence, I mean that peer production is currently co-existing within capitalism and is used and beneficial to capital. Contemporary capitalism could not exist without the input of free social cooperation, and creates a surplus of value that capital can monetize and use in its accumulation processes. This is very similar to coloni, early serfdom, being used by the slave-based Roman Empire and elite, and capitalism used by feudal forces to strengthen their own system.

BUT, equally important is that peer production also has within itself elements that are anti-, non- and post-capitalist. Peer production is based on the abundance logic of digital reproduction, and what is abundant lies outside the market mechanism. It is based on free contributions that lie outside of the labour-capital relationship. It creates a commons that is outside commodification and is based on sharing practices that contradict the neoliberal and neoclassical view of human anthropology. Peer production creates use value directly, which can only be partially monetized in its periphery, contradicting the basic mechanism of capitalism, which is production for exchange value.

So, just as serfdom and capitalism before it, it is a new hyperproductive modality of value creation that has the potential of breaking through the limits of capitalism, and can be the seed form of a new civilisational order.

In fact, it is my thesis that it is precisely because it is necessary for the survival of capitalism, that this new modality will be strengthened, giving it the opportunity to move from emergence to parity level, and eventually lead to a phase transition. So, the Commons can be part of a capitalist world order, but it can also be the core of a new political economy, to which market processes are subsumed.

And how do you see this condition - the relationship to capital - coming to a head?

I have a certain idea about the timing of the potential transition. Today, we are clearly at the point of emergence, but also coinciding with a systemic crisis of capitalism and the end of a Kondratieff wave.

There are two possible scenarios in my mind. The first is that capital successfully integrates the main innovations of peer production on its own terms, and makes it the basis of a new wave of growth, say of a green capitalist wave. This would require a successful transition away from neoliberalism, the existence of a strong social movement which can push a new social contract, and an enlightened leadership which can reconfigure capitalism on this new basis. This is what I call the high road. However, given the serious ecological and resource crises, this can at the most last 2-3 decades. At this stage, we will have both a new crisis of capitalism, but also a much stronger social structure oriented around peer production, which will have reached what I call parity level, and can hence be the basis of a potential phase transition.
Read the whole interview.

NPR - Robert Johnson At 100, Still Dispelling Myths

So much of contemporary rock-n-roll owes its soul to Robert Johnson and other early blues guitar pioneers - there would have been no Elvis, Rolling Stones, Cream, Led Zeppelin, or even Black Sabbath without those early blues musicians.

Of all those early musicians, Robert Johnson and his rumored deal with the devil (to explain how he mastered the guitar in 6 months) is one of he most enigmatic - he died so young, at 27, and under slightly curious circumstances that his life has become legend (think of the 1980s movie Crossroads).

The sidebar for this story contains other NPR segments on Johnson.

Featured Artist

Robert Johnson The audio will be up in a couple of hours (9 am Pacific, 12 pm Eastern).

Audio for this story from Weekend Edition Saturday will be available at approx. 12:00 p.m. ET

One of the two known photos of Robert Johnson. This portrait was taken by the Hooks Bros. Photography Company in Memphis, Tenn., circa 1935.
Courtesy of the Delta Haze Corporation - One of the two known photos of Robert Johnson. This portrait was taken by the Hooks Bros. Photography Company in Memphis, Tenn., circa 1935.

Sunday marks the 100th anniversary of the birth of Robert Johnson. Although he recorded just 29 songs, the bluesman had a huge influence on guitarists such as Eric Clapton and Keith Richards. Johnson is one of the most studied of all country blues musicians, and he's been the subject of many books, films and essays. But the mythology surrounding his life just won't go away.

If you know anything about Johnson, chances are it's the story that he sold his soul to the devil at the crossroads in exchange for his musical talent. That legend reached a mainstream audience with the 1986 movie Crossroads, starring Joe Seneca and Ralph Macchio.

But according to folklorist Barry Lee Pearson, it didn't happen.

"The popular mythology has him as a total loner," Pearson says, "and kind of lived this life in regret as a repayment for his alleged sin of making a contract with Old Scratch."

Pearson, a professor at the University of Maryland and the co-author of the book Robert Johnson: Lost and Found, says none of it is true. In the absence of any real biographical information, Pearson says early blues writers got a little carried away.

"Everybody was so anxious to make this devil story true that they've been working on finding little details that can corroborate it," he says.

Here is what we do know about Robert Johnson. He said he was born in Mississippi on May 8, 1911, and grew up on a plantation in the Delta. As a young man, he was more interested in music than farming: He'd hound the older blues musicians for a chance to play. In an interview included in the 1997 documentary Can't You Hear the Wind Howl, Son House recalls that the young Johnson would annoy audiences with his lousy guitar playing.

"Folks they come and say, 'Why don't you go out and make that boy put that thing down? He running us crazy,' " House said. "Finally he left. He run off from his mother and father, and went over in Arkansas some place or other."

When Johnson came back from Arkansas six months later, he'd mastered the guitar. That's where the rumors about his deal with the devil came from, but Johnson acknowledged studying with a human teacher while he was gone. After that, Johnson worked as a traveling musician, playing on street corners and in juke joints, mostly in Mississippi. And in 1936, he got a chance to record in Texas.

"Terraplane Blues" was a minor hit, and he was invited back for a second recording session. Johnson died a year later at age 27, under mysterious circumstances. Some think he was poisoned, although a note on the back of his death certificate says the cause was syphilis.

In any case, the timing was tragic. Legendary Columbia Records talent scout John Hammond wanted to book Johnson at Carnegie Hall for the landmark "Spirituals to Swing" concert in 1938. Hammond was also the driving force behind the first LP reissue of Johnson's music in 1961. At the time, Johnson was so obscure that Columbia didn't even have a picture of him to put on the cover. The LP was produced by Frank Driggs, who also wrote the liner notes.

"If you read the liner notes," Driggs says, "you see next to nothing. 'Cause I just created a thing out of whole cloth when I wrote the notes. Because there really was very little known about the guy."

Up Against The Wall

That LP, King of the Delta Blues Singers, introduced Johnson's music to a new generation of young, mainly white blues fans, including Eric Clapton, as the rock legend told NPR in 2004.

"It was on Columbia and it had, like, some pretty interesting sleeve notes on it about the fact that these were the only sides he had cut, and that they'd done it in a hotel room, and when he was auditioning for the sessions that he was so shy, he had to play facing into the corner of the room," Clapton says. "I mean, I immediately identified with that, because I was paralyzed with shyness as a kid."

But there may be another reason why Johnson recorded facing the wall. Elijah Wald is a musician and the author of the book Escaping the Delta: Robert Johnson and the Invention of the Blues. He says there were pre-war blues musicians who played guitar better than Johnson, as well as musicians who sang better. But Wald says that, unlike most of them, Johnson learned to play from listening to radio and records.

"Robert Johnson certainly was very conscious of what a hit record sounded like," Wald says. "If you listen to something like 'Come on in My Kitchen,' he's singing very quietly, and he actually has a moment when he says, 'Can't you hear the wind blowin'.' He whispers it and then plays this very quiet riff. That never would have worked on a street corner or a Mississippi juke joint, but it sounds great on records."

Sound is one of the main things that distinguishes Johnson's sides from other records of the time. By facing the wall, Wald says Johnson might have made his vocals sound better to a later generation accustomed to high fidelity. It doesn't hurt that the original masters of his recordings survived, too. But what really set Johnson apart from his peers was all of the mythology that grew up around him, especially the part about the devil. Many of Johnson's friends, including Johnny Shines in Can't You Hear the Wind Howl, dismissed it as false.

"No," Shimes says, "he never told me that lie. If he would've, I would've called him a liar right to his face. You have no control over your soul. How you gonna do anything with your soul?"

But the myth about Johnson persists, in part because it helps sell records. Steve Berkowitz is a producer at Sony Legacy, which is reissuing Johnson's music again, this time in a new centennial edition.

"That was always the heart and soul of the marketing plan," Berkowitz says. "We always knew the music was great. But a guy sells his soul to the devil at midnight down at the crossroads, comes back and plays the hell out of the guitar, and then he dies. I mean, it's a spectacular story."

And there wouldn't be any harm in that, Wald says, except that the legend tends to overshadow the real Robert Johnson.

"To just say that he went to the crossroads in the dead of night, first of all means we're not getting what happened. And second of all, it's kind of insulting," Wald says. "It's kind of implying that, unlike us who do this serious work to understand music, these old black blues guys just went and sold their soul to the devil."

If it were really that easy, Wald says, the devil would own the souls of every teenage boy and girl in America.

"Me and the Devil Blues"

"Sweet Home Chicago"

Friday, May 06, 2011

Slavoj Zizek - Communism is in its very notion anti-statist

This quote was posted by Ed Berge in the Integral Postmetaphysical Spirituality forum. It comes from the current issue of Guernica, in which appeared "The Un-Shock Doctrine" by Slavoj Zizek - in this piece he is advocating for the demise of capitalism and the emergence of a truly stateless communism (which to me looks a lot like the commons and P2P movements).

There is a lot to like here, but I seriously question his notion of the "eternal Idea of communism" - there is nothing eternal about it. He needs to go back and read some more Habermas.

Here is the excerpt that was posted (which I find intriguing, and which I think is longer than what was posted at IPS):

The Left today faces the difficult task of emphasizing that we are dealing with political economy—that there is nothing “natural” in the present crisis, that the existing global economic system relies on a series of political decisions—while simultaneously acknowledging that, insofar as we remain within the capitalist system, violating its rules will indeed cause economic breakdown, since the system obeys a pseudo-natural logic of its own. So, although we are clearly entering a new phase of enhanced exploitation, facilitated by global market conditions (outsourcing, etc.), we should also bear in mind that this is not the result of an evil plot by capitalists, but an urgency imposed by the functioning of the system itself, always on the brink of financial collapse. For this reason, what is now required is not a moralizing critique of capitalism, but the full re-affirmation of the Idea of communism.

The idea of communism, as elaborated by Alain Badiou, remains a Kantian regulative idea lacking any mediation with historical reality. Badiou emphatically rejects any such mediation as a regression to an historicist evolutionism which betrays the purity of the Idea, reducing it to a positive order of Being (the Revolution conceived as a moment of the positive historical process). This Kantian mode of reference effectively allows us to characterize Badiou’s deployment of the “communist hypothesis” as a Kritik der reinen Kommunismus. As such, it invites us to repeat the passage from Kant to Hegel—to re-conceive the Idea of communism as an Idea in the Hegelian sense, that is, as an Idea which is in the process of its own actualization. The Idea that “makes itself what it is” is thus no longer a concept opposed to reality as its lifeless shadow, but one which gives reality and existence to itself. Recall Hegel’s infamous “idealist” formula according to which Spirit is its own result, the product of itself. Such statements usually provoke sarcastic “materialist” comments (“so it is not actual people who think and realize ideas, but Spirit itself, which, like Baron Munchausen, pulls itself up by its own hair”). But consider, for example, a religious Idea which catches the spirit of the masses and becomes a major historical force? In a way, is this not a case of an Idea actualizing itself, becoming a “product of itself”? Does it not, in a kind of closed loop, motivate people to fight for it and to realize it? What the notion of the Idea as a product of itself makes visible is thus not a process of idealist self-engendering, but the materialist fact that an Idea exists only in and through the activity of the individuals engaged with it and motivated by it. What we have here is emphatically not the kind of historicist/evolutionist position that Badiou rejects, but something much more radical: an insight into how historical reality itself is not a positive order, but a “not-all” which points towards its own future. It is this inclusion of the future as the gap in the present order that renders the latter “not-all,” ontologically incomplete, and thus explodes the self-enclosure of the historicist/ evolutionary process. In short, it is this gap which enables us to distinguish historicity proper from historicism.

Why, then, the Idea of communism? For three reasons, which echo the Lacanian triad of the I-S-R: at the Imaginary level, because it is necessary to maintain continuity with the long tradition of radical millenarian and egalitarian rebellions; at the Symbolic level, because we need to determine the precise conditions under which, in each historical epoch, the space for communism may be opened up; finally, at the level of the Real, because we must assume the harshness of what Badiou calls the eternal communist invariants (egalitarian justice, voluntarism, terror, “trust in the people”). Such an Idea of communism is clearly opposed to socialism, which is precisely not an Idea, but a vague communitarian notion applicable to all kinds of organic social bonds, from spiritualized ideas of solidarity (“we are all part of the same body”) right up to fascist corporatism. The Really Existing Socialist states were precisely that: positively existing states, whereas communism is in its very notion anti-statist.

The problem is how to avoid radical social uprisings which end in defeat, unable to stabilize themselves in a new order, or retreat into an ideal displaced to a domain outside social reality (for Buddhism we are all equal—in nirvana).

Where does this eternal communist Idea come from? Is it part of human nature, or, as Habermasians propose, an ethical premise (of equality or reciprocal recognition) inscribed into the universal symbolic order? Its eternal character cannot, after all, be accounted for by specific historical conditions. The key to resolving this problem is to focus on that against which the communist Idea rebels: namely, the hierarchical social body whose ideology was first formulated in great sacred texts such as The Book of Manu. As was demonstrated by Louis Dumont in his Homo hierarchicus, social hierarchy is always inconsistent; that is, its very structure relies on a paradoxical reversal (the higher sphere is, of course, higher than the lower, but, within the lower order, the lower is higher than the higher) on account of which the social hierarchy can never fully encompass all its elements. It is this constitutive inconsistency that gives birth to what Rancière calls “the part of no-part,” that singular element which remains out of place in the hierarchical order, and, as such, functions as a singular universal, giving body to the universality of the society in question. The communist Idea, then, is the eternal demand co-substantial with this element that lacks its proper place in the social hierarchy (“we are nothing, and we want to be all”).

Our task is thus to remain faithful to this eternal Idea of communism: to the egalitarian spirit kept alive over thousands of years in revolts and utopian dreams, in radical movements from Spartacus to Thomas Müntzer, including within the great religions (Buddhism versus Hinduism, Daoism or Legalism versus Confucianism, etc.). The problem is how to avoid the choice between radical social uprisings which end in defeat, unable to stabilize themselves in a new order, and the retreat into an ideal displaced to a domain outside social reality (for Buddhism we are all equal—in nirvana). It is here that the originality of Western thought becomes clear, particularly in its three great historical ruptures: Greek philosophy’s break with the mythical universe; Christianity’s break with the pagan universe; and modern democracy’s break with traditional authority. In each case, the egalitarian spirit is transposed into a new positive order (limited, but nonetheless actual).

The democratic axiom is that the place of power is empty, that there is no one directly qualified for the vacancy, either by tradition, charisma, or leadership qualities.

In short, the wager of Western thought is that radical negativity (whose first and immediate expression is egalitarian terror) is not condemned to being expressed in short ecstatic outbursts after which things are returned to normal. On the contrary, radical negativity, as the undermining of every traditional hierarchy, has the potential to articulate itself in a positive order within which it acquires the stability of a new form of life. Such is the meaning of the Holy Spirit in Christianity: faith can not only be expressed in, but also exists as, the collective of believers. And this faith is itself based on “terror,” as indicated by Christ’s insistence that he brings a sword, not peace, that whoever does not hate his father and mother is not a true follower, and so on. The content of this terror thus involves the rejection of all traditional hierarchical and community ties, with the wager that a different collective link is possible—an egalitarian bond between believers connected by agape as political love.

Here is one more quote from later in the article:
Perhaps the most succinct characterization of the epoch which began with the First World War is the well-known phrase attributed to Antonio Gramsci: “The old world is dying away, and the new world struggles to come forth: now is the time of monsters.” Were Fascism and Stalinism not the twin monsters of the twentieth century, the one emerging out of the old world’s desperate attempts to survive, the other out of a misbegotten endeavor to build a new one? And what about the monsters we are engendering now, propelled by techno-gnostic dreams of a biogenetically controlled society? All the consequences should be drawn from this paradox: perhaps there is no direct passage to the New, at least not in the way we imagined it, and monsters necessarily emerge in any attempt to force that passage.
Zizek is quite the idealist - I like that about him.

On Qualia - Richard Brown and Keith Frankish,+Qualia.jpg

Cool discussion from PhilosophyTV. Before launching into their discussion, it might be helpful to have a brief definition of qualia, from the Stanford Encyclopedia of philosophy:
Philosophers often use the term ‘qualia’ (singular ‘quale’) to refer to the introspectively accessible, phenomenal aspects of our mental lives. In this standard, broad sense of the term, it is difficult to deny that there are qualia. Disagreement typically centers on which mental states have qualia, whether qualia are intrinsic qualities of their bearers, and how qualia relate to the physical world both inside and outside the head. The status of qualia is hotly debated in philosophy largely because it is central to a proper understanding of the nature of consciousness. Qualia are at the very heart of the mind-body problem.
The whole entry is quite educational.

Richard Brown (left) and Keith Frankish (right) on qualia.

Suppose you’re a physicalist and you want to include qualia in your ontology. Unfortunately, “classic qualia” (intrinsic, ineffable, private properties of experience) seem incompatible with physicalism, while “zero qualia” (mere dispositions to judge that we have classic qualia) don’t seem like genuine qualia at all. After all, even zombies have zero qualia! Perhaps you can be satisfied with “diet qualia” (subjective feels of experience). But are there meaningful distinctions between diet qualia and the other two conceptions? Is the notion of diet qualia even coherent? Frankish and Brown discuss the issue.

Related works

by Brown:
“Deprioritizing the A Priori Arguments Against Physicalism” (2010)
“The Higher-Order Approach to Consciousness” (draft)

by Frankish:
“Quining Diet Qualia” (forthcoming)
“The Anti-Zombie Argument” (2007)
Consciousness (2005)

See also:
Guest posts by Frankish at the Splintered Mind

Eric Schwitzgebel - Embodied Introspection

Cool stuff from Eric Schwitzgebel at The Splintered Mind on Embodied Introspection. Andy Clark and Alva Noe are two of my favorite authors. Rob Wilson is new to me.

Here is some of Wilson's bio from the page linked to in the post below.
My research falls chiefly in the philosophy of mind and cognitive science, the philosophy of biology, and general philosophy of science, but I also have ongoing interests in ethics, metaphysics and epistemology, and 17th and 18th century philosophy. Over the last three years I have been reading and thinking about eugenics, the contemporary uses of biotechnology, and the philosophy of psychiatry, amongst other things. I am also involved in several large-scale research projects that fall under the question What Sorts of People Should There Be?, which now has a mighty fine blog that's worth checking out.
You can find some of his recent papers here.

Embodied Introspection

Embodiment is hot these days in philosophy of psychology. Andy Clark, Alva Noe, Rob Wilson, and others have argued that cognition and perception are not processes confined within the brain, but rather transpire in extended brain-body-environment systems. Tactile perception, the argument goes, is not a brain process in response to stimulation of the skin; rather, it is a looping process that includes as a part one's active bodily movement and environmental exploration. Thinking about Scrabble moves can happen perhaps entirely in the brain if one visually imagines shuffling the tiles; but if shuffling the wooden tiles with one's fingers serves a similar functional role, that fingered shuffling is as much a part of Scrabble cognition as is imaginary shuffling.

Introspection, it might seem, is not embodied in the same way. After all, you can just close your eyes and introspect, no body involved, right? Introspection seems to be entirely interior -- a kind of attunement to one's internal stream of experiences, or the activity of the brain's self-monitoring systems.

Yet I think proper appreciation of the tangle of processes that drive introspective judgments points toward treating introspection as embodied. Consider two examples . . . .
Read the whole post.

Simon Baron-Cohen on empathy and evil

Interesting discussion of Simon Baron-Cohen's new book - The Science of Evil: On Empathy and the Origins of Cruelty (different U.S. title than in England).

Science Weekly Extra: Simon Baron-Cohen on empathy and evil

Simon Baron-Cohen talks to Ian Sample about his proposal that we should redefine 'evil' as an absence of empathy, outlined in his book Zero Degrees of Empathy: A New Theory of Human Cruelty

This is the extended version of the interview with Simon Baron-Cohen featured in this week's Science Weekly podcast.

Subscribe for free via iTunes to ensure every episode gets delivered. (Here is the non-iTunes URL feed).

Thursday, May 05, 2011

Thubten Chodron - A Kind Heart

by Thubten Chodron
foreword by His Holiness the Dalai Lama


Dharma Quote of the Week

A kind heart is the essential cause of happiness. Being kind to others is the nicest thing we can do for ourselves. When we respect others and are considerate of their needs, opinions and wishes, hostility evaporates. It takes two people to fight, and if we refuse to be one of them, there is no quarrel.

...A kind heart is the root of harmony and mutual respect. It prevents us from feeling estranged or fearful of others. It also protects us from becoming angry, attached, closed-minded, proud or jealous. When opportunities arise to help others we won't lack courage or compassion. If political leaders had impartial minds and kind hearts, how different our world would be!

As all problems arise from the self-cherishing attitude, it would be wise for each of us, as individuals, to exert ourselves to subdue it. World peace doesn't come from winning a war, nor can it be legislated. Peace comes through each person eliminating his or her own selfishness and developing a kind heart...we can each do our part beginning today. The beneficial result in our own lives will immediately be evident. (p.76)

--from Open Heart, Clear Mind by Thubten Chodron, foreword by His Holiness the Dalai Lama, published by Snow Lion Publications

Open Heart, Clear Mind • Now at 5O% off
(Good until May 13th).

Tags: , , , , , , , , , , ,

Secret Files of the Inquisition

This is cool - I didn't know the Church had released these documents. I found this at Top Documentary Films. A little context for those who may not be aware:
The Inquisition was the "fight against heretics," with the use of torture, by several institutions within the justice-system of the Roman Catholic Church. It started in the 12th century, with the introduction of torture in the persecution of heresy. Inquisition practices were used also on offences against canon law other than heresy.
Ah, good times. Damned Cathars.

Secret Files of the Inquisition

Secret Files of the InquisitionBased on previously unreleased secret documents from European Archives including the Vatican, Secret Files of the Inquisition unveils the incredible true story of the Catholic Church’s 500-year struggle to remain the world’s only true Christian religion.

Filmed in High Definition, this 4-hour series spans medieval France in Episode 1, 15th century Spain in Episode 2, Renaissance Italy in Episode 3 and mid-nineteenth century Europe in Episode 4. Historians, experts and Church authorities advise on the handling of this controversial subject matter.

At the dawn of the second millennium Europe was slowly emerging from the blackness and ignorance of the Dark Ages. There were no nations and the people were loyal only to their immediate community and to God. The keeper of God’s word was the Catholic Church, the only religion in all of Christendom.

The supreme religious leader, the Pope in Rome, crowned the Kings who became rulers of the Holy Roman Empire stretching from Sicily north to Poland. The Emperor was ruler of the temporal world while the Pope and his Bishops reigned supreme over the Spiritual world.

By the 12th and 13th century, cracks began appearing in this ordered world. Emperors no longer submitted to being crowned by the Pope and across Europe Kings demanded the right to select their own Bishops. But for the Pope the most terrifying threat came from upstart Christian sects who challenged church doctrine and the absolute power of the Roman Pope.

To preserve the purity of the faith and the unquestioned authority of the Pope, the Church began to crack down on all dissenting with a new weapon: the Inquisition. For over half a millennium a system of mass terror reigned. Thousands were subject to secret courts, torture and punishment.

Watch the full documentary now (playlist – 4 hours)
Bonus documentary in the playlist: History’s Mysteries – The Inquisition

Thomas de Zengotita - On the Politics of Pastiche and Depthless Intensities: The Case of Barack Obama

This article was published at The Hedgehog Review before the recent Osama bin Laden assassination, but I doubt that event would change anything de Zengotita has to say here. This is more about the phenomenon of Barack Obama as a cultural event and refutation of social categories.

On the Politics of Pastiche and Depthless Intensities: The Case of Barack Obama

Thomas de Zengotita

Reprinted from The Hedgehog Review 13.1 (Spring 2011). This essay may not be resold, reprinted, or redistributed for compensation of any kind without prior written permission. Please contact The Hedgehog Review for further details.

The exposition will take up in turn the following constitutive features of the postmodern: a new depthlessness…both in contemporary “theory” and in a whole new culture of the image or the simulacrum; a consequent weakening of historicity, both in our relationship to public History and in the new forms of our private temporality, whose “schizophrenic” structure (following Lacan) will determine new types of syntax…; a whole new type of emotional ground tone—what I will call “intensities”…

The disappearance of the individual subject, along with its formal consequence, the increasing unavailability of the personal style, engender the well-nigh universal practice today of what may be called pastiche.

And the stupendous proliferation of social codes today…is also a political phenomenon…advanced capitalist countries today are now a field of stylistic and discursive heterogeneity without a norm.

—Fredric Jameson 1

Tiger Woods caused a bit of a furor back in 1997 when he told Oprah Winfrey he didn’t think of himself as black or African American. It seems he had created his own ethnic category; “I’m a ‘Cablinasian,’” he declared—as in Caucasian-black-Indian-Asian, his actual ancestry, as he saw it. Tiger’s point was representative of an entitlement many young people feel nowadays—as any educator with an ear for student discourse can attest. He felt that he could choose not to identify with the categories imposed on him by “society” (a term which in this context stands in for, and evacuates, Jameson’s historicity). “I’m just who I am,” he told Oprah, “just whoever you see in front of you.”

That’s how “free to be what you want to be” young Tiger felt back then (before a reality of another kind caught up with him). But a lot of older black Americans (representative, in their way, of Jameson’s generation) were quick to chide this callow youth for trying to shrug off the iron claims of history. Even Colin Powell weighed in, admonishing Tiger that “in America…when you look like me, you’re black.”2

That little episode is not just a convenient way to show how rooted in contemporary life Frederic Jameson’s classic analysis was—and remains. It also serves as an allegory for a much more significant manifestation of postmodern culture to be considered in this essay—the case of Barack Obama. But first, a few supplements to Jameson’s analysis, some concepts that will be brought to bear on the stunning event of Obama’s election (it is easy to forget how stunning; a mark in itself of “intensity”) and the dreary undoing that followed.

The Flattery of Representation

What all media, all representations—from street signs to photographs to emoticons—have in common is this: they pay attention to you, they address you. Sometimes generically, as with street signs, sometimes precisely, as with person-specific ring tones. And all that attention is flattering—indeed, it is a form of flattery so pervasive, and so essential to the nature of representation, that it has escaped notice as such, though it ultimately accounts for the oft-remarked narcissism of our time.3 The very process by which reality and representation become fused in the age of the simulacrum is delivered to our psyches by the flattery of representation. We have been consigned by it to a new plane of being, a new kind of life-world, an environment of representations of fabulous quality and inescapable ubiquity, a place where everything is addressed to us, everything is for us, and nothing is beyond us anymore.

Virtual Revolution

During the mass-media age (roughly co-extensive with the modernist period), the hidden blandishments of representational flattery were already at work. Broadcast representations were implanting in anonymous spectators a desire for public significance commensurate with their unconscious sense of centrality—for it was, after all, to them that all performances were addressed. But celebrities were monopolizing public attention, gorging on it. The most basic of specifically human needs—the need for acknowledgment, for significance, for a place in the world—was left unsatisfied. Spectators were craving, however inchoately, their fair share of that attention. All that was lacking were the means. Until recently.

This is a piquant historical irony that would reward more extensive examination than can be given here: With the rise of narrow-cast digital media, a revolution something like the one Marx envisioned actually took place—but in the space of representation, not in the land of bricks and mortar and machinery. In that virtual space, the “means of production” simply fell into the hands of the masses. And they proceeded to produce at a furious rate. But the revolution they accomplished wasn’t about workers displacing capitalists; it was about spectators displacing celebrities—it was about “you” being named Time Magazine’s “Person of the Year.” It was a virtual revolution in the Age of Facebook.4

The Moreness of Everything and the Rise of the Mix: Surfing the Options5

In a mediated world, the opposite of real isn’t phony or artificial—it’s optional. Idiomatically, we recognize this when we say, “the reality is…,” meaning something that has to be dealt with, something that isn’t an option. Jameson’s qualities of “surface” and “intensity” are rooted in the “optionality” that constitutes our existence, in the Heideggerian sense, as Being-in-the-World—but now a mediated world. The expression “whatever” arose and caught on because it captured so precisely the bivalent attitude we must adopt in order to negotiate the environment of options that are soliciting our attention so incessantly. On the one hand, it is a feast, a world that offers an unprecedented array of possible experiences, “whatever” you want—“no limits” as the SUV and technology ads all say. On the other hand, it is a world of effects. Each of us is at the center of it, but there is a thinness to things, an insulational quality—as if the deities of DreamWorks were laboring invisibly around us, touching up the canvas of reality with digital airbrushes.

Haunting the moment of “I can experience whatever I want” is the moment of the shrug, of “what difference does it make?”—of “whatever” in that register. That moment is essential to our mobility among the options. And we need mobility among the options because they are representations—even food and shelter partake of the representational, for how we live and what we eat says so much about us.6 But just insofar as entities are representational, they are no more than they appear to be. And so they are never enough. And so we move on—choosing among options, and creating more options, in an open-ended project of perpetual self-construction. Jameson’s surfaces and intensities necessarily attend that kind of existence—and pastiche is what becomes of originality when just about everything’s already out there.

Complex processes of commodification and technological innovations under late capitalism have been driving these developments, of course. But at the level of content, the dominating effect is relatively easy to describe, though impossible to comprehend. Genres in general have collapsed under this pressure. Categories as fundamental as fact and fiction, news and entertainment, gender and sexuality, have eroded away. In literature and architecture, in cuisine, in music, in fashion and furnishings, everywhere, everything—it’s fusion and mix.

Barack Obama emerged as a literal embodiment of this age. To educated people, especially younger people with generally progressive views, other candidates suddenly looked parochial by comparison—or simply outdated. In his ethnicity and biography and in his personality and politics, Obama, the conciliator, was above all a combiner. Because he was from virtually everywhere—Kenya, Indonesia, Honolulu, Harvard, Chicago’s South Side—he was also from nowhere. The pastiche of his persona made him “his own man” in a new sense of the term.

The Fusion Candidate

Obama the candidate would never have made Tiger Wood’s silly mistake in so many words, of course. But on the screen of public culture, where significance is substance, he overlapped in many ways with the Tiger Woods who felt entitled to make a mix of his own when it came to his “own” identity. If Obama “came from nowhere”—as so many remarked at the time—it wasn’t just because he came to public attention so quickly; that has happened before. It was the way his every gesture reinforced the impression that he felt entitled to be who he wanted to be, that he took his instantaneous emergence pretty much in stride. That was why, when he rocketed to the top of the hype heap in a single year, a lot of veteran black leaders weren’t happy about it at all. He hadn’t paid his dues, they muttered. But it was clear that other identity issues were in play—his style, his way of speaking, his upbringing. The not-so-hidden reservations boiled down to a sense that maybe he wasn’t “black enough”—an ironic inversion of Colin Powell’s admonishment to Tiger. But hypocrisy was not behind that inversion. It is simply an effect of the fact that, as principles detach themselves from history, they invite increasingly improvisational application; they become more flexible, more pragmatic, adapting to the pastiche of circumstances at any given moment.

Their reservations were confirmed when Obama wasn’t immediately offended by Joe Biden’s description of him as “articulate” and “clean.” It wasn’t until the likes of Jesse Jackson and Al Sharpton expressed their consternation—barely suppressed outrage, actually—that Obama grasped the historically resonant implications of those words. A very telling moment. For one thing, it showed that he really did represent generational change, and the fact that he hadn’t lived through the 60s was as important as he claimed.

So the Obama campaign stole a page from John F. Kennedy’s playbook and made his youth and inexperience a virtue and generational change a theme. “Time to get over the 60s” became central to his message, and it resonated with a lot of people who were tired of ideological Boomers framing the national debate and hogging the limelight. On that level, Obama simply represented a fresh start—which might describe any instance of generational change in modern history, going back at least to the seventeenth century. But something more comprehensive, and more radical, was also suggested by his candidacy—an escape from history itself,7 from the historicity for which Jameson has mourned so persuasively. The underlying question was whether, for Obama himself, history had lost the depth and weight it once had for educated and politically engaged people. Did it also show that Obama, as a product of his own design, was himself a creature of intensities and surfaces? Most significantly, might that actually be a virtue—if it reflected the order of the day, the ethos of his time?

On the plane of representation, the 60s was to the last four decades what the French Revolution was to the first half of the nineteenth century—a looming determinant of the way events were framed and understood. The expression “culture wars,” captured the nature of the ensuing conflicts perfectly. These wars were initially conducted by conscious partisans and foes of the 60s’ legacy growing into adulthood and moving into positions of responsibility, especially in cultural institutions—in schools at all levels, most importantly, but also museums, libraries, publishing, and entertainment. Throughout the 70s and into the 80s, historically conscious progressives—more or less radical, more or less liberal, more or less compromised—lined up against historically conscious conservatives. The culmination of this phase might be marked by the publication of Allan Bloom’s The Closing of the American Mind in 1987. Bloom’s book was a relentless attack on the superficiality of American culture’s prevailing assumptions and practices, especially its embrace of various forms of relativism—and the 60s were held responsible at every turn. Its astonishing success suggested that many people who were not historically conscious partisans in the battle over the 60s’ legacy were now reacting against it for more diffuse and immediate reasons.

Successor generations were necessarily less and less aware of the historical roots of the debates that raged around them—multiculturalism vs. the canon, moral relativism, third-wave feminism, gay rights, and all the rest. The efficacy of the “political correctness” trope in that context was an especially vivid illustration of the decline of historicity. In educational venues especially, responsible adults who were consciously advancing a progressive agenda found themselves in the ironic position of enforcing standards of speech and behavior on resistant young people who were more and more likely to experience these standards as, at best, over-fussy sensitivity about hurt feelings and giving offense or, at worst, as the arbitrary—and even inequitable8—imposition of adult authority.

Of course, some students continued to support an activist progressive agenda, but their ranks were dwindling, and they fell more and more exclusively into disparate groups, each with its own concerns. And so it was for society as a whole. In spite of the achievements of identity politics, there was eventually no denying that it also represented a (“schizophrenic”) dispersion of discourses, laden with intensities that rose and fell with the occurrence and passing of incidents, with the pastiche of events floating free of a past that had lost its depth, momentum, and focus. Anita Hill at the Thomas hearings, white progressives vs. African-Americans on the O. J. Simpson trial, the Clinton impeachment—these dramas riveted the nation and were charged with social and political implications. But those implications were tangled, cross-cutting. The intensities, the swirl of compelling images—the glove that didn’t fit, the “high tech lynching,” the stained dress—offered very little traction for a larger historical narrative. No clear direction forward was suggested or denied, no past struggle clearly vindicated or betrayed. These were reality shows indeed—and the social and emotional fallout that attended them was more akin to the impact of an event movie like Avatar than the assassination of Martin Luther King.

Progressives with a commitment to traditional narratives continued the struggle, of course; Jameson had company.9 But efforts to awaken a largely apathetic population to the history of social and economic injustice and to present consequences of that history had to reckon with an increasingly mediated environment. Readily dramatizable efforts to awaken the apathetic masses to the dangers of pollution and global warming were also underway. So were efforts to awaken the apathetic masses to desperate situations in Sub-Saharan Africa and Haiti. Likewise, the dangers of nuclear proliferation—what could be more urgent in an age of terror? Sex trafficking, often involving minor girls sold or kidnapped into slavery, was proliferating. Various diseases—not just identity-resonant AIDS and breast cancer—became causes that people worked for, marched for. What about the conditions under which farm animals were being raised and slaughtered? The rise of obesity, especially in children, has recently been demanding attention. Let us pass over in silence the 9/11 truthers, for it is clear that this list could go on indefinitely.

A traditional left-wing politics, informed by historical depth, was inevitably diminished by the dynamic of optionality. The attendant emphasis on surfaces and intensities was a function of the centrality of virtual revolutionaries, presiding over the flow of their experience in custom-made me-worlds. Their attention was the scarce resource that mattered most and, because they were disposed to surf, surfaces powerful enough to arrest their attention were naturally selected. As the flattery of representation took hold, people felt motivated to coherent action only by something they could consistently “identify with” or, more occasionally, by something so compelling it could not be resisted, at least for as long as the excitement lasted.

Into this situation, stepped Barack Obama. He managed to mobilize support along both those lines. First of all, through the pastiche of his persona, he embodied the possibility and actuality of self-construction—a sense of entitlement broadly shared by the young supporters who flocked to his candidacy. Simply by being himself, he represented a sort of bridge across those niches of identity politics. The holy grail of postmodern progressivism, the dream of unity in diversity going back to Jesse Jackson’s impossible “Rainbow Coalition,” now seemed to be standing in front of you, as it were, an accomplished fact, a post-historical instance.

His position as unifier through partialities ultimately accounts for Obama’s detached demeanor, the laid back reflexivity that so many have found so frustrating. But readers of his extraordinary autobiography know how essential to his personal history, how essential to his ability to navigate through the contexts that shaped him, that detachment has always been.10 So Obama could, for example, borrow from the rhetorical style of Martin Luther King, but he could not, would not, fully commit to it. He retained his authenticity by restricting himself to citing, to letting those cadences slip into his performances once in awhile and then letting them fade away. His most enthusiastic supporters—the ones who identified with him—understood that, in some space of his own devising, Obama hovered over it all, at home with his own disengagement, at home in a world in which more people are more aware of their own identities and more engaged in self-representation than at any time in human history.

The fact that Obama could represent any one niche only partially was at first an asset because it meant he could also offer the redemption of an experience of commitment to something greater than self—however simulational, as the phrasing implies. And Obama’s young supporters welcomed that. They knew of their reputation for apathy and irony and self-involvement—they had been enduring lectures from their elders about it since grade school. Here was a chance to prove their self-righteous seniors wrong and accomplish at a stroke what those political veterans could only conceive as a distant goal. A black American President. To achieve something so dramatically unlikely would be a transcendence of historicity in itself, a validation of the postmodern ethos—of themselves and their lives and the world they were making. Yes, we can!

The stark fact of Obama’s very being as a black man poised on the brink of election to the Presidency made that virtual revolutionary gesture prima facie plausible. That was the essential surface (an interesting phrase), the essential intensity. Once that plausibility was realized, the momentum behind him seemed to gather force almost overnight and that astonishing campaign got underway. It was a convergence, in a singular burst of purposeful action on a massive scale, of the coolest media technologies and processes ever and of young people—almost all of them socially effective, intelligent, and quick to learn—willing to put their lives on hold and devote themselves to the cause of “yes, we can.”

The Howard Dean campaign of 2003–2004 provided the template for the virtual revolutionaries who came out for Obama. The participants knew—they said it over and over again—that this wasn’t really about Dean; it was about the movement.11 It was about the Deaniacs themselves, those techie communards and the multiple niches they coordinated, online and off. If you were part of that campaign, you were being the phenomenon as you were seeing it represented, in “real time,” unfolding before you. You could see the impact of your role on the national stage in essentially the same way you can see the impact of your keyboarding on the screen of your computer. MoveOn understood this. That’s why they held contests for political ads, judged by celebrity professionals—and then featured the winning ads on their website. And Dean himself understood it. He acknowledged it at every turn, saying over and over again, ferociously, joyfully, pointing his finger—“You have the power! You have the power!”

But the Deaniacs went overboard. As brash as the dot-commers of the mid-90s, they poured into Iowa from out of state in their orange sweater-hats and flooded the Iowa countryside with—themselves. And the press was agog; the Deaniacs became the story. A lot of Iowa voters, often older folks who were serious enough to participate in the time-consuming caucus system, resented the intrusion and decided to refocus the limelight on themselves. All of a sudden, Dean was through. One final scream and it was over. Obama’s team learned the lesson well, however, and when the Iowa primary rolled around in January of 2008, the campaign deployed a carefully cultivated cadre of local Obama supporters to get out the caucus vote. After that victory—in a state that was 97 percent white—African American voters, in South Carolina and elsewhere, began to realize that the impossible could actually happen, and their support for the Clintons began to erode. And so began the long march to victory and that stunning appearance in Grant Park, Chicago, on election night, 2008. Yet another stunning moment, it is so easy to forget.

That was the end of the movie. Produced, directed by, and starring the we who could. And, yes, we did. And then—having experienced a commitment to something bigger than self—we went home.


Compared with 2008, voting dropped off this year particularly among pro-Democratic groups:
—Young voters were down by 55 percent.
—African-Americans were down by 43 percent.

—From the McClatchy Newspapers (22 November 2010)

Of course, Obama’s record of compromise helped to drive those numbers down. And the dynamics of the traditional mid-term slump in voter turnout were no doubt at work as well. But the whole idea of Obama’s election had been that the traditional dynamics had been confounded. And anyone who expected Obama to be more radical than he turned out to be just wasn’t paying attention to what he was saying during the campaign. With the sole exception of his categorical stand against the war in Iraq, it was obvious all along that he stood to the right of Clinton and Edwards on domestic issues, across the board. And it was also clear that part of getting over the 60s was getting over what little remained (after Clinton, the great triangulator) of the traditional liberal/left willingness to take on the interests of corporate America in any serious way. It was never in Obama to go to the mat for the public option or Wall Street accountability, and if that could somehow be overlooked in the excitement of the campaign, it became glaringly obvious the moment he began to assemble his cabinet and top advisors.

Faced with the prospect of actually governing, Obama inevitably went about it the way he had governed his own career. He surrounded himself with the best, the brightest, and the most powerful. Inspired, perhaps, by the ongoing mash-up of himself and Abraham Lincoln (which he encouraged by announcing his candidacy in Springfield, Illinois), Obama positioned himself amidst his Team of Rivals and proceeded to do what he had always done best—deliberate, adjudicate, conciliate, compromise. He did what he had done so successfully as editor of the Harvard Law Review, where he was principally known for his balanced approach and a willingness to give more credence to conservative views than was customary at the time. And, of course, being who he was—it worked. Did he also believe that he possessed a larger transformational potential and, if so, to what extent, to what degree? Certainly, there were hints. Certainly, he was serious about being a postpartisan president, about bringing the red states and the blue states together, about reaching out to Republicans who made no secret of the fact that their top priority was to ruin his presidency. And he has persisted in that effort in the teeth of all the evidence against him. Does that suggest that, at some level of his thinking, he had conceived of a grander goal—of contributing to a postracial, postnational, posthistorical age? Was he that ambitious?

However that may be, and whatever the future holds for Obama, it is clear that the postmodern dynamic has shifted to new terrain. Obama’s Tea Party opponents and their affiliates have become the protagonists in a new political story of their own devising, starring themselves. A new pastiche of surfaces and intensities has arisen as if to mirror and invert the one that Howard Dean pioneered and Obama brought to fruition. In many ways, they are proving even more suited to this kind of public existence. For example, the seamless way the Right deploys its imagery of half-truths and lies (“government takeover” “where is his birth certificate?” “death panels”) testifies to a completely ahistorical sensibility in which the distinction between fact and convenient fiction is not merely blurred but obliterated. We are accustomed to thinking of progressive movements of one kind or another when we think of identity politics, but that is dangerously misleading. Ronald Reagan was the most successful postmodern practitioner we have ever had. His was the vision of the simulacrum that became the “real America,” the guiding light on the Right ever since.

For the environment of optionality also conditions—perhaps even most acutely—the existence of people who try to refuse that environment, people who cling to some tradition. They have to decide to do that, and deciding to be who you are is precisely what an authentically traditional person does not do. Religious fundamentalism as we know it is very much a postmodern phenomenon—and so is the Tea Party, for all its insistence on the old days and ways. Fanaticisms flourish in an atmosphere of unlimited choice.

Consider all the outlandish costumes, the signs, the atmosphere of carnival—sometimes it’s an intensity of rage, sometimes of weepy nostalgia. But, either way, these aging representatives of Nixon’s silent majority are silent no more. They have joined the postmodern age at last—in the very act of refusing to be silent, in the very act of putting on their show. It is no accident that, from a sufficiently disinterested distance, they look like something Abby Hoffman might have thrown together for a Yippie happening back in 1969. As striking an instance of political specular doubling as you could ask for. And despite the relentless emphasis on history, the references to it are a breathtakingly shallow exercise in depthless surfing that might have been designed to illustrate the basic Jamesonian concepts. Could there be a trope more completely bereft of historicity, a more shameless exercise in pastiche than Congressman Todd Akin’s widely circulated account of the first Thanksgiving as a moment when the pilgrim fathers repudiated socialism?


  1. Fredric Jameson, Postmodernism, or, The Cultural Logic of Late Capitalism (Durham: Duke University Press, 1991) 6, 16, 17.
  2. Gary Kamiya “Cablinasian Like Me,” Salon (30 April 1997): <>. “Light-skinned Colin Powell, responding to Woods’ comments, ‘In America, which I love from the depths of my heart and soul, when you look like me, you’re black.’”
  3. As mammals, we are wired to respond to attention. Puppies respond to attention. But only human beings need recognition. A dog won’t get insulted if you forget its name. And people feel more and more recognized the more customized the representations that constitute their life-world become.
  4. Robert Murphy, who introduced me to anthropology at Columbia, used to sum up a basic difference between modern societies and hunter-gatherer communities this way: “everyone is famous in a tribe.” By which he meant that everyone in a face-to-face society is recognized by everyone else, so that everyone means something in their world, whatever their status. Dynamics of recognition of an analogous sort have divided the mediated world into niches—not a “global village” after all.
  5. Compare the cereal section of today’s MegaSuperMarkets with its counterpart, say, forty years ago. It used to be Wheaties, Corn Flakes, Cheerios (oats), Rice Krispies—the idea was one cereal for each grain. You could take in the display at a glance. Now you have to walk a couple of blocks to cover it all. Likewise with fruit juices, makes of cars, sneaker species, and pasta possibilities.
  6. But haven’t everyday objects always been more than functional—filtered through culture, sending a message? Yes, but being aware of that is new. Awareness of “culture” used to be confined to a few reflective individuals; now it is common sense. What cultures traditionally provided was entrenched custom, a kind of necessity. Options are very different, as are the people who exist among them—people with “life-styles.”
  7. For this, too, there are modern precedents—most notably, in the attitudes of Enlightenment philosophes toward tradition, and in the modernist turn away from “evolutionist” history at the end of the nineteenth century. But each such moment has its distinctive characteristics.
  8. In the late 90s, I was asked by the head of an independent school to meet with some boys who wanted to start a “Caucasian Boys Club.” There was an African-American club, a Latino club, a Gender Issues club—so why not? They were making mischief, of course, but they were not racist or sexist. They saw themselves as rebels against an adult regime. Talk about injustices of the past and continued discrimination in contemporary society made no impression; synchronic and local “fairness” was all that mattered. I managed finally to dissuade them only by naming specific classmates and teammates who would be terribly hurt if such a club were to be started.
  9. See, for example, Todd Gitlin’s The Twilight of Common Dreams: Why America Is Wracked by Culture Wars (New York: Holt, 1996).
  10. With the personal historical aspect of Jameson’s notion of depth in mind, one might say that Obama is deeply detached.
  11. Samantha M. Shapiro, “The Dean Connection,” The New York Times Magazine (7 December 2003).