David Foster Wallace, Writer, Is Found Dead
David Foster Wallace, whose darkly ironic novels, essays and short stories garnered him a large following and made him one of the most influential writers of his generation, was found dead in his California home on Friday, after apparently committing suicide, the authorities said.
Marion Ettlinger: David Foster WallaceMr. Wallace, 46, best known for his sprawling 1,079-page novel “Infinite Jest,” was discovered by his wife, Karen Green, who returned home to find that he had hanged himself, a spokesman for the Claremont, Calif., police said Saturday evening.
Mr. Wallace was a professor in the English department at Pomona College in Claremont.
“I know a great novelist has left the scene, but we knew him as a great teacher who cared deeply about his students, who treasured him. That’s what we’re going to miss,” said Gary Kates, the dean of Pomona College.
Mr. Wallace had taught at the small, liberal arts college since 2002 and was the school’s Roy Edward Disney Chair in Creative Writing. He taught one or two classes each semester of about 12 students each, Mr. Kates said.
Mr. Wallace burst onto the literary scene in the 1990s with a style variously described as “pyrotechnic” and incomprehensible and was compared to writers including Jorge Luis Borges, Thomas Pynchon and Don DeLillo.
His opus, “Infinite Jest,” published by Little, Brown & Company in 1996, is set in the near future called the Year of the Depend Adult Undergarment and is, roughly, about addiction and how the need for pleasure and entertainment can interfere with human connection.
In a New York Times review of the book, Jay McInerney wrote that the novel’s “skeleton of satire is fleshed out with several domestically scaled narratives and masses of hyperrealistic quotidian detail.”
“The overall effect.” Mr. McInerney continued, “is something like a sleek Vonnegut chassis wrapped in layers of post-millennial Zola.”
The novel was filled with references to high and low culture alike, and at the end had more than 100 pages of footnotes, which were both trademarks of Mr. Wallace’s work.
The blurbs are by contemporary novelists like Jonathan Franzen and Rick Moody, each of whom was a friend of Mr. Wallace.
Michael Pietsch, who edited “Infinite Jest,” said Saturday night that the literary world had lost one of its great talents.
“He had a mind that was constantly working on more cylinders than most people, but he was amazingly gentle and kind,” Mr. Pietsch said. “He was a writer who other writers looked to with awe.”
Mr. Wallace was born in Ithaca, N.Y. His father, James Donald Wallace, was a philosophy professor at the University of Illinois, and his mother taught English at a community college in Champaign, Ill.
Mr. Wallace majored in philosophy at Amherst College and had planned on embarking on a career in mathematics or philosophy. But after graduation in 1987, he enrolled in the creative writing program at the University of Arizona, where he wrote his first novel, “The Broom of the System,” which was praised by critics.
He followed a year later with a collection of short stories, “Girl with Curious Hair,” which cemented his reputation as a master of the postmodern. Eight years later returned with “Infinite Jest,” which became a literary sensation.
“It was ironic, but at the same time it was attempting to take emotional risk,” said Toni Fitzpatrick, chair of the media studies department at Pomona College, who knew Mr. Wallace. “A lot of contemporary literature uses irony as a self-protective gesture, but he never did that. He was like a lot of postmodern novelists, but braver.”
Mr. Pietsch said although Mr. Wallace’s work was complex and layered, it was his sense of humor that kept people reading.
“He wrote showstoppers,” Mr. Pietsch said. “He was brilliantly funny. People stayed with these long, complicated novels because they made them laugh."
Among his other works are “Brief Interviews with Hideous Men,” a short story collection, and “A Supposedly Fun Thing I’ll Never Do Again,” a collection of essays.
Offering multiple perspectives from many fields of human inquiry that may move all of us toward a more integrated understanding of who we are as conscious beings.
Pages
Saturday, September 13, 2008
David Foster Wallace, Writer, Is Found Dead
Deep Throughts - Sunnata Vagga
If you keep thinking "That man has abused me," holding it as a much-cherished grievance, your anger will never be allayed. If you can put down that fury-inducing thought, your anger will lessen. Fury will never end fury, it will just ricochet on and on. Only putting it down will end such an abysmal state.
-Sunnata Vagga
Search Magazine - What Happens to Religion When It Is Biologized?
What Happens to Religion When It Is Biologized?
by Nathan Schneider
Scientific approaches to describing spiritual life are changing the face of faith.
Zoloft works better than God,” a Catholic priest once told me during a conversation about depression. This is not the kind of man to give up on faith; our talks always finish with his reminders to pray. But in matters of body, and in matters of mind more and more, he and many others sense the right thing to do—religiously—is to consult a scientist.
In recent years, such common sense has given rise to a new paradigm in the study of human religiosity, one founded in a laboratory understanding of biology and the human mind. Together, these researchers are attempting to “biologize” religion, recalling E.O. Wilson’s notorious call for inquiry into human values “to be removed temporarily from the hands of philosophers and biologized.” And, like Wilson’s, their efforts have been fraught with controversy.
Driven by a growing confidence in cognitive science, neuroscience, evolutionary psychology, and genetics, the wider public has begun to embrace truly scientific (as well as pseudo-scientific) explanations of religious beliefs and practices. This idea takes root in an era that led the first President Bush to declare the 1990s “the decade of the brain,” one in which mental illnesses are treated increasingly through psychiatric drugs rather than analysis or spiritual practice alone.
Popular magazines, from Newsweek and Time to the New York Times Magazine, have paraded the research around with catchy labels like “the God spot,” “the God gene,” and “Darwin’s God.” People with a variety of religious perspectives, from Western Buddhists to evangelical Christians to the anti-religious, have explored and discussed biological accounts of belief, each bringing to it their sometimes contradictory assumptions and interpretations. Biologizing is a research project, but it is also a manifestation of a public desire to understand religiosity in the terms of science.
So what happens to religion when it is biologized? Many would intuitively believe philosopher and “New Atheist” Daniel Dennett, whose best-selling Breaking the Spell framed biologizing religiosity and overcoming it as two sides of the same coin; one leads naturally to the other. Confident in the possibility of this research, Dennett contends that “we” should “gently, firmly educate the people of the world, so that they can make truly informed choices about their lives,” choices that he believes will involve dispelling religion.
Less optimistically, but along similar lines, cognitive anthropologist Scott Atran suspects that “religious belief in the supernatural will be here to stay” despite those who come to understand it scientifically. He and other biologizers prefer to maintain a more agnostic stance than Dennett, purporting to pursue a scientific study of religion apart from biases and agendas. Scientific methods, they suggest, liberate the study of religion from ideological and theological debates.
Yet the lines between religion and the scientific study of it are not so clear. Biologizers depend on traditional ways of conceptualizing religiosity that have particular ideological connotations. In turn, believers of various stripes are eager to respond creatively to scientific research, and in some cases they head to the laboratory themselves to shed new light on their own beliefs and practices.
This should not be surprising. What we know of as science and religiosity have not been static absolutes but are constructed variously, often in terms of one another. For all that scientific methods offer the study of religion, they are not necessarily an escape from the influence of the theologies and ideologies that accompany it. Neither are they exempt from the divisions of approach and focus that have fueled religious conflicts for centuries.
Biologizers refer to themselves by a handful of names and are affiliated with a number of different disciplines. They can be arranged into three groups—cognitivists, neurotheologians, and evolutionary biologists—based on the alliances they have formed and the methods they rely on.
Whereas some biologizers do their work on religion in formal scientific contexts, others are professional researchers whose ideas on religion are more of a hobby—published in the popular press rather than peer-reviewed publications, often lacking scholarly sophistication. Therefore, evaluating the biologizers means looking not just at their findings and interpretations but also at the peer communities they either create or neglect.
Cognitivists
Many of the most sophisticated biologizers today rely on the methods of cognitive science and evolutionary psychology. These anthropologists, psychologists, and religious studies scholars are working to build a comprehensive and experimental explanation of religiosity—including beliefs and practices—in terms of universal mechanisms in the human mind.
The general field of cognitive science emerged in the context of artificial intelligence research with computers after World War II. “The last thirty years of cognitive science,” researcher Edwin Hutchins has noted, “can be seen as attempts to remake the person in the image of the computer.” Through that metaphor—a machine that people can build from bottom up and understand—human intelligence begins to appear explainable in terms of its biological “hardware” or “wiring” running the learned “software” of experience. Biologizing then becomes a matter of computerizing.
Coupled with the success of Noam Chomsky’s linguistics in the 1950s, cognitive science represented a departure from the then-dominant behaviorist mode of psychology, which refused to explicate the internal logic of mental processes and looked only at how various influences condition behavior. Since then, researchers from a number of disciplines, including linguistics, philosophy, and psychology, have allied themselves with cognitive science, rejecting the limits behaviorist thinking imposed.
Over the course of the 1980s and ’90s, a theoretical and experimental framework for a cognitive study of religious belief began to emerge. In 1993, anthropologist Stewart Guthrie’s Faces in the Clouds explained beliefs about supernatural beings in terms of an evolved human tendency to anthropomorphize what we perceive. The same mental systems that helped our ancestors spot a concealed predator, Guthrie argued, compel people to see spiritual beings behind the forces of nature and events in our lives.
With experimental studies and publications, anthropologist Pascal Boyer and psychologist Justin Barrett further probed the counterintuitive logic of supernatural representations, rendering Guthrie’s anthropomorphism as a “hyper-active agency detection device.” By evaluating how test subjects in a variety of cultural contexts intuitively described narratives about supernatural action, for instance, Barrett claims to elucidate the cognitive processes beneath, and sometimes opposed to, the formal theological formulations of religious beliefs. Supernatural beliefs that follow certain patterns, Boyer and Barrett argue, are more easily grasped by human minds, facilitating their transmission and retention in culture.
By the early 2000s, the cognitivists had a theoretical model grounded in enough evidence to produce several comprehensive synthetic works. Boyer’s Religion Explained, written for a non-expert audience, is a readable and triumphant cognitivist manifesto. Anthropologist Scott Atran’s In Gods We Trust is an impressive scholarly synthesis. Meanwhile, other scholars have used this approach to develop an account of ritual behavior based on cognitive underpinnings. A frequent subject of major media coverage, cognitivists represent a growing trend in the study of religion, complete with peer-reviewed literature, a growing public following, and eager graduate students.
Neurotheologians
Whereas cognitive science treats the brain as a computer to be studied through inputs and outputs, a group of neuroscientists, who call themselves “neurotheologians,” examine the inner workings of the brain using sophisticated imaging technology.
In recent decades, brain science has had a growing impact on ordinary life in industrialized societies, and this impact has generated considerable explanatory currency. Matthew Alper, whose personal quest in The “God” Part of the Brain has become a cult classic, tells of his conversion to science after being prescribed psychiatric drugs: “Whereas in the past, however, in which I had admired the sciences, I now revered them. Science had saved my life. I was indebted to it. God didn’t save me. I didn’t save me. ... And so, the same faith that many placed in a god or religion, I now placed in science.”
In addition to drugs, neurofeedback therapies that combine EEG scans with behaviorist-like conditioning are making the workings of our brains more accessible for clinical adjustment. Popular books offer “mind hacks”—neuroscientific tricks that one can try at home to improve cognitive performance. With the advent of such therapies that transform our whole sensation of personhood, laboratory researchers renew the salvific promise of religious authorities.
Some of the earliest and most-cited attempts to use brain science on religion were cognitive neuroscientist Michael Persinger’s “God helmet” experiments at Laurentian University in Ontario in the 1980s. In the early experiments, he modified a snowmobile helmet to direct electromagnetic fields at the brain’s temporal lobes, which he and others surmised might be associated with religious experience.
The results of these tests were astonishing but controversial. Reportedly, 80 percent of the volunteers who donned the helmet had some kind of extraordinary experience, and of those, most sensed the presence of a person in the room. In the years since, these experiments have attracted public attention. Persinger’s helmet has been featured in Wired, Newsweek, and on the BBC. Richard Dawkins tried the helmet, as did psychologist Susan Blackmore. Although Dawkins reported little effect from the trial, Blackmore found the experience “extraordinary.”
More recently, a team of Swedish researchers attempted to replicate the experiments using double-blind methods, which some of Persinger’s trials lacked, and the new helmet had no effect. Although some use these findings as evidence against Persinger’s approach, he insists that the Swedish team did not expose subjects to magnetic fields for long enough.
The idea of a “God spot” that Persinger pioneered caught the attention of a number of neuroscientists. V.S. Ramachandran, professor at the University of California, San Diego, and bestselling author, has done experiments that reveal the religious proclivities of temporal lobe epileptics. Neuroscientist Rhawn Joseph has argued in a series of flamboyant, self-published books for the centrality of the limbic system as the “transmitter to god.” Joseph even suggests that these areas “contain neurons that fire selectively in response to visual images of faces, hands, eyes, and complex geometric shapes, including crosses.” Together, Persinger, Ramachandran, and Joseph all tend to assume that experiences that look “religious” should have their origin in a single brain center, giving the concept of religion its own neural correlate.
Arguably the most influential neurotheologian today is Andrew Newberg at the University of Pennsylvania. In the early 1990s, he and elder neuroscientist Eugene d’Aquili began to devise experiments that used single emission computed tomography (SPECT) equipment to examine meditating Tibetan Buddhist monks (left) and praying Franciscan nuns. This equipment detected clear differences between normal brain states and peak spiritual experiences, and major similarities were found between the different groups. On the one hand, regions associated with thinking and planning showed a noticeable increase in blood flow. On the other, the images revealed decreased activity in the posterior superior parietal lobes, which Newberg calls the “orientation association area.” These, in terms he uses for popular audiences, manage the distinction between “me and not-me.” Unlike Persinger and others, however, he is not eager to localize religiosity in any one specific brain region.
The neurotheologians’ findings demonstrate significant public appeal, having been featured in numerous major magazine articles and radio broadcasts. Such reports readily entertain metaphysical reflections on what these findings might mean, just as the scientists themselves do in their popular books. Work on religion remains on the fringes of conventional neuroscience. Yet as more established neuroscientists begin to turn their attention toward it, the neurology of religion is poised to enter the mainstream.
Evolutionary Biologists
When Dean Hamer’s The God Gene was published in 2004, it was the subject of Time cover story. Hamer, a geneticist at the National Institutes of Health, made a name for himself ten years earlier by controversially arguing for the existence of a “gay gene,” and predictably, his new book garnered a lot of attention. It claims that a particular gene, VMAT2, triggers spiritual tendencies. Hamer describes making this discovery alone and in his spare time, apart from his funded research at the NIH. In an addiction study conducted for other purposes, he noticed that VMAT2 seemed to account for some participants’ survey scores on a “self-transcendence” scale.
Although much of Hamer’s book tries to qualify the brazenness of his title (other genes are involved in religiosity, “spirituality” is more the dependent variable than “god,” etc.), skepticism remains about whether anything has actually been demonstrated. Carl Zimmer’s October 2004 review in Scientific American suggested an alternative title: A Gene That Accounts for Less Than One Percent of the Variance Found in Scores of Psychological Questionnaires Designed to Measure a Factor Called Self-Transcendence, Which Can Signify Everything from Belonging to the Green Party to Believing in ESP, According to One Unpublished, Unreplicated Study. Zimmer maintains, however, that shortcomings of the “god gene” theory stem mainly from Hamer’s desire to publish controversial books, and that future work in genetics may shed useful light on religiosity.
David Sloan Wilson’s Darwin’s Cathedral proposes a more sophisticated evolutionary approach. Wilson, a biologist, combines the idea of group selection with rational choice theory. He attempts to demonstrate that, by coordinating group activity, religiosity has a “secular utility” that caused god genes to succeed in the course of human evolution. Although the logic of group selection is controversial among biologists, Wilson’s application of it to religion is compelling.
Anthropologist Barbara J. King, who has spent her career studying primates, has turned to religion in her book Evolving God: A Provocative View on the Origins of Religion. Pointing to evidence of ape cognition, empathy, social rules, and meaning making, she suggests that “the fundamental building blocks of the religious imagination” can be found among animals. For her, the systems of emotional “belongingness” at work in ape societies are a “necessary condition” for religiosity in modern humans. But her argument is more of an invitation for further research than a conclusive theory. [Editor’s note: For more of Barbara J. King’s work on animals and religion, see “Hard Times in Big Sky,” p. 40]
Evolutionary biologists generally avoid polemic by stressing that their research does not question whether divine beings or mystical states are real. Still, a more subtle polemic about what constitutes real religiosity lurks beneath. Hamer, who argues that his gene affects a person’s “spirituality” rather than organized religiosity, asserts the priority of individual experience and dismisses social forms outright. For him, whereas personal spirituality is a natural instinct with a genetic basis that enabled our ancestors to survive, religion is the product of misleading memes.
David Sloan Wilson and Barbara J. King, on the other hand, emphasize social forms and pay little attention to individual experience or supernatural beliefs. Wilson’s “secular utility” and King’s “belongingness” view personal spirituality as extraneous to the critical function and value of religiosity, which is even more evident when they interpret their findings. The admiration they have for religiosity (while perhaps disagreeing with actual religious beliefs) rests on its power to facilitate group behavior. Together with Hamer’s, their works read like secularized efforts to interpret the function of religious legacy in the human race.
***
For many, the suggestion that religiosity has its basis in biological mechanisms implies its falsity. Daniel Dennett would certainly agree. Most of the prominent cognitivists, including Pascal Boyer, Scott Atran, and Stewart Guthrie, avoid this argument, but readers generally take them to be hostile toward religious belief. Jesuit theologian John Haught, whose own work champions a science friendly Christianity, concludes that “if Boyer and others are giving us the ultimate and adequate explanation of religion, then of course we should acknowledge that our piety is pure fiction.”
Perhaps this is not necessarily true. Cognitivist Justin Barrett identifies as an evangelical Christian and has been an organizer for the youth ministry Young Life. “Why wouldn’t God,” he speculates in an interview, “design us in such a way as to find belief in divinity quite natural?” His book Why Would Anyone Believe in God?, a summary of cognitivist research, spends its concluding chapters suggesting that these theories make a naturalist case against atheism: “Belief in God comes naturally. Disbelief requires human intervention.” When the research is presented this way, believers receive it much more eagerly than either Dennett or Haught might expect. A review of Barrett’s book in Meridian, a Mormon magazine, expressed enthusiasm for his rhetorical openness to theism: “Neither coercion nor brainwashing nor special persuasive techniques need be invoked in order to account for widespread human belief in God or gods.”
This ambivalence only begins to reveal the range of the religious interpretations biologizing has already inspired. In his books, Andrew Newberg speaks of an “Absolute Unitary Being” for which his research allegedly gives evidence. Whereas Michael Persinger thinks of his “God helmet” as an antidote to religiosity, another neuroscientist associated with the research, Todd Murphy, advocates the helmet’s potential for spiritual enlightenment. Most outlandish of all, Rhawn Joseph claims his neurotheology as evidence that “we are in fact spiritual beings” and that our ancestors were planted on Earth by an advanced extraterrestrial civilization.
Such divergent interpretations reveal the power that scientific explanations of religion can have. The consequences of biologizing are much more complex and difficult to predict than most scientists have been willing to admit. The assumption that a scientific description of religiosity will inevitably counteract it rests on a model of static identity that has been amply refuted by modern experience. Instead, biologizing is a new move in the ongoing transformations that people have called religion or science, subject to such biases, imagination, and missteps that have always accompanied these undertakings.
Biologizing is, on the one hand, a series of new directions for serious research, and on the other, a movement with religious vitality of its own.
PsyBlog - How the Mind Reveals Itself in Everyday Activities
How the Mind Reveals Itself in Everyday Activities
Many fascinating insights into the human mind are hidden in the most routine activities.
What is the most depressing day of the week? How do you deal with queue-jumpers? Do you have paranoid thoughts while travelling on an underground train?
The answers to these simple questions can speak volumes about complex psychological process. Because the queue is a small social system, our reaction to its disruption hints at what we will tolerate elsewhere; clues to how our memory and emotions work come from whether we're right about the most depressing day of the week; and paranoid thoughts on a train show how differently we can each interpret exactly the same environment.
Collected below are links to recent articles on the psychology of the everyday. Future articles in this series will be added below, so you may like to bookmark this page at del.icio.us.
» Hell is other people » People's intuition is that learning more about a new acquaintance will lead to greater liking. In fact, on average, we like other people less the more we know about them. Read more on why familiarity breeds contempt.
» Dealing with line-jumpers » Queues are mini social systems, so when they are disrupted by queue-jumpers, how do people react? Quite meekly according to a study on the psychology of queuing.
» Superstitious? » Do you avoid opening umbrellas indoors? Do you hate to tempt fate? Despite knowing better, even the most rational of us seem to have some superstitious beliefs. That's because there are automatic psychological processes supporting superstition.
» The crowd myth » Crowds tend to get a bad rap - spontaneous, emotional and irrational - but has this image been exaggerated? Read more on the 7 myths of crowd psychology.
» Just ask for help » Everybody needs a little help from time-to-time, but it takes courage to ask for fear of rejection. However research suggests others are actually more likely to help than we might imagine.
» Dropping litter » When others drop litter or fail to clean up after their dogs, it can really get the blood boiling. Psychologists have begun to examine how these public incivilities can be reduced.
» Accidental friends » Selecting our friends is often a matter of pure chance: where we work, who lives next door or which club we happen to join. But the lottery of friendship could even depend on who we happen to meet first.
» Depressing Mondays » Are Mondays really the most depressing day of the week? It turns out that this common perception is out of line with reality (it's actually Wednesday). Find out why Monday is not as depressing as we think.
» Pet psychology » Are people's relationships with their pets beneficial? Studies of both cat psychology and dog psychology suggest they can be, but some of the research is barking mad (sorry).
» Paranoia on the train » Extremes of paranoid thinking are associated with mental illness, but we all have paranoid thoughts from time to time. This study of a virtual train journey reveals just how common paranoid thoughts really are.
Time/Volume Training For Mass
Go read the whole article to see how this program works.Time/Volume Training For Mass
By Nick Nilsson
This is a great training concept that I've been using recently that I came up with as a way to get mass-building effects out of bodyweight exercises that I could do a lot of reps with.
It's a type of training you could easily build a more comprehensive program on simply by extending the concept.
Basically, it's kind of a cross between my Compound Exercise Overload training (where you take a weight you can do 6 reps with and do 3 rep sets until you can't get 3 reps anymore, then you drop the weight and keep going) and Escalating Density Training (by Charles Staley - you might recognize the name :) ...it's basically where you take a 15 minute timeframe and do as many reps as you can within that timeframe).
Time/Volume Training is relatively simple. I'll use back training for my example (chin-ups, specifically).
For working back, I use a 15 minute block of time (this will vary according to bodypart - less time for smaller parts - e.g. 10 minutes for biceps).
First, start by doing a set of 3 reps. Then stop and rest 10 seconds. Now do another set of 3 reps. Stop and rest 10 seconds.
Keep going using 3 rep sets and 10 seconds rest until you can't get 3 reps anymore. When you hit this point, now starting taking 20 SECONDS rest in between 3 rep sets.
Keep going using 3 rep sets and 20 seconds rest until you again can't get 3 reps anymore. then take 30 SECONDS rest in between your 3 rep sets. If you have to increase again, go to 40 seconds, and so on.
Keep going in this fashion until your 15 minutes are up.
It's just that simple! Basically, the idea here is not to go to failure on any of your reps but to manage your fatigue so that you can maximize your training volume (i.e. more reps and sets).
And, because I originally worked up this technique to go with bodyweight training (where you can't change resistance), instead of decreasing the weight (like in Compound Exercise Overload), you will instead just increase the rest periods, which gives your body a bit more time to recover in between sets, allowing you to keep doing sets with the exact same resistance.
But just because it's originally designed for bodyweight training doesn't mean you can't use it with free weights and machines as well - it'll work like a charm for that, too!
You'll find when using this technique with different exercises (especially bodyweight exercises, where some tend to be a bit easier than others), you'll be able to go longer before having to increase rest. For example, when doing chins, you'll probably have to increase rest sooner than you will with push-ups.
But rest assured, even if you can 50 push-ups, you'll STILL get to a point where you're not able to do 3 reps sets on 10 seconds rest and you'll have to bump up the rest periods.
It's a great way to work bodyweight exercises without resorting to high-rep endurance training. With the 3 rep sets, you're still hitting the power-oriented muscle fibers, which is what allows you to make this type of training work for mass building.
You can take a few minutes in between bodyparts for a bit of recovery as well.
Here are the time intervals I've been using for this type of training:
Friday, September 12, 2008
Obama Blasts McCain on Lipstickgate: Enough of the Lies and Distractions!
Here, Obama bites back.
FactCheck - It's a Mess Out There
Big Think - How Do You Contribute?
Author and spiritualist Deepak Chopra on the ways he gives to the world.
Thomas Friedman - Hot, Flat & Crowded: Why We Need a Green Revolution
I'm pushing the new Friedman book wherever I can -- it's a market-based approach to Green, not a regulation-based approach, which is much more likely to succeed in the long-term.
Each part is about ten minutes or less.
Part One:
Part Two:
Part Three:
Daniel Dennett's Darwinian Mind: An Interview with a 'Dangerous' Man
Go read the rest of the interview.Daniel Dennett's Darwinian Mind: An Interview with a 'Dangerous' Man
by Chris Floyd
The outspoken philosopher of science distills his rigorous conceptions of consciousness, and aims withering fire at the dialogue between science and religion.
In matters of the mind—the exploration of consciousness, its correlation with the body, its evolutionary foundations, and the possibilities of its creation through computer technology—few voices today speak as boldly as that of philosopher Daniel Dennett. His best-selling works—among them Consciousness Explained and Darwin’s Dangerous Idea—have provoked fierce debates with their rigorous arguments, eloquent polemic and witty, no-holds-barred approach to intellectual combat. He is often ranked alongside Richard Dawkins as one of the most powerful—and, in some circles, feared—proponents of thorough-going Darwinism.
Dennett has famously called Darwinism a "universal acid," cutting through every aspect of science, culture, religion, art and human thought. "The question is," he writes in Darwin’s Dangerous Idea, "what does it leave behind? I have tried to show that once it passes through everything, we are left with stronger, sounder versions of our most important ideas. Some of the traditional details perish, and some of these are losses to be regretted, but...what remains is more than enough to build on."
Consciousness has arisen from the unwilled, unordained algorithmic processes of natural selection, says Dennett, whose work delivers a strong, extensive attack on the "argument from design" or the "anthropic principle." But a world without a Creator or an "Ultimate Meaning" is not a world without creation or meaning, he insists. When viewed through the solvent of Darwinism, he writes, "the ‘miracles’ of life and consciousness turn out to be even better than we imagined back when we were sure they were inexplicable."
Dennett’s prominence does not rest solely on his high public profile in the scientific controversies of our day; it is also based on a large body of academic work dealing with various aspects of the mind, stretching back almost 40 years. Dennett has long been associated with Tufts University, where he is now Distinguished Arts and Sciences Professor and director of the Center for Cognitive Studies. Boston-born, Oxford-educated, he now divides his time between North Andover, Massachusetts, and his farm in Maine, where he grows hay and blueberries, and makes cider wine.
In this exclusive interview with Science & Spirit, Dennett talks about his ideas on consciousness, evolution, free will, and the "slowly eroding domain" of religion.
Science & Spirit: Can you give us an overview of your ideas on consciousness? What is it? Where does it come from? Where might it be going?
Dennett: The problem I have answering your question is that my views on consciousness are initially very counterintuitive, and hence all too easy to misinterpret, so any short summary is bound to be misleading. Those whose curiosity is piqued by what I say here are beseeched to consult the long version carefully. Aside from my books, there are dozens of articles available free on my website, at www.ase.tufts.edu/cogstud.
With that caveat behind us (and convinced that in spite of it, some people will leap on what I say here and confidently ride off with a caricature), I claim that consciousness is not some extra glow or aura or "quale" caused by the activities made possible by the functional organization of the mature cortex; consciousness is those various activities. One is conscious of those contents whose representations briefly monopolize certain cortical resources, in competition with many other representations. The losers—lacking "political clout" in this competition—quickly fade leaving few if any traces, and that’s the only difference between being a conscious content and being an unconscious content.
There is no separate medium in the brain, where a content can "appear" and thus be guaranteed a shot at consciousness. Consciousness is not like television—it is like fame. One’s "access" to these representations is not a matter of perceiving them with some further inner sensory apparatus; one’s access is simply a matter of their being influential when they are. So consciousness is fame in the brain, or cerebral celebrity. That entails, of course, that those who claim they can imagine a being that has all these competitive activities, all the functional benefits and incidental features of such activities, in the cortex but is not conscious are simply mistaken. They can no more imagine this coherently than they can imagine a being that has all the metabolic, reproductive, and self-regulatory powers of a living thing but is not alive.
There is no privileged center, no soul, no place where it all comes together—aside from the brain itself. Actually, Aristotle’s concept of a soul is not bad—the "vegetative soul" of a plant is not a thing somewhere in the plant; it is simply its homeostatic organization, the proper functioning of its various systems, maintaining the plant’s life. A conscious human soul is the same sort of phenomenon, not a thing, but a way of being organized and maintaining that organization. Parts of that organization are more persistent, and play more salient (and hence reportable) roles than others, but the boundaries between them—like the threshold of human fame—are far from sharp.
New York Times - How Obama Reconciles Dueling Views on Economy
By DAVID LEONHARDT
I just caught up with this article from the August 20, 2008 issue of the New York Times Magazine, which featured this piece on Barack Obama's economic policy. Interesting that both sides think he has sold them out -- liberals think he is too favorable to Big Business, conservatives think he wants to create a bigger government.
With Obama, there is vast disagreement about just how liberal he is, especially on the economy. My favorite example came in mid-June, shortly after Obama named Jason Furman, a protégé of Robert Rubin, the centrist former Treasury secretary, as his lead economic adviser. Labor leaders recoiled, and John Sweeney, the head of the A.F.L.-C.I.O., worried aloud about “corporate influence on the Democratic Party.” Then, the following week, Kimberley Strassel, a member of The Wall Street Journal editorial board, wrote a column titled, “Farewell, New Democrats,” concluding that Obama’s economic policies amounted to the end of Clintonian centrism and a reversion to old liberal ways.Both are right, but not in the ways they think. Obama says he is simply a pragmatist = whatever works.
First things first - the economic situation right now:
Ever since Wall Street bankers were called back from their vacations last summer to deal with the convulsions in the mortgage market, the economy has been lurching from one crisis to the next. The International Monetary Fund has described the situation as “the largest financial shock since the Great Depression.” The details are too technical for most of us to understand. (They’re too technical for many bankers to understand, which is part of the problem.) But the root cause is simple enough. In some fundamental ways, the American economy has stopped working.The fact that the economy grows — that it produces more goods and services one year than it did in the previous one — no longer ensures that most families will benefit from its growth. For the first time on record, an economic expansion seems to have ended without family income having risen substantially. Most families are still making less, after accounting for inflation, than they were in 2000. For these workers, roughly the bottom 60 percent of the income ladder, economic growth has become a theoretical concept rather than the wellspring of better medical care, a new car, a nicer house — a better life than their parents had.
Americans have still been buying such things, but they have been doing so with debt. A big chunk of that debt will never be repaid, which is the most basic explanation for the financial crisis. Even after the crisis has passed, the larger problem of income stagnation will remain. It’s hardly the economy’s only serious problem either. There is also the slow unraveling of the employer-based health-insurance system and the fact that, come 2011, the baby boomers will start to turn 65, setting off an enormous rise in the government’s Medicare and Social Security obligations.
It ain't pretty.
There are a lot of other issues coming our way, and those are the things Obama seems to be focused on:
Among the policy experts and economists who make up the Democratic government-in-waiting, there is now something of a consensus. They agree that deficit reduction did an enormous amount of good. It helped usher in the 1990s boom and the only period of strong, broad-based income growth in a generation. But that boom also depended on a technology bubble and historically low oil prices. In the current decade, the economy has continued to grow at a decent pace, yet most families have seen little benefit. Instead, the benefits have flowed mostly to a small slice of workers at the very top of the income distribution. As Rubin told me, comparing the current moment with 1993, “The distributional issues are obviously more serious now.” From today’s vantage point, inequality looks likes a bigger problem than economic growth; fiscal discipline seems necessary but not sufficient.This is a very enlightening article, so go read the rest of it.In practical terms, the new consensus means that the policies of an Obama administration would differ from those of the Clinton administration, but not primarily because of differences between the two men. “The economy has changed in the last 15 years, and our understanding of economic policy has changed as well,” Furman says. “And that means that what was appropriate in 1993 is no longer appropriate.” Obama’s agenda starts not with raising taxes to reduce the deficit, as Clinton’s ended up doing, but with changing the tax code so that families making more than $250,000 a year pay more taxes and nearly everyone else pays less. That would begin to address inequality. Then there would be Reich-like investments in alternative energy, physical infrastructure and such, meant both to create middle-class jobs and to address long-term problems like global warming.
All of this raises the question of what will happen to the deficit. Obama’s aides optimistically insist he will reduce it, thanks to his tax increases on the affluent and his plan to wind down the Iraq war. Relative to McCain, whose promised spending cuts are extremely vague, Obama does indeed look like a fiscal conservative. But the larger point is that the immediate deficit isn’t as big as it was in 1992. Then, it was equal to 4.7 percent of gross domestic product. Right now it’s about 2.5 percent.
During our conversation, Obama made it clear that he considered the deficit to be only one of the long-term problems requiring immediate attention, and he sounded more worried about the others, like global warming, health care and the economic hangover that could follow the housing bust.
Thursday, September 11, 2008
Satire - Female Fans Out For Season With Tom Brady's Knee Injury
Female Fans Out For Season With Tom Brady's Knee Injury
FOXBOROUGH, MA—More than 90 percent of female football fans were lost for the season on Sunday when New England Patriots quarterback Tom Brady suffered a left knee injury that will require extensive treatment. The Patriots announced Monday that Brady, the 2007 NFL Most Valuable Player and arguably the NFL's most handsome man, will be placed on injured reserve, where despite being no less attractive than before his injury, he will only be partially visible for the rest of the 2008-2009 season.
Bill Belichick held a press conference Tuesday confirming that Brady will have surgery, ending his 128-game combined starting-and-high-visibility streak, the third longest for a quarterback and the longest ever for a quarterback heartthrob.
Brady left Sunday's game against Kansas City after suffering an ugly anterior cruciate ligament tear in his incredibly handsome left knee after being hit by merely average-looking Chiefs safety Bernard Pollard.
"We feel badly for the nation's women about the injury," Belichick said. "And for Tom, of course. You hate to see anyone with that kind of masculine yet boyish appeal go down. No one has worked harder or done more for this team's female fan base than Tom has, and we expect him to set his rugged, chiseled jaw, keep his twinkling blue eyes on the prize, and be ready to get back on the field and in front of the cameras by next year."
Matt Cassel, who analysts say looked "consistent and confident" while guiding New England to its 20th straight regular-season win after Brady was hurt and "okay but not remarkable" in jeans and a polo shirt after the game, will start Sunday at the New York Jets, although there are doubts Cassel can win as many games and women as Brady.
"Well, as far as my role on this team goes, I'm not trying to be Tom Brady. I'm just trying to be Matt," Cassel said when subbing for Brady on his regular weekly radio show. "I mean, I have to just be myself, or else the ladies will sense I'm faking it, and in the end, that'll make it worse. I just hope there's one special fan group out there for me."
Cassel has been a second fiddle his entire football career, even in college at Southern California, where he was backup and wingman to lovable tousle-headed manchild Matt Leinart.
But football and demographics analysts agree that Brady's injury surely changes the rugged, weatherbeaten complexion of the entire NFL, where the Patriots, winners of three Super Bowls since 2001 with Brady as their quarterback and spokesmodel, were the strong female-fan favorite. However, Belichick denied the team reached out to any other more experienced or handsome quarterbacks.
Although losing Brady's strong arm and sculpted face will not be easy for the Patriots, the impact of his loss is expected to be felt around the NFL, where Brady has been the leading performer both on and off the field for the last several seasons. League commissioner Roger Goodell called an emergency owner's meeting Monday in which attendees discussed measures designed to compensate for Brady's loss, such as giving poise and diction lessons to Peyton or Eli Manning, getting Brett Favre a new wardrobe and a decent haircut, or teaching Ben Affleck how to play football.
Unfortunately for the NFL, Brady's loss seems to have affected more than just the Patriots and women. Many Boston-area fans of both genders, claiming that the team isn't worth watching without Brady, have concentrated their attention on the waning and somewhat disheveled Red Sox season or the attractive upcoming Celtics' NBA title defense. The sports media has likewise gone into shock, with columnist Bill Simmons saying he will no longer watch football this season, Sports Illustrated canceling large Brady-themed sections of this years' upcoming swimsuit issue, and NBC Football Night In America analyst Cris Collinsworth bursting into tears and collapsing into Peter King's arms upon receiving the news.
"No one else in football has Brady's unique talents—the physical gifts of build, height, arms, cheekbones, piercingly sultry field vision, the combination of arm strength and accuracy with a sense of tenderness, the combination of smirk and pout—along with the intangibles and the ability to look good in everything," said Tom Chiarella, who scouted and evaluated Brady for the September issue of Esquire. "It's impossible to estimate the impact of his loss, but it will almost certainly mean the loss of most female fans, many Boston-area fans, fair-weather fans, and the majority of mainstream media fans. The NFL is really looking at a worst-case scenario here, one that it never wanted to happen: A football season that's only watched by actual football fans."
McCain's Ads Are Lies and Misrepresentations
If you don't like Obama's policies, fine, pick a third party candidate, but don't elect another sleaze-ball to be leader of this nation.
Why I Hate the 9/11 Anniversary
My girlfriend turned on the TV this morning and all the news channels were commemorating the seventh 9/11 anniversary. All well and good, except that many of them were replaying footage from the event with the original images and commentary.
I cannot convey how fucking stupid that is.
Many Americans, not just those in New York City, were deeply traumatized by the planes crashing into the the towers and the subsequent collapse of the towers, not to mention the images and reports of people jumping to their deaths. Most of the people who suffered trauma from that event never received therapy to deal with the post-traumatic stress disorder that resulted.
Each year, on 9/11, those people are re-traumatized by the images and reports on their televisions. One of the downsides of PTSD is that those suffering from it will actually willingly watch the reports, not knowing that they are recreating the original trauma in their brains. In essence, they are keeping those memories intact and hard-wired in the neural circuits.
There is no good reason to televise those images each year. Doing so simply re-injures those who were hardest hit emotionally by the original event.
It's callous and manipulative for the media to keep doing this.
John Cleese on the God Gene
There are a total of 14 videos in this embed, so enjoy!
Charlie Rose - A Conversation with Thomas L. Friedman
Popularity of a Hallucinogen May Thwart Its Medical Uses
This drug works differently than other hallucinogens, which may make it very useful for treating a variety of mental illnesses, but as street use rockets upward, the drug may soon be illegal, even for research.
In 2002, Dr. Bryan L. Roth, now of the University of North Carolina at Chapel Hill, discovered that Salvinorin A, perhaps uniquely, stimulates a single receptor in the brain, the kappa opioid receptor. LSD, by comparison, stimulates about 50 receptors. Dr. Roth said Salvinorin A was the strongest hallucinogen gram for gram found in nature.Unfortunately, dumbass kids are messing with this stuff and doing dumbass things like trying to drive a car while being essentially incapacitated. Then they post the videos on YouTube.Though Salvinorin A, because of its debilitating effects, is unlikely to become a pharmaceutical agent itself, its chemistry may enable the discovery of valuable derivatives. “If we can find a drug that blocks salvia’s effects, there’s good evidence it could treat brain disorders including depression, schizophrenia, Alzheimer’s, maybe even H.I.V.,” Dr. Roth said.
Many scientists believe salvia should be regulated like alcohol or tobacco, but worry that criminalization would encumber their research before it bears fruit.
“We have this incredible new compound, the first in its class; it absolutely has potential medical use, and here we’re talking about throttling it because some people get intoxicated on it,” said Dr. John Mendelson, a pharmacologist at the California Pacific Medical Center Research Institute who, with federal financing, is studying salvia’s impact on humans. “It couldn’t be more foolish from a business point of view.”
Here's a video of one dude doing some salvia:
This guy had a good trip, not everyone has so much fun. Salvia can be incredibly intense and many people never do it again after a first trip.
Here is some good info on the effects of salvia use:
Psychedelic experiences are necessarily somewhat subjective and variations in reported effects are to be expected. Aside from individual reported experiences there has been a limited amount of published work summarising the effects. D.M. Turner’s book “Salvinorin—The Psychedelic Essence of Salvia Divinorum” quotes Daniel Siebert’s summarisation, mentioning that the effects may include:[37]
- Uncontrollable laughter
- Past memories, such as revisiting places from childhood memory
- Sensations of motion, or being pulled or twisted by forces
- Visions of membranes, films and various two-dimensional surfaces
- Merging with or becoming objects
- Overlapping realities, such as the perception of being in several locations at once
A survey of salvia users found that 38% described the effects as unique. 23% said the effects were like yoga, meditation or trance.[38]
Media reporters rarely venture to take salvia themselves, but one firsthand journalistic account has been published in the UK science magazine New Scientist:
“ the salvia took me on a consciousness-expanding journey unlike any other I have ever experienced. My body felt disconnected from ‘me’ and objects and people appeared cartoonish, surreal and marvellous. Then, as suddenly as it had began, it was over. The visions vanished and I was back in my bedroom. I spoke to my ‘sitter’—the friend who was watching over me, as recommended on the packaging—but my mouth was awkward and clumsy. When I attempted to stand my coordination was off. Within a couple of minutes, however, I was fine and clear-headed, though dripping with sweat. The whole experience had lasted less than 5 minutes.”
There have been few books published on the subject. One notable example is Dale Pendell’s work “Phamako/Poeia—Plants Powers, Poisons, and Herbcraft”, which won the 1996 Firecracker Alternative Book Award[39] and has a chapter dedicated to Salvia divinorum. It includes some experience accounts:
“ It’s very intense, I call it a reality stutter, or a reality strobing. I think that having been a test pilot, and flying in that unforgiving environment with only two feet between our wingtips, helped to prepare me for this kind of exploration.”
Some have written extensive prose and/or poetry about their experiences.[40][41] Some describe their visions pictorially, and there exist examples of visionary art which claim to be salvia-inspired. Others claim musical inspiration from the plant: examples are the songs “Salvia divinorum” by 1200 Micrograms, "Salvia" by Deepwater Sunshine, and "Flight 77" by Paul Dereas.[41]
Cautionary notes
Dale Pendell expresses some concerns about the use of highly concentrated forms of salvia. In its natural form salvia is more balanced and benevolent, and quite strong enough, he argues. High strength extracts on the other hand can show “a more precipitous, and more terrifying, face” and many who try it this way may never wish to repeat the experience.[42]
The “Salvia divinorum User’s Guide” hosted on Daniel Siebert’s website recommends having a sitter present if you are new to salvia, are experimenting with a stronger form, or are using a more effective method of ingestion than you have before.
“ An experienced salvia user who is chewing a quid, may often choose to do it alone, and may be quite safe in doing so. But having a pleasant, sensible, sober sitter is an absolute must if you are trying vaporization, smoking high doses of extract-enhanced leaves, or using pure salvinorin.”
The guide points out that the effects of salvia are generally quite different from those of alcohol; but, like alcohol, it impairs coordination. One should never attempt to drive under its influence.
It also emphasizes that salvia is not a party drug.
Hopefully, any legislation involving this drug will be regulatory and not criminalizing. But that's wishful thinking in this political climate. In the meantime, hopefully some good research can get done into how this drug works and how to harness its power for healing.
Sacred Grain and Gluten-Free Superfood
Good stuff!Sacred Grain and Gluten-Free Superfood
Written by Derek MarkhamPublished on September 9th, 2008Which ancient grain:
- is 15 to 18% protein?
- has five times the iron as wheat?
- contains three times the fiber as wheat?
- delivers twice the calcium as milk?
- is a great source of magnesium and manganese?
- is high in linoleic acid, an omega-6 fatty acid?
- contains both lysine and methionine, two essential amino acids usually missing from other grains?
This nutritional powerhouse is easy to prepare and can be substituted for other grains in just about any recipe. It is a gluten-free food, making it an easy choice for sufferers of celiac disease or wheat allergies to replace wheat, barley, and rye in grain-based recipes.
What is this amazing superfood?
Amaranth seeds!
Amaranth is actually an annual herb related to lamb’s quarters, not a grain. It grows like a weed (and is considered as such to many farmers and homeowners) and produces copious amounts of seeds and edible greens even in arid climates. The greens are prepared just like spinach, with the youngest leaves being the most desirable. Some varieties are grown in annual gardens just for their showy flower-heads.
The Incas prized amaranth as one of their staple foods, and it held a special significance in Aztec and Incan rituals. Today, the grain is grown throughout the world because of ease of harvest, high yields, and tolerance to adverse growing conditions.
Directions for preparing amaranth grain:
- 1 cup amaranth
- 3 cups of water
- Pinch of salt (optional)
- Bouillon or herbs (optional)
- Bring to a boil
- Simmer for 20 to 30 minutes or until all of the water is absorbed, stirring occasionally
- For most savory dishes, cook only until the water is gone yet the grains are still separate
- For a breakfast porridge, add extra 1/2 cup water and simmer until it reaches the desired thickness
Try using half quinoa and half amaranth, cooked together. Replace the rice, wheat, or millet in recipes with amaranth, stir some into soup, or use as a base for a casserole. We like to make it as a sweet pudding for dessert (throw in some chocolate chips), and a hot porridge with cinnamon and raisins for the morning.
Amaranth is also available as flour, pasta, a puffed cereal, and as an ingredient in some energy bars. Ask for amaranth on your next trip to the co-op or natural foods store!
Deep Thoughts - John Welwood
~ John Welwood, Toward a Psychology of Awakening
Wednesday, September 10, 2008
Peter Gray - Why We Should Stop Segregating Children by Age: Part I--The Value of Play in the Zone of Proximal Development
Why We Should Stop Segregating Children by Age: Part I--The Value of Play in the Zone of Proximal Development
By Peter Gray on September 09, 2008 in Freedom to LearnOne of the oddest, and in my view most harmful, aspects our treatment of children today is our penchant for segregating them into separate groups by age. We do that not only in schools, but increasingly in out-of-school settings as well. In doing so, we deprive children of a valuable component of their natural means of self-education.
The age-segregated mode of schooling became dominant at about the same time in history when the assembly-line approach to manufacturing became dominant. The implicit analogy is pretty obvious. The graded school system treats children as if they are items on an assembly line, moving from stop to stop (grade to grade) along a conveyor belt, all at the same speed. At each stop a factory worker (teacher) adds some new component (unit of knowledge) to the product. At the end of the line, the factory spits out complete, new, adult human beings, all built to the specifications of the manufacturers (the professional educators).
Of course everyone who has ever had or known a child, including everyone who works in our age-graded schools, knows that this assembly-line view of child development is completely false. Children are not passive products, to which we can add components. Children are not incomplete adults that need to be built bit by bit in some ordered sequence. Children are complete human beings in their own right, who constantly demand to control their own lives and who, despite what we put them through, insist on learning what they want to learn and practicing the skills they want to practice. We can't stop them. We would all be much better off if we went with them on this rather than fought them.
In previous postings I have described settings where children educate themselves, without adult direction or prodding. In particular, I have discussed self-education as it once occurred in hunter-gatherer bands (August 2, 2008, posting) and as it occurs today in schools designed for self-education, particularly the Sudbury Valley School (August 13, 2008, and September 3, 2008, postings). A prominent feature of such settings is that children regularly interact with others across the whole spectrum of ages. Anthropologists have claimed that free age mixing is the key to the self-education of hunter-gatherer children; and Daniel Greenberg has long claimed that free age mixing is the key to self-education at the Sudbury Valley School, which he helped to found [1].
Several years ago, Jay Feldman (who then was a graduate student working with me) and I conducted some studies of age-mixed interactions at the Sudbury Valley School, aimed at (a) determining how much age mixing occurred at the school, (b) identifying the contexts in which age-mixing occurred, and (c) identifying ways by which age mixing seemed to contribute to students' self-education.