Saturday, July 05, 2008
A nice thought the begin the day:
Imagine that you are having difficulties with a loved one, such as your mother or father, husband or wife, lover or friend. How helpful and revealing it can be to consider the other person not in his or her “role” of mother or father or husband, but simply as another “you,” another human being, with the same feelings as you, the same desire for happiness, the same fear of suffering. Thinking of the other one as a real person, exactly the same as you, will open your heart to him or her and give you more insight into how to help.
~ Sogyal Rinpoche
Tags: Dharma, Quote, Compassion, Buddhism, Sogyal Rinpoche
To me, the holder of a master's degree in humanities, this is the key statement:
The humanities, rightly pursued and rightly ordered, can do things, and teach things, and preserve things, and illuminate things, which can be accomplished in no other way.This is a good defense of the humanities, so please give it a read.
The Burden of the Humanities
by Wilfred M. McClay
Lamentations about the sad state of the humanities in modern America have a familiar, indeed almost ritualistic, quality about them. The humanities are among those unquestionably nice endeavors, like animal shelters and tree-planting projects, about which nice people invariably say nice things. But there gets to be something vaguely annoying about all this cloying uplift. One longs for the moral clarity of a swift kick in the rear.
Enter the eminent literary scholar Stanley Fish, author of a regular blog for The New York Times, who addressed the subject with a kicky piece entitled “Will the Humanities Save Us?” (Jan. 6, 2008). Where there is Fish there will always be bait, for nothing pleases this contrarian professor more than double-crossing his readers’ expectations and enticing them into a heated debate, and he did not disappoint.
He took as his starting point Anthony Kronman’s passionate and high-minded book Education’s End: Why Our Colleges and Universities Have Given Up on the Meaning of Life (2007), in which Kronman argues that higher education has lost its soul, and can only recover it by re-emphasizing the building of character through the study of great literary and philosophical texts. Fish was having none of such “pretty ideas.” There is “no evidence,” he sniffed, that such study has the effect of “ennobling” us or spurring us on to noble actions. If it did, then the finest people on earth would be humanities professors, a contention for which the evidence is, alas, mostly on the other side.
Teachers of literature and philosophy possess specialized knowledge, Fish asserted, but they do not have a ministry. The humanities can’t save us, and in fact they don’t really “do” anything, other than give pleasure to “those who enjoy them.” Those of us involved with the humanities should reconcile ourselves to the futility of it all, and embrace our uselessness as a badge of honor. At least that way we can claim that we are engaged in “an activity that refuses to regard itself as instrumental to some larger good.”
This sustained shrug elicited a blast of energetic and mostly negative response from the Times’ online readers. To read through the hundreds of comments is to be reminded that Americans do seem to have a strong and abiding respect for the humanities. For many of these readers, Fish’s remarks failed the test of moral seriousness, and failed to come to terms with exactly what it is that makes the humanities special, and places upon them a particular task, a particular burden, in the life of our civilization. That one of the humanities’ most famous, influential, and well-paid elder statesmen would damn his own livelihood with such faint praise seems in itself a perfect indicator of where we now find ourselves.
What does it mean to speak of the “burden” of the humanities? The phrase can be taken several ways. First, it can refer to the weight the humanities themselves have to bear, the things that they are supposed to accomplish on behalf of us, our nation, or our civilization. But it can also refer to the near opposite: the ways in which the humanities are a source of responsibility for us, and their recovery and cultivation and preservation our job, even our duty.
Both of these senses of burden—the humanities as preceptor, and the humanities as task—need to be included in our sense of the problem. The humanities, rightly pursued and rightly ordered, can do things, and teach things, and preserve things, and illuminate things, which can be accomplished in no other way. It is the humanities that instruct us in the range and depth of human possibility, including our immense capacity for both goodness and depravity. It is the humanities that nourish and sustain our shared memories, and connect us with our civilization’s past and with those who have come before us. It is the humanities that teach us how to ask what the good life is for us humans, and guide us in the search for civic ideals and institutions that will make the good life possible.
The humanities are imprecise by their very nature. But that does not mean they are a form of intellectual finger-painting. The knowledge they convey is not a rough, preliminary substitute for what psychology, chemistry, molecular biology, and physics will eventually resolve with greater finality. They are an accurate reflection of the subject they treat, the most accurate possible. In the long run, we cannot do without them.
But they are not indestructible, and will not be sustainable without active attention from us. The recovery and repair of the humanities—and the restoration of the kind of insight they provide—is an enormous task. Its urgency is only increasing as we move closer to the technologies of a posthuman future, a strange, half-lit frontier in which bioengineering and pharmacology may combine to make all the fearsome transgressions of the past into the iron cages of the future, and leave the human image permanently altered.
The mere fact that there are so many people whose livelihood depends on the humanities, and that the humanities have a certain lingering cultural capital associated with them, and a resultant snob appeal, does not mean that they are necessarily capable of exercising any real cultural authority. This is where the second sense of burden comes in—the humanities as reclamation task. The humanities cannot be saved by massive increases in funding. But they can be saved by men and women who believe in them.
Read the rest of this article.
OK, one more quote:
The chief point to make here is that the humanities do have a use, an important use—an essential use—in our lives. Not that we can’t get along without them. Certainly not in the same sense that we can’t get along without a steady supply of air, water, and nutrients to sustain organic life, and someone to make candles and books for the world’s poets. But we need the humanities in order to understand more fully what it means to be human, and to permit that knowledge to shape and nourish the way we live.'Nuf said.
Friday, July 04, 2008
Parade asked Barack Obama and John McCain to explain their views on patriotism. Personally, I'd rather live under Obama's version.
Obama: Faith In One Another As Americans
McCain: Putting The Country First
If we want to get all integral on these guys, we can generalize their views on this topic (and this topic alone) in terms of stages of development.
McCain is clearly authoritarian (Wilbewr's Amber stage) in his definition:
Amber (ethnocentric—able to take a 2nd-person perspective): Amber Altitude indicates a worldview that is mythic, and mythic worldviews are always held as absolute (this stage of development is often called absolutistic). Instead of "might makes right," amber ethics are more oriented to the group, but one that extends only to "my" group. Grade school and high school kids usually exhibit amber motivations to "fit in." Amber ethics help to control the impulsiveness and narcissism of red. Culturally, amber worldviews can be seen in fundamentalism (my God is right no matter what); extreme patriotism (my country is right no matter what); and ethnocentrism (my people are right no matter what).Obama is clearly more pluralistic (Wilber's Green stage):
It's tempting, of course, to generalize for both men -- to see McCain as straddling the amber/orange divide in his thinking and policies, while Obama would appear to be pretty centered in green, with teal intellectual skills (see this link). None of this means anything to anyone not versed in integralese, so let's break it down a little more clearly.
Green (worldcentric—able to take a 4th-person perspective): Green worldviews are marked by pluralism, or the ability to see that there are multiple ways of seeing reality. If orange sees universal truths ("All men are created equal"), green sees multiple universal truths—different ones for different cultures. Green ethics continue, and radically broaden, the movement to embrace all people. A green statement might read, "We hold these truths to be self-evident, that all people are created equal, regardless of race, gender, class...." Green ethics have given birth to the civil rights, feminist, and gay rights movements, as well as environmentalism.The green worldview's multiple perspectives give it room for greater compassion, idealism, and involvement, in its healthy form. Such qualities are seen by organizations such as the Sierra Club, Amnesty International, Union of Concerned Scientists, and Doctors Without Borders. In its unhealthy form green worldviews can lead to extreme relativism, where all beliefs are seen as relative and equally true, which can in turn lead to the nihilism, narcissism, irony, and meaninglessness exhibited by many of today's intellectuals, academics, and trend-setters.... Not to mention another "lost" generation in students.
McCain, as near as I can tell, is highly nationalistic, somewhat ethnocentric, and largely neo-conservative in his fiscal and social views. On the other hand, Obama has a stong faith in a mythic religion, is very comfortable with relativistic social views, seems to desire a more compassionate government, and is apparently capable of thinking in terms of systems within a complex and chaotic world (beyond black, white, or gray).
So, assuming these broad generalizations to be true, their respective definitions of patriotism are very much in line with who they are as people.
His recent article blasting Obama's "patriotism problem" has generated at least one good critique. It seems I also felt a need to respond, since, by extension, I am not a good patriot (which I will readily admit to -- nationalism is a load of crap as far as I am concerned).
His definition of patriotism in America fails to recognize that this country is a mess:
Definitions of patriotism proliferate, but in the American context patriotism must involve not only devotion to American texts (something that distinguishes our patriotism from European nationalism) but also an abiding belief in the inherent and enduring goodness of the American nation. We might need to change this or that policy or law, fix this or that problem, but at the end of the day the patriotic American believes that America is fundamentally good as it is.
It's the "good as it is" part that has vexed many on the left since at least the Progressive era. Marxists and other revolutionaries obviously don't believe entrepreneurial and religious America is good as it is. But even more mainstream figures have a problem distinguishing patriotic reform from reformation. Many progressives in the 1920s considered the American hinterlands a vast sea of yokels and boobs, incapable of grasping how much they needed what the activists were selling.
I'm going to ignore that bit about "devotion to American texts," which smells an awful lot like patriotism as religion. And I'm going to resist the urge to suggest that "abiding belief" really means "blind devotion." Oh wait, maybe I won't.
He thinks liberals want "to 'remake' America from scratch", which is what it might look like to someone who believes this nation is "good as it is." America is good, but it can be great -- and that's what liberals, and a lot of conservatives, want for this country.
Goldberg seems to think we are already a great nation. But has he ever lived in poverty, barely getting by with enough to eat while making minimum wage? Has he ever been a minority struggling with intolerance? Has ever been a woman whose right to determine the fate of her own body is continually threatened and reduced? Has he ever been arrested for "driving while black"? Has ever had to join the military to have any hope of paying for college and then be sent to fight a war entered into through lies and manipulations of the American public? Has he ever had to worry there won't be any Social Security when he retires? I could go, but I'm sure you get the idea.
This is not how a "great" nation treats its citizens. We can do better. We can be a great nation. But not if we think we are fine the way we are. Complacency does not solve real problems.
This sense that America is in need of fixing in order to be a great country points to Obama's real patriotism problem. And it's not Obama's alone.
We do need to fix this country. We have more problems than any president can reasonably address.
Eduction? Needs to be fixed. Health Care? Needs to be fixed. Immigration? Needs to be fixed. The two-party system? Needs to be fixed. Foreign policy? Needs to be fixed. Social Security? Needs to be fixed. Medicare and Medicaid? Needs to be fixed. Infrastructure? Needs to be fixed. Oil dependency? Needs to be fixed. Pollution and environmental protection? Needs to be fixed.
Only someone living with his head in the sand would suggest we are now a nation that does NOT need to be fixed. I wonder how Goldberg breathes down there?
Bruce E. Levine, writing over at AlterNet, has a provocative article up titled, The Science of Happiness: Is It All Bullshit?
He is essentially criticizing the positive psychology movement, those purveyors of happiness as a pursuit of psychology (rather than always focusing on pathology).
I think Levine makes some good points, but as far as I can tell the "science of happiness," while seeming like little more than common sense, is much needed in this culture. We have forgotten how to live lives that engender happiness.
And yes, things like "meaningfulness" and "happiness" are subjective qualities that can't be tested objectively in a laboratory, but that does not mean they do not exist and can't be evaluated in their own subjective terms.
Anyway, here is key quote from the article.
The reality is that the "science of happiness" is a shaky science. For one thing, the independent and dependent variables (such as meaningfulness and happiness) are subjective and not truly quantifiable in the manner that legitimate scientists would take too seriously. I respect the findings of real science, but shaky science provides far less authority than time-honored wisdom.Read the rest of the article.
The current positive psychology craze is by no means the first time that academic psychology has taken basic common sense and elevated it with scientific-sounding jargon to create the illusion that psychologists have something special to offer. When I attended graduate school in clinical psychology, hot topics were "cognitive psychology" and "cognitive therapy," which were considered radical shifts from "behaviorism" (which dogmatically focused only on "observable events"). Cognitive psychology's great contribution? Just because you cannot see people's thoughts, people actually do think, and thoughts affect our emotions. However, 2,500 years earlier, the Buddha taught about the thought-emotion connection in a far more profound way than any academic cognitive psychologist.
Bruce E. Levine, Ph.D., is a clinical psychologist and author of Surviving America's Depression Epidemic: How to Find Morale, Energy, and Community in a World Gone Crazy (Chelsea Green, 2007). www.brucelevine.net.
First up, it appears that severe shyness and anxiety are hard-wired into some people's brains.
Severe Shyness? New Study Shows That Anxiety Is Likely A Long-lasting Trait
ScienceDaily (July 3, 2008) — We all know people who are tense and nervous and can't relax. They may have been wired differently since childhood.New research by the HealthEmotions Research Institute and Department of Psychiatry at the University of Wisconsin School of Medicine and Public Health (SMPH) indicates that the brains of those suffering from anxiety and severe shyness in social situations consistently respond more strongly to stress, and show signs of being anxious even in situations that others find safe.
Dr. Ned Kalin, chairman of the UW Department of Psychiatry and HealthEmotions Research Institute, in collaboration with graduate student Andrew Fox and others, has published a new study on anxious brains in the online journal PLoS One.
The study looked at brain activity, anxious behaviour, and stress hormones in adolescent rhesus monkeys, which have long been used as a model to understand anxious temperament in human children. Anxious temperament is important because it is an early predictor of the later risk to develop anxiety, depression, and drug abuse related to self medicating.
* * * * *
The next article looks at how we define who we are by the ideal self we envision, and why we have trouble seeing that same ability to transform in other people.
This article is quite fascinating, be sure to check out the whole thing. Here is one key finding that suggests we see other people as closer to their ideal self than we are in our lives, which is quite interesting:Understanding ourselves is partly about understanding who it is we want to become.
Because each of us is a perpetual work in progress, we live our lives with one eye on the future. In that future we see ourselves transformed into our true, ideal self - just as we would like to be.
While we take this for granted in ourselves, research finds we are much less likely to see other people's good intentions and hopes for the future as part of their selves. Instead we are likely to judge them just as they appear to us - defined by their past and present, stuck in the moment, unlikely to change and ultimately knowable.
When thinking about themselves students thought they were about 30% to where they wanted to be, while they thought the average student was about 50% towards becoming who they wanted to be. This confirmed their earlier studies which suggested we really do think other people are further towards fulfilling their potential than we are.
A lot of trauma work in therapy assumes that painful or troubling memories must be processed inb order to be overcome. But new research suggests that we can overcome some feelings with practice.
In Sight, Out of MindThat's the whole article. They're careful to say this doesn't apply to trauma, but I'm guessing a lot of people will try to extrapolate in that direction.
Research shows that practice can help you forget painful memories and control your memory.
By: Nicole Bode
Can't go to a favorite restaurant without thinking about your ex? Better plan to eat there daily. According to research, it may be possible to push unwanted memories out of your mind—but only with repeated practice.
Motivated by studies that found that children are more likely to forget abuse by family members than by strangers, study author Michael Anderson, Ph.D., professor of psychology at the University of Oregon, designed an experiment to find out if memory can be consciously controlled.
In the first 15 minutes of the experiment, participants memorized a series of 40 word pairs. Anderson then presented participants with only one word from the pair, sometimes asking them to remember the associated word and sometimes asking them to forget it. Published in the journal Science, his findings show that participants who repeatedly tried to suppress words eventually became more likely to forget them, even when offered clues and money to remember. "People forced to encounter reminders of unwanted memory find ways to adapt accordingly." Anderson says. However, Anderson is careful to clarify that "the findings don't have direct implications for forgetting trauma."
The study sheds light on the flip side of memory research, which often focuses on how to improve recall. "There are many positive aspects of forgetting." Anderson argues. "To be unable to push out of mind outdated or painful information risks impairing your concentration and your well-being."
Finally, Sam Wang, at Welcome To Your Brain, takes a look at The neuroscience behind Swift-boating. How timely.
Your brain lies to youRead the whole article.
False beliefs are everywhere. Eighteen percent of Americans think the sun revolves around the earth, one poll has found. Thus it seems slightly less egregious that, according to another poll, 10 percent of us think that Senator Barack Obama, a Christian, is instead a Muslim. The Obama campaign has created a Web site to dispel misinformation. But this effort may be more difficult than it seems, thanks to the quirky way in which our brains store memories — and mislead us along the way.
The brain does not simply gather and stockpile information as a computer’s hard drive does. Facts are stored first in the hippocampus, a structure deep in the brain about the size and shape of a fat man’s curled pinkie finger. But the information does not rest there. Every time we recall it, our brain writes it down again, and during this re-storage, it is also reprocessed. In time, the fact is gradually transferred to the cerebral cortex and is separated from the context in which it was originally learned. For example, you know that the capital of California is Sacramento, but you probably don’t remember how you learned it.
This phenomenon, known as source amnesia, can also lead people to forget whether a statement is true. Even when a lie is presented with a disclaimer, people often later remember it as true.
This is another fascinating article that first ran in the New York Times. The original story didn't carry any references, so they embedded links in this version of the article. Very cool stuff.
Of all the July 4th posts I have seen, this one resonates with me the most. That we can disagree on what patriotism means and the ways in which we express is the nature of what America is about.
The War Over Patriotism
When critics challenge Barack Obama's patriotism, his supporters have a ready reply: True patriotism has nothing to do with little flags on politicians' lapels. It's not about symbols; it's about actions. It's not about odes to American greatness; it's about taking on your government when it goes astray.
But there Obama is, in his first TV advertisement of the general-election campaign, talking about his "deep and abiding faith in the country I love." And there, perched below his left shoulder, is a subtle, but not too subtle reminder: a tiny American flag.
Obama's no fool. He may not believe that things like flag pins should matter politically, but he knows the difference between should and does. Since Vietnam, the ability to associate oneself with patriotic symbols has often been the difference between Democrats who win and Democrats who lose. Why couldn't George McGovern buy a white working-class vote in 1972? Partly, as the great campaign chronicler Theodore White noted, because virtually every member of Richard Nixon's Cabinet wore a flag lapel button, and no one in McGovern's entourage did. Michael Dukakis lost in 1988 because as governor of Massachusetts, he vetoed a bill requiring teachers to lead students in the Pledge of Allegiance, a veto the Republicans never let him forget.
Obama is trying to follow a different path, blazed by Robert F. Kennedy, who in 1967--just as he was coming out against the Vietnam War--co-sponsored legislation raising penalties for protesters who desecrate the flag. For his part, John McCain is a walking American flag, his heroic biography at the root of his entire campaign. What both campaigns understand is that American patriotism wears two faces: a patriotism of affirmation, which appeals more to conservatives, and a patriotism of dissent, particularly cherished by liberals. Both brands are precious, and both are dangerous. And in this campaign, the candidate who embodies the best of both will probably win.
Preserving the Past
On the surface, defining patriotism is simple. It is love and devotion to country. The questions are why we love it and how we express our devotion. That's where the arguments begin.
The conservative answer is implicit in the title of John McCain's 1999 book, Faith of My Fathers. Why should we love America? In part, at least, because our forefathers did. Think about the lyrics to America ("My Country, 'Tis of Thee"): "Land where my fathers died,/ Land of the Pilgrims' pride." Most liberals don't consider those the best lines of the song. What about the Americans whose fathers died somewhere else? What about all the nasty stuff the Pilgrims did? But conservatives generally want to conserve, and that requires a reverence for the past. What McCain's title implies is that patriotism isn't a choice; it's an inheritance. Being born into a nation is like being born into a religion or a family. You may be called on to reaffirm the commitment as you reach adulthood--as McCain did by joining the military--but it is impressed upon you early on, by those who have come before.
That's why conservatives tend to believe that loving America today requires loving its past. Conservatives often fret about "politically correct" education, which forces America's students to dwell on its past sins. They're forever writing books like America: The Last Best Hope (by William J. Bennett) and America: A Patriotic Primer (by Lynne Cheney), which teach children that historically the U.S. was a pretty nifty place. These books are based on the belief that our national forefathers are a bit like our actual mothers and fathers: if we dishonor them, we dishonor ourselves. That's why conservatives got so upset when Michelle Obama said that "for the first time in my adult lifetime, I am really proud of my country" (a comment she says was misinterpreted). In the eyes of conservatives, those comments suggested a lack of gratitude toward the nation that--as they saw it--has given her and the rest of us so much.
Conservatives know America isn't perfect, of course. But they grade on a curve. Partly that's because they generally take a dimmer view of human nature than do their counterparts on the left. When evaluating America, they're more likely to remember that for most of human history, tyranny has been the norm. By that standard, America looks pretty good. Conservatives worry that if Americans don't appreciate--and celebrate--their nation's past accomplishments, they'll assume the country can be easily and dramatically improved. And they'll end up making things worse. But if conservatives believe that America is, comparatively, a great country, they also believe that comparing America with other countries is beside the point. It's like your family: it doesn't matter whether it's objectively better than someone else's. You love it because it is yours.
The President who best summoned this brand of patriotism was Ronald Reagan. After the humiliation of Vietnam, stagflation and the Iran hostage crisis, Reagan--the nation's oldest President--served as a living link to a stronger, prouder, earlier America. "I would like to be President because I would like to see this country become once again a country where a little 6-year-old girl can grow up knowing the same freedom that I knew when I was 6 years old, growing up in America," he once declared. As a matter of historical fact, that statement was downright bizarre. When Reagan was 6, in 1917, women and most blacks couldn't vote, and America's entry into World War I was whipping up an anti-German frenzy so vicious that some towns in Reagan's native Midwest banned the playing of Beethoven and Brahms. But for Reagan, who sometimes confused movies with real life, history usually meant myth. In his mind, American history was the saga of brave, good-hearted men and women battling daunting odds but forever trying to do the right thing. His favorite TV show was Little House on the Prairie.
As President, Reagan convinced many Americans that they were living in that mythic land once again. He was a master at associating himself with America's cherished symbols. The images in his 1984 "Morning in America" ad--the fresh-faced lad on his paper route, the proud mother in the simple church watching her daughter walk down the aisle, the burly man gently hoisting an American flag--moistened even many liberal eyes. In fact, Reagan practically became one of those symbols himself: the cowboy President, sitting astride his horse, framed by a rugged Western terrain.
Read the rest of this article.
I want to clear here that I think this is a great move on the part of the Wilber realm of integral. Visser raised some points that many of us agree with, and he also made some unfair generalizations, so it's great to see Mark and Sean engage the criticism rather than ignore it.
The more dialogue that can be generated between the various integral "movements," the better the whole integral enterprise will become.
The Academic Emergence of Integral Theory
Reflections on and Clarifications of the 1st Biennial Integral Theory Conference
Mark D. Forman, Ph.D.
and Sean Esbjörn-Hargens, Ph.D.
As founders and organizers of the 1st Integral Theory Conference we feel moved to respond to Frank Visser's latest posting (“Assessing Integral Theory”). We do this in the spirit of dialogue and out of a sense that his characterization of our event was misleading and inaccurate in important ways. To be fair, Visser's article is less about the conference and more about what constitutes theory building and the checking of its validity. His main focus is on how Wilber has failed to build theory and have it validated in a scientific or academic fashion. We would like to raise several points relevant to this.
First, however, we would like to underscore that we agree with elements of Visser's article.
- We agree that Integral Theory (broadly defined or defined simply as Wilber's work) has not yet made strong enough inroads into the academic world. We recognize that the burden is on those of us engaged in Integral Theory to help it conform more strongly to the norms of academic discourse, research, and critical analysis.
- Likewise, we agree, as academics, that Ken's writing has generally fallen in a place in between traditional academic discourse and more general popular philosophical discourse. While both of us feel greatly indebted to Ken for his obvious contribution to the Integral movement and honor the extent to which he has indeed sought empirical support (narrow and broad) for many of his ideas, we think that Visser rightly points out the challenges his writing style presents from an academic point of view.
- Finally, we also agree that much value could have come out of Wilber engaging with patience and curiosity the various critiques the Integral World website has housed. We would be the first to read and study such dialogues closely were they to take place. And yet we recognize these squandered opportunities while simultaneously not being convinced that Wilber's silence has only been a matter of him being a self-referential jerk. Nor are we sure that the general lack of response by Wilber's students and proponents of his work is simply the result of them sleeping with the devil. We will address this issue further at the end of this essay.
All this said—and we believe these are some substantial areas of agreement—we also feel that Visser's overall characterization of the relationship between Integral Theory and academia misses the mark in several places. More centrally, we believe that his characterization of our conference as a place where “a kind of Wilber celebration is staged”—which we take to mean it will be an unthinking, uncritical, or idol-worshipping look at his work—is both inaccurate and demonstrates the same kind of dismissive tone that he has so vehemently charged Wilber with using with his critics. This accusation appears especially cynical given that much of the conference has been explicitly and obviously designed to address the kind of critiques that Visser's website has showcased over the years. To us it was notable that he primarily highlighted the potential downsides of the conference (e.g., to be more of the same “self-referential discourse” he attributes to Wilber) without presenting much, if any, of its possible opportunities (e.g., the first open academic space that is beginning a much needed and arguably far overdue process of critical reflection and application of Integral Theory).
We would therefore like to outline a few points that give a more accurate view of the conference and to provide what we feel is a more balanced impression—in the context of discussing the conference—of the relationship between Integral Theory and academia. We will end this response with some of our concerns with the quality and nature of the work found on the Integral World website itself.
Read their whole response.
We have a long way to go to be a great nation.
Courtesy of Yoism.org, the world's first open source religion.
Thursday, July 03, 2008
CALLS FOR SUBMISSIONS
Integral Review: A Transdisciplinary and Transcultural Journal for New Thought, Research, and Praxis
Please circulate this notice to your networks and help spread the word!
Submissions deadlines are approaching for publication in Integral Review’s regular December issue!
§ July 31, 2008 for academic works
§ September 30, 2008 for non-academic works
See http://integral-review.org for the types of works IR welcomes, and for submission guidelines.
# # #
The deadline for submissions to the 2009 Special Issue “Toward Development of Politics and the Political” is December 31, 2008.
See http://integral-review.org/submissions/special-issue.asp for the full Call for Papers.
Salon ran a nice interview with Karl Giberson the other day -- he is the author of Saving Darwin: How to Be a Christian and Believe in Evolution. His book is the latest in the new literary genre of Evolution vs. Religion (OK, I just made that up, but long-time readers of this blog will know what I am talking about).
Can't Darwin and God get along?
Of course they can, argues physicist and theologian Karl Giberson, if only many believers were more sophisticated and atheists less dogmatic.
By Vincent RossmeierJuly 1, 2008 | With biologist Richard Dawkins leading the way, many scientists today are locked in an unending match of whack-a-mole with Christian creationists, who insist that God created heaven, earth and humanity in its present form, and with disciples of intelligent design who want to expel evolution from its scientific prominence in public schools. If you've been following the battle, you might be inclined to believe that Americans are faced with a choice between believing in God and scientific fact.
In his new book, "Saving Darwin: How to Be a Christian and Believe in Evolution," Karl Giberson calls this a false choice. A professor of physics at Eastern Nazarene College, and director of the Forum on Faith and Science at Gordon College, Giberson believes in evolutionary theory as adamantly as he does in God. For Giberson, evolution and Christianity are not in competition but complement one another. Holding equal disdain for creationists who read the Bible literally and scientists who disregard God altogether, Giberson seeks a middle way, and attempts to resuscitate Darwin's reputation as both a religious man and a scientist. In conversation, Giberson possesses a boundless inquisitiveness typical of many scientists, but also displays the wry wit of a seasoned polemicist. He seems to know how to counteract your best arguments before you have even made them.
Why does Darwin need to be saved?
He has been vilified in American evangelical culture and even more broadly than that. Yet his important contribution to science reaches into theology and religion, and so it's important to rehabilitate him so that you can't simply call something Darwinist and have people say, "Oooo, that smells bad."
Why do misconceptions about Darwin persist?
Because in the latter part of the 20th century, evolution became identified with negative social agendas, and some very effective polemicists like Henry Morris and Ken Ham convinced people that evolution was responsible for the breakdown of the family and drug abuse and all manners of evil. Christians who tend to see satanic or sinister influences behind those things were only too ready to demonize Darwin and say he had an agenda to destroy their faith. In their eyes, Darwinism destroyed belief in God the creator.
Darwinism became associated with repugnant beliefs like Nazism and eugenics. But as you point out, evolution doesn't make judgments, it merely describes.
Right. There's an important distinction between a theory that tells us the way the world is and a theory that tells us the way it ought to be. In practice, however, we think we should behave by the way we think the world is. That's why there's such an intense debate about homosexuality. Conservatives don't want homosexuality to be perceived as something natural because that would force them to reevaluate the way we treat it from a moral perspective. While it's true that you can't justify eugenics on the basis of Darwin's observations, as soon as genetics was recognized to be as important as it is, people began to realize that genetics could be used to improve the species. This was kind of a natural extension but it's certainly not implied in Darwin's work.
Aren't some people threatened by evolution because they can't reconcile biblical literalism, or "young-earth" creationism, with the fact that the earth is not 10,000 years old but billions?
Yes, but young-earth theory is an interpretation of Genesis that requires that you bring a certain set of suspect assumptions to the text. The early chapters of Genesis do not read like history. They have a different sort of character to them. People who read Hebrew and understand the ancient Near Eastern worldview, and the cosmology that informed it, have given us ample reasons why you would not read Genesis that way, even if you weren't worried about reconciling it with a billion-year-old planet.
One of the interesting things he rails against (and that term may be a little strong) is Biblical literalism and the commensurate distrust of academic scholarship. In the fundamentalist community there is HUGE distrust of academics -- they have been so strongly defined as the enemy in the "culture wars" that all things academic are rejected out of hand. This in part explains the rejection of science in general.
Any way, here is a good quote:
Biblical literalism is very simple. You read the Bible in English and you say to yourself that these are the things God wrote down through a secretary a long time ago, and all I need to do is read this in English and that's all the work I have to do to understand it. Who wouldn't want that to be the case? If you try to tell these people that they need some egghead scholar from Harvard, who can read Hebrew, to come in and help them with it, that seems offensive and alienating, and people aren't attracted to that. So I think the ability of American religion to invent itself and to appeal to common denominators, sometimes the lowest denominator, has allowed these evangelical movements to flourish with their own agendas.
Here is one more quote -- I quite like this guy. If more Christians held this worldview, the world might be a little less violent.
I'm not at all uncomfortable saying that religious experiences can be genuine. A lot of them are fraudulent and some of them are epileptic seizures or whatever. But I believe in God, I believe God is personal and that God exists and cares about the created order. I think it's a very reasonable belief that God interacts with creation and that experiences people have of interacting with God are profound and deeply meaningful.
But you reject the idea that God tinkers and has his hand in day-to-day processes, so how do nature and God interact?
That's the tough question. You should rewind the tape and erase the question because I don't really have a good answer. What I would say, however, is when you know a lot about how something works, it's reasonable to rule out certain things and say, well, I don't think it could be this or that. When you know almost nothing about how something works, you need to be more humble. We don't know how we interact with the world. Somehow you got it into your head that you were going to call and talk to me about this book. Some kind of vague intention, purposeful agenda emerged in your mind, and it got translated into a whole set of actions, and now we're talking on the phone. We don't understand that.
Consciousness is a very deep mystery. All of our models say consciousness shouldn't be possible, that it should just be atoms and molecules in your brain randomly doing things. Nothing that we've developed for a model of how human intentionality works makes sense of our own experience of the world. But here we are, doing things in the world. Somehow a conscious-like starting point for human actions emerges and we are able to execute things in the world and change physical reality. Now, we know this happens, this isn't a mystical theory, you can see this happening every day.
How do we know there isn't some similar mechanism by which God interacts with the world, that God can be understood as a spirit, that God is more like consciousness than a material object? If we have an all-encompassing, pervasive personal being that has created the entire universe, and is coupled to that universe in some way, it just seems to me that the notion of God acting through the world without violating its laws is no more mysterious than us acting through that same world. So I'd say to Dawkins, until you explain to me how human beings interact with the world, don't tell me that God couldn't interact with the world in the same way we do.
Be sure to read the whole interview.
Tags: Salon, Can't Darwin and God Get Along?, Karl Giberson, Saving Darwin, How to Be a Christian and Believe in Evolution, Vincent Rossmeier, Religion, evolution, culture, Biblical Literalism, God
Liberal Bloggers Accuse Obama of Trying to Win Election
Posted July 2, 2008 | 07:43 AM (EST)
The liberal blogosphere was aflame today with new accusations that Sen. Barack Obama (D-Ill) is trying to win the 2008 presidential election.
Suspicions about Sen. Obama's true motives have been building over the past few weeks, but not until today have the bloggers called him out for betraying the Democratic Party's losing tradition.
"Barack Obama seems to be making a very calculated attempt to win over 270 electoral votes," wrote liberal blogger Carol Foyler at LibDemWatch.com, a blog read by a half-dozen other liberal bloggers. "He must be stopped."
But those comments were not nearly as strident as those of Tracy Klugian, whose blog LoseOn.org has backed unsuccessful Democratic candidates since 2000.
"Increasingly, Barack Obama's message is becoming more accessible, appealing, and yes, potentially successful," he wrote. "Any Democrat who voted for Dukakis, Mondale or Kerry should regard this as a betrayal."
Liberal bloggers said that they would be watching Sen. Obama's vice-presidential selection process "very closely" for signs that he is plotting to win the election.
"Barack Obama still has a chance to pick someone disastrous as a sign that he wants to lose this thing," Ms. Foyler wrote. "If not, he should brace himself for some really mean blog posts."
Biological anthropologist Helen Fisher writes widely on the biological basis of love, sex and relationships. She is a consultant for the computer dating firm Chemistry.com. We talk about the potential therapeutic uses of the neurochemistry of love.
Governor Bobby Jindal's decision this past Friday to sign a bill that allows teachers in his state to "supplement" classes on evolution with talk of creationism -- is one simple basic fact. The human species isn't intelligently designed.
When you get right down to it, from an engineering perspective, the design of the human mind (and for the matter the human body) is a bit of mess.
Take, for instance, human memory, and the trouble we often have in remembering even the most basic facts -- where did we put our keys? Where did we park our car? Because our brains so often blur our memories together. Human eyewitness testimony is often no match for even a low-rent survelllance camera, and memory can fail even in life-or-death circumstances. (6% of all skydiving fatalities, for instance, are from divers that forgot to pull their ripcords),
Our troubles with memory in turn lead to an unending litany of problems that the psychologist Timothy Wilson collectively refers to as "mental contamination", in which irrelevant information frequently, ranging from the physical attractiveness of political candidates to random numbers on a roulette wheel, subconsciously cloud human judgments. If an ugly child throws an ice-filled snowballs, for instance, we judge that child to be delinquent, but when an especially attractive child does the same thing, we excuse him, saying he's just "having a bad day." A study published earlier this month showed that people's moral judgments are more severe when made in a disgusting, soiled pizza-box filled office than when in an office that is neat as a pin; another, which appeared just last week in the Proceedings of the National Academy of Sciences, shows that voters are more likely to favor school policies if the balloting takes place in a school than if it takes place in an apartment building. We may aspire, as Aristotle thought, to be "the rational animal", but in reality the flotsam and jetsam of barely conscious memory frequently intercedes.
At this point, 30 years after the Nobel Laureate Daniel Kahneman and his late collaborator Amos Tversky started documenting a rash of fallacies in human reasoning, the idea that the human mind would be "perfect in His image" is as outdated (and narcissistic) as the idea that the solar system would revolve around the planet earth.
Imperfections riddle the body as well; the human spine supports 70% of our body weight with a single column, where four might have distributed the load better (greatly reducing the incidence of debilitating back pain), and the human retina is effectively installed backwards, with its array of outgoing neural fibers coming out of the front rather than the back, saddling us with an entirely needless blindspot.
The only theory that can really make sense of these needless imperfections is Darwin's theory of natural selection, which holds that humans (and all other life forms) evolve through a blind process known as descent-with-modification, in which new life forms represent random modifications of earlier life forms -- with no central overseer to guide the process. Such a random process can, over time, lead populations of creatures to become more adapted to their environment, but it is also vulnerable to getting stuck, in the sort of good-enough-but-not-perfect solutions that mathematicians call local maxima.
A local maximum is like a moderately high peak in a rugged mountain range that is filled with other peaks, some of which are considerably higher; a peak at the top of the treeline, when there are plenty of snow-capped peaks that loom considerably higher. The process of natural selection is vulnerable to such limits for two reasons: it is blind, and it generally takes only small steps; as such, it can easily get stuck on low-lying peaks that are impressive but well short of the highest possible mountaintop, designs that are "good enough for government work" but far from perfect.
Darwin gives a natural explanation that indicates poorly-designed features should be common in biology. The theory of intelligent design, in contrast, has a serious problem explaining such phenomena: an intelligent designer that could perceive the whole landscape could just pick us up and move us to higher ground. That this has never happened is clear testament both to the wisdom of the theory of natural selection and the implausibility of intelligent design.
The problem with the Lousiana law is not just that it seeks to mix church and state, a situation that the Constitution's framers rightly sought to avoid, but that it is predicated on the assumption that creationists have a reasonable theory with which to counter evolution with - where in truth they simply don't.
This fascinating, brilliant 20-minute video narrates the history of the "Amen Break," a six-second drum sample from the b-side of a chart-topping single from 1969. This sample was used extensively in early hiphop and sample-based music, and became the basis for drum-and-bass and jungle music -- a six-second clip that spawned several entire subcultures. Nate Harrison's 2004 video is a meditation on the ownership of culture, the nature of art and creativity, and the history of a remarkable music clip.
John "I Didn't Say What I Said" McCainby Todd Beeton, Wed Jul 02, 2008 at 08:15:15 PM EST
As I've written before, John McCain has this amusing little habit of denying his own words. It's as if he's not aware that we have these devices nowadays that record sound and images and still other devices that replay them. Whether it be his "100 years in Iraq" comment or his "it's not too important when the troops come home" line, when called on it, he simply chooses to pretend he never said anything of the kind, crying "Out of context!" like a child trying to avoid punishment for something everyone knows he did.
McCain's latest "I didn't say what I said" moment -- this one on the subject of his self-professed lack of economic expertise -- came this morning on Good Morning America:"Good Morning America's" Robin Roberts...asked McCain why he went abroad when the No. 1 issue for voters was the U.S. economy.
"You have admitted that you're not exactly an expert when it comes to the economy," Roberts began.
"I have not. I have not. I actually have not," McCain interrupted. "I said that I am stronger on national security issues because of all the time I spent in the military. Very strong on the economy. I understand it. I have a lot more experience than my opponent."
In response, The DNC has released this excellent compilation video of all the times McCain has let us know just how NOT an expert on the economy he is.
Is this the "straight talk" guy all the media seems to be in love with? It's about time someone calls him on this kind of stuff.
A Really Long Strange TripPeter Dejong / AP
How some dedicated scientists and former flower children managed to bring hallucinogenic drug research back to mainstream labs after more than 30 years.By Jeneen Interlandi | Newsweek Web ExclusiveJul 2, 2008 | Updated: 4:47 p.m. ET Jul 2, 2008
It's been more than a year since John Hayes, a professor of pastoral counseling at Loyola College, ingested psilocybin, the active ingredient in magic mushrooms. He claims that the series of three eight-hour highs, administered—in a laboratory-turned-living room at Johns Hopkins Medical School in Baltimore—have made him a calmer, less fearful person. "It gave me this sense that space and time are human constructions that can collapse," says Hayes, 59. "The ultimate reality is something beyond those constructions, and more importantly, everything in the world is connected."
These are familiar sentiments to Roland Griffiths, the scientist who led a study of 36 volunteers, most of whom detailed similar experiences after taking the hallucinogenic compound. In a report published on July 1st in the Journal of Pharmacology, more than 60 percent of those intrepid volunteers reported substantial increases in life satisfaction a year after the experiment. "We have people saying these eight hours in the lab are among the most meaningful in their lives," says Griffiths. "Some rank it alongside births and deaths of loved ones." (Eleven volunteers experienced side effects such as fear or anxiety, only eight of them for a significant portion of the session.) Despite the long-held promise that such substances might reveal the secrets of the conscious mind, the study of hallucinogenic compounds has always been controversial. Once a thriving area of research, projects like these ground to a halt in the late 1960s when a media frenzy over rampant recreational use led the federal government to criminalize both psilocybin and LSD. There were reports of college students diving out of windows, staring at the sun until they went blind or developing schizophrenia after taking the drugs. While Griffiths insists many of these reports were pure myth, they scared scientists and administrators away.
For a time, it seemed that convincing America's premier research institutions to fund or sponsor research like this was nigh on impossible. In fact, the Journal of Pharmacology study represents one of the first yields of a 30-year effort to rebuild legitimate psychedelic research programs from the ashes of 1960s.
So how did Griffiths and his colleagues get the funding and approval to bring magic mushrooms and their pharmacological siblings back into mainstream labs? It's been a long strange trip. In fact, the story of how a small group of scientists worked for decades to revive scientific interest in psychedelic drugs and attract private donors to fill the funding gap left by a skeptical establishment is almost as fascinating as the research itself. Griffiths and Purdue pharmacologist Dave Nichols were just beginning their careers when the excesses of their forbears effectively shut down the field of psychedelic research in the early 1970s. "There's just a handful of us driving this, and we're sort of all in the time frame where we just caught the tail-end of the whole Haight-Ashbury period," says Nichols. "But we saw some amazing effects, and the interest never went away, even if the research did." Some of the most striking of those effects had been seen in the terminally ill, who often lost their fear of death and found comfort and peace from drugs such as psilocybin. "The hospice movement had yet to begin," says Griffith. "At the time we were just leaving terminal patients in a sterile corner of the hospital."
But with federal agencies reluctant to fund research into illegal substances and major universities unwilling to chance a 1960s-style meltdown (should the chemicals make their way from labs to dorm rooms), those early threads could not be pursued. So Nichols focused on the biochemistry of psychedelics, relying exclusively on animal models. And Griffiths went on to study the influence of other substances on behavior. Still, the questions that first sparked their curiosity—namely how a particular molecule could so profoundly influence one's perception of the world—lingered on. Until, that is, Nichols and his colleagues rose to a level of prominence that they could leverage to probe their still-controversial interest in these substances.
"I had been saying for decades that you could still do the research if you had private funding," says Nichols. "Finally I realized if I waited any longer, I'd be retired and I'd really regret not having done anything with it."
Read the rest of the article.
Wednesday, July 02, 2008
Bush Tours America To Survey Damage Caused By His Disastrous Presidency
Here's a taste of the article for those who may have somehow missed this:
So what does this mean for science?
Scientists are trained to recognize that correlation is not causation, that no conclusions should be drawn simply on the basis of correlation between X and Y (it could just be a coincidence). Instead, you must understand the underlying mechanisms that connect the two. Once you have a model, you can connect the data sets with confidence. Data without a model is just noise.
But faced with massive data, this approach to science — hypothesize, model, test — is becoming obsolete. Consider physics: Newtonian models were crude approximations of the truth (wrong at the atomic level, but still useful). A hundred years ago, statistically based quantum mechanics offered a better picture — but quantum mechanics is yet another model, and as such it, too, is flawed, no doubt a caricature of a more complex underlying reality. The reason physics has drifted into theoretical speculation about n-dimensional grand unified models over the past few decades (the "beautiful story" phase of a discipline starved of data) is that we don't know how to run the experiments that would falsify the hypotheses — the energies are too high, the accelerators too expensive, and so on.
Now biology is heading in the same direction. The models we were taught in school about "dominant" and "recessive" genes steering a strictly Mendelian process have turned out to be an even greater simplification of reality than Newton's laws. The discovery of gene-protein interactions and other aspects of epigenetics has challenged the view of DNA as destiny and even introduced evidence that environment can influence inheritable traits, something once considered a genetic impossibility.
In short, the more we learn about biology, the further we find ourselves from a model that can explain it.
There is now a better way. Petabytes allow us to say: "Correlation is enough." We can stop looking for models. We can analyze the data without hypotheses about what it might show. We can throw the numbers into the biggest computing clusters the world has ever seen and let statistical algorithms find patterns where science cannot.
The best practical example of this is the shotgun gene sequencing by J. Craig Venter. Enabled by high-speed sequencers and supercomputers that statistically analyze the data they produce, Venter went from sequencing individual organisms to sequencing entire ecosystems. In 2003, he started sequencing much of the ocean, retracing the voyage of Captain Cook. And in 2005 he started sequencing the air. In the process, he discovered thousands of previously unknown species of bacteria and other life-forms.
If the words "discover a new species" call to mind Darwin and drawings of finches, you may be stuck in the old way of doing science. Venter can tell you almost nothing about the species he found. He doesn't know what they look like, how they live, or much of anything else about their morphology. He doesn't even have their entire genome. All he has is a statistical blip — a unique sequence that, being unlike any other sequence in the database, must represent a new species.
This sequence may correlate with other sequences that resemble those of species we do know more about. In that case, Venter can make some guesses about the animals — that they convert sunlight into energy in a particular way, or that they descended from a common ancestor. But besides that, he has no better model of this species than Google has of your MySpace page. It's just data. By analyzing it with Google-quality computing resources, though, Venter has advanced biology more than anyone else of his generation.This kind of thinking is poised to go mainstream.
~C4Chaos offered one good interpretation of this article:
This reminded of the book, The Black Swan (see my review). Theoretical models are useful as starting points and for framing but in the long run our human tendency to categorize (Platonicity) and explain the causes of everything with theories (narrative fallacy) backed up with partial evidence (confirmation bias; fallacy of silent evidence) while concocting models of reality (ludic fallacy) make us blind to Black Swans (i.e. high-impact, hard-to-predict, and rare event beyond the realm of normal expectations).Many science bloggers disagreed with Anderson's conclusions. Deepak Singh stated bluntly, Chris Anderson, you are wrong.
How can I (or others who actually still do science) take the new paradigms of computing (by the way, bioinformaticians have been using methods typically used in “collective intelligence” for years), and take biology, which is now very much a digital science, and combine them with our scientific reasoning, our ability to take phenomena and develop models that explain those phenomena and do something meaningful with them. I have seen many computer scientists develop some very elegant theoretical models for biological information, but often without any biological context. Yes scientists need to adopt new techniques, develop new theoretical approaches, even rethink the very basic tenets that they know, but to say the scientific method is dead or approaching the end is sensationalist in the least, and completely uneducated in the extreme.Andrew at the Social Statistics blog also adds his perspective in The End of Theory: The Data Deluge Makes the Scientific Method Obsolete.
1. Anderson has a point--there is definitely a tradeoff between modeling and data. Statistical modeling is what you do to fill in the spaces between data, and as data become denser, modeling becomes less important.
2. That said, if you look at the end result of an analysis, it is often a simple comparison of the "treatment A is more effective than treatment B" variety. In that case, no matter how large your sample size, you'll still have to worry about issues of balance between treatment groups, generalizability, and all the other reasons why people say things like, "correlation is not causation" and "the future is different from the past."
3. Faster computing gives the potential for more modeling along with more data processing. Consider the story of "no pooling" and "complete pooling," leading to "partial pooling" and multilevel modeling. Ideally our algorithms should become better at balancing different sources of information. I suspect this will always be needed.
More dissent comes from Drew Conway at Zero Intelligence Agents in The Hubris of ‘The End of Theory’.
Go read the rest of his entry -- it's one of the better ones.Where is the Life we have lost in living?
Where is the wisdom we have lost in knowledge?
Where is the knowledge we have lost in information?
Mr. Anderson is well advised to consider these questions posed by Elliot, as his assertion that the “scientific method is becoming obsolete” by the deluge of data stored in the “clouds” of Google is not only misinformed; it presents a dangerous framework for undermining our ability to truly understand individuals as they exist, and have existed, in all aspects of life.
Now back to Edge issue 248, where there are many responses to Anderson's article. Edge asked George Dyson, Kevin Kelly, Stewart Brand, W. Daniel Hillis, Sean Carroll, Jaron Lanier, Joseph Traub, John Horgan, Bruce Sterling, and Douglas Rushkoff to contribute their insights into The End of Theory -- the results are intriguing.
Here are teaser bits from some of the respondents:
GEORGE DYSON: Just as we may eventually take the brain apart, neuron by neuron, and never find the model, we may discover that true AI came into existence without anyone ever developing a coherent model of reality or an unambiguous theory of intelligence. Reality, with all its ambiguities, does the job just fine. It may be that our true destiny as a species is to build an intelligence that proves highly successful, whether we understand how it works or not.And here is a full response from Douglas Rushkoff, someone I have admired for years, since his first couple of books.
KEVIN KELLY: My guess is that this emerging method will be one additional tool in the evolution of the scientific method. It will not replace any current methods (sorry, no end of science!) but will compliment established theory-driven science. Let's call this data intensive approach to problem solving Correlative Analytics. I think Chris squanders a unique opportunity by titling his thesis "The End of Theory" because this is a negation, the absence of something. Rather it is the beginning of something, and this is when you have a chance to accelerate that birth by giving it a positive name.
STEWART BRAND: Digital humanity apparently crossed from one watershed to another over the last few years. Now we are noticing. Noticing usually helps. We'll converge on one or two names for the new watershed and watch what induction tells us about how it works and what it's good for.
W. DANIEL HILLIS: Chris Anderson says that "this approach to science — hypothesize, model, test — is becoming obsolete". No doubt the statement is intended to be provocative, but I do not see even a little bit of truth in it. I share his enthusiasm for the possibilities created by petabyte datasets and parallel computing, but I do not see why large amounts of data will undermine the scientific method. We will begin, as always, by looking for simple patterns in what we have observed and use that to hypothesize what is true elsewhere. Where our extrapolations work, we will believe in them, and when they do not, we will make new models and test their consequences. We will extrapolate from the data first and then establish a context later. This is the way science has worked for hundreds of years.
I'm suspicious on a few levels.My thoughts:
First off, I don't think Google has been proven "right." Just effective for the moment. Once advertising itself is revealed to be a temporary business model, then Google's ability to correctly exploit the trajectory of a declining industry will itself be called into question. Without greater context, Google's success is really just a tactic. It's not an extension of human agency (or even corporate agency) but strategic stab based on the logic of a moment. It is not a guided effort, but a passive response. Does it work? For the moment. Does it lead? Not at all.
Likewise, to determine human choice or make policy or derive science from the cloud denies all of these fields the presumption of meaning.
I watched during the 2004 election as market research firms crunched data in this way for the Kerry and Bush campaigns. They would use information unrelated to politics to identify households more likely to contain "swing" voters. The predictive modeling would employ data points such as whether the voters owned a dog or cat, a two-door or four-door car, how far they traveled to work, how much they owed on their mortgage, to determine what kind of voters were inside. These techniques had no logic to them. Logic was seen as a distraction. All that mattered was the correlations, as determined by computers poring over data.
If it turned out that cat-owners with two door cars were more likely to vote a certain way or favor a certain issue, then pollsters could instruct their canvassers which telephone call to make to whom. Kids with dvd players containing ads customized for certain households would show up on the doorsteps of homes, play the computer-assembled piece, leave a flyer, and head to the next one.
Something about that process made me cynical about the whole emerging field of bottom-up, anti-taxonomy.
I'm all for a good "folksonomy," such as when kids tag their favorite videos or blog posts. It's how we know which YouTube clip to watch; we do a search and then look for the hit with the most views. But the numbers most certainly do not speak for themselves. By forgetting taxonomy, ontology, and psychology, we forget why we're there in the first place. Maybe the video consumer can forget those disciplines, but what about the video maker?
When I read Anderson's extremely astute arguments about the direction of science, I find myself concerned that science could very well take the same course as politics or business. The techniques of mindless petabyte churn favor industry over consideration, consumption over creation, and—dare I say it—mindless fascism over thoughtful self-government. They are compatible with the ethics-agnostic agendas of corporations much more than they are the more intentional applied science of a community or civilization.
For while agnostic themselves, these techniques are not without bias. While their bias may be less obvious than that of human scientists trained at elite institutions, their bias is nonetheless implicit in the apparently but falsely post-mechanistic and absolutely open approach to data and its implications. It is no more truly open than open markets, and ultimately biased in their favor. Just because we remove the limits and biases of human narrativity from science, does not mean other biases don't rush in to fill the vacuum.
I doubt that theory will ever go away -- it's a integral part of human consciousness. In an earlier post of a discussion from Seed Magazine, Michael Gazzaniga discusses his discovery of "the Interpreter," a "module" in the left hemisphere of the brain that seeks to explain our actions (even in the absence of any knowledge as to why we did something).
My sense is that this module is also responsible for all kinds of other theoretical activities. Human beings are defined more than any other trait by their desire to make some sense of their world. Information overload will not change this inclination, but it will give us more data to work with.