Pages

Wednesday, January 02, 2008

Edge World Question: "What Have You Changed Your Mind About?"

Edge's 2008 question is, "What Have You Changed Your Mind About?" Major thinkers from a wide variety of fields weight in with their answers, which is always interesting.

Here are a few I liked (though don't necessarily agree with), although I haven't gotten through all of them yet. One question arises, though: Why aren't there more women represented? In the first three or four pages, I only noticed one or two women.

DOUGLAS RUSHKOFF
Media Analyst; Documentary Writer; Author, Get Back in the Box: Innovation from the Inside Out

The Internet

I thought that it would change people. I thought it would allow us to build a new world through which we could model new behaviors, values, and relationships. In the 90's, I thought the experience of going online for the first time would change a person's consciousness as much as if they had dropped acid in the 60's.

I thought Amazon.com was a ridiculous idea, and that the Internet would shrug off business as easily as it did its original Defense Department minders.

For now, at least, it's turned out to be different.

Virtual worlds like Second Life have been reduced to market opportunities: advertisers from banks to soft drinks purchase space and create fake characters, while kids (and Chinese digital sweatshop laborers) earn "play money" in the game only to sell it to lazier players on eBay for real cash.

The businesspeople running Facebook and MySpace are rivaled only by the members of these online "communities" in their willingness to surrender their identities and ideals for a buck, a click-through, or a better market valuation.

The open source ethos has been reinterpreted through the lens of corporatism as "crowd sourcing" — meaning just another way to get people to do work for no compensation. And even "file-sharing" has been reduced to a frenzy of acquisition that has less to do with music than it does the ever-expanding hard drives of successive iPods.

Sadly, cyberspace has become just another place to do business. The question is no longer how browsing the Internet changes the way we look at the world; it's which browser we'll be using to buy and sell stuff in the same old world.

*****

HOWARD GARDNER
Psychologist, Harvard University; Author, Changing Minds

Wrestling with Jean Piaget, my Paragon

Like many other college students, I turned to the study of psychology for personal reasons. I wanted to understand myself better. And so I read the works of Freud; and I was privileged to have as my undergraduate tutor, the psychoanalyst Erik Erikson, himself a sometime pupil of Freud. But once I learned about new trends in psychology, through contacts with another mentor Jerome Bruner, I turned my attention to the operation of the mind in a cognitive sense — and I've remained at that post ever since.

The giant at the time — the middle 1960s — was Jean Piaget. Though I met and interviewed him a few times, Piaget really functioned for me as a paragon. In the term of Dean Keith Simonton, a paragon is someone whom one does not know personally but who serves as a virtual teacher and point of reference. I thought that Piaget had identified the most important question in cognitive psychology — how does the mind develop; developed brilliant methods of observation and experimentation; and put forth a convincing picture of development — a set of general cognitive operations that unfold in the course of essentially lockstep, universally occurring stages. I wrote my first books about Piaget; saw myself as carrying on the Piagetian tradition in my own studies of artistic and symbolic development (two areas that he had not focused on); and even defended Piaget vigorously in print against those who would critique his approach and claims.

Yet, now forty years later, I have come to realize that the bulk of my scholarly career has been a critique of the principal claims that Piaget put forth. As to the specifics of how I changed my mind:

Piaget believed in general stages of development that cut across contents (Space, time, number); I now believe that each area of content has its own rules and operations and I am dubious about the existence of general stages and structures.

Piaget believed that intelligence was a single general capacity that developed pretty much in the same way across individuals: I now believe that humans posses a number of relatively independent intelligences and these can function and interact in idiosyncratic ways,

Piaget was not interested in individual differences; he studied the 'epistemic subject.' Most of my work has focused on individual differences, with particular attention to those with special talents or deficits, and unusual profiles of abilities and disabilities.

Piaget assumed that the newborn had a few basic biological capacities — like sucking and looking — and two major processes of acquiring knowledge, that he called assimilation and accommodation. Nowadays, with many others, I assume that human beings possess considerable innate or easily elicited cognitive capacities, and that Piaget way underestimated the power of this inborn cognitive architecture.

Piaget downplayed the importance of historical and cultural factors — cognitive development consisted of the growing child experimenting largely on his own with the physical (and, minimally, the social ) world. I see development as permeated from the first by contingent forces pervading the time and place of origin.

Finally, Piaget saw language and other symbols systems (graphic, musical, bodily etc) as manifestations, almost epiphenomena, of a single cognitive motor; I see each of these systems as having its own origins and being heavily colored by the particular uses to which a systems is put in one's own culture and one's own time.

Why I changed my mind is an issue principally of biography: some of the change has to do with my own choices (I worked for 20 years with brain damaged patients); and some with the Zeitgeist (I was strongly influenced by the ideas of Noam Chomsky and Jerry Fodor, on the one hand, and by empirical discoveries in psychology and biology on the other).

Still, I consider Piaget to be the giant of the field. He raised the right questions; he developed exquisite methods; and his observations of phenomena have turned out to be robust. It's a tribute to Piaget that we continue to ponder these questions, even as many of us are now far more critical than we once were. Any serious scientist or scholar will change his or her mind; put differently, we will come to agree with those with whom we used to disagree, and vice versa. We differ in whether we are open or secretive about such "changes of mind": and in whether we choose to attack, ignore, or continue to celebrate those with whose views we are no longer in agreement.

*****

NICK BOSTROM
Philosopher, University of Oxford; Author,

Everything

For me, belief is not an all or nothing thing — believe or disbelieve, accept or reject. Instead, I have degrees of belief, a subjective probability distribution over different possible ways the world could be. This means that I am constantly changing my mind about all sorts of things, as I reflect or gain more evidence. While I don't always think explicitly in terms of probabilities, I often do so when I give careful consideration to some matter. And when I reflect on my own cognitive processes, I must acknowledge the graduated nature of my beliefs.

The commonest way in which I change my mind is by concentrating my credence function on a narrower set of possibilities than before. This occurs every time I learn a new piece of information. Since I started my life knowing virtually nothing, I have changed my mind about virtually everything. For example, not knowing a friend's birthday, I assign a 1/365 chance (approximately) of it being the 11th of August. After she tells me that the 11th of August is her birthday, I assign that date a probability of close to 100%. (Never exactly 100%, for there is always a non-zero probability of miscommunication, deception, or other error.)

It can also happen that I change my mind by smearing out my credence function over a wider set of possibilities. I might forget the exact date of my friend's birthday but remember that it is sometime in the summer. The forgetting changes my credence function, from being almost entirely concentrated on 11th of August to being spread out more or less evenly over all the summer months. After this change of mind, I might assign a 1% probability to my friend's birthday being on the 11th of August.

My credence function can become more smeared out not only by forgetting but also by learning — learning that what I previously took to be strong evidence for some hypothesis is in fact weak or misleading evidence. (This type of belief change can often be mathematically modeled as a narrowing rather than a broadening of credence function, but the technicalities of this are not relevant here.)

For example, over the years I have become moderately more uncertain about the benefits of medicine, nutritional supplements, and much conventional health wisdom. This belief change has come about as a result of several factors. One of the factors is that I have read some papers that cast doubt on the reliability of the standard methodological protocols used in medical studies and their reporting. Another factor is my own experience of following up on MEDLINE some of the exciting medical findings reported in the media — almost always, the search of the source literature reveals a much more complicated picture with many studies showing a positive effect, many showing a negative effect, and many showing no effect. A third factor is the arguments of a health economist friend of mine, who holds a dim view of the marginal benefits of medical care.

Typically, my beliefs about big issues change in small steps. Ideally, these steps should approximate a random walk, like the stock market. It should be impossible for me to predict how my beliefs on some topic will change in the future. If I believed that a year hence I will assign a higher probability to some hypothesis than I do today — why, in that case I could raise the probability right away. Given knowledge of what I will believe in the future, I would defer to the beliefs of my future self, provided that I think my future self will be better informed than I am now and at least as rational.

I have no crystal ball to show me what my future self will believe. But I do have access to many other selves, who are better informed than I am on many topics. I can defer to experts. Provided they are unbiased and are giving me their honest opinion, I should perhaps always defer to people who have more information than I do — or to some weighted average of expert opinion if there is no consensus. Of course, the proviso is a very big one: often I have reason to disbelieve that other people are unbiased or that they are giving me their honest opinion. However, it is also possible that I am biased and self-deceiving. An important unresolved question is how much epistemic weight a wannabe Bayesian thinker should give to the opinions of others. I'm looking forward to changing my mind on that issue, hopefully by my credence function becoming concentrated on the correct answer.

*****

TODD E. FEINBERG, M.D.
Professor of Psychiatry and Neurology, Albert Einstein College of Medicine; Author, Altered Egos


Soul Searching

For most of my life I viewed any notion of the "soul" a fanciful religious invention. I agreed with the view of the late Nobel Laureate Francis Crick who in his book The Astonishing Hypothesis claimed "A modern neurobiologist sees no need for the religious concept of a soul to explain the behavior of humans and other animals." But is the idea of a soul really so crazy and beyond the limits of scientific reason?

From the standpoint of neuroscience, it is easy to make the claim that Descartes is simply wrong about the separateness of brain and mind. The plain fact is that there is no scientific evidence that a self, an individual mind, or a soul could exist without a physical brain. However, there are persisting reasons why the self and the mind do not appear to be identical with, or entirely reducible to, the brain.

For example, in spite of the claims of Massachusetts physician Dr. Duncan MacDougall, who estimated through his experiments on dying humans that approximately 21 grams of matter — the presumed weight of the human soul — was lost upon death (The New York Times "Soul Has Weight, Physician Thinks" March 11, 1907), unlike the brain, the mind cannot be objectively observed, but only subjectively experienced. The subject that represents the "I" in the statement "I think therefore I am" cannot be directly observed, weighed, or measured. And the experiences of that self, its pains and pleasures, sights and sounds possess an objective reality only to the one who experiences them. In other words, as the philosopher John Searle puts it, the mind is "irreducibly first-person."

On the other hand, although there are many perplexing properties about the brain, mind, and the self that remain to be scientifically explained — subjectivity among them — this does not mean that there must be an immaterial entity at work that explains these mysterious features. Nonetheless, I have come to believe that an individual consciousness represents an entity that is so personal and ontologically unique that it qualifies as something that we might as well call "a soul."

I am not suggesting that anything like a soul survives the death of the brain. Indeed, the link between the life of the brain and the life of the mind is irreducible, the one completely dependant upon the other. Indeed the danger of capturing the beauty and mystery of a personal consciousness and identity with the somewhat metaphorical designation "soul" is the tendency for the grandiose metaphor to obscure the actual accomplishments of the brain. The soul is not a "thing" independent of the living brain; it is part and parcel of it, its most remarkable feature, but nonetheless inextricably bound to its life and death.


There are many ideas presented in this collection of observations that deserve more attention. In the coming weeks I hope to take a closer look at some of what these great minds have to say.


No comments:

Post a Comment