JOHN BROCKMANHere are a few not so random answers to this year's question.
Publisher & Editor, Edge; Author, By The Late John Brockman, The Third CultureTHE INFINITE OSCILLATION OF OUR COLLECTIVE CONSCIOUS INTERACTING WITH ITSELF
"Love Intermedia Kinetic Environments." John Brockman speaking — partly kidding, but conveying the notion that Intermedia Kinetic Environments are In in the places where the action is — an Experience, an Event, an Environment, a humming electric world.
— The New York Times
On a Sunday in September 1966, I was sitting on a park bench reading about myself on the front page of the New York Times Arts & Leisure section. I was wondering whether the article would get me fired from my job at the New York Film Festival at Lincoln Center, where I was producing "expanded cinema" and "intermedia" events. I was twenty-five years old.
New and exciting ideas and forms of expression were in the air. They came out of happenings, the dance world, underground movies, avant-garde theater. They came from artists engaged in experiment. Intermedia consisted more often than not of unscripted, sometimes spontaneous theatrical events in which the audience was also a participant. I was lucky enough to have some small part in this upheaval, having been hired a year earlier by the underground filmmaker and critic Jonas Mekas to manage the Filmmakers' Cinémathèque and organize and run the Expanded Cinema Festival.
During that wildly interesting period, many of the leading artists were reading science and bringing scientific ideas to their work. John Cage gave me a copy of Norbert Wiener's Cybernetics; Bob Rauschenberg turned me on to James Jeans' The Mysterious Universe. Claes Oldenburg suggested I read George Gamow's 1,2,3...Infinity. USCO, a group of artists, engineers, and poets who created intermedia environments; La Monte Young's Theatre of Eternal Music; Andy Warhol's Factory; Nam June Paik's video performances; Terry Riley's minimalist music — these were master classes in the radical epistemology of a set of ideas involving feedback and information.
Another stroke of good luck was my inclusion in a small group of young artists invited by Fluxus artist Dick Higgins to attend a series of dinners with John Cage — an ongoing seminar about media, communications, art, music, and philosophy that focused on the ideas of Norbert Wiener, Claude Shannon, and Marshall McLuhan. Cage was aware of research conducted in the late 1930s and 1940s by Wiener, Shannon, Vannevar Bush, Warren McCulloch, and John von Neumann, who were all present at the creation of cybernetic theory. And he had picked up on McLuhan's idea that by inventing electric technology we had externalized our central nervous systems — that is, our minds — and that we now had to presume that "There's only one mind, the one we all share." We had to go beyond personal mind-sets: "Mind" had become socialized. "We can't change our minds without changing the world," Cage said. Mind as a man-made extension had become our environment, which he characterized as a "collective consciousness" that we could tap into by creating "a global utilities network."
Back then, of course, the Internet didn't exist, but the idea was alive. In 1962, J.C.R Licklider, who had published "Man-Computer Symbiosis" in 1960 and described the idea of an "Intergalactic Computer Network" in 1961, was hired as the first director of the new Information Processing Techniques Office (IPTO) at the Pentagon's Advanced Research Projects Agency, an agency created as a response to Sputnik. Licklider designed the foundation for a global computer network. He and his successors at IPTO, including Robert Taylor and Larry Roberts, provided the ideas that led to the development of the ARPAnet, the forerunner of the Internet, which itself emerged as an ARPA-funded research project in the mid-1980s.
Inspired also by architect-designer Buckminster Fuller, futurist John McHale, and cultural anthropologists Edward T. ("Ned") Hall and Edmund Carpenter, I began to read avidly in the field of information theory, cybernetics, and systems theory. McLuhan himself introduced me to The Mathematical Theory of Communication by Shannon and Weaver, which began: "The word communication will be used here in a very broad sense to include all of the procedures by which one mind may affect another. This, of course, involves not only written and oral speech, but also music, the pictorial arts, the theater, the ballet, and in fact all human behavior."
Inherent in these ideas is a radical new epistemology. It tears apart the fabric of our habitual thinking. Subject and object fuse. The individual self decreates. I wrote a synthesis of these ideas in my first book, By the Late John Brockman (1969), taking information theory — the mathematical theory of communications — as a model for regarding all human experience. I began to develop a theme that has informed my endeavors ever since: New technologies beget new perceptions. Reality is a man-made process. Our images of our world and of ourselves are, in part, models resulting from our perceptions of the technologies we generate.
We create tools and then we mold ourselves in their image. Seventeenth-century clockworks inspired mechanistic metaphors ("The heart is a pump"), just as the self-regulating engineering devices of the mid-twentieth century inspired the cybernetic image ("The brain is a computer"). The anthropologist Gregory Bateson has characterized the post-Newtonian worldview as one of pattern, of order, of resonances in which the individual mind is a subsystem of a larger order. Mind is intrinsic to the messages carried by the pathways within the larger system and intrinsic also in the pathways themselves.
Ned Hall once pointed out to me that the most critical inventions are not those that resemble inventions but those that appear innate and natural. Once you become aware of this kind of invention, it is as though you had always known about it. ("The medium is the message." Of course, I always knew that).
Hall's candidate for the most important invention was not the capture of fire, the printing press, the discovery of electricity, or the discovery of the structure of DNA. The most important invention was ... talking. To illustrate the point, he told a story about a group of prehistoric cavemen having a conversation.
"Guess what?" the first man said. "We're talking." Silence. The others looked at him with suspicion.
"What's 'talking'?" a second man asked.
"It's what we're all doing, right now. We're talking!"
"You're crazy," the third man said. "I never heard of such a thing!"
"I'm not crazy," the first man said. "You're crazy. We're talking."
Talking, undoubtedly, was considered innate and natural until the first man rendered it visible by exclaiming, "We're talking."
A new invention has emerged, a code for the collective conscious, which requires a new way of thinking. The collective externalized mind is the mind we all share. The Internet is the infinite oscillation of our collective conscious interacting with itself. It's not about computers. It's not about what it means to be human — in fact it challenges, renders trite, our cherished assumptions on that score. It's about thinking. "We're talking."
ERIC FISCHL & APRIL GORNIK
REPLACING EXPERIENCE WITH FACSIMILE
As visual artists, we might rephrase the question as something like: How has the Internet changed the way we see?
For the visual artist, seeing is essential to thought. It organizes information and how we develop thoughts and feelings. It's how we connect.
So how has the Internet changed us visually? The changes are subtle yet profound. They did not start with the computer. The changes began with the camera and other film-based media, and the Internet has had an exponential effect on that change.
The result is a leveling of visual information, whereby it all assumes the same characteristics. One loss is a sense of scale. Another is a loss of differentiation between materials, and the process of making. All visual information "looks" the same, with film/photography being the common denominator.
Art objects contain a dynamism based on scale and physicality that produces a somatic response in the viewer. The powerful visual experience of art locates the viewer very precisely as an integrated self within the artist's vision. With the flattening of visual information and the randomness of size inherent in reproduction, the significance of scale is eroded. Visual information becomes based on image alone. Experience is replaced with facsimile.
As admittedly useful as the Internet is, easy access to images of everything and anything creates a false illusion of knowledge and experience. The world pictured as pictures does not deliver the experience of art seen and experienced physically. It is possible for an art-experienced person to "translate" what is seen online, but the experience is necessarily remote.
As John Berger pointed out, the nature of photography is a memory device that allows us to forget. Perhaps something similar can be said about the Internet. In terms of art, the Internet expands the network of reproduction that replaces the way we "know" something. It replaces experience with facsimile.
* * *
KEVIN KELLY
Editor-At-Large, Wired; Author, New Rules for the New Economy
AN INTERMEDIA WITH 2 BILLION SCREENS PEERING INTO ITWe already know that our use of technology changes how our brains work. Reading and writing are cognitive tools that, once acquired, change the way in which the brain processes information. When psychologists use neuroimaging technology, like MRI, to compare the brains of literates and illiterates working on a task, they find many differences, and not just when the subjects are reading.
Researcher Alexandre Castro-Caldas discovered that processing between the hemispheres of the brain was different between those who could read and those who could not. A key part of the corpus callosum was thicker in literates, and "the occipital lobe processed information more slowly in individuals who learned to read as adults compared to those who learned at the usual age." Psychologists Ostrosky-Solis, Garcia and Perez tested literates and illiterates with a battery of cognitive tests while measuring their brain waves and concluded that "the acquisition of reading and writing skills has changed the brain organization of cognitive activity in general is not only in language but also in visual perception, logical reasoning, remembering strategies, and formal operational thinking."
If alphabetic literacy can change how we think, imagine how Internet literacy and 10 hours per day in front of one kind of screen or another is changing our brains. The first generation to grow up screen literate is just reaching adulthood so we don't have any scientific studies of the full consequence of ubiquitous connectivity, but I have a few hunches based on my own behavior.
When I do long division or even multiplication I don't try to remember the intermediate numbers. Long ago I learned to write them down. Because of paper and pencil I am "smarter" in arithmetic. In a similar manner I now no longer to try remember facts, or even where I found the facts. I have learned to summon them on the Internet. Because the Internet is my new pencil and paper, I am "smarter" in factuality.
But my knowledge is now more fragile. For every accepted piece of knowledge I find, there is within easy reach someone who challenges the fact. Every fact has its anti-fact. The Internet's extreme hyperlinking highlights those anti-facts as brightly as the facts. Some anti-facts are silly, some borderline, and some valid. You can't rely on experts to sort them out because for every expert there is an equal and countervailing anti-expert. Thus anything I learn is subject to erosion by these ubiquitous anti-factors.
My certainty about anything has decreased. Rather than importing authority, I am reduced to creating my own certainty — not just about things I care about — but about anything I touch, including areas about which I can't possibly have any direct knowledge . That means that in general I assume more and more that what I know is wrong. We might consider this state perfect for science but it also means that I am more likely to have my mind changed for incorrect reasons. Nonetheless, the embrace of uncertainty is one way my thinking has changed.Uncertainty is a kind of liquidity. I think my thinking has become more liquid. It is less fixed, as text in a book might be, and more fluid, as say text in Wikipedia might be. My opinions shift more. My interests rise and fall more quickly. I am less interested in Truth, with a capital T, and more interested in truths, plural. I feel the subjective has an important role in assembling the objective from many data points. The incremental plodding progress of imperfect science seems the only way to know anything.
While hooked into the network of networks I feel like I am a network myself, trying to achieve reliability from unreliable parts. And in my quest to assemble truths from half-truths, non-truths, and some other truths scattered in the flux (this creation of the known is now our job and not the job of authorities), I find my mind attracted to fluid ways of thinking (scenarios, provisional belief) and fluid media like mashups, twitter, and search. But as I flow through this slippery Web of ideas, it often feels like a waking dream.
We don't really know what dreams are for, only that they satisfy some fundamental need. Someone watching me surf the Web, as I jump from one suggested link to another, would see a day-dream. Today, I was in a crowd of people who watched a barefoot man eat dirt, then the face of a boy who was singing began to melt, then Santa burned a Christmas tree, then I was floating inside mud house on the very tippy top of the world, then Celtic knots untied themselves, then a guy told me the formula for making clear glass, then I was watching myself, back in high school, riding a bicycle. And that was just the first few minutes of my day on the Web this morning. The trance-like state we fall into while following the undirected path of links may be a terrible waste of time, or like dreams, it might be a productive waste of time. Perhaps we are tapping into our collective unconscious in a way watching the directed stream of TV, radio and newspapers could not. Maybe click-dreaming is a way for all of us to have the same dream, independent of what we click on.
This waking dream we call the Internet also blurs the difference between my serious thoughts and my playful thoughts, or to put it more simply: I no longer can tell when I am working and when I am playing online. For some people the disintegration between these two realms marks all that is wrong with the Internet: It is the high-priced waster of time. It breeds trifles. On the contrary, I cherish a good wasting of time as a necessary precondition for creativity, but more importantly I believe the conflation of play and work, of thinking hard and thinking playfully, is one the greatest things the Internet has done.
In fact the propensity of the Internet to diminish our attention is overrated. I do find that smaller and smaller bits of information can command the full attention of my over-educated mind. And not just me; everyone reports succumbing to the lure of fast, tiny, interruptions of information. In response to this incessant barrage of bits, the culture of the Internet has been busy unbundling larger works into minor snippets for sale. Music albums are chopped up and sold as songs; movies become trailers, or even smaller video snips. (I find that many trailers really are better than their movie.) Newspapers become twitter posts. Scientific papers are served up in snippets on Google. I happily swim in this rising ocean of fragments.
While I rush into the Net to hunt for these tidbits, or to surf on its lucid dream, I've noticed a different approach to my thinking. My thinking is more active, less contemplative. Rather than begin a question or hunch by ruminating aimlessly in my mind, nourished only by my ignorance, I start doing things. I immediately, instantly go.
I go looking, searching, asking, questioning, reacting to data, leaping in, constructing notes, bookmarks, a trail, a start of making something mine. I don't wait. Don't have to wait. I act on ideas first now instead of thinking on them. For some folks, this is the worst of the Net — the loss of contemplation. Others feel that all this frothy activity is simply stupid busy work, or spinning of wheels, or illusionary action. I think to myself, compared to what?
Compared to the passive consumption of TV or sucking up bully newspapers, or of merely sitting at home going in circles musing about stuff in my head without any new inputs, I find myself much more productive by acting first. The emergence of blogs and Wikipedia are expressions of this same impulse, to act (write) first and think (filter) later. I have a picture of the hundreds of millions people online at this very minute. To my eye they are not wasting time with silly associative links, but are engaged in a more productive way of thinking then the equivalent hundred of millions people were 50 years ago.
This approach does encourage tiny bits, but surprisingly at the very same time, it also allows us to give more attention to works that are far more complex, bigger, and more complicated than ever before. These new creations contain more data, require more attention over longer periods; and these works are more successful as the Internet expands. This parallel trend is less visible at first because of a common short sightedness that equates the Internet with text.To a first approximation the Internet is words on a screen — Google, papers, blogs. But this first glance ignores the vastly larger underbelly of the Internet — moving images on a screen. People (and not just young kids) no longer go to books and text first. If people have a question they (myself included) head first for YouTube. For fun we go to online massive games, or catch streaming movies, including factual videos (documentaries are in a renaissance). New visual media are stampeding onto the Nets. This is where the Internet's center of attention lies, not in text alone. Because of online fans, and streaming on demand, and rewinding at will, and all the other liquid abilities of the Internet, directors started creating movies that were more than 100 hours long.
These vast epics like Lost and The Wire had multiple interweaving plot lines, multiple protagonists, an incredible depth of characters and demanded sustained attention that was not only beyond previous TV and 90-minute movies, but would have shocked Dickens and other novelists of yore. They would marvel: "You mean they could follow all that, and then want more? Over how many years?" I would never have believed myself capable of enjoying such complicated stories, or caring about them to put in the time. My attention has grown. In a similar way the depth, complexity and demands of games can equal these marathon movies, or any great book.
But the most important way the Internet has changed the direction of my attention, and thus my thinking, is that it has become one thing. It may look like I am spending endless nano-seconds on a series of tweets, and endless microseconds surfing between Web pages, or wandering between channels, and hovering only mere minutes on one book snippet after another; but in reality I am spending 10 hours a day paying attention to the Internet. I return to it after a few minutes, day after day, with essentially my full-time attention. As do you.
We are developing an intense, sustained conversation with this large thing. The fact that it is made up of a million loosely connected pieces is distracting us. The producers of Websites, and the hordes of commenters online, and the movie moguls reluctantly letting us stream their movies, don't believe they are mere pixels in a big global show, but they are. It is one thing now, an intermedia with 2 billion screens peering into it. The whole ball of connections — including all its books, all its pages, all its tweets, all its movies, all its games, all its posts, all its streams — is like one vast global book (or movie, etc.), and we are only beginning to learn how to read it. Knowing that this large thing is there, and that I am in constant communication with it, has changed how I think.
* * *
MIHALY CSIKSZENTMIHALYI
Psychologist; Director, Quality of Life Research Center, Claremont Graduate University; Author, FlowI MUST CONFESS TO BEING PERPLEXED
Answering this question should be a slam-dunk, right? After all, thinking about thinking is my racket. Yet I must confess to being perplexed. I am not even sure we have good evidence that the way humans think has been changed by the advent of the printing press . . . Of course the speed of accessing information and the extent of information at one's fingertips has been extended enormously, but has that actually affected the way thinking unfolds?
If I am to rely on my personal experience, I would probably suggest the following hypotheses:
1. I am less likely to pursue new lines of thought before turning to the Internet to check either existing data-bases, or asking a colleague directly (result: less sustained thought?)
2. Information from the Internet is often decontextualized, but being quick it satisfies immediate needs at the expense of deeper understanding (result: more superficial thought?)
3. At the same time, connection between ideas, facts. Etc., can be more easily established on the Web — if one takes the time to do so — (result: more intra-personally integrated thought?)
4. The development of cooperative sites ranging from Wikipedia to open-source software (and including Edge?) makes the thought process more public, more interactive, more transpersonal, resulting in something similar to what Teilhard de Chardin anticipated over half a century ago as the "Noosphere", or a global consciousness that he saw as the next step in human evolution.Like all technologies, this one has both positive and negative consequences. I am not sure I would bet on the first two (negative) hypotheses being closer to the truth; or on the next two, which are more positive. And of course, both sets could be true at the same time.
* * *
ALISON GOPNIK
Psychologist, UC, Berkeley; Author, The Philosophical BabyTHE STRANGERS IN THE CRIB
My thinking has certainly been transformed in alarming ways by a relatively recent information technology, but it's not the Internet. I often sit for hours in the grip of this compelling medium, motionless and oblivious, instead of interacting with the people around me. As I walk through the streets I compulsively check out even trivial messages — movie ads, street signs — and I pay more attention to descriptions of the world — museum captions, menus — than to the world itself. I've become incapable of using attention and memory in ways that previous generations took for granted. Yes, I know reading has given me a powerful new source of information. But is it worth the isolation, the damage to dialog and memorization that Socrates foresaw? Studies show, in fact, that I've become involuntarily compelled to read, I literally can't keep myself from decoding letters. Reading has even reshaped my brain, cortical areas that once were devoted to vision and speech have been hijacked by print. Instead of learning through practice and apprenticeship, I've become dependent on lectures and textbooks. And look at the toll of dyslexia and attention disorders and learning disabilities, all signs that our brains were just not designed to deal with such a profoundly unnatural technology.
Like many others I feel that the Internet has made my experience more fragmented, splintered and discontinuous. But I'd argue that's not because of the Internet itself but because I have mastered the Internet as an adult. Why don't we feel the same way about reading and schooling that we feel about the Web? These changes in the way we get information have had a pervasive and transformative effect on human cognition and thought, and universal literacy and education have only been around for a hundred years or so.
It's because human change takes place across generations, rather than within a single life. This is built into the very nature of the developing mind and brain. All the authors of these essays have learned how to use the Web with brains that were fully developed long before we sent our first e-mail. All of us learned to read with the open and flexible brains we had when we were children. As a result no-one living now will experience the digital world in the spontaneous and unselfconscious way that the children of 2010 will experience it, or in the spontaneous and unselfconscious way we experience print.
There is a profound difference between the way children and adults learn. Young brains are capable of much more extensive change — more rewiring — than the brains of adults. This difference between old brains and young ones is the engine of technological and cultural innovation. Human adults, more than any other animal, reshape the world around them. But adults innovate slowly, intentionally, and consciously. The changes that take place within an adult life, like the development of the Internet, are disruptive, attention-getting, disturbing or exciting. But those changes become second nature to the next generation of children. Those young brains painlessly absorb the world their parents created, and that world takes on a glow of timelessness and eternity, even if it was only created the day before you were born.
My experience of the Web, feels fragmented, discontinuous, effortful (and interesting!) because, for adults, learning a new technology depends on conscious, attentive, intentional processing. In adults, this kind of conscious attention is a very limited resource. This is even true at the neural level. When we pay attention to something, the prefrontal cortex, the part of our brain responsible for conscious goal-directed planning, controls the release of cholinergic transmitters, chemicals that help us learn, to certain very specific parts of the brain. So as we wrestle with a new technology we adults can only change our minds a little bit at a time.
Attention and learning work very differently in young brains. Young animals have much more wide-spread cholinergic transmitters than adults and their ability to learn doesn't depend on planned, deliberate attention. Young brains are designed to learn from everything new, or surprising or information-rich, even when it isn't particularly relevant or useful.
So children who grow up with the Web will master it in a way that will feel as whole and natural as reading feels to us. But that doesn't mean that their experience and attention won't be changed by the Internet, anymore than my print-soaked twentieth century life was the same as the life of a barely literate 19th century farmer.
The special attentional strategies that we require for literacy and schooling may feel natural since they are so pervasive, and since we learned them at such an early age. But at different times and places, different ways of deploying attention have been equally valuable and felt equally natural. Children in Mayan Indian cultures, for example, are taught to distribute their attention to several events simultaneously, just as print and school teach us to focus on just one thing at a time. I'll never be able to deploy the broad yet vigilant attention of a hunter-gatherer, though, luckily, a childhood full of practice caregiving let me master the equally ancient art of attending to work and babies at the same time.
Perhaps our digital grandchildren will view a master reader with the same nostalgic awe that we now accord to a master hunter or an even more masterly mother of six. The skills of the hyper-literate 20th century may well disappear, or at least become highly specialized enthusiasms, like the once universal skills of hunting, poetry and dance. It is sad that after the intimacy of infancy our children inevitably end up being somewhat weird and incomprehensible visitors from the technological future. But the hopeful thought is that my grand-children will not have the fragmented, distracted, alienated digital experience that I do. For them the Internet will feel as fundamental, as rooted, as timeless, as a battered Penguin paperback, that apex of the literate civilization of the last century, feels for me.
I'll post more of these in the coming days - there are a lot of interesting responses.
No comments:
Post a Comment