I tend to feel the later is more true, at least in my experience.
I've collected a handful of article that explore these ideas - most of these are riffing on a recent research article: Google Effects on Memory: Cognitive Consequences of Having Information at Our Fingertips by Betsy Sparrow, Jenny Liu, and Daniel M. Wegner (the PDF is available at the link).
The advent of the Internet, with sophisticated algorithmic search engines, has made accessing information as easy as lifting a finger. No longer do we have to make costly efforts to find the things we want. We can “Google” the old classmate, find articles online, or look up the actor who was on the tip of our tongue. The results of four studies suggest that when faced with difficult questions, people are primed to think about computers and that when people expect to have future access to information, they have lower rates of recall of the information itself and enhanced recall instead for where to access it. The Internet has become a primary form of external or transactive memory, where information is stored collectively outside ourselves.I'll be posting the first bit or so of each one, so follow the links to read the whole article.
At the end, there is an NPR story about how children who have grown up with the internet have brains that function differently - they do not so much multitask as easily move back and forth between tasks very rapidly.
From Scientific American:
Researchers study whether the use of the Internet as a memory aid leads to a lazy mind, or whether memorization is overrated| July 14, 2011
TOTAL RECALL?: The advent of the Internet and near-ubiquitous information at our fingertips makes it less critical for us to commit items to memory. Using the Internet as a mental crutch is not necessarily a bad thing, according to researchers.
Has the Internet dumbed down society or simply become an external storage unit that enhances the human brain's memory capacity? With Google, Internet Movie Database and Wikipedia at our beck and call via smart phones, tablets and laptops, the once essential function of committing facts to memory has become little more than a flashback to flash cards. This shift is not necessarily a bad thing, nor is it irreversible, according to a team of researchers whose study on search engines and learning appears in the July 15 issue of Science.
Led by Columbia University psychologist Betsy Sparrow, the researchers conducted a series of experiments whose results suggest that when people are faced with difficult questions, they are likely to think that the Internet will help them find the answers. In fact, those who expect to able to search for answers to difficult questions online are less likely to commit the information to memory. People tend to memorize answers if they believe that it is the only way they will have access to that information in the future. Regardless of whether they remember the facts, however, people tend to recall the Web sites that hold the answers they seek.
In this way, the Internet has become a primary form of external or "transactive" memory (a term coined by Sparrow's one-time academic advisor, social psychologist Daniel Wegner), where information is stored collectively outside the brain. This is not so different from the pre-Internet past, when people relied on books, libraries and one another—such as using a "lifeline" on the game show Who Wants to be a Millionaire?—for information. Now, however, besides oral and printed sources of information, a lion's share of our collective and institutional knowledge bases reside online and in data storage.
From BPS Research Digest, by the British Psychological Society.
Written by Christian Jarrett
Last year's annual question posed by Edge was "How is the Internet changing the way you think?" Several psychologists answered that it was becoming an extension of their minds. "The Internet is a kind of collective memory,’ wrote Stephen Kosslyn (Harvard University). "When I write with a browser open in the background, it feels like the browser is an extension of myself."
A research team led by Betsy Sparrow has now tested the idea that the Internet really has become a kind of memory prosthesis. First they showed that difficult questions prompted dozens of undergrad participants to think automatically of computers and search engines. Participants tackled either easy or difficult trivia questions and then completed a version of the classic Stroop task: they had to look at a series of words and say what colour ink they were written in. After difficult questions, participants were extra slow at naming the colour of words like "Google". This is a sign that the search engine concept was salient in their minds and therefore interfered more with the process of colour naming.
Next, a group of dozens more undergrad participants read 40 trivia statements and then typed them into a computer. Half the participants were told that the computer would save their entry, the others were told the entries would be deleted. Participants in the "saved" condition performed worse at a subsequent recall test of the statements, as if they'd relied on the computer as an external memory store. Half the participants in both conditions had been instructed explicitly to try to remember the statements, but this made no difference to their memory performance. "Participants were more impacted by the cue that information would or would not be available to them, regardless of whether they thought they would be tested on it," the researchers said.
From Jonah Lehrer's The Frontal Cortex (via Wired):
- By Jonah Lehrer
- July 15, 2011
By now, you’ve probably heard about this smart study showing that Google is making you stupid, led by Betsy Sparrow at Columbia. The scientists demonstrated that the availability of the internet is changing the nature of what we remember, making us more likely to recall where the facts are rather than the facts themselves. Patricia Cohen of the Times summarizes the results:The headlines are already emphasizing the amnesiac effects of the internet, as if Google were a pox on the hippocampus. The scientists themselves are mostly sanguine about the data, noting that humans have been relying on “transactive memory” ever since the invention of language. It’s just that, for most of human history, the only other reliable sources of information were other people. What these experiments reveal is that we treat the search engine like a particularly clever friend, a buddy with a gift for factoids and trivia.
Dr. Sparrow and her collaborators, Daniel M. Wegner of Harvard and Jenny Liu of the University of Wisconsin, Madison, staged four different memory experiments. In one, participants typed 40 bits of trivia — for example, “an ostrich’s eye is bigger than its brain” — into a computer. Half of the subjects believed the information would be saved in the computer; the other half believed the items they typed would be erased.
The subjects were significantly more likely to remember information if they thought they would not be able to find it later. “Participants did not make the effort to remember when they thought they could later look up the trivia statement they had read,” the authors write.
A second experiment was aimed at determining whether computer accessibility affects precisely what we remember. “If asked the question whether there are any countries with only one color in their flag, for example,” the researchers wrote, “do we think about flags — or immediately think to go online to find out?”
In this case, participants were asked to remember both the trivia statement itself and which of five computer folders it was saved in. The researchers were surprised to find that people seemed better able to recall the folder.
Here is an interview with Betsy Sparrow, one of the researchers from the study being discussed above. The PBS video can't be embedded, so follow the link - it's short.
Program: PBS NewsHour
In this NPR story from Talk of the Nation, the topic switches to children who have never known anything but the internet infused world. Their brains seem to function differently, and that fact may require that we change the way we do education.
Few will argue about America's colleges and universities being critical to our economic and intellectual future. And by many measures, that future looks promising: Competition for places in the country's top schools is fiercer than ever, more families are willing to pay higher tuition, and employers are putting a greater premium on a college degree.
But Don Tapscott, co-author of Macrowikinomics: Rebooting Business And The World, argues that universities are woefully behind the times.
Tapscott — who has studied the digital revolution — tells NPR's Neal Conan that the traditional lecture model in American universities is no longer appropriate for a generation that has grown up making, changing and learning from digital communities.
"My generation — the boomers — grew up watching 24 hours a week of TV per kid," Tapscott says. But, he adds, today's young people have had a very different experience."This new generation comes home and they turn on their computer and they're in three different windows and they've got three magazines open and they're listening to iTunes and they're texting with their friends," he says, "and they're doing their homework."
With such a networked approach to work and leisure time, Tapscott says the traditional university classroom is starting to feel less appropriate.
A Harvard student studying the corporate management expert Peter Drucker once summed up his disillusionment with what Tapscott calls the "broadcast model" of learning: "Why would I sit there and listen to a [teaching assistant] talking to 300 of us," Tapscott recalls him saying, "when I can go online and interact with a real-time Peter Drucker?"
"The big thing is to get an 'A' without having ever gone to the lecture," Tapscott says. "All these kids that have grown up collaborating and thinking differently walk into a university and they're asked to sit there and passively listen to someone talking."
He says that if someone from 100 years ago miraculously came back and found a modern engineer designing a bridge, it would be clear how much technology had changed things. But if that same person walked into a university lecture hall today, it would be entirely familiar.
"We need to move toward a collaborative model of learning that's student focused, [that's] highly customized and that is a model appropriate for a new generation that learns differently," says Tapscott. He warns that universities are ignoring the changing needs and desires of young people — and they're doing so at their own peril.
"When you have the cream of the crop of an entire generation thinking that the model of pedagogy is deeply flawed," he says, "well, the writing's on the wall."