Showing posts with label Steven Pinker. Show all posts
Showing posts with label Steven Pinker. Show all posts

Wednesday, June 11, 2014

Writing In The 21st Century: A Conversation with Steven Pinker

http://ecx.images-amazon.com/images/I/71ggVzMMH8L.jpg

Steven Pinker, the beloved and sometimes infuriating evolutionary psychologist, has a new book out, The Sense of Style: The Thinking Person’s Guide to Writing in the 21st Century (2014). Interesting choice. I have never thought of Pinker as an especially clear writer. But I guess after so many big serious books, he may have wanted to do something fun.

Writing In The 21st Century

A Conversation with Steven Pinker [6.9.14]

Topic: Conversations
Introduction By: John Brockman

What are the arts but products of the human mind which resonate with our aesthetic and emotional faculties? What are social issues, but ways in which humans try to coordinate their behavior and come to working arrangements that benefit everyone? There's no aspect of life that cannot be illuminated by a better understanding of the mind from scientific psychology. And for me the most recent example is the process of writing itself.


(37 minutes)

Introduction


Psychologist Steven Pinker's 1994 book The Language Instinct discussed all aspects of language in a unified, Darwinian framework, and in his next book, How The Mind Works he did the same for the rest of the mind, explaining "what the mind is, how it evolved, and how it allows us to see, think, feel, laugh, interact, enjoy the arts, and ponder the mysteries of life".

He has written four more consequential books: Words and Rules (1999), The Blank Slate (2002), The Stuff of Thought (2007), and The Better Angels of Our Nature (2011). The evolution in his thinking, and the expansion of his range, the depth of his vision, are evident in his contributions on many important issues on these pages over the years: "A Biological Understanding of Human Nature", "The Science of Gender and Science", "A Preface to Dangerous Ideas", "Language and Human Nature", "A History of Violence", "The False Allure of Group Selection", "Napoleon Chagnon: Blood Is Their Argument", and "Science Is Not Your Enemy". In addition to his many honors, he is the Edge Question Laureate, having suggested three of Edge's Annual Questions: "What Is Your Dangerous Idea?"; What Is Your Favorite Deep, Elegant, Or Beautiful Explanation?"; and "What Scientific Concept Would Improve Everybody's Cognitive Toolkit?". He is a consummate third culture intellectual.

In the conversation below, Pinker begins by stating his belief that "science can inform all aspects of life, particularly psychology, my own favorite science. Psychology looks in one direction to biology, to neuroscience, to genetics, to evolution. And it looks in another direction to the rest of intellectual and cultural life—because what are the arts but products of the human mind which resonate with our aesthetic and emotional faculties? What are social issues but ways in which humans try to coordinate their behavior and come to working arrangements that benefit everyone? There's no aspect of life that cannot be illuminated by a better understanding of the mind from scientific psychology. And for me the most recent example is the process of writing itself."...

John Brockman

STEVEN PINKER is the Johnstone Family Professor in the Department of Psychology at Harvard University. He is the author of ten books, including The Language Instinct, How the Mind Works, The Better Angels of Our Nature, and The Sense of Style (September). Steven Pinker's Edge Bio page

WRITING IN THE 21ST CENTURY


I believe that science can inform all aspects of life, particularly psychology, my own favorite science. Psychology looks in one direction to biology, to neuroscience, to genetics, to evolution. And it looks in another direction to the rest of intellectual and cultural life—because what are the arts but products of the human mind which resonate with our aesthetic and emotional faculties? What are social issues but ways in which humans try to coordinate their behavior and come to working arrangements that benefit everyone? There's no aspect of life that cannot be illuminated by a better understanding of the mind from scientific psychology. And for me the most recent example is the process of writing itself.

I'm a psychologist who studies language—a psycholinguist—and I'm also someone who uses language in my books and articles to convey ideas about, among other things, the science of language itself. But also, ideas about war and peace and emotion and cognition and human nature. The question I'm currently asking myself is how our scientific understanding of language can be put into practice to improve the way that we communicate anything, including science?

In particular, can you use linguistics, cognitive science, and psycholinguistics to come up with a better style manual—a 21st century alternative to the classic guides like Strunk and White's The Elements of Style?

Writing is inherently a topic in psychology. It's a way that one mind can cause ideas to happen in another mind. The medium by which we share complex ideas, namely language, has been studied intensively for more than half a century. And so if all that work is of any use it ought to be of use in crafting more stylish and transparent prose.


From a scientific perspective, the starting point must be different from that of traditional manuals, which are lists of dos and don'ts that are presented mechanically and often followed robotically. Many writers have been the victims of inept copy editors who follow guidelines from style manuals unthinkingly, never understanding their rationale.

For example, everyone knows that scientists overuse the passive voice. It's one of the signatures of academese: "the experiment was performed" instead of "I performed the experiment." But if you follow the guideline, "Change every passive sentence into an active sentence," you don't improve the prose, because there's no way the passive construction could have survived in the English language for millennia if it hadn't served some purpose.

The problem with any given construction, like the passive voice, isn't that people use it, but that they use it too much or in the wrong circumstances. Active and passive sentences express the same underlying content (who did what to whom) while varying the topic, focus, and linear order of the participants, all of which have cognitive ramifications. The passive is a better construction than the active when the affected entity (the thing that has moved or changed) is the topic of the preceding discourse, and should therefore come early in the sentence to connect with what came before; when the affected entity is shorter or grammatically simpler than the agent of the action, so expressing it early relieves the reader's memory load; and when the agent is irrelevant to the story, and is best omitted altogether (which the passive, but not the active, allows you to do). To give good advice on how to write, you have to understand what the passive can accomplish, and therefore you should not blue-pencil every passive sentence into an active one (as one of my copy editors once did).

Ironically, the aspect of writing that gets the most attention is the one that is least important to good style, and that is the rules of correct usage. Can you split an infinitive, that is, say, "to boldly go where no man has gone before,"or must you say to "go boldly"? Can you use the so-called fused participle—"I approve of Sheila taking the job"—as opposed to "I approve of Sheila's taking the job" (with an apostrophe "s")? There are literally (yes, "literally") hundreds of traditional usage issues like these, and many are worth following. But many are not, and in general they are not the first things to concentrate on when we think about how to improve writing.

The first thing you should think about is the stance that you as a writer take when putting pen to paper or fingers to keyboard. Writing is cognitively unnatural. In ordinary conversation, we've got another person across from us. We can monitor the other person's facial expressions: Do they furrow their brow, or widen their eyes? We can respond when they break in and interrupt us. And unless you're addressing a stranger you know the hearer's background: whether they're an adult or child, whether they're an expert in your field or not. When you're writing you have none of those advantages. You're casting your bread onto the waters, hoping that this invisible and unknowable audience will catch your drift.

The first thing to do in writing well—before worrying about split infinitives—is what kind of situation you imagine yourself to be in. What are you simulating when you write, and you're only pretending to use language in the ordinary way? That stance is the main thing that iw distinguishes clear vigorous writing from the mush we see in academese and medicalese and bureaucratese and corporatese.

The literary scholars Mark Turner and Francis-Noël Thomas have identified the stance that our best essayists and writers implicitly adopt, and that is a combination of vision and conversation. When you write you should pretend that you, the writer, see something in the world that's interesting, that you are directing the attention of your reader to that thing in the world, and that you are doing so by means of conversation.

That may sound obvious. But it's amazing how many of the bad habits of academese and legalese and so on come from flouting that model. Bad writers don't point to something in the world but are self-conscious about not seeming naïve about the pitfalls of their own enterprise. Their goal is not to show something to the reader but to prove that they are not a bad lawyer or a bad scientist or a bad academic. And so bad writing is cluttered with apologies and hedges and "somewhats" and reviews of the past activity of people in the same line of work as the writer, as opposed to concentrating on something in the world that the writer is trying to get someone else to see with their own eyes.

That's a starting point to becoming a good writer. Another key is to be an attentive reader. One of the things you appreciate when you do linguistics is that a language is a combination of two very different mechanisms: powerful rules, which can be applied algorithmically, and lexical irregularities, which must be memorized by brute force: in sum, words and rules.

All languages contain elegant, powerful, logical rules for combining words in such a way that the meaning of the combination can be deduced from the meanings of the words and the way they're arranged. If I say "the dog bit the man" or "the man bit the dog," you have two different images, because of the way those words are ordered by the rules of English grammar.

On the other hand, language has a massive amount of irregularity: idiosyncrasies, idioms, figures of speech, and other historical accidents that you couldn't possibly deduce from rules, because often they are fundamentally illogical. The past tense of "bring" is "brought," but the past tense of "ring" is "rang," and the past tense of "blink" is "blinked." No rule allows you to predict that; you need raw exposure to the language. That's also true for many rules of punctuation. If I talk about "Pat's leg," it's "Pat-apostrophe-s." But If I talk about "its leg," I can't use apostrophe S; that would be illiterate. Why? Who knows? That's just the way English works. People who spell possessive "its" with an apostrophe are not being illogical; they're being too logical, while betraying the fact that they haven't paid close attention to details of the printed page.

So being a good writer depends not just on having mastered the logical rules of combination but on having absorbed tens or hundreds of thousands of constructions and idioms and irregularities from the printed page. The first step to being a good writer is to be a good reader: to read a lot, and to savor and reverse-engineer good prose wherever you find it. That is, to read a passage of writing and think to yourself, … "How did the writer achieve that effect? What was their trick?" And to read a good sentence with a consciousness of what makes it so much fun to glide through.

Any handbook on writing today is going to be compared to Strunk and White's The Elements of Style, a lovely little book, filled with insight and charm, which I have read many times. But William Strunk, its original author, was born in 1869. This is a man who was born before the invention of the telephone, let alone the computer and the Internet and the smartphone. His sense of style was honed in the later decades of the 19th century!

We know that language changes. You and I don't speak the way people did in Shakespeare's era, or in Chaucer's. As valuable as The Elements of Style is (and it's tremendously valuable), it's got a lot of cockamamie advice, dated by the fact that its authors were born more than a hundred years ago. For example, they sternly warn, "Never use 'contact' as a verb. Don't say 'I'm going to contact him.' It's pretentious jargon, pompous and self-important. Indicate that you intend to 'telephone' someone or 'write them' or 'knock on their door.'" To a writer in the 21st century, this advice is bizarre. Not only is "to contact" thoroughly entrenched and unpretentious, but it's indispensable. Often it's extremely useful to be able to talk about getting in touch with someone when you don't care by what medium you're going to do it, and in those cases, "to contact" is the perfect verb. It may have been a neologism in Strunk and White's day, but all words start out as neologisms in their day. If you read The Elements of Style today, you have no way of appreciating that what grated on the ears of someone born in 1869 might be completely unexceptionable today.

The other problem is that The Elements of Style was composed before there existed a science of language and cognition. A lot of Strunk and White's advice depended completely on their gut reactions from a lifetime of practice as an English professor and critic, respectively. Today we can offer deeper advice, such as the syntactic and discourse functions of the passive voice—a construction which, by the way, Strunk and White couldn't even consistently identify, not having being trained in grammar.

Another advantage of modern linguistics and psycholinguistics is that it provides a way to think your way through a pseudo-controversy that was ginned up about 50 years ago between so-called prescriptivists and descriptivists. According to this fairy tale there are prescriptivists who prescribe how language ought to be used and there are descriptivists, mainly academic linguists, who describe how language in fact is used. In this story there is a war between them, with prescriptivist dictionaries competing with descriptivist dictionaries.

Inevitably my own writing manual is going to be called "descriptivist," because it questions a number of dumb rules that are routinely flouted by all the best writers and had no business being in style books in the first place. These pseudo-rules violate the logic of English but get passed down as folklore from one style sheet to the next. But debunking stupid rules is not the same thing as denying the existence of rules, to say nothing of advice on writing. The Sense of Style is clearly prescriptive: it consists of 300 pages in which I boss the reader around.

This pseudo-controversy was created when Webster's Third International Dictionary was published in the early 1960s. Like all dictionaries, it paid attention to the way that language changes. If a dictionary didn't do that it would be useless: writers who consulted it would be guaranteed to be misunderstood. For example, there is an old prescriptive rule that says that "nauseous," which most people use to mean nauseated, cannot mean that. It must mean creating nausea, namely, "nauseating." You must write that a roller coaster ride was nauseous, or a violent movie was nauseous, not I got nauseous riding on the roller coaster or watching the movie. Nowadays, no one obeys this rule. If a dictionary were to stick by its guns and say it's an error to say that the movie made me nauseous, it would be a useless dictionary: it wouldn't be doing what a dictionary has to do. This has always been true of dictionaries.

But there's a myth that dictionaries work like the rulebook of Major League Baseball; they legislate what is correct. I can speak with some authority in saying that this is false. I am the Chair of the Usage Panel of The American Heritage Dictionary, which is allegedly the prescriptivist alternative to the descriptivist Webster's. But when I asked the editors how they decide what goes into the dictionary, they replied, "By paying attention to the way people use language."

Of course dictionary editors can't pay attention to the way everyone uses language, because people use language in different ways. When you write, you're writing for a virtual audience of well-read, literate fellow readers. And those are the people that we consult in deciding what goes into the dictionary, particularly in the usage notes that comment on controversies of usage, so that readers will know what to anticipate when they opt to obey or flout an alleged rule.

This entire approach is sometimes criticized by literary critics who are ignorant of the way that language works, and fantasize about a golden age in which dictionaries legislated usage. But language has always been a grassroots, bottom-up phenomenon. The controversy between "prescriptivists" and "descriptivists" is like the choice in "America: Love it or leave it" or "Nature versus Nurture"—a euphonious dichotomy that prevents you from thinking.

Many people get incensed about so-called errors of grammar which are perfectly unexceptionable. There was a controversy in the 1960s over the advertising slogan "Winston tastes good, like a cigarette should." The critics said it should be "as a cigarette should" and moaned about the decline of standards. . A more recent example was an SAT question that asked students whether there was an error in "Toni Morrison's genius allows her to write novels that capture the African American condition." Supposedly the sentence is ungrammatical: you can't have "Toni Morrison's" as an antecedent to the pronoun "she." Now that is a complete myth: there was nothing wrong with the sentence.

Once a rumor about a grammatical error gets legs, it can proliferate like an urban legend about alligators in the sewers. Critics and self-appointed guardians of the language will claim that language is deteriorating because people violate the rule—which was never a rule in the first place. It's so much fun to be in high dudgeon over the decline of language and civilization that these critics don't stop to check the rulebooks and dictionaries to discover how great writers write or to learn the logic of the English language.

Poets and novelists often have a better feel for the language than the self-appointed guardians and the pop grammarians because for them language is a medium. It's a way of conveying ideas and moods with sounds. The most gifted writers—the Virginia Woolfs and H.G. Wellses and George Bernard Shaws and Herman Melvilles—routinely used words and constructions that the guardians insist are incorrect. And of course avant-garde writers such as Burroughs and Kerouac, and poets pushing the envelope or expanding the expressive possibilities of the language, will deliberately flout even the genuine rules that most people obey. But even non-avant garde writers, writers in the traditional canon, write in ways that would be condemned as grammatical errors by many of the purists, sticklers and mavens.

Another bit of psychology that can make anyone a better writer is to be aware of a phenomenon sometimes called The Curse of Knowledge. It goes by many names, and many psychologists have rediscovered versions of it, including defective Theory of Mind, egocentrism, hindsight bias, and false consensus. They're all versions of an infirmity afflicting every member of our species, namely that it's hard to imagine what it's like not to know something that you do know.

It's easiest to see it in children. In one famous experiment, kid comes into a room, opens a box of candy, finds pencils inside, and the kid is surprised. Then you say to him, "Now Jason's going to come into the room. What does he think is in the box?" And the child will say "pencils." Of course, Jason has no way of knowing that the box had pencils, but the first child is projecting his own state of knowledge onto Jason, forgetting that other people may not know what he knows.

Now we laugh at the kids, but it's true of all of us. We as writers often use technical terms, abbreviations, assumptions about typical experimental methods, assumptions about what questions we ask in our research, that our readers have no way of knowing because they haven't been through the same training that we have. Overcoming the curse of knowledge may be the single most important requirement in becoming a clear writer.

Contrary to the common accusation that academic writing is bad because professors are trying to bamboozle their audience with highfalutin gobbledygook, I don't think that most bad prose is deliberate. I think it is inept. It is a failure to get inside the head of your reader. We also know from psychology that simply trying harder to get inside the head of your reader is not the ideal way to do it. No matter how hard we try, we're at best okay, but not great, at anticipating another person's state of knowledge.

Instead, you have to ask. You've got to show people a draft. Even if you're writing for laypeople, your reviewers don't all have to be laypeople; a colleague is better than no one. I'm often astonished at things that I think are obvious that turn out to be not so obvious to other people.

Another implication of the curse of knowledge is that having an editor is a really good thing. Supposedly there are writers who can dash off a perfectly comprehensible, clear, and coherent essay without getting feedback from a typical reader, but most of us don't have that clairvoyance. We need someone to say "I don't understand this" or "What the hell are you talking about?" To say nothing of attention to the fine points of punctuation, grammar, sentence structure, and other ways in which a sophisticated copy editor can add value to your written work.

How much of this advice comes from my experience as a writer and how much from my knowledge as a psycholinguist? Some of each. I often reflect on psychology behind the thousands of decisions I make as a writer in the lifelong effort to improve my prose, and I often think about how to apply experiments on sentence comprehension and the history of words and the logic (and illogic) of grammar to the task of writing. I might think, "Aha, the reason I rewrote this sentence that way is because of the memory demands of subject versus object relative clauses."

This combination of science and letters is emblematic of what I hope to be a the larger trend we spoke of earlier, namely the application of science, particularly psychology and cognitive science, to the traditional domains of humanities. There's no aspect of human communication and cultural creation that can't benefit from a greater application of psychology and the other sciences of mind. We would have an exciting addition to literary studies, for example, if literary critics knew more about linguistics.Poetry analysts could apply phonology (the study of sound structure) and the cognitive psychology of metaphor. An analysis of plot in fiction could benefit from a greater understanding of the conflicts and confluences of ultimate interests in human social relationships. The genre of biography would be deepened by an understanding of the nature of human memory, particularly autobiographical memory. How much of the memory of our childhood is confabulated? Memory scientists have a lot to say about that. How much do we polish our image of ourselves in describing ourselves to others, and more importantly, recollecting our own histories? Do we edit our memories in an Orwellian manner to make ourselves more coherent in retrospect? Syntax and semantics are relevant as well. How does a writer use the tense system of English to convey a sense of immediacy or historical distance?

In music the sciences of auditory and speech perception have much to contribute to understanding how musicians accomplish their effects. The visual arts could revive an old method of analysis going back to Ernst Gombrich and Rudolf Arnheim in collaboration with the psychologist Richard Gregory Indeed, even the art itself in the 1920s was influenced by psychology, thanks in part to Gertrude Stein, who as an undergraduate student of William James did a wonderful thesis on divided attention, and then went to Paris and brought the psychology of perception to the attention of artists like Picasso and Braque. Gestalt psychology may have influenced Paul Klee and the expressionists. Since then we have lost that wonderful synergy between the science of visual perception and the creation of visual art.

Going beyond the arts, the social sciences, such as political ,science could benefit from a greater understanding of human moral and social instincts, such as the psychology of dominance, the psychology of revenge and forgiveness, and the psychology of gratitude and social competition. All of them are relevant, for example, to international negotiations. We talk about one country being friendly to another or allying or competing, but countries themselves don't have feelings. It's the elites and leaders who do, and a lot of international politics is driven by the psychology of its leaders.

Even beyond applying the findings of psychology and cognitive science and social and affective neuroscience, it's the mindset of science that ought to be exported to cultural and intellectual life as a whole. That consists in increased skepticism and scrutiny about factual conventional wisdom: How much of what you think is true really is true if you go to the, the numbers? For me this has been a salient issue in analyzing violence, because the conventional wisdom is that we're living in extraordinarily violent times.

But if you take into account the psychology of risk perception, as pioneered by Daniel Kahneman, Amos Tversky, Paul Slovic, Gerd Gigerenzer, and others, you realize that the conventional wisdom is systematically distorted by the source of our information about the world, namely the news. News is about the stuff that happens; it's not about the stuff that doesn't happen. Human risk perception is affected by memorable examples, according to Tversky and Kahneman's availability heuristic. No matter what the rate of violence is objectively, there are always enough examples to fill the news. And since our perception of risk is influenced by memorable examples, we'll always think we're living in violent times. It's only when you apply the scientific mindset to world events, to political science and history, and try to count how many people are killed now as opposed to ten years ago, a hundred years ago, or a thousand years ago that you get an accurate picture about the state of the world and the direction that it's going, which is largely downward. That conclusion only came from applying an empirical mindset to the traditional subject matter of history and political science.

The other aspect of the scientific mindset that ought to be exported to the rest of intellectual life is the search for explanations. That is, not to just say that history is one damn thing after another, that stuff happens, and there's nothing we can do to explain why, but to relate phenomena to more basic or general phenomena … and to try to explain those phenomena with still more basic phenomena. We've repeatedly seen that happen in the sciences, where, for example, biological phenomena were explained in part at the level of molecules, which were explained by chemistry, which was explained by physics.

There's no reason that that this process of explanation can't continue. Biology gives us a grasp of the brain, and human nature is a product of the organization of the brain, and societies unfold as they do because they consist of brains interacting with other brains and negotiating arrangements to coordinate their behavior, and so on.

Now I know that there is tremendous resistance to this idea, because it's confused with a boogeyman called "reductionism"—the fear that we must explain World War I in terms of genes or even elementary particles.

But explanation does not imply reduction. You reduce the building blocks of an explanation to more complex phenomena one level down, but you don't discard the explanation of the phenomenon itself. So World War I obviously is not going to be explained in terms of neuroscience.On the other hand, World War I could be explained in terms of the emotions of fear and dominance and prestige among leaders, which fell into a deadly combination at that moment in history. And instead of just saying, "Well, that's the way things are, and there's nothing more we can say about it," we can ask, , "Why do people compete for prestige? Why do people have the kinds of fears that they do?

The answer doesn't have to be, "Because I said so" or "Because that's the way it is." You can ask, "How does the psychology of fear work? How does the psychology of dominance work? How does the psychology of coalitions work?" Having done that, you get a deeper understanding of some of the causes of World War I. That doesn't mean you throw out the conventional history of World War I, it just means that you enrich it, you diversity it, you deepen it. A program of unifying the arts and humanities with the psychological sciences and ultimately the biological sciences promises tremendous increases of depth of understanding for all the fields.

I'm often asked, "Who are the leaders of this movement? Whose writings should we be reading and discussing?" But that misses the point. It's not about individual people. It's more revolutionary than just reading this, that or the other person. There has to be a change in mindset coming from both directions. It's not just a question of getting traditional scholars from the humanities and social sciences to start incorporating more science, to start thinking more like scientists. It's got to work the other direction as well. A lot of scientists really are philistines when it comes to history and political theory and philosophy. We need to break down the idea that there are these separate disciplines and modes of study.

In trying to figure out what would give us the deepest, most insightful, most informative understanding of the world and ourselves, we have to be aware of the turf battles: who gets the franchise for talking about what matters. That is one reason that there is cadre of traditional intellectuals who have been hostile to science. I'm not talking about the climate deniers or the vaccine kooks but those who resent the idea that the discussion of what matters, of morality, of politics, of meaning, of purpose should be taken on by these philistines called scientists or social scientists. They act as if the franchise for these heavyweight topics has been given to critics and literary scholars and commentators on religion.

But we need not give credence to people who are simply protecting their turf. It's becoming increasingly clear over the decades and centuries that an understanding of science is central to our understanding of the deepest questions of who we are, where we came from, what matters. If you aren't aware of what science has to say about who we are and what we're like as a species, then you're going to be missing a lot of insight about human life. The fact that this upsets certain traditional bastions of commentary shouldn't matter. People always protect their turf.

That's why I'm reluctant to answer when I'm asked who are the people we should be reading, what names can we associate with this approach. It's not about people. It's about the ideas, and the ideas inevitably come piecemeal from many thinkers. The ideas are refined, exchanged, accumulated, and improved by a community of thinkers, each of whom will have some a few ideas and a lot of bad ideas. What we've been talking about is a direction that I hope the entire intellectual culture goes in. It's not about anointing some guru.

Another intellectual error we must be suspicious of is the ever-present tendency to demonize the younger generation and the direction in which culture and society are going. In every era there are commentators who say that the kids today are dumbing down the culture and taking human values with them. Today the accusations are often directed at anything having to do with the Web and other electronic technologies—as if the difference between being printed on dead trees and displayed as pixels on a screen is going to determine the content of ideas. We're always being told that young people suck: that they are illiterate and unreflective and un-thoughtful, all of which ignores the fact that every generation had that said about them by the older generation. Yet somehow civilization persists.

An appreciation of psychology can remind us that we as a species are prone to these bad habits. When we comment on the direction that intellectual life is going, we should learn to discount our own prejudices, our own natural inclination to say "I and my tribe are entitled to weigh in on profound issues, but members of some other guild or tribe or clique are not." And "My generation is the embodiment of wisdom and experience, and the younger generation is uncouth, illiterate, unwashed and uncivilized." better

There is no "conflict between the sciences and humanities," or at least there shouldn't be. There should be no turf battle as to who gets to speak about what matters. What matters are ideas. We should seek the ideas that give us the deepest, richest, best-informed understanding of the human condition, regardless of which people or what discipline originates them. That has to include the sciences, but it can't come only from the sciences. The focus should be on ideas, not on people, disciplines, or academic traditions.

Wednesday, May 14, 2014

Steven Pinker - ‘What Could Be More Interesting than How the Mind Works?’

A long and interesting interview with Steven Pinker from the Harvard Gazette. Pinker is the author of a lot of really, really thick books, including The Better Angels of Our Nature: Why Violence Has Declined (2012), The Stuff of Thought: Language as a Window into Human Nature (2007), The Blank Slate: The Modern Denial of Human Nature (2002), and How the Mind Works (1999).

‘What could be more interesting than how the mind works?’

Steven Pinker’s history of thought

May 6, 2014 


By Colleen Walsh, Harvard Staff Writer

Steven Pinker follows Sara Lawrence-Lightfoot, Martha Minow, and E.O. Wilson in the Experience series, interviews with Harvard faculty members covering the reasons they became teachers and scholars, and the personal journeys, missteps included, behind their professional success. Interviews with Melissa Franklin, Stephen Greenblatt, Laurel Thatcher Ulrich, Helen Vendler, and Walter Willett will appear in coming weeks.

The brain is Steven Pinker’s playground. A cognitive scientist and experimental psychologist, Pinker is fascinated by language, behavior, and the development of human nature. His work has ranged from a detailed analysis of how the mind works to a best-seller about the decline in violence from biblical times to today.

Raised in Montreal, Pinker was drawn early to the mysteries of thought that would drive his career, and shaped in part by coming of age in the ’60s and early ’70s, when “society was up for grabs,” it seemed, and nature vs. nurture debates were becoming more complex and more heated.

His earliest work involved research in both visual imagery and language, but eventually he devoted himself to the study of language development, particularly in children. His groundbreaking 1994 book “The Language Instinct” put him firmly in the sphere of evolutionary psychology, the study of human impulses as genetically programmed and language as an instinct “wired into our brains by evolution.” Pinker, 59, has spent most of his career in Cambridge, and much of that time at Harvard — first for his graduate studies, later as an assistant professor. He is the Johnstone Family Professor of Psychology.

Q: Can you tell me about your early life? Where did you grow up and what did your parents do?

A: I grew up in Montreal, as part of the Jewish minority within the English-speaking minority within the French-speaking minority in Canada. This is the community that gave the world Leonard Cohen, who my mother knew, and Mordecai Richler, who my father knew, together with William Shatner, Saul Bellow, and Burt Bacharach. I was born in 1954, the peak year of the baby boom. My grandparents came to Canada from Eastern Europe in the 1920s, I surmise, because in 1924 the United States passed a restrictive immigration law. I can visualize them looking at a map and saying “Damn, what’s the closest that we can get to New York? Oh, there’s this cold place called Canada, let’s try that.” Three were from Poland, one from what is now Moldova. My parents both earned college degrees. My father had a law degree, but for much of his career did not practice law. He worked as a sales representative and a landlord and owned an apartment-motel in Florida. But he reopened his law practice in his 50s, and retired at 75. Like many women of her generation, my mother was a homemaker through the ’50s and ’60s. In the 1970s she got a master’s degree in counseling, then got a job and later became vice principal of a high school in Montreal.

I went to public schools in the suburbs of Montreal, and then to McGill University, which is also where my parents went. I came to Harvard in 1976 for graduate school, got my Ph.D. from this [psychology] department in 1979, went to MIT to do a postdoc, and came back here as an assistant professor in 1980. It was what they called a folding chair, since in those years Harvard did not have a genuine tenure track. I was advised to take the first real tenure-track job that came my way, and that happened within a few months, so I decamped for Stanford after just one year here. Something in me wanted to come back to Boston, so I left Stanford after a year and I was at MIT for 21 years before returning to Harvard ten and a half years ago. This is my third stint at Harvard.

Q: Were your parents instrumental in your choice of a career?

A: Not directly, other than encouraging my intellectual growth and expecting that I would do something that would make use of my strengths.

Q: What were those strengths?

A: My parents wanted me to become a psychiatrist, given my interest in the human mind, and given the assumption that any smart, responsible young person would go into medicine. They figured it was the obvious career for me. The 1970s was a decade in which the academic job market had collapsed. There were stories in The New York Times of Ph.D.s driving taxis and working in sheriff’s offices, and so they thought that a Ph.D. would be a ticket to unemployment — some things don’t change. They tried to reason with me: “If you become a psychiatrist, you get to indulge your interest in the human mind, but you also always have a job. You can always treat patients.” But I had no interest in pursuing a medical degree, nor in treating patients. Psychopathology was not my primary interest within psychology. So I gambled, figuring that if the worst happened and I couldn’t get an academic job I would be 25 years old and could do something else. Also, I chose a field — cognitive psychology — that I knew was expanding. I expected that psychology departments would be converting slots in the experimental analysis of behavior, that is, rats and pigeons being conditioned, to cognitive psychology. And that’s exactly what happened. Fortunately, I got three job offers in three years at three decent places. My parents were relieved, not to mention filled with naches.

Q: I read that an early experience with anarchy got you intrigued about the workings of the mind. Can you tell me more about that?

A: I was too young for ’60s campus activism; I was in high school when all of the excitement happened. But it was very much the world I lived in. The older siblings of my friends were college students, and you couldn’t avoid the controversies of the ’60s if you read the newspaper and watched TV. In the ’60s everyone had to have a political ideology. You couldn’t get a date unless you were a Marxist or an anarchist. Anarchism seemed appealing. I had a friend who had read Kropotkin and Bakunin and he persuaded me that human beings are naturally generous and cooperative and peaceful. That’s just the rational way to be if you didn’t have a state forcing you to delineate your property and separate it from someone else’s. No state, no property, nothing to fight over . . . I’d have arguments over the dinner table with my parents, and they said that if the police ever disappeared, all hell would break loose. Being 14 years old, of course I knew better, until an empirical test presented itself.

Quebec is politically and economically very Gallic: Sooner or later, every public sector goes on strike. One week it’s the garbage collectors, another week the letter carriers. Then one day the police went on strike. They simply did not show up for work one morning. So what happened? Well, within a couple of hours there was widespread looting, rioting, and arson — not one but two people were shot to death, until the government called in the Mounties to restore order. This was particularly shocking in Montreal, which had a far lower rate of violent crime than American cities. Canadians felt morally superior to Americans because we didn’t have the riots and the civil unrest of the 1960s. So to see how quickly violent anarchy could break out in the absence of police enforcement was certainly, well, informative. As so often happens, long-suffering mom and dad were right, and their smart-ass teenage son was wrong. That episode also gave me a taste of what it’s like to be a scientist, namely that cherished beliefs can be cruelly falsified by empirical tests.

I wouldn’t say it’s that incident in particular that gave me an interest in human nature. But I do credit growing up in the ’60s, when these ideas trickled down, and the early ’70s, which were an extension of the ’60s. Debates on human nature and its political implications were in the air. Society was up for grabs. There was talk of revolution and rationally reconstructing society, and those discussions naturally boiled down to rival conceptions of human nature. Is the human psyche socially constructed by culture and parenting, or is there even such a thing as human nature? And if there is, what materials do we have to work with in organizing a society? In college I took a number of courses that looked at human nature from different vantage points: anthropology, sociology, psychology, literature, philosophy. But psychology appealed to me because it seemed to ask profound questions about our kind, but it also offered the hope that the questions could be answered in the lab. So it had just the right mixture of depth and tractability.

Q: You started your career interested in the visual realm as well as in language, but eventually you chose to focus your energies on your work with language. Why?

A: Starting from graduate school I pursued both. My Ph.D. thesis was done under the supervision of Stephen Kosslyn, who later became chair of this department, then dean of social science until he left a couple of years ago to become provost of Minerva University. My thesis was on visual imagery, the ability to visualize objects in the mind’s eye. At the same time, I took a course with Roger Brown, the beloved social psychologist who was in this department for many years. In yet another course I wrote a theoretical paper on language acquisition, which took on the question “How could any intelligent agent make the leap from a bunch of words and sentences in its input to the ability to understand and produce an infinite number of sentences in the language from which they were drawn?” That was the problem that Noam Chomsky set out as the core issue in linguistics.

So I came out of graduate school with an interest in both vision and language. When I was hired back at Harvard a year after leaving, I was given responsibility for three courses in language acquisition. In the course of developing the lectures and lab assignments I started my own empirical research program on language acquisition. And I pursued both projects for about 10 years until the world told me that it found my work on language more interesting than my work on visual cognition. I got more speaking invitations, more grants, more commentary. And seeing that other people in visual cognition like Ken Nakayama, my colleague here, were doing dazzling work that I couldn’t match, whereas my work on language seemed to be more distinctive within its field — that is, there weren’t other people doing what I was doing — I decided to concentrate more and more on language, and eventually closed down my lab in visual cognition.

Q: Did you have any doubts when you were starting out in your career?

A: Oh, absolutely. I was terrified of ending up unemployed. When I got to Harvard, the Psychology Department, at least the experimental program in the Psychology Department, was extremely mathematical. It specialized in a sub-sub-discipline called psychophysics, which was the oldest part of psychology, coming out of Germany in the late 19th century. William James, the namesake of this building, said “the study of psychophysics proves that it is impossible to bore a German.” Now, I’m interested in pretty much every part of psychology, including psychophysics. But this was simply not the most exciting frontier in psychology, and even though I was good in math, I didn’t have nearly as much math background as a hardcore psychophysicist, and I wondered whether I had what it took to do the kind of psychology being done here. But it was starting to become clear — even at Harvard — that mathematical psychophysics was becoming increasingly marginalized, and if it wanted to keep up, Harvard had to start hiring in cognitive psychology. They hired Steve Kosslyn, we immediately hit it off, and I felt much more at home.

Q: If you were trying to get someone interested in this field today, what would you say?

A: What could be more interesting than how the mind works? Also, I believe that psychology sits at the center of intellectual life. In one direction, it looks to the biological sciences, to neuroscience, to genetics, to evolution. But in the other, it looks to the social sciences and the humanities. Societies are formed and take their shape from our social instincts, our ability to communicate and cooperate. And the humanities are the study of the products of our human mind, of our works of literature and music and art. So psychology is relevant to pretty much every subject taught at a university.

Psychology is blossoming today, but for much of its history it was dull, dull, dull. Perception was basically psychophysics, the study of the relationship between the physical magnitude of stimulus and of its perceived magnitude — that is, as you make a light brighter and brighter, does its subjective brightness increase at the same rate or not? It also studied illusions, like the ones on the back of the cereal box, but without much in the way of theory. Learning was the study of the rate at which rats press levers when they are rewarded with food pellets. Social psychology was a bunch of laboratory demonstrations showing that people could behave foolishly and be mindless conformists, but also without a trace of theory explaining why. It’s only recently, in dialogue with other disciplines, that psychology has begun to answer the “why” questions. Cognitive science, for example, which connects psychology to linguistics, theoretical computer science, and philosophy of mind, has helped explain intelligence in terms of information, computation, and feedback. Evolutionary thinking is necessary to ask the “why” questions: “Why does the mind work the way it does instead of some other way in which it could have worked?” This crosstalk has made psychology more intellectually satisfying. It’s no longer just one damn phenomenon after another.

Q: Is there a single work that you are most proud of?

A: I am proud of “How the Mind Works” for its sheer audacity in trying to explain exactly that, how the mind works, between one pair of covers. At the other extreme of generality, I’m proud of a research program I did for about 15 years that culminated in “Words and Rules,” a book about, of all things, irregular verbs, which I use as a window onto the workings of cognition. I’m also fulfilled by having written my most recent book, “The Better Angels of Our Nature,” which is about something completely different: the historical decline of violence and its causes, a phenomenon that most people are not even aware of, let alone have an explanation for. In that book, I first had to convince readers that violence has declined, knowing that the very idea strikes people as preposterous, even outrageous. So I told the story in 100 graphs, each showing a different category of violence: tribal warfare, slavery, homicide, war, civil war, domestic violence, corporal punishment, rape, terrorism. All have been in decline. Having made this case, I returned to being a psychologist, and set myself the task of explaining how that could have happened. And that explanation requires answering two psychological questions: “Why was there so much violence in the past?” and “What drove the violence down?” For me, the pair of phenomena stood as a corroboration of an idea I have long believed; mainly that human nature is complex. There is no single formula that explains what makes people tick, no wonder tissue, no magical all-purpose learning algorithm. The mind is a system of mental organs, if you will, and some of its components can lead us to violence, while others can inhibit us from violence. What changed over the centuries and decades is which parts of human nature are most engaged. I took the title, “The Better Angels of Our Nature,” from Abraham Lincoln’s first inaugural. It’s a poetic allusion to the idea that there are many components to human nature, some of which can lead to cooperation and amity.

Q: I read a newspaper article in which you talked about the worst thing you have ever done. Can you tell me about that?

A: It was as an undergraduate working in a behaviorist lab. I carried out a procedure that turned out to be tantamount to torturing a rat to death. I was asked to do it, and against my better judgment, did it. I knew it had little scientific purpose. It was done in an era in which there was no oversight over the treatment of animals in research, and just a few years later it would have been inconceivable. But this painful episode resonated with me for two reasons. One is that it was a historical change in a particular kind of violence that I lived through, namely the increased concern for the welfare of laboratory animals. This was one of the many developments I talk about in the “The Better Angels of Our Nature.” Also, as any psychology student knows, humans sometimes do things against their own conscience under the direction of a responsible authority, even if the authority has no power to enforce the command. This is the famous Milgram experiment, in which people were delivering what they thought were fatal shocks to subjects pretending to be volunteers. I show the film of the Milgram experiment to my class every year. It’s harrowing to watch, but I’ve seen it now 17 times and found it just as gripping the 17th time as the first. There was a lot of skepticism that people could possibly behave that way. Prior to the experiment, a number of experts were polled for their prediction as to what percentage of subjects would administer the most severe shock. The average of the predictions was on the order of one-tenth of one percent. The actual result was 70 percent. Many people think there must be some trick or artifact, but having behaved like Milgram’s 70 percent myself, despite thinking of myself as conscientious and morally concerned, I believe that the Milgram study reveals a profound and disturbing feature of human psychology.

Pinker, at his Boston home, might someday add photography to his list of book topics.

Q: What would you say is your biggest flaw as a scholar? What about your greatest strength?

A: That’s for other people to judge! I am enough of a psychologist to know that any answer I give would be self-serving. La Rochefoucauld said, “Our enemies’ opinions of us come closer to the truth than our own.”

Q: As an expert in language, what do you think of Twitter?

A: I was pressured into becoming a Twitterer when I wrote an op-ed for The New York Times saying that Google is not making us stupid, that electronic media are not ruining the language. And my literary agent said, “OK, you’ve gone on record saying that these are not bad things. You better start tweeting yourself.” And so I set up a Twitter feed, which turns out to suit me because it doesn’t require taking out hours of the day to write a blog. The majority of my tweets are links to interesting articles, which takes advantage of the breadth of articles that come my way — everything from controversies over correct grammar to trends in genocide.

Having once been a young person myself, I remember the vilification that was hurled at us baby boomers by the older generation. This reminds me that it is a failing of human nature to detest anything that young people do just because older people are not used to it or have trouble learning it. So I am wary of the “young people suck” school of social criticism. I have no patience for the idea that because texting and tweeting force one to be brief, we’re going to lose the ability to express ourselves in full sentences and paragraphs. This simply misunderstands the way that human language works. All of us command a variety of registers and speech styles, which we narrowcast to different forums. We speak differently to our loved ones than we do when we are lecturing, and still differently when we are approaching a stranger. And so, too, we have a style that is appropriate for texting and instant messaging that does not necessarily infect the way we communicate in other forums. In the heyday of telegraphy, when people paid by the word, they left out the prepositions and articles. It didn’t mean that the English language lost its prepositions and articles; it just meant that people used them in some media and not in others. And likewise, the prevalence of texting and tweeting does not mean that people magically lose the ability to communicate in every other conceivable way.

Q: Early in your career you wrote a number of important technical works. Do you find it more fun to write the broader appealing books?

A: Both are appealing for different reasons. In trade books I have the length to pursue objections, digressions, and subtleties, something that is hard to do in the confines of a journal article. I also like the freedom to avoid academese and to write in an accessible style — which happens to be the very topic of my forthcoming book, “The Sense of Style: The Thinking Person’s Guide to Writing in the 21st Century.” I also like bringing to bear ideas and sources of evidence that don’t come from a single discipline. In the case of my books on language, for example, I used not just laboratory studies of kids learning to talk, or studies of language in patients with brain damage, but also cartoons and jokes where the humor depends on some linguistic subtlety. Telling examples of linguistic phenomena can be found in both high and low culture: song lyrics, punch lines from stand-up comedy, couplets from Shakespeare. In “Better Angels,” I supplemented the main narrative, told with graphs and data, with vignettes of culture at various times in history, which I presented as a sanity check, as a way of answering the question, “Could your numbers be misleading you into a preposterous conclusion because you didn’t try to get some echo from the world as to whether life as it was lived reflects the story told by the numbers?” If, as I claim, genocide is not a modern phenomenon, we should see signs of it being treated as commonplace or acceptable in popular narratives. One example is the Old Testament, which narrates one genocide after another, commanded by God. This doesn’t mean that those genocides actually took place; probably most of them did not. But it shows the attitude at the time, which is, genocide is an excellent thing as long as it doesn’t happen to you.

I also find that there is little distinction between popular writing and cross-disciplinary writing. Academia has become so hyperspecialized that as soon as you write for scholars who are not in your immediate field, the material is as alien to them as it is to a lawyer or a doctor or a high school teacher or a reader of The New York Times.

Q: Were you a big reader as a teen? Can you think of one or two works you read early, fiction or nonfiction, where you came away impressed, even inspired, by the ideas, the craft, or both?

A: I was a voracious reader, and then as now, struggled to balance breadth and depth, so my diet was eclectic: newspapers, encyclopedias, a Time-Life book-of-the-month collection on science, magazines (including Esquire in its quality-essay days and Commentary in its pre-neocon era), and teen-friendly fiction by Orwell, Vonnegut, Roth, and Salinger (the intriguing Glasses, not the tedious Caulfield). Only as a 17-year-old in junior college did I encounter a literary style I consciously wanted to emulate — the wit and clarity of British analytical philosophers like Gilbert Ryle and A.J. Ayer, and the elegant prose of the Harvard psycholinguists George Miller and Roger Brown.

Q: Might we one day see a Steven Pinker book about horse racing or piano playing — or a Pinker novel? Is there a genre or off-work-hours interest you’ve thought seriously about putting book-length work into?

A: Whatever thoughts I might have had of writing a novel were squelched by marrying a real novelist [Rebecca Goldstein] and seeing firsthand the degree of artistry and brainpower that goes into literary fiction. But I have pondered other crossover projects. I’m an avid photographer, and would love to write a book someday that applied my practical experience, combined with vision science and evolutionary aesthetics, to explaining why we enjoy photographs. And I’ve thought of collaborating with Rebecca on a book on the psychology, philosophy, and linguistics of fiction — which would give me an excuse to read the great novels I’ve never found time for.

Q: You have won several teaching awards during your career. What makes a great teacher?

A: Foremost is passion for the subject matter. Studies of teaching effectiveness all show that enthusiasm is a major contributor. Also important is an ability to overcome professional narcissism, namely a focus on the methods, buzzwords, and cliques of your academic specialty, rather than a focus on the subject matter, the actual content. I don’t think of what I’m teaching my students as “psychology.” I think of it as teaching them “how the mind works.” They’re not the same thing. Psychology is an academic guild, and I could certainly spend a lot of time talking about schools of psychology, the history of psychology, methods in psychology, theories in psychology, and so on. But that would be about my clique, how my buddies and I spend our days, how I earn my paycheck, what peer group I want to impress. What students are interested in is not an academic field but a set of phenomena in the world — in this case the workings of the human mind. Sometimes academics seem not to appreciate the difference.

A third ingredient of good teaching is overcoming “the curse of knowledge”: the inability to know what it’s like not to know something that you do know. That is a lifelong challenge. It’s a challenge in writing, and it’s a challenge in teaching, which is why I see a lot of synergy between the two. Often an idea in one of my books will have originated from the classroom, or vice versa, because the audience is the same: smart people who are intellectually curious enough to have bought the book or signed up for the course but who are just not as knowledgeable about a particular topic as I am. The obvious solution is to “imagine the reader over your shoulder” or “to put yourself in your students’ shoes.” That’s a good start, but it’s not enough, because the curse of knowledge prevents us from fully appreciating what it’s like to be a student or a reader. That’s why writers need editors: The editors force them to realize that what’s obvious to them isn’t obvious to everyone else. And it’s why teachers need feedback, either from seeing the version of your content that comes back at you in exams, or in conversations with students during office hours, or in discussion sessions. Another important solution is being prepared to revise. Most of the work of writing is in the revising. During the first pass of the writing process, it’s hard enough to come up with ideas that are worth sharing. To simultaneously concentrate on the form, on the felicity of expression, is too much for our thimble-sized minds to handle. You have to break it into two distinct stages: Come up with the ideas, and polish the prose. This may sound banal, but I find that it comes as a revelation to people who ask about my writing process. It’s why in my SLS 20 class, the assignment for the second term paper is to revise the first term paper. That’s my way to impress on students that the quality comes in the revision.

Q: How do students differ today from when you were a student?

A: What a dangerous question! The most tempting and common answer is the thoughtless one: “The kids today are worse.” It’s tempting because people often confuse changes in themselves with changes in the times, and changes in the times with moral and intellectual decline. This is a well-documented psychological phenomenon. Every generation thinks that the younger generation is dissolute, lazy, ignorant, and illiterate. There is a paper trail of professors complaining about the declining quality of their students that goes back at least 100 years. All this means that your question is one that people should think twice before answering. I know a lot more now than I did when I was a student, and thanks to the curse of knowledge, I may not realize that I have acquired most of it during the decades that have elapsed since I was a student. So it’s tempting to look at students and think, “What a bunch of inarticulate ignoramuses! It was better when I was at that age, a time when I and other teenagers spoke in fluent paragraphs, and we effortlessly held forth on the foundations of Western civilization.” Yeah, right.

Here is a famous experiment. A 3-year-old comes into the lab. You give him a box of M&Ms. He opens up the box and instead of finding candy he finds a tangle of ribbons. He is surprised, and now you say to him, “OK, now your friend Jason is going to come into the room. What will Jason think is in the box?” The child says, “ribbons,” even though Jason could have no way of knowing that. And, if you ask the child, “Before you opened the box, what did you think was in it?” They say, “ribbons.” That is, they backdate their own knowledge. Now we laugh at the 3-year-old, but we do the same thing. We backdate our own knowledge and sophistication, so we always think that the kids today are more slovenly than we were at that age.

Q: What are some of the greatest things your students have taught you?

A: Many things. The most obvious is the changes in technology for which we adults are late adopters. I had never heard of Reddit, let alone knowing that it was a major social phenomenon, until two of my students asked if I would do a Reddit AMA [Ask Me Anything]. I did the session in my office with two of my students guiding me, kind of the way I taught my grandmother how to use this newfangled thing called an answering machine. That evening I got an email from my editor in New York saying: “The sales of your book just mysteriously spiked. Any explanation?” It was all thanks to Reddit, which I barely knew existed. Another is a kind of innocence — though that’s a condescending way to put it. It’s a curiosity about the world untainted by familiarity with an academic field. It works as an effective challenge to my own curse of knowledge. So if you want to know what it’s like not to know something that you know, the answer is not to try harder, because that doesn’t work very well. The answer is to interact with someone who doesn’t know what you know, but who is intelligent, curious, and open.

Q: If you weren’t in this field, what would you be doing?

A: Am I allowed to be an academic?

Q: You can be anything you want.

A: I could have been in some other field that deals with ideas, like philosophy or constitutional law. I have enough of an inner geek to imagine being a programmer, and for a time as an undergraduate that appealed to me. But as much as I like gadgets and code, I like ideas more, so I suspect that the identical twin separated from me at birth would also have done something in the world of ideas.

Q: No Steven Pinker interview would be complete without a question about your hair. I recently saw a picture of you from the 1970s, and your style appears unchanged. Why haven’t you gone for a shorter look?

A: First, there’s immaturity. Any boy growing up in the ’60s fought a constant battle with his father about getting a haircut. Now no one can force me to get my hair cut, and I’m still reveling in the freedom. Also, I had a colleague at MIT, the computer scientist Pat Winston, who had a famous annual speech on how to lecture, and one of his tips was that every professor should have an affectation, something to amuse students with. Or journalists, comedians, and wise guys. I am the charter member of an organization called The Luxuriant Flowing Hair Club for Scientists. The MIT newspaper once ran a feature on all the famous big-haired people I had been compared to, including Simon Rattle, Robert Plant, Spinoza, and Bruno, the guy who played the piano on the TV show “Fame.” When I was on The Colbert Report, talking about fear and security, he pulled out an electromagnetic wand and scanned my hair for concealed weapons. So it does have its purposes.

Interview was edited for length and clarity.

Thursday, May 01, 2014

The Better Angels of Our Nature: Why Violence Has Declined

 

Steven Pinker's The Better Angels of Our Nature: Why Violence Has Declined (2011) received a lot of praise, and it was also dismissed as "hallucinatory" by Robert Epstein in Scientific American.
People pay more attention to facts that match their beliefs than those that undermine them. Pinker wants peace, and he also believes in his hypothesis; it is no surprise that he focuses more on facts that support his views than on those that do not.
So watch the talk below, given at UC Berkeley in February, but just posted on the UCTV Channel, and see if you agree with Epstein or with Pinker.

The Better Angels of Our Nature: Why Violence Has Declined

Published on Apr 29, 2014


Believe it or not, violence has been in decline for long stretches of time, and we may be living in the most peaceful era in our species existence. Harvard psychology professor Steven Pinker presents the data supporting this surprising conclusion, and explains the trends by showing how changing historical circumstances have engaged different components of human nature. Recorded on 02/04/2014. Series: "UC Berkeley Graduate Council Lectures" [5/2014] - (Visit: http://www.uctv.tv/)

Thursday, April 03, 2014

World Thinkers 2013 - Prospect Magazine

Here is the list of top "world thinkers" from Prospect Magazine, 2013 edition. It's an interesting list of people - although I am not sure I would put Richard Dawkins in the #1 slot, or maybe event he top ten.

World Thinkers 2013

by Prospect / April 24, 2013 / 109 Comments

The results of Prospect’s world thinkers poll


Left to right: Ashraf Ghani, Richard Dawkins, Steven Pinker © US Embassy, Kabul © Rex Features

After more than 10,000 votes from over 100 countries, the results of Prospect’s world thinkers 2013 poll are in. Online polls often throw up curious results, but this top 10 offers a snapshot of the intellectual trends that dominate our age.

THE WINNERS

1. Richard Dawkins
When Richard Dawkins, the Oxford evolutionary biologist, coined the term “meme” in The Selfish Gene 37 years ago, he can’t have anticipated its current popularity as a word to describe internet fads. But this is only one of the ways in which he thrives as an intellectual in the internet age. He is also prolific on Twitter, with more than half a million followers—and his success in this poll attests to his popularity online. He uses this platform to attack his old foe, religion, and to promote science and rationalism. Uncompromising as his message may be, he’s not averse to poking fun at himself: in March he made a guest appearance on The Simpsons, lending his voice to a demon version of himself.

2. Ashraf Ghani
Few academics get the chance to put their ideas into practice. But after decades of research into building states at Columbia, Berkeley and Johns Hopkins, followed by a stint at the World Bank, Ashraf Ghani returned to his native Afghanistan to do just that. He served as the country’s finance minister and advised the UN on the transfer of power to the Afghans. He is now in charge of the Afghan Transition Coordination Commission and the Institute for State Effectiveness, applying his experience in Afghanistan elsewhere. He is already looking beyond the current crisis in Syria, raising important questions about what kind of state it will eventually become.

3. Steven Pinker
Long admired for his work on language and cognition, the latest book by the Harvard professor Steven Pinker, The Better Angels of Our Nature, was a panoramic sweep through history. Marshalling a huge range of evidence, Pinker argued that humanity has become less violent over time. As with Pinker’s previous books, it sparked fierce debate. Whether writing about evolutionary psychology, linguistics or history, what unites Pinker’s work is a fascination with human nature and an enthusiasm for sharing new discoveries in accessible, elegant prose.

4. Ali Allawi
Ali Allawi began his career in 1971 at the World Bank before moving into academia and finally politics, as Iraq’s minister of trade, finance and defence after the fall of Saddam Hussein. Since then he has written a pair of acclaimed books, most recently The Crisis of Islamic Civilisation, and he is currently a senior visiting fellow at Princeton. “His scholarly work on post-Saddam Iraq went further than anyone else has yet done in helping us understand the complex reality of that country,” says Clare Lockhart, co-author (with Ashraf Ghani) of Fixing Failed States. “His continuing work on the Iraqi economy—and that of the broader region—is meanwhile helping to illuminate its potential, as well as pathways to a more stable and productive future.”

5. Paul Krugman
As a fierce critic of the economic policies of the right, Paul Krugman has become something like the global opposition to fiscal austerity. A tireless advocate of Keynesian economics, he has been repeatedly attacked for his insistence that government spending is critical to ending the recession. But as he told Prospect last year, “we’ve just conducted what amounts to a massive experiment on pretty much the entire OECD [the industrialised world]. It’s been as slam-dunk a victory for a more or less Keynesian view as one can possibly imagine.” His New York Times columns are so widely discussed that it is easy to overlook his academic work, which has won him a Nobel prize and made him one of the world’s most cited economists.

6. Slavoj Žižek
Slavoj Žižek’s critics seem unsure whether to dismiss him as a buffoon or a villain. The New Republic has called him “the most despicable philosopher in the west,” but the Slovenian’s legion of fans continues to grow. He has been giving them plenty to chew on—in the past year alone he has produced a 1,200-page study of Hegel, a book, The Year of Dreaming Dangerously, analysing the Arab Spring and other recent events, and a documentary called The Pervert’s Guide to Ideology. And he has done all this while occupying academic posts at universities in Slovenia, Switzerland and London. His trademark pop culture references (“If you ask me for really dangerous ideological films, I’d say Kung Fu Panda,” he told one interviewer in 2008) may have lost their novelty, but they remain a gentle entry point to his studies of Lacanian psychoanalysis and left-wing ideology.

7. Amartya Sen
Amartya Sen will turn 80 in November—making him the fourth oldest thinker on our list—but he remains one of the world’s most active public intellectuals. He rose to prominence in the early 1980s with his studies of famine. Since then he has gone on to make major contributions to developmental economics, social choice theory and political philosophy. Receiving the Nobel prize for economics in 1998, he was praised for having “restored an ethical dimension to the discussion of vital economic problems.” The author of Prospect’s first cover story in 1995, Sen continues to write influential essays and columns, in the past year arguing against European austerity. And he shows no sign of slowing down or narrowing his focus—his latest book (with Jean Drèze), An Uncertain Glory: India and its Contradictions, will be published in July.

8. Peter Higgs
The English physicist Peter Higgs lent his name to the Higgs boson, the subatomic particle discovered last year at Cern that gives mass to other elementary particles. Although Higgs is always quick to point out that others were involved in early work on the existence of the particle, he was central to the first descriptions of the boson in 1964. “Of the various people who contributed to that piece of theory,” Higgs told Prospect in 2011, “I was the only one who pointed to this particle as something that would be… of interest for experimentalists.” Higgs is expected to receive a Nobel prize this year for his achievements.

9. Mohamed ElBaradei
The former director general of the UN’s international atomic energy agency and winner of the 2005 Nobel peace prize, Mohamed ElBaradei has become one of the most prominent advocates of democracy in Egyptian politics over the past two years. Since December, ElBaradei has been the coordinator of the National Salvation Front, a coalition of political parties dedicated to opposing what they see as President Mohamed Morsi’s attempts to secure power for himself and impose a new constitution favouring Islamist parties. Reflecting widespread concern about Morsi’s actions, ElBaradei has accused the president of appointing himself “Egypt’s new pharaoh.”

10. Daniel Kahneman
Since the publication of Thinking, Fast and Slow in 2011, Daniel Kahneman has become an unlikely resident at the top of the bestseller lists. His face has even appeared on posters on the London Underground, with only two words of explanation: “Thinking Kahneman.” Although he is a psychologist by training, his work on our capacity for making irrational decisions helped create the field of behavioural economics, and he was awarded the Nobel prize for economics in 2002. His book has now brought these insights to a wider audience, making them more influential than ever.

Biographies by Daniel Cohen, Jay Elwes and David Wolf. Additional research by Luke Neima and Lucy Webster

RANKINGS 11 TO 65

11. Steven Weinberg, physicist
12. Jared Diamond, biologist
13. Oliver Sacks, neurologist and author
14. Ai Weiwei, artist
15. Arundhati Roy, writer
16. Nate Silver, statistician
17. Asgar Farhadi, filmmaker
18. Ha-Joon Chang, economist
19. Martha Nussbaum, philosopher
20. Elon Musk, businessman
21. Michael Sandel, philosopher
22. Niall Ferguson, historian
23. Hans Rosling, statistician
24 = Anne Applebaum, journalist
24 = Craig Venter, biologist
26. Shinya Yamanaka, biologist
27. Jonathan Haidt, psychologist
28. George Soros, philanthropist
29. Francis Fukuyama, political scientist
30. James Robinson and Daron Acemoglu, political scientist and economist
31. Mario Draghi, economist
32. Ramachandra Guha, historian
33. Hilary Mantel, novelist
34. Sebastian Thrun, computer scientist
35. Zadie Smith, novelist
36 = Hernando de Soto, economist
36 = Raghuram Rajan, economist
38. James Hansen, climate scientist
39. Christine Lagarde, economist
40. Roberto Unger, philosopher
41. Moisés Naím, political scientist
42. David Grossman, novelist
43. Andrew Solomon, writer
44. Esther Duflo, economist
45. Eric Schmidt, businessman
46. Wang Hui, political scientist
47. Fernando Savater, philosopher
48. Alexei Navalny, activist
49. Katherine Boo, journalist
50. Anne-Marie Slaughter, political scientist
51. Paul Collier, development economist
52. Margaret Chan, health policy expert
53. Sheryl Sandberg, businesswoman
54. Chen Guangcheng, activist
55. Robert Shiller, economist
56 = Ivan Krastev, political scientist
56 = Nicholas Stern, economist
58. Theda Skocpol, sociologist
59 = Carmen Reinhart, economist
59 = Ngozi Okonjo-Iweala, economist
61. Jeremy Grantham, investment strategist
62. Thomas Piketty and Emmanuel Saez, economists
63. Jessica Tuchman Mathews, political scientist
64. Robert Silvers, editor
65. Jean Pisani-Ferry, economist

ANALYSIS

Only three thinkers from our 2005 top 10, Richard Dawkins, Paul Krugman and Amartya Sen, appear in this year’s top spots. The panelists who drew up the longlist of 65 gave credit for the currency of candidates’ work—their influence over the past 12 months and their continuing significance for this year’s biggest questions.

Among the new entries at the top are Peter Higgs—whose inclusion is a sign of public excitement about the discoveries emerging from the world’s largest particle physics laboratory, Cern—and Slavoj Žižek, whose critique of global capitalism has gained more urgency in the wake of the financial crisis. The appearance of Steven Pinker and Daniel Kahneman, authors of two of the most successful recent “ideas books,” further demonstrates the public appetite for serious, in-depth thinking in the age of the TED talk. The inclusion of Ashraf Ghani, Ali Allawi and Mohamed ElBaradei—from Afghanistan, Iraq and Egypt, respectively—reflects the importance of their work on fostering democracies across the Muslim world in the wake of foreign interventions and the Arab Spring.

One new development was the influence of social media, with just over half of voters coming to the world thinkers homepage via Twitter or Facebook. Twitter also gave readers a chance to respond to the list and highlight notable omissions—Stephen Hawking and Noam Chomsky were popular choices.

As always, the absences are as revealing as the familiar names at the top. The failure of environmental thinkers to win many votes may be a sign of the faltering energy of the green movement. Despite the presence of climate scientists lower down the list, the movement seems to lack successors to influential public intellectuals such as Rachel Carson and James Lovelock. Serious thinkers about the internet and technology are also conspicuous by their absence. The highest-placed representative of Silicon Valley is the entrepreneur Elon Musk, but beyond journalist-critics such as Evgeny Morozov and Nicholas Carr, technology still awaits its heavyweight public intellectuals (see Thomas Meaney, £).

Most striking of all is the lack of women at the top of this year’s list. The highest-placed woman in this year’s poll, at number 15, is Arundhati Roy, who has become a prominent left-wing critic of inequalities and injustice in modern India since the publication of her novel The God of Small Things over a decade ago.

Many thanks to all those who voted. Do let us know what you make of the results.

~ David Wolf

MORE ON THE WORLD THINKERS OF 2013:

Do public intellectuals matter? asks AC Grayling

The XX factor: Jessica Abrahams looks at the women on the list

Follow Prospect on Facebook and Twitter