Showing posts with label human nature. Show all posts
Showing posts with label human nature. Show all posts

Thursday, August 07, 2014

Kirk Schneider - Why Are Humans Violent? The Psychological Reason We Hurt Each Other


Kirk Schneider is the author of The Polarized Mind: Why It’s Killing Us and What We Can Do About It (2013), as well as The Paradoxical Self: Toward an Understanding of Our Contradictory Nature (1990), Rediscovery of Awe: Splendor, Mystery and the Fluid Center of Life (2004), and several textbooks on humanistic psychology, including Existential-Integrative Psychotherapy: Guideposts to the Core of Practice (2007).

Here is some biographical background from his Amazon page:
KIRK J. SCHNEIDER, Ph.D., is a leading spokesperson for contemporary existential-humanistic psychology. Dr. Schneider is the recent past editor of the Journal of Humanistic Psychology (2005-2012), vice-president of the Existential-Humanistic Institute (EHI), and adjunct faculty at Saybrook University, Teachers College, Columbia University, and the California Institute of Integral Studies. A Fellow of the American Psychological Association (APA), Dr. Schneider has published over 100 articles and chapters and has authored or edited nine books (now 10). Dr. Schneider is the recipient of the Rollo May Award from Division 32 of the APA for "Outstanding and independent pursuit of new frontiers in humanistic psychology," the "Cultural Innovator" award from the Living Institute, Toronto, Canada, a psychotherapy training center which bases its diploma on Dr. Schneider's Existential-Integrative model of therapy, and an Honorary Diploma from the East European Association of Existential Therapy. Dr. Schneider is also a founding member of the Existential-Humanistic Institute in San Francisco, which in August, 2012 launched one of the first certificate programs in the "foundations" of Existential-Humanistic practice ever to be offered in the U.S.A.
This is an interesting article, but I suspect his most recent book is something I really need to read. 

Why Are Humans Violent? The Psychological Reason We Hurt Each Other

Terror management theorists explain how when we feel small and humiliated, we'll do anything to feel big.

July 30, 2014 | By Kirk Schneider

Photo Credit: Shutterstock.com

From the crises in the Middle East to mass shootings in U.S. schools to the reckless striving for wealth and world domination, there is one overarching theme that almost never gets media coverage—the sense of insignificance that drives destructive acts. As a depth psychologist with many years of experience, I can say emphatically that the sense of being crushed, humiliated and existentially unimportant are the main factors behind so much that we call psychopathology.

Why would it not follow that the same factors are at play in social and cultural upheavals? The emerging science of “terror management theory” shows convincingly that when people feel unimportant they equate those feelings with dying—and they will do everything they can, including becoming extreme and destructive themselves to avoid that feeling.

The sense of insignificance and death anxiety have been shown to play a key role in everything from terrorism to mass shootings to extremist religious and political ideologies to obsessions with materialism and wealth. Just about all that is violent and corrupt in our world seems connected to it.

So before we rush to judgment about the basis of violence in our world, we would do well to heed the terror management theorists and consider missing pieces of the puzzle. Economic, ideological and biological explanations take us only so far in unpacking the bewildering phenomenon of slaughtering people in cold blood, or playing recklessly with their health, safety or livelihoods. Granted, some violence is defensive and perhaps necessary to protect the lives of sovereign individuals and states. But too often violence is provocative, and when it becomes so betrays a common thread of psychological destitution—the sense of insignificance, the sense of not counting, of helplessness, and of emotional devaluation. We have stories daily about both lone gunmen and soldiers who seek vengeance and “prestige” to cover over their cultural and emotional wounds. Correspondingly, such stories parallel the kind of psychopathy of some in the corporate sector who speculate, pollute and militarize at will.

How do we prevent such terrorizing cycles from continuing to arise? How do we transform people who feel so utterly estranged and stripped of value that they are willing to do virtually anything to redress perceived injustices? That transformation is not likely to occur through political or military coercion (as is now being contemplated in Iraq), nor through the ingestion of pills nor anger management programs (as was the case with several mass shooters), nor through the usual hand-wringing about stricter gun laws and increased diplomacy, which are imperative, but don’t go far enough.

What is needed is no less than a “moral equivalent of war,” to echo the philosopher William James, but at a fraction of the cost. The rampant sense of insignificance needs to be addressed at its root, and not with simplistic bromides. This means that alongside providing affordable short-term public mental health services, we also need to provide affordable long-term, in-depth mental health services. Such services would emphasize the transformative power of deeply supportive, subtly attuned relationships over short-term palliatives and would likely be life-changing (as well as life-saving) in their impact.

We could (and should) also provide cadres of group facilitators to optimize encounters between people in power. These encounters could include heads of state, members of diplomatic corps and legislators. Such facilitators could be schooled in well-established approaches to mediation, such as nonviolent communications, and would likely be pivotal in the settlement of intractable disputes. While the most hardened extremists may be unreachable, there are many others who might surprise us and engage the opportunity.

There is no theoretical reason why such practices would not work with the appropriate adaptations; we see these practices work every day in our clinics and consulting rooms, and often with the most challenging personalities.

The range of violent upheaval in the world is alarming. The quick fix, militarist solutions to this problem are faltering. In many cases, they are making situations worse (as we have seen with recent military operations). The time for a change in societal consciousness is at hand. By focusing our resources on the root of the problem, the many people who feel they don’t count, we not only bolster individual and collective lives, we provide a model that others will find difficult to ignore.

~ Kirk Schneider is president-elect of the Society for Humanistic Psychology of the American Psychological Association, and author of “The Polarized Mind: Why It’s Killing Us and What We Can Do About It.”

Friday, July 04, 2014

Paul Bloom - Can Prejudice Ever Be a Good Thing?


In the TED Talk from the beginning of 2014, Paul Bloom talks about prejudice, based on his then new book, Just Babies: The Origins of Good and Evil (Nov. 2013).

Can prejudice ever be a good thing?

TEDSalon NY2014 · 16:23 · Filmed Jan 2014


We often think of bias and prejudice as rooted in ignorance. But as psychologist Paul Bloom seeks to show, prejudice is often natural, rational ... even moral. The key, says Bloom, is to understand how our own biases work — so we can take control when they go wrong.
* * * * *

Paul Bloom explores some of the most puzzling aspects of human nature, including pleasure, religion, and morality.

Why you should listen

In Paul Bloom’s last book, How Pleasure Works, he explores the often-mysterious enjoyment that people get out of experiences such as sex, food, art, and stories. His latest book, Just Babies, examines the nature and origins of good and evil. How do we decide what's fair and unfair? What is the relationship between emotion and rationality in our judgments of right and wrong? And how much of morality is present at birth? To answer these questions, he and his colleagues at Yale study how babies make moral decisions. (How do you present a moral quandary to a 6-month-old? Through simple, gamelike experiments that yield surprisingly adult-like results.)   

Paul Bloom is a passionate teacher of undergraduates, and his popular Introduction to Psychology 110 class has been released to the world through the Open Yale Courses program. He has recently completed a second MOOC, “Moralities of Everyday Life”, that introduced moral psychology to tens of thousands of students. And he also presents his research to a popular audience though articles in The Atlantic, The New Yorker, and The New York Times. Many of the projects he works on are student-initiated, and all of them, he notes, are "strongly interdisciplinary, bringing in theory and research from areas such as cognitive, social, and developmental psychology, evolutionary theory, linguistics, theology and philosophy."  

He says: "A growing body of evidence suggests that humans do have a rudimentary moral sense from the very start of life."

What others say

"Bloom is after something deeper than the mere stuff of feeling good. He analyzes how our minds have evolved certain cognitive tricks that help us negotiate the physical and social world." — New York Times

Wednesday, May 14, 2014

Steven Pinker - ‘What Could Be More Interesting than How the Mind Works?’

A long and interesting interview with Steven Pinker from the Harvard Gazette. Pinker is the author of a lot of really, really thick books, including The Better Angels of Our Nature: Why Violence Has Declined (2012), The Stuff of Thought: Language as a Window into Human Nature (2007), The Blank Slate: The Modern Denial of Human Nature (2002), and How the Mind Works (1999).

‘What could be more interesting than how the mind works?’

Steven Pinker’s history of thought

May 6, 2014 


By Colleen Walsh, Harvard Staff Writer

Steven Pinker follows Sara Lawrence-Lightfoot, Martha Minow, and E.O. Wilson in the Experience series, interviews with Harvard faculty members covering the reasons they became teachers and scholars, and the personal journeys, missteps included, behind their professional success. Interviews with Melissa Franklin, Stephen Greenblatt, Laurel Thatcher Ulrich, Helen Vendler, and Walter Willett will appear in coming weeks.

The brain is Steven Pinker’s playground. A cognitive scientist and experimental psychologist, Pinker is fascinated by language, behavior, and the development of human nature. His work has ranged from a detailed analysis of how the mind works to a best-seller about the decline in violence from biblical times to today.

Raised in Montreal, Pinker was drawn early to the mysteries of thought that would drive his career, and shaped in part by coming of age in the ’60s and early ’70s, when “society was up for grabs,” it seemed, and nature vs. nurture debates were becoming more complex and more heated.

His earliest work involved research in both visual imagery and language, but eventually he devoted himself to the study of language development, particularly in children. His groundbreaking 1994 book “The Language Instinct” put him firmly in the sphere of evolutionary psychology, the study of human impulses as genetically programmed and language as an instinct “wired into our brains by evolution.” Pinker, 59, has spent most of his career in Cambridge, and much of that time at Harvard — first for his graduate studies, later as an assistant professor. He is the Johnstone Family Professor of Psychology.

Q: Can you tell me about your early life? Where did you grow up and what did your parents do?

A: I grew up in Montreal, as part of the Jewish minority within the English-speaking minority within the French-speaking minority in Canada. This is the community that gave the world Leonard Cohen, who my mother knew, and Mordecai Richler, who my father knew, together with William Shatner, Saul Bellow, and Burt Bacharach. I was born in 1954, the peak year of the baby boom. My grandparents came to Canada from Eastern Europe in the 1920s, I surmise, because in 1924 the United States passed a restrictive immigration law. I can visualize them looking at a map and saying “Damn, what’s the closest that we can get to New York? Oh, there’s this cold place called Canada, let’s try that.” Three were from Poland, one from what is now Moldova. My parents both earned college degrees. My father had a law degree, but for much of his career did not practice law. He worked as a sales representative and a landlord and owned an apartment-motel in Florida. But he reopened his law practice in his 50s, and retired at 75. Like many women of her generation, my mother was a homemaker through the ’50s and ’60s. In the 1970s she got a master’s degree in counseling, then got a job and later became vice principal of a high school in Montreal.

I went to public schools in the suburbs of Montreal, and then to McGill University, which is also where my parents went. I came to Harvard in 1976 for graduate school, got my Ph.D. from this [psychology] department in 1979, went to MIT to do a postdoc, and came back here as an assistant professor in 1980. It was what they called a folding chair, since in those years Harvard did not have a genuine tenure track. I was advised to take the first real tenure-track job that came my way, and that happened within a few months, so I decamped for Stanford after just one year here. Something in me wanted to come back to Boston, so I left Stanford after a year and I was at MIT for 21 years before returning to Harvard ten and a half years ago. This is my third stint at Harvard.

Q: Were your parents instrumental in your choice of a career?

A: Not directly, other than encouraging my intellectual growth and expecting that I would do something that would make use of my strengths.

Q: What were those strengths?

A: My parents wanted me to become a psychiatrist, given my interest in the human mind, and given the assumption that any smart, responsible young person would go into medicine. They figured it was the obvious career for me. The 1970s was a decade in which the academic job market had collapsed. There were stories in The New York Times of Ph.D.s driving taxis and working in sheriff’s offices, and so they thought that a Ph.D. would be a ticket to unemployment — some things don’t change. They tried to reason with me: “If you become a psychiatrist, you get to indulge your interest in the human mind, but you also always have a job. You can always treat patients.” But I had no interest in pursuing a medical degree, nor in treating patients. Psychopathology was not my primary interest within psychology. So I gambled, figuring that if the worst happened and I couldn’t get an academic job I would be 25 years old and could do something else. Also, I chose a field — cognitive psychology — that I knew was expanding. I expected that psychology departments would be converting slots in the experimental analysis of behavior, that is, rats and pigeons being conditioned, to cognitive psychology. And that’s exactly what happened. Fortunately, I got three job offers in three years at three decent places. My parents were relieved, not to mention filled with naches.

Q: I read that an early experience with anarchy got you intrigued about the workings of the mind. Can you tell me more about that?

A: I was too young for ’60s campus activism; I was in high school when all of the excitement happened. But it was very much the world I lived in. The older siblings of my friends were college students, and you couldn’t avoid the controversies of the ’60s if you read the newspaper and watched TV. In the ’60s everyone had to have a political ideology. You couldn’t get a date unless you were a Marxist or an anarchist. Anarchism seemed appealing. I had a friend who had read Kropotkin and Bakunin and he persuaded me that human beings are naturally generous and cooperative and peaceful. That’s just the rational way to be if you didn’t have a state forcing you to delineate your property and separate it from someone else’s. No state, no property, nothing to fight over . . . I’d have arguments over the dinner table with my parents, and they said that if the police ever disappeared, all hell would break loose. Being 14 years old, of course I knew better, until an empirical test presented itself.

Quebec is politically and economically very Gallic: Sooner or later, every public sector goes on strike. One week it’s the garbage collectors, another week the letter carriers. Then one day the police went on strike. They simply did not show up for work one morning. So what happened? Well, within a couple of hours there was widespread looting, rioting, and arson — not one but two people were shot to death, until the government called in the Mounties to restore order. This was particularly shocking in Montreal, which had a far lower rate of violent crime than American cities. Canadians felt morally superior to Americans because we didn’t have the riots and the civil unrest of the 1960s. So to see how quickly violent anarchy could break out in the absence of police enforcement was certainly, well, informative. As so often happens, long-suffering mom and dad were right, and their smart-ass teenage son was wrong. That episode also gave me a taste of what it’s like to be a scientist, namely that cherished beliefs can be cruelly falsified by empirical tests.

I wouldn’t say it’s that incident in particular that gave me an interest in human nature. But I do credit growing up in the ’60s, when these ideas trickled down, and the early ’70s, which were an extension of the ’60s. Debates on human nature and its political implications were in the air. Society was up for grabs. There was talk of revolution and rationally reconstructing society, and those discussions naturally boiled down to rival conceptions of human nature. Is the human psyche socially constructed by culture and parenting, or is there even such a thing as human nature? And if there is, what materials do we have to work with in organizing a society? In college I took a number of courses that looked at human nature from different vantage points: anthropology, sociology, psychology, literature, philosophy. But psychology appealed to me because it seemed to ask profound questions about our kind, but it also offered the hope that the questions could be answered in the lab. So it had just the right mixture of depth and tractability.

Q: You started your career interested in the visual realm as well as in language, but eventually you chose to focus your energies on your work with language. Why?

A: Starting from graduate school I pursued both. My Ph.D. thesis was done under the supervision of Stephen Kosslyn, who later became chair of this department, then dean of social science until he left a couple of years ago to become provost of Minerva University. My thesis was on visual imagery, the ability to visualize objects in the mind’s eye. At the same time, I took a course with Roger Brown, the beloved social psychologist who was in this department for many years. In yet another course I wrote a theoretical paper on language acquisition, which took on the question “How could any intelligent agent make the leap from a bunch of words and sentences in its input to the ability to understand and produce an infinite number of sentences in the language from which they were drawn?” That was the problem that Noam Chomsky set out as the core issue in linguistics.

So I came out of graduate school with an interest in both vision and language. When I was hired back at Harvard a year after leaving, I was given responsibility for three courses in language acquisition. In the course of developing the lectures and lab assignments I started my own empirical research program on language acquisition. And I pursued both projects for about 10 years until the world told me that it found my work on language more interesting than my work on visual cognition. I got more speaking invitations, more grants, more commentary. And seeing that other people in visual cognition like Ken Nakayama, my colleague here, were doing dazzling work that I couldn’t match, whereas my work on language seemed to be more distinctive within its field — that is, there weren’t other people doing what I was doing — I decided to concentrate more and more on language, and eventually closed down my lab in visual cognition.

Q: Did you have any doubts when you were starting out in your career?

A: Oh, absolutely. I was terrified of ending up unemployed. When I got to Harvard, the Psychology Department, at least the experimental program in the Psychology Department, was extremely mathematical. It specialized in a sub-sub-discipline called psychophysics, which was the oldest part of psychology, coming out of Germany in the late 19th century. William James, the namesake of this building, said “the study of psychophysics proves that it is impossible to bore a German.” Now, I’m interested in pretty much every part of psychology, including psychophysics. But this was simply not the most exciting frontier in psychology, and even though I was good in math, I didn’t have nearly as much math background as a hardcore psychophysicist, and I wondered whether I had what it took to do the kind of psychology being done here. But it was starting to become clear — even at Harvard — that mathematical psychophysics was becoming increasingly marginalized, and if it wanted to keep up, Harvard had to start hiring in cognitive psychology. They hired Steve Kosslyn, we immediately hit it off, and I felt much more at home.

Q: If you were trying to get someone interested in this field today, what would you say?

A: What could be more interesting than how the mind works? Also, I believe that psychology sits at the center of intellectual life. In one direction, it looks to the biological sciences, to neuroscience, to genetics, to evolution. But in the other, it looks to the social sciences and the humanities. Societies are formed and take their shape from our social instincts, our ability to communicate and cooperate. And the humanities are the study of the products of our human mind, of our works of literature and music and art. So psychology is relevant to pretty much every subject taught at a university.

Psychology is blossoming today, but for much of its history it was dull, dull, dull. Perception was basically psychophysics, the study of the relationship between the physical magnitude of stimulus and of its perceived magnitude — that is, as you make a light brighter and brighter, does its subjective brightness increase at the same rate or not? It also studied illusions, like the ones on the back of the cereal box, but without much in the way of theory. Learning was the study of the rate at which rats press levers when they are rewarded with food pellets. Social psychology was a bunch of laboratory demonstrations showing that people could behave foolishly and be mindless conformists, but also without a trace of theory explaining why. It’s only recently, in dialogue with other disciplines, that psychology has begun to answer the “why” questions. Cognitive science, for example, which connects psychology to linguistics, theoretical computer science, and philosophy of mind, has helped explain intelligence in terms of information, computation, and feedback. Evolutionary thinking is necessary to ask the “why” questions: “Why does the mind work the way it does instead of some other way in which it could have worked?” This crosstalk has made psychology more intellectually satisfying. It’s no longer just one damn phenomenon after another.

Q: Is there a single work that you are most proud of?

A: I am proud of “How the Mind Works” for its sheer audacity in trying to explain exactly that, how the mind works, between one pair of covers. At the other extreme of generality, I’m proud of a research program I did for about 15 years that culminated in “Words and Rules,” a book about, of all things, irregular verbs, which I use as a window onto the workings of cognition. I’m also fulfilled by having written my most recent book, “The Better Angels of Our Nature,” which is about something completely different: the historical decline of violence and its causes, a phenomenon that most people are not even aware of, let alone have an explanation for. In that book, I first had to convince readers that violence has declined, knowing that the very idea strikes people as preposterous, even outrageous. So I told the story in 100 graphs, each showing a different category of violence: tribal warfare, slavery, homicide, war, civil war, domestic violence, corporal punishment, rape, terrorism. All have been in decline. Having made this case, I returned to being a psychologist, and set myself the task of explaining how that could have happened. And that explanation requires answering two psychological questions: “Why was there so much violence in the past?” and “What drove the violence down?” For me, the pair of phenomena stood as a corroboration of an idea I have long believed; mainly that human nature is complex. There is no single formula that explains what makes people tick, no wonder tissue, no magical all-purpose learning algorithm. The mind is a system of mental organs, if you will, and some of its components can lead us to violence, while others can inhibit us from violence. What changed over the centuries and decades is which parts of human nature are most engaged. I took the title, “The Better Angels of Our Nature,” from Abraham Lincoln’s first inaugural. It’s a poetic allusion to the idea that there are many components to human nature, some of which can lead to cooperation and amity.

Q: I read a newspaper article in which you talked about the worst thing you have ever done. Can you tell me about that?

A: It was as an undergraduate working in a behaviorist lab. I carried out a procedure that turned out to be tantamount to torturing a rat to death. I was asked to do it, and against my better judgment, did it. I knew it had little scientific purpose. It was done in an era in which there was no oversight over the treatment of animals in research, and just a few years later it would have been inconceivable. But this painful episode resonated with me for two reasons. One is that it was a historical change in a particular kind of violence that I lived through, namely the increased concern for the welfare of laboratory animals. This was one of the many developments I talk about in the “The Better Angels of Our Nature.” Also, as any psychology student knows, humans sometimes do things against their own conscience under the direction of a responsible authority, even if the authority has no power to enforce the command. This is the famous Milgram experiment, in which people were delivering what they thought were fatal shocks to subjects pretending to be volunteers. I show the film of the Milgram experiment to my class every year. It’s harrowing to watch, but I’ve seen it now 17 times and found it just as gripping the 17th time as the first. There was a lot of skepticism that people could possibly behave that way. Prior to the experiment, a number of experts were polled for their prediction as to what percentage of subjects would administer the most severe shock. The average of the predictions was on the order of one-tenth of one percent. The actual result was 70 percent. Many people think there must be some trick or artifact, but having behaved like Milgram’s 70 percent myself, despite thinking of myself as conscientious and morally concerned, I believe that the Milgram study reveals a profound and disturbing feature of human psychology.

Pinker, at his Boston home, might someday add photography to his list of book topics.

Q: What would you say is your biggest flaw as a scholar? What about your greatest strength?

A: That’s for other people to judge! I am enough of a psychologist to know that any answer I give would be self-serving. La Rochefoucauld said, “Our enemies’ opinions of us come closer to the truth than our own.”

Q: As an expert in language, what do you think of Twitter?

A: I was pressured into becoming a Twitterer when I wrote an op-ed for The New York Times saying that Google is not making us stupid, that electronic media are not ruining the language. And my literary agent said, “OK, you’ve gone on record saying that these are not bad things. You better start tweeting yourself.” And so I set up a Twitter feed, which turns out to suit me because it doesn’t require taking out hours of the day to write a blog. The majority of my tweets are links to interesting articles, which takes advantage of the breadth of articles that come my way — everything from controversies over correct grammar to trends in genocide.

Having once been a young person myself, I remember the vilification that was hurled at us baby boomers by the older generation. This reminds me that it is a failing of human nature to detest anything that young people do just because older people are not used to it or have trouble learning it. So I am wary of the “young people suck” school of social criticism. I have no patience for the idea that because texting and tweeting force one to be brief, we’re going to lose the ability to express ourselves in full sentences and paragraphs. This simply misunderstands the way that human language works. All of us command a variety of registers and speech styles, which we narrowcast to different forums. We speak differently to our loved ones than we do when we are lecturing, and still differently when we are approaching a stranger. And so, too, we have a style that is appropriate for texting and instant messaging that does not necessarily infect the way we communicate in other forums. In the heyday of telegraphy, when people paid by the word, they left out the prepositions and articles. It didn’t mean that the English language lost its prepositions and articles; it just meant that people used them in some media and not in others. And likewise, the prevalence of texting and tweeting does not mean that people magically lose the ability to communicate in every other conceivable way.

Q: Early in your career you wrote a number of important technical works. Do you find it more fun to write the broader appealing books?

A: Both are appealing for different reasons. In trade books I have the length to pursue objections, digressions, and subtleties, something that is hard to do in the confines of a journal article. I also like the freedom to avoid academese and to write in an accessible style — which happens to be the very topic of my forthcoming book, “The Sense of Style: The Thinking Person’s Guide to Writing in the 21st Century.” I also like bringing to bear ideas and sources of evidence that don’t come from a single discipline. In the case of my books on language, for example, I used not just laboratory studies of kids learning to talk, or studies of language in patients with brain damage, but also cartoons and jokes where the humor depends on some linguistic subtlety. Telling examples of linguistic phenomena can be found in both high and low culture: song lyrics, punch lines from stand-up comedy, couplets from Shakespeare. In “Better Angels,” I supplemented the main narrative, told with graphs and data, with vignettes of culture at various times in history, which I presented as a sanity check, as a way of answering the question, “Could your numbers be misleading you into a preposterous conclusion because you didn’t try to get some echo from the world as to whether life as it was lived reflects the story told by the numbers?” If, as I claim, genocide is not a modern phenomenon, we should see signs of it being treated as commonplace or acceptable in popular narratives. One example is the Old Testament, which narrates one genocide after another, commanded by God. This doesn’t mean that those genocides actually took place; probably most of them did not. But it shows the attitude at the time, which is, genocide is an excellent thing as long as it doesn’t happen to you.

I also find that there is little distinction between popular writing and cross-disciplinary writing. Academia has become so hyperspecialized that as soon as you write for scholars who are not in your immediate field, the material is as alien to them as it is to a lawyer or a doctor or a high school teacher or a reader of The New York Times.

Q: Were you a big reader as a teen? Can you think of one or two works you read early, fiction or nonfiction, where you came away impressed, even inspired, by the ideas, the craft, or both?

A: I was a voracious reader, and then as now, struggled to balance breadth and depth, so my diet was eclectic: newspapers, encyclopedias, a Time-Life book-of-the-month collection on science, magazines (including Esquire in its quality-essay days and Commentary in its pre-neocon era), and teen-friendly fiction by Orwell, Vonnegut, Roth, and Salinger (the intriguing Glasses, not the tedious Caulfield). Only as a 17-year-old in junior college did I encounter a literary style I consciously wanted to emulate — the wit and clarity of British analytical philosophers like Gilbert Ryle and A.J. Ayer, and the elegant prose of the Harvard psycholinguists George Miller and Roger Brown.

Q: Might we one day see a Steven Pinker book about horse racing or piano playing — or a Pinker novel? Is there a genre or off-work-hours interest you’ve thought seriously about putting book-length work into?

A: Whatever thoughts I might have had of writing a novel were squelched by marrying a real novelist [Rebecca Goldstein] and seeing firsthand the degree of artistry and brainpower that goes into literary fiction. But I have pondered other crossover projects. I’m an avid photographer, and would love to write a book someday that applied my practical experience, combined with vision science and evolutionary aesthetics, to explaining why we enjoy photographs. And I’ve thought of collaborating with Rebecca on a book on the psychology, philosophy, and linguistics of fiction — which would give me an excuse to read the great novels I’ve never found time for.

Q: You have won several teaching awards during your career. What makes a great teacher?

A: Foremost is passion for the subject matter. Studies of teaching effectiveness all show that enthusiasm is a major contributor. Also important is an ability to overcome professional narcissism, namely a focus on the methods, buzzwords, and cliques of your academic specialty, rather than a focus on the subject matter, the actual content. I don’t think of what I’m teaching my students as “psychology.” I think of it as teaching them “how the mind works.” They’re not the same thing. Psychology is an academic guild, and I could certainly spend a lot of time talking about schools of psychology, the history of psychology, methods in psychology, theories in psychology, and so on. But that would be about my clique, how my buddies and I spend our days, how I earn my paycheck, what peer group I want to impress. What students are interested in is not an academic field but a set of phenomena in the world — in this case the workings of the human mind. Sometimes academics seem not to appreciate the difference.

A third ingredient of good teaching is overcoming “the curse of knowledge”: the inability to know what it’s like not to know something that you do know. That is a lifelong challenge. It’s a challenge in writing, and it’s a challenge in teaching, which is why I see a lot of synergy between the two. Often an idea in one of my books will have originated from the classroom, or vice versa, because the audience is the same: smart people who are intellectually curious enough to have bought the book or signed up for the course but who are just not as knowledgeable about a particular topic as I am. The obvious solution is to “imagine the reader over your shoulder” or “to put yourself in your students’ shoes.” That’s a good start, but it’s not enough, because the curse of knowledge prevents us from fully appreciating what it’s like to be a student or a reader. That’s why writers need editors: The editors force them to realize that what’s obvious to them isn’t obvious to everyone else. And it’s why teachers need feedback, either from seeing the version of your content that comes back at you in exams, or in conversations with students during office hours, or in discussion sessions. Another important solution is being prepared to revise. Most of the work of writing is in the revising. During the first pass of the writing process, it’s hard enough to come up with ideas that are worth sharing. To simultaneously concentrate on the form, on the felicity of expression, is too much for our thimble-sized minds to handle. You have to break it into two distinct stages: Come up with the ideas, and polish the prose. This may sound banal, but I find that it comes as a revelation to people who ask about my writing process. It’s why in my SLS 20 class, the assignment for the second term paper is to revise the first term paper. That’s my way to impress on students that the quality comes in the revision.

Q: How do students differ today from when you were a student?

A: What a dangerous question! The most tempting and common answer is the thoughtless one: “The kids today are worse.” It’s tempting because people often confuse changes in themselves with changes in the times, and changes in the times with moral and intellectual decline. This is a well-documented psychological phenomenon. Every generation thinks that the younger generation is dissolute, lazy, ignorant, and illiterate. There is a paper trail of professors complaining about the declining quality of their students that goes back at least 100 years. All this means that your question is one that people should think twice before answering. I know a lot more now than I did when I was a student, and thanks to the curse of knowledge, I may not realize that I have acquired most of it during the decades that have elapsed since I was a student. So it’s tempting to look at students and think, “What a bunch of inarticulate ignoramuses! It was better when I was at that age, a time when I and other teenagers spoke in fluent paragraphs, and we effortlessly held forth on the foundations of Western civilization.” Yeah, right.

Here is a famous experiment. A 3-year-old comes into the lab. You give him a box of M&Ms. He opens up the box and instead of finding candy he finds a tangle of ribbons. He is surprised, and now you say to him, “OK, now your friend Jason is going to come into the room. What will Jason think is in the box?” The child says, “ribbons,” even though Jason could have no way of knowing that. And, if you ask the child, “Before you opened the box, what did you think was in it?” They say, “ribbons.” That is, they backdate their own knowledge. Now we laugh at the 3-year-old, but we do the same thing. We backdate our own knowledge and sophistication, so we always think that the kids today are more slovenly than we were at that age.

Q: What are some of the greatest things your students have taught you?

A: Many things. The most obvious is the changes in technology for which we adults are late adopters. I had never heard of Reddit, let alone knowing that it was a major social phenomenon, until two of my students asked if I would do a Reddit AMA [Ask Me Anything]. I did the session in my office with two of my students guiding me, kind of the way I taught my grandmother how to use this newfangled thing called an answering machine. That evening I got an email from my editor in New York saying: “The sales of your book just mysteriously spiked. Any explanation?” It was all thanks to Reddit, which I barely knew existed. Another is a kind of innocence — though that’s a condescending way to put it. It’s a curiosity about the world untainted by familiarity with an academic field. It works as an effective challenge to my own curse of knowledge. So if you want to know what it’s like not to know something that you know, the answer is not to try harder, because that doesn’t work very well. The answer is to interact with someone who doesn’t know what you know, but who is intelligent, curious, and open.

Q: If you weren’t in this field, what would you be doing?

A: Am I allowed to be an academic?

Q: You can be anything you want.

A: I could have been in some other field that deals with ideas, like philosophy or constitutional law. I have enough of an inner geek to imagine being a programmer, and for a time as an undergraduate that appealed to me. But as much as I like gadgets and code, I like ideas more, so I suspect that the identical twin separated from me at birth would also have done something in the world of ideas.

Q: No Steven Pinker interview would be complete without a question about your hair. I recently saw a picture of you from the 1970s, and your style appears unchanged. Why haven’t you gone for a shorter look?

A: First, there’s immaturity. Any boy growing up in the ’60s fought a constant battle with his father about getting a haircut. Now no one can force me to get my hair cut, and I’m still reveling in the freedom. Also, I had a colleague at MIT, the computer scientist Pat Winston, who had a famous annual speech on how to lecture, and one of his tips was that every professor should have an affectation, something to amuse students with. Or journalists, comedians, and wise guys. I am the charter member of an organization called The Luxuriant Flowing Hair Club for Scientists. The MIT newspaper once ran a feature on all the famous big-haired people I had been compared to, including Simon Rattle, Robert Plant, Spinoza, and Bruno, the guy who played the piano on the TV show “Fame.” When I was on The Colbert Report, talking about fear and security, he pulled out an electromagnetic wand and scanned my hair for concealed weapons. So it does have its purposes.

Interview was edited for length and clarity.

Thursday, May 01, 2014

The Better Angels of Our Nature: Why Violence Has Declined

 

Steven Pinker's The Better Angels of Our Nature: Why Violence Has Declined (2011) received a lot of praise, and it was also dismissed as "hallucinatory" by Robert Epstein in Scientific American.
People pay more attention to facts that match their beliefs than those that undermine them. Pinker wants peace, and he also believes in his hypothesis; it is no surprise that he focuses more on facts that support his views than on those that do not.
So watch the talk below, given at UC Berkeley in February, but just posted on the UCTV Channel, and see if you agree with Epstein or with Pinker.

The Better Angels of Our Nature: Why Violence Has Declined

Published on Apr 29, 2014


Believe it or not, violence has been in decline for long stretches of time, and we may be living in the most peaceful era in our species existence. Harvard psychology professor Steven Pinker presents the data supporting this surprising conclusion, and explains the trends by showing how changing historical circumstances have engaged different components of human nature. Recorded on 02/04/2014. Series: "UC Berkeley Graduate Council Lectures" [5/2014] - (Visit: http://www.uctv.tv/)

Wednesday, April 02, 2014

Understanding Human Nature with Steven Pinker - Conversations with History


Harvard professor of psychology Steven Pinker visited UC Berkeley back in February as a part of the Conversations with History lecture series. In this talk he focused on the development of his understanding of human nature, including some discussion of his most recent book, The Better Angels of Our Nature: Why Violence Has Declined.

Understanding Human Nature with Steven Pinker - Conversations with History

Published on Apr 1, 2014 
(Visit: http://www.uctv.tv/)


Conversations host Harry Kreisler welcomes Harvard's Steven Pinker, Johnstone Family Professor of Psychology, for a discussion of his intellectual journey. Pinker discusses the origins and evolution of his thinking on human nature. Topics include: growing up in Montreal in a Jewish family, the impact of the 1960's, his education, and the trajectory of his research interests. He explains his early work in linguistics and how he came to write his recent work, The Better Angels of Our Nature: Why Violence Has Declined. In the conversation, Pinker describes the importance of interdisciplinary research and analyzes creativity. He concludes with a discussion of how science can contribute to the humanities and offers advice to students on how to prepare for the future.

Recorded on 02/04/2014. Series: "Conversations with History" [4/2014]

Sunday, March 09, 2014

In Conversation with… Steven Pinker (via Mosaic)

Mosaic is a new open access online science magazine produced by the Wellcome Trust. In the first collection of stories they have posted, one is an interview with the always interesting (and sometimes infuriating) Steven Pinker of Harvard University - and author of How the Mind Works (1997) and The Blank Slate: The modern denial of human nature (2002) [both nominated for the Pulitzer Prize], and The Better Angels of Our Nature: Why Violence Has Declined (2011).

Those who follow Pinker's writing will find little new here, but for those who do not this article/interview provides an excellent overview of the man and his career.

In Conversation with… Steven Pinker

Oliver Burkeman explores human nature, violence, feminism and religion with one of the world’s most controversial cognitive scientists. Can he dent Steven Pinker’s optimism?

March 4, 2014

Stephen Pinker holding a piece of the Berlin Wall

In the week that I interview the cognitive psychologist and bestselling author Steven Pinker in his office at Harvard, police release the agonising recordings of emergency calls made during the Sandy Hook school shootings. In Yemen, a suicide attack on the defence ministry kills more than 50 people. An American teacher is shot dead as he goes jogging in Libya. Several people are killed in riots between political factions in Thailand, and peacekeepers have to be dispatched to the Central African Republic.

In short, it’s not hard to find anecdotes that seem to contradict a guiding principle behind much of Pinker’s work – which is that science and human reason are, slowly but unmistakably, making the world a better place.

Repeatedly during our conversation, I seek to puncture the silver-haired professor’s quietly relentless optimism. If the ongoing tolls of war and violence can’t do it, what about the prevalence in America of unscientific beliefs about the origins of life? Or the devastating potential impacts of climate change, paired with the news – also released in the week we meet – that 23 per cent of Americans don’t believe it’s happening, up seven percentage points in just eight months?

I try. But it proves far from easy.

At first glance Pinker’s implacable optimism, though in keeping with his sunny demeanour and stereotypically Canadian friendliness, presents a puzzle. His stellar career – which includes two Pulitzer Prize nominations for his books How the Mind Works (1997) and The Blank Slate: The modern denial of human nature (2002) – has been defined, above all, by support for the fraught notion of human nature: the contention that genetic predispositions account in hugely significant ways for how we think, feel and act, why we behave towards others as we do, and why we excel in certain areas rather than others.

This has frequently drawn Pinker into controversy – as in 2005, when he offered a defence of Larry Summers, then Harvard’s President, who had suggested that the under-representation of women in science and maths careers might be down to innate sex differences.

“The possibility that men and women might differ for reasons other than socialisation, expectations, hidden biases and barriers is very close to an absolute taboo,” Pinker tells me. He faults books such as Lean In, by Facebook’s chief operating officer, Sheryl Sandberg, for not entertaining the notion that men and women might not have “identical life desires”. But he also insists that taking the possibility of such differences seriously need not lend any justification to policies or prejudices that exclude women from positions of expertise or power.

“Even if there are sex differences, they’re differences in the means of two overlapping populations, so for any [stereotypically female] trait you care to name, there’ll be many men who are more extreme than most women, and vice versa. So as a matter of both efficiency and of fairness, you should treat every individual as an individual, and not prejudge them.”

It is generally assumed that anyone who takes human nature seriously will be a fatalist, and probably politically conservative. If we’re pre-wired to be how we are, the reasoning goes, we might as well accept it and give up on hopes of any change. One way of interpreting Pinker’s most recent book, The Better Angels of Our Nature, is as an 800-page doorstopper of a riposte to this idea. Not only can we change, but when it comes to arguably the most important measure of improvement – the violence we inflict on each other – we actually have changed, to an almost incredible degree.


“I had very often come across the objection that if human nature exists – including some ugly motives like revenge, dominance, greed and lust – then that would imply it’s pointless to try to improve the human condition, because humans are innately depraved,” says the 59-year-old, whose distinctive appearance – today he is sporting black cowboy boots – frequently gets him stopped in the street. “Or there’s an alternative objection: that we ought to improve our lot, and therefore, it cannot be the case that human nature exists.”

Pinker puts all this down to “a fear that acknowledging human nature would subvert any attempt to improve the human condition”. Better Angels argues that this is a misunderstanding of what human nature means. It shouldn’t be identified with a certain set of behaviours; rather, we have a complex variety of predispositions, violent and peaceful, that can be activated in different ways by different environments. The book’s title, drawn from Abraham Lincoln’s first inaugural address, is “a poetic allusion to the parts of human nature that can overcome the nastier parts,” he explains.

But Better Angels is notable above all for the sheer weight of evidence it amasses, culled from forensic archaeology, government statistics, town records, and studies by ‘atrocitologists’ of historical genocides and other mass killings. The book demonstrates that homicides, calculated as a proportion of the world’s population at any given point, have plummeted; when you look at the numbers this way, World War II wasn’t the worst single atrocity in history, but more like the tenth.

Pinker dwells, in sometimes unnerving detail, on horrifying methods of torture once considered routine. “The Heretic’s Fork had a pair of sharp spikes at each end,” he writes, in what is definitely not the most appalling passage. “One end was propped under the victim’s jaw and the other at the base of his neck, so that as his muscles became exhausted he would impale himself in both places.”

“Human nature or no human nature,” Pinker says, “it’s just a brute fact that we don’t throw virgins into volcanoes any more. We don’t execute people for shoplifting a cabbage. And we used to.”

He offers a multi-pronged explanation for this decline, from the rise of the state and of cities, to literacy, trade and democracy. Whether this constitutes an across-the-board endorsement of scientific rationality may be debated. (“Like other latter-day partisans of ‘Enlightenment values’,” the critic John Gray wrote, “Pinker prefers to ignore the fact that many Enlightenment thinkers have been doctrinally anti-liberal, while quite a few have favoured the large-scale use of political violence”.) But it’s hard to question the basic finding that your chances of meeting a sticky end, all else being equal, are vastly lower in 2014 than they were in 1014.

If Pinker’s message has proved hard for some to swallow, that may be because our standards are improving even faster than our actual behaviour, giving the misleading impression that things are getting worse. “Hate attacks on Muslims are deplorable, and they ought to be combated, and it reflects well that we’re concerned when they do occur,” Pinker says. “But by the standards of past pogroms and ethnic cleansings, they’re in the noise: this is not a phenomenon of the same magnitude as the ethnic expulsions of decades past.”

We’ve even witnessed the emergence of whole new categories of condemnable acts. Take bullying, says Pinker: “The President of the United States gave a speech denouncing bullying! When I was a child, this would have been worthy of satire.” As we continue to construct a social environment that activates more and more of our peaceable dispositions, and fewer and fewer of our aggressive ones, the remaining instances of bad behaviour stick out like ever-sorer thumbs.

What’s more, evolutionary psychology, one of Pinker’s several specialisms, can explain why. For reasons that long ago made excellent sense, our brains are adapted to focus on bad news over good, vivid threats over vague ones, and recent horrors over historically distant atrocities. Our elevated levels of anxiety about the future might actually be a sign of reason’s triumph.

“It could be interpreted as a sign of our growing up,” Pinker says. “We worry about more things, because we know that there are more things to worry about. Every time we go to a restaurant, we worry we might be ingesting saturated fats, or carcinogens. For my parents’ generation, the main concern about food was: ‘Does it taste delicious?’”

§

Many of Pinker’s most ambitious ideas about science and human morality have their origins in a seemingly trivial observation about irregular verbs. Building on the groundbreaking linguistic ideas of Noam Chomsky, Pinker proposed that certain simple language errors committed by young children “capture the essence of language” itself.

When a three-year-old says “I eated the ice cream” or “we holded the kittens”, she is, Pinker observes, following a grammar rule correctly, and making a mistake only because we happen to suspend the rule for those verbs in English. Since she couldn’t have learned “eated” or “holded” by simply imitating adult speakers, this points to the presence of innate cognitive machinery – a “language instinct”, to quote the title of Pinker’s 1994 book – that enables a young child to construct novel linguistic forms by following rules.

(Irregular verbs have an even more intimate role in Pinker’s life: he met his wife, the philosopher Rebecca Goldstein, through an email exchange after he mentioned her correct use of the past participle ‘stridden’ in his book Words and Rules.)

Years later, in 2007’s The Stuff of Thought, he extended this reasoning to the structures of “mentalese”, the wordless “language of thought” that he argues we use to think in. When, for example, we use spatial language to talk about time – as in “a long day”, or bringing a meeting “forward” – might we be relying on an in-built, pre-linguistic tendency to think about the abstract notion of time by analogy to space, something far more concretely graspable to an early human concerned with food, shelter and survival?

This view of the mind – as a set of modules evolved to confront specific cognitive challenges on the Pleistocene savannah – is most ambitiously on display in How the Mind Works, a dazzling effort to “reverse engineer” all of our mental capacities, asking for what purposes each might have been selected. Love, humour, war, jealousy, the disgust we feel at the idea of eating certain animals but not others, religous food taboos, compulsive lying: none of them escape the blade of Pinker’s rationalist scalpel.

Assuming you buy the book’s general approach, it is almost impossible after reading it to cling to the romantic notion that there might be more to our inner lives than the brute facts of biology and natural selection. The notable exception is how the brain causes sentience, or conscious awareness: after a long discussion on the topic, Pinker finally concludes: “Beats me!” There’s reason to believe, he argues, that humans may simply lack the mental capacity ever to solve the mind–body problem.

But the broader philosophical question – how far science can, or should, reach into the life of the mind – got a disputatious airing last year, when Pinker wrote an essay for the New Republic entitled ‘Science is not your enemy’. It was motivated in part by reports on both sides of the Atlantic about declining student enrolments in humanities subjects, and was Pinker’s intervention in the long-running debate over ‘scientism’: are science and scientists guilty of attempting to colonise areas of intellectual life where they don’t belong?

Rather than denying that this was a real phenomenon, as numerous scientists have, Pinker audaciously claimed it was a good thing – providing you defined ‘scientism’ correctly. Humanities scholars had themselves to blame, he implied, for the decline of their fields: by insisting on remaining inside departmental silos, resistant to new approaches, they’d helped guarantee their growing irrelevance. Science was not engaged upon “an imperialistic drive to occupy the humanities,” he wrote. “The promise of science is to enrich and diversify the intellectual tools of humanistic scholarship, not to obliterate them.”

In a furious response, entitled ‘Crimes against humanities’, the New Republic’s literary editor, Leon Wieseltier, accused Pinker of denying the very possibility of valid yet non-scientific knowledge. How absurd, he argued, to imagine that a scientific analysis of a painting – a chemical breakdown of its pigments and textures, and so on – could be the only thing worth saying about it. Pinker calls this a “paranoid” interpretation of his argument. “How could an understanding of perception of colour, of form, of lighting, of shading, of content such as faces and landscapes not enrich our understanding of art?”

Yet if Wieseltier’s retort was overheated, he may still have a point. Pinker wasn’t – and isn’t – merely calling on scholars from different disciplines to talk to each other more. His argument is that any scholar committed to the idea that “the world is intelligible” is doing science. “The great thinkers of the Age of Reason and the Enlightenment were scientists,” he wrote, naming various philosophers.

It seems to follow from this that non-scientific scholarship doesn’t help make the world intelligible, but Pinker will have none of it. “I’m married to a humanities scholar. I collaborate with humanities scholars. I’m in fields like linguistics, where deans don’t know whether it’s the humanities or not,” he says. “Many humanities scholars – particularly here at Harvard and MIT, but elsewhere too – find it very exciting that there might be new ways of approaching old problems, and an influx of new ideas. I mean, who in their right mind could defend insularity as a principle for excellence in anything?”

Of course there are ways of studying a painting, he concedes, that result in worthwhile insights that can’t be described as scientific. But, he tells me, “I think the humanities would do themselves a favour by not insisting on staying in a silo. If they are wanting to attract the smartest minds from the next generation, it would be wise to hold out the promise that there will be new ways of understanding things – the same expansive mindset that attracts smart, ambitious people to the sciences could also attract them to the humanities.

“That it isn’t just a question of reinterpreting the same works of art, with the same methods, over and over again,” he concludes. “I don’t see why humanistic scholarship can’t make progress. Wieseltier seemed to insist that it can’t, but I don’t think most people in the humanities would agree with this. He claims to speak for the humanities. But I can imagine a lot of people in the humanities saying: ‘Speak for yourself!’”

§

Pinker was born in 1954 in Montréal, and raised in that bilingual city’s English-speaking Jewish community (his sister, Susan, is also a psychologist, of the clinical rather than research variety). It’s tempting to try to attribute his subsequent intellectual interests to the milieu of his upbringing. Did his focus on language emerge from having grown up in a linguistic battleground? Did his conception of the mind as a complex assemblage of modules, each designed for specific purposes, arise from inspecting the machines his grandfather used as a garment manufacturer? Was it growing up in the 1960s, when many progressives embraced a ‘blank slate’ model of humans as a precondition for radical change, that prompted him to rebel against that notion in his work?

Such speculation can be hazardous when it concerns an evolutionary psychologist who believes that genetic heritage is more important than parenting or peer-group influence. But how far, really, does Pinker believes his career trajectory was influenced by his genes, and how far by his upbringing?

“There are parallel universes to this one [in which] I wouldn’t have written The Better Angels of Our Nature or The Language Instinct,” he says. “But I’d probably be in the sciences of something human. I probably wouldn’t have been a physicist: I’m too much of a yenta, too interested in humans.” On the other hand, “I probably wouldn’t have been a literary critic.”

In this universe, Pinker studied experimental psychology at McGill University in Montréal, then did his PhD in the same field at Harvard; he has spent the rest of his career there and at MIT, just down the street.

Wherever he got them from, Pinker’s dispositions include a prodigious appetite for work. As well as being a self-confessed micromanager in his teaching work, as Harvard’s Johnstone Family Professor of Psychology, he’s usually either pursuing a full schedule of research, speaking and article-writing, or plunging into months-long marathons of book-writing.

“When I write a book, it’s almost all-consuming,” he says, recalling the year he spent in his house on Cape Cod writing The Better Angels, seven days a week, and sometimes until three in the morning. (He’d spent the previous year doing little but reading in preparation for it.) “I do try to exercise. I try to spend some time being a human being with my wife” – as recreation, he and Goldstein ride a tandem bicycle and paddle a tandem kayak. “Fortunately, she’s also a very intense writer, so she sympathises.”

The couple do not have children, a fact Pinker sometimes uses to illustrate the non-determinative nature of genetic predispositions. (He might be predisposed, thanks to natural selection, to reproduce, but he’s used his frontal lobe, a crucial part of his evolutionary inheritance, to decide not to.) “Some things have to give,” Pinker says. “I’m not on Facebook, I don’t see a whole lot of movies, I don’t watch much TV – not because I consider myself above TV, I just don’t have time. And I don’t have a whole lot of face-to-face meetings.” The Pinker–Goldstein house is sometimes almost silent, except for keyboard-tapping, for days and weeks on end.

Both partners are self-described, out-and-proud atheists. Yet while Pinker has received awards from atheist organisations for his support for their cause, it’s notable that he opts not to focus on religion, or its opponents, in his work. A Pinker book on the topic would surely have sold impressively, anointing him the fifth horseman of New Atheism – but “there’s just not enough intellectual content in there, at least on mind, for me to explore,” he says. “I think [Richard] Dawkins has done a fine job; I don’t think I have anything to add to that.”

Pinker’s relative lack of engagement in the modern wars over belief shouldn’t be taken as any endorsement of Stephen Jay Gould’s argument that religion and science are “non-overlapping magisteria”, each a legitimate domain of authority that should keep out of the other’s business. “As a matter of fact,” Pinker says, “religions have concerned themselves with the subject-matter of science… All the world’s major religions have origin myths, they have theories of psychology, of what animates a body that allows it to make decisions. And I think science has competed on that territory successfully: it has shown that those explanations are factually incorrect.”


Nor, in his eyes, should religion have any franchise on morality: “That’s not to say that morality is going to be determined by biology – it could be – but rather that it’s the subject-matter of secular moral philosophy.”

Does any kind of spirituality, however non-religiously defined, play a role in his life?

“I’m afraid of using the word ‘spiritual’,” he says. “I mean, I have a sense of awe and wonder – a sense of intellectual vertigo in pondering certain questions. I hesitate to use the word ‘spiritual’ just because it comes with so much baggage about the supernatural.”

Pinker’s next book, The Sense of Style, will be a style manual for writers incorporating insights from cognitive psychology and linguistics. For example, it will offer advice on how to get around “the curse of knowledge” – the difficulty writers face in being unable to place themselves in the mind of a reader who doesn’t already know as much as the writer knows. Or the question of how to relate to one’s imagined reader: insights from psychology, Pinker will argue, show that the appropriate metaphor to keep in mind is one of vision – that “the stance you take as a writer ought to be to pretend that you’re pointing out something in the world that your reader could see with his own eyes if only he were given an unobstructed view”.

To the extent that these, or any other findings, rely on explanations from evolutionary psychology, they’re vulnerable to a recurrent criticism: aren’t evolutionary psychologists guilty of simply constructing retrospectively satisfying ‘just-so stories’, with no way of showing whether or not they’re the truth?

In one memorable passage in How the Mind Works, Pinker suggests that our cultural tendency to reward successful executives (and Harvard academics) with high-floor offices might result from an adaptive preference for good views of the surrounding territory, the better to defend against attackers. But in an alternative world where we rewarded executives with offices in the basement, couldn’t you construct a mirror-image explanation, about the benefits of being able to hide out of sight?

For Pinker, the crucial question is whether a hypothesis can be tested. First, he says, you’d have to establish – by means of psychology experiments, or surveys of property prices – that there was indeed a culturally widespread, present-day preference for high floors with good views. Then you’d have to scour the historical evidence: for example, data from studies of “tribal warfare, on whether there’s been a historically continued preference for high vantage points over bunkers and burrows”. Sufficient data showing a preference through history and across cultures, and in contexts of life and death, might amount to good reason to accept your hypothesis.

Once more, Pinker navigates his way through my critique with ease. All attempts to puncture his unique brand of rational optimism – his confidence that careful scientific thinking, consistently applied, will carry humanity towards a future of reason, peace and flourishing – end in failure.

Even climate change, that archetypal case of humanity remaining inert in the face of scientific knowledge, doesn’t do it. “I think it would be foolhardy to say we’ll solve it, but I don’t think it’s foolhardy to say we can solve it,” Pinker says. “History tells us there have been cases in which the global community has adopted agreements to better collective welfare: the ban on atmospheric nuclear testing would be an example. The ban on commercial whaling. The end of piracy and privateering as a legitimate form of international competition. The banning of chlorofluorocarbons.”

In this domain as elsewhere, in Pinker’s judgement, science plus judicious optimism may yet win the day. Or, as he puts it: “We’re not on a trolley-track to oblivion.”