Pages

Thursday, May 02, 2013

Gary Marcus - THE CRISIS IN SOCIAL PSYCHOLOGY THAT ISN’T

This article appeared in The New Yorker, written by neuroscientist Gary Marcus, author of Kluge: The Haphazard Evolution of the Human Mind (2008), one of the best books in recent years for an explanation of why our brains function - and dysfunction - as they do. He is also author of The Birth of the Mind: How a Tiny Number of Genes Creates The Complexities of Human Thought (2004) and The Algebraic Mind: Integrating Connectionism and Cognitive Science (Learning, Development, and Conceptual Change) (2003).

Social psychology has had a rough week or two in the media, with articles in Nature and The New York Times Magazine profiling either fraud or statistically weak findings taken as fact. Marcus wants to make sure we know that the field is much stronger than these articles might have us believe.

THE CRISIS IN SOCIAL PSYCHOLOGY THAT ISN’T

POSTED BY GARY MARCUS
MAY 1, 2013


According to the headlines, social psychology has had a terrible year—and, at any rate, a bad week. The New York Times Magazine devoted nearly seven thousand words to Diederik Stapel, the Dutch researcher who committed fraud in at least fifty-four scientific papers, while Nature just published a report about another controversy, questioning whether some well-known “social-priming” results from the social psychologist Ap Dijksterhuis are replicable. Dijksterhuis famously found that thinking about a professor before taking an exam improves your performance, while thinking about a soccer ruffian makes you do worse. Although nobody doubts that Dijksterhuis ran the experiment that he says he did, it may be that his finding is either weak, or simply wrong—perhaps the peril of a field that relies too heavily on the notion that if something is statistically likely, it can be counted on.

Things aren’t quite as bad as they seem, though. Although Nature’s report was headlined “Disputed results a fresh blow for social psychology,” it scarcely noted that there have been some replications of experiments modelled on Dijksterhuis’s phenomenon. His finding could still out turn to be right, if weaker than first thought. More broadly, social priming is just one thread in the very rich fabric of social psychology. The field will survive, even if social priming turns out to have been overrated or an unfortunate detour.

Even if this one particular line of work is under a shroud, it is important not to lose sight of the fact many of the old standbys from social psychology have been endlessly replicated, like the Milgram effect—the old study of obedience in which subjects turned up electrical shocks (or what they thought were electrical shocks) all the way to four hundred and fifty volts, apparently causing great pain to their subjects, simply because they’d been asked to do it. Milgram himself replicated the experiment numerous times, in many different populations, with groups of differing backgrounds. It is still robust (in hands of other researchers) nearly fifty years later. And even today, people are still extending that result; just last week I read about a study in which intrepid experimenters asked whether people might administer electric shocks to robots, under similar circumstances. (Answer: yes.)

More importantly, there is something positive that has come out of the crisis of replicability—something vitally important for all experimental sciences. For years, it was extremely difficult to publish a direct replication, or a failure to replicate an experiment, in a good journal. Throughout my career, and long before it, journals emphasized that new papers have to publish original results; I completely failed to replicate a particular study a few years ago, but at the time didn’t bother to submit it to a journal because I knew few people would be interested. Now, happily, the scientific culture has changed. Since I first mentioned these issues in late December, several leading researchers in psychology have announced major efforts to replicate previous work, and to change the incentives so that scientists can do the right thing without feeling like they are spending time doing something that might not be valued by tenure committees.

The Reproducibility Project, from the Center for Open Science is now underway, with its first white paper on the psychology and sociology of replication itself. Thanks to Daniel Simons and Bobbie Spellman, the journal Perspectives in Psychological Science is now accepting submissions for a new section of each issue devoted to replicability. The journal Social Psychology is planning a special issue on replications for important results in social psychology, and has already received forty proposals. Other journals in neuroscience and medicine are engaged in similar efforts: my N.Y.U. colleague Todd Gureckis just used Amazon’s Mechanical Turk to replicate a wide range of basic results in cognitive psychology. And just last week, Uri Simonsohn released a paper on coping with the famous file-drawer problem, in which failed studies have historically been underreported.

It’s worth remembering, too, that social psychology isn’t the only field affected by these problems—medicine, for example, has been coping with the same concerns. But as Brian Nosek, of the Center for Open Science, wrote to me in an e-mail, psychologists are specially equipped to deal with these issues. “Psychology is at the forefront of wrestling with reproducibility by turning its research expertise on itself,” he said.

Whether or not social priming does prove real, and whether or not it turns out to be a large contributor to our mental lives—or only a trivial one that is easily overrun by other factors—all of psychology, and all of science, will be better off for the increased effort on replication. If, in the worst case, a decade’s studies turn out to be less important than we initially believed, it won’t be the end of the world. And if it accidentally leads to a culture of more careful science, on balance we all will have come out ahead.

Illustration by Nishant Choksi.

No comments:

Post a Comment