Showing posts with label feelings. Show all posts
Showing posts with label feelings. Show all posts

Wednesday, June 25, 2014

The Forces that Shape Us - All in the Mind


http://img1.imagesbn.com/p/9781594204548_p0_v1_s260x420.JPG

In this week's episode of All in the Mind (ABC's Radio National) looks at the ways objects and colors and other elements of our environment can shape how we feel and how we behave. Cool topic.

The Forces that Shape Us


The hidden forces that shape our thoughts, feelings, and behaviour. What if your name could predict your future career, if your culture could influence your judgement of distance, or the physical appearance of your chess opponent could determine whether you win or lose? Everyday, as we move about the world there are many subconscious forces that massively influence our decisions - and how people relate to us.

Guests 
  • Dr Adam Alter - Associate professor of marketing and psychology, New York University

Publications

Thursday, April 10, 2014

Omnivore - Psychologists Search

From Bookforum's Omnivore blog, this is a new collection of links related to psychology, research, and the state of the field.

In another piece of research not included below, Researchers Identify 15 more Facial Emotions:
The traditional six basic human emotions are happy, sad, fearful, angry, surprised and disgusted. For years, researchers have focused on these six categories, which are often depicted via specific facial muscles, when they assess people's moods. Now, according to a new study headed by associate professor Aleix Martinez from Ohio State University, the researchers have identified 15 more facial expressions, which they called "compound emotions." 
Interesting.

Psychologists search

Apr 8 2014 | 9:00AM

Friday, February 07, 2014

Shauna Shapiro - How Mindfulness Cultivates Compassion

 

From the Greater Good blog via UC Berkeley Greater Good Science Center - a little Friday wisdom on how we can generate greater compassion through mindfulness practice. Shauna Shapiro is coauthor, with Linda E. Carlson, of The Art and Science of Mindfulness: Integrating Mindfulness into Psychology and the Helping Professions.

How Mindfulness Cultivates Compassion

January 2014 | TRT 16:18


The author and researcher explores how moment-to-moment awareness of our thoughts, feelings, and surrounding helps us to see and alleviate suffering in others.

Shauna Shapiro explains why mindfulness-based therapies work to develop compassion.
Shauna Shapiro, Ph.D., is associate professor of counseling psychology at Santa Clara University. She has conducted extensive clinical research investigating the effects of mindfulness-based therapies across a wide range of populations, and published over 70 book chapters and peer-reviewed journal articles. She currently lectures and leads mindfulness training programs nationally and internationally for health professionals on the growing applications of mindfulness in psychology and health care. She is coauthor, with Linda E. Carlson, of The Art and Science of Mindfulness: Integrating Mindfulness into Psychology and the Helping Professions.

Subscribe to The Science of a Meaningful Life Video Series via RSS
 

Tuesday, November 26, 2013

How Long Until a Robot Cries? (from Nautilus)


From Nautilus Magazine, this article takes a look at efforts to make emotionally intelligent robots. Researchers currently limit any efforts to teach robots to "feel" emotions to six emotions "anger, sadness, disgust, happiness, fear, and 'neutral.'"

One of the issues I see with this EVER happening is that human emotions are body-based (unless we create robots with biological bodies). Antonio Damasio is the researcher who makes this most clear in his several books. In our daily usage, feelings and emotions are often used interchangeably.
But for neuroscience, emotions are more or less the complex reactions the body has to certain stimuli. When we are afraid of something, our hearts begin to race, our mouths become dry, our skin turns pale and our muscles contract. This emotional reaction occurs automatically and unconsciously. Feelings occur after we become aware in our brain of such physical changes; only then do we experience the feeling of fear. [Damasio, Scientific American interview, Feeling Our Emotions; March, 2005]
Emotions are what we experience in the body, which our brain then interprets into feelings. Robots do not have a peripheral nervous system (PNS) as we do, they can not have an enteric nervous system (ENS) [see here also] as we do, and they can not have an autonomic nervous system (ANS) as we do.

While robots might be understood to have "unconscious" processes much like our ANS, those processes are not "embodied" as ours are and, therefore, cannot be a part of the emotion system in the same way a rapid heartbeat can generate feelings of anxiety (the emotion is the combination of rapid heartbeat, shallow breathing, and stomach butterflies, which our brain then interprets based on context as either anxiety or excitement).

Anyway, all of this to suggest I have serious doubts about robots and emotions. For an interesting film take on this, see Robot and Frank (image at the top is from the film).

Artificial Emotions

How long until a robot cries?

By Neil Savage
Illustrations by John Hendrix

When Angelica Lim bakes macaroons, she has her own kitchen helper, Naoki. Her assistant is only good at the repetitive tasks, like sifting flour, but he makes the job more fun. Naoki is very cute, just under two feet tall. He’s white, mostly, with blue highlights, and has speakers where his ears should be. The little round circle of a mouth that gives him a surprised expression is actually a camera, and his eyes are infrared receivers and transmitters.

“I just love robots,” says Lim, a Ph.D. student in the Department of Intelligent Science and Technology at Kyoto University in Japan. She uses the robot from Aldebaran Robotics in Paris to explore how robots might express emotions and interact with people. When Lim plays the flute, Naoki (the Japanese characters of his name translate roughly to “more than a machine”) accompanies her on the theremin or the egg shaker. She believes it won’t be too many years before robotic companions share our homes and our lives.

Of course Naoki doesn’t get the jokes, or enjoy the music, or feel his mouth watering over the cookies. Though we might refer to a person-shaped robot as “him,” we know it’s just a collection of metal parts and circuit boards. When we yell at Siri or swear at our desktop, we don’t really believe they’re being deliberately obtuse. And they’re certainly not going to react to our frustration; machines don’t understand what we feel.



At least that’s what we’d like to believe. Having feelings, we usually assume, and the ability to read emotions in others, are human traits. We don’t expect machines to know what we’re thinking or react to our moods. And we feel superior to them because we emote and they don’t. No matter how quick and logical they are, sensitive humans win and prevail over machines: emotional David Bowman beats calculative HAL 9000 in 2001: A Space Odyssey, and desperate Sarah Connor triumphs over the ultimate killing machine in The Terminator. From Dr. McCoy condemning the unemotional Spock as a “green-blooded inhuman” in Star Trek to moral reasoning that revolves around the unemotionality of criminals, we hold our emotions at the core of our identity.

Special and indecipherable, except by us—our whims and fancies are what makes us human. But we may be wrong in our thinking. Far from being some inexplicable, ethereal quality of humanity, emotions may be nothing more than an autonomic response to changes in our environment, software programmed into our biological hardware by evolution as a survival response.

Joseph LeDoux, a neuroscientist at New York University’s Center for Neural Science, describes emotion in terms of “survival circuits” that exist in all living things. An organism, as simple as an amoeba or as complex as a person, reacts to an environmental stimulus in a way that makes it more likely to survive and reproduce. The stimulus flip switches on survival circuits which prompt behaviors that enhance survival. Neurons firing in a particular pattern might trigger the brain to order the release of adrenaline, which makes the heart beat faster, priming an animal to fight or flee from danger. That physical state, LeDoux says, is an emotion.




Melissa Sturge-Apple, an assistant professor of psychology at the University of Rochester, agrees that emotions have something to do with our survival. “They’re kind of a response to environmental cues, and that organizes your actions,” she says. “If you’re fearful, you might run away. If you get pleasure from eating something, you might eat more of it. You do things that facilitate your survival.” And key among the human’s survival tool kit is communication—something emotions help facilitate, through the use of empathy.

By this reasoning, every living thing interested in survival emotes in some form, though perhaps not in quite the same way as humans. Certainly any pet owner will tell you that dogs experience emotions. The things we call feelings are our conscious interpretation and description of those emotional states, LeDoux argues. Other types of feelings, such as guilt, envy, or pride, are what he calls “higher order or social emotions.”

We are also beginning to understand that the mechanics of how we express emotion are deeply tied into the emotion itself. Oftentimes, they determine what we are feeling. Smiling makes you happier, even if it’s because Botox has frozen your face into an unholy imitation, author Eric Finzi says in his recent book The Face of Emotion. Conversely, people whose facial muscles are immobilized by Botox injections can’t mirror other people’s expressions, and have less empathy. No mechanics, no emotion, it seems.

But if our emotional states are indeed mechanical, they can be detected and measured, which is what scientists in the field of affective computing are working on. They’re hoping to enable machines to read a person’s affect the same way we display and detect our feelings—by capturing clues from our voices, our faces, even the way we walk. Computer scientists and psychologists are training machines to recognize and respond to human emotion. They’re trying to break down feelings into quantifiable properties, with mechanisms that can be described, and quantities that can be measured and analyzed. They’re working on algorithms that will alert therapists when a patient is trying to hide his real feelings and computers that can sense and respond to our moods. Some are breaking down emotion into mathematical formalism that can be programmed into robots, because machines motivated by fear or joy or desire might make better decisions and accomplish their goals more efficiently.

Wendi Heinzelman, a professor of electrical and computer engineering at the University of Rochester and a collaborator of Sturge-Apple, is developing an algorithm to detect emotion based on the vocal qualities of a speaker. Heinzelman feeds a computer speech samples recorded by actors attempting to convey particular feelings, and tells the computer which clips sound happy, sad, angry, and so on. The computer measures the pitch, energy and loudness of the recordings, as well as the fluctuations in energy and pitch from one moment to the next. More fluctuations can suggest a more active emotional state, such as happiness or fear. The computer also tracks what are known as formants, a band of fundamental frequencies that are affected by the shape of the vocal tract. If your throat tightens because you’re angry, it alters your voice—and the computer can measure that. With these data, it can run a statistical analysis to figure out what distinguishes one emotion from another.

Neal Lathia, a post-doctoral research associate in the computer laboratory at the University of Cambridge, in England, is working on EmotionSense, an app for Android phones which listens to human speech and ferrets out its emotional content in a similar way. For instance, it may decide that there’s a 90 percent chance the speaker is happy and report that, “from a purely statistical perspective, you sound most like this actor who had claimed he was expressing happiness,” Lathia explains.

Like Lathia and Heinzelman, Lim thinks there are certain identifiable qualities to emotional expression, and that when we detect those qualities in the behavior of an animal or the sound of a song, we ascribe the associated emotion to it. “I’m more interested in how we detect emotions in other things, like music or a little puppy jumping around,” she says. Why, for instance, should we ascribe sadness to a particular piece of music? “There’s nothing intrinsically sad about this music, so how do we extract sadness from that?” She uses four parameters: speed, intensity, regularity, and extent—whether something is small or large, soft or loud. Angry speech might be rapid, loud, rough and broken. So might an angry piece of music. Someone who’s walking at a moderate pace using regular strides and not stomping around might be seen as content, whereas a person slowly shuffling, with small steps and an irregular stride, might be displaying that they’re sad. Lim’s hypothesis, as yet untested, is that mothers convey emotion to their babies through those qualities of speed, intensity, regularity, and extent in their speech and facial expressions—so humans learn to think of them as markers of emotion.



Currently, researchers work with a limited set of emotions in order to make it easier for the computer to distinguish one from another, and because the difference between joy and glee or anger and contempt is subtle and complex. “The more emotions you get, the harder it is to do this because they’re so similar,” says Heinzelman, who focuses on six emotions: anger, sadness, disgust, happiness, fear, and “neutral.” And for therapists looking for a way to measure patients’ general state of mind, grouping them into these general categories may be all that’s necessary, she says.

Voice, of course, is not the only way people convey their emotional states. Maja Pantic, professor of affective and behavioral computing and leader of Imperial College London’s Intelligent Behavior and Understanding Group, uses computer vision to capture facial expressions and analyze what they tell about a person’s feelings. Her system tracks various facial movements such as the lifting or lowering of an eyebrow and movements in the muscles around the mouth or the eyes. It can tell the difference between a genuine and a polite smile based on how quickly the smile forms and how long it lasts. Pantic has identified 45 different facial actions, of which her computer can recognize 30 about 80 percent of the time. The rest are obscured by the limitations of the computer’s two-dimensional vision and other obstacles. Actions such as movements in a different direction, jaw clenching and teeth grinding—which may indicate feeling—are hard for it to recognize. Most emotion identification systems work pretty well in a lab. In the real world with imperfect conditions, their accuracy is still low, but it’s getting better. “I believe in a couple of years, probably five years, we will have systems that can do analysis in the wild and also learn new patterns in an unsupervised way,” Pantic says.

With emotions reduced to their components, recorded, and analyzed, it becomes possible to input them into machines. The value of this project might seem simple: the resulting robots will have richer, more interesting and more fun interactions with humans. Lim hopes that, in the future, how Naoki moves and how it plays the theramin will allow it to express its emotional states.

But there are also deeper reasons why engineers are interested in emotional robots. If emotions help living things survive, will they do the same for robots? An intelligent agent—a robot or a piece of software—that could experience emotions in response to its environment could make quick decisions, like a human dropping everything and fleeing when he sees his house is on fire. “Emotions focus your attention,” says Mehdi Dastani, a professor of computer science at the University of Utrecht, in the Netherlands. “Your focus gets changed from what you’re working on to a much more important goal, like saving your life.”

Dastani is providing intelligent agents with what he calls a “logic of emotion,” a formalized description of 22 different emotional states such as pity, gloating, resentment, pride, admiration, gratitude, and others. A robot can use them, he explains, to evaluate progress it’s making toward a goal. An unemotional robot, directed to go from Point A to Point B, might hit an obstacle in its path and simply keep banging into it. An intelligent agent equipped with emotion might feel sad at its lack of progress, and eventually give up and go do something else. If the robot feels happy, that means it’s getting closer to its goal, and it should stay the course. But if it’s frustrated, it may have to try another tack. The robot’s emotions offer a kind of problem-solving strategy computer scientists call a heuristic, which is the ability to discover and learn things for themselves—like humans do. “Emotion is a kind of evolutionarily established heuristic mechanism that intervenes in rational decision-making, to make decision-making more efficient and effective,” Dastani says.

But could a machine actually have emotions? Arvid Kappas, a professor of psychology who runs the Emotion, Cognition, and Social Context group at Jacobs University in Bremen, Germany, believes that it comes back to the definition of emotion. By some definitions, even a human baby, which operates mostly on instinct and doesn’t have the cognitive capacity to understand or describe its feelings, might be said to have no emotions. By other definitions, the trait exists in all sorts of animals, with most people willing to ascribe feelings to creatures that closely resemble humans. So does he believe a computer could be emotional? “As emotional as a crocodile, sure. As emotional as a fish, yes. As emotional as a dog, I can see that.”



But would robots that felt, feel the same way we do? “They would probably be machine emotions and not human emotions, because they have machine bodies,” says Kappas. Emotions are tied into our sense of ourselves as physical beings. A robot might have such a sense, but it would be of a very different self, with no heart and a battery meter instead of a stomach. An android in power-saving mode may, in fact, dream of electric sheep. And that starts to raise ethical questions. What responsibility does a human have when the Roomba begs not to let its battery die? What do you say to Robot Charlie when the Charlie S6 comes out, and you want to send the old model to the recycling plant?

“It really is important, if humans are going to be interacting with robots, to think about whether robots could be feeling and under what conditions,” says Bruce MacLennan, an associate professor of computer science at the University of Tennessee, Knoxville, who will be presenting a paper on the ethical treatment of future robots at the International Association for Computing and Philosophy this summer. MacLennan feels that this isn’t just a philosophical question, but one that can be tackled scientifically. He proposes trying to break emotions down into what he calls “protophenomena,” the tiniest units of the physical effects that lead to emotion. “Protophenomena are so small that they’re not normally something a person would be aware of as part of their conscious experience,” he says. There should be some basic physical quantities that science can measure and, therefore, reproduce—in machines.

“I think anything that’s going to be able to make the kinds of decisions we want a human- scale android to make, they’re going to inevitably have consciousness,” MacClennan says. And, LeDoux argues, since human consciousness drives our experience of emotion, that could give rise to robots actually experiencing feelings.

It will probably be many decades before we’re forced to confront questions of whether robots can have emotions comparable to humans, says MacLennan. “I don’t think they’re immediate questions that need to be answered, but they do illuminate our understanding of ourselves, so they’re good to address.” Co-existing with emotional robots, he argues, could have as profound an effect as one civilization meeting another, or as humanity making contact with extraterrestrial intelligence. We would be forced to face the question of whether there’s anything so special about our feelings, and if not, whether there’s anything special about us at all. “It would maybe focus us more on what makes humans human,” he says, “to be confronted by something that is so like us in some ways, but in other ways is totally alien.”

Neil Savage is a freelance science and technology writer in Massachusetts. His story for Nature about artificial tongues won an award from the American Society of Journalists and Authors. He has also written about companion robots for the elderly for
Nature, electronic spider silk for IEEE Spectrum, and bionic limbs for Discover. To see his works, visit www.neilsavage.com.

Thursday, October 03, 2013

Being Human 2013 - Human Emotions with Richard Davidson, Paul Ekman, and Esther Sternberg




Human Emotions from Being Human on FORA.tv

Human Emotions

In this session we look at emotions as evolved behavioral responses, how well-being can be cultivated, and how our emotions can influence health. We further investigate the nature of compassion and its compatibility with evolutionary theory.

Session led by: Richie Davidson, Ph.D., Professor of Psychology and Psychiatry, University of Wisconsin-Madison; Founder and Chair of the Center for Investigating Healthy Minds
Paul Ekman, Ph.D., Psychologist, Paul Ekman Group, LLC
Esther Sternberg, MD, Physician


Richard Davidson

Neuroscientist Richard Davidson (author of The Emotional Life of Your Brain: How Its Unique Patterns Affect the Way You Think, Feel, and Live--and How You Can Change Them) was named one of the 100 most influential people in the world by Time magazine in 2006. His research focuses on correlating emotional states with the brain activity underlying them. Davidson has reached the conclusion that our brain circuitry isn't set in stone: though our emotions are evolved responses, they are remarkably plastic and can be shaped over time. As he says, "I think that what modern neuroscience is teaching us is that, in fact, there is a lot of plasticity, that change is indeed possible, and the evidence is more and more strongly in favor of the importance of environmental influences in shaping brain function and structure and even shaping the expression of our genes." At the Center for Investigating Healthy Minds, Davidson and other researchers investigate qualities of mind such as compassion and mindfulness in order to understand how healthy minds might be cultivated. He is perhaps most famous for his investigations into the neurological effects of meditation, showing how this practice can functionally rewire the brain. In 2012, he spoke at the Being Human conference in San Francisco. 

Paul Ekman

Paul Ekman (author of Emotions Revealed, Second Edition: Recognizing Faces and Feelings to Improve Communication and Emotional Life) is a pioneering psychologist in the study of emotions and facial expressions, and was named one of the most influential psychologists of the 20th century by the American Psychological Association. Ekman is most famous for his research establishing that nonverbal communication of emotions is not a cultural phenomenon but a universal one. Through his study of facial expressions, Ekman has substantiated Darwin's theory that human emotions are an evolved, biological response shared throughout cultures worldwide. On their importance in our lives, Ekman states, "Emotions can override…the more powerful fundamental motives that drive our lives: hunger, sex, and the will to survive." Ekman has also contributed to the study of microexpressions, involuntary facial expressions that occur when someone is attempting to conceal their true feelings. Microexpressions offer further evidence that emotional responses are indeed hardwired and universal. His system of reading these emotions gave rise to the crime drama television series Lie to Me, starring a character based on Ekman. In 2012, he spoke at the Being Human conference in San Francisco.

Esther Sternberg

Internationally recognized for her discoveries of the science of the mind-body interaction in illness and healing, Dr. Esther M. Sternberg (author of The Balance Within: The Science Connecting Health and Emotions) is a major force in collaborative initiatives on mind-body-stress-wellness and environment inter-relationships. Dr. Sternberg's many honors include recognition by the National Library of Medicine as one of 300 women physicians who have changed the face of medicine, the Anita Roberts National Institutes of Health Distinguished Woman Scientist Lectureship, and an honorary doctorate in medicine from Trinity College, Dublin. Currently Research Director for the Arizona Center for Integrative Medicine, University of Arizona at Tucson, Dr. Sternberg was previously Section Chief of Neuroendocrine Immunology and Behavior at the National Institute of Mental Health; Director of the Integrative Neural Immune Program, NIMH/NIH; and Co-Chair of the NIH Intramural Program on Research on Women's Health. She has been featured on numerous radio and television programs, including PBS's The New Medicine and Life Part II, NPR's Speaking of Faith and, in 2009, with Emmy Award winning Resolution Pictures, created and hosted a PBS special based on her books: The Science of Healing. Well known for her ability to translate complex scientific subjects for lay audiences, Sternberg has testified before Congress, advised the World Health Organization, and is a regular contributor to Science Magazine's "Books et al." column, and a regular columnist for Arthritis Today. A dynamic speaker, recognized by her peers as a spokesperson for the field, she translates complex scientific subjects in a highly accessible manner, with a combination of academic credibility, passion for science and compassion as a physician. Dr. Sternberg lectures nationally and internationally to both lay and scientific audiences and is frequently interviewed on radio, television and film and in print media on subjects including the mind-body connection, 'stress and illness', spirituality, love, and health, and place and well-being.


Wednesday, September 04, 2013

TEDxPortsmouth - Dr. Alan Watkins - Being Brilliant Every Single Day


This rare two-part TEDx Talk by Alan Watkins, from TEDxPortsmouth, is better than the title might suggest. He speaks on performance, coherence, and controlling our organism. If you think deep breathing is the way to calm the body, he suggests you are wrong (thanks to Mark Walsh for the heads up on this).

Watkins advocates a slower, more rhythmic breathing cycle that seems to generate more brain coherence (according to his machine). Their model seeks to control all of the factors below the water-line (as seen above) in order to generate better results for the two factors above the water-line.

TEDxPortsmouth - Dr. Alan Watkins - Being Brilliant Every Single Day


Published on Mar 13, 2012 in (2 Parts)

Alan is the founder and CEO of Complete Coherence Ltd. He is recognised as an international expert on leadership and human performance. He has researched and published widely on both subjects for over 18 years. He is currently an Honorary Senior Lecturer in Neuroscience and Psychological Medicine at Imperial College, London as well as an Affiliate Professor of Leadership at the European School of Management, London. He originally qualified as a physician, has a first class degree in psychology and a PhD in immunology.

Website: http://www.complete-coherence.com

Part One:


Part Two:


This is the "About" statement from the Complete Coherence website. Of interest to integral folks, perhaps, because Diane Hamilton is one of their practitioners.

Complete Coherence is powered by compassion.


Our purpose is to develop more enlightened leaders.

Compassion is what gets us all out of bed every single day. We have a strong desire to reduce the suffering that comes from the poor decision making of leaders across the globe. We believe that there is an urgent need to develop more enlightened leadership in organisations. We are also very optimistic about the potential of human beings and what is possible. We delight in helping leaders, executive teams and multi-national organisations develop themselves and deliver much better results even in tough conditions.

Our approach is very bespoke. It is driven by our ability to precisely diagnose the critical issues which, when resolved, will cause a significant improvement in performance. Using a range of high definition diagnostic processes we ensure we understand your issues deeply before intervening. Our interventions integrate the most recent advances from multiple scientific fields including; complexity theory, human performance, neuroscience, evolutionary biology, team dynamics, organisational development, medical technology and many others.

We distinguish “horizontal” development, which is effectively the acquisition of knowledge, skills and experience from “vertical” development. Vertical development enables individuals, teams and organisations to move to a more sophisticated level of performance. Such a distinction is, in our view, critical to delivering sustainable change. We also believe in scientifically measuring the improvements created and sharing the results with you.

Founder and CEO Dr Alan Watkins BSc MBBS PhD is a one of a team of outstanding consultants who are supported by a superb back office who are incredibly friendly and keep us on track to ensure we all deliver Brilliance Every Day!

Tuesday, August 20, 2013

The Contagion of Being: Derrick Hull at TEDxTeachersCollege


Thomas Derrick Hull was formerly Director of Neuroscience and Learning Design at Candeo (a program for overcoming unwanted sexual behaviors), studied Mind, Brain, and Education at Harvard University, and studied Clinical Psychology at Columbia University.

This is a short but interesting TEDx talk.


The Contagion of Being: Derrick Hull at TEDxTeachersCollege 
Published on Aug 18, 2013 
In this captivating talk, Derrick Hull discusses the shared experience of being alive. Explaining that thoughts, moods, and feelings are essentially contagious experiences, Hull engagingly cites research from psychology and social science to show the power of becoming a mindfully contagious individual. Derrick Hull is a researcher, writer, and entrepreneur. Derrick is pursuing graduate studies in Clinical Psychology at Teachers College, Columbia University.

Monday, June 03, 2013

On the Distinction of Empathic and Vicarious Emotions


This new article from Frontiers in Human Neuroscience (Open Access journal) looks at the difference between vicarious emotions and empathy - two very different processes in my opinion. However, the authors suggest that both vicarious and empathic emotions "originate from the simulation processes mirroring and mentalizing that depend on anchoring and adjustment." Further, they claim that the term empathic emotion should be reserved only for incidents "where perceivers and social targets have shared affective experience, whereas vicarious emotion offers a wider scope and also includes non-shared affective experiences. Both are supposed to be highly functional in social interactions."

Interesting stuff. [Note: The image at the top has nothing to do with the article - I just liked it.]

Full Citation: 
Paulus FM, Müller-Pinzler L, Westermann S and Krach S (2013). On the distinction of empathic and vicarious emotions. Frontiers in Human Neuroscience; 7:196. doi: 10.3389/fnhum.2013.00196

On the distinction of empathic and vicarious emotions


Frieder M. Paulus, Laura Müller-Pinzler, Stefan Westermann, Sören Krach


Abstract


In the introduction to the special issue “The Neural Underpinnings of Vicarious Experience” the editors state that one “may feel embarrassed when witnessing another making a social faux pas.” In our commentary we address this statement and ask whether this example introduces avicarious or an empathic form of embarrassment. We elaborate commonalities and differences between these two forms of emotional experiences and discuss their underlying mechanisms. We suggest that both, vicarious and empathic emotions, originate from the simulation processes mirroring and mentalizing that depend on anchoring and adjustment. We claim the term “empathic emotion” to be reserved exclusively for incidents where perceivers and social targets have shared affective experience, whereas “vicarious emotion” offers a wider scope and also includes non-shared affective experiences. Both are supposed to be highly functional in social interactions.


Introduction


The human ability to infer others' emotions, thoughts or intentions is a central mechanism in creating meaningful social interactions. Accordingly, the question of how we develop a representation of our interaction partners' minds and emotions has been the focus of various disciplines such as social psychology, philosophy, anthropology, and biology. In the last decade the social neurosciences, specifically, have put tremendous efforts into disentangling the neural networks involved in this ability. Most of this research has concentrated on the phenomenon of “empathy.” Empathy has been defined as the state where people (i.e., perceivers1)represent the same emotion they are observing or imagining in another person (i.e., social targets) with full awareness that the source of their own experience is the other's emotion (De Vignemont and Singer, 2006). However, empathy only refers to a small amount of vicarious emotions people may experience while interacting with their social environment in everyday life (Singer and Lamm, 2009). With this commentary, we aim to broaden this perspective by proposing a clear-cut distinction between vicarious and empathic emotions, with the latter being a specific case of the first and both being mediated by two streams of simulation processes.


Two Processes of Understanding Others' Emotions: Mirroring and Mentalizing


Mainly, two interacting processes have been proposed that allow perceivers to empathize (Keysers and Gazzola, 2007; Waytz and Mitchell, 2011). First,mirroring processes have been described as a direct mapping of another's observed actions and bodily states in one's own (i.e., the perceiver's) neural system that allow sharing the target's feelings in an embodied manner. Second, mentalizing processes which have been proposed to lead to comparable internal representations in perceivers, however, via a projection of oneself into the target's position (Keysers and Gazzola, 2007;Hein and Singer, 2008). Mentalizing thus involves imagining oneself in the same situation as the social target and helps to “intuitively” (Keysers and Gazzola, 2007) grasp the target's emotions as if they were one's own bodily states. These processes, mirroring and mentalizing, can be understood as forms of internal simulation that allow perceivers to experience another person's state on one's own body (see Waytz and Mitchell, 2011).

In order to shed light onto the neural mechanisms of these two processes to simulate the target's emotional state, the fundamental idea of these approaches is to compare neuronal networks involved in first-hand experiences of emotions or sensations (e.g., provoking pain or disgust through administration of electro-shocks or unpleasant odors, respectively) with the neuronal networks engaged while observing emotions or sensations in interaction partners (Wicker et al., 2003; Singer et al., 2004;Jabbi et al., 2007). Overlap in cortical activation between these experimental conditions is then interpreted as evidence for shared, “isomorphic”2 affective states between interaction partners and thus as a neuronal manifestation of empathy (Wicker et al., 2003; Gallese et al., 2004; Singer et al., 2004; Jackson et al., 2006). Irrespective of the underlying processes, neuroscience research has shown that the anterior insula and the anterior cingulate cortex are most robustly involved in common mapping of one's own and another's affective states during empathic experiences (Craig, 2009; Lamm and Singer, 2010).

Depending on the available input, perceivers rely on sensory [i.e., mirroring of gestures, mimics, bodily postures, sounds etc. in a near-simultaneous isomorphic fashion (Waytz and Mitchell, 2011)] and/or contextual information (i.e., mentalizing using semantic information, prior knowledge, past experiences in similar situations etc.) in order to represent another person's state (Waytz and Mitchell, 2011; Zaki and Ochsner, 2011). Among others, the premotor cortex and primary as well as higher order somatosensory cortices are thought to mediate the mirroring process (Avenanti et al., 2005). Mentalizing is typically associated with medial prefrontal cortex (mPFC), temporal pole, and superior temporal sulcus activation (Hein and Singer, 2008). Within the mentalizing network, the mPFC has been specifically linked to reflective processes about oneself and another (Mitchell et al., 2005) or imagining oneself in past and future events (Buckner and Carroll, 2007; Schacter et al., 2007). This supports the conceptualization of mentalizing as a process where perceivers project themselves into to the position of the social target.


Dissociating Vicarious and Empathic Emotions


The processes to infer the “physically invisible but psychologically real, internal state” (Zaki and Ochsner, 2011; p.159) can also result in “vicarious emotions” that are simulated in the absence of this specific emotional state in the social target. Although the terms “empathic emotions” (Batson, 1981;Lamm et al., 2007a; Hein and Singer, 2008; Pfeifer et al., 2008; Engen and Singer, 2012; Zaki and Ochsner, 2012) and “vicarious emotions” (Batson et al., 1987; Decety and Lamm, 2006; Keysers and Gazzola, 2009; Meyer et al., 2012; Niedenthal and Brauer, 2012) have been used with near identical meaning, we consider both concepts to have distinctive characteristics and consequences. This distinction is easily illustrated on the basis of vicarious embarrassment: in many social encounters perceivers feel vicariously embarrassed in the absence of embarrassment or any other emotion in the social target3 (Hawk et al., 2011; Krach et al., 2011; Müller-Pinzler et al., 2012; Paulus et al., 2013). Thus, the social target is unaware about the ongoing threats to her social integrity in this situation (Krach et al., 2011). Consequently, in contrast to empathic manifestations, vicarious embarrassment reflects an emotional state in the perceiver that does not match the internal, psychologically real state of the social target. Nonetheless, recent studies provided first evidence that similar processes of mentalizing and mirroring contribute to the perceiver's vicarious embarrassment (Hawk et al., 2011; Krach et al., 2011).

We have previously discussed how mentalizing can result in vicarious emotions that do not match the emotional state of the social target (Krach et al., 2011). This has been explained through self-projections of perceivers who transpose themselves into the position of others thereby integrating their own perspective within the mental simulation (Hawk et al., 2011). However, for several reasons, the mapping of the social target's state in the perceiver's neural network through mirroring processes is also not independent of the perceiver's perspective. First, similar to the processing of sensory information of one's own body (Gazzola et al., 2012), mirroring the target's state in a near-simultaneous isomorphic fashion is modulated by other processes such as mentalizing. This is particularly important in social contexts that constrain the desirability of displayed emotions (e.g., at work). In these situations the enacted and thus mirrored expressions could deviate from the corresponding internal psychological state. Second, the mirror neuron functioning is deeply integrated in a neural network that is tailored and tuned to process information of the perceiver's body. In the most extreme example this is illustrated with mirror neuron activity in response to observing robotic arms grasping objects (Gazzola et al., 2007;Keysers et al., 2010). Those robots do not have any human sensations or form intentions about their actions, however, the perceiver's neural system mirrors the action as if it was human. Consequently, depending on the idiosyncratic learning experiences the mirrored representation should deviate across different perceivers even if the inputs entering the system were exactly similar. These arguments illustrate how mirroring is indeed anchored in the characteristics of the perceiver's neural system and might be modulated by additional information accessible exclusively from the observer's perspective. The resulting simulation of the social target's state through mirroring processes could represent a genuine vicarious emotion. Previous research has already demonstrated such automated vicarious responses while e.g., observing numbed limbs that undergo biopsy (Lamm et al., 2007b).

These thoughts raise the question of whether vicarious in comparison to empathic emotional experiences may serve a useful function in social interactions or have to be considered as the result of immature and maladapted processes to representing another person's internal psychological state. With the help of some examples we argue that these vicarious emotions may indeed provide useful information for perceivers, enable helping behavior, and foster social interactions. First, vicarious emotions contribute to the social regulation of the perceiver's behavior. For instance, many forms of psychological punishment are used as an “example” to induce avoidance of disobedience from norms, even if the social target does not respond to the situation. Perceivers will nonetheless do so and vicariously experience the suffering in that situation. Second, imagine observing the above described non-embarrassed presenter who is currently unaware of the ongoing threat to her social integrity. For perceivers, their vicarious emotional response provides insights about the severity of the threat to the image of the social target. This internal vicarious representation of the unfavorable condition may help to motivate interventions from the perceivers' side in order to re-establish the social integrity of the target. In contrast, perceivers who are tied to an empathically accurate response that matches the internal psychological state of the target may be less prone to develop such motivations. Similarly, with regards to observing physical injuries to another's body, vicarious pain experiences, even in the absence of a psychological state of pain in the target, might provide vital information for initiating costly helping behaviors (Hein et al., 2010).

This line of argumentation supports the notion that human beings are not only passive perceivers in the context of social interactions but also active creators of shared emotional experiences. In a natural setting, they need to be aware of their own presence and the simulated vicarious emotions in response to another person's condition; is it in the presence or absence of an emotional state in the social target. The perceiver's construal of a social target's condition as the representation of an internal, “psychologically real” state might thus provide an unnecessarily narrow scope to examining vicarious emotions. Rather, vicarious emotions should be considered as the result of ongoing simulation processes that, depending on the social context as well as personal or task induced motivations are flexibly tuned to match another's internal psychological state.

The question remains if perceivers, even if they intend to, always have correct assumptions about the emotion of the social target. Accordingly, social neuroscience has to consider the match or mismatch of the emotional experiences between social targets and perceivers from two perspectives: first, the de facto match or mismatch of the emotions between the perceiver and the social target, and second, the subjective stance of the perceiver about the match or mismatch with the social target's emotions. In social interactions both perspectives may occur independently of each other, resulting in four different states (for examples see Figure 1). The neural responses should not differ between de facto and subjective empathic and vicarious emotions, respectively. The transition from one of the states to another might nonetheless offer great potential for unraveling yet neglected neural processes in social interactions. This is especially important considering upcoming second-person neuroscience paradigms that allow the investigation of true social interactions (Krach et al., in press;Schilbach et al., in press).

FIGURE 1


Figure 1. Integrating the perceiver's perspective in vicarious and empathic emotions. The figure illustrates how the perceiver's assumption about the match of her emotions with the social target's emotion may dissociate from the de facto state. Notably, the neural response pattern within each state is determined by the subjective appraisals of perceivers. The arrows indicate the adjustment of a subjectively “incorrect” stance during the course of social interactions (e.g., feedback of the social target) in order to match the demands of the social context. These transitions might specifically help to dissociate neural processes that are involved in the adjustment and anchoring of one's own perspective.


A Process Oriented Perspective on Vicarious Emotions


Ideas how to conceptually explain vicarious emotions can be derived from recent efforts in social psychology. Several behavioral studies have examined the process of understanding others' minds. Those studies indicate that people adopt others' perspectives by initially anchoring on their own perspective and then serially adjusting their internal representation to account for differences between themselves and others (Epley et al., 2004). This understanding has been mostly applied in context of cognitive inferences on another person's knowledge or attitudes but might be easily applicable for examining the neural underpinnings of vicarious emotions as well. In a shared social environment, perceivers have access to different inputs (i.e., internal and external, see Figure 2) allowing to simulate the social target's state. We have outlined above how both streams of simulation, mirroring and mentalizing, are anchored in the egocentric perspective of the perceiver. The social context then defines how the initial simulation needs to get adjusted in order to provide the foundation for successful social interactions. Depending on the appropriateness of the initial simulation (anchoring) the readjustment process might be more or less demanding and may finish after a “plausible” assessment is reached (Epley and Gilovich, 2001). Notably, the plausibility refers to both, vicarious and/or empathic emotional experiences (Figure 2).

FIGURE 2


Figure 2. Conceptualization of vicarious and empathic emotions in a unified framework.The figure illustrates how perceiver and social target may interact in a shared social environment and how the perceiver represents vicarious and empathic emotions based on simulation processes. On the most abstract level, the input for the simulation stems from external (e.g., gesture, mimic, prosody) or internal sources (e.g., prior knowledge, past experiences with the interaction partner). Simulation of internal states is realized through two different streams, mirroring and mentalizing, which depend on the available input. Both streams of simulation are anchored in the perceiver's perspective and get adjusted to obtain the adequate outcome in the shared social environment. This can be rather empathic and/or vicarious emotional experience.
So far, social neuroscience has predominantly investigated the two streams of simulation processes and their interactions (Zaki and Ochsner, 2012). We believe that focusing on the sub-processes of anchoring and adjusting in both streams of simulation has the potential to explain vicarious and empathic emotions in a parsimonious framework. A first fMRI study has indicated the potential for this approach in the social neurosciences (Tamir and Mitchell, 2010). While focusing on cognitive inferences, this study showed the mPFC to be specifically involved in the readjustment process during mentalizing. We would predict similar mPFC functioning in case of readjustment of vicarious emotions, both during mirroring or mentalizing processes. Extending on these findings, one can formulate more refined hypotheses on the involvement of neural networks in simulation processes and the specific functions of subunits within the system. These may allow differentiating vicarious and empathic emotions on the neural systems level and processes involved in the transitions from subjective to de facto vicarious or empathic states (see Figure 1). Here, we would predict the mPFC to play a pivotal role for remodeling the “incorrect” subjective state. Future studies on vicarious or empathic emotions, however, need to address the complexity of social situations and manipulate it to the extremes in order to elucidate the specific neural processes involved in the different stances.

Further, the modulatory role of contextual demands on brain and behavior can be tested. Among others, one could model the effects of time constraints or increased cognitive load on the perceiver side, or alter the perceiver's simulation by task induced manipulations. This understanding of simulation processes also is of clinical relevance. Instead of characterizing the impairments in both streams of simulation, research has to consider causes of clinical phenotypes on the level of anchoring and adjustment. The source of e.g., autistic symptomatology might rather originate from disturbed anchoring and adjustment and the inflexibility to modulate the simulation process according to social contextual demands (Paulus et al., 2013). Although, there is evidence for both simulation processes to be affected in individuals with autism (see Zaki and Ochsner, 2012) a theoretical work on autism-spectrum disorders also suggested that affected individuals have strong egocentrically anchoring that cannot be readjusted to the social target's perspective (De Vignemont and Frith, 2007) which might contribute to observed alterations in simulation processes.

In conclusion, we provide an argument for how to distinguish the terms “vicarious emotions” and “empathic emotions.” Both originate from the simulation processes mirroring and mentalizing, however, the term “empathic emotions” should be reserved only for incidents where perceivers and social targets have shared, “isomorphic” affective experience (Engen and Singer, 2012). Vicarious emotions offer a wider scope and also include non-shared affective experiences which are nonetheless highly functional in social interactions. With several examples we have briefly illustrated how the two streams of simulation, mirroring and mentalizing, are imbued by the perceiver's perspective which might result in both vicarious and/or empathic emotions. In order to explain these emotional experiences in a parsimonious framework, we think that anchoring and adjustment are the yet neglected concepts that need to get integrated into the research on the neural underpinnings of vicarious experience.

Acknowledgments


We are very grateful to the reviewers' important and valuable comments on this manuscript which greatly helped to improve the quality of this theoretical paper. Research leading to this article has been funded by the German Research Foundation (DFG; KR3803/2-1, KR3803/7-1), the Research Foundation of the Philipps-University Marburg and the von Behring-Röntgen-Stiftung (KR 60-0023).

Footnotes


^Zaki and Ochsner (2011) described individuals focusing on someone else as “perceivers” and individuals being in the focus of the perceivers' attention as ‘social targets’. For the present article we take on this labeling and will refer to perceiver and social target in the following.

^In the neurosciences the term isomorphism might be understood with at least two different meanings: on the one hand, “isomorphic” patterns of information refer to the similar firing of mirror neurons during self-initiated actions and the observation of corresponding actions of others thus allowing computational predictions. On the other hand, in the context of empathy research the term “isomorphism” has been used to describe similar affective states between targets and perceivers (De Vignemont and Singer, 2006). Whereas the earlier usage refers to the micro-level of information processing in the brain, the latter describes the subjective level of affective experiences. In the present manuscript we use the term “isomorphism” with the latter meaning.
^For example, vicarious embarrassment is experienced by attendees of a scientific conference when they observe the presenter of a talk returning from the rest room not realizing that toilet paper is sticking out of the back of her pants.

References


Avenanti, A., Bueti, D., Galati, G., and Aglioti, S. M. (2005). Transcranial magnetic stimulation highlights the sensorimotor side of empathy for pain. Nat. Neurosci. 8, 955–960. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Batson, C. D. (1981). Is empathic emotion a source of altruistic motivation? J. Pers. Soc. Psychol. 40, 290–302.

Batson, C. D., Fultz, J., and Schoenrade, P. A. (1987). Distress and empathy: two qualitatively distinct vicarious emotions with different motivational consequences. J. Pers. 55, 19–39. Pubmed Abstract | Pubmed Full Text

Buckner, R. L., and Carroll, D. C. (2007). Self-projection and the brain. Trends Cogn. Sci. 11, 49–57. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Craig, A. D. B. (2009). How do you feel–now? The anterior insula and human awareness. Nat. Rev. Neurosci. 10, 59–70. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Decety, J., and Lamm, C. (2006). Human empathy through the lens of social neuroscience. ScientificWorldJournal 6, 1146–1163. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

De Vignemont, F., and Frith, U. (2007). “Autism, morality and empathy,” in Moral Psychology. Vol. 3, ed W. Sinnott-Armstrong (Cambridge, MA: MIT Press), 273–280.

De Vignemont, F., and Singer, T. (2006). The empathic brain: how, when and why?Trends Cogn. Sci. 10, 435–441. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Engen, H. G., and Singer, T. (2012). Empathy circuits. Curr. Opin. Neurobiol. 23, 275–282. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Epley, N., and Gilovich, T. (2001). Putting adjustment back in the anchoring and adjustment heuristic: differential processing of self-generated and experimenter-provided anchors. Psychol. Sci. 12, 391–396. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Epley, N., Keysar, B., Van Boven, L., and Gilovich, T. (2004). Perspective taking as egocentric anchoring and adjustment. J. Pers. Soc. Psychol. 87, 327–339. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Gallese, V., Keysers, C., and Rizzolatti, G. (2004). A unifying view of the basis of social cognition. Trends Cogn. Sci. 8, 396–403. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Gazzola, V., Rizzolatti, G., Wicker, B., and Keysers, C. (2007). The anthropomorphic brain: the mirror neuron system responds to human and robotic actions. Neuroimage35, 1674–1684. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Gazzola, V., Spezio, M. L., Etzel, J. A., Castelli, F., Adolphs, R., and Keysers, C. (2012). Primary somatosensory cortex discriminates affective significance in social touch.Proc. Natl. Acad. Sci. U.S.A. 109, E1657–E1666. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Hawk, S. T., Fischer, A. H., and Van Kleef, G. A. (2011). Taking your place or matching your face: two paths to empathic embarrassment. Emotion 11, 502–513. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Hein, G., Silani, G., Preuschoff, K., Batson, C. D., and Singer, T. (2010). Neural responses to ingroup and outgroup members' suffering predict individual differences in costly helping. Neuron 68, 149–160. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Hein, G., and Singer, T. (2008). I feel how you feel but not always: the empathic brain and its modulation. Curr. Opin. Neurobiol. 18, 153–158. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Jabbi, M., Swart, M., and Keysers, C. (2007). Empathy for positive and negative emotions in the gustatory cortex. Neuroimage 34, 1744–1753. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Jackson, P. L., Brunet, E., Meltzoff, A. N., and Decety, J. (2006). Empathy examined through the neural mechanisms involved in imagining how I feel versus how you feel pain. Neuropsychologia 44, 752–761. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Keysers, C., and Gazzola, V. (2007). Integrating simulation and theory of mind: from self to social cognition. Trends Cogn. Sci. 11, 194–196. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Keysers, C., and Gazzola, V. (2009). Expanding the mirror: vicarious activity for actions, emotions, and sensations. Curr. Opin. Neurobiol. 19, 666–671. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Keysers, C., Kaas, J. H., and Gazzola, V. (2010). Somatosensation in social perception.Nat. Rev. Neurosci. 11, 417–428. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Krach, S., Cohrs, J. C., De Echeverría Loebell, N. C., Kircher, T., Sommer, J., Jansen, A., et al. (2011). Your flaws are my pain: linking empathy to vicarious embarrassment.PLoS ONE 6:e18675. doi: 10.1371/journal.pone.0018675 Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Krach, S., Müller-Pinzler, L., Westermann, S., and Paulus, F. M. (in press). Advancing the neuroscience of social emotions with social immersion. Behav. Brain Sci.

Lamm, C., Batson, C. D., and Decety, J. (2007a). The neural substrate of human empathy?: effects of perspective-taking and cognitive appraisal. J. Cogn. Neurosci. 19, 42–58. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Lamm, C., Nusbaum, H. C., Meltzoff, A. N., and Decety, J. (2007b). What are you feeling? Using functional magnetic resonance imaging to assess the modulation of sensory and affective responses during empathy for pain. PLoS ONE 2:e1292. doi: 10.1371/journal.pone.0001292 Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Lamm, C., and Singer, T. (2010). The role of anterior insular cortex in social emotions.Brain Struct. Funct. 214, 579–591. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Meyer, M. L., Masten, C. L., Ma, Y., Wang, C., Shi, Z., Eisenberger, N. I., et al. (2012). Empathy for the social suffering of friends and strangers recruits distinct patterns of brain activation. Soc. Cogn. Affect. Neurosci. 8, 446–454. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Mitchell, J. P., Banaji, M. R., and Macrae, C. N. (2005). The link between social cognition and self-referential thought in the medial prefrontal cortex. J. Cogn. Neurosci. 17, 1306–1315. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Müller-Pinzler, L., Paulus, F. M., Stemmler, G., and Krach, S. (2012). Increased autonomic activation in vicarious embarrassment. Int. J. Psychophysiol. 86, 74–82. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Niedenthal, P. M., and Brauer, M. (2012). Social functionality of human emotion.Annu. Rev. Psychol. 63, 259–285. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Paulus, F. M., Kamp-Becker, I., and Krach, S. (2013). Demands in reflecting about another's motives and intentions modulate vicarious embarrassment in autism spectrum disorders. Res. Dev. Disabil. 34, 1312–1321. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Pfeifer, J. H., Iacoboni, M., Mazziotta, J. C., and Dapretto, M. (2008). Mirroring others' emotions relates to empathy and interpersonal competence in children. Neuroimage 39, 2076–2085. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Schacter, D. L., Addis, D. R., and Buckner, R. L. (2007). Remembering the past to imagine the future: the prospective brain. Nat. Rev. Neurosci. 8, 657–661. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Schilbach, L., Timmermans, B., Reddy, V., Costall, A., Bente, G., Schlicht, T., et al. (in press). Toward a second-person neuroscience. Behav. Brain Sci.

Singer, T., and Lamm, C. (2009). The social neuroscience of empathy. Ann. N.Y. Acad. Sci. 1156, 81–96. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Singer, T., Seymour, B., O'Doherty, J., Kaube, H., Dolan, R. J., and Frith, C. D. (2004). Empathy for pain involves the affective but not sensory components of pain. Science 303, 1157–1162. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Tamir, D. I., and Mitchell, J. P. (2010). Neural correlates of anchoring-and-adjustment during mentalizing. Proc. Natl. Acad. Sci. U.S.A. 107, 10827–10832. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Waytz, A., and Mitchell, J. P. (2011). Two mechanisms for simulating other minds: dissociations between mirroring and self-projection. Curr. Dir. Psychol. Sci. 20, 197–200.

Wicker, B., Keysers, C., Plailly, J., Royet, J. P., Gallese, V., and Rizzolatti, G. (2003). Both of us disgusted in My insula: the common neural basis of seeing and feeling disgust. Neuron 40, 655–664. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Zaki, J., and Ochsner, K. (2011). Reintegrating the study of accuracy into social cognition research. Psychol. Inq. 22, 159–182.

Zaki, J., and Ochsner, K. (2012). The neuroscience of empathy: progress, pitfalls and promise. Nat. Neurosci. 15, 675–680. Pubmed Abstract | Pubmed Full Text | CrossRef Full Text