Tuesday, July 02, 2013

Identifying Emotions on the Basis of Neural Activation - Kassam, et al


As of the publication of the new research detailed here, at least 200 articles have been published on the neural correlates of emotion using fMRI and PET alone (cited by the authors, [1]). The research suggests that some regions of the brain are more active than others when people experience specific emotions, but there is no region consistently and specifically activated by a single emotion type.

Rather than use the standard approaches, the authors of this new study, led by Karim Kassam, used a more complex approach:
Rather than search for contiguous neural structures associated with specific emotions, we applied multi-voxel pattern analysis techniques to identify distributed patterns of activity associated with specific emotions [21] [22]. Such techniques allow for the possibility that neural responses to emotional stimulation occur in many brain areas simultaneously. These algorithms frequently result in increased predictive power, and recent research suggests that they hold promise for classifying emotion using neurological and physiological data [23].
Their approach seems promising, but I am still apprehensive about trying to pinpoint the location of emotions in the brain, especially because they likely are distributed across a variety of modules. Still, it's interesting stuff.
[1] Kober H, Barrett LF, Joseph J, Bliss-Moreau E, Lindquist K, et al. (2008). Functional grouping and cortical–subcortical interactions in emotion: A meta-analysis of neuroimaging studies. Neuroimage 42: 998. doi: 10.1016/j.neuroimage.2008.03.059.

Full Citation:
Kassam KS, Markey AR, Cherkassky VL, Loewenstein G, Just MA. (2013). Identifying Emotions on the Basis of Neural Activation. PLoS ONE 8(6): e66032. doi:10.1371/journal.pone.0066032



Emotions Identified Based On Brain Activity

For the first time, scientists at Carnegie Mellon University have identified which emotion a person is experiencing based on brain activity.


The study, which will be published in the journal PLOS ONE, combines functional magnetic resonance imaging (fMRI) and machine learning to measure brain signals to accurately read emotions in individuals. Led by researchers in CMU's Dietrich College of Humanities and Social Sciences, the findings illustrate how the brain categorizes feelings, giving researchers the first reliable process to analyze emotions. Until now, research on emotions has been long stymied by the lack of reliable methods to evaluate them, mostly because people are often reluctant to honestly report their feelings. Further complicating matters is that many emotional responses may not be consciously experienced.

Identifying emotions based on neural activity builds on previous discoveries by CMU's Marcel Just and Tom M. Mitchell, which used similar techniques to create a computational model that identifies individuals' thoughts of concrete objects, often dubbed "mind reading."

"This research introduces a new method with potential to identify emotions without relying on people's ability to self-report," said Karim Kassam, assistant professor of social and decision sciences and lead author of the study. "It could be used to assess an individual's emotional response to almost any kind of stimulus, for example, a flag, a brand name or a political candidate."

One challenge for the research team was find a way to repeatedly and reliably evoke different emotional states from the participants. Traditional approaches, such as showing subjects emotion-inducing film clips, would likely have been unsuccessful because the impact of film clips diminishes with repeated display. The researchers solved the problem by recruiting actors from CMU's School of Drama.

"Our big breakthrough was my colleague Karim Kassam's idea of testing actors, who are experienced at cycling through emotional states. We were fortunate, in that respect, that CMU has a superb drama school," said George Loewenstein, the Herbert A. Simon University Professor of Economics and Psychology.

For the study, 10 actors were scanned at CMU's Scientific Imaging & Brain Research Center while viewing the words of nine emotions: anger, disgust, envy, fear, happiness, lust, pride, sadness and shame. While inside the fMRI scanner, the actors were instructed to enter each of these emotional states multiple times, in random order.

Another challenge was to ensure that the technique was measuring emotions per se, and not the act of trying to induce an emotion in oneself. To meet this challenge, a second phase of the study presented participants with pictures of neutral and disgusting photos that they had not seen before. The computer model, constructed from using statistical information to analyze the fMRI activation patterns gathered for 18 emotional words, had learned the emotion patterns from self-induced emotions. It was able to correctly identify the emotional content of photos being viewed using the brain activity of the viewers.

To identify emotions within the brain, the researchers first used the participants' neural activation patterns in early scans to identify the emotions experienced by the same participants in later scans. The computer model achieved a rank accuracy of 0.84. Rank accuracy refers to the percentile rank of the correct emotion in an ordered list of the computer model guesses; random guessing would result in a rank accuracy of 0.50.

Next, the team took the machine learning analysis of the self-induced emotions to guess which emotion the subjects were experiencing when they were exposed to the disgusting photographs. The computer model achieved a rank accuracy of 0.91. With nine emotions to choose from, the model listed disgust as the most likely emotion 60 percent of the time and as one of its top two guesses 80 percent of the time.

Finally, they applied machine learning analysis of neural activation patterns from all but one of the participants to predict the emotions experienced by the hold-out participant. This answers an important question: If we took a new individual, put them in the scanner and exposed them to an emotional stimulus, how accurately could we identify their emotional reaction? Here, the model achieved a rank accuracy of 0.71, once again well above the chance guessing level of 0.50.

"Despite manifest differences between people's psychology, different people tend to neurally encode emotions in remarkably similar ways," noted Amanda Markey, a graduate student in the Department of Social and Decision Sciences.

A surprising finding from the research was that almost equivalent accuracy levels could be achieved even when the computer model made use of activation patterns in only one of a number of different subsections of the human brain.

"This suggests that emotion signatures aren't limited to specific brain regions, such as the amygdala, but produce characteristic patterns throughout a number of brain regions," said Vladimir Cherkassky, senior research programmer in the Psychology Department.

The research team also found that while on average the model ranked the correct emotion highest among its guesses, it was best at identifying happiness and least accurate in identifying envy. It rarely confused positive and negative emotions, suggesting that these have distinct neural signatures. And, it was least likely to misidentify lust as any other emotion, suggesting that lust produces a pattern of neural activity that is distinct from all other emotional experiences.

Just, the D.O. Hebb University Professor of Psychology, director of the university's Center for Cognitive Brain Imaging and leading neuroscientist, explained, "We found that three main organizing factors underpinned the emotion neural signatures, namely the positive or negative valence of the emotion, its intensity - mild or strong, and its sociality - involvement or non-involvement of another person. This is how emotions are organized in the brain."


In the future, the researchers plan to apply this new identification method to a number of challenging problems in emotion research, including identifying emotions that individuals are actively attempting to suppress and multiple emotions experienced simultaneously, such as the combination of joy and envy one might experience upon hearing about a friend's good fortune.

Groundbreaking discoveries such as identifying emotions based on neural activation patterns have helped to establish Carnegie Mellon as a world leader in brain and behavioral sciences. To build on its foundation of research excellence in psychology, neuroscience and computational science, CMU recently launched a Brain, Mind and Learning initiative to enhance the university's ability to innovate in the laboratory and continue to solve real-world problems.

Here is the abstract and link to the article discussed in this post.

Identifying Emotions on the Basis of Neural Activation

Karim S. Kassam, Amanda R. Markey, Vladimir L. Cherkassky, George Loewenstein, Marcel Adam Just

Abstract


We attempt to determine the discriminability and organization of neural activation corresponding to the experience of specific emotions. Method actors were asked to self-induce nine emotional states (anger, disgust, envy, fear, happiness, lust, pride, sadness, and shame) while in an fMRI scanner. Using a Gaussian Naïve Bayes pooled variance classifier, we demonstrate the ability to identify specific emotions experienced by an individual at well over chance accuracy on the basis of: 1) neural activation of the same individual in other trials, 2) neural activation of other individuals who experienced similar trials, and 3) neural activation of the same individual to a qualitatively different type of emotion induction. Factor analysis identified valence, arousal, sociality, and lust as dimensions underlying the activation patterns. These results suggest a structure for neural representations of emotion and inform theories of emotional processing.
Read the full article.

No comments: