Pages

Thursday, July 10, 2014

Research Offers New Insight into How the Brain Processes Emotions

Parametric modulation analysis (univariate) for independent ratings of positive and negative valence.

This new study sheds some light on how the brain processes emotions, although it certainly does not explain everything. According to Cornell University neuroscientist, Adam Anderson,
“It appears that the human brain generates a special code for the entire valence spectrum of pleasant-to-unpleasant, good-to-bad feelings, which can be read like a ‘neural valence meter’ in which the leaning of a population of neurons in one direction equals positive feeling and the leaning in the other direction equals negative feeling.”
Interesting stuff - too bad the full article is hidden from readers behind a paywall.

Study cracks how brain processes emotions

Date: July 9, 2014
Source: Cornell University

Summary:
Although feelings are personal and subjective, the human brain turns them into a standard code that objectively represents emotions across different senses, situations and even people, reports a new study. “Despite how personal our feelings feel, the evidence suggests our brains use a standard code to speak the same emotional language,” one researcher concludes.

The study's findings provide insight into how the brain represents our innermost feelings – what Anderson calls the last frontier of neuroscience – and upend the long-held view that emotion is represented in the brain simply by activation in specialized regions for positive or negative feelings, he says. Credit: Image courtesy of Cornell University.
Although feelings are personal and subjective, the human brain turns them into a standard code that objectively represents emotions across different senses, situations and even people, reports a new study by Cornell University neuroscientist Adam Anderson.

“We discovered that fine-grained patterns of neural activity within the orbitofrontal cortex, an area of the brain associated with emotional processing, act as a neural code which captures an individual’s subjective feeling,” says Anderson, associate professor of human development in Cornell’s College of Human Ecology and senior author of the study. “Population coding of affect across stimuli, modalities and individuals,” published online in Nature Neuroscience.

Their findings provide insight into how the brain represents our innermost feelings – what Anderson calls the last frontier of neuroscience – and upend the long-held view that emotion is represented in the brain simply by activation in specialized regions for positive or negative feelings, he says.

“If you and I derive similar pleasure from sipping a fine wine or watching the sun set, our results suggest it is because we share similar fine-grained patterns of activity in the orbitofrontal cortex,” Anderson says.

“It appears that the human brain generates a special code for the entire valence spectrum of pleasant-to-unpleasant, good-to-bad feelings, which can be read like a ‘neural valence meter’ in which the leaning of a population of neurons in one direction equals positive feeling and the leaning in the other direction equals negative feeling,” Anderson explains.

For the study, the researchers presented participants with a series of pictures and tastes during functional neuroimaging, then analyzed participants’ ratings of their subjective experiences along with their brain activation patterns.

Anderson’s team found that valence was represented as sensory-specific patterns or codes in areas of the brain associated with vision and taste, as well as sensory-independent codes in the orbitofrontal cortices (OFC), suggesting, the authors say, that representation of our internal subjective experience is not confined to specialized emotional centers, but may be central to perception of sensory experience.

They also discovered that similar subjective feelings – whether evoked from the eye or tongue – resulted in a similar pattern of activity in the OFC, suggesting the brain contains an emotion code common across distinct experiences of pleasure (or displeasure), they say. Furthermore, these OFC activity patterns of positive and negative experiences were partly shared across people.

“Despite how personal our feelings feel, the evidence suggests our brains use a standard code to speak the same emotional language,” Anderson concludes.


Story Source:
The above story is based on materials provided by Cornell University. The original article was written by Melissa Osgood. Note: Materials may be edited for content and length.

Journal Reference:
Junichi Chikazoe, Daniel H Lee, Nikolaus Kriegeskorte, Adam K Anderson. (2014, Jun 22). Population coding of affect across stimuli, modalities and individuals. Nature Neuroscience, ePub ahead of print. DOI: 10.1038/nn.3749
* * * * *

Here is the abstract from the full article, which is sequestered safely behind a paywall, although the article can be yours for the low, low rate of $32.

Population coding of affect across stimuli, modalities and individuals

Junichi Chikazoe, Daniel H Lee, Nikolaus Kriegeskorte & Adam K Anderson

Nature Neuroscience (2014). doi:10.1038/nn.3749 
Received 19 January 2014, Accepted 23 May 2014, Published online 22 June 2014
Abstract

It remains unclear how the brain represents external objective sensory events alongside our internal subjective impressions of them—affect. Representational mapping of population activity evoked by complex scenes and basic tastes in humans revealed a neural code supporting a continuous axis of pleasant-to-unpleasant valence. This valence code was distinct from low-level physical and high-level object properties. Although ventral temporal and anterior insular cortices supported valence codes specific to vision and taste, both the medial and lateral orbitofrontal cortices (OFC) maintained a valence code independent of sensory origin. Furthermore, only the OFC code could classify experienced affect across participants. The entire valence spectrum was represented as a collective pattern in regional neural activity as sensory-specific and abstract codes, whereby the subjective quality of affect can be objectively quantified across stimuli, modalities and people.

No comments:

Post a Comment