Thursday, January 30, 2014

Transhumanism to the Forefront and Digital Technology Can't Save the World (Omnivore)

From Bookforum's Omnivore blog, two collections of technology-related links - one on transhumanism and one on digital technology.

Transhumanism to the forefront

Jan 29 2014
9:00AM

* * * * *

Digital technology can't save the world

Jan 24 2014
3:00PM
  • A T. Kingsmith (York): Virtual Roadblocks: The Securitisation of the Information Superhighway. 
  • Archon Fung and Hollie Russon-Gilman (Harvard) and Jennifer Shkabatur (IDC): Six Models for the Internet and Politics
  • Five key questions — and answers — about how our social horizons may shrink as we use more technology: Henry Farrell interviews Ethan Zuckerman, author of Rewire: Digital Cosmopolitans in the Age of Connection
  • Have scientists found a way to pop the filter bubble? They say the key to exposing us to opposing views is to get them from people with whom we share other interests. 
  • Scroll down your Facebook feed and see if you don’t find one ditto after another: So many people with “good” or “bad politics”, delivered with conviction to rage or applause; so little doubt, error, falsifiability — surely the criteria by which anything true, or democratic, could ever be found. 
  • The NSA-disclosures have destroyed the utopia of the internet as a medium of freedom and democracy; instead it more and more becomes apparent that the internet is ruled by big companies and secret services — according to Evgeny Morozov a reevaluation of the medium is necessary. 
  • Michael Meyer on Evgeny vs. the internet: Evgeny Morozov wants to convince us that digital technology can’t save the world, and he’s willing to burn every bridge from Cambridge to Silicon Valley to do it (and more by Joshua Cohen).

Wednesday, January 29, 2014

The Current State of Evolutionary Theory (Omnivore)


From Bookforum's Omnivore blog, this collection of links looks at current thinking in the world of evolutionary theory.


One addition from today's news: 300,000-Year-Old Hearth Uncovered in Israel, via Live Science
It's not entirely clear who was cooking at Qesem Cave. A study published about three years ago in the American Journal of Physical Anthropology described teeth found in the cave dating to between 400,000 and 200,000 years ago. The authors speculated the teeth might have belonged to modern humans (Homo sapiens), Neanderthals or perhaps a different species, though they noted they couldn't draw a solid conclusion from their evidence.
Cool . . . .


The current state of evolutionary theory

Jan 27 2014
9:00AM

Fish Oil Increases Brain Size


We have known for years that fish oil can ameliorate depression (fish oil higher in EPA than DHA seems to be more effective), ADHD, and offers benefits of prevention for neurodegenerative disorders.

We even have evidence that high-dose fish oil can help heal severe brain trauma.

There is now new evidence that higher levels of fish oil in the blood are related to larger brain size over a period of several years.

Can Fish Oil Help Preserve Brain Cells?
Posted on January 22, 2014

MINNEAPOLIS – People with higher levels of the omega-3 fatty acids found in fish oil may also have larger brain volumes in old age equivalent to preserving one to two years of brain health, according to a study published in the January 22, 2014, online issue of Neurology®, the medical journal of the American Academy of Neurology. Shrinking brain volume is a sign of Alzheimer's disease as well as normal aging.

For the study, the levels of omega-3 fatty acids EPA+DHA in red blood cells were tested in 1,111 women who were part of the Women's Health Initiative Memory Study. Eight years later, when the women were an average age of 78, MRI scans were taken to measure their brain volume.

Those with higher levels of omega-3s had larger total brain volumes eight years later. Those with twice as high levels of fatty acids (7.5 vs. 3.4 percent) had a 0.7 percent larger brain volume.

"These higher levels of fatty acids can be achieved through diet and the use of supplements, and the results suggest that the effect on brain volume is the equivalent of delaying the normal loss of brain cells that comes with aging by one to two years," said study author James V. Pottala, PhD, of the University of South Dakota in Sioux Falls and Health Diagnostic Laboratory, Inc., in Richmond, Va.

Those with higher levels of omega-3s also had a 2.7 percent larger volume in the hippocampus area of the brain, which plays an important role in memory. In Alzheimer's disease, the hippocampus begins to atrophy even before symptoms appear.

Source: American Academy of Neurology
 

Speech Uses Both Sides of Brain


Folk wisdom has it that language is a left brain function, however new research using brain imaging shows that speech is a bilateral function.

Full Citation:
Gregory B. Cogan, Thomas Thesen, Chad Carlson, Werner Doyle, Orrin Devinsky, and Bijan Pesaranx. (2014, Jan 15). Sensory-motor transformations for speech occur bilaterally. Nature, online first. DOI:10.1038/nature12935

Speech uses both sides of brain

Friday 17 January 2014

Many scientists believe we only use one side of our brain for speech and language. Now, a new study from the US shows that as far as speech is concerned, we use both sides.

The study poses a significant challenge to current thinking about brain activity and could have important implications for developing treatment and rehabilitation to help people recover speech after stroke or injury, say the researchers from New York University (NYU) and NYU Langone Medical Center.

Senior author Bijan Pesaran, an associate professor in the Center for Neural Science at NYU, says:
"Our findings upend what has been universally accepted in the scientific community - that we use only one side of our brains for speech. In addition, now that we have a firmer understanding of how speech is generated, our work toward finding remedies for speech afflictions is much better informed."
Speech rarely studied separately from language

As well as listening and speaking, speech involves language for constructing and understanding sentences. Thus most studies conclude that speech, like language, happens on one side of the brain. And these studies rely on indirect measurement of brain activity, explain the researchers.

For their study, they took a different approach and examined the link between speech and brain processes directly.

The data for the study came from a group of patients being treated for epilepsy. They had electrodes implanted directly inside and on the surface of their brains as they performed sensory and thinking tasks.

Co-author Thomas Thesen, director of the NYU ECog Center where the data was collected, and an assistant professor at NYU Lagone, says:
"Recordings directly from the human brain are a rare opportunity. As such, they offer unparalleled spatial and temporal resolution over other imaging technologies to help us achieve a better understanding of complex and uniquely human brain functions, such as language."
To record what happens in the brain during speech alone that is separate from language, the patients were asked to repeat "non-words," like "kig" and "pob."

Study shows speech is a bilateral brain process

The recordings showed that both sides of the brain were involved during speech - suggesting that speech is a "bilateral" brain process, as the researchers conclude in their study report:

"Using a non-word transformation task, we show that bilateral sensory-motor responses can perform transformations between speech-perception- and speech-production-based representations. These results establish a bilateral sublexical speech sensory-motor system."

Prof. Pesaran adds:
"Now that we have greater insights into the connection between the brain and speech, we can begin to develop new ways to aid those trying to regain the ability to speak after a stroke or injuries resulting in brain damage. With this greater understanding of the speech process, we can retool rehabilitation methods in ways that isolate speech recovery and that don't involve language."
Meanwhile, US researchers - who discovered how accurately we respond to a beat is tied to how effectively our brains respond to speech - suggest in The Journal of Neuroscience that musical training could improve our brains' response to language.

Written by Catharine Paddock PhD
Here is the abstract from Nature:

Sensory–motor transformations for speech occur bilaterally

Gregory B. Cogan, Thomas Thesen, Chad Carlson, Werner Doyle, Orrin Devinsky & Bijan Pesaran 
Affiliations
Contributions

Historically, the study of speech processing has emphasized a strong link between auditory perceptual input and motor production output1, 2, 3, 4. A kind of ‘parity’ is essential, as both perception- and production-based representations must form a unified interface to facilitate access to higher-order language processes such as syntax and semantics, believed to be computed in the dominant, typically left hemisphere5, 6. Although various theories have been proposed to unite perception and production2, 7, the underlying neural mechanisms are unclear. Early models of speech and language processing proposed that perceptual processing occurred in the left posterior superior temporal gyrus (Wernicke’s area) and motor production processes occurred in the left inferior frontal gyrus (Broca’s area)8, 9. Sensory activity was proposed to link to production activity through connecting fibre tracts, forming the left lateralized speech sensory–motor system10. Although recent evidence indicates that speech perception occurs bilaterally11, 12, 13, prevailing models maintain that the speech sensory–motor system is left lateralized11, 14, 15, 16, 17, 18 and facilitates the transformation from sensory-based auditory representations to motor-based production representations11, 15, 16. However, evidence for the lateralized computation of sensory–motor speech transformations is indirect and primarily comes from stroke patients that have speech repetition deficits (conduction aphasia) and studies using covert speech and haemodynamic functional imaging16, 19. Whether the speech sensory–motor system is lateralized, like higher-order language processes, or bilateral, like speech perception, is controversial. Here we use direct neural recordings in subjects performing sensory–motor tasks involving overt speech production to show that sensory–motor transformations occur bilaterally. We demonstrate that electrodes over bilateral inferior frontal, inferior parietal, superior temporal, premotor and somatosensory cortices exhibit robust sensory–motor neural responses during both perception and production in an overt word-repetition task. Using a non-word transformation task, we show that bilateral sensory–motor responses can perform transformations between speech-perception- and speech-production-based representations. These results establish a bilateral sublexical speech sensory–motor system.

Tuesday, January 28, 2014

Facts You Should Know about Global Economic Inequality (Omnivore)

From Bookforum's Omnivore blog, this collection of links looks at global economic inequality - it ain't pretty, for example, seven dozen rich people have as much combined wealth as 3.5 billion poor people. And it's only getting worse each year.


Facts you should know about global economic inequality

Jan 27 2014
3:00PM

http://static5.businessinsider.com/image/5255265ceab8eab55d00e45c/this-pyramid-shows-how-all-the-worlds-wealth-is-distributed-and-the-gigantic-gap-between-rich-and-poor.jpg

Dr. Temple Grandin: Authors At Google


Dr. Temple Grandin comes to Google to talk about her book: The Autistic Brain: Thinking Across the Spectrum.

Dr. Temple Grandin: Authors At Google

Published on Jan 18, 2014


When Temple Grandin was born in 1947, autism had only just been named. Today it is more prevalent than ever, with one in 88 children diagnosed on the spectrum. And our thinking about it has undergone a transformation in her lifetime: Autism studies have moved from the realm of psychology to neurology and genetics, and there is far more hope today than ever before thanks to groundbreaking new research into causes and treatments. Now Temple Grandin reports from the forefront of autism science, bringing her singular perspective to a thrilling journey into the heart of the autism revolution.

Weaving her own experience with remarkable new discoveries, Grandin introduces the neuroimaging advances and genetic research that link brain science to behavior, even sharing her own brain scan to show us which anomalies might explain common symptoms. We meet the scientists and self-advocates who are exploring innovative theories of what causes autism and how we can diagnose and best treat it. Grandin also highlights long-ignored sensory problems and the transformative effects we can have by treating autism symptom by symptom, rather than with an umbrella diagnosis. Most exciting, she argues that raising and educating kids on the spectrum isn't just a matter of focusing on their weaknesses; in the science that reveals their long-overlooked strengths she shows us new ways to foster their unique contributions.

From the "aspies" in Silicon Valley to the five-year-old without language, Grandin understands the true meaning of the word spectrum. The Autistic Brain is essential reading from the most respected and beloved voices in the field.

Melanie Mitchell - How Can the Study of Complexity Transform Our Understanding of the World?

From Big Questions Online, Melanie Mitchell (Professor of Computer Science at Portland State University, and External Professor and Member of the Science Board at the Santa Fe Institute) offers a nice and very accessible overview of how the study of complex systems can help us better make sense of our world.

How Can the Study of Complexity Transform Our Understanding of the World?


By Melanie Mitchell
January 20, 2014


image: Stephen Hopkins

In 1894, the physicist and Nobel laureate Albert Michelson declared that science was almost finished; the human race was within a hair’s breadth of understanding everything:

It seems probable that most of the grand underlying principles have now been firmly established and that further advances are to be sought chiefly in the rigorous application of these principles to all the phenomena which come under our notice.

Bold and heady predictions like this often seem destined to topple, and, to be sure, the world of physics was soon shaken by the revolutions of relativity and quantum mechanics.

But as the 20th century unfolded, it turned out to be the phenomena closest to our own human scale— biology, social science, economics, politics, among others—that have most notably eluded explanation by any grand principles. The deeper we dig into the workings of ourselves and our society, the more unexpected complexity we find. Fittingly, it was in the 20th century that science began to bridge disciplinary boundaries in order to search for principles of complexity itself.

What is Complexity?

The “study of complexity” refers to the attempt to find common principles underlying the behavior of complex systems—systems in which large collections of components interact in nonlinear ways. Here, the term nonlinear implies that the system can’t be understood simply by understanding its individual components; nonlinear interactions cause the whole to be “more than the sum of its parts.”

Complex systems scientists try to understand how such collective sophistication can come about, whether it be in ant colonies, cells, brains, immune systems, social groups, or economic markets. People who study complexity are intrigued by the suggestive similarities among these disparate systems. All these systems exhibit self-organization: the system’s components organize themselves to act as a coherent whole without the benefit of any central or outside “controller”. Complex systems are able to encode and process information with a sophistication that is not available to the individual components. Complex systems evolve—they are continually changing in an open-ended way, and they learn and adapt over time. Such systems defy precise prediction, and resist the kind of equilibrium that would make them easier for scientists to understand.

Transforming Our Understanding

Of course all important scientific discoveries transform our understanding of nature, but I think that the study of complexity goes a step further: it not only helps us understand important phenomena, but changes our perspective on how to think about nature, and about science itself.

Here are a few examples of the surprising, perspective-changing discoveries of Complex Systems science. (If these don’t seem so surprising to you, it is because your perspective has already been changed by the sciences of complexity!)

Simple rules can yield complex, unpredictable behavior


Why can’t we seem to forecast the weather farther out than a week or so? Why is it so hard to project yearly variation in fishery populations? Why can’t we foresee stock market bubbles and crashes? In the past it was widely assumed that such phenomena are hard to predict because the underlying processes are highly complex, and that random factors must play a key role. However, Complex Systems science—especially the study of dynamics and chaos—have shown that complex behavior and unpredictability can arise in a system even if the underlying rules are extremely simple and completely deterministic. Often, the key to complexity is the iteration over time of simple, though nonlinear, interaction rules among the system’s components. It’s still not clear if unpredictability in the weather, stock market, and animal populations is caused by such iteration alone, but the study of chaos has shown that it’s possible.

More is Different

Above I reiterated the old saw, “the whole is more than the sum of its parts”. The physicist Phil Anderson coined a better aphorism: he noted that a key lesson of complexity is that “more is different”.

Ant colonies are a great example of this. As the ecologist Nigel Franks puts it, “The solitary army ant is behaviorally one of the least sophisticated animals imaginable...If 100 army ants are placed on a flat surface, they will walk around and around in never decreasing circles until they die of exhaustion.” Yet put half a million of them together and the group as a whole behaves as a hard-to-predict “superorganism” with sophisticated, and sometimes frightening, “collective intelligence”. More is different.

Similar stories can be told for neurons in the brain, cells in the immune system, creativity and social movements in cities, and agents in market economies. The study of complexity has shown that when a system’s components have the right kind of interactions, its global behavior—the system’s capacity to process information, to make decisions, to evolve and learn—can be powerfully different from that of its individual components.

Network Thinking

In the early 2000s, the complete human genome was sequenced. While the benefits to science were enormous, some of the predictions made by prominent scientists and others had a Michelsonian flavor (see first paragraph). President Clinton echoed the widely held view that the Human Genome Project would “revolutionize the diagnosis, prevention and treatment of most, if not all, human diseases.” Indeed, many scientists believed that a complete mapping of human genes would provide a nearly complete understanding of how genetics worked, which genes were responsible for which traits, and this would guide the way for revolutionary medical discoveries and targeted gene therapies.

Now, more than a decade later, these predicted medical revolutions have not yet materialized. But the Human Genome Project, and the huge progress in genetics research that followed, did uncover some unexpected results. First, human genes (DNA sequences that code for proteins) number around 21,000—much fewer than anyone thought, and about the same number as in mice, worms, and mustard plants. Second, these protein-coding genes make up only about 2% of our DNA. Two mysteries emerge: If we humans have comparatively so few genes, where does our complexity come from? And as for that 98% of non-gene DNA, which in the past was dismissively called "junk DNA", what is its function?

What geneticists have learned is that genetic elements in a cell, like ants in a colony, interact nonlinearly so as to create intricate information-processing networks. It is the networks, rather than the individual genes, that shape the organism. Moreover, and most surprising: the so-called “junk” DNA is key to forming these networks. As biologist John Mattick puts it, “The irony...is that what was dismissed as junk because it wasn’t understood will turn out to hold the secret of human complexity.”

Information-processing networks are emerging as a core organizing principle of biology. What used to be called “cellular signaling pathways” are now “cellular information processing networks.” New research on cancer treatments is focused not on individual genes but on disrupting the cellular information processing networks that many cancers exploit. Some types of bacteria are now known to communicate via “quorum sensing” networks in order to collectively attack a host; this discovery is also driving research into network-specific treatment of infections.

Over the last two decades an interdisciplinary science of networks has emerged, and has developed insights and research methods that apply to networks ranging from genetics to economics. Network thinking is the area of complex systems that has perhaps done the most to transform our understanding of the world.

Non-Normal is the New Normal

In 2009, Nobel Prize-winning economist Paul Krugman said, “Few economists saw our current crisis coming, but this predictive failure was the least of the field’s problems. More important was the profession’s blindness to the very possibility of catastrophic failures in a market economy.” At least part of this “blindness” was due to the reliance on risk models based on so-called normal distributions.

Figure 1: (a) A hypothetical normal distribution of the probability of financial gain or loss under trading. (b) A hypothetical long-tailed distribution, showing only the loss side. The “tail” of the distribution is the far right-hand side. The long-tailed distribution predicts a considerably higher probability of catastrophic loss than the normal distribution.
The term normal distribution refers to the familiar bell curve. Economists and finance professionals often use such distributions to model the probability of gains and risk of losses from investments. Figure 1(a) shows a hypothetical normal distribution of risk. I’ve marked a hypothetical “catastrophic loss” on the graph. You can see that, given this distribution of risk, the probability of such a loss would be very near zero. Less probable, maybe, than a lightning strike right where you’re standing. Something you don’t have to worry about. Unless the model is wrong.

The study of complexity has shown that in nonlinear, highly networked systems, a more accurate estimation of risk would be a so-called “long-tailed” distribution. Figure 1(b) shows a hypothetical long-tailed distribution of risk (here, only the “loss” side is shown). The longer non-zero “tail” (far right-hand side) of this distribution shows that the probability of a catastrophic loss is significantly higher than for a system obeying a normal distribution. If risk models in 2008 had employed long-tailed rather than normal distributions, the possibility of an “extreme event”—here, “catastrophic loss”—would have be judged more likely.

Because long-tailed distributions are now known to be signatures of complex networks, our growing understanding of such networks implies that risk models need to be rethought in many areas, ranging from disease epidemics to power grid failures; from financial crises to ecosystem collapses. The technologist Andreas Antonopoulos puts it succinctly: “The threat is complexity itself”.

Is Complexity a New Science?

“The new science of complexity” has become a catchphrase in some circles. Google reports nearly 87,000 hits on this phrase. But how “new” is the study of complexity? And to what extent is it actually a “science”?

The current scientific efforts centered around complexity have several antecedents. The Cybernetics movement of the 1940s and 50s, the General System Theory movement of the 1960s, and the more recent advent of Systems Biology, Systems Engineering, Systems Science, etc., all share goals with Complex Systems science: finding general principles that explain how system-level behavior emerges from interactions among lower-level components. The different movements capture different (though sometimes overlapping) communities and different foci of attention.

To my mind, Complexity refers not to a single science but rather to a community of scientists in different disciplines who share interdisciplinary interests, methodologies, and a mindset about how to address scientific problems. Just what this mindset consists of is hard to pin down. I would say it includes, first, the assumption that understanding complexity will require integrating concepts from dynamics, information, statistical physics, and evolution. And second, that computer modeling is an essential addition to traditional scientific theory and experimentation. As yet, Complexity is not a single unified science; rather, to paraphrase William James, it is still “the hope of a science”. I believe that this hope has great promise.

In our era of Big Data, what Complexity potentially offers is “Big Theory”—a scientific understanding of the complex processes that produce the data we are drowning in. If the field’s past contributions are any indication, Complexity’s sought-after big theory will even more profoundly transform our understanding of the world.

It’s something to look forward to. In the words of playwright Tom Stoppard: “It’s the best possible time to be alive, when almost everything you thought you knew is wrong.”

Discussion Questions


1. Can you identify any ways in which your own way of thinking has been changed by Complex Systems science?

2. The discussion above stated that when systems get too intricately networked, “the threat is complexity itself”. The network scientist Duncan Watts suggested that the notion “too big to fail” should be rethought as “too complex to exist.” Should we worry about the world becoming too complex? If so, what should we do about it?

3. To what extent do you think the ideas of complex systems are new? What would it take to create a unified science of complexity?

Resources and Further Reading:


http://complexityexplorer.org
  • Anderson, P. W. More is different. Science, 177 (4047), 1972, 393-396.
  • Bettencourt, L. M., Lobo, J., Helbing, D., Kühnert, C., & West, G. B. (2007). Growth, innovation, scaling, and the pace of life in cities. Proceedings of the National Academy of Sciences, 104(17), 7301-7306.
  • Franks, N. R. Army ants: A collective intelligence. American Scientist, 77(2), 1989, 138-145.
  • Hayden, E. C. Human genome at ten: Life is complicated. Nature, 464, 2010, 664-667.
  • Krugman, P. How did economists get it so wrong? New York Times, September 2, 2009.
  • Miller, J. H. and Page, S. E. Complex Adaptive Systems. Princeton University Press, 2007.
  • Mitchell, M. Complexity: A Guided Tour. Oxford University Press, 2009
  • Newman, M. E. J. Networks: An Introduction. Oxford University Press, 2009.
  • Watts, D. Too complex to exist. Boston Globe, June 14, 2009.
  • West, G. Big data needs a big theory to go with it. Scientific American, May 15, 2013.

Monday, January 27, 2014

Carl Zimmer - Secrets of the Brain

The human brain is a three-pound wad of flesh able to explore the universe, imagine a better world, and ponder its own nature. Armed with far more sophisticated imaging techniques, scientists today are reaching toward an ultimate understanding of what makes us us.

Here is a very cool article from Carl Zimmer at National Geographic - the developments in brain imaging that are helping us gain better understanding of how the brain functions.


Secrets of the Brain


New technologies are shedding light on biology’s greatest unsolved mystery: how the brain really works.


Text by Carl Zimmer | February 2014
Photographs by Robert Clark

Brain Terrain The human brain is a three-pound wad of flesh able to explore the universe, imagine a better world, and ponder its own nature. Armed with far more sophisticated imaging techniques, scientists today are reaching toward an ultimate understanding of what makes us us. U.S. National Library of Medicine, Visible Human Project

Van Wedeen strokes his half-gray beard and leans toward his computer screen, scrolling through a cascade of files. We’re sitting in a windowless library, surrounded by speckled boxes of old letters, curling issues of scientific journals, and an old slide projector that no one has gotten around to throwing out.

“It’ll take me a moment to locate your brain,” he says.

On a hard drive Wedeen has stored hundreds of brains—exquisitely detailed 3-D images from monkeys, rats, and humans, including me. Wedeen has offered to take me on a journey through my own head.

“We’ll hit all the tourist spots,” he promises, smiling.

This is my second trip to the Martinos Center for Biomedical Imaging, located in a former ship-rope factory on Boston Harbor. The first time, a few weeks ago, I offered myself as a neuroscientific guinea pig to Wedeen and his colleagues. In a scanning room I lay down on a slab, the back of my head resting in an open plastic box. A radiologist lowered a white plastic helmet over my face. I looked up at him through two eyeholes as he screwed the helmet tight, so that the 96 miniature antennas it contained would be close enough to my brain to pick up the radio waves it was about to emit. As the slab glided into the cylindrical maw of the scanner, I thought of The Man in the Iron Mask.

Mind Machine An engineer wears a helmet of sensors at the Martinos Center for Biomedical Imaging—part of a brain scanner requiring almost as much power as a nuclear submarine. Antennas pick up signals produced when the scanner’s magnetic field excites water molecules in the brain. Computers convert this data into brain maps like the one below.

The magnets that now surrounded me began to rumble and beep. For an hour I lay still, eyes closed, and tried to keep myself calm with my own thoughts. It wasn’t easy. To squeeze as much resolution as possible out of the scanner, Wedeen and his colleagues had designed the device with barely enough room for a person of my build to fit inside. To tamp down the panic, I breathed smoothly and transported myself to places in my memory, at one point recalling how I had once walked my nine-year-old daughter to school through piles of blizzard snow.

As I lay there, I reflected on the fact that all of these thoughts and emotions were the creation of the three-pound loaf of flesh that was under scrutiny: my fear, carried by electrical impulses converging in an almond-shaped chunk of tissue in my brain called the amygdala, and the calming response to it, marshaled in regions of my frontal cortex. My memory of my walk with my daughter was coordinated by a seahorse-shaped fold of neurons called the hippocampus, which reactivated a vast web of links throughout my brain that had first fired when I had clambered over the snowbanks and formed those memories.

I was submitting to this procedure as part of my cross-country reporting to chronicle one of the great scientific revolutions of our times: the stunning advances in understanding the workings of the human brain. Some neuroscientists are zooming in on the fine structure of individual nerve cells, or neurons. Others are charting the biochemistry of the brain, surveying how our billions of neurons produce and employ thousands of different kinds of proteins. Still others, Wedeen among them, are creating in unprecedented detail representations of the brain’s wiring: the network of some 100,000 miles of nerve fibers, called white matter, that connects the various components of the mind, giving rise to everything we think, feel, and perceive. The U.S. government is throwing its weight behind this research through the Brain Research through Advancing Innovative Neurotechnologies (BRAIN) Initiative. In an announcement last spring President Barack Obama said that the large-scale project aimed to speed up the mapping of our neural circuitry, “giving scientists the tools they need to get a dynamic picture of the brain in action.”

 
Centuries of study have provided increasingly detailed understanding of human brain anatomy. Now scientists are turning their attention to the complex circuits that connect the brain’s many regions—some 100,000 miles of fibers called white matter, enough to circle the Earth four times. In this image taken at the Martinos Center, pink and orange bundles transmit signals critical for language. VAN WEEDEN AND L. L. WALD, MARTINOS CENTER FOR BIOMEDICAL IMAGING, HUMAN CONNECTOME PROJECT; BRAIN PREPARATION PERFORMED AT ALLEN INSTITUTE FOR BRAIN SCIENCE

As they see the brain in action, neuroscientists can also see its flaws. They are starting to identify differences in the structure of ordinary brains and brains of people with disorders such as schizophrenia, autism, and Alzheimer’s disease. As they map the brain in greater detail, they may learn how to diagnose disorders by their effect on anatomy, and perhaps even understand how those disorders arise.

On my return trip to his lab Wedeen finally locates the image from my session in the scanner. My brain appears on his screen. His technique, called diffusion spectrum imaging, translates radio signals given off by the white matter into a high-resolution atlas of that neurological Internet. His scanner maps bundles of nerve fibers that form hundreds of thousands of pathways carrying information from one part of my brain to another. Wedeen paints each path a rainbow of colors, so that my brain appears as an explosion of colorful fur, like a psychedelic Persian cat.

Wedeen focuses in on particular pathways, showing me some of the circuitry important to language and other kinds of thought. Then he pares away most of the pathways in my brain, so that I can more easily see how they’re organized. As he increases the magnification, something astonishing takes shape before me. In spite of the dizzying complexity of the circuits, they all intersect at right angles, like the lines on a sheet of graph paper.

“It’s all grids,” says Wedeen.

 
Anatomy of a Mystery New technologies let scientists peer deep into the hidden structure of the brain. A high-resolution view of the image above reveals white matter fibers arranged in a mysterious grid structure, like longitude and latitude lines on a map. Van Wedeen and L. L. Wald, Martinos Center for Biomedical Imaging, Human Connectome Project

When Wedeen first unveiled the grid structure of the brain, in 2012, some scientists were skeptical, wondering if he’d uncovered only part of a much more tangled anatomy. But Wedeen is more convinced than ever that the pattern is meaningful. Wherever he looks—in the brains of humans, monkeys, rats—he finds the grid. He notes that the earliest nervous systems in Cambrian worms were simple grids—just a pair of nerve cords running from head to tail, with runglike links between them. In our own lineage the nerves at the head end exploded into billions but still retained that gridlike structure. It’s possible that our thoughts run like streetcars along these white matter tracks as signals travel from one region of the brain to another.

“There’s zero chance that there are not principles lurking in this,” says Wedeen, peering intently at the image of my brain. “We’re just not yet in a position to see the simplicity.”

Scientists are learning so much about the brain now that it’s easy to forget that for much of history we had no idea at all how it worked or even what it was. In the ancient world physicians believed that the brain was made of phlegm. Aristotle looked on it as a refrigerator, cooling off the fiery heart. From his time through the Renaissance, anatomists declared with great authority that our perceptions, emotions, reasoning, and actions were all the result of “animal spirits”—mysterious, unknowable vapors that swirled through cavities in our head and traveled through our bodies.

The scientific revolution in the 17th century began to change that. The British physician Thomas Willis recognized that the custardlike tissue of the brain was where our mental world existed. To understand how it worked, he dissected brains of sheep, dogs, and expired patients, producing the first accurate maps of the organ.

It would take another century for researchers to grasp that the brain is an electric organ. Instead of animal spirits, voltage spikes travel through it and out into the body’s nervous system. Still, even in the 19th century scientists knew little about the paths those spikes followed. The Italian physician Camillo Golgi argued that the brain was a seamless connected web. Building on Golgi’s research, the Spanish scientist Santiago Ramón y Cajal tested new ways of staining individual neurons to trace their tangled branches. Cajal recognized what Golgi did not: that each neuron is a distinct cell, separate from every other one. A neuron sends signals down tendrils known as axons. A tiny gap separates the ends of axons from the receiving ends of neurons, called dendrites. Scientists would later discover that axons dump a cocktail of chemicals into the gap to trigger a signal in the neighboring neuron.

 
Intimate View Two hundred sections of a piece of mouse brain, each less than 1/1,000 the thickness of a human hair, are readied to be imaged by an electron microscope. Arranged in stacks, 10,000 such photomicrographs form a 3-D model no larger than a grain of salt (in tweezers). A human brain visualized at this level of detail would require an amount of data equal to all the written material in all the libraries of the world.

Jeff Lichtman, a neuroscientist, is the current Ramón y Cajal Professor of Arts and Sciences at Harvard, carrying Cajal’s project into the 21st century. Instead of making pen-and-ink drawings of neurons stained by hand, he and his colleagues are creating extremely detailed three-dimensional images of neurons, revealing every bump and stalk branching from them. By burrowing down to the fine structure of individual nerve cells, they may finally get answers to some of the most basic questions about the nature of the brain. Each neuron has on average 10,000 synapses. Is there some order to their connections to other neurons, or are they random? Do they prefer linking to one type of neuron over others?

To produce the images, Lichtman and his colleagues load pieces of preserved mouse brain into a neuroanatomical version of a deli meat slicer, which pares off layers of tissue, each less than a thousandth the thickness of a strand of human hair. The scientists use an electron microscope to take a picture of each cross section, then use a computer to order them into a stack. Slowly a three-dimensional image takes shape—one that the scientists can explore as if they were in a submarine traveling through an underwater kelp forest.

“Everything is revealed,” says Lichtman.

The only problem is the sheer enormity of “everything.” So far the largest volume of a mouse’s brain that Lichtman and his colleagues have managed to re-create is about the size of a grain of salt. Its data alone total a hundred terabytes, the amount of data in about 25,000 high-definition movies.

A Voyage Into the Brain Thought, feeling, sense, action—all derive from unimaginably complex interactions among billions of nerve cells. A section of mouse brain no larger than a grain of salt serves as a window into this hidden world.

Go read the rest of the article.

Polly Young-Eisendrath - Embracing Our Imperfect Life

Polly Young-Eisendrath is a Buddhist and a Jungian analyst, as well as authoring many excellent books, including The Resilient Spirit: Transforming Suffering Into Insight And Renewal (1997), The Psychology of Mature Spirituality: Integrity, Wisdom, Transcendence (2000), Subject to Change: Jung, Gender and Subjectivity in Psychoanalysis (2004), and The Self-Esteem Trap: Raising Confident and Compassionate Kids in an Age of Self-Importance (2009).

This is from a group blog she is part of at Psychology Today - it was one of the "Best of the Blogs" selections for this past week.


Embracing Our Imperfect Life

Listening to Leonard Cohen and learning to relish defeat.

Published on January 19, 2014 by The Contemporary Psychoanalysis Group in Contemporary Psychoanalysis in Action
By Polly Young-Eisendrath, Ph.D.


In my car on a sunny Vermont winter morning, I am listening to Leonard Cohen sing his view of reality: a Zen teaching on how to relish defeat. He says you can’t be a hero in your own life and, more important, you can’t be happy until you know how thoroughly broken life itself is, everyone’s life, not just yours. He sounds sexy and ironic and wise. There really is no one else quite like Leonard.

I am also thinking about what a therapy patient said to me yesterday: that he’s spent hundreds of thousands of dollars on four different psychotherapies, which have extended over most of his adult years. He is a minister and he’s in psychoanalytic psychotherapy with me. ALL this therapy, he said, was “a f***ing hopeless cause.” He’s never had the relationship he’s really wanted or a job that expresses his creativity or the self-confidence he should have. I work hard to understand his feelings and I delight in this man as a human being (flaws and all) and admire his dedication to his work. I praise his strengths and am interested in his weaknesses. I don’t have a “fix it” attitude towards him or his life.

As I think about my patient, I recall a teaching I heard from a Tibetan Buddhist monk. He noted that North Americans rarely appreciate the enormous privileges they enjoy every day: that we are free to talk and write about our ideas, feelings, and opinions. We are educated and encouraged to have our own points of view. By contrast, Rinpoche says, 80 percent of Tibetans are illiterate. Those who live in China are not free to speak their native language or to express their feelings and opinions about many things, including their own Tibetan culture and religion. I found Rinpoche’s reflections on American culture fresh and interesting. He did not criticize American materialism or our “fast” style, but he noticed how much we take our individual rights for granted.

Just as I am having warm thoughts about Rinpoche, Leonard is singing:

I fought against the bottle,
But I had to do it drunk –
Took my diamond to the pawnshop –
But that don’t make it junk.

I know that I’m forgiven,
But I don’t know how I know
I don’t trust my inner feelings –
Inner feelings come and go.
Of course, as a Zen practitioner myself, I know about Leonard’s distrust of inner feelings and of the Buddha’s cautionary teachings on the importance of watching our feelings arise and pass away instead of becoming invested in a narrative about them. This skill of “simply experiencing” our feelings – instead of discharging or suppressing them – is something I have incorporated into my psychotherapeutic work. I sit with people as they watch their shame, envy, joy, sadness or anger arise and pass away. This leads to a new kind of freedom. But there is something more in what Leonard is singing. He’s not ashamed that he fought against the bottle while he was drunk.

Admittedly, Leonard is a scoundrel and a hard man to pin down. After I read his biography, I stopped romanticizing him as my perfect soul mate! More than a handful of beautiful, kind, talented women have tried and failed to have a long-term love relationship with him. He has fought against commitment. His struggle with depression is legendary. And yet, his conviction that there is no ideal way to live conveys a remarkable grace, strength and freedom. This perspective--no ideal way to live--is often absent in my patients and in my psychoanalytic colleagues, as well.

We psychoanalysts can still sound as though there is an ideal way to live. Look closely into the lives of many great masters of art, literature, and spiritual practice and you will find abundant tribulation. Human beings who are challenged, even at a young age, to make sense of a world that is deeply disappointing and frustrating can become resilient and insightful in a way that supports a lifetime of transformation. The importance of anxiety, suffering, and difficulty cannot be overestimated in helping us appreciate how little is under our control.

My therapy patient is relentlessly and bitterly disappointed, even though he has meaningful creative work, a grown daughter who is doing well, and lots of friends. He often explains his bitterness in terms of his parents’ self-absorption, a narrative he developed as a result of years of psychotherapy. Instead of observing his inner feelings just coming and going, he relies on a particular story about his life: he has low self-esteem and lacks confidence due to his parents’ shortcomings. He ignores the richness he has developed in his inner life as a result of his freedom to examine his thoughts, feelings, and life story – freedom he has engaged over years of psychotherapy.

What I adore in Leonard Cohen is he is NOT advising us to grieve our losses and our flaws. He wants us to celebrate them. He wants us to develop a sense of humor about them and to see how they link us to one another. We are not gods. When you have to pawn your diamond because you run out of money, your diamond is not “a f***ing hopeless cause.”

What is most appealing about Leonard is not his accomplishments or perfections, but his poetic account of his peculiar failures and weaknesses. He teaches us, too, that our peculiarities can become our cherished particularities when we embrace our selves with lightness and friendship.

Polly Young-Eisendrath, Ph.D., is a psychologist, speaker, writer, mindfulness teacher and Jungian analyst who maintains a clinical and consulting practice in central Vermont. A practicing Buddhist since 1971, she is also chairperson of the non-profit "Enlightening Conversations: Buddhism and Psychoanalysis Meeting in Person" that hosts conferences in cities around the USA.

Dr. Young-Eisendrath will speak at The William Alanson White Institute in New York City on Wednesday, February 5, 8 PM. To register, click here.

Sunday, January 26, 2014

The Changing Face of Psychology

Interesting post from The Guardian (UK) on a few changes occurring in the world of academic/research psychology, most importantly a move toward more open access and more replication studies.

I do a lot of research in the worlds of cell biology, nutrition, and psychology/neuroscience. Of these three fields, psych/neuroscience is by far the least open access and open data. The publishers gouge researchers and universities for publication fees and then gouge libraries and individuals for access to the publication (with electronic versions costing as much as print versions, even though they often receive the material "camera ready" and have to do NO processing of the manuscripts). It's the only business I know of that charges producers and consumers for intellectual property it has not created. What a scam.

The changing face of psychology

After 50 years of stagnation in research practices, psychology is leading reforms that will benefit all life sciences


Posted by Chris Chambers
Friday 24 January 2014


Psychology is championing important changes to culture and practice, including a greater emphasis on transparency, reliability, and adherence to the scientific method. 
Photograph: Sebastian Kaulitzki/Alamy

In 1959, an American researcher named Ted Sterling reported something disturbing. Of 294 articles published across four major psychology journals, 286 had reported positive results – that is, a staggering 97% of published papers were underpinned by statistically significant effects. Where, he wondered, were all the negative results – the less exciting or less conclusive findings? Sterling labelled this publication bias a form of malpractice. After all, getting published in science should never depend on getting the “right results”.

You might think that Sterling’s discovery would have led the psychologists of 1959 to sit up and take notice. Groups would be assembled to combat the problem, ensuring that the scientific record reflected a balanced sum of the evidence. Journal policies would be changed, incentives realigned.

Sadly, that never happened. Thirty-six years later, in 1995, Sterling took another look at the literature and found exactly the same problem – negative results were still being censored. Fifteen years after that, Daniele Fanelli from the University of Edinburgh confirmed it yet again. Publication bias had turned out to be the ultimate bad car smell, a prime example of how irrational research practices can linger on and on.

Now, finally, the tide is turning. A growing number of psychologists – particularly the younger generation – are fed up with results that don’t replicate, journals that value story-telling over truth, and an academic culture in which researchers treat data as their personal property. Psychologists are realising that major scientific advances will require us to stamp out malpractice, face our own weaknesses, and overcome the ego-driven ideals that maintain the status quo.

Here are five key developments to watch in 2014.

1. Replication


The problem: The best evidence for a genuine discovery is showing that independent scientists can replicate it using the same method. If it replicates repeatedly then we can use it to build better theories. If it doesn't then it belongs in the trash bin of history. This simple logic underpins all science – without replication we’d still believe in phlogiston and faster-than-light neutrinos.

In psychology, attempts to closely reproduce previous methods are rarely attempted. Psychologists tend to see such work as boring, lacking in intellectual prowess, and a waste of limited resources. Some of the most prominent psychology journals even have explicit policies against publishing replications, instead offering readers a diet of fast food: results that are novel, eye catching, and even counter-intuitive. Exciting results are fine provided they replicate. The problem is that nobody bothers to try, which litters the field with results of unknown (likely low) value.

How it’s changing: The new generation of psychologists understands that independent replication is crucial for real advancement and to earn wider credibility in science. A beautiful example of this drive is the Many Labs project led by Brian Nosek from the University of Virginia. Nosek and a team of 50 colleagues located in 36 labs worldwide sought to replicate 13 key findings in psychology, across a sample of 6,344 participants. Ten of the effects replicated successfully.

Journals are also beginning to respect the importance of replication. The prominent outlet Perspectives on Psychological Science recently launched an initiative that specifically publishes direct replications of previous studies. Meanwhile, journals such as BMC Psychology and PLOS ONE officially disown the requirement for researchers to report novel, positive findings.

2. Open access


The problem: Strictly speaking, most psychology research isn’t really “published” – it is printed within journals that expressly deny access to the public (unless you are willing to pay for a personal subscription or spend £30+ on a single article). Some might say this is no different to traditional book publishing, so what's the problem? But remember that the public being denied access to science is the very same public that already funds most psychology research, including the subscription fees for universities. So why, you might ask, is taxpayer-funded research invisible to the taxpayers that funded it? The answer is complicated enough to fill a 140-page government report, but the short version is that the government places the business interests of corporate publishers ahead of the public interest in accessing science.

How it’s changing: The open access movement is growing in size and influence. Since April 2013, all research funded by UK research councils, including psychology, must now be fully open access – freely viewable to the public. Charities such as the Wellcome Trust have similar policies. These moves help alleviate the symptoms of closed access but don’t address the root cause, which is market dominance by traditional subscription publishers. Rather than requiring journals to make articles publicly available, the research councils and charities are merely subsidising those publishers, in some cases paying them extra for open access on top of their existing subscription fees. What other business in society is paid twice for a product that it didn’t produce in the first place? It remains a mystery who, other than the publishers themselves, would call this bizarre set of circumstances a “solution”.

3. Open science


The problem: Data sharing is crucial for science but rare in psychology. Even though ethical guidelines require authors to share data when requested, such requests are usually ignored or denied, even when coming from other psychologists. Failing to publicly share data makes it harder to do meta-analysis and easier for unscrupulous researchers to get away with fraud. The most serious fraud cases, such as Diederik Stapel, would have been caught years earlier if journals required the raw data to be published alongside research articles.

How it’s changing: Data sharing isn’t yet mandatory, but it is gradually becoming unacceptable for psychologists not to share. Evidence shows that studies which share data tend to be more accurate and less likely to make statistical errors. Public repositories such as Figshare and the Open Science Framework now make the act of sharing easy, and new journals including the Journal of Open Psychology Data have been launched specifically to provide authors with a way of publicising data sharing.

Some existing journals are also introducing rewards to encourage data sharing. Since 2014, authors who share data at the journal Psychological Science will earn an Open Data badge, printed at the top of the article. Coordinated data sharing carries all kinds of other benefits too – for instance, it allows future researchers to run meta-analysis on huge volumes of existing data, answering questions that simply can’t be tackled with smaller datasets.

4. Bigger data


The problem: We’ve known for decades that psychology research is statistically underpowered. What this means is that even when genuine phenomena exist, most experiments don’t have sufficiently large samples to detect them. The curse of low power cuts both ways: not only is an underpowered experiment likely to miss finding water in the desert, it’s also more likely to lead us to a mirage.

How it’s changing: Psychologists are beginning to develop innovative ways to acquire larger samples. An exciting approach is Internet testing, which enables easy data collection from thousands of participants. One recent study managed to replicate 10 major effects in psychology using Amazon’s Mechanical Turk. Psychologists are also starting to work alongside organisations that already collect large amounts of useful data (and no, I don’t mean GCHQ). A great example is collaborative research with online gaming companies. Tom Stafford from the University of Sheffield recently published an extraordinary study of learning patterns in over 850,000 people by working with a game developer.

5. Limiting researcher “degrees of freedom”


The problem: In psychology, discoveries tend to be statistical. This means that to test a particular hypothesis, say, about motor actions, we might measure the difference in reaction times or response accuracy between two experimental conditions. Because the measurements contain noise (or “unexplained variability”), we rely on statistical tests to provide us with a level of certainty in the outcome. This is different to other sciences where discoveries are more black and white, like finding a new rock layer or observing a supernova.

Whenever experiments rely on inferences from statistics, researchers can exploit “degrees of freedom” in the analyses to produce desirable outcomes. This might involve trying different ways of removing statistical outliers or the effect of different statistical models, and then only reporting the approach that “worked” best in producing attractive results. Just as buying all the tickets in a raffle guarantees a win, exploiting researcher degrees of freedom can guarantee a false discovery.

The reason we fall into this trap is because of incentives and human nature. As Sterling showed in 1959, psychology journals select which studies to publish not based on the methods but on the results: getting published in the most prominent, career-making journals requires researchers to obtain novel, positive, statistically significant effects. And because statistical significance is an arbitrary threshold (p<.05), researchers have every incentive to tweak their analyses until the results cross the line. These behaviours are common in psychology – a recent survey led by Leslie John from Harvard University estimated that at least 60% of psychologists selectively report analyses that “work”. In many cases such behaviour may even be unconscious.

How it’s changing: The best cure for researcher degrees of freedom is to pre-register the predictions and planned analyses of experiments before looking at the data. This approach is standard practice in medicine because it helps prevent the desires of the researcher from influencing the outcome. Among the basic life sciences, psychology is now leading the way in advancing pre-registration. The journals Cortex, Attention Perception & Psychophysics, AIMS Neuroscience and Experimental Psychology offer pre-registered articles in which peer review happens before experiments are conducted. Not only does pre-registration put the reins on researcher degrees of freedom, it also prevents journals from selecting which papers to publish based on the results.

Journals aren’t the only organisations embracing pre-registration. The Open Science Framework invites psychologists to publish their protocols, and the 2013 Declaration of Helsinki now requires public pre-registration of all human research “before recruitment of the first subject”.

We’ll continue to cover these developments at HQ as they progress throughout 2014.