Friday, January 17, 2014

A First-Person Account of Schizophrenia and Recovery

From Salon, this is an interesting first-person account of one man's descent into and recovery from schizophrenia.

I thought I was a prophet

After my schizophrenic break, I couldn't even trust my own mind -- and it would be a long road back from the abyss

Friday, Aug 2, 2013 | Michael Hedrick


A photo of the author.

On the day I realized I was a prophet, I left my home in Colorado and began to hitch-hike to the U.N. I needed to save the world from its various evils. And I needed to go — now.

I spent the next several days wandering around the northeast, trying to decipher messages. I found codes in places where codes didn’t exist. I finally found my way back home, thanks to a quiet and generous woman who lived somewhere in rural Massachusetts, and when I got back, I explained to my parents that I was on a mission and this was God’s will. I’m sure there was some stuff about aliens and conspiracies in there, too.

And so, a week after my adventure began, I woke up in the psych ward of Boulder Community Hospital, and I spent the next seven days condemned to a hospital bed with waterproof sheets in a ward with eight other people who either didn’t talk or rambled so incoherently that it was impossible to understand them.

My parents came every day at 2 o’clock, when visiting hours began, and they brought me pillows and a down comforter and my favorite hoodie. Anything to make me more comfortable. Still, their visits were marred by hour-long screaming and crying matches in which I accused them of throwing me in a mental hospital to rot. Why was it so hard for them to understand? I was a prophet, and this was my magnum opus.

My parents didn’t know what had happened to their son. For that matter, I didn’t know what had happened to me either. And all of us wondered: Would I ever make it back?

Four and a half percent of all adults in the United States suffer with a serious mental illness. That equates to 14,125,500 people who struggle, day in and day out, not sure if they can trust their own thoughts. Many find support and do recover, but many more don’t. They become homeless, they languish on the fringes of society where they aren’t given the slightest thought, let alone assistance. I could have been one of them.

When I got out of the hospital, nobody knew what to do with me. (And I certainly didn’t know what to do with myself.) I moved back into my parents’ house and was treated with nervous caution. Mental illness was a foreign concept to them. Before I broke, they had blamed my strange behavior on marijuana. Now, they regarded me with a silence reserved for things they feared. And they were probably right to fear me. I was still sick and dangerously delusional.

It’s hard to explain the logic of a cracked mind, but I found personal threat in every tiny action or event. Once, I was seated at the piano, and my mother reached over me to lower the register, her hand brushing my lower stomach, and I was convinced she was trying to molest me. Another time, my dad and I were in the garage, trying to put new brake pads on my car, when I decided he was going to sabotage me so that I would lose control of my car, crash into a tree and die.

My mistrust of people was intense. Sometimes it felt like they existed only to harm me.

For instance, I was certain my psychiatrist was a quack. This is the psychiatrist who, over the course of the next six years, would guide me into stability, but when I first walked into his office I knew it was a ruse designed by my parents to convince me I was crazy. He wasn’t a real doctor. This was a set with props and actors. I remember being in the waiting room, seeing those magazines that stretched back two years, listening to the calm music and the sound of ocean waves on the speakers, sitting in the couches that seemed so real, looking at the diagnostic surveys and medicine prescriptions, and thinking: “Wow, they really pulled out all the stops with this one.”

Slowly it dawned on me: My thoughts were not real.

My initial diagnosis was bipolar. (Later it would be changed to schizoaffective disorder, and then schizophrenia with periods of depression.) And we began the long and frustrating battle to pinpoint the right medication. The first medication I tried gave me akathisia, a side effect in which you have an intense and extremely uncomfortable urge to move at all times. For weeks, I stayed in this particular hell, and I would spend hours on the treadmill or walking around the neighborhood in an attempt to shake the feeling that I could rip off my skin at any point.

We switched that, and switched it again, a process that took years. Eventually, we found a medication that worked on my mind, though sadly, not with my body. I gained about 100 pounds, took up smoking and was too tired most days to get out of bed.

My parents became a source of strength during this time. They began attending support groups and integrating themselves into the local mental illness community. They fought bureaucracy and headaches to secure me government assistance.

I moved into my own place, and for years I kept my diagnosis a secret. I still longed to live a normal life, even as I struggled with my paranoia. So I spent my time alone. I was afraid to leave my house, afraid to go into stores, because I was so quick to interpret random comments as criticism and ostracism. I just knew everyone was saying nasty things about me. I couldn’t even order a pizza without worrying that the delivery guy would get back into his car, think about that one expression I made, and tell all his friends what a freak I was.

To be unsure of your own mind is a prison, one that can break some of us. It simply becomes easier to slip into homelessness or drug abuse, but I fought against that. I began studying normal behavior. I read books on psychology, body language, dating technique — anything that would help me build a repertoire of healthy social interactions and somehow guide me into acting like a functioning human being.

I got better. Sometimes, I’m so much better that I question if I really have any illness. I’ll have a good month — no paranoia, no depression, lots of joking around — and I think, have I nipped this in the bud? Am I the first person in the history of medicine to cure myself of schizophrenia? The seductive thought arises: Maybe I could skip my meds tonight and see how I feel in the morning. The meds make me so miserable. Maybe I could actually be free of this stuff.

Those are the nights when I feel empowered enough to go to some bar where hipsters are trying to have sex with one another, but in every conversation I start to worry that the other person has found something wrong with the way I look, or the way I move, or the way I talk. I get so overwhelmed that I have to go outside and smoke, and I think: God, I just want to go home. And then when I get home I take my meds and go to bed — the habit that will keep me sane — and when I wake up in the morning I feel fine, and that’s pretty much all I can ask. The truth is, the meds make me feel OK, and OK is always better than bat-shit insane.

I’m pretty stable these days. I have good health care and a good family who watch out for my red flags. I have the semblance of a career. I do wish my prospects for love were better. Every piece of romantic advice says you should just be yourself, but it’s hard to do that when your self is a 400-pound schizophrenic. I’ve been on dates before, but it’s hard to snuff out the fear that everyone in the cafĂ© is laughing at me. I don’t know if I’ll ever get over that hurdle, but I hurt for companionship. I want someone besides my mom to rub my shoulders when I’m tense, to talk me down from my scary thoughts, to give me a hug when I feel low.

My parents tell me, “It’ll happen if it’s meant to be.” So maybe it isn’t meant to be. I can live with that, I guess, though it still stings when I see my friends getting married on Facebook or when I catch a rom-com on TV. But I’ve had to adjust a lot of my expectations for having a normal life. I realize that my challenge, ultimately, is to simply be all right by myself.

The only cure for paranoia is self-assurance, which comes from an intense and radical self-acceptance. You have to learn to accept yourself and everything you fear. You should try it sometime. You don’t have to be schizophrenic to find it useful.

It isn’t easy to be crazy. But my seven years on this path have taught me that people with mental illness are some of the most resilient and courageous people there are. They don’t pretend to be someone else. They have a raw authenticity that can scare people, and I see why. They show you their warts — demonic voices and all.

The media hasn’t helped in portraying mental illness as something to fear. After any massacre or inexplicable tragedy, you can count on a slew of pieces that explain how the killer was a quiet man who kept to himself and exhibited some strange behavior in the past. But endless studies have shown that people with major mental illness are much more likely to be victims of senseless crimes than to be perpetrators. Still, we struggle with the idea that we are monsters.

Maybe it’s up to me, and articles like this, to change minds.


~ Michael Hedrick is a writer and photographer based in Boulder, CO. He is currently a regular contributor for Thought Catalog but dreams of being paid to write a regular column for some big publication. His book "Schizophrenic Connections" is available here.

More Michael Hedrick.

Omnivore - Philosophers Gonna Philosophize

From Bookforum's Omnivore blog, here is another quirky collection of philosophy links from around the interwebs.

Philosophers gonna philosophize

Jan 15 2014
9:00AM

Thursday, January 16, 2014

John Brockman - The World Mind That Came In From the Counterculture

From Germany's Frankfurter Allgemeine , this is an interesting profile of Edge founder John Brockman on the occasion of this year's Edge Annual Question.

John Brockman: A Portrait

The World Mind That Came In From the Counterculture

Be imaginative, exciting, compelling, inspiring: That’s what John Brockman expects of himself and others. Arguably, the planet’s most important literary agent, Brockman brings its cyber elite together in his Internet salon "Edge." We paid a visit to the man from the Third Culture.


01.10.2014 · Von Jordan Mejias, New York

 
At the age of three John Brockman announced: „I want to go to New York!“ For decades he has been a leading light behind the scenes in the city’s intellectual life.

THE INTERNET had yet to be born but the talk still revolved around it. In New York, that was, half a century ago. "Cage," as John Brockman recalls, "always spoke about the mind we all share. That wasn’t some kind of holistic nonsense. He was talking about profound cybernetic ideas." He got to hear about them on one of the occasions when John Cage, the music revolutionary, Zen master and mushroom collector, cooked mushroom dishes for him and a few friends. At some point Cage packed him off home with a book. "That’s for you," were his parting words. After which he never exchanged another word with Brockman. Something that he couldn’t understand for a long time. "John, that’s Zen," a friend finally explained to him. "You no longer need him."

Norbert Wiener was the name of the author, Cybernetics: Or Control and Communication in the Animal and the Machine the name of the book. Page by page Brockman battled his way through the academic text, together with Stewart Brand, his friend, who was about to publish the Whole Earth Catalog, the shopping primer and bible of the environmentally-driven counterculture. For both readers, physics and mathematics expanded into an infinite space that no longer distinguished between the natural and human sciences, mind and matter, searching and finding.

Like the idea of the Internet—which was slowly acquiring contours during these rambling 1960s discussions—the idea of Edge, the Internet salon around which Brockman’s life now revolves, was also taking shape. Edge is the meeting place for the cyber elite, the most illustrious minds who are shaping the emergence of the latest developments in the natural and social sciences, whether they be digital, genetic, psychological, cosmological or neurological. Digerati from the computer universe of Silicon Valley aren’t alone in giving voice to their ideas in Brockman’s salon. They are joined in equal measure by other eminent experts, including the evolutionary biologists Richard Dawkins and Steven Pinker, the philosopher Daniel Dennett, the cosmologist Martin Rees, the biological anthropologist Helen Fisher, the economist, psychologist and Nobel Prize winner Daniel Kahneman, the quantum physicist David Deutsch, the computer scientist Marvin Minsky, and the social theorist Anthony Giddens. Ranging from the co-founder of Apple Steve Wozniak to the decoder of genomes Craig Venter, his guest list is almost unparalleled even in the boundless realm of the Internet. Even the actor Alan Alda and writer Ian McEwan can be found in his forum.

The bridge of the third culture


A question is sent out to all salon members at the start of every year. This year it is: "What scientific idea ready to be retired?" The "editorial marching orders," written by Brockman, reveal the heart of Edge: "Go deeper than the news. Tell me something I don’t know. You are writing for your fellow Edgies, a sophisticated bunch, and not the general public. Stick to ideas, theories, systems of thought, disciplines, not people. Come up with something new, be exciting, inspiring, compelling. Tell us a great story. Amaze, delight, surprise us!"

Does he really need to spell all that out so clearly? After all, quite a few of the authors number among his clients. He markets them and their works globally, and they know exactly what he expects of them and what they can expect of him. As their literary agent, he never misses a business opportunity. Indeed, he has built a reputation for negotiating mind-boggling prices for individual works that, in contrast to Edge, adopt a more populist approach to the sciences. But above all, it’s his concept of The Third Culture that glitters, the miraculous formula that Brockman evokes to secure the supremacy of the so called hard sciences, even in the instances when the world and our place in it is surveyed in quasi-philosophical mode. As physicist, politician, and the novelist C. P. Snow lament, there is a chasm separating the twin cultures of the natural and human sciences; and the enterprising Brockman fills this divide with bestsellers from his Third Culture.

Business isn’t just blossoming, he says, it has never been better. Anyone harboring any doubts should pay him a visit on Fifth Avenue, where Brockman, Inc. has been spreading its wings of late in premises that are awash with light and where gravity seems to have been suspended. The two glass corner offices are a testament to transparency. The one for the company’s founder allows the Empire State Building to peek over his shoulder as he works at his paper-free desk; the other is for his son Max, the company’s brand new CEO, who can admire the perpetually breathtaking silhouette of the Flatiron Building though the gigantic windows. Between them Katinka Matson, the co-founder of Edge, President of Brockman Inc., mother of Max, and business and life partner of John—has stylishly set up shop. As the daughter of a literary agent, the profession is in her DNA. In her spare time she now brightens up the office with multi-colored, larger-than-life scans of floral images.

Brockman, who was born in 1941, could comfortably retire and devote himself completely to Edge, his intellectual hobby. But Edge is no mere hobby for him, no pastime pursued at times when the demands of work abate. "I have never thought of money. I have only ever done what interested me, and that always brought in enough to get me by." Before opening his Internet salon, he had published a newsletter with the same title and philosophical outlook. This evolved out of the Reality Club. "Trippy stuff" topped the agenda when a group of people started meeting in New York during the 1980s, a group whose fluctuating composition included the physicist Freeman Dyson, the feminist Betty Friedan, the social revolutionary Abbie Hoffman and the film stars Ellen Burstyn and Dennis Hopper. They were charged with asking each other the questions that they asked themselves. No instant answers were expected. The focus was on asking the questions. In literary New York Brockman had never glimpsed the prospect of this type of exchange of ideas, the adventure that he wanted for himself and to share with others. He preferred the empirical study of our cosmos, on both micro and macro scales, to the imagined world. Not that this forced him to relinquish story-telling. With the frequently spectacular experiences they describe, the books and authors he represents offer him more suspense and excitement than he can find in any novel. And his own life? As he describes it, that too emerges as a collection of gripping stories that veer off in numerous different directions while always following a clear, very personal line. From Day One he was curious and hungry for knowledge, and had an appetite for excitement and new experiences.

A blueprint for the Internet


Brockman’s life-story begins with the proclamation: "I want to go to New York." He was three years old at the time, lying in a Boston hospital, seriously ill with cerebrospinal meningitis, and these are said to have been the first words he spoke when he woke up from a six-week coma. He finally made it to New York at the age of 20—enrolling as a graduate student at Columbia University where he completed a degree in business. After this he worked within the financial services industry, not that his life revolved exclusively around money and transactions at the time. The crazy 1960s burst into life and Brockman felt compelled to immerse himself in the vibrant cultural mix. He experienced the New York underground for himself on the stage of the Living Theater. It was culture shock, a call to action, an invitation to engage. But Brockman didn’t participate in the avant-garde experiments with his banjo and guitar, but with his gift for organization. Today we would probably call him a cultural impresario.

New York gave him confidence, telling him "You can be free." He didn’t need to be told twice. With Sam Shepard, who was still working as a waiter, he discussed ideas for "intermedia" stage performances. In no time he had become an indispensable part of the multimedia theater and film scene. He was entrusted by Jonas Mekas, the great father of experimental film in the U.S., with commissioning films from Nam June Paik and Robert Rauschenberg for an "expanded" film festival. His organizational skills even got him into the Lincoln Center Film Festival where he presented the work of newcomers like Martin Scorsese when he wasn’t escorting European guests—with names like Federico Fellini and Jean-Luc Godard—out to dinners. Even Jackie Kennedy, still not an Onassis, makes an appearance in the background during this period.

While the stars of Allen Ginsberg, William Burroughs and the Beatniks were slowly fading, and the folk scene around Bob Dylan dawning, Brockman was spending time working with Andy Warhol. But the drug-sodden collective in the Factory wasn’t for him. He needed to be his own master. For the same reason, things didn’t work out with the countercultural Yippies, after his friend Abbie Hoffman recruited him for the founding meetings of the movement. Brockman had no interest in revolution. However: "The ideas behind it interested me." Cage taught him how to perceive the non-linear structure of reality using cybernetics. With hindsight he came to feel this was "like a construction diagram for the Internet." He wrote a book with the title By the Late John Brockman, an aphoristic volume of his various insights and experiences.

In the circle of elites


And then, at MIT in 1965, he finally came face to face with a computer. There is precisely one example of this type of computer, a humungous contraption, surrounded by busy men in white lab coats, and secured behind a glass screen against which he pressed his nose. "I fell in love on the spot. It was pure magic." Brockman had no more doubts whatsoever that everything was interconnected: the arts and the sciences and the psychedelic shows with their flashing strobes, through whose cacophony of sound Marshall McLuhan trumpeted his theory of communications.

At the Esalen Institute, the personal growth laboratory on California’s Pacific coast, he listened to talks by scientists and madcap geniuses whose names hardly anyone on the East Coast knew. A treasure trove just waiting to be opened. An awakening. In 1973 this gave rise to his literary agency, albeit circuitously. Once again he found himself promoting something that interested him. Slowly but surely he realized that he had struck gold. Or, as he prefers to say, he discovered an oil well that has never stopped bubbling. Since then Brockman has been keyed to the Third Culture from head to foot. Famous scientists, researchers, entrepreneurs and sponsors are drawn to him like moths to a light bulb. At his desk in New York he clicks on the invitation to a party he is flying to in San Francisco the following day. The hosts include the co-founder of Google Sergey Brin, the Russian billionaire Yuri Miner, the co-founder of Facebook Mark Zuckerberg, and Art Levinson, Chairman of the Board of Apple Inc. and the former CEO of the biotech company Genentech. It is safe to assume that Brockman also enjoys get-togethers with such distinguished names.

But even more he evidently enjoys the gatherings at his picturesque farm in Connecticut with its numerous nooks and crannies. For one day or weekend every summer, he affords himself the intellectual pleasure of transforming his New England idyll into a swap meet for the latest scientific research and ideas. From Princeton and Yale, Harvard and MIT, Silicon Valley and New York’s executive suites, he invites thinkers, movers, shakers and clients—all of them friends—to discuss the hottest topics in their various fields. The most recent edition of these bucolic conferences held beneath ancient maple trees began with an up-to-date tour d’horizon by the economist Sendhil Mullainathan, who mused that the excessive volumes of data might threaten the qualitative character of science. The social scientist Fiery Cushman reported on the failure of algorithms in complex calculations, the experimental philosopher Joshua Knobe on the elusively ephemeral nature of the self, the psychologist June Gruber on the problem of positive emotion and the initial solutions.

In total 10 scientists gave talks on this perfect summer’s day, which now, thanks to Edge, no longer has to end. Since November Brockman has been posting the videos of the contributions on the Web. By February the day’s entire program should be accessible. Those online, however, can only guess at the pleasure John Brockman feels as he observes the mind games he has staged. "Edge," says its creator, "for me that means ideas, for me that means culture."

A 2014 "Scientific Idea to Be Retired" - Mental Illness is Nothing But Brain Illness


When Thomas Insel, director of the National Institutes of Mental Health (NIMH), announced last year that the NIMH would only be funding research consistent with their new research agenda, some of us recoiled at the assumed "truth" that mental illness is a brain disease. 
NIMH has launched the Research Domain Criteria (RDoC) project to transform diagnosis by incorporating genetics, imaging, cognitive science, and other levels of information to lay the foundation for a new classification system.
In a TED Talk a just before announcing that the DSM-5 will not be used at the NIMH (April 2013) because it is based on identification of symptoms and not on biomarkers or imaging of brain network dysfunction (as only two examples), Insel was more direct in saying that mental illness is a brain disease:
Insel believes part of the problem is that mental illness is referred to either as a mental or behavioral disorder. “We need to think of these as brain disorders,” he said, adding that for these brain disorders, behavior is the last thing to change.
Cough . . . bullshit . . . cough.

Allan Schore alone has compiled enough evidence to show how the environmental surround (including nurturing, interpersonal/intersubjective experience, nutrition, abuse/neglect, and so on) all have incredibly powerful impacts on brain development.

If someone shows up in my office, my first thought is not, "What's wrong with you?" Rather, the first thought and the first question is, "What happened to you?" This is the foundation of trauma-based therapy. 

With this as my belief and conviction, I was pleased to see in this year's Edge question responses (to the question, What Scientific Idea Is Ready for Retirement?) one that argues that we need to retire the belief that mental illness is nothing but brain illness.

Can I get a Hallelujah?!

Mental Illness is Nothing But Brain Illness



Joel Gold  - Psychiatrist; Clinical Associate Professor of Psychiatry, NYU School of Medicine


Ian Gold  - Neuroscientist; Canada Research Chair in Philosophy & Psychiatry, McGill University

In 1845, Wilhelm Griesinger, author of the most important textbook of psychiatry of the day, wrote: “what organ must necessarily and invariably be diseased where there is madness? … Physiological and pathological facts show us that this organ can only be the brain…” Griesinger’s truism is regularly reiterated in our own time because it expresses the basic commitment of contemporary biological psychiatry.

The logic of Griesinger’s argument seems unassailable: severe mental illness has to originate in a physiological abnormality of some part of the body, and the only plausible candidate location is the brain. Since the mind is nothing over and above the activity of the brain, the disordered mind is nothing more than a disordered brain. True enough. But that is not to say that mental disorders can, or will, be described by genetics and neurobiology. Here’s an analogy. Earthquakes are nothing over and above the movements of a vast number of atoms in space, but the theory of earthquakes says nothing at all about atoms but only about tectonic plates. The best scientific explanation of a phenomenon depends on where real human beings find comprehensible patterns in the universe, and not how the universe is constituted. God may understand earthquakes and mental illness in terms of atoms, but we may not have the time or the intelligence to do so.

It’s not a radical idea that understanding and treating brain disorders sometimes has to move outside the skull. A man's heart hurls an embolus into his brain. He might now be unable to produce or understand speech, move one half of his body, or see half of the world in front of him. He has had a stroke and his brain is now damaged. The cause of his brain illness did not originate there, but in his heart. His physicians will do what they can to limit further damage to his brain tissue and perhaps even restore some of the function lost due to the embolism. But they will also try to diagnose and treat his cardiovascular disease. Is he in atrial fibrillation? Is his mitral valve prolapsed? Does he require blood thinner? And they won't stop there. They will want to know about the patient's diet, exercise regimen, cholesterol level and any family history of heart disease.

Severe mental illness is also an assault on the brain. But like the embolus it may sometimes originate outside the brain. Indeed, psychiatric research has already given us clues suggesting that a good theory of mental illness will need concepts that make reference to things outside the skull. Psychosis provides a good example. A family of disorders, psychosis is marked by hallucinations and delusions. The central form of psychosis, schizophrenia, is the psychiatric brain disease par excellence. But schizophrenia interacts with the outside world, in particular, the social world. Decades of research has given us robust evidence that the risk of developing schizophrenia goes up with experience of childhood adversity, like abuse and bullying. Immigrants are at about twice the risk, as are their children. And the risk of illness increases in a near-linear fashion with the population of your city and varies with the social features of neighborhoods. Stable, socially coherent neighborhoods have a lower incidence than neighborhoods that are more transient and less cohesive. We don’t yet understand what it is about these social phenomena that interacts with schizophrenia, but there is good reason to think they are genuinely social.

Unfortunately, these environmental determinants of psychosis go largely ignored, but they provide opportunities for useful interventions. We don’t yet have a genetic therapy for schizophrenia, and antipsychotic drugs can only be used after the fact and are not nearly as good as we’d like them to be. The Decade of the Brain produced a great deal of important research into brain function, and the new BRAIN initiative will do so as well. But almost none of it has yet (or is likely) to help the patients who suffer from mental illness or those who treat them. But reducing child abuse, and improving the quality of the urban environment might very well prevent some people from ever developing a psychotic illness at all.

Of course, whatever it is about the social determinants of psychosis that makes them risk factors, they must have some downstream effect on the brain otherwise they would not raise the risk of schizophrenia, but they themselves are not neural phenomena any more than smoking is a biological phenomenon because it is a cause of lung cancer. The theory of schizophrenia will have to be more expansive, therefore, than the theory of the brain and its disorders.

That a theory of mental illness should make reference to the world outside the brain is no more surprising than that the theory of cancer has to make reference to cigarette smoke. And yet what is commonplace in cancer research is radical in psychiatry. The time has come to expand the biological model of psychiatric disorder to include the context in which the brain functions. In understanding, preventing and treating mental illness, we will rightly continue to look into the neurons and DNA of the afflicted and unafflicted. To ignore the world around them would be not only bad medicine but bad science.

Dr. Maulik Shah - Neurologic Mysteries



This is an interesting talk from Dr. Maulik Shah (UC San Francisco) on the diagnostic mysteries of neurology - presented by UCTV.

Dr. Maulik Shah - Neurologic Mysteries


Published on Jan 13, 2014
(Visit: http://www.uctv.tv/) Dr. Maulik Shah, UCSF Department of Neurology, explores how neurologists solve diagnostic dilemmas, including the evaluation of patients who are referred to them from the community for their expert opinion. Hear about the pitfalls as well as the eureka moments. Learn the importance of thinking broadly about cases and deciphering which pieces of data are the most likely to lead to a diagnosis -- versus those that might be a red herring or irrelevant to the current problem.
Series: "UCSF Osher Center for Integrative Medicine presents Mini Medical School for the Public" [1/2014]

Alex Filippenko: "Dark Energy and the Runaway Universe" - Talks at Google


This is a fascinating talk from Alex Filippenko (Richard & Rhoda Goldman Distinguished Professor in the Physical Sciences at UC Berkeley), a record 9-time winner of the "Best Professor" award at Berkeley.

The above graphic is useful, but we can disregard the whole "big bang" thing - it's more likely our universe is one of many and that the universe may recycle itself - see Roger Penrose (Conformal cyclic cosmology). The dark energy paradox works is Penrose's model.

Alex Filippenko: "Dark Energy and the Runaway Universe", Talks at Google

Published on Jan 13, 2014


We expected the attractive force of gravity to slow down the rate at which the Universe is expanding. But observations of very distant exploding stars (supernovae) show that the expansion rate is actually speeding up, a remarkable discovery that was honored with the 2011 Nobel Prize in Physics to the teams' leaders. Over the largest distances, the Universe seems to be dominated by a repulsive "dark energy" -- an idea Albert Einstein had suggested in 1917 but renounced in 1929 as his "biggest blunder." It stretches space itself faster and faster with time. But the physical origin and nature of dark energy, which makes up about 70% of the contents of the Universe, is probably the most important unsolved problem in all of physics; it may provide clues to a unified quantum theory of gravity.

About the Speaker: Alex Filippenko is the Richard & Rhoda Goldman Distinguished Professor in the Physical Sciences. His accomplishments, documented in about 700 research papers, have been recognized by several major prizes, and he is one of the world's most highly cited astronomers. In 2009 he was elected to the National Academy of Sciences, and he shared part of the Gruber Cosmology Prize in 2007. He has won the top teaching awards at UC Berkeley and has been voted the "Best Professor" on campus a record 9 times. In 2006 he was selected as the Carnegie/CASE National Professor of the Year among doctoral institutions, and in 2010 he won the ASP's Emmons Award for undergraduate teaching. He has produced five astronomy video courses with "The Great Courses," coauthored an award-winning textbook, and appears in numerous TV documentaries including about 40 episodes of "The Universe" series. An avid tennis player, hiker, and skier, he enjoys world travel and is addicted to observing total solar eclipses (11 so far).

Wednesday, January 15, 2014

Anti-Inflammatory Luteolin Concentrated in Nanocapsules in the Blood Inhibits Lung-Cancer Growth


This is a cool piece of research. Nanocapsules (a water-soluble polymer) and other forms of micro-encapsulation are hot properties in the world of drug delivery systems. The essential advantage to this approach is bypassing the degradation of the target material in the digestive process. Among the substances being researched for this type of delivery system are curcumin, resveratrol, probiotics, and (as this article discusses) luteolin.

Via Wikipedia:
Luteolin has been studied in several preliminary in vitro scientific investigations. Proposed activities include antioxidant activity (ie. scavenging of free radicals), promotion of carbohydrate metabolism, and immune system modulation.[citation needed] Other in vitro studies suggest luteolin has anti-inflammatory activity,[7][8] and that it acts as a monoamine transporter activator,[9] a phosphodiesterase inhibitor,[10] and an interleukin 6 inhibitor.[7] In vivo studies show luteolin affects xylazine/ketamine-induced anesthesia in mice.[11] In vitro and in vivo experiments also suggest luteolin may inhibit the development of skin cancer.[12][13] Importantly, the therapeutic value of the above findings is unclear, and will remain so until further detailed in vivo, toxicity, and clinical studies are performed.
Luteolin has long been thought to have great potential as a chemopreventive substance, but it (like curcumin) suffers considerable breakdown in the gut.

Still, consuming a diet rich is luteolin has measurable anti-inflammatory effects. Luteolin is found in many herbs and plant foods, including carrots, peppers, olive oil, peppermint, rosemary, celery, broccoli, green pepper, parsley, thyme, dandelion, chamomile tea, navel oranges, and oregano, among others.

These new delivery systems can transform therapeutically marginal substances (mostly herbal and food extracts) into highly effective pharmaceutical quality interventions. The research with micro-encapsulated curcumin has shown great promise in prostate cancer.

Anti-inflammatory luteolin concentrated in nanocapsules in the blood inhibits lung-cancer growth

January 9, 2014


Cellular uptake of nanoparticles encapsulating both dye and luteolin (credit: Winship Cancer Institute of Emory University)

Researchers at the Winship Cancer Institute of Emory University have discovered a more effective drug delivery system using nanoparticles that could one day significantly affect cancer prevention.

The study, published in Cancer Prevention Research, involved the use of microscopic amounts of the naturally occurring antioxidant, luteolin, that were encapsulated in a water-soluble polymer. When injected into mice, the nano-luteolin inhibited growth of lung cancer and head and neck cancer cells.

“By using a high concentration of luteolin in the blood, we were better able to inhibit the growth of cancer cells,” says senior study author Dong Moon Shin, MD, professor of hematology and medical oncology at Emory University School of Medicine and associate director of academic development at Winship Cancer Institute.

Luteolin is known for its anti-inflammatory and anti-cancer effects. It is naturally found in green vegetables such as broccoli, celery and artichokes, however, Shin says large quantities would need to be consumed to be effective. By concentrating the compound into a nanoparticle and making it easy to dissolve in water, researchers conclude nano-luteolin has immense potential for future human studies of chemoprevention (to help stop the recurrence of cancer in patients and reduce the risk of cancer in others).

Shin told KurzweilAI several chemotherapeutic agents using nanotechnology have been developed for drug delivery over the last decade. “As a chemopreventive agent, our nanotechnology-driven delivery would be one of those pioneering ones. The new luteolin chemopreventive agent has to still go through many steps, including GMP production, IND filing, conducting phase I, II, and III trials, etc., so it is hard to predict at this moment how long it will take to be commercialized.”

Abstract of Cancer Prevention Research paper


Cancer prevention (chemoprevention) by using naturally occurring dietary agents has gained immense interest because of the broad safety window of these compounds. However, many of these compounds are hydrophobic and poorly soluble in water. They frequently display low bioavailability, poor systemic delivery, and low efficacy. To circumvent this problem, we explored a novel approach toward chemoprevention using nanotechnology to deliver luteolin, a natural compound present in green vegetables. We formulated water-soluble polymer-encapsulated Nano-Luteolin from hydrophobic luteolin, and studied its anticancer activity against lung cancer and head and neck cancer. In vitro studies demonstrated that, like luteolin, Nano-Luteolin inhibited the growth of lung cancer cells (H292 cell line) and squamous cell carcinoma of head and neck (SCCHN) cells (Tu212 cell line). In Tu212 cells, the IC50 value of Nano-Luteolin was 4.13 ÎĽmol/L, and that of luteolin was 6.96 ÎĽmol/L. In H292 cells, the IC50 of luteolin was 15.56 ÎĽmol/L, and Nano-Luteolin was 14.96 ÎĽmol/L. In vivo studies using a tumor xenograft mouse model demonstrated that Nano-Luteolin has a significant inhibitory effect on the tumor growth of SCCHN in comparison to luteolin. Our results suggest that nanoparticle delivery of naturally occurring dietary agents like luteolin has many advantages and may have potential application in chemoprevention in clinical settings.

Cancer Prev Res; 7(1); 65–73. ©2013 AACR.

References:
Related:

The 15-Year-Old Who Invented a New Way to Detect Early Stage Pancreatic Cancer (Morgan Spurlock)


What an amazing kid - This is a brief little video worth the watch. If his invention is as accurate as it sounds, there could be a future where pancreatic cancer is not a guaranteed death sentence.

Watch Morgan Spurlock’s Documentary on the 15-Year-Old Who Invented a New Way to Detect Early Stage Pancreatic Cancer

January 14, 2014

You Don't Know Jack | Morgan Spurlock from Focus Forward Films on Vimeo.

If you believe, as Whitney Houston once did, that children are our future, you’ll be gratified by the work of Jack Andraka, age 15.

Describing him as a kid with a passion for science is an understatement on par with calling Mr. Peabody a cartoon dog.

Not that I’ve got a crystal ball or anything, but let’s just say if you or your loved one come down with pancreatic cancer a decade from now, you’ll be very glad this young man—the 2012 grand prize winner of the Intel International Science and Engineering Fair, as well as the Smithsonian American Ingenuity Award—didn’t squander his freshman year’s extracurricular hours on sports and glee club.

Instead, he became the “cancer paper boy.” His mentor, Johns Hopkins pathologist and researcher, Anirban Maitra floats comparisons to Edison. As Morgan Spurlock points out in his show documentary on Andraka — You Don’t Know Jack (above) — many of Einstein’s discoveries were made before he stuck his tongue out beneath that white mane.

Spurred on in part by the death of a family friend, Jack, then 14, developed an inexpensive procedure that can diagnose the presence of the notoriously stealthy cancer of the pancreas while treatment is still an option. Through trial and error, he developed an absorbent filter paper dipstick that helps measure the electrical signal of a nanotube network laced with antibodies specific to the protein mesothelin, after a sixth of a drop of blood has been introduced.

As a theater major, I fear I may not be summarizing the science with sufficient accuracy. The Smithsonian published an article describing Jack’s process in detail. While I don’t know much about pancreatic function, cancerous or otherwise, I do know enough to have deep respect for Jack’s supportive parents, and Johns Hopkins University, the only institution (of 200 contacted) to respond in the affirmative when the then-14-year -old got in touch, seeking lab space. (Hosting the Center for Talented Youth may have primed them for such queries.) If this science thing doesn’t work out, Jack could totally make a go of it as a publicist. He’s got the tenacity.

Again, it’ll take another ten years or so before the fruits of Jack’s labors can be part of mainstream medical practice, but it does give one hope for the future. Some paper boy!

Related Content:

~ Ayun Halliday is an author and Chief Primatologist of the East Village Inky, an award-winning, handwritten zine. Follow her @AyunHalliday

The Worst Thing You Can Eat Is Sugar

Excess sugar is poison in the body - and of course we love our poisons - alcohol, tobacco, drugs, gluten, and now sugar is finally being added to the list. Recently, a group of leading medical and nutrition experts released a call for a 20-30% reduction in sugar added to packaged and processed foods over the next 3-5 years. That would be a good start.

The worst thing you can eat is sugar

By Lindsay Kobayashi
Posted: January 13, 2014
Removing sugar from the food industry could reverse the obesity epidemic
A couple days ago, a group of leading medical and nutrition experts released a call for a 20-30% reduction in sugar added to packaged and processed foods over the next 3-5 years (1). The expert group, ‘Action on Sugar’, estimates that this change would result in a reduction of roughly 100 calories each person eats per day, and will eventually reverse the obesity epidemic (1). Wow. The media has picked up on this statement in a huge way, with headlines like ‘Sugar is the ‘new tobacco’ (2), and ‘Sugar is now enemy number one in the western diet (3). While these headlines sound sensationalist, they are right.

A sickening amount of sugar is added to many processed foods (1). Some culprits are obvious. There are 9 teaspoons of sugar in a can of regular Coke or Pepsi, but others are surprising. Heinz tomato soup has 4 teaspoons of sugar per serving. Add two slices of white bread to that soup at nearly a teaspoon of sugar, another teaspoon or two in your coffee or tea, and that’s your entire daily sugar allowance. Sugar should comprise no more than 5% of daily energy intake, which is about 6 teaspoons per day for women and 8 teaspoons per day for men (3).

And what is the big deal about sugar? A calorie is a calorie – right? Well, not so much. The calories provided by sugar are void of nutrition. ‘Action on Sugar’ (1) states it best:
Added sugar is a very recent phenomenon (c150 years) and only occurred when sugar, obtained from sugar cane, beet and corn became very cheap to produce. No other mammal eats added sugar and there is no requirement for added sugar in the human diet. This sugar is a totally unnecessary source of calories, gives no feeling of fullness and is acknowledged to be a major factor in causing obesity and diabetes both in the UK and worldwide.
Humans have no dietary requirement for added sugar. Dr Aseem Malhotra, the science director of ‘Action on Sugar’, emphasizes that the body does not require carbohydrates from sugar added to foods (3). Furthermore, high sugar intake may reduce the ability to regulate caloric intake (4), with consumption of sugar leading to eating more sugar, overeating, and ultimately to weight gain (5). Added sugar therefore presents a ‘double jeopardy’ of empty caloric intake that triggers further unnecessary consumption.

Dr Malhotra states that sugar is in fact ‘essential to food industry profits and lining the pockets of its co-opted partners’ (3). The sugar/food industry has tremendous power, sponsoring high-profile sporting events, gaining celebrity endorsements, and employing psychological techniques in their ubiquitous advertising. Maliciously, they target children, who are vulnerable to advertising and to giving in to a sweet tooth (6). The politics of the sugar industry have been covered by this blog in another post. Essential to their tactics is heavy resistance against the scientific links between sugar and obesity. The American Sugar Association website states that ‘sugar is a healthy part of a diet’ (7), and Sugar Nutrition UK states that ‘the balance of available evidence does not implicate sugar in any of the ‘lifestyle diseases’‘ (8). On top of that, the food industry sponsors scientific research that is biased towards showing no link between sugar and adverse health problems. Last month, a large evidence review found that research on sugar-sweetened beverages and obesity is more likely to find no association between the two when funded by the food industry (9).

Clearly, we have a long way to go in fighting against the paradigm of today’s food environment, which is largely dictated by the industry. ‘Action on Sugar’ has some important aims to this end: in addition to reducing sugar in processed foods by 20-30%, they aim to reach a consensus with the food industry that sugar is linked obesity and other negative health effects, to improve nutritional labelling of added sugar content using a traffic light system, and to ensure that scientific evidence is translated into government policy to reduce sugar. Their full list of aims can be found here (10). These aims are likely to be successful, as they are modelled off of sodium reduction efforts that have led to an estimated reduction of sodium in packaged foods ‘between 20 and 40%, with a minimum reduction of 6,000 strokes and heart attack deaths per year, and a healthcare saving cost of £1.5 billion [approx. $2.5 billion USD]’ (1).

So what can we do, as individuals? The first step is educating oneself, so if you’ve read this far then you’re one step ahead. Always read nutritional labelling on packaged foods carefully to determine how much sugar is in what you’re eating. Katharine Jenner, nutritionist and campaign director of ‘Action on Sugar’ states that you can ‘wean yourself off the white stuff’ by cutting down on using it at home, but the main source of sugar in our diets remains that added during the processing of manufactured food (1). The best thing is to heavily cut down on packaged, processed foods in favour of whole, unprocessed foods. Do this, if not only for your individual health, but to stop supporting an industry that comprises the well-being of the world’s population for financial profit. The worst thing you can do is eat sugar.

References

  1. Action on Sugar. Worldwide experts unite to reverse obesity epidemic by forming ‘Action on Sugar’. http://www.actiononsalt.org.uk/actiononsugar/Press%20Release%20/118440.html (accessed 12 January 2014).
  2. Poulter S. Sugar is the ‘new tobacco’: health chiefs tell food giants to slash levels by a third. Daily Mail. 09 January 2014. http://www.dailymail.co.uk/health/article-2536180/Sugar-new-tobacco-Health-chiefs-tell-food-giants-slash-levels-third.html (accessed 12 January 2014).
  3. Malhotra A. Sugar is now enemy number one in the western diet. The Guardian. 11 January 2014. http://www.theguardian.com/commentisfree/2014/jan/11/sugar-is-enemy-number-one-now (accessed 12 January 2014).
  4. Davidson TL, Swithers SE. A Pavlovian approach to the problem of obesity. Int J Obes Relat Metab Disord 2004;28(7):933-5.
  5. Bray GA, Nielsen SJ, Popkin BM. Consumption of high-fructose corn syrup in beverages may play a role in the epidemic of obesity. Am J Clin Nutr 2004;79(4):537-43.
  6. Calvert SL. Children as consumers: advertising and marketing. The Future of Children 2008;18(1):205-34.
  7. The Sugar Association. Balanced Diet. http://www.sugar.org/sugar-your-diet/balanced-diet/ (accessed 12 January 2014).
  8. Sugar Nutrition UK: Researching the Science of Sugar. Sugar & Health. http://www.sugarnutrition.org.uk/Sugar-and-Health.aspx (accessed 12 January 2014).
  9. Bes-Rastrollo M, Schulze MB, Ruiz-Canela M, Martinez-Gonzalez. Financial conflicts of interest and reporting bias regarding the association between sugar-sweetened beverages and weight gain: a systematic review of systematic reviews. PLOS Med 2013; doi: 10.1371/journal.pmed.1001578
  10. Action on Sugar. Aims. http://www.actiononsalt.org.uk/actiononsugar/Aims%20/118439.html (accessed 12 January 2014).

Image source

EDGE Question 2014 - What Scientific Idea Is Ready for Retirement?


The 2014 EDGE Question is out and all 176 contributors (174 responses) can be reviewed and pondered at the EDGE site. This year's question is a good one (they are always interesting) in that it provided many respondents an opportunity to question some of the basic tenets of scientific belief.
Science advances by discovering new things and developing new ideas. Few truly new ideas are developed without abandoning old ones first. As theoretical physicist Max Planck (1858-1947) noted, "A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it." In other words, science advances by a series of funerals. Why wait that long?

 WHAT SCIENTIFIC IDEA IS READY FOR RETIREMENT?

Ideas change, and the times we live in change. Perhaps the biggest change today is the rate of change. What established scientific idea is ready to be moved aside so that science can advance?
Among this years respondents are the usual who's who of science, as well as a lot of people I have never heard of but who contribute excellent responses.

Here are a few of the recognizable names:
Jay Rosen, Robert Sapolsky, David Berreby, Dean Ornish, Laurie Santos & Tamar Gendler, Ian Bogost, Simon Baron-Cohen, Catherine Bateson, Bruce Hood, David Buss, Paul Bloom, Daniel Goleman, Susan Blackmore, Marcelo Gleiser, David Deutsch, Nicholas Carr, Thomas Metzinger, Nicholas Christakis, Mihaly Csikszentmihalyi, Sherry Turkle,, Patricia Churchland, Nassim Nicholas Taleb, Andy Clark, Jerry Coyne, Alan Alda, Paul Davies, Richard Dawkins, Lawrence Krauss, Matt Ridley, Lee Smolin, Sam Harris, A.C. Grayling, Ian McEwan, Alison Gopnik, John McWhorter, Freeman Dyson, Jared Diamond, Jonathan Haidt, Steven Pinker, Howard Gardner, Douglas Rushkoff, Michael Shermer, Daniel Everett, Daniel C. Dennett, Nicholas Humphrey, George Dyson, Kevin Kelly
Here is one of the MANY excellent responses, this one focusing on one of my favorite issues with the traditional Darwinian evolutionary model. [I'll be posting more of these over the coming days.]

Natural Selection is the Only Engine of Evolution 



Athena Vouloumanos  - Associate Professor of Psychology, Director, NYU Infant Cognition and Communication Lab, New York University
 

In evolution classes, Lamarckism–the notion promoted by Lamarck that an organism could acquire a trait during its lifetime and pass that trait to its offspring–is usually briefly discussed and often ridiculed. Darwin's theory of natural selection is presented as the one true mechanism of evolutionary change.

In Lamarck's famous example, giraffes that ate leaves from higher branches could potentially grow longer necks than giraffes that ate from lower branches, and pass on their longer necks to their offspring. The inheritance of acquired characteristics was originally considered a legitimate theory of evolutionary change, with even Darwin proposing his own version of how organisms might inherit acquired characteristics.

Experimental hints of intergenerational transfer of acquired traits came in 1923 when Pavlov reported that while his first generation of white mice needed 300 trials to learn where he hid food, their offspring needed only 100, and their grandchildren only 30. But Pavlov's description didn't make clear whether the mice were all housed together allowing for some communication between mice or other kinds of learning. Still other early studies of potential intergenerational trait transfer in plants, insects, and fish also suffered from alternative interpretations or poorly controlled experiments. Lamarckism was dismissed.

But more recent studies–using modern reproduction techniques like in vitro fertilization and proper controls–can physically isolate generations from each other and rule out any kind of social transmission or learning. For example, mice that were fear-conditioned to an otherwise neutral odor produced baby mice that also feared that odor. Their grandbaby mice feared it too. But unlike in Pavlov's studies, communication couldn’t be the explanation. Because the mice never fraternized, and cross-fostering experiments further ruled out social transmission, the newly acquired specific fear had to be encoded in their biological material. (Biochemical analysis showed that the relevant change was likely in the methylation of olfactory reception genes in the sperm of the parents and offspring. Methylation is one example of an epigenetic mechanism.) Natural selection is still the primary shaper of evolutionary change, but the inheritance of acquired traits might play an important role too.

These findings fit in a relatively new field of study called epigenetics. Epigenetic control of gene expression contributes to cells in a single organism (which share the same DNA sequence) developing differently into e.g. heart cells or neurons. But the last decade has shown actual evidence–and possible mechanisms–for how the environment and the organism's behavior in it might cause heritable changes in gene expression (with no change in the DNA sequence) that are passed onto offspring. In recent years, we have seen evidence of epigenetic inheritance across a wide range of morphological, metabolic, and even behavioral traits. The intergenerational transmission of acquired traits is making a comeback as a potential mechanism of evolution. It also opens up the interesting possibility that better diet, exercise, and education which we thought couldn't affect the next generation–except with luck through good example–actually could.

Tuesday, January 14, 2014

Iain McGilchrist - Anyone with Half a Brain Can See That! (at TEDxGhent)

 

Iain McGilchrist is the author of The Master and His Emissary: The Divided Brain and the Making of the Western World (2012), a somewhat controversial book (see here, and here for McGilchrist's response, and a reply from the original critic), and the briefer version (31 pages), The Divided Brain and the Search for Meaning (2012, Kindle only, $0.99).

Here is a review from Publisher's Weekly (from the book's Amazon page):
A U.K. mental health consultant and clinical director with a background in literature, McGilchrist attempts to synthesize his two areas of expertise, arguing that the "divided and asymmetrical nature" of the human brain is reflected in the history of Western culture. Part I, The Divided Brain, lays the groundwork for his thesis, examining two lobes' significantly different features (structure, sensitivity to hormones, etc.) and separate functions (the left hemisphere is concerned with "what," the right with "how"). He suggests that music, "ultimately... the communication of emotion," is the "ancestor of language," arising largely in the right hemisphere while "the culture of the written word tends inevitably toward the predominantly left hemisphere." More controversially, McGilchrist argues that "there is no such thing as the brain" as such, only the brain as we perceive it; this leads him to conclude that different periods of Western civilization (from the Homeric epoch to the present), one or the other hemisphere has predominated, defining "consistent ways of being that persist" through time. This densely argued book is aimed at an academic crowd, is notable for its sweep but a stretch in terms of a uniting thesis.
For those who want a little more about this book and its central thesis, I am also including the RSA Animates video excerpted from McGilchrist's talk and workshop at the RSA.

Iain McGilchrist - Anyone with Half a Brain Can See That! (at TEDxGhent)


Published on Jan 11, 2014


Iain McGilchrist is a psychiatrist and writer, committed to the idea that the mind and brain can be understood only by seeing them in the broadest possible context, that of the whole of our physical and spiritual existence, and of the wider human culture in which they arise -- the culture which helps to mould, and in turn is moulded by, our minds and brains. His talk 'The Divided Brain' was a Best of the Web pick by TED!

Here is the RSA Animate video - enjoy!


Uploaded on Oct 21, 2011

In this new RSA Animates, renowned psychiatrist and writer Iain McGilchrist explains how our 'divided brain' has profoundly altered human behaviour, culture and society.

Taken from a lecture given by Iain McGilchrist as part of the RSA's free public events programme. To view the full lecture, go here. RSA is a 258 year-old charity devoted to creating social progress and spreading world-changing ideas. For more information about our research, RSA Animates, free events programme and 27,000 strong fellowship.
Find out more about the RSA
Join the RSA on Facebook

The Roadmap to Nobility: Cindy Wigglesworth at TEDxLowerEastSide


Nice talk by Cindy Wigglesworth at TEDxLowerEastSide.

The Roadmap to Nobility: Cindy Wigglesworth at TEDxLowerEastSide

Published on Jan 12, 2014


Cindy Wigglesworth is the author of SQ21: The Twenty-One Skills of Spiritual Intelligence and the President of Deep Change, Inc. Cindy worked at Exxon in human resources management for 20 years. Seeing both the strengths and limitations of traditional leadership approaches, she left ExxonMobil to start her own business dedicated to developing the multiple intelligences of leaders and organizations.

Cindy is ambassador for Conscious Capitalism, and is quoted by Patricia Aburdene in her book Megatrends 2010: The Rise of Conscious Capitalism. Her passionate commitment is to help birth a new way of talking about spirituality that gets us beyond the barrier of "religion versus science" and lands us squarely in the "this stuff works!" category of applied wisdom.

Her corporate clients have included The Methodist Healthcare System (now a Fortune 100 best employer), BHP Billiton Petroleum, and MD Anderson Cancer Center.

The Top 10 Insights from the “Science of a Meaningful Life” in 2013 (Greater Good Science Center)

From UC Berkeley's Greater Good Science Center, here is a collection of 10 research summaries on topics related to having a meaningful life, for example the idea that a meaningful and healthy life is not the same as a happy life; or that mindfulness meditation can make people more altruistic (even when doing so has barriers) and that the emotional benefits of altruism are likely to be human universals.

There is some nice research summarized here - and for a nice change of pace, the news is good.

The Top 10 Insights from the “Science of a Meaningful Life” in 2013


Below are some of the most surprising, provocative, and inspiring findings published this past year.
By Jason Marsh, Devan Davison, Bianca Lorenz, Lauren Klein, Jeremy Adam Smith, Emiliana R. Simon-Thomas

January 2, 2014


The past few years have been marked by two major trends in the science of a meaningful life.

One is that researchers continued to add sophistication and depth to our understanding of positive feelings and behaviors. Happiness is good for you, but not all the time; empathy ties us together, and can overwhelm you; humans are born with an innate sense of fairness and morality, that changes in response to context. This has been especially true of the study of mindfulness and attention, which is producing more and more potentially life-changing discoveries.

The other factor involves intellectual diversity. The turn from the study of human dysfunction to human strengths and virtues may have started in psychology, with the positive psychology movement, but that perspective spread to adjacent disciplines like neuroscience and criminology, and from there to fields like sociology, economics, and medicine. Across all these fields, we’re seeing more and more support for the idea that empathy, compassion, and happiness are more than you-have-it-or-not capacities, but skills that can be cultivated by individuals and by groups of people through deliberate decisions.

In 2013, the UC Berkeley Greater Good Science Center is now part of a mature, multidisciplinary movement. Here are 10 scientific insights published in peer-reviewed journals from the past year that we anticipate will be cited in scientific studies, help shift public debate, and change individual behavior in the year to come.


A meaningful life is different—and healthier—than a happy one.



The research we cover here at the Greater Good Science Center is often referred to as “the science of happiness,” yet our tagline is “The Science of a Meaningful Life.” Meaning, happiness—is there a difference?

New research suggests that there is. When a study in the Journal of Positive Psychology tried to disentangle the concepts of “meaning” and “happiness” by surveying roughly 400 Americans, it found considerable overlap between the two—but also some key distinctions.

Based on those surveys, for instance, feeling good and having one’s needs met seem integral to happiness but unrelated to meaning. Happy people seem to dwell in the present moment, not the past or future, whereas meaning seems to involve linking past, present, and future. People derive meaningfulness (but not necessarily happiness) from helping others—being a “giver”—whereas people derive happiness (but not necessarily meaningfulness) from being a “taker.” And while social connections are important to meaning and happiness, the type of connection matters: Spending time with friends is important to happiness but not meaning, whereas the opposite is true for spending time with loved ones.

And other research published in the Proceedings of the National Academy of Sciences suggests that these differences might have important implications for our health. When Barbara Fredrickson and Steve Cole compared the immune cells of people who reported being “happy” with those of people who reported “a sense of direction and meaning,” the people leading meaningful lives seemed to have stronger immune systems.


The emotional benefits of altruism might be a human universal.



One of the most significant findings to have emerged from the sciences of happiness and altruism has been this: Altruism boosts happiness. Spending on others makes us happier than spending on ourselves—at least among the relatively affluent North Americans who have participated in this research.

But a paper published in the Journal of Personality and Social Psychology suggested that this finding holds up around the world, even in countries where sharing with others might threaten someone’s own subsistence.

In one study, the researchers examined data of more than 200,000 people from 136 countries; they determined that donating to charity in the past month boosts happiness “in most individual countries and all major regions of the world,” cutting across cultures and levels of economic well-being. It was even true regardless of whether someone said they’d had trouble securing food for their family in the past year.

When the researchers zeroed in on three countries with vastly different levels of wealth—Canada, Uganda, and India—they found that people reported greater happiness recalling a time when they’d spent money on others than when they’d spent on themselves. And in a study comparing Canada and South Africa, people reported feeling happier after donating to charity than after buying themselves a treat, even though they would never meet the beneficiary of their largess. This suggests to the researchers that their happiness didn’t result from feeling like they were strengthening social connections or improving their reputation but from a deeply ingrained human instinct.

In fact, they argue, the nearly universal emotional benefits of altruism suggest it is a product of evolution, perpetuating behavior that “may have carried short-term costs but long-term benefits for survival over human evolutionary history.”


Mindfulness meditation makes people more altruistic—even when confronted with barriers to compassionate action.



In March, the GGSC hosted a conference called “Practicing Mindfulness & Compassion,” where speakers made the case that the practice of mindfulness—the moment-by-moment awareness of our thoughts, feelings, and surrounding—doesn’t just improve our individual health but also makes us more compassionate toward others. Coincidentally, just weeks after the conference, two new studies bolstered this claim.

The first study, published in Psychological Science, found that people who took an eight-week mindfulness meditation course were significantly more likely than a control group to give up their waiting-room seat for a person on crutches. This was true despite the fact that other people in the waiting room (who were secretly working with the researchers) didn’t acknowledge the person in need or make any gesture to give up their own seats; prior research suggests that this kind of inaction strongly deters bystanders from helping out, but that wasn’t the case when the bystanders had received training in mindfulness.

A few weeks later, another study published in Psychological Science echoed that finding. In this second study, which was unrelated to the first, people who had practiced a mindfulness-based “compassion meditation” for a total of just seven hours over two weeks were significantly more likely than people who hadn’t received the training to give money to a stranger in need. What’s more, after completing their training, the meditation group showed noticeable changes in brain activity, including in networks linked to understanding the suffering of others.

“Our findings,” write the authors of the second study, “support the possibility that compassion and altruism can be viewed as trainable skills rather than as stable traits.”




Meditation changes gene expression.


Are genes destiny? They certainly influence our behavior and health outcomes—for example, one study published in 2013 found that genes make some people more inclined to focus on the negative. But more and more research is revealing how it’s a two-way street: Our choices can also influence how our genes behave.

In 2013, a collaborative project between researchers in Spain and France and at the University of Wisconsin found that when experienced meditators meditate, they quiet down the genes that express bodily inflammation in response to stress.

How did they figure this out? Before and after two different retreat days, the researchers drew blood samples from 19 long-term meditators (averaging more than 6000 lifetime hours) and 21 inexperienced people. During the retreat, the meditators meditated and discussed the benefits and advantages of meditation; the non-meditators read, played games, and walked around.

After this experience, the meditators’ inflammation genes—measured by blood concentrations of enzymes that catalyze or are a byproduct of gene expression—were less active. Blood samples from the people in the leisure-day condition did not show these changes.

Why does this matter? The researchers also looked at their study participants’ ability to recover from a stressful event. Long-term meditators’ ability to turn down inflammatory genes, it turns out, predicted how quickly stress hormones in their saliva diminished after a stressful experience—a sign of healthy coping and resilience that can potentially lead to a longer life.

This is good news to people who come from a family of stress cases who are stress-prone themselves: There are steps you can take to mitigate the impact of stressful events. Hard as it may be to find time or get excited about meditating, mounting evidence suggests that it can offer more concrete advantages to a healthy life than the leisurely activities we more readily seek.




Mindfulness training improves teachers’ performance in the classroom.


For educators grappling with students’ behavioral problems and other sources of stress, new research suggested an effective response: mindfulness.

Although mindfulness-based programs are not uncommon in schools these days, they’ve mainly been deployed to enhance students’ social, emotional, and cognitive skills; only a handful of programs and studies have examined the benefits of mindfulness for teachers, and in those cases, the research has focused largely on the general benefits for teachers’ mental health.

But in 2013, researchers at the University of Wisconsin’s Center for Investigating Healthy Minds broke new ground when they studied the impact of an eight-week mindfulness course developed specifically for teachers, looking not only at its effects on the teachers’ emotional well-being and levels of stress but also on their performance in the classroom.

They found that teachers randomly assigned to take the course felt less anxious, depressed, and burned out afterward, and felt more compassionate toward themselves. What’s more, according to experts who watched the teachers in action, these teachers ran more productive classrooms after completing the course and improved at managing their students’ behavior as well. The results, published in Mind, Brain, and Education, show that stress and burnout levels actually increased among teachers who didn’t take the course.

The researchers speculate that mindfulness may carry these benefits for teachers because it helps them cope with classroom stress and stay focused on their work. “Mindfulness-based practices offer promise as a tool for enhancing teaching quality,” write the researchers, “which may, in turn, promote positive student outcomes and school success.”




There’s nothing simple about happiness.


Who doesn’t want to be happy? Happy is always good, right?

Sure. Just don’t be too happy, OK? Because June Gruber and her colleagues analyzed health data and found that it’s much better to be a little bit happy over a long period of time than to experience wild spikes in happiness. Another study, published in the journal Emotion, showed how seeking happiness at the right time may be more important than seeking happiness all the time. Instead, allowing yourself to feel emotions appropriate to a situation—whether or not they are pleasant in the moment—is a key to long-lasting happiness.

In a study published earlier in the year in the journal Psychological Science, Sonja Lyubomirsky and Kristin Layous found that not all research-approved happiness practices work for everyone all the time. “Let’s say you publish a study that shows being grateful makes you happy—which it does,” Lyubomirsky recently told us. “But, actually, it’s much harder than that. It’s actually very hard to be grateful, and to be grateful on a regular basis, and at the right time, and for the right things.” She continued:
So, for example, some people have a lot of social support, some people have little social support, some people are extroverted, some people are introverted—you have to take into account the happiness seeker before you give them advice about what should make them happy. And then there are factors relevant to the activity that you do. How is it that you’re trying to become happier? How is it that you’re trying to stave off adaptation? Are you trying to appreciate more? Are you trying to do more acts of kindness? Are you trying to savor the moment? The kind of person you are, the different kinds of activities, and how often you do them, and where you do them—these are all going to matter.
The bottom line might be that if happiness were really that simple, we’d all be happy all the time. But we’re not, and that appears to be because there is no rigid formula for happiness. It’s a state that comes and goes in response to how we’re changing and how our world is changing.




Gratitude can save your life.


Or at least help lessen suicidal thoughts, says a study published in the Journal of Research in Personality.

Across a four-week period, 209 college students answered questions to measure depression, suicidal thoughts, grit, gratitude, and meaning in life. The idea was to see if the positive traits—grit and gratitude—mitigated the negative ones. Since depression is a large contributing factor to suicide, they controlled for that variable throughout the study.

Grit, said the authors, is “characterized by the long-term interests and passions, and willingness to persevere through obstacles and setbacks to make progress toward goals aligned or separate from these passionate pursuits.” It stands to reason that someone with lots of grit wouldn’t waste much time on suicidal thoughts.

But what about gratitude? That entails noticing the benefits and gifts received from others, and it gives an individual a sense of belonging. That should make life living—and, indeed, the researchers found that gratitude and grit worked synergistically together to make life more meaningful and to reduce suicidal thoughts, independent of depression symptoms.

As the authors note, their study has huge clinical implications: If therapists can specifically foster gratitude in suicidal people, they should be able to increase their sense that life is worth living. This new finding adds to a pile of new research on the benefits of gratitude. Saying “thanks” can make you happier, sustain your marriage through tough times, reduce envy, and even improve physical health.




Employees are motivated by giving as well as getting.


Over the past two decades, work satisfaction has declined, while time spent at work has significantly increased. Not a good combination!

Would paying people more money help? Some studies have shown that rewarding employees for their hard work and late nights at the office with a bonus will make things a little better and quiet dissatisfaction. But in September, through the collaborative research of Lalin Anik, Lara B. Aknin, Michael I. Norton, Elizabeth W. Dunn, and Jordi Quoidbach, we learned that employee bonuses might have the most positive effects when they’re spent on others. The researchers suggested an alternative bonus offer that has the potential to provide some of the same benefits as team-based compensation—increased social support, cohesion, and performance—while carrying fewer drawbacks.

Their first experiment focused on broad, self-reported measures of the impact of prosocial bonuses on an employee’s job satisfaction. They were either given a bonus to spend on charity or were not given a bonus at all. Those who gave to charities reported increased happiness and job satisfaction. The second experiment was conducted in two parts—both focused on “sports team orientation” by looking at the difference between donating to a charity or a fellow employee—and attempted to see if these improved actual performance. In the first part of the experiment, these participants were given $20 and told to spend it on a teammate or on themselves over the course of the week. In the second part of this experiment, they were instructed to spend $22 on themselves or on a specified teammate over the course of the week. Both of these experiments found more positive effects for givers than those who spent the $22 on themselves.

This collaborative research indicates that prosocial bonuses can benefit both individuals and teams, on both psychological and “bottom line” indicators, in both the short and long-term. So when you receive your bonus this year, you might want to think twice before buying those pair of shoes you’ve been dying for, instead consider spending it on someone else—because, according to this research, you’ll probably be much happier and more satisfied with your job.


Subtle contextual factors influence our sense of right and wrong.



An out-of-control train will kill five people. You can switch the train onto another track and save them—but doing so will kill one person. What should you do?

A series of experiments published in the journal Psychological Science suggests that on one day you’ll divert the train and save those five lives—but on another you might not. It all depends on how the dilemma is framed and how we’ve been thinking about ourselves.

Through the train dilemma and other experiments, the study revealed two factors that can influence our moral decisions. The first involves how morality has been defined for you, in this case around consequences or rules. For example, when researchers asked participants to think in terms of consequences, some readily diverted the train, thus saving four lives. On the other hand, those who prompted to think in terms of rules (e.g., “thou shalt not kill”) let the five die. But that factor was influenced by another that depends on memory and whether your past ethical or unethical behavior is on your mind—a memory of a good deed might make you more likely to cheat, for example, if urged to think of consequences. It’s the complex interaction between those two factors that shapes your decision.

That wasn’t the only study published during the past year that revealed how susceptible we are to context. One study found that people are more moral in the morning than in the afternoon. Another study, cleverly titled “Hunger Games,” found that when people are hungry, they express more support for charitable giving. Yet another experiment discovered that thinking about money makes you more inclined to cheat at a game—but thinking about time keeps you honest.

The bottom line is that our sense of right and wrong is heavily influenced by seemingly trivial variables in memory, in our bodies, and in changes within our environment. This doesn’t necessarily lead us to pessimistic conclusions about humanity—in fact, knowing how our minds work might help us to make better moral decisions.


Anyone can cultivate empathic skills—even psychopaths.



In daily life, calling someone a “psychopath” or a “sociopath” is a way of saying that the person is beyond redemption. Are they?

When neuroscientist James Fallon accidentally discovered that his brain resembled that of a psychopath—showing less activity in areas of the frontal lobe linked to empathy—he was confused. After all, Fallon was a happily married man, with a career and good relationships with colleagues. How could he be beyond redemption?

Additional genetic tests revealed “high-risk alleles for aggression, violence and low empathy.” What was going on? Fallon decided he was a “pro-social psychopath,” someone whose genetic and neurological inheritance makes it hard for him to feel empathy, but who was gifted with a good upbringing and environment—good enough to overcome latent psychopathic tendencies.

This self-description found support in a study published this year by Swiss and German researchers, which showed education levels and “social desirability” seemed to improve empathy in diagnosed psychopaths. Another new study found that empathy deficits don’t necessarily lead to aggression.

It seems that psychopaths can be taught to feel empathy and compassion, though they have a disability that makes developing those skills difficult. When a team of researchers looked at the brain activity of psychopathic criminals in the Netherlands, for example, they discovered the predictable empathic deficits. But they also found that it made a difference in their brains to simply ask the criminals to empathize with others—hinting that empathy may be repressed rather than missing entirely in people classified as psychopaths. For some, at least, it may help a great deal to lift that repression.

Psychopathy remains an intractable mental illness and social problem—this year’s studies of treatment did not reveal a magic bullet that would turn psychopaths into angels. But we can take heart in the fact that if they can develop empathic skills, anyone can.