Pages

Saturday, September 15, 2012

One Day - A Short Animated Film on Impermanence


Via io9, this is a cool little animated film about a man whose house jumps around the world, spending only day in each location. So what happens when he wants to stay someplace?

Film blurb: “One man always on the move will have an encounter that puts into question everything he knows.”


A Sweet Animated Short about a Man Whose House Teleports All Over the World


The short animated film One Day features a man with an incredible gift: his home jumps around the world, spending just one day in each location. But what happens when the house's occupant finds a place where he wants to stay.

This short comes out of Paris' Gobelins School of the Image. There are shades of Doctor Who in the house that jumps around, unbidden by its owner, but it has a very different, very travel-weary protagonist who might be swayed to stay put after a simple, human encounter.

For non-French-speakers, "rien à faire" means "nothing to do."

[via Geek Art Gallery]

Matthew Kálmán Mezey on "Clumsy Leadership," Cultural Theory, and Development



RSA Chief Executive, Matthew Taylor, recently gave his annual lecture (on 12th September) with an accompanying article in RSA Journal. Matthew Kálmán Mezey, the RSA's Online Community Manager, wrote a fairly long and detailed response to the lecture, essentially (this is going for the nutshell summary) asking Taylor if he is serious about creating conditions for people to increase their development (and by this Mezey is generally referring to Robert Kegan's model of cognitive development, which is only one line among many), and if so, how he would envision doing so?

[Let's ignore, for now, the whole thorny issue of "development," such as what it is, does it actually exist, is it an inherent predisposition, as often suggested in integral theory, or is it an evolutionary adaptation to life conditions, and is it morally and ethically right to "develop" people without a full disclosure of the what, why, and how?]

Taylor spoke about Cultural Theory (he calls it the "theory of plural rationality") in his lecture (using the recent example of the Olympics) and the idea that we need "clumsy" leadership to get us through some of the incredibly difficult problems we face, what Keith Grint calls "wicked problems":
A Wicked Problem is more complex, rather than just complicated – that is, it cannot be removed from its environment, solved, and returned without affecting the environment. Moreover, there is no clear relationship between cause and effect. Such problems are often intractable....
Cultural Theory advocates for "clumsy solutions" to these types of problems, since normal solution-focused thinking is no longer useful at this level. A "clumsy" solution involves, by Taylor's account, "the three active rationalities of hierarchy, egalitarianism, and individualism (as well as recognising the fourth, passive, rationality of fatalism)." Furthermore,
Leaders who can be part of developing clumsy solutions to wicked problems are likely to have reached an advanced level of awareness not just of the tasks, context and stakeholders but of themselves and – crucially – of the perspectives of others using different frames of rationality.

But lest this seem merely like an elitist’s call for a new generation of enlightened leaders it is important to recognise that leadership is also about followership.

Mezey suggests that "clumsy" leadership is roughly equivalent to Robert Kegan's self-transforming stage (the highest stage he has identified, which exists in less than 1% of the population). Using Jonathan Haidt as an (flawed, in my opinion) example, he ponders whether the 16 years it took Haidt, by his own account, to move from an intellectual awareness of multiple perspectives to actually living a post-partisan perspective (again, his account - he appears to have a conservative bias when I hear him speak without a script) is a possible norm for "growing" people.

Haidt is a poor example for another reason - his life is that of an academic and author. Such a life offers much greater opportunity for growth, and time for speculation, than that of a government official or a corporate CEO. Nor do the rest of us who work for a living have the freedom Haidt likely enjoys.

Anyway. There are links to a lot of good articles and books in this article, and I highly recommend it (although, it assumes some working knowledge of leadership, integral theory (especially Robert Kegan, but also some Spiral Dynamics), and Cultural Theory (the links to Matthew Taylor's article provide a good foundation).

Finally, Mezey offers a variety of options for engaging programs that may stimulate growth in people, and solutions are what we need. Among his suggestions are the following:
  • Launch a research project to uncover how commonplace the Self-transforming/Clumsy mind actually is. 
Eg across the UK, in organisations, in different professions, in Government depts, in No. 10 Downing Street, in the RSA...
  • Undertake a literature review/metanalysis of all interventions which have fostered positive adult growth (towards 'Clumsiness').
How many have successfully fostered development to the Socialised (traditional) stage, how many to the Self-authoring (modern) stage, how many to the Self-transforming - Clumsy - stage? Which ones appear valid and replicable? Could we help any to become widespread?
  • Produce a template to help active citizens create ‘Clumsy’ solutions
A toolkit to help active citizens to uncover how each of the three Cultural Theory rationalities would view any issue, followed by guidance on how to mesh them together to build a powerful and sustainable ‘Clumsy’ solution. (NB Cultural Theory sometimes adds in Fatalism and the Hermit as additional rationalities).
  • Start testing out the collaborative/organisational approaches - such as ‘Future Search’ - that Cultural Theory has assessed as being most ‘Clumsy’
An excellent - though, of course, challengeable - Cultural Theory paper by Steven Ney and Marco Verweij is titled Messy Institutions for Wicked Problems: How to Generate Clumsy Solutions. It looks at a number of approaches to collaboration, strategising and decision-making in organisations to see which ones are most ‘Clumsy’/’Messy’(ie does it honour the 4 or 5 Cultural Theory lenses?).

Candidate approaches researched include Open Space, Soft Systems Methodology, Citizen’s Juries, Bohm Dialogues, Future Search, Wisdom Circles and the Learning Organisation. (It assesses 19 in all).

The most Clumsy/Messy - and potentially Self-transforming - is Future Search, with others like Design Thinking and 21st Century Town Meetings coming close too. Bohm Dialogue, Learning Organisation and Open Space are some of the approaches that only honour one of the rationalities, and so are far from integrative and ‘Clumsy’. (I’m not the only one who might quibble with some of this - but it’s a great step forward to scan the current tools being used in organisations, from NGOs to corporations, and to analyse which are most integrative/Clumsy).
  • Begin to assess government policy proposals to see whether they are integrative and ‘clumsy’, or not.
Use Ney and Verweij’s assessment approach - above - but apply it to proposed Government policies.

Another angle on this might be to undertake a before and after adult developmental stage assessment, to see if policies are fostering the growth of ‘Self-authoring’ minds (as the OECD said was so important in the 21st century - see Beyond the Big Society) or Self-transforming minds.
  • A booklet featuring the key proponents of the different models of plural rationalities outlining how their approach has successfully dealt with - or could deal with - a particular ‘wicked’ issue.
An edited collection of short articles could showcase approaches such as:
- Cultural Theory’s plural rationalities
- Jonathan Haidt’s moral matrices
- Prof Robert Kegan’s ways of knowing
- Prof Clare Graves/Spiral Dynamics’ ‘Value memes’
- Mark Williams’ ‘10 Lenses’ (actually there are really 11, as he has a clumsy/integrative one too).
- Pat Dade’s 12 ‘Values Modes’ (with its three Maslowian top-level categories of ‘Sustenance Driven’, ‘Outer Directed’ and ‘Inner Directed’)
- Torbert/Loevinger/Cook-Greuter ‘Action Logics’/Ego stages
- Lawrence Kohlberg’s moral stages
- Ken Wilber’s integral approach (though this is an integration, rather than a single model)

(And potentially many others too: Michael Commons, William Perry, Belenky et al, Hall-Tonna, Richard Barrett, King and Kitchener, Marcia Baxter Magolda etc).

I don’t think any publication like this has been attempted before - and it could be very helpful indeed to policy-makers - offering some really fresh thinking on ‘Wicked’ issues.

  • Support ‘Clumsy’ leaders, help them to stay - and grow - inside their organisations
A new ‘Clumsy’ leaders network might help, and I’m sure Jennifer Garvey Berger isn’t the only person who’s thought about how to support Self-transforming/clumsy individuals so that they don’t leave their organisations - where their input could be unique and valuable. (This might make a good topic for a research project). The idea that organisational culture often acts as a sorting mechanism to drive away the very people who might be able to succeed with ‘Wicked’ issues is pretty troubling.

Another place to look for ways to turn organisations into supportive and deliberately transformational institutions might be in the work on schools by Eleanor Drago-Severson. Her wonderful 2009 book
Leading Adult Learning: Supporting Adult Development in Our Schools is a pioneering look at how school leaders can foster the adult development of their staff by understanding developmental diversity, in order that their schools be the most effective for their students.

Ellie describes 4 practices that schools - and any other organisation - can use to support adult transformation and growth:

- Teaming
- Providing Leadership Roles
- Collegial inquiry
- Mentoring

Of course, it may even turn out that the late Elliot Jaques was right all along with his rather prescriptive and hierarchical vision of a ‘Requisite Organisation’ that is designed to reflect, engage and support the different stages of cognitive complexity of staff. (Part of the problem might be that Jaquesians have never properly reworked his model for the world of knowledge-based work?)

  • From Alpha Course… to Genesis Course - spreading transformation across the UK and beyond
The RSA’s Beyond the Big Society report called for ‘transformational learning hubs which run training exercises for community leaders’.
 
One way I envisage to do this would be to create a secularised, transformational equivalent of the Alpha Course, complete with shared meals - and an active citizenship focus.

The Alpha Course has proven hugely popular with its 'opportunity to explore the meaning of life': it has attracted 3 million participants in the UK, and 15 million worldwide. I think some don’t - initially - even fully realise its Christian basis. It now runs in churches, homes, workplaces, prisons, universities and elsewhere.


Could there be a way fuse citizens’ skills, self-development, effectiveness and community engagement (and ‘Social Brain’ reflexivity) into a deliberately transformational 10-week course, that could spread across the 100 countries that have RSA Fellows?
I particularly like the idea of "wicked problems" pamphlets with solutions offered by many or most of the folks Mezey identifies above. These white papers could be very influential if given enough publicity and put in the hands of sympathetic leaders in various fields.


I also like the idea of generating a template for local leaders or organizers to generate clumsy solutions for wicked problems at the local level, which maybe then can be scaled up to the next bigger level, and so on. Bottom-up approaches, at least in the U.S., tend to work very well (witness the highly organized GOP takeover of school boards, local government, state government, and then for much of the last 45 years, the White House).



One idea I would like to see is some way to identify those people with the most power to create change and cultural shifts, however many that may be - from 10 or 15 to possibly 500. Then create an assessment that could easily identify their stage according to Kegan's model, with the goal of identifying those who may be already in position, developmentally, to implement these "clumsy solutions" so that they can be given information and a meta-framework for understanding these ideas.


My sense is that there are thousands of people in powerful positions who, if they could be shown the importance of greater depth and wider span in their leadership methods, would already be in position to generate considerable change very quickly.


But I am, strangely, an optimist.


For more information

These are some of the links offered in the body of the article:

ENCODE - The ENCyclopedia Of DNA Elements

Early last week, the science world was buzzing with the release of more than 30 papers highlighting the results from the 2nd phase of ENCODE -"a consortium-driven project tasked with building the ‘ENCyclopedia Of DNA Elements’, a manual of sorts that defines and describes all the functional bits of the genome."

In the following article, Nature offered a comprehensive overview of the ENCODE project - following the article, there are three links that look at the results so far and some of what they suggest.

ENCODE: The human encyclopaedia

First they sequenced it. Now they have surveyed its hinterlands. But no one knows how much more information the human genome holds, or when to stop looking for it.


By Brendan Maher

05 September 2012


Ewan Birney would like to create a printout of all the genomic data that he and his collaborators have been collecting for the past five years as part of ENCODE, the Encyclopedia of DNA Elements. Finding a place to put it would be a challenge, however. Even if it contained 1,000 base pairs per square centimetre, the printout would stretch 16 metres high and at least 30 kilometres long.

ENCODE was designed to pick up where the Human Genome Project left off. Although that massive effort revealed the blueprint of human biology, it quickly became clear that the instruction manual for reading the blueprint was sketchy at best. Researchers could identify in its 3 billion letters many of the regions that code for proteins, but those make up little more than 1% of the genome, contained in around 20,000 genes — a few familiar objects in an otherwise stark and unrecognizable landscape. Many biologists suspected that the information responsible for the wondrous complexity of humans lay somewhere in the ‘deserts’ between the genes. ENCODE, which started in 2003, is a massive data-collection effort designed to populate this terrain. The aim is to catalogue the ‘functional’ DNA sequences that lurk there, learn when and in which cells they are active and trace their effects on how the genome is packaged, regulated and read.


After an initial pilot phase, ENCODE scientists started applying their methods to the entire genome in 2007. Now that phase has come to a close, signalled by the publication of 30 papers, in Nature, Genome Research and Genome Biology. The consortium has assigned some sort of function to roughly 80% of the genome, including more than 70,000 ‘promoter’ regions — the sites, just upstream of genes, where proteins bind to control gene expression — and nearly 400,000 ‘enhancer’ regions that regulate expression of distant genes (see page 57)1. But the job is far from done, says Birney, a computational biologist at the European Molecular Biology Laboratory’s European Bioinformatics Institute in Hinxton, UK, who coordinated the data analysis for ENCODE. He says that some of the mapping efforts are about halfway to completion, and that deeper characterization of everything the genome is doing is probably only 10% finished. A third phase, now getting under way, will fill out the human instruction manual and provide much more detail.
Many who have dipped a cup into the vast stream of data are excited by the prospect. ENCODE has already illuminated some of the genome’s dark corners, creating opportunities to understand how genetic variations affect human traits and diseases. Exploring the myriad regulatory elements revealed by the project and comparing their sequences with those from other mammals promises to reshape scientists’ understanding of how humans evolved.

Yet some researchers wonder at what point enough will be enough. “I don’t see the runaway train stopping soon,” says Chris Ponting, a computational biologist at the University of Oxford, UK. Although Ponting is supportive of the project’s goals, he does question whether some aspects of ENCODE will provide a return on the investment, which is estimated to have exceeded US$185 million. But Job Dekker, an ENCODE group leader at the University of Massachusetts Medical School in Worcester, says that realizing ENCODE’s potential will require some patience. “It sometimes takes you a long time to know how much can you learn from any given data set,” he says.

Even before the human genome sequence was finished2, the National Human Genome Research Institute (NHGRI), the main US funder of genomic science, was arguing for a systematic approach to identify functional pieces of DNA. In 2003, it invited biologists to propose pilot projects that would accrue such information on just 1% of the genome, and help to determine which experimental techniques were likely to work best on the whole thing.

The pilot projects transformed biologists’ view of the genome. Even though only a small amount of DNA manufactures protein-coding messenger RNA,for example, the researchers found that much of the genome is ‘transcribed’ into non-coding RNA molecules, some of which are now known to be important regulators of gene expression. And although many geneticists had thought that the functional elements would be those that are most conserved across species, they actually found that many important regulatory sequences have evolved rapidly. The consortium published its results3 in 2007, shortly after the NHGRI had issued a second round of requests, this time asking would-be participants to extend their work to the entire genome. This ‘scale-up’ phase started just as next-generation sequencing machines were taking off, making data acquisition much faster and cheaper. “We produced, I think, five times the data we said we were going to produce without any change in cost,” says John Stamatoyannopoulos, an ENCODE group leader at the University of Washington in Seattle.

The 32 groups, including more than 440 scientists, focused on 24 standard types of experiment (see ‘Making a genome manual’). They isolated and sequenced the RNA transcribed from the genome, and identified the DNA binding sites for about 120 transcription factors. They mapped the regions of the genome that were carpeted by methyl chemical groups, which generally indicate areas in which genes are silent. They examined patterns of chemical modifications made to histone proteins, which help to package DNA into chromosomes and can signal regions where gene expression is boosted or suppressed. And even though the genome is the same in most human cells, how it is used is not. So the teams did these experiments on multiple cell types — at least 147 — resulting in the 1,648 experiments that ENCODE reports on this week1, 4–8.



Stamatoyannopoulos and his collaborators4, for example, mapped the regulatory regions in 125 cell types using an enzyme called DNaseI (see page 75). The enzyme has little effect on the DNA that hugs histones, but it chops up DNA that is bound to other regulatory proteins, such as transcription factors. Sequencing the chopped-up DNA suggests where these proteins bind in the different cell types. The team discovered around 2.9 million of these sites altogether. Roughly one-third were found in only one cell type and just 3,700 showed up in all cell types, suggesting major differences in how the genome is regulated from cell to cell.

The real fun starts when the various data sets are layered together. Experiments looking at histone modifications, for example, reveal patterns that correspond with the borders of the DNaseI-sensitive sites. Then researchers can add data showing exactly which transcription factors bind where, and when. The vast desert regions have now been populated with hundreds of thousands of features that contribute to gene regulation. And every cell type uses different combinations and permutations of these features to generate its unique biology. This richness helps to explain how relatively few protein-coding genes can provide the biological complexity necessary to grow and run a human being. ENCODE “is much more than the sum of the parts”, says Manolis Kellis, a computational genomicist at the Massachusetts Institute of Technology in Cambridge, who led some of the data-analysis efforts.

The data, which have been released throughout the project, are already helping researchers to make sense of disease genetics. Since 2005, genome-wide association studies (GWAS) have spat out thousands of points on the genome in which a single-letter difference, or variant, seems to be associated with disease risk. But almost 90% of these variants fall outside protein-coding genes, so researchers have little clue as to how they might cause or influence disease.

The map created by ENCODE reveals that many of the disease-linked regions include enhancers or other functional sequences. And cell type is important. Kellis’s group looked at some of the variants that are strongly associated with systemic lupus erythematosus, a disease in which the immune system attacks the body’s own tissues. The team noticed that the variants identified in GWAS tended to be in regulatory regions of the genome that were active in an immune-cell line, but not necessarily in other types of cell and Kellis’s postdoc Lucas Ward has created a web portal called HaploReg, which allows researchers to screen variants identified in GWAS against ENCODE data in a systematic way. “We are now, thanks to ENCODE, able to attack much more complex diseases,” Kellis says.

Are we there yet?

Researchers could spend years just working with ENCODE’s existing data — but there is still much more to come. On its website, the University of California, Santa Cruz, has a telling visual representation of ENCODE’s progress: a grid showing which of the 24 experiment types have been done and which of the nearly 180 cell types ENCODE has now examined. It is sparsely populated. A handful of cell lines, including the lab workhorses called HeLa and GM12878, are fairly well filled out. Many, however, have seen just one experiment.

Scientists will fill in many of the blanks as part of the third phase, which Birney refers to as the ‘build out’. But they also plan to add more experiments and cell types. One way to do that is to expand the use of a technique known as chromatin immunoprecipitation (ChIP), which looks for all sequences bound to a specific protein, including transcription factors and modified histones. Through a painstaking process, researchers develop antibodies for these DNA binding proteins one by one, use those antibodies to pull the protein and any attached DNA out of cell extracts, and then sequence that DNA.

But at least that is a bounded problem, says Birney, because there are thought to be only about 2,000 such proteins to explore. (ENCODE has already sampled about one-tenth of these.) More difficult is figuring out how many cell lines to interrogate. Most of the experiments so far have been performed on lines that grow readily in culture but have unnatural properties. The cell line GM12878, for example, was created from blood cells using a virus that drives the cells to reproduce, and histones or other factors may bind abnormally to its amped-up genome. HeLa was established from a cervical-cancer biopsy more than 50 years ago and is riddled with genomic rearrangements. Birney recently quipped at a talk that it qualifies as a new species.

ENCODE researchers now want to look at cells taken directly from a person. But because many of these cells do not divide in culture, experiments have to be performed on only a small amount of DNA, and some tissues, such as those in the brain, are difficult to sample. ENCODE collaborators are also starting to talk about delving deeper into how variation between people affects the activity of regulatory elements in the genome. “At some places there’s going to be some sequence variation that means a transcription factor is not going to bind here the same way it binds over here,” says Mark Gerstein, a computational biologist at Yale University in New Haven, Connecticut, who helped to design the data architecture for ENCODE. Eventually, researchers could end up looking at samples from dozens to hundreds of people.

The range of experiments is expanding, too. One quickly developing area of study involves looking at interactions between parts of the genome in three-dimensional space. If the intervening DNA loops out of the way, enhancer elements can regulate genes hundreds of thousands of base pairs away, so proteins bound to the enhancer can end up interacting with those attached near the gene. Dekker and his collaborators have been developing a technique to map these interactions. First, they use chemicals that fuse DNA-binding proteins together. Then they cut out the intervening loops and sequence the bound DNA, revealing the distant relationships between regulatory elements. They are now scaling up these efforts to explore the interactions across the genome. “This is beyond the simple annotation of the genome. It’s the next phase,” Dekker says.

The question is, where to stop? Kellis says that some experimental approaches could hit saturation points: if the rate of discoveries falls below a certain threshold, the return on each experiment could become too low to pursue. And, says Kellis, scientists could eventually accumulate enough data to predict the function of unexplored sequences. This process, called imputation, has long been a goal for genome annotation. “I think there’s going to be a phase transition where sometimes imputation is going to be more powerful and more accurate than actually doing the experiments,” Kellis says.

Yet with thousands of cell types to test and a growing set of tools with which to test them, the project could unfold endlessly. “We’re far from finished,” says geneticist Rick Myers of the HudsonAlpha Institute for Biotechnology in Huntsville, Alabama. “You might argue that this could go on forever.” And that worries some people. The pilot ENCODE project cost an estimated $55 million; the scale-up was about $130 million; and the NHGRI could award up to $123 million in the next phase.

Some researchers argue that they have yet to see a solid return on that investment. For one thing, it has been difficult to collect detailed information on how the ENCODE data are being used. Mike Pazin, a programme director at the NHGRI, has scoured the literature for papers in which ENCODE data played a significant part. He has counted about 300, 110 of which come from labs without ENCODE funding. The exercise was complicated, however, because the word ‘encode’ shows up in genetics and genomics papers all the time. “Note to self,” says Pazin wryly, “make up a unique project name next time around.”

A few scientists contacted for this story complain that this isn’t much to show from nearly a decade of work, and that the choices of cell lines and transcription factors have been somewhat arbitrary. Some also think that the money eaten up by the project would be better spent on investigator-initiated, hypothesis-driven projects — a complaint that also arose during the Human Genome Project. But unlike the genome project, which had a clear endpoint, critics say that ENCODE could continue to expand and is essentially unfinishable. (None of the scientists would comment on the record, however, for fear that it would affect their funding or that of their postdocs and graduate students.)

Birney sympathizes with the concern that hypothesis-led research needs more funding, but says that “it’s the wrong approach to put these things up as direct competition”. The NHGRI devotes a lot of its research dollars to big, consortium-led projects such as ENCODE, but it gets just 2% of the total US National Institutes of Health budget, leaving plenty for hypothesis-led work. And Birney argues that the project’s systematic approach will pay dividends. “As mundane as these cataloguing efforts are, you’ve got to put all the parts down on the table before putting it together,” he says.

After all, says Gerstein, it took more than half a century to get from the realization that DNA is the hereditary material of life to the sequence of the human genome. “You could almost imagine that the scientific programme for the next century is really understanding that sequence.”


Nature 489, 46–48 (06 September 2012) | doi:10.1038/489046a

References

  1. The ENCODE Project Consortium Nature 489, 5774 (2012).
    Show context
  2. International Human Genome Sequencing Consortium Nature 431, 931945 (2004).
    Show context
  3. The ENCODE Project Consortium Nature 447, 799816 (2007).
    Show context
  4. Thurman, R. E. et al. Nature 489, 7582 (2012).
    Show context
  5. Neph, S. et al. Nature 489, 8390 (2012).
    Show context
  6. Gerstein, M. B. et al. Nature 489, 91100 (2012).
    Show context
  7. Djebali, S. et al. Nature 489, 101108 (2012).
    Show context
  8. Sanyal, A., Lajoie, B. R., Jain, G. & Dekker, J. Nature 489, 109113 (2012).
    Show context

Related stories and links:

Friday, September 14, 2012

RSA Animate: Dan Airely - The Truth About Dishonesty


A new RSA Animate, this one built on Dan Airely's RSA talk on "the truth about dishonesty."


RSA Animate: Dan Airely - The Truth About Dishonesty

Are you more honest than a banker? Under what circumstances would you lie, or cheat, and what effect does your deception have on society at large? Dan Ariely, one of the world's leading voices on human motivation and behaviour is the latest big thinker to get the RSA Animate treatment.

Taken from a lecture given at the RSA in July 2012 . Watch the longer talk here.

Conversations on Compassion: Jon Kabat-Zinn


Conversations on Compassion with Dr. James Doty and Jon Kabat-Zinn, hosted by CCARE (The Center for Compassion and Altruism Research and Education) at Stanford University on December 14, 2011.

Hal Arkowitz and Scott O. Lilienfeld - Are All Psychotherapies Created Equal?

That is the big question in the world of psychotherapies - and the answer is yes and no. They are equal in so far as all good therapy is about the relationship between the therapist in the client. They are not equal in that some models do not spend much effort in building that crucial relationship.

The other part of it is that some therapies are better suited to some issues than others. There is no model that works for all issues equally.

Are All Psychotherapies Created Equal?

Certain core benefits cut across methods, but some differences in effectiveness remain


pscyhotherapy, mental health  
Image: MARIO WAGNER

As a prospective client searches for a psychotherapist, numerous questions may spring to mind. How experienced is the therapist? Has he helped people with problems like mine? Is she someone I can relate to? Yet it may not occur to clients to ask another one: What type of therapy does the clinician deliver? People often assume that the brand of therapy offered is irrelevant to the effectiveness of treatment. Is this assumption correct?

Psychologists do not agree on whether the “school” of therapy predicts its effectiveness. In a survey in 2006 by psychologists Charles Boisvert of Rhode Island College and David Faust of the University of Rhode Island, psychotherapy researchers responded to the statement that “in general, therapies achieve similar outcomes” with an average score of 6 on a 7-point scale, indicating strong agreement. In contrast, psychologists in practice averaged a rating of 4.5, signifying that they agreed only moderately with that position.

As we will discover, both camps can justify their point of view. Although a number of commonly used psychotherapies are broadly comparable in their effects, some options are less well suited to certain conditions, and a few may even be harmful. In addition, the differences among therapies in their effectiveness may depend partly on the kinds of psychological problems that clients are experiencing.
 
Tale of the Dodo Bird

At least 500 different types of psychotherapy exist, according to one estimate by University of Scranton psychologist John Norcross. Given that researchers cannot investigate all of them, they have generally concentrated on the most frequently used approaches. These include behavior therapy (altering unhealthy behaviors), cognitive-behavior therapy (altering maladaptive ways of thinking), psychodynamic therapy (resolving unconscious conflicts and adverse childhood experiences), interpersonal therapy (remedying unhealthy ways of interacting with others), and person-centered therapy (helping clients to find their own solutions to life problems).

As early as 1936, Washington University psychologist Saul Rosenzweig concluded after perusing the literature that one therapy works about as well as any other. At the time, many of the principal treatments fell roughly into the psychodynamic and behavioral categories, which are still widely used today. Rosenzweig introduced the metaphor of the Dodo Bird, after the feathered creature in Lewis Carroll's Alice in Wonderland, who declared following a race that “everyone has won, and all must have prizes.” The “Dodo Bird verdict” has since come to refer to the claim that all therapies are equivalent in their effects.

This verdict gained traction in 1975, when University of Pennsylvania psychologist Lester Luborsky and his colleagues published a review of relevant research suggesting that all therapies work equally well. It gathered more momentum in 1997, when University of Wisconsin–Madison psychologist Bruce E. Wampold and his co-authors published a meta-analysis (quantitative review) of more than 200 scientific studies in which “bona fide” therapies were compared with no treatment. By bona fide, they meant treatments delivered by trained therapists, based on sound psychological principles and described in publications. Wampold's team found the differences in the treatments' effectiveness to be minimal (and they were all better than no treatment).

One explanation for the Dodo Bird effect is that virtually all types of psychotherapy share certain core features.
Read the whole article.

Thursday, September 13, 2012

High Fructose Intake Linked to Poor Liver Health


Unlike glucose (the simplest sugar, and the foundation of carbohydrate energy use in the body), which can be metabolized and used for energy by cells throughout the body, fructose (fruit sugar) must be metabolized through a rather complex process in the liver, where some of it becomes glucose eventually, and a substantial portion is bound in triglycerides, destined to become very-low-density lipoproteins (VLDL) or be stored as fat (or to clog arteries as VLDL cholesterol).

A new study shows that this process, fructolysis, increases uric acid levels in the liver and decreases  ATP levels (the energy that powers cells) - this was with obese and diabetic subjects, but there is ample evidence that this is not confined only to this population.
For the present study, 244 obese and diabetic adults from the Look AHEAD Study were evaluated, with dietary fructose consumption estimated by the food frequency questionnaire. Liver ATP and uric acid levels were measured in 105 patients who participated in the Look AHEAD Fatty Liver Ancillary Study. Researchers assessed the change in liver ATP content using an IV fructose challenge in 25 subjects, comparing patients with low fructose consumption (less than 15 grams per day) to those with high fructose consumption (greater than 15 grams per day).

The team found that participants with a high intake of dietary fructose had lower liver ATP levels at baseline and a greater change in ATP content following the fructose challenge than those who consumed a lower amount of fructose. Patients with high uric acid levels (5.5 mg/dL or more) displayed lower ATP stores in response to fructose.

Dr. Abdelmalek concludes, “High fructose consumption and elevated levels of uric acid are associated with more severe depletion of liver ATP. Our findings suggest that increased dietary fructose intake may impair liver “energy balance.” Further research to define the clinical implications of these findings on metabolism and NAFLD is necessary.” The authors highlight the importance of public awareness of the risks associated with a diet high in fructose.
The study citation, as given in the press release:
“Higher Dietary Fructose Is Associated with Impaired Hepatic ATP Homeostasis in Obese Individuals with Type 2 Diabetes.” Manal F. Abdelmalek, Mariana Lazo, Alena Horska, Susanne Bonekamp, Edward W. Lipkin, Ashok Balasubramanyam, John P. Bantle, Richard J. Johnson, Anna Mae Diehl, Jeanne M. Clark, and the Fatty Liver Subgroup of the Look AHEAD Research Group. Hepatology; (DOI: 10.1002/hep.25741); Print Issue Date: September, 2012. URL: http://onlinelibrary.wiley.com/doi/10.1002/hep.25741/abstract

Here is some additional information on fructose metabolism from Wikipedia:

Metabolism

In a 2012 meta-analysis of controlled feeding clinical trials, fructose was not an independent factor for weight gain. Fructose consumption did cause weight gain in a diet with excessive calories, which could be due to the extra calories rather than fructose per se.[40]

Excess fructose consumption has been hypothesized to be a cause of insulin resistance, obesity,[41] elevated LDL cholesterol and triglycerides, leading to metabolic syndrome.[42] In preliminary research, fructose consumption was correlated with obesity.[43][44] A study in mice showed that a high fructose intake may increase adiposity.[45]

Although all simple sugars have nearly identical chemical formulae, each has distinct chemical properties. This can be illustrated with pure fructose. A journal article reports that, "...fructose given alone increased the blood glucose almost as much as a similar amount of glucose (78% of the glucose-alone area)".[46][47][48][48][49]

In Wistar fatty rats, a laboratory model of diabetes, 10% fructose feeding as opposed to 10% glucose feeding was found to increase blood triglyceride levels by 86%, whereas the same amount of glucose had no effect on triglycerides.[50] Neither glucose nor fructose influenced insulin or blood sugar in this model. The authors concluded "These results show that in genetically obese, diabetic rats feeding fructose and glucose is associated with an increase in hepatic lipogenic enzyme activities and triglyceride production, and suggest that fructose stimulates triglyceride production but impairs triglyceride removal, whereas glucose stimulates both of them.[50]

Another study in humans concluded that fructose and sucrose are metabolized similarly,[51] whereas a different analysis "produced significantly higher fasting plasma triglyceride values than did the glucose diet in men" and "...if plasma triacylglycerols are a risk factor for cardiovascular disease, then diets high in fructose may be undesirable".[52]

Fructose is a reducing sugar, as are all monosaccharides. The spontaneous chemical reaction of simple sugar molecules binding to proteins is known as glycation. Showing potential cause of skin and bone damage in a rat model of diabetes, investigators suggested "that long-term fructose consumption negatively affects the aging process."[53] Another study using human proteins showed that the glycation caused by fructose appears to be equivalent to glucose and so does not seem to be a better answer for diabetes for this reason alone, save for the smaller quantities required to achieve equivalent sweetness in some foods. It also found evidence for glycation of human lens proteins caused by fructose.[54]

The people most at risk of increased triglycerides and fat storage from eating fructose are those who are consuming more calories than they need. Consumption of high-fructose corn syrup is also a huge risk factor for anyone, healthy or overweight.

If you eat fruits, and you should, be sure your daily calorie intake is in check and that you are exercising regularly. Then, try to choose fruits lower in sugar, such as apples, berries, melons, and so on, that also offer good nutrient and antioxidant benefits.

Dr. Kristin Neff - The Science of Self-Compassion


This is a cool video of Dr. Kristin Neff presenting her talk, The Science of Self-Compassion, offered via The Center for Compassion and Altruism Research and Education, at Stanford University. This talk was a part of The Science of Compassion: Origins, Measures, and Interventions, which took place July 19th to 22nd in Telluride Colorado. Her lecture was part of panel Self-Report Autonomic and Behavioral Measures of Compassion by Kristin Neff, Ph.D.

Buddhist Geeks 264: McLuhan and Buddhism | How is the Medium Changing the Message?

This cool Buddhist Geeks talk with Ken McLeod is from the 2012 Buddhist Geeks Conference. You can stream the podcast directly at the Buddhist Geeks site, just follow the title link below.

Buddhist Geeks 264: McLuhan and Buddhism | How is the Medium Changing the Message?

BG 264: McLuhan and Buddhism | How is the Medium Changing the Message?
by Ken McLeod

Download

 

Episode Description:

What is the message of Buddhism today?
Self-improvement? A fulfilling life? An understanding of the mysteries of the human condition? How does McLuhan’s famous dictum “the medium is the message” apply now that people are connecting with Buddhism in radically different ways?

In this episode, taken from the Buddhist Geeks Conference in 2012, Ken McLeod explores how McLuhan’s famous dictum “the medium is the message” might apply to Buddhism.

Episode Links:

Transcript:

Transcript coming soon…

Wednesday, September 12, 2012

Thupten Jinpa, Ph.D. - The Science of Compassion: Origins, Measures, and Interventions


This cool talk was posted at The Center for Compassion and Altruism Research and Education - hosted by Stanford University.
The Science of Compassion: Origins, Measures, and Interventions, that took place July 19th to 22nd in Telluride Colorado, was the first large-scale international conference of its kind dedicated to scientific inquiry into compassion. The conference convened a unique group of leading world experts in the fields of altruism, compassion, and service to present their latest research. This talk was part of panel Origins and Conceptual Models of Compassion by Thupten Jinpa, Ph.D. 

Tami Simon & A.H. Almaas - Love of the Truth, Without End

In this new podcast from Sounds True, Tami Simon speaks with A.H. Almaas (Hameed Ali) about his integral psycho-spiritual system, The Diamond Approach.

Love of the Truth, Without End


Tuesday, September 11, 2012 

Tami Simon speaks with A.H. Almaas. A.H. Almaas is the pen name for Hameed Ali, best known as the originator of the wisdom path known as the Diamond Approach. He is the author of 14 books, including The Unfolding Now, and his works with Sounds True include the audio learning course The Diamond Approach and Realization Unfolds, a dialogue with Adyashanti

In this episode, Tami speaks with Hameed about some of the distinct characteristics of the Diamond Approach as an approach to investigating both reality and oneself as a path to liberation, why he makes no distinction between a psychological and spiritual approach to inquiry, and how the love of truth drives the process of realization. (73 minutes)

Play

more from A.H. Almaas

A Quick Guide to Mirror Neurons

This comes from Cell Press's Current Biology, a rare open access article available through the Science Direct portal. This quick guide to mirror neurons accompanied a few temporarily open access articles from previous journals that examined mirror neurons, their functions, and the validity of the claims made for them. This is from 2009, so the info is mostly correct, but not fully so.
Current Biology
Volume 19, Issue 21, 17 November 2009, Pages R971–R973

Mirror neurons 

Christian Keysers; Social Brain Lab, Department of Neuroscience, University Medical Center, Groningen, Groningen and Netherlands Institute for Neuroscience, Royal Netherlands Academy of Arts and Sciences (KNAW), Amsterdam
http://dx.doi.org/10.1016/j.cub.2009.08.026, How to Cite or Link Using DOI
Permissions & Reprints

What are mirror neurons? Mirror neurons are multimodal association neurons that increase their activity during the execution of certain actions and while hearing or seeing corresponding actions being performed by others. Neurons responding to the sound or sight of some actions, but only to the execution of different actions, are not mirror neurons.

Where are mirror neurons found? Three research groups have reported the existence of mirror neurons in three regions of the macaque cortex (Figure 1). Pending systematic explorations, we do not know whether mirror neurons exist elsewhere in the macaque brain. Recently, mirror neurons have also been reported in the song-bird.
Full-size image (40 K)
Figure 1. Mirror neurons.Left: regions in which mirror neurons have been recorded in the macaque; and right, voxels showing activity both during observation and execution in the human brain (from Gazzola and Keysers (2009)). Both brains have been partially inflated to reveal the sulci. Many brain regions have not yet been explored for mirror neurons in the monkey, hence the ‘?’s. IPS, intraparietal sulcus; PF/PFG, areas of the inferior parietal lobule.
Do humans have mirror neurons? This issue has been highly contentious, with no individual piece of evidence generally accepted as definitive, but quite a lot of indirect evidence for human mirror neurons has been reported. First, if a subject moves, the power of the mu-rhythm in the electro-encephalogram (EEG) recorded from his or her brain decreases. Similarly, the EEG rhythm desynchronizes when the subject observes somebody else move. Second, behavioral experiments indicate that the execution of an action is facilitated by viewing someone else execute a similar action, but hindered by viewing an incompatible action. Moreover, transcranial magnetic stimulation (TMS) studies evidence that watching performance of an action facilitates the motor cortical representation of the muscles involved in doing the same action. This shows that some neurons involved in performing an action are indeed selectively activated by seeing a similar action — in other words, mirror neurons do exist somewhere in the human brain.
Read the whole article.

Tuesday, September 11, 2012

Georg Northoff - From Emotions to Consciousness – A neuro-phenomenal and neuro-relational approach


This excellent article comes from the always open access Frontiers in Emotion Science (part of the "Frontiers" series of journals). In this article, Northoff extends the James-Lange theory of emotions, which posits that the environment has an indirect but modulating role on emotional feelings "via the body and its sensorimotor and vegetative functions." In his extension, the environment has a direct and "constitutional role in emotional feelings." Northoff calls his model the relational concept of emotional feeling, and he suggests that the "environment itself is constitutive of emotional feeling rather than the bodily representation of the environment."

This is a little geeky, but it represents the shift that is occurring in neuroscience toward a more embedded and embodied view of human experience.

From emotions to consciousness – a neuro-phenomenal and neuro-relational approach

  • Mind, Brain Imaging and Neuroethics Research Unit, Institute of Mental Health Research, University of Ottawa, Ottawa, ON, Canada
The James–Lange theory considers emotional feelings as perceptions of physiological body changes. This approach has recently resurfaced and modified in both neuroscientific and philosophical concepts of embodiment of emotional feelings. In addition to the body, the role of the environment in emotional feeling needs to be considered. I here claim that the environment has not merely an indirect and instrumental, i.e., modulatory role on emotional feelings via the body and its sensorimotor and vegetative functions. Instead, the environment may have a direct and non-instrumental, i.e., constitutional role in emotional feelings. This implies that the environment itself is constitutive of emotional feeling rather than the bodily representation of the environment. I call this the relational concept of emotional feeling. The present paper discusses recent data from neuroimaging that investigate emotions in relation to interoceptive processing and the brain’s intrinsic activity. These data show the intrinsic linkage of interoceptive stimulus processing to both exteroceptive stimuli and the brain’s intrinsic activity. This is possible only if the differences between intrinsic activity and intero- and exteroceptive stimuli is encoded into neural activity. Such relational coding makes possible the assignment of subjective and affective features to the otherwise objective and non-affective stimulus. I therefore consider emotions to be intrinsically affective and subjective as it is manifest in emotional feelings. The relational approach thus goes together with what may be described as neuro-phenomenal approach. Such neuro-phenomenal approach does not only inform emotions and emotional feeling but is also highly relevant to better understand the neuronal mechanisms underlying consciousness in general.

Full Citation: 
Northoff, G. (2012, Aug 31). From emotions to consciousness – a neuro-phenomenal and neuro-relational approach. Front. Psychology 3:303. doi: 10.3389/fpsyg.2012.00303

Here is the introduction to the paper, which offers some background on the way emotions have been understood in contemporary neuroscience.

Introduction

The well-known James–Lange theory determined feelings as perceptions of physiological body changes in the autonomic, hormonal, and motor systems. Once we become aware of physiological bodily changes induced by danger, we feel fear and subjectively experience emotional feelings. James (1884, p. 190) consequently considered bodily changes as central to emotional feelings; “we feel sorry because we cry, angry because we strike, afraid because we tremble, and not that we cry, strike, or tremble, because we are sorry, angry, or fearful, as the case may be.” Modern empirical versions of this theory resurface in current neuroscientific models of emotion as, for instance, in Damasio and others (Damasio, 1999, 2010; Craig, 2003, 2004, 2005, 2009, 2011; Bechara, 2004; Niedenthal, 2007).

Conceptually, the embodied approach to emotion emphasizes the crucial role of the body in emotional feeling. If the body and its vegetative and sensorimotor function play a crucial role in constituting emotional feelings, the body can no longer be considered in a merely objective way but rather as subjective and experienced – the mere Koerper as objective body must be distinguished from the lived body as subjectively experienced body in emotional feeling (Colombetti and Thompson, 2005, 2007; Colombetti, 2008)1.

The emphasis on the body raises the question for the role of the environment in constituting emotional feelings. The body stands in direct contact with the environment via its sensorimotor functions which are emphasized in recent body-based, e.g., embodied concepts of emotional feelings (see Niedenthal et al., 2005; Niedenthal, 2007). The body is supposed to represent the environment in sensorimotor terms and it is these bodily representations that are considered crucial in constituting emotional feelings. The environment may have then an indirect and modulatory role via the body in the constitution of the emotional feelings.

One could also imagine that the environment has a direct and constitutive role in emotional feeling; the environment may then directly constitute emotional feeling independent of the body’s sensorimotor (and vegetative) functions. In this case, emotional feelings should be constituted directly by the respective person’s and its brain’s relation to the social environment (see below for definition) rather than indirectly via bodily representations. Since the person-environment relation is crucial here, I call such approach the relational concept of emotional feeling (see Northoff, 2004 for a general outline of such relational approach and Ben-Ze’ev, 1993 for the characterization of perception as relational).

The general aim of the present paper is to review recent human imaging data on emotional feelings in relation to both interoceptive processing and the brain’s intrinsic activity. This will be accompanied by discussing the empirical and conceptual implications of these data which I assume to favor a relational approach to emotions. Such relational concept characterizes emotions and emotional feeling to be intrinsically affective and subjective. Neuronally I assume this to be related to the interaction of the stimuli with the brain’s intrinsic activity, i.e., rest-stimulus interaction (see below for definition). Finally, the empirical and conceptual implications of such relational approach to emotions for consciousness are pointed out.