Jonathan Shedler, PhD, is somewhat of a hero of mine. He has offered the best meta-analysis and most convincing evidence available that not only is cognitive behavioral therapy (CBT) not very effective, in general (and that few CBT therapists actually practice manualized CBT), but that psychodynamic is considerably more effective in nearly every measure (see The Efficacy of Psychodynamic Psychotherapy, 2010).
He also authored That Was Then, This Is Now: Psychoanalytic Psychotherapy for the Rest of Us (2006/2010), a work-in-progress on the current state of psychoanalytic psychotherapy. For anyone who thinks psychoanalytic therapy is still about laying on a couch with the therapist acting as a "blank slate" and offering little in the way of interaction, aside from abstract interpretations, this article will get you up to speed.
If you would like a little more, see his Scientific American article, Getting to Know Me: Psychodynamic therapy has been caricatured as navel-gazing, but studies show powerful benefits (2010). This is a shorter, more accessible version of "The Efficacy of Psychodynamic Psychotherapy."
In the post below, Dr. Shedler has started what may be a multi-part series (we know there will be at least two parts) looking at the lack of evidence for the so-called evidence-based therapies, such as CBT, REBT, and so on.
Where is the Evidence for Evidence Based Therapies?
A study from a prestigious psychology journal recently crossed my desk. It found that clinicians who provide Cognitive Behavior Therapy—including the most experienced clinicians—routinely depart from CBT techniques described in treatment manuals. “Only half of the clinicians claiming to use CBT use an approach that even approximates to CBT,” the authors wrote.
The finding is not surprising, since there is no evidence that manualized therapy leads to better outcomes, and therapists in the real world naturally adapt their approaches to the needs of individual patients. Their practice methods also evolve over time as they learn through hard-won experience what is helpful to patients and what isn’t.
In fact, studies show that when CBT is effective, it is at least in part because the more skilled practitioners incorporate methods that are fundamentally psychodynamic. These include open-ended, unstructured sessions (versus following an agenda from a manual), working with defenses, discussing the therapy relationship, and drawing connections between the therapy relationship and other relationships.
So the research finding was no surprise. Something would be seriously amiss if experienced clinicians practiced like beginners, following an instruction manual like a consumer trying to assemble a new appliance. What caught my eye was the authors’ conclusion that clinicians should be trained to adhere to CBT interventions “to give patients the best chance of recovery.”
The study did not evaluate treatment outcome, so the authors had no way of knowing which clinicians were effective or which patients got better. They just presumed, a priori, that departure from treatment manuals means poorer therapy. And this presumption—which flies in the face of actual scientific evidence—slipped right past the “evidence oriented” reviewers and editors of a top-tier research journal. They probably never gave it a second thought.
The Big Lie
Academic researchers have usurped and appropriated the term “evidence based” to refer to a group of therapies conducted according to step-by-step instruction manuals (manualized therapies). The other things these therapies have in common are that they are typically brief, highly structured, and almost exclusively identified with CBT. The term “evidence based therapy” is also, de facto, a code word for “not psychodynamic.” It seems not seem to matter that scientific evidence shows that psychodynamic therapy is at least as effective as CBT. Proponents of “evidence based therapies” tend to denigrate psychodynamic approaches (or more correctly, their stereotypes and caricatures of psychodynamic approaches). When they use the term “evidence based,” it is often with an implicit wink and a nod and the unspoken message: “Manualized treatments are Science. Psychodynamic treatment is superstition.”
Some explanation is in order, since this is not how things are usually portrayed in textbooks or psychology classes. In past decades, most therapists practiced psychodynamic therapy or were strongly influenced by psychodynamic thought. Psychodynamic therapy aims at enhancing self-knowledge in the context of a deeply personal relationship between therapist and patient.
Psychodynamic or psychoanalytic clinicians in the old days were not especially supportive of empirical outcome research. Many believed that therapy required a level of privacy that precluded independent observation. Many also believed that research instruments could not measure important treatment benefits like self-awareness, freedom from inner constraints, or more intimate relationships. In contrast, academic researchers routinely conducted controlled trials comparing manualized CBT to control groups. These manualized forms of CBT were therefore described as “empirically validated” (the preferred term later morphed into “empirically supported” and later, “evidence based”).
Research findings never suggested that manualized CBT was more effective than psychodynamic therapy. It was just more often studied in controlled trials. There is obviously a world of difference between saying that a treatment has not been extensively researched and saying it has been empirically invalidated. But academic researchers routinely blurred this distinction. A culture developed in academic psychology that promoted a myth that research had proven manualized CBT superior to psychodynamic therapy. Some academics and researchers (those with little regard for actual scientific evidence) went so far as to assert that it was unethical to practice psychodynamic therapy since research had shown CBT to be more effective. The only problem is that research showed nothing of the sort.
This may shed some light on why the authors of the study I described above could so cavalierly assert that clinicians should adhere to CBT treatment manuals to give patients the best chance of recovery—and how such an empirically false assertion could sail right through the editorial review process of a prestigious research journal.
Where is the Evidence for Evidence-based Therapies, Part 2
Stay tuned. In the next installment, I will discuss whether “evidence based therapies” help people. The answer may surprise you.
No comments:
Post a Comment