Recently, when I was consulting with someone on a particularly difficult case, he simply said, “You know, the research says…”
Unfortunately, this is a common occurrence. Of course, most of the time comments about what “research says” are well-meaning and potentially useful. But every time it happens, or someone bandies about the phrase, “empirically supported treatment,” I feel my stomach cringe and can almost see the conversation collapsing into some sort of binary…where I either admit my allegiance to “science” (and hence can be trusted) or admit I practice without reference or consideration to whether what I am doing is scientifically supported.
The problem is psychotherapy research does not actually say anything. Authors do. That is one of the first things I was taught in graduate school. I was also informed it is not published or presented with the goal of revealing truth, but to create a narrative with which to support an argument. What’s more, I learned that some of the very assumptions of evidenced-based treatments, are, themselves, not empirically supported (see Westen, Novotny & Thompson-Brenner, 2004). But I choose not to say these things, as I didn’t believe it would be helpful.
Had I said something, I would have replied I know the literature and almost all the studies were short-term. Yet the authors did not review the literature on the length of time it takes to treat the disorder. Instead, the research was likely designed with concerns for pragmatics, including concerns around loss of experimental control. And the condition being discussed takes a long time to change and is invariably intertwined with stable characteristics, such as temperament and personality style. Thus the experiments were incapable of effectively treating the disorder.
Another complication is that the authors appeared to assume mental health can be compartmentalized, such that changing one system can be done without concern to changing others. My experience is that changing one aspect of a person’s life invariably produces unanticipated and uncontrollable changes in other aspects. But these studies chose a single outcome measure, and failed to “measure” the overall experience of the person. While they may have improved in one area, I found no reference to other important markers.
The choice of participants in these studies was also highly problematic. Homogenous, uncomplicated individuals in clinical populations are much less frequent than the opposite. Honestly, I’m not sure I have ever treated one. Therefore, it is almost impossible to tell the translatability of the outcomes to my patients. But the results were presented as if all individuals will have the same response. Not only does this deny the over-determined nature of human development, it would potentially lead to inappropriate treatments.
Don’t get me wrong. I believe it is important to reference research. In fact, I spend a great deal of time reading peer-reviewed articles. But I suggest we need to read and reference them properly, as we do anecdotal information and clinical experience. While commitment to the rigors of science is appealing, it is also, in my opinion, potentially harmful. Ultimately, I believe we, as a community of professionals, benefit from tolerance of a level of uncertainty and recognition that psychotherapy is collective science and an individual art.
Michael Roeske