When your therapist is an algorithm: the role of AI in therapy
Therapy has always been about human connection, empathy, and a safe space. But what happens when the therapist is an algorithm? AI in mental health care is expanding rapidly, promising support that’s free and available 24/7. At first glance, AI seems to add value to the usual therapy relationship. For instance, what if you want to ask your therapist a question late at night? Can AI step in to help here? At the same time, the rise of AI in therapy raises risks around safety, privacy, and ethics. Like any tool, it has the potential to help or harm depending on how it’s used.
The benefits of AI are undeniable. As someone who uses AI to summarize documents, correct errors in code, plan trip itineraries, and proofread long emails, I can personally attest to its utility. It can offer a specific workout plan that is cheaper than a trainer and provide legal advice faster than you could hire an attorney, but can it provide therapeutic insight that rivals a trained professional?
For now, the answer is no. There is still a great need for psychologists, even with the rise of AI. Here are several reasons why:
AI can be dangerously wrong. There are recent reports of people reporting suicidal ideation to chatbots and the chatbots responding with a list of the tallest buildings nearby. AI can miss red flags that are behaviorally observable, and even nudge people towards self-harm and delusions. AI falls flat when it comes to more complicated situations that require clinical judgment.
AI lacks clinical nuance. While it may feel personalized and validating, AI is not yet trained in the nuance of personalized medicine and evidence-based treatment decisions.
AI can become addictive. When AI responds to you, it is reinforcing a pattern – answers provide relief. Over time, people may learn to rely on immediate reassurance from AI rather than using skills that are helpful long-term.
Over-reliance on AI can lead to social isolation. Interacting with AI can reduce human contact, which increases symptoms of social anxiety, withdrawal, and loneliness in the long run.
Data privacy is questionable. Not all apps are secure or honest; they are gathering information and learning from you, for purposes you may not know.
On the plus side, AI can be effectively used to detect and analyze patterns of behavior and make predictions based on data. When used wisely, it can augment therapy. Think of it as a helpful tool alongside therapy to enhance the experience. It has the potential to expand access to those who are otherwise limited, but should be used with caution at this time.
- by Lauren Rutter, PhDDr. Rutter is a clinical psychologist in practice and an academic researcher. She uses social media data and natural language processing/AI to understand depression and risk for depression across the lifespan.