AI as a Tool, Not as a Therapist

Strengths and Weaknesses of Using AI to Support Mental Health

September 3, 2025
AI as a Tool, Not as a Therapist

Artificial intelligence has made remarkable strides in healthcare and mental health support. It can aid doctors in diagnosing cancers and other diseases, analyze patient data to predict outcomes, and even detect crucial warning signs, such as suicidal tendencies. Although these tools can be beneficial, it is important to consider the limitations and risks.

Strengths of AI

Artificial Intelligence tools, such as platforms like ChatGPT, are easy to find online and are very straightforward to use. They can be quite helpful to anyone undergoing mental health challenges, as many AI chatbots are:

  • Structured: Chatbots are helpful to create reminders and help you set routines, such as aiding in maintaining habits or suggesting coping mechanisms.
  • Accessible: Many models are available 24/7, and are able to be used anytime and from anywhere.
  • Personalized: Anyone is able to tailor the AI’s responses to their own data or preferences.
  • Consistent: AI models are able to generate responses instantly.

These strengths make AI an excellent tool and are quite helpful when used similarly to how we might use Google or a meditation app.

Weaknesses of AI

However, while AI can mimic empathy, it does not actually feel it. Since chatbots are trained on pattern recognition to generate responses, they lack genuine emotional awareness or the lived experience of a human being.

  • Algorithmic bias: AI is trained on existing data, which may reflect harmful biases based on race, gender, sexual orientation, and socioeconomic status, among others.
  • No ability to co-regulate: A human therapist can sense dysregulation, help you stabilize emotionally, and ensure you leave a session feeling grounded. AI cannot track your emotional state in real time without explicit input, and it cannot sense your body language or physical cues.
  • No formal therapeutic training: AI has not undergone the years of education, supervision, and ethical training that human clinicians have undergone. The AI models do not have the clinical training to manage risk or offer live interventions. Additionally, they have not been trained in cultural competency, unlike many clinicians.
  • Short-term, not long-term: AI can validate your feelings and offer coping strategies, but it cannot adapt its approach over months or years through a deep understanding of your history, body language, and emotional patterns.

Designed to Please

Additionally, a core part of therapy is not just being told what you want to hear, but what you need to hear. It is important to be challenged to confront uncomfortable truths in order to grow and change. Human therapists know when to push and when to pull back, and understand the importance of regulating the client after a difficult session.

Chatbots, however, are trained to please, not to push. They cannot sense your body language, tone, or subtle emotional cues that tell a therapist you might be shutting down.

Ethical Concerns

Beyond the therapeutic limits, large AI models have a significant environmental footprint, as training and running the chatbots can consume substantial amounts of water and energy. Additionally, because this technology is so new, questions still remain about privacy concerns and confidentiality, informed consent, and long-term impacts on mental health.

Bottom Line

AI can be used to support mental health work, but it cannot replace it, especially in regards to complex trauma, serious mental health disorders, or crisis intervention. Think of AI as a resource, not a relationship.

AI is valuable for tracking habits, finding coping mechanisms, and providing immediate assistance. Nevertheless, for in-depth, lasting change, ethical guidance, and emotional safety, it is often more beneficial to trust a human therapist instead.

References

Tingley, K. (2025, June 20). Kids Are in Crisis. Could Chatbot Therapy Help? The New York Times. https://www.nytimes.com/2025/06/20/magazine/ai-chatbot-therapy.html

Zhang, Z., & Wang, J. (2024). Can AI replace psychotherapists? Exploring the future of mental health care. Frontiers in psychiatry, 15, 1444382. https://doi.org/10.3389/fpsyt.2024.1444382

Other Articles