Artificial intelligence has made remarkable strides in healthcare and mental health support. It can aid doctors in diagnosing cancers and other diseases, analyze patient data to predict outcomes, and even detect crucial warning signs, such as suicidal tendencies. Although these tools can be beneficial, it is important to consider the limitations and risks.
Artificial Intelligence tools, such as platforms like ChatGPT, are easy to find online and are very straightforward to use. They can be quite helpful to anyone undergoing mental health challenges, as many AI chatbots are:
These strengths make AI an excellent tool and are quite helpful when used similarly to how we might use Google or a meditation app.
However, while AI can mimic empathy, it does not actually feel it. Since chatbots are trained on pattern recognition to generate responses, they lack genuine emotional awareness or the lived experience of a human being.
Additionally, a core part of therapy is not just being told what you want to hear, but what you need to hear. It is important to be challenged to confront uncomfortable truths in order to grow and change. Human therapists know when to push and when to pull back, and understand the importance of regulating the client after a difficult session.
Chatbots, however, are trained to please, not to push. They cannot sense your body language, tone, or subtle emotional cues that tell a therapist you might be shutting down.
Beyond the therapeutic limits, large AI models have a significant environmental footprint, as training and running the chatbots can consume substantial amounts of water and energy. Additionally, because this technology is so new, questions still remain about privacy concerns and confidentiality, informed consent, and long-term impacts on mental health.
AI can be used to support mental health work, but it cannot replace it, especially in regards to complex trauma, serious mental health disorders, or crisis intervention. Think of AI as a resource, not a relationship.
AI is valuable for tracking habits, finding coping mechanisms, and providing immediate assistance. Nevertheless, for in-depth, lasting change, ethical guidance, and emotional safety, it is often more beneficial to trust a human therapist instead.
Tingley, K. (2025, June 20). Kids Are in Crisis. Could Chatbot Therapy Help? The New York Times. https://www.nytimes.com/2025/06/20/magazine/ai-chatbot-therapy.html
Zhang, Z., & Wang, J. (2024). Can AI replace psychotherapists? Exploring the future of mental health care. Frontiers in psychiatry, 15, 1444382. https://doi.org/10.3389/fpsyt.2024.1444382