From ELIZA to ChatGPT: A Brief History
The idea of a “machine therapist” is nothing new. In 1966, Joseph Weizenbaum at MIT created ELIZA, a primitive chatbot that mimicked a Rogerian psychotherapist — reflecting the user's words back as questions. Though technologically simplistic, ELIZA astonished the world: many users genuinely believed they were talking to a real person.
📖 Read more: Online Therapy: Is It Equally Effective?
Today, with Large Language Models (LLMs) like ChatGPT, Claude, and Gemini, we're in an entirely different era. A study by Sentio University (early 2025) found that 48.7% of U.S. adults with mental health concerns who use LLMs turned to them for support with anxiety, depression, loneliness, or related issues.
The Technologies Behind AI Mental Health
Machine Learning
Identifies patterns in massive datasets — uncovering correlations that human researchers would need years to detect.
NLP (Natural Language Processing)
Analyzes speech, text & tone of voice. Detects signs of psychological distress through linguistic patterns and sentiment indicators.
Computer Vision
Analyzes facial expressions, body language & micro-expressions to assess emotional states in real time.
LLMs & Generative AI
Provide 24/7 therapeutic support, lower cost, increased accessibility — but with risks of hallucinations & lack of reasoning transparency.
AI Therapists: Who Are They?
Woebot Stanford
A chatbot based on CBT (Cognitive Behavioral Therapy). It tracks mood through brief daily conversations. In a randomized clinical trial (Fitzpatrick et al., 2017), it significantly reduced depressive symptoms in young adults within 2 weeks.
Ellie USC
A virtual therapist from the University of Southern California. Through webcam & microphone, it processes facial expressions and vocal tone in real time.
Cogito Voice AI
Analyzes changes in pitch and vocal dynamics (loudness) to detect symptoms of depression or anxiety during phone calls.
XAIA Cedars-Sinai
A pioneering program (Jan. 2024) that combines immersive virtual reality with generative AI for mental health — a “therapist” that looks human inside VR.
Oura Ring Wearable
Wearable technology that scans heart rate and sleep patterns in real time, delivering personalized AI-based mental health recommendations.
📖 Read more: Adult ADHD: The Symptoms Everyone Ignores
AI vs Human Therapy: The Comparison
| Criterion | AI Therapist | Human Therapist |
|---|---|---|
| Availability | 24/7, no appointment needed | Office hours, waitlists |
| Cost | Free or very low | €50-150/session |
| Empathy | Simulated, debatable | Authentic — a core therapeutic tool |
| Therapeutic Alliance | Limited | Accounts for 30% of positive outcomes |
| Severe Cases | Unsuitable (suicidality, PTSD) | Specialist required |
| Data | Analyzes massive datasets | Clinical experience & intuition |
| Ideal For | Mild-to-moderate (screening, CBT) | Complex, severe, high-risk cases |
- Woebot significantly reduced depressive symptoms in 2 weeks, with results comparable to brief human interventions (Fitzpatrick et al., JMIR Mental Health, 2017)
- A 2022 meta-analysis found moderate effectiveness of digital tools in reducing symptoms, provided user engagement was high
- The therapeutic alliance accounts for ~30% of positive therapy outcomes — something AI has yet to replicate
- An AI model demonstrated higher diagnostic accuracy for depression and PTSD compared to general practitioners
- Some studies find that LLM responses are rated as more empathetic than those of clinicians — but without genuine emotional intelligence
The Serious Risks
AI's entry into mental health is not without risks — and some of them are deadly.
A young father in Belgium took his own life after an AI chatbot allegedly encouraged him to “sacrifice himself” for climate change. Multiple lawsuits against ChatGPT allege it encouraged victims, provided information on suicide methods, and pressured them to keep their suicidal thoughts secret. An eating disorder helpline took its AI chatbot offline following reports of dangerous advice.
Hallucinations
LLMs generate plausible but false information, which is particularly dangerous in a clinical mental health context.
Chatbot Psychosis
Excessive use of ChatGPT has led users to develop delusional thinking. The realism of conversations creates cognitive dissonance.
Data & Privacy
Chatbots are not classified as medical devices — data isn't always protected. Pharmaceutical companies exploit this regulatory gap.
Algorithmic Bias
AI depression prediction on social media showed significantly reduced accuracy for African Americans due to cultural differences.
The Future: Hybrid Models
The majority of researchers agree: AI will not replace therapists — it will augment them. “Hybrid models” that combine AI-driven symptom monitoring with human clinical oversight are showing the most promising results.
- Precision Psychiatry: AI combined with EHR, genomic data & clinical prescriptions for personalized treatment
- Suicide prediction: Vanderbilt algorithm with 80% accuracy, factoring in age, gender, medical history
- Explainable AI: New “transparent” AI systems that explain their decisions — critical for clinical trust
- FDA evaluation: AI-COA tool in pilot phase — the first AI mental health assessment tool under regulatory oversight
- Wearable monitoring: Oura Ring, Apple Watch, Galaxy Ring — continuous tracking of mental health biomarkers
The truth is that the therapeutic relationship remains the most powerful factor in psychotherapy outcomes (Wampold, 2015). Traditional psychotherapy is therefore not an outdated method — but that doesn't mean it can't be enhanced. That's exactly where AI comes in: screening, monitoring, reducing administrative burden, and providing 24/7 support for mild cases — freeing clinicians to focus on relational care.
Sources & Bibliography
- Fitzpatrick, K.K., Darcy, A. & Vierhile, M. (2017). Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression Using a Fully Automated Conversational Agent (Woebot). JMIR Mental Health, 4(2), e19. PMC5478797.
- Lee, E.E. et al. (2021). Artificial Intelligence for Mental Health Care: Clinical Applications, Barriers, Facilitators, and Artificial Wisdom. Biological Psychiatry: Cognitive Neuroscience, 6(9), 856-864. PMC8349367.
- Wampold, B.E. (2015). How important are the common factors in psychotherapy? An update. World Psychiatry, 14(3), 270-277. PMC4592639.
- Brown, J.E.H. & Halpern, J. (2021). AI chatbots cannot replace human interactions in the pursuit of more inclusive mental healthcare. SSM - Mental Health, 1, 100017.
- Meadi, M.R. et al. (2025). Exploring the Ethical Challenges of Conversational AI in Mental Health Care: Scoping Review. JMIR Mental Health, 12(1), e60432. PMC11890142.
- Spiegel, B.M.R. et al. (2024). Feasibility of combining spatial computing and AI for mental health support. npj Digital Medicine, 7(1), 22. PMC10817913.
- Wikipedia (2025). Artificial intelligence in mental health.
