Will AI Replace Your Therapist? The Uncomfortable Truth
One of the bigger discussions happening right now is whether or not AI will replace your mental health therapist.
Many industries are asking similar questions, and are already witnessing AI models doing a pretty good job of handling the most routine aspects of their work.
It’s exciting and terrifying all at the same time.
Make the statement, “AI will very soon replace your therapist,” and you can cause quite a kerfuffle on the interwebs.
Mental health providers don’t want to hear it.
And why would we?
To obtain a license as a mental health counselor (or the licensed professional counselor equivalent) requires:
- A master’s degree in counselor education, from a university program accredited by the Council for Accreditation of Counseling and Related Educational Programs (CACREP),
- Around 1,000 hours of a university-level practicum,
- Post graduate degree, registering with your state as a practicing intern under a qualified supervisor,
- Completing anywhere from 1,500 to 3,000 face-to-face clinical hours, generally in no less than two years, and
- Passing the National Clinical Mental Health Counseling Examination (NCMHCE).
It’s insulting to think that a Large Language Model could simply replace that considerable effort and expense. Never mind asserting it might do a better job.
But the question is not “Should AI replace therapy?”
The reality is that it most likely will.
Common arguments against AI therapy
Here’s the typical rationale for why AI should not replace your therapist:
- AI can’t replicate the human touch and shared human experience. While an AI might create the impression of empathy by reflecting your words back to you or asking thoughtful questions followed by positive affirmations, this is vastly different from unconditional positive regard and a genuine therapeutic alliance.
- AI presents ethical concerns about privacy, confidentiality, and security. Where does all the data go, and who keeps it safe? Who holds accountability in this infrastructure, and who gets fined when there’s a breach? How might this sensitive information be used against you in legal situations without confidential protections? (All data goes somewhere, you know?)
- AI misses subtle nonverbal cues crucial for risk assessment and crisis management. AI can ask some of the right questions but can’t pick up when verbal responses are incongruent with nonverbal signals, or when humor or sarcasm is involved. It struggles to make nuanced judgments about client safety when those judgments are based solely on data collection and synthesis.
- AI can be overly generalized in its approach, missing the unique needs of individual clients. It’s not dynamic and can’t easily have back-and-forth discussions. This makes it almost impossible to flex enough to adapt to this particular client. The model only knows what it’s been fed from existing sources.
- AI models contain inherent biases and struggle to effectively check these biases to ensure cultural sensitivity. Because they train on nearly everything available, teams managing these models bear the responsibility to address bias, while also managing their own biases.
- AI runs the risk of creating a knock-off version of therapy. Therapy isn’t simply about finding quick solutions to problems. It’s also about growth and self-discovery. AI will relegate the field to providing answers clients want to hear, rather than challenging them within a relationship where someone listens, validates, and supports them when all that insight gets hard to handle.
- AI models lack built-in accountability and oversight, like those provided by state licensing boards and professional associations that oversee ethical guidelines and regulations. There’s no straightforward way to determine if AI possesses the basic understanding and professional judgment to avoid harm while also managing complex human situations. There’s no implied stamp of approval.
I don’t disagree that AI therapy is a sore substitute.
What I’ve outlined here represents some of the fundamental elements of Therapy 101. Any practicing therapist should be delivering all of this at minimum.
Therapists must navigate multiple complexities at the same time.
We don’t listen to problems.
Okay, yes, we listen to problems. But we also validate experiences and support clients through their darkest moments. We assess for what we might have missed before, pivot on a dime, and challenge ourselves to guide clients toward choices that line up with their values.
This is difficult, ambiguous work that demands your best energy.
It’s a highly customized experience for each client. That’s why we make the big bucks. 😁
The reality of the will
So, should AI be capable of replicating this level of care before we hand over the keys? Yes, ideally.
But even though it falls short, I think AI will still replace all but the most highly specialized therapists.
Many people already accept less-than-ideal experiences when addressing their interpersonal challenges.
Many of my clients are perfectly comfortable having serious disagreements with their spouses via text. They understand on some level they’re missing 90% of the communication from the nonverbals. But they appreciate being able to edit their thoughts in real time and express themselves without interruption, all while going through the Chik-Fil-A drive-through.
Texting definitely offers a lesser experience compared to face-to-face conversation, but we appreciate the convenience, and so we keep right on using it.
Social media platforms are another substandard substitute for the connection that a genuine real-life community offers.
People turn to social media groups, communities, or comment sections for emotional support, validation, and advice with some pretty personal problems.
I know, I know, your group is really good.
But these interactions don’t have the same depth, consistency, and accountability of real-world friendships or support groups in your church, neighborhood, PTA and community programs.
Strangers online can’t fully grasp your unique circumstances, and the advice can be all over the place. That warm fuzzy feeling is generally the dopamine hits from likes and replies to your comments.
It’s not the same as someone calling you to say they missed you at Bible study last night, or giving you an oxytocin hit with a hug to let you know they sure are glad to see you.
Still, the online community wins because it’s immediately accessible, requires minimal vulnerability, and provides instant gratification.
No playdates to put together, no schedules to move around, no donuts to pick up.
We’ll accept a limited but convenient solution if it meets our immediate needs, even when we recognize it lacks the richness of the alternative.
People are busy as hell.
Most simply want to solve their problems and get on with their lives.
They’ll settle for the answers that get them past this roadblock right now.
AI can give you some pretty actionable answers, and a quantifiable, measurable plan to implement.
You won’t get a human touch, but you’ll get the impression of one, along with something tangible you can apply to your life.
In many cases, that will suffice, for now.
No expensive therapist, no battles with insurance companies over coverage, no scheduling inconveniences, no pressure.
The question isn’t if AI should replace therapists.
It’s recognizing that for many people who need mental health support, AI will become the path of least resistance.
And history shows us that’s usually enough to change an industry forever.
What do you think about this?
_______________________________
References
Gutierrez, G., Stephenson, C., Eadie, J., Asadpour, K., & Alavi, N. (2024). Examining the role of AI technology in online mental healthcare: Opportunities, challenges, and implications, a mixed-methods review. Frontiers in Psychiatry, 15, 1356773. https://doi.org/10.3389/fpsyt.2024.1356773
American Counseling Association. (2014). ACA code of ethics. Retrieved from https://www.counseling.org/resources/aca-code-of-ethics.pdf
Luxton DD. Ethical implications of conversational agents in global public health. Bull World Health Organ. 2020 Apr 1;98(4):285-287. doi: 10.2471/BLT.19.237636. Epub 2020 Jan 27. PMID: 32284654; PMCID: PMC7133471. https://pmc.ncbi.nlm.nih.gov/articles/PMC7133471/
U.S. Department of Health and Human Services. (n.d.). Health Insurance Portability and Accountability Act (HIPAA). Retrieved from https://www.hhs.gov/hipaa/index.html
Norcross, J. C., & Lambert, M. J. (2018). Psychotherapy relationships that work III.Psychotherapy, 55(4), 303–315. https://doi.org/10.1037/pst0000193
Miller, H. I. (2025, February 18). Will artificial intelligence replace human psychiatrists? American Council on Science and Health. Retrieved from https://www.acsh.org/news/2025/02/18/will-artificial-intelligence-replace-human-psychiatrists-48921
Patzelt EH, Kool W, Millner AJ, Gershman SJ. The transdiagnostic structure of mental effort avoidance. Sci Rep. 2019 Feb 8;9(1):1689. doi: 10.1038/s41598-018-37802-1. PMID: 30737422; PMCID: PMC6368591. https://pmc.ncbi.nlm.nih.gov/articles/PMC6368591/
Leave a Reply
Want to join the discussion?Feel free to contribute!