AI Won't Replace You. But Therapists Who are Using it May
AI is coming to mental health care. In many ways, it’s already here. From AI-assisted documentation and smart EHR systems to chatbots providing between-session support, the landscape is shifting fast. The question is no longer whether AI will shape our field, but how.
Conversations are happening now about how these tools will be used, and clinicians need to be at the table. Those of us who will work with AI in our practices need to have a say. Therapists who lean in now, learn the tools, and develop an informed perspective will be far better equipped to advocate for ethical, client-centered AI use.
I should start by saying, I‘m not an AI expert. I’m a psychotherapist and practice owner. I run a residential treatment program and an outpatient practice. I train new clinicians and make programmatic decisions for my residential and outpatient programs. I oversee compliance and administrative tasks, and my business partners oversee our admissions, financial, outreach, and human resources departments. When I got started with all of this, I had no concept of how heavily we would rely on digital tools or what AI was, let alone how quickly it would evolve.
Now, it's top of mind for all of us. With approximately 30 employees, AI has been a hot topic in business meetings, professional gatherings, and amongst our staff.
So I’ve been learning a lot.
I’ve spoken with colleagues using and refusing AI. I’ve talked with business owners who are planning and testing AI systems in their organizations. I’ve interrogated the tight-lipped developers at our EHR company, mining them for what is in the works. I’ve even asked clients who receive services what they think about having an AI ride-along in therapy sessions. There is also plenty to read in the popular literature, and the academic publications are rolling out with urgency.
I’ve also been using AI a lot.
Clearly, there is a lot of potential here. From building clinical expertise, to managing efficiency, to increasing accessibility, AI is on the cusp of transforming - well, just about everything. That’s true in the broader healthcare landscape, and it is true in mental healthcare. From large institutional providers to single-member PLLC practices.
There are certainly risks. There are certainly ethical concerns. There are environmental issues and an unknown financial fallout. All of that needs to be talked about. But this isn’t that article.
Rather, I’m sharing what I’ve tried and what you might try using AI in your practice. We're all figuring this out together as a field of practice. Whether we like it or not, small private practices and privately owned treatment centers are operating in a competitive environment, head-to-head with PE-funded, corporate mental health care companies and tech firms that leverage massive investments and cutting-edge technologies.
There are big dollars barging into this space, and if you’re like me, you’ve invested a lot into building your practice. You care deeply about the integrity of the field. You believe in the healing power of human connection. You work hard to serve your clients well. It’s your career, livelihood, and passion.
You should get a say in how it moves forward.
So here are some suggestions for how you can begin to tool up and get familiar with AI tools.
A quick addendum. If you are using AI in any of the ways below, make sure you’ve gotten proper informed consent from clients, de-identify any information used outside of your HIPAA-compliant EHR, and double-check AI for hallucinations.
Using AI to level up your clinical work.
Start by recording your session through your EHR's AI features. De-identify the transcript and paste it into your AI platform of choice. I’ve found Claude to be particularly good at this task. Then start asking questions. The more specific your question, the better the feedback. For instance, you could ask, "I practice from a DBT perspective, where in this session could I have incorporated skills training more effectively?" or "Where did I miss opportunities to validate my client's emotional experience?"
Save this information and bring it into your clinical supervision. Rather than working from our faulty memories and sparse notes, we’ve got real info. You and your supervisor can review both the transcript and the AI feedback to inform your supervision topics.
Dig into your session analytics.
AI can analyze the time you spent talking vs. listening. It can tell you how many interventions you used from a specific modality or identify modalities associated with the interventions you have used. It can comment on your word choice. It can tell you how many open-ended questions you used. It can track validation and challenging interventions. It can point out your verbal patterns, like how often you use the word “just” (ok, that one came from one of my AI feedback sessions - I know that it can come across as minimizing, but I just keep saying it.) Whatever patterns in the transcript you are curious about exploring, AI can give you data sets around this information.
Store this data and track it over time. If you have goals for improving any of this information, you can see how you're actually doing.
Use AI to sharpen your clinical conceptualization.
I have found this particularly helpful. Share a de-identified conceptualization with an AI tool and ask it to expand or challenge your thinking. Does your conceptualization align with your clinical framework? What other frameworks might offer meaningful inroads? Request alternative ways of considering the clinical picture.
Try prompts like "Are there aspects of this presentation that might be better understood through an attachment lens?" or "What might a somatic therapist notice about this client's experience that I haven't addressed?" You don't have to agree with everything that comes back, but even the ideas you push back on can sharpen your thinking in valuable ways.
The best part is that you can do this conversationally. Talk into your device. And it talks back. Kinda weird, I know, but think of it as a low-stakes second opinion that's available any time you need it. And it's not about replacing your judgment, it's about expanding it.
Use AI to supplement treatment planning.
Once you have a solid clinical conceptualization, plug it into your AI tool and ask it to suggest treatment goals, objectives, and interventions. You can ask it to make suggestions based on your clinical framework and ask that it suggest ideas that are adjacent to the interventions you are using. Ask specific questions. You’ll get better answers. For instance, you can ask "Based on this clinical presentation, what are the most evidence-based interventions for this client?" or "What treatment goals would a CBT therapist prioritize here?"
AI can also generate worksheets and psychoed material that augment your treatment plan. Provide a few prompts, and in no time, you can offer your client handouts tailored to their specific challenges and goals. This degree of bespoke services is a level up.
AI isn’t a substitute for your own clinical judgment. Review all suggestions closely. Think through treatment planning carefully. AI isn’t there to replace your judgment, but it can help ensure your treatment plan is thorough, evidence-based, and creatively considered.
Use AI to analyze your outcome data and spot meaningful trends.
Every therapist should be collecting outcome data. If you're not yet, that's a conversation for another day! If you are tracking client progress through measures like the PHQ-9, GAD-7, or any other outcome tool, AI can help you actually do something meaningful with those numbers. We use the OQ45 and pay for the online analytics, so much af that work is done through their software. But you can do this for free (or close to it), right in Google Sheets or Microsoft Excel.
Plug your data into a spreadsheet, share it with your AI tool, and ask it to identify trends. Are certain clients plateauing around session six? Are your anxiety clients responding faster than your depression clients? Is a particular intervention consistently correlated with progress? AI can spot patterns in your data that are easy to miss when you're in the weeds of day-to-day clinical work.
AI is nowhere close to replacing human-to-human services. But we are in competition with it and with those who are using it effectively. Your instincts, your training, your ability to sit with another human being in their most vulnerable moments, that's the good stuff, and no algorithm is touching it. But therapists who thrive in the years ahead will be the ones who figure out how to pair those irreplaceable human skills with the tools that are reshaping our world. You don't have to be an expert. You just have to be curious and willing to experiment. Start small, mind our ethical obligations, keep your clients at the center, and see how your practice evolves as you bridge into this new era of mental health care.