top of page
Search

Why Mental Health Clinicians Shouldn't Dismiss AI Tools Just Yet: A Grounded Look at Support Between Sessions

Spend any time in professional therapist spaces (Facebook groups, supervision rooms, conference hallways) and a pattern emerges almost immediately. The moment AI enters the conversation, shoulders tighten. The energy shifts. Someone cites ethical concerns. Another warns about misinformation. A third insists that artificial intelligence threatens the therapeutic alliance itself.


These reactions are not unfounded. They reflect the seriousness and integrity with which clinicians approach their work. Therapy is not a commodity. The therapeutic relationship is not replaceable. And the complexities of trauma, grief, dysregulation, or chronic stress cannot be solved by an algorithm or a chatbot. AI should be handled with caution because the stakes are high.


ree

But something else is true as well: the realities of modern clinical caseloads, client challenges, and nervous-system overwhelm aren’t getting easier. Clients are carrying more stress, more chronic dysregulation, and more environmental instability than ever before. Clinicians are holding heavy emotional labor, navigating administrative strain, and witnessing the limits of once-a-week work. Between-session dropout points (forgetting tools, becoming overwhelmed, losing motivation, spiraling into shame) continue to erode progress.


This tension between ethical caution and practical need is where the AI conversation becomes more nuanced. Not all AI tools are designed to mimic therapy or step into roles they have no business occupying. Some exist to reinforce the work clinicians are already doing by offering small, supportive touchpoints that help clients stay anchored between appointments. These tools don’t pretend to be “therapists.” They don’t attempt to diagnose, interpret pathology, or render clinical judgment. Their purpose is much simpler: to help clients stay connected to their own bodies, their own capacity, and their own inner signals when stress tries to pull them off-center.


And this is where curiosity becomes clinically relevant.


One of the most consistent challenges clinicians report is the gap between what happens in a session and what clients are actually able to sustain outside of it. A client may access clarity while sitting with you (nervous system supported, emotional safety present, co-regulation available) only to lose access to that clarity the second real life pulls them back into chaos. Even the most well-designed therapeutic tools lose their potency when the client forgets to use them, feels too overwhelmed to start, or doesn’t recognize the early signs of dysregulation quickly enough to intervene.


Most clinicians know that consistent, micro-level self-connection is what creates lasting change, not intensity. But most clients struggle with follow-through, not because they’re resistant or unmotivated, but because dysregulation disrupts memory, attention, and executive functioning. Shame adds another layer: the “I know what to do but didn’t do it” spiral. This is where supportive tools can be powerful; not as replacements for therapy, but as reinforcers that help clients practice the skills you teach them.


Some clinicians fear that integrating external tools will dilute the therapeutic alliance. In practice, the opposite is often true. When a client has a way to reflect, track sensory information, or reconnect with their body between sessions, they usually arrive more grounded and more prepared to make meaningful progress. Instead of spending half the session catching up on what went wrong, you get to deepen the work. Patterns become clearer. Interventions land more effectively. Clients feel more empowered and less dependent. And clinicians feel less pressure to be the only source of regulation in a client’s week.


This is the context in which tools like SomaGuide by Somyn were built; not to disrupt clinical work but to support it. Unlike many AI-driven apps marketed as “therapy alternatives,” this approach intentionally avoids clinical claims. Instead, it focuses on guiding users through tiny, nervous-system-friendly check-ins and somatic reflections that take only a minute or two. By helping clients notice their internal states without judgment, it strengthens the very body awareness that clinicians are trying to cultivate. And because the tone is gentle, non-directive, and regulation-focused, it becomes less of a solution and more of a companion or a supportive pause point clients can turn to when they feel overwhelmed.


For clinicians, this means their clients now have a tool that makes the work between sessions feel more accessible and less intimidating. For clients, it means the moments when they would normally shut down, avoid, or dissociate become moments where they can slow down just enough to reconnect. For the therapeutic relationship, it means the client shows up with more data, more insight, and more nervous system capacity, allowing deeper and safer clinical work.


The resistance to AI in mental health is understandable. It is rooted in ethics, care, professional responsibility, and healthy skepticism. But dismissing every AI tool outright might inadvertently deprive clients of something that genuinely supports their healing. Not all AI in the wellness space is designed to replace clinicians. Some tools are intentionally built to respect boundaries, reinforce somatic literacy, and strengthen the client’s ability to stay connected to themselves in the moments they need it most.


Curiosity doesn’t compromise ethics, thoughtful integration doesn’t replace clinical expertise, and supportive tools don’t diminish the therapeutic relationship, but they can strengthen it.


The question isn’t whether AI should replace therapy. It’s whether small, grounded, nervous-system-informed tools can help clients stay connected between the moments that matter most. If a tool can do that (gently, ethically, and without overstepping), then it might be worth clinicians giving it a second look.

 
 
 
bottom of page