AI in therapy: what can AI help with – and what shouldn't it do?
AI can structure, mirror, and lower the threshold for reflection. But it can't listen, judge, or treat – and the difference is crucial. AI should be a mirror, not a counsellor, and transparency about its limitations makes it more useful, not less.
You open an app and type: "My teenager won't talk to me." The AI responds within seconds. It sounds empathetic. It gives advice. It suggests conversation openers, reflection questions, a new approach. It feels like being heard.
But were you?
Pattern recognition is not empathy
AI is good at finding patterns. Really good, in fact. It can see that families with low emotional openness and high conflict tolerance often get stuck in the same dynamics. In principle, AI can spot patterns across many responses and identify which combinations typically create friction – and which create connection.
That's useful. But let's not confuse it with understanding you.
A therapist listens to what you don't say. She notices your hesitation. She sees that you look away when you mention your son. AI sees scores and patterns. It sees numbers, not faces. And that's an important difference.
What AI is actually good at
Let's be honest about what AI does well – because there's a lot:
- Structure. AI can organise what feels chaotic. When you don't know where to start, a structured overview of your family patterns gives you a place to begin.
- Accessibility. Therapy costs money. There are waiting lists. AI can lower the threshold for reflection – at 10pm, on the sofa, without waiting six weeks.
- Scale. AI can reach many people at once. No therapist can do that.
- Putting words to the unclear. Sometimes you know something is off, but you can't say exactly what. An AI-generated overview can give language to what you're feeling.
In SAMRUM, the AI model only receives aggregated scores in the prompt – never the raw responses. And because each family member answers individually, the picture is built on multiple perspectives – not just one person's narrative. That creates a form of objectivity that neither a chatbot nor a single conversation can deliver. The AI describes patterns between family members, not the person behind them. It says: "Here's your dynamic" – not: "Here's what's wrong with you."
The mirror and the counsellor
There is a crucial difference between a mirror and a counsellor.
A mirror shows you what is. It doesn't interpret. It doesn't judge. It doesn't tell you what to do. It reflects.
A counsellor does something different. A counsellor assesses, contextualises, adapts. A counsellor knows your history, your body, your tears. A counsellor can say: "I think there's something more here" – and be right, because she knows you.
AI should be a mirror. It can show you patterns you haven't seen yourself. It can start a conversation you didn't know you needed. But it shouldn't pretend to be the counsellor. Because the moment it starts diagnosing, treating, or prescribing, it moves into territory where mistakes cost – not in data, but in people.
Why transparency is essential
The problem isn't that AI exists in the therapeutic space. The problem is when you don't know it's AI speaking. Or when you don't know what data it uses. Or when it presents its patterns as truths rather than possibilities.
Transparency comes down to three things:
- What does the AI see? What data does it have access to – and what doesn't it?
- What can it do? Is it reflection or treatment? Patterns or diagnoses?
- What can't it do? Where does it stop – and where does the human begin?
When AI is honest about its limitations, it becomes more useful, not less. You know what you're getting. And you know when you need something else.
AI can open doors. It can show you patterns. It can give you language for what's creaking. But it can't sit with you in the silence afterwards. That takes a person.
SAMRUM is not therapy and does not make diagnoses. If you or your family need professional help, contact your doctor or a licensed psychologist.