The role of AI in counselling: Friend or disruptor?
In an age of accelerated change, artificial intelligence (AI) is reshaping every corner of our lives, including the most human of all spaces: the therapy room. For counsellors and clients alike, AI presents both promise and paradox. Is it here to replace the healing power of being heard, or could it deepen it?

As therapists, we’re being called to explore this new terrain not with fear but with fierce curiosity.
Is AI replacing the need to be heard or enhancing it?
One of the most exciting developments in mental health tech is the potential for AI to support emotional expression outside of traditional sessions. AI chatbots, like Woebot and Wysa, offer 24/7, judgment-free check-ins for users experiencing anxiety, low mood, or stress.
They’re not pretending to be therapists, but they are creating space for people to feel seen between appointments, helping them name emotions and track patterns. For some, this can feel like a lifeline. For others, it raises a haunting question: Are we outsourcing empathy to machines?
The truth may lie in the collaboration between AI and human therapists, not competition. AI can’t replicate the warmth of human presence, the nuance of a knowing look, or the sacred space of shared silence. But it can offer tools that extend our reach, enhance psychoeducation, and support clients in developing awareness and resilience.
Will paid compassion go digital, and what does that mean for real therapists?
AI is already being used to streamline administrative tasks, freeing therapists from hours of unpaid labour like note writing, appointment reminders, or form-filling.
Tools like TheraNest, SimplePractice, and Tilda are evolving to include smart automation, allowing counsellors to spend more time with clients and less time at a screen.
But if emotional support becomes digitised, if clients bond with bots, what does that mean for the therapeutic relationship? Here’s the reassuring truth: AI lacks intuition, ethical attunement, and emotional depth. It doesn’t carry the lived experience, the humour, or the heart of a therapist. Paid compassion may become augmented, but it won’t become obsolete.
Our role may evolve into something even more powerful, guiding clients through a sea of automated information with integrity, empathy, and discernment.
AI companions in grief, loss, and trauma recovery: Friend, fraud, or future?
AI companions are now used in bereavement support, trauma healing, and crisis response. Some platforms use voice and text data to simulate conversations with loved ones who have passed. It’s a controversial frontier: Are these tools comforting or keeping people stuck?
In trauma work, apps like Replika or Tess offer space to debrief and de-escalate. While they’re not a substitute for therapy, they may offer a sense of continuity and emotional safety for those between sessions or waiting on a long list for care.
Used ethically, these tools can be friends on the path to healing. Used unethically or without therapeutic oversight, they can become frauds, offering false intimacy without the grounding of real attunement.
Are we training ourselves out of a job?
A genuine fear for many in our profession is this: By using AI, are we training the algorithm to do what we do and ultimately replace us? It’s a valid concern. AI learns from the data it’s fed. But therapy isn’t just a technical skillset, it’s a relational art. The therapeutic relationship is built on trust, safety, and connection, which a machine cannot authentically replicate. Our use of AI should be intentional and guarded.
As counsellors, we must advocate for transparency, client consent, and ethical guidelines that prevent commercial tech from exploiting vulnerable users. Just because something can be automated doesn’t mean it should.
Ethics to consider:
- Informed consent: When interacting with AI, clients must know how their data is used.
- Confidentiality: Where is the data stored? Who has access? Is it GDPR-compliant?
- Bias and inequality: Algorithms can reflect racial, gender, and cultural biases embedded in training data.
- Dehumanisation: We must ensure that efficiency never replaces empathy, and speed never overrides safety.
- Professional boundaries: AI tools must support, not undermine, qualified therapeutic work.
AI tools that help, not harm:
- Wysa: An AI-based mental health coach using evidence-based CBT techniques.
- Woebot: A chatbot offering brief emotional support and mood tracking.
- TheraNest / SimplePractice: Practice management software with smart scheduling, note-taking, and client portals.
- ChatGPT (with therapist supervision!): Can assist with writing psychoeducational materials, session summaries, or even creating metaphors and grounding exercises.
These tools can empower us to work smarter, not harder, so we can show up more fully for our clients.
The future of counselling
A hybrid model of healing
Picture this: A client begins their journey with an AI triage tool that gently guides them toward the right level of care. Between sessions, they use an app to track sleep, mood, and triggers data that their counsellor reviews to offer deeper insights. Sessions are in-person, online, or both.
Supervision includes AI feedback but is always held by a human. In the future, counselling may be hybrid, flexible, and deeply personalised, but it will still centre on relationships. The therapist becomes a healer and a guide through a world of tools, teaching clients how to use tech wisely while never forgetting what it means to be fully seen and held.
Hope in the age of machines
AI has the potential to democratise mental health support, reaching people who’ve never had access to therapy before. It can reduce loneliness, extend care beyond session walls, and create new pathways to healing.
As counsellors, we are not being replaced. We are being called to lead this evolution with courage, wisdom, and compassion. Because in the end, it’s not the tech that changes lives, it’s the human heart behind it. A future where everyone’s mental health is supported isn’t just possible, it’s within reach. And that’s a future worth fighting for.
