The risks of using AI for mental health support
Artificial Intelligence is everywhere. It helps us write emails, create videos, and it’s even integrated into some of our household appliances.
More and more, we’re using AI for daily tasks, like helping us write messages or answer questions. While AI can be helpful in some areas, there are situations where it raises important concerns, like turning to it for mental health support.
Our counsellors are starting to notice this shift in real time. Some clients have shared that they use AI to work through their thoughts and build on the tools learned in counselling sessions. When it's used that way, as a small extra support alongside real care, it can be genuinely helpful. For others, especially youth who may be struggling with loneliness, isolation, or feeling misunderstood, AI can feel like an easier option than talking to someone face-to-face. It's easy to understand why. AI is available anytime, even at 2am. It doesn't require vulnerability in the way a real conversation does. For someone who finds face-to-face support intimidating or inaccessible, the appeal is real.
While these factors help explain why people are turning to AI, it’s important to understand that using it as a replacement for professional support and connection is inherently dangerous.
AI can’t replace human connection
While AI can provide information, suggestions, and structured responses, it can’t replicate a counsellor–client relationship.
When AI becomes a replacement for professional mental health support, the risks are significant. AI cannot assess risk, recognize crisis, or respond with the clinical judgment that comes from years of training and human experience. It cannot form a therapeutic relationship, and that relationship is, in many ways, the mechanism through which healing happens.
Counsellors can adapt in real time, ask questions that deepen understanding, and build on emotional cues that aren’t explicitly spoken.
Feeling understood, supported, and validated doesn’t just happen on its own. It takes time, attention, and genuine curiosity, which are often the most impactful parts of counselling and something AI can’t replicate.
For some, building that human connection and being vulnerable may cause stress or anxiety, but it can also be a powerful source of support. Learning how to navigate discomfort, communicate needs, and experience connection with other people unfolds within relationships, and counselling offers a safe space to build those skills in ways AI can’t provide.
What makes it dangerous?
One of the biggest concerns with using AI for mental health support is its lack of ability to assess safety and risk.
Counsellors are trained to recognize when someone is in distress, figure out how urgent the situation is, and help connect them with the right support. AI tools don’t operate in that same way and aren’t equipped to safely support someone in crisis. There have been instances where AI missed safety cues and provided people with information that resulted in suicide or harm.
Limitations and bias
For many people, a big part of counselling is about building skills like learning ways to regulate emotions, calming techniques when feeling overwhelmed, or healthy ways to communicate with a partner. While AI can offer suggestions, many emotional and interpersonal skills are learned through role-playing conversations, tolerating discomfort in real interactions, and receiving feedback.
AI can also give people an unrealistic idea of how conversations and relationships work. Its responses are often consistently supportive and affirming, which can feel comforting in the moment but differ from real relationships, where misunderstandings and imperfect moments are part of connection and growth.
There’s also the issue of bias when people use AI to talk about their mental health. If someone approaches AI already believing they have a certain issue or diagnosis, the way AI is designed to respond — supportive, validating, agreeable — can actually work against them. The responses they receive may reinforce that assumption rather than exploring other possibilities. In counselling, there’s a collaborative process to better understand what someone is going through. Counsellors are trained to reflect on bias and adapt their support to each person’s unique experience.
Finding a balance
While there are some major concerns, it doesn’t mean you can’t use AI at all. It can be a helpful tool when it’s used safely, as support alongside professional care.
While AI can offer accessibility and comfort, counsellors play an essential role in providing effective mental health support.
If you or someone you know is struggling, reach out to us! We offer in-person and online counselling, so you can get support that feels comfortable and fits your lifestyle.
Information for this blog post was provided by registered social worker, Sarah Rosenfeld.