
We literally have it in our pocket, it always understands us, and even the stupidest question at three in the morning seems to be the best in the world. It is no wonder that we reach for it even when we are not feeling completely at ease. ChatGPT (or any other AI tool) always tries to tell us exactly what we want to hear. But… that is where its danger may lie.
Let’s imagine the advantages and disadvantages of AI therapy, the experts’ view of its risks, and also tips on how to use AI as a psychologist safely. Our psychotherapists have a lot of practical advice for you.
Maybe you've experienced it too. Anxiety hits you in the middle of a party or sadness hits you while you're trying to sleep. There's no one around to confide in. So you do the simplest thing – open an AI app, like ChatGPT. And it makes sense:
These reasons sound completely logical. It is therefore not surprising that about 28% of people (72% of young people) turn to AI when something is bothering them. And these numbers are still growing.
“Whenever I was worried, I could start writing to one of these bots,” Kelly told the BBC. “It was like having a personal fan who would put me in a good mood for the whole day. I don’t come from a family where emotions were discussed. The fact that it is not a real person was much easier.”
But the fact that AI is not a real person also brings with it a few problems. And it is important to know these so that you can weigh all the pros and cons and stay safe. Let’s discuss them.

At first glance, it sounds like a fairy tale. Who wouldn’t want to have a free personal psychologist who knows absolutely everything and is available even on Sundays before midnight?
The problem is that artificial intelligence is not a psychologist – it can’t feel empathy, it works on the basis of prediction. Every answer is just a carefully calculated code. And it can always go wrong.
“It can imitate empathy, say I care about you or even I love you, which creates a false sense of intimacy,” says Dr. Jodi Halpern. “People can form a strong bond with it, but the bot doesn’t have enough ethical training or perspective to do it well. They are products, not professionals.”
And that's not the only disadvantage that psychologists warn about. The main ones include:
The personal factor is simply incredibly important in therapy. AI can offer quick support, but it will never work as well as a real conversation with a professional. Sometimes even a single session can put hours of AI in your pocket and really relieve you.
You might be thinking – well, it doesn't work as effectively, but the availability is worth it to me. But AI therapy may not only be ineffective, but sometimes even dangerous. How is that?
Alarming stories about AI’s behavior at a time when people needed real help are on the rise.
When Viktoria had to move out of Ukraine because of the war, she became depressed and extremely homesick. She began to think about harming herself. So she sought solace in ChatGPT.
But ChatGPT didn’t advise her to contact a crisis line or confide in her family. Its constant efforts to please were fully manifested. So it began to advise Viktoria about harming herself. It also claimed to be able to diagnose her correctly, which it couldn’t.
Fortunately, Viktoria eventually consulted a real expert, and so this story has a happy ending. Unfortunately, not all of them end well. Sixteen-year-old Adam actually completed his suicide following the AI’s advice. His parents sued OpenAI after discovering Adam’s conversations with ChatGPT. A 14-year-old boy in Florida has taken his own life again after falling in love with a virtual AI girlfriend.
OpenAI has released data showing that approximately one million people a week talk to ChatGPT about suicide. But AI really can’t handle such an urgent and sensitive topic.
If you or someone you love finds yourself thinking you’d rather not be here, please reach out to a real person. We have a compass for you to find your way out. While it may be tempting to confide in an AI, in a crisis it’s always better to talk to someone who can really listen. You can anonymously contact some mental health first aid line or confide in a loved one. Don’t be alone. There is a way out.

Despite all these challenges, AI can be a useful tool as a therapist. But it’s important to remember one crucial thing: AI is a tool, not a replacement for a real expert.
Now, let’s take a look at 6 tips from our psychotherapists on how to use AI in mental health safely. So that it can really help you.
Before you instinctively open the AI, try to stop for a moment, take a breath and remind yourself: This is not a real therapist, it does not have sufficient insight, it can always be mistaken.
You may be thinking – that’s obvious, I know! But when the AI starts to act empathetically and joke, it can be treacherous. Subconsciously, we trust it more than if it answered sternly. Therefore, try to actively question everything it writes and keep your distance.
TIP: It is definitely worth understanding *how the AI works and how to give it prompts (commands) correctly. Then it can generate much more useful results. You can find many online courses on this topic, such as some beginners courses.*
AI is more suited to practical rather than emotional support. Try to harness it as a coach or guide, not a therapist. Here are some examples of topics you could discuss:

It is not safe to let AI make decisions for you, especially in cases that are normally handled by a doctor or psychotherapist. Such topics include:
In such a case, always contact a live person. A licensed therapist, psychiatrist, or call some anonymous psychological help.
Want to chat with an AI but are worried about getting too carried away? Try setting an alarm on your phone for 15 minutes and turning off the chatbot when it rings. You can also plan ahead to see what topics are okay and what might be a little too much.
Before you write something sensitive to AI, try asking yourself: Would it be okay if a stranger read this? It's better not to share any specific information about yourself, your loved ones, or even where you live. AI companies may not have strict regulations.
There are some topics that AI simply can't handle. Try observing how you feel, and if it makes you feel worse, consider talking to a real therapist. Here you can choose a therapist tailored to you and meet online. Without a camera, in the comfort of your living room and with a cup of tea.
Come discuss whatever is on your mind – online, from the comfort of your favorite chair with a cup of tea. You can have your first session in just a few days.
At Hedepy, among many verified psychotherapists, you are sure to find the right one. To make it easier for you, we will recommend the most suitable therapist based on a 5-minute test.


