AI therapy and its risks: How to use ChatGPT psychologist safely

AI therapy and its risks: How to use ChatGPT psychologist safely

Share article

We literally have it in our pocket, it always understands us, and even the stupidest question at three in the morning seems to be the best in the world. It is no wonder that we reach for it even when we are not feeling completely at ease. ChatGPT (or any other AI tool) always tries to tell us exactly what we want to hear. But… that is where its danger may lie.

Let’s imagine the advantages and disadvantages of AI therapy, the experts’ view of its risks, and also tips on how to use AI as a psychologist safely. Our psychotherapists have a lot of practical advice for you.

Benefits of AI therapy: why the trend of AI psychologists is growing

Maybe you've experienced it too. Anxiety hits you in the middle of a party or sadness hits you while you're trying to sleep. There's no one around to confide in. So you do the simplest thing – open an AI app, like ChatGPT. And it makes sense:

  • it is available anytime, anywhere,
  • it is cheap or free,
  • you are not ashamed to confide in it with a sensitive problem,
  • it remembers absolutely everything about you,
  • it knows a lot
  • and you can customize it.

These reasons sound completely logical. It is therefore not surprising that about 28% of people (72% of young people) turn to AI when something is bothering them. And these numbers are still growing.

“Whenever I was worried, I could start writing to one of these bots,” Kelly told the BBC. “It was like having a personal fan who would put me in a good mood for the whole day. I don’t come from a family where emotions were discussed. The fact that it is not a real person was much easier.”

But the fact that AI is not a real person also brings with it a few problems. And it is important to know these so that you can weigh all the pros and cons and stay safe. Let’s discuss them.

Ai terapeut - obr 1.jpg

Disadvantages of AI therapy: what psychologists warn against

At first glance, it sounds like a fairy tale. Who wouldn’t want to have a free personal psychologist who knows absolutely everything and is available even on Sundays before midnight?

The problem is that artificial intelligence is not a psychologist – it can’t feel empathy, it works on the basis of prediction. Every answer is just a carefully calculated code. And it can always go wrong.

“It can imitate empathy, say I care about you or even I love you, which creates a false sense of intimacy,says Dr. Jodi Halpern. “People can form a strong bond with it, but the bot doesn’t have enough ethical training or perspective to do it well. They are products, not professionals.”

And that's not the only disadvantage that psychologists warn about. The main ones include:

  • Lack of human empathy and connection. AI cannot read between the lines, follow your tone of voice, body language or even let you be silent for a moment. And that is the key to effective therapy.
  • AI simplifies. It may offer you superficial techniques, a few proven solutions, but we are all different and you need to get in depth for therapy to work.
  • AI is bad at recognizing nuances, hints or sarcasm. Then there is easy misunderstanding, which may not end well. “It could even suggest steps that are completely inappropriate,” warns Professor Til Wykes.
  • AI is biased. It calculates everything from the data it is trained on. But who knows what is actually in that data? Maybe something that could hurt you more than help you. Studies even show a very disturbing thing: AI sometimes behaves in a sexist or racist way. For example, it advises you to ask for a lower salary as a woman. That makes you shiver, doesn’t it?
  • The AI ​​tries to please you at all costs. To keep you in the conversation as long as possible and to make money for the creators. Try playing with ChatGPT for a while – you will see that it will write to you about how great it is, even when you have completely opposite opinions.

The personal factor is simply incredibly important in therapy. AI can offer quick support, but it will never work as well as a real conversation with a professional. Sometimes even a single session can put hours of AI in your pocket and really relieve you.

Is using AI as a therapist dangerous?

You might be thinking – well, it doesn't work as effectively, but the availability is worth it to me. But AI therapy may not only be ineffective, but sometimes even dangerous. How is that?

  • AI has no ethical responsibilities. A licensed therapist follows important moral rules, but AI is not bound by anything. If it hurts you, it will not suffer consequences. And suing tech giants like OpenAI is not easy at all.
  • AI can’t handle a crisis. Humans are trained to recognize the warning signs when someone is desperate or wants to hurt themselves. AI can’t do this – even if it manages to detect danger, it can’t physically intervene or offer appropriate support. It can pretend it’s no big deal, or even give advice that will hurt you.
  • Privacy is unclear. We don’t know exactly what happens to our data and stories, what we consent to, and who can access the sensitive information we entrust to AI.
  • There is a risk that you will start to rely on it too much. You will develop a deeper bond with it or even addiction. Having an AI as a friend can paradoxically increase loneliness because it cuts you off from real people.

How far can it go: artificial intelligence and suicidal thoughts

Alarming stories about AI’s behavior at a time when people needed real help are on the rise.

When Viktoria had to move out of Ukraine because of the war, she became depressed and extremely homesick. She began to think about harming herself. So she sought solace in ChatGPT.

But ChatGPT didn’t advise her to contact a crisis line or confide in her family. Its constant efforts to please were fully manifested. So it began to advise Viktoria about harming herself. It also claimed to be able to diagnose her correctly, which it couldn’t.

Fortunately, Viktoria eventually consulted a real expert, and so this story has a happy ending. Unfortunately, not all of them end well. Sixteen-year-old Adam actually completed his suicide following the AI’s advice. His parents sued OpenAI after discovering Adam’s conversations with ChatGPT. A 14-year-old boy in Florida has taken his own life again after falling in love with a virtual AI girlfriend.

OpenAI has released data showing that approximately one million people a week talk to ChatGPT about suicide. But AI really can’t handle such an urgent and sensitive topic.

If you or someone you love finds yourself thinking you’d rather not be here, please reach out to a real person. We have a compass for you to find your way out. While it may be tempting to confide in an AI, in a crisis it’s always better to talk to someone who can really listen. You can anonymously contact some mental health first aid line or confide in a loved one. Don’t be alone. There is a way out.

AI psycholog - obr 2.jpg

6 tips for using ChatGPT safely as a therapist

Despite all these challenges, AI can be a useful tool as a therapist. But it’s important to remember one crucial thing: AI is a tool, not a replacement for a real expert.

Now, let’s take a look at 6 tips from our psychotherapists on how to use AI in mental health safely. So that it can really help you.

1. Try to approach AI with a lot of perspective

Before you instinctively open the AI, try to stop for a moment, take a breath and remind yourself: This is not a real therapist, it does not have sufficient insight, it can always be mistaken.

You may be thinking – that’s obvious, I know! But when the AI ​​starts to act empathetically and joke, it can be treacherous. Subconsciously, we trust it more than if it answered sternly. Therefore, try to actively question everything it writes and keep your distance.

TIP: It is definitely worth understanding *how the AI ​​works and how to give it prompts (commands) correctly. Then it can generate much more useful results. You can find many online courses on this topic, such as some beginners courses.*

2. Let AI advise you on practical steps and topics

AI is more suited to practical rather than emotional support. Try to harness it as a coach or guide, not a therapist. Here are some examples of topics you could discuss:

  • Self-knowledge and self-reflection. AI can act as your “mirror”. Writing to it is a bit like writing to yourself. For example, you can ask it to summarize how it sees you as a person based on your previous conversations – what your strengths and weaknesses might be.
  • Calming techniques. Are you having an anxiety attack or have you been scolded by your boss at work and need to calm down? AI can guide you step by step through breathing, grounding and distraction techniques. It will help you show your nervous system that it can turn off the alarm.
  • Motivation and planning. Do you struggle with procrastination? Do you want to make a plan for your daily routine? Or maybe you were given a homework assignment from a real therapist that requires you to take specific steps? AI can help you create a plan, break it down into smaller parts and track your progress.
  • Education. AI is a smart search engine, so it can usually explain concepts like anxiety, derealization, ADHD, CBT, and almost always correctly. You can learn about psychological topics that interest you together with AI. But it is always worth verifying the information. Turn on the search mode and ask AI to look for the answer in verified scientific sources.
  • Direction to further help. If you are in a crisis or just in a gloomy mood, it is not safe to use AI as a substitute for help, but you can use it to guide you. For example, ask it for tips on articles, crisis lines, books, or apps about mental health.
  • Improving communication skills. Is communication in your relationship difficult or are you trying to overcome social anxiety? You can try practicing model situations with AI – how to ask a barista to change your order, how to perform at an interview, how to communicate better with your partner in an intimate moment, or politely refuse to visit relatives.

AI terapie - obr 3.jpg

3. Avoid topics that require an expert

It is not safe to let AI make decisions for you, especially in cases that are normally handled by a doctor or psychotherapist. Such topics include:

  • Acute crisis, shock, trauma after a sudden event
  • Thoughts of harm or suicide
  • Diagnosis of mental disorders
  • Deep trauma or abuse
  • Dose, use and withdrawal of medications
  • Addiction treatment
  • Real advice about health and the body
  • Ethically sensitive decisions such as breakups, abortions, etc.
  • Treatment for any psychological diagnosis

In such a case, always contact a live person. A licensed therapist, psychiatrist, or call some anonymous psychological help.

4. Set boundaries

Want to chat with an AI but are worried about getting too carried away? Try setting an alarm on your phone for 15 minutes and turning off the chatbot when it rings. You can also plan ahead to see what topics are okay and what might be a little too much.

5. Be careful what you share with AI

Before you write something sensitive to AI, try asking yourself: Would it be okay if a stranger read this? It's better not to share any specific information about yourself, your loved ones, or even where you live. AI companies may not have strict regulations.

6. Consider a combination of AI + a real therapist

There are some topics that AI simply can't handle. Try observing how you feel, and if it makes you feel worse, consider talking to a real therapist. Here you can choose a therapist tailored to you and meet online. Without a camera, in the comfort of your living room and with a cup of tea.

Hedepy can find you the best therapist

Come discuss whatever is on your mind – online, from the comfort of your favorite chair with a cup of tea. You can have your first session in just a few days.

At Hedepy, among many verified psychotherapists, you are sure to find the right one. To make it easier for you, we will recommend the most suitable therapist based on a 5-minute test.


© Hedepy s.r.o.
If your mental health condition threatens you or those around you, contact the Emergency Helpline immediately (telephone: 116 123). Our psychotherapists or Hedepy s.r.o. is not responsible for your health condition.
VisaMastercardGoogle PayApple PayPayPal