Now chatbot can speak with users and also make conversations with the chatbot more personal. What would happen when people start to use AI as a therapist? Therapies are traditional practice, which is built on human interaction, trust, and emotional intelligence. The relationship between therapist and client is one of the important thing for treatment success. This relationship is based on trust and confidentiality. People are using chatbots for various purposes, especially for solving their problems.
Open AI
In September, an Open AI officer compared her conversation with a chatbot to therapy, rather than admitting she had never experienced any therapy before that. She used a chatbot to calm down and she compared her situation with a therapist.
Lilian Weng
She had quite an emotional and personal conversation with the chatbot in voice mode. She talked about stress and work-life balance. Felt heard and warm after having a conversation with AI. She said that never tried therapy before this, but this is probably it and posted on X (former Twitter) that try it, especially if you usually just uses it as a productivity tool. This was posted on X on September 26. Her statement prompted a torrent of negative commentary accusing her of downplaying mental illness.
Announcement by Open AI
Just a day before, Open AI announced that it had upgraded its chatbot with voice and image capability. It also announced users are allowed to chat with it, share photos, and also can listen to responses. It is open to all users. People could use this upgrade to settle their family dinner table debates or they could even have a bedtime story read to them suggested by Open AI. Open AI has updated its features which are convenient for the users to use it easily.
Criticism
When Weng shared her views she was criticized for apparently promoting a chatbot as a tool in the replacement of a therapist. She was also criticised, especially after admitting that she was not qualified to pass a statement on the therapy experience. She was accused by others of this which comes under the ELIZA effect. ELIZA effect is the effect in which people think that computer programs have become capable of understanding human emotions or functions because of the way they respond to them.
A day later Weng posted: that the People’s interactions with AI models differed and that her statements were her personal opinion.
Open AI which has turned into a chatbot is the world’s fastest-growing consumer app this year. It helps to find solutions for mental health challenges or even just a warm listener or with patience.
Free of cost and available 24/7
A journalist in Mumbai went to therapy but also connected with the chatbot to get help with life challenges. She was aware that this was not recommended and also knew that the chatbot could generate incorrect answers for her. “I usually book therapy sessions when I need the clarification or when things get worse in terms of anxiety or is there a new change in my behavior pattern, as therapy is costly and when you want to quickly calm down yourself,” said the journalist.
She pointed out that the therapist couldn’t be available at the break of dawn while Chatgpt helped her to find certain solutions when needed to calm down herself. Chatbot can be used at any time, but while with a therapist there is a fixed time to talk.
Chatgpt helps out people, but there is a lot of conversation with a human therapist. With the therapist, you don’t have to use certain formats to write which you use while chatbot.
Leave feedback about this