News location:

Friday, December 27, 2024 | Digital Edition | Crossword & Sudoku

Chatbot therapy: can generative AI treat loneliness?

There could be serious risks in turning to generic generative AI services for therapeutic help. Photo: UNSW

By Jennifer Dudley-Nicholson

Former AFL player Richard fell on hard times, quite literally, when he missed a step and suffered “a nasty spill” at a game.

The accident hurt his pride as much as his body and the retiree, in his 70s, pulled back from social activities and stopped following football in the aftermath.

“It was embarrassing really. I mean, I’ve tackled blokes twice my size but this, it felt like the whole crowd was watching,” he said.

“Now I just find myself second-guessing every step I take.”

Richard, with his conversational manner and compassionate advice, is not actually human but a virtual helper crafted from pixels, powered by artificial intelligence, and programmed in human experiences by academics.

His character is one of six created at the University of NSW which are designed to help people navigate challenges such as dementia, ageing, trauma, eating disorders and depression.

But the researchers, as well as experts in technology and psychology, warn not all AI chatbots can be relied upon to help people in need.

There could be serious risks in turning to generic generative AI services such as ChatGPT, Google Gemini or Meta AI for therapeutic help, they say, as they are not trained to assist with mental health and may simply agree with users rather than helping them.

UNSW’s AI characters were created to be virtual companions to they people they help, loaded with real-life experiences from people in similar situations, Jill Bennett, who leads its Big Anxiety Research Centre, says.

The characters felt a little flat and basic before generative AI large-language models appeared, she says, and the technology helped to make them more conversational and interactive.

“The idea is you create a very relatable character on the understanding that people don’t want to be seeking help, people don’t want to sign up for a boring course, and we’re creating someone who can be a mate and chat about footy but also tell his story,” Professor Bennett told AAP.

“Once you have these relatable characters that people trust and find enjoyable and stimulating, you can deliver good information to them.”

One of the group’s most advanced AI characters is Viv, who has been trained to help people with dementia.

The older female character is designed to listen to users, share experiences, provide “reality testing” by explaining surroundings, and provide a calm, even-tempered and non-judgemental presence.

“In aged care, there may not be someone to sit with you and talk for an extended period and if you’re home alone there definitely wouldn’t be,” Prof Bennett said.

“The idea is that a character could be gently calming, could be reassuring, could be clarifying if you don’t know whether something’s real or not, or it could just have a soothing effect from having someone who hears you and is with you and makes you feel that things are all right.”

Other characters include a supportive coach for people with eating disorders or a therapist for those battling depression, and the group is investigating companions to deal with loneliness.

The AI companions can be delivered on a TV screen or another device, Prof Bennett says, and the group plans to launch characters commercially before the end of 2025.

But using generative AI tools to help with personal problems is a trend that is already taking off, RMIT psychology senior lecturer James Collett says, as internet users seek help.

“ChatGPT is already out of the bottle, we can’t put it back in, and people are going to use it for psychotherapy and even for companionship,” he said.

Perhaps surprisingly, using generative AI services for assistance can provide limited assistance in some instances, he says, even though he does not recommend it.

AI chatbots could deliver reflective questions that make users consider their own feelings, he says, or help to validate their experiences.

They could not draw on body language and non-verbal cues like a therapist though, Dr Collett says, and they should not be used to diagnose an issue.

“The real risk is not that (users are) going to experience a catastrophe, the risk is that they’re not going to benefit in the long-term,” he says.

“It might take the edge off enough that they avoid engaging in some sort of psychotherapy that could be a better long-term solution.”

AI services are also commonly programmed to agree with users, rather than gently challenging their statements, Dr Collett says, which could prove problematic.

Chatbots have been associated with at least two deaths worldwide, including a 14-year-old American boy suffering from depression who, according to a lawsuit, took bad advice from a virtual companion.

Specialised AI technology could assist people in certain circumstances, UNSW AI Institute chief scientist Toby Walsh says, such as situations in which they may fear judgement or feel more comfortable communicating with software.

Ultimately, he says, AI-based therapy or companions will not be able to supplant human assistance but merely supplement it.

“There’s nothing that can replace our human connections with each other,” he said.

“We’re never going to fully relate to a machine because it’s never going to suffer like us, it’s never going to enjoy the highs of life, it’s never going to fall in love and, equally, it’s never going to lose a loved one or have to contemplate its own mortality.”

Who can be trusted?

In a world of spin and confusion, there’s never been a more important time to support independent journalism in Canberra.

If you trust our work online and want to enforce the power of independent voices, I invite you to make a small contribution.

Every dollar of support is invested back into our journalism to help keep citynews.com.au strong and free.

Become a supporter

Thank you,

Ian Meikle, editor

Australian Associated Press

Australian Associated Press

Share this

Leave a Reply

Related Posts

Follow us on Instagram @canberracitynews