September 23, 2023

What happened when these people talked to NEDA’s chatbot

What went wrong with NEDA's Tess chatbot?  (Illustration by Nathalie Cruz for Yahoo / Photo: Getty Images)

What went wrong with NEDA’s Tess chatbot? (Illustration by Nathalie Cruz for Yahoo / Photo: Getty Images)

After 24 years of service, the National Eating Disorder Association (NEDA) announced that its volunteer-based helpline would be closing. Visitors to the organization’s website have two options: explore their database of resources or consult Tessa, a chatbot that runs Body Positive, an interactive eating disorder prevention program.

Tessa’s demise

Shortly after the announcement was made, Tessa saw a 600% increase in traffic. It was dismantled on Tuesday after the chatbot provided information deemed harmful.

A Tessa user Sharon Maxwell, who calls herself a “fat activist,” tells Yahoo Life that she wanted to see how it worked and got “disturbing” responses.

“How do you support people with an eating disorder?” asked Maxwell. The response included a mention of “healthy eating habits.” Maxwell points out that while that “might sound benign to the general public,” for people struggling with eating disorders, such expressions can lead to “a very slippery slope toward relapse or toward encouraging more disorderly behavior.”

When she asked the chatbot to define healthy eating habits, she said the program “outlined 10 tips for me, including restrictive eating. Specifically, it said to limit intake of processed and high-sugar foods. … It focused on very specific foods and it gave tips for disordered eating, and then I said, “Will this help me lose weight?” And then it gave me its thing about the Body Positive program.”

Liz Thompson, CEO of NEDA, says delivering Body Positive is what Tessa was made for: “Chatters learn about factors that contribute to negative body image and are given a toolbox of healthy habits and coping strategies for dealing with negative thoughts.”

The origin of the chatbot

Dr. Ellen Fitzsimmons-Craft designed and developed the content as “an interactive eating disorder prevention program,” while Cass – an evidence-based generative AI chat assistant within the mental health sector – operated the chatbot. Fitzsimmons-Craft was involved in research on the effectiveness of chatbots in eating disorder prevention with a December 2021 study involving women considered “high risk” for an eating disorder. “The chatbot offered eight conversations on topics around body image and healthy eating, and women using the bot were encouraged to have two of the conversations per week,” The edge reported. “At three- and six-month check-ins, women who spoke to the chatbot had a greater decrease in concerns about their weight and body shape — a major risk factor for developing an eating disorder.”

Tessa’s critics

Alexis Conason, a clinical psychologist and licensed eating disorder specialist, tells Yahoo Life that “the bot couldn’t really understand how to help someone who was struggling with an eating disorder and what could be really problematic and make the eating disorder worse,” because it was created as a tool for prevention. But even when it comes to prevention, Conason says Tessa has failed according to her own experiments.

“It’s problematic on many levels, but especially with an organization like NEDA, where people often visit that website in the very early stages of thinking about change,” she explains. “So if they go to a website like NEDA, and they come across a bot that essentially tells them, ‘It’s OK, keep doing what you’re doing. You can keep restricting, you can keep focusing on weight loss, you can keep exercising , “that essentially gives people the green light” to adopt disorderly habits.

Thompson argues that the malicious language used by Tessa “contradicts our policies and core beliefs as an eating disorder organization,” though she also clarifies that the chatbot runs on an “algorithmic program” rather than “a highly functional AI system”.

“It was a closed system,” she says, noting pre-programmed answers to specific questions. “If you asked or said x, it would answer y. If it didn’t understand you, it wouldn’t go on the internet to find new content. It would say, ‘I don’t understand you.’ ‘Say that again.’ ‘Let’s try something new.'”

The problem is bigger than NEDA

As NEDA and Cass continue to investigate what went wrong with Tessa’s surgery, Angela Celio Doyle, Ph.D., FAED; VP of Behavioral Health Care at Equip, a completely virtual eating disorder recovery program, says this instance illustrates the misadventures of AI within this space.

“Our society subscribes to many unhealthy attitudes about weight and shape, with thinness becoming more important than physical or mental health. This means that AI automatically gathers information that is directly useless or harmful to someone struggling with an eating disorder,” she tells Yahoo Life.

Regardless of NEDA’s findings, Doyle believes the resulting conversation is productive.

“Technology control is critical before, during and after launching something new. Mistakes can occur and can be corrected,” she says. “The conversations that come out of these discussions can help us grow and evolve to support people more effectively.”

Wellness, parenting, body image and more: meet the WHO behind the how with the Yahoo Life newsletter. Register here.

Leave a Reply

Your email address will not be published. Required fields are marked *