Eating Disorder Helpline Chatbot Goes Terribly Wrong β†’

Last week, news broke that the National Eating Disorders Association (NEDA) planned on firing its staff and volunteer helpline folks and replace them with a chatbot named Tessa, less than a week after the workers formed a union.

That is going about as well as you would think, as Chloe Xiang writes:

As of Tuesday, Tessa was taken down by the organization following a viral social media post displaying how the chatbot encouraged unhealthy eating habits rather than helping someone with an eating disorder.

β€œIt came to our attention last night that the current version of the Tessa Chatbot, running the Body Positive program, may have given information that was harmful and unrelated to the program,” NEDA said in an Instagram post. We are investigating this immediately and have taken down that program until further notice for a complete investigation.”

It’s one thing to have an AI hallucinate and get some simple fact incorrect. This is way, way worse. Shame on the people at NEDA who made this call.