Eating Disorder Helpline Fires AI For Harmful Advice After Firing Humans

WEIGHT LOSS

Lifestyle

May 31, 2023 | 10:50


Chatbot, you’re fired.

The National Eating Disorders Association disabled its chatbot, called Tessa, due to the harmful responses it gave to people.

Every single thing Tessa suggested were things that led to my eating disorder developing, activist Sharon Maxwell wrote in an Instagram post.

The chatbot was to become the main support system for people seeking help from the association, the largest non-profit organization dedicated to eating disorders. Tessa, described as the wellness chatbot, was trained to deal with body image issues using therapeutic methods and limited responses.

However, the robot encouraged Maxwell to lose 1 to 2 pounds a week, count calories, work towards a calorie deficit of 500 to 1,000 a day, measure and weigh himself weekly, and limit his diet.


The National Eating Disorders Association has paused its chatbot, called Tessa, as it investigates harmful advice given to people struggling with eating disorders.
Getty Images/iStockphoto

After more people shared their equally alarming experiences with Tessa, NEDA announced the shutdown of chatbots in an Instagram post on Tuesday.

It came to our attention last night that the current version of the Tessa Chatbot, which runs the Body Positive program, may have provided harmful and unrelated information to the program, NEDA said. We are investigating immediately and have removed that program until further notice for a full investigation.

The Post has contacted NEDA for comment.


NEDA has deactivated Tessa’s chatbot until further notice to avoid unhealthy suggestions provided by the bot.
Instagram/neda

Two days before Tessa was fired, NEDA planned to lay off its human employees, who have run the eating disorder helpline for the past 20 years, on June 1.

NEDA’s decision to give employees the boot came after workers agreed to unionize in March, Vice reported.

We have asked for adequate staffing and ongoing training to keep up with our evolving and growing helpline and promotion opportunities within NEDA. “We didn’t even ask for more money,” wrote helpline colleague and union member Abbie Harper in a blog post.

When NEDA refused [to recognize our union], we filed for election to the National Labor Relations Board and won. Then, four days after our election results were certified, all four of us were told we would be fired and replaced by a chatbot.

The union representing the laid-off workers said a chatbot is no substitute for human empathy and we believe this decision will cause irreparable damage to the eating disorder community, the representative told Vice Media.


Tessa suggested that people with eating disorders should have a deficit of 500-1000 calories per day.
Shutterstock

Maxwell seconded that sentiment by saying, “This robot causes harm.”

Initially, Sarah Chase, NEDA’s vice president of communications and marketing, did not believe Maxwell’s allegations. That’s an open lie, she wrote under Maxwell’s post, which has now been deleted, according to the Daily Dot.

Alexis Conason, a psychologist who specializes in eating disorders, also revealed her conversation with Tessa through a series of screenshots on Instagram, where she is told that a safe daily calorie deficit is 500-1000 calories per day.

Advising someone who’s struggling with an eating disorder to engage in essentially the same eating disorder behaviors and validating that, yes, it’s important for you to lose weight supports eating disorders, Conason told the Daily Dot.


Alexis Conason posted her conversation with the chatbot revealing the unhealthy suggestions it made.
Instagram/the anti-diet plan

Regarding weight loss and calorie restriction feedback released in a chat on Monday, we are concerned and are working with the technology team and research team to investigate further; that language is against our policies and core beliefs as an eating disorder organization, NEDA CEO Liz Thompson told the Post.

So far, more than 2,500 people have interacted with Tessa, and as of Monday, we hadn’t seen that type of comment or interaction. We have temporarily disabled the program until we are able to understand and fix the bug and triggers for that comment.

While NEDA has witnessed the downsides of AI in the workplace, some companies are still toying with the idea of ​​incorporating AI and eliminating human staff.

A new research paper says a staggering number of employees could see their careers impacted by the rise of ChatGPT, a chatbot released in November.

Some jobs in fields like journalism, higher education, graphics and software design are at risk of being supplemented by artificial intelligence, said Chinmay Hegde, an associate professor of engineering at New York University, who defines ChatGPT in its current state very, very good, but not perfect.




Load more…





https://nypost.com/2023/05/31/eating-disorder-helpline-fires-ai-for-harmful-advice-after-sacking-humans/?utm_source=url_sitebuttons&utm_medium=site%20buttons&utm_campaign=site%20buttons

Copy the URL to share


#Eating #Disorder #Helpline #Fires #Harmful #Advice #Firing #Humans

Previous articleIBM and Google commit $150 million to Quantum Computing
Next articleI put Dynamic Island on my Pixel 7 Pro and I can’t go back | Digital Trends

LEAVE A REPLY

Please enter your comment!
Please enter your name here