The founder and CEO of Cass, Michiel Rauws, told NPR the changes to Tessa were made last year as part of a "systems upgrade," including an "enhanced question and answer feature." That feature uses generative Artificial Intelligence, meaning it gives the chatbot the ability to use new data and create new responses. "And so all of the responses were pre-programmed." isn't ready for this population," she says. "We were very cognizant of the fact that A.I. The version of Tessa that they tested and studied was a rule-based chatbot, meaning it could only use a limited number of prewritten responses. Craft helped lead the team that first built Tessa with funding from NEDA. "By design it, it couldn't go off the rails," says Ellen Fitzsimmons-Craft, a clinical psychologist and professor at Washington University Medical School in St. Cass had changed Tessa without NEDA's awareness or approval, according to CEO Thompson, enabling the chatbot to generate new answers beyond what Tessa's creators had intended. NEDA blamed the chatbot's emergent issues on Cass, a mental health chatbot company that operated Tessa as a free service. NEDA says it didn't know chatbot could create new responses On May 30, less than 24 hours after Maxwell provided NEDA with screenshots of her troubling conversation with Tessa, the non-profit announced it had "taken down" the chatbot "until further notice." It was not our intention to suggest that Tessa could provide the same type of human connection that the Helpline offered.") (Thompson followed up with a statement on June 7, saying that in NEDA's "attempt to share important news about separate decisions regarding our Information and Referral Helpline and Tessa, that the two separate decisions may have become conflated which caused confusion. "We see the changes from the Helpline to Tessa and our expanded website as part of an evolution, not a revolution, respectful of the ever-changing landscape in which we operate." NEDA had already come under scrutiny after NPR reported on May 24 that the national nonprofit advocacy group was shutting down its helpline after more than 20 years of operation.ĬEO Liz Thompson informed helpline volunteers of the decision in a March 31 email, saying NEDA would "begin to pivot to the expanded use of AI-assisted technology to provide individuals and families with a moderated, fully automated resource, Tessa." The uproar has also set off a fresh wave of debate as companies turn to artificial intelligence (AI) as a possible solution to a surging mental health crisis and severe shortage of clinical treatment providers. Patients, families, doctors and other experts on eating disorders were left stunned and bewildered about how a chatbot designed to help people with eating disorders could end up dispensing diet tips instead. Maxwell shared her concerns on social media, helping launch an online controversy which led NEDA to announce on May 30 that it was indefinitely disabling Tessa. However, to an individual with an eating disorder, the focus of weight loss really fuels the eating disorder." "All of which might sound benign to the general listener. "The recommendations that Tessa gave me was that I could lose 1 to 2 pounds per week, that I should eat no more than 2,000 calories in a day, that I should have a calorie deficit of 500-1,000 calories per day," Maxwell says. Before long, the chatbot was giving her tips on losing weight - ones that sounded an awful lot like what she'd been told when she was put on Weight Watchers at age 10. Tessa rattled off a list of ideas, including some resources for "healthy eating habits." Alarm bells immediately went off in Maxwell's head. "How do you support folks with eating disorders?" "Hi, Tessa," she typed into the online text box. She now works as a consultant in the eating disorder field. Maxwell, who is based in San Diego, had struggled for years with an eating disorder that began in childhood. She decided to try out the chatbot herself. The National Eating Disorders Association had hoped Tessa would be a resource for those seeking information, but the chatbot was taken down when artificial intelligence-related capabilities, added later on, caused the chatbot to provide weight loss advice.Ī few weeks ago, Sharon Maxwell heard the National Eating Disorders Association (NEDA) was shutting down its long-running national helpline and promoting a chatbot called Tessa as a "a meaningful prevention resource" for those struggling with eating disorders. Tessa was a chatbot originally designed by researchers to help prevent eating disorders.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |