Woebot does not suggest medications or clinical treatment protocols either, and instead functions as a friend checking in, or, more accurately, a mechanical channel someone can feel safe sharing their thoughts without fear of judgment.“It’s more of a choose-your-own-adventure of how the conversation will play out,” said Darcy, who did her post-doctoral psychiatry research at Stanford and still lectures there.Rather than augmenting a real therapist or even a non-clinical person, Woebot is wholly robotic; open to engage with an individual as often or as little as they want depending on their needs.San Francisco-based Woebot Labs created the tool –originally intended for college students but later expanded to all adults – based on cognitive behavioral therapy techniques.Rather than other mental health apps that use chatbots as a go-between or an assistant, we wanted to create an end-to-end therapeutic experience that is available 24 hours a day, seven days per week.He always knows who you are and remembers everything you told him.
After a brief introduction, Woebot will check in to ask the user’s mood and thoughts, then employ CBT techniques to help reframe thought patterns or negative emotions associated with mood disorders like anxiety or depression.Tay, the creation of Microsoft's Technology and Research and Bing teams, was an experiment aimed at learning through conversations. Soon, Tay began saying things like "Hitler was right i hate the jews," and "i fucking hate feminists." But Tay's bad behavior, it's been noted, should come as no big surprise."This was to be expected," said Roman Yampolskiy, head of the Cyber Security lab at the University of Louisville, who has published a paper on the subject of pathways to dangerous AI.She was targeted at American 18 to 24-year olds—primary social media users, according to Microsoft—and "designed to engage and entertain people where they connect with each other online through casual and playful conversation."SEE: Microsoft's Tay AI chatbot goes offline after being taught to be a racist (ZDNet) And in less than 24 hours after her arrival on Twitter, Tay gained more than 50,000 followers, and produced nearly 100,000 tweets. "The system is designed to learn from its users, so it will become a reflection of their behavior," he said.That’s the best thing about technology: it never forgets.” Woebot's brain, so to speak, also sets it apart from other health chatbots in general because it doesn’t rely on data from any source other than the words the user texts to it.As a counter-example, the AI-powered patient engagement chatbots from Conversa overlay clinical conversation modules with internal data from their health systems customers as well as data collection platform Validic.“We’re not trying to replicate or replace how therapy with a real psychologist or psychiatrist would unfold, but create a new experience altogether for people who might not otherwise seek out mental health treatment.” Darcy was inspired to create an app to do that job based on her varied professional experience.