LOGO

Wysa Raises $5.5M to Enhance AI-Powered Mental Health Support

May 21, 2021
Wysa Raises $5.5M to Enhance AI-Powered Mental Health Support

The Rise of Emotionally Intelligent AI in Mental Wellness

Addressing personal feelings can be challenging; Jo Aggarwal, founder and CEO of Wysa, envisions a future where individuals might more readily share their thoughts with an artificial intelligence. Specifically, an AI designed with emotional intelligence.

Introducing Wysa: An AI-Powered Mental Health Companion

Wysa is a mental health application powered by AI, developed by Touchkin eServices, the company led by Aggarwal. The company currently operates from locations in Bangalore, Boston, and London. Functioning similarly to a chatbot, Wysa offers affirming responses and guides users through over 150 distinct therapeutic techniques.

From Elder Care to Mental Health Innovation

Wysa represents Aggarwal’s second entrepreneurial undertaking. Her initial venture, focused on elder care, did not achieve the desired market traction. This experience led Aggarwal to confront personal challenges with depression, ultimately inspiring the creation of Wysa in 2016.

Funding and Growth

In March, Wysa was selected as one of 17 participants in the Google Assistant Investment Program. Subsequently, in May, the company secured a Series A funding round of $5.5 million, spearheaded by W Health Ventures of Boston, alongside contributions from the Google Assistant Investment Program, pi Ventures, and Kae Capital.

Aggarwal reports that Wysa has accumulated a total of $9 million in funding. The company employs 60 individuals full-time and currently serves approximately three million users.

Focus on Support, Not Diagnosis

The primary objective, according to Aggarwal, is not to provide mental health diagnoses. Wysa primarily caters to individuals seeking an outlet for expression. A significant portion of Wysa’s user base utilizes the app to enhance their sleep, manage anxiety, or improve interpersonal relationships.

“Approximately 10% of Wysa’s three million users are estimated to require a formal medical diagnosis,” states Aggarwal. When a user’s interactions with Wysa yield high scores on established depression questionnaires, such as the PHQ-9, or the anxiety assessment GAD-7, the app suggests seeking guidance from a qualified human therapist.

A Complementary Tool for Mental Wellbeing

It’s important to recognize that a clinical mental health diagnosis isn't a prerequisite for benefiting from therapeutic support.

Aggarwal emphasizes that Wysa is not intended as a substitute for traditional therapy, but rather as an accessible tool for daily interaction and support.

The Power of Being Heard

“A substantial 60% of individuals who engage with Wysa simply need to feel understood and validated. However, when provided with self-help techniques, they can proactively address their concerns and experience improvement,” Aggarwal explains.

Refining Wysa’s Approach

Wysa’s methodology has been continuously refined through user feedback and insights from mental health professionals, according to Aggarwal.

During a conversation, Wysa categorizes user statements and then applies a relevant therapeutic approach, such as cognitive behavioral therapy or acceptance and commitment therapy. The app then presents pre-written questions or techniques developed by therapists to facilitate the interaction.

Insights from Millions of Conversations

Aggarwal notes that Wysa has gleaned valuable insights from over 100 million conversations conducted through the platform.

“For example, when addressing anger towards another person, therapists initially favored the ‘empty chair’ technique, encouraging users to consider the other person’s perspective. However, we discovered this technique was ineffective for individuals experiencing feelings of powerlessness or lacking trust, such as teenagers and their parents,” she explains.

“Over 10,000 users with trust issues were unwilling to engage with the empty chair exercise. This prompted us to explore alternative approaches, ultimately shaping the evolution of Wysa.”

Collaboration with Research Institutions

While Wysa’s development is driven by real-world application, research institutions have also contributed to its progress. Pediatricians at the University of Cincinnati assisted in creating a module specifically addressing anxiety related to COVID-19. Ongoing studies are also exploring Wysa’s potential to support individuals coping with the mental health effects of chronic pain, arthritis, and diabetes at The Washington University in St. Louis and The University of New Brunswick.

Real-World Implementation

Wysa has been tested in practical settings. In 2020, the government of Singapore licensed Wysa, offering free access to the service to help citizens manage the emotional impact of the coronavirus pandemic. Aetna, a health insurance provider, also offers Wysa as a supplementary resource within its Employee Assistance Program.

Prioritizing Safety and Compliance

A primary concern with mental health apps is the potential to inadvertently trigger a crisis or misinterpret signs of self-harm. To address this, the U.K.’s National Health Service (NHS) has established specific compliance standards.

Wysa is compliant with the NHS’ DCB0129 standard for clinical safety, making it the first AI-based mental health app to achieve this recognition.

To meet these standards, Wysa appointed a clinical safety officer and implemented “escalation paths” for individuals exhibiting signs of self-harm.

Wysa is designed to identify and flag responses indicative of self-harm, abuse, suicidal thoughts, or trauma. If such responses are detected, the app prompts the user to contact a crisis hotline.

Regulatory Considerations in the U.S.

In the U.S., the Wysa app available for public download aligns with the FDA’s definition of a general wellness app or a “low risk device.” This is significant because the FDA issued guidance during the pandemic to expedite the distribution of these types of applications.

Continuous Improvement and Refinement

However, Wysa’s categorization of user responses isn’t always perfect. A 2018 BBC investigation highlighted an instance where the app failed to recognize the seriousness of a proposed underage sexual encounter. Wysa responded by updating the app to better handle instances of coercive sexual behavior.

Aggarwal also notes that Wysa maintains a manual list of slang terms and phrases that the AI may not accurately identify or categorize as harmful. This list is regularly updated to ensure appropriate responses. “Our principle is that a response can be 80% appropriate, but must be 0% triggering,” she says.

Future Expansion and Goals

Looking ahead, Aggarwal states the goal is to become a comprehensive service. Instead of referring patients who receive a diagnosis to external resources, Wysa aims to establish its own network of mental health providers.

Technologically, the company plans to expand into Spanish and explore the development of a voice-based system, guided by insights from the Google Assistant Investment Fund.

#mental health#AI#Wysa#funding#emotionally intelligent#chatbot