By Alison Darcy, Founder and President

We recently updated our Privacy Policy with clearer language about how we protect our users’ privacy and security. It was an important update, because privacy is an important topic.

We know that our whole service, which supports people’s mental health, is based on trust. So not only do we believe in data privacy — our business, and the users we aim to serve, depend on it. 

I wanted to confirm some of our most important data practices

We do not share or sell user data to advertising companies or for advertising purposes

I wrote about this in the blog post Why We’ll Never Sell Your Data to Advertisers nearly 4 years ago. We didn’t do it then. We don’t do it now. We will never do it. We do interact with marketing partners to promote Woeboth Health initiatives: for example, we promote research initiatives using Facebook ads and target those ads to relevant populations based on their Facebook interests. No personal data is shared or sold to these marketing/advertising partners.

We treat all user data as Protected Health Information (PHI) and adhere to all HIPAA and GDPR requirements

We do this even when the data is not legally classified as such. We also secure sensitive data in a dedicated environment for clear access control, and we’re assessed by an external party every year to ensure we’re compliant. These are rigorous, clinical-level protocols we choose to take on for the sake of privacy.

The data we do share is with a specific purpose and intent

We are transparent with users about how data is processed and used. For the vast majority of users, data is only shared with service providers who make the Woebot app work, or in the rare circumstance when we must comply with law enforcement. For a fraction of users who have chosen to participate in a partner program, such as with a research institution, health system or employer, we may share certain data with those partners, but only when users have provided explicit agreement. In these partnerships, we scrutinize every piece of data we might share to ensure it’s in the service of good. 

We apply AI for learning in specific ways

We periodically review de-identified portions of conversations and compare the AI-suggested path to the path chosen by the user. When these paths diverge, we retrain our algorithms using the additional de-identified data. This is how Woebot’s conversational ability improves and learns.

We also have core beliefs that guide these practices

We ask for consent before collecting any data and outline each type of data including when and how that data can be used. Data sharing is not inherently bad. It can lead to better care in a clinical context or be used to answer questions that move the growing fields of technology and healthcare forward for everyone. The most important thing is that every user is properly informed about how their data may be used so they can decide whether they’re OK with it, or not.

We believe users own their data

Long before we were legally required to do so, we were empowering people to delete their data or obtain a copy of it. And we try to make the process simple — users can instantaneously delete their data in the app itself by asking Woebot.

We believe in helping people

Our basic app is free given our mission: to make mental health radically accessible. We hear far too often about the crippling cost of healthcare and are proud to be part of the solution for many people who need help.

We believe in regulation

That’s why we’re doing the hard work to seek FDA clearance for our products. This is not a path for the faint-hearted, and very few in the mental health app space have followed it. It denotes not just the highest standard of clinical effectiveness but excellence in important practices like engineering, data handling and privacy as well. 

While we have a published Privacy Policy, that is not the only way that a company communicates its business practices. We’ve talked regularly and publicly about privacy and security since Day 1, nearly five years ago. Some examples: We’ve written about privacy on our blog in posts like Woebot Health & GDPR: We Think About Privacy at Every Level of Development, talked about it in countless articles like this interview with ABC7 News, and even hosted a Reddit AMA.

Open conversations about privacy are important to us. Please reach out to us at privacy@woebothealth.com with your comments or questions. We hope that continued conversation will help illuminate the honorable work being done by many, including Woebot Health, to create accessible, clinically founded, and private spaces at a time when people need it most.