Why we’ll never sell your data to advertisers

In the wake of the worst year on record for social media giants, it’s little wonder that people are skeptical of tech companies and how they will treat our data.

In the wake of data scandal after scandal, and the growing understanding of how our emotions are weaponized online for political reasons, how can people judge the motivations of companies that are under pressure to grow their bottom line?

First and foremost, this company was established by psychologists, which means we are subject to an ethical code of conduct. For us, this is so obvious that it seems self-evident. There are also several business reasons why selling your data would be a horrible strategy for us, and that is the focus of this post.

The whole concept behind Woebot is a relational interface that facilitates sharing and challenging your thoughts in confidence. If we were to sell data to advertisers we would instantly undermine that trust, and we would lose our users. This would constitute an incredibly poor business decision that is contrary to the entirety of what Woebot represents.

It’s not the business we’re in. Just because we are a data-driven company, does not mean that our data are where the value lies. The value we create is in the service we provide. This is why we’re committed to clinical outcomes research – to investigate and demonstrate a value in reducing symptoms, and because we strongly believe that apps that claim to have some benefit, should have data that show that benefit. In other words, our investment has been in symptom change outcomes— not in data gathering. If it were the latter we would have built our product to orient around gathering as much data as possible.

Woebot is a tool that helps people challenge distorted thinking while advertisers sell to that distorted thinking. One of the best examples of this is the common distorted thought; “I’m not X enough” where X can be anything – good, successful, attractive, effective, nice, intelligent, etc. etc. In contrast, advertisers rely on the cognitive distortion that you’re not X enough. As psychologists, our profession is devoted to learning how to help people challenge this way of thinking. As a mental health service, why would Woebot, a tool that helps people fight these distortions, suddenly align with an industry that creates them?

Selling data is a commercialization strategy, often for businesses that have no obvious pathway to monetization. This is why free apps have annoying ad banners, and the ones that don’t, should have a clear reason why not. Happily, we are not in this boat. There is a clear pathway to monetization for us that is based on creating symptom change as outlined above. When we first launched the Woebot service, we charged individuals $39/month on a subscription basis. Happily we created sufficient value for enough individuals that they were happy to pay this amount. We were then lucky enough to be able to remove the fee for this service after we raised our Series A funding round so that we could understand how people naturally engage with Woebot over time. We’ve always been transparent about our decisions to charge or not and we feel other companies should be too.

At this point you may or may not be asking yourself, How can we claim to not sell to advertisers but then launch the service in Facebook Messenger?

Two important deciding factors here are informed consent and empowerment. We’ve always been transparent about those using us on Facebook’s platform are subject to Facebook’s data policy and ultimately, we believe it’s up to each individual whether they choose to use the service or not. Our Facebook users have the most international profile with the majority of users not being from Western Countries, and many using the service on 2G phones. This means that many people can only access Woebot on Facebook, and those who can choose, can choose to use Woebot on our native apps.

We believe that people need to be vigilant about how their data are used and hold companies accountable for improper or exploitative practices. But not all companies are trying to exploit data. If you have a healthy distrust for the motives of all tech companies (understandable), then perhaps criteria for judgement could also include how long could a company is likely to survive if exploitation or negligence were part of the operation.