Paying a stranger for help with your mental health can do harm

The women falling prey to new wild west of online therapy: Pouring out your darkest secrets via an app has suddenly taken off. But as these alarming stories reveal, paying a stranger (or even, incredibly, a robot) for help with your mental health can do more harm than good

  • Digital therapy is booming and there are three main types of health related apps
  • Dr Melanie Smart warns there are some people practising outside of their remit
  • Vulnerable users claim have been ‘ghosted’ by online therapists and left anxious
  • Becky Knox, from West Sussex, says money was taken despite lacking service

You open your phone and there’s a notification — you’ve got a match. You feel excited and hopeful and hurriedly compose a message saying hello. Then you wait eagerly for a reply.

No, this isn’t Tinder or any other modern dating app, where singles swipe through potential partners looking for love — though it bears many similarities to them.

This is an online gateway into a far less frivolous world, where the match is not a date but a mental health ‘therapist’, who promises to banish your anxiety, help you sleep better, lift your depression or overcome your OCD. All at the touch of a smartphone screen.

Sound too good to be true? As Covid sends us online for all manner of services, digital therapy is booming. Yet it’s an industry subject to a raft of disturbing complaints from vulnerable users, who claim they’ve been ‘ghosted’ by online therapists, triggered by unsuitable advice and left more anxious than they were.

UK-based users of online therapy revealed they’ve been ‘ghosted’ by therapists, triggered by unsuitable advice and left more anxious than they were. Pictured: Becky Knox

‘There are people out there practising massively outside of their remit,’ says registered psychologist and clinical director of Chichester Child Psychology, Dr Melanie Smart, who warns of potentially ‘fatal consequences’.

‘Some are pretending to be way more qualified than they are. They could cause further trauma and even suicide.’

Her warning comes at a time of rising demand for accessible therapy of the kind these apps claim to deliver.

The charity Mind UK says 60 per cent of adults feel their mental health has deteriorated during the pandemic, while figures from digital health specialists ORCHA (Organisation for the Review of Care and Health Applications) show between the summers of 2019 and 2020, downloads of mental health apps increased by 200 per cent.

There are about 370,000 health-related apps available online and, in terms of mental health, there are three main types. Apps that offer self-help tips and techniques with no real-time interactive features (this would include NHS-approved apps such as Calm Harm and distrACT); the portals that match you with a human therapist to whom you talk via text whenever you like, or by video appointment; and AI therapy apps offering automated ‘chatbot’ services.

This last invites you to divulge your mental health problems directly into the ether, with no real sense of what or who might be listening.

Many apps are free to download with in-app purchases and some work on subscription style models similar to Netflix.

Those that offer video calls do not always reveal the full name of your online therapist, making it impossible to check their credentials.

Catherine Burgess, 36, (pictured) from Bedford, said her online therapist was a chatbot who wasn’t empathetic and left her feeling worse

The former Lib Dem spokeswoman for health in the House of Lords, Baroness Jolly, calls it ‘a complete Wild West’.

‘Mobile phone apps abound to increase wellbeing, gain confidence and sort personal problems,’ she says. ‘But there’s not a lot of research on this, and there is no regulation.

‘Often there is no way of knowing where the site is based, or the qualifications of the counsellor, and some may not guarantee you will be connected to the same person each time.’

Professionals have long complained that therapy lacks regulation in the UK both off and online — and you do not need accredited qualifications to call yourself a psychotherapist or counsellor.

‘Other health related job titles are protected,’ explains Baroness Jolly. ‘If you’re not a doctor you can’t prescribe medicine, but there is nothing to regulate individuals who practise bogus therapy.’

It’s a problem further amplified by services that offer ‘on-demand’ text therapy and aren’t even located in the UK, meaning they’re outside of our General Data Protection Regulation (GDPR).

Take California-based BetterHelp, one of the most aggressively promoted e-counselling apps on the market, launched in 2013 by an Israeli serial entrepreneur with no medical background.

At BetterHelp, therapy means subscribing at a cost of $55 (£39) a week, being ‘matched’ with a therapist (often in America but potentially worldwide) and beginning an online dialogue — spilling your inner-most worries to an anonymous recipient somewhere on the internet.

Becky said she chose an online therapist to get support for her anxiety, depression and eating disorder because it seemed more affordable than face-to-face counselling (file image)

You’re shown your therapist’s credentials and location but only their first name, rendering it impossible for you to verify the information yourself.

For British woman Becky Knox, from West Sussex, the experience was disturbing. The 26-year-old began using the service this time last year to get support for her anxiety, depression and eating disorder.

She’d seen it advertised on YouTube and Facebook and was attracted by its apparent affordability and accessibility compared with face-to-face counselling. ‘Anyone who struggles can get help anytime, anywhere,’ reads the mission statement on the website.

‘When you first sign up, they ask you what kind of therapist you’d like. I asked for a woman in the LGBT community — I was given a straight man. It wasn’t the best start,’ says Becky.

She sent a message nonetheless — and waited a week for her therapist to reply. ‘It’s like when you’re ignored on a dating app, but worse.’

Victoria Wade, 26, (pictured) from Northamptonshire, claims the advice she was given made her anxiety spike

She gave up with her first ‘match’ and tried another. In six months, she cycled through three therapists, all in the U.S., all of whom took days to reply. When they did, she says the advice was rudimentary and seemed to lack expertise.

‘One time I was told to have a bath. Having a bath isn’t going to help an eating disorder.’

Becky requested phone sessions but could never be accommodated. Money was taken out of her account on a weekly basis, but she didn’t feel she was getting the service she’d signed up for — the therapy certainly wasn’t ‘anytime, anywhere’.

‘I thought it was going to be cheaper [than conventional therapy], but it ended up costing me a lot more.’ She also felt uncomfortable about the fact she didn’t know who she was speaking to.

When Becky cancelled her subscription, a further three payments were taken from her account. BetterHelp then took two weeks to respond to complaints.

‘The whole experience was awful. I wouldn’t recommend it to anyone. It made me much more anxious than I was to start with.’

BetterHelp did not comment on Becky’s claims.

However, Alon Matas, founder and president of BetterHelp responded to points made by the Mail: ‘We strive to ensure that every one of our members receives the best level of care and the highest quality of service on every occasion and interaction.

‘Our member success team is always available to assist when this expectation is not met to the member’s satisfaction.

‘Our members’ privacy and the security of their data is extremely important to us. We are highly focused on the tools, resources and procedures we have in place to protect and safeguard member privacy and confidentiality, and to comply with the applicable data privacy laws.

Dr Melanie Smart warns there are some people ‘pretending to be way more qualified’ than they are, which could cause further trauma and even suicide (file image)

‘BetterHelp is always transparent with our members with respect to their data, as explained in our Privacy Policy.’

The very notion of therapy- by-text is one that rings alarm bells for professionals, however. Nuance is hard to detect, body language impossible to observe, rapport tricky to build and misinterpretation likely.

‘Even with your friends and family you can have misunderstandings by text,’ says Steve Flatt, director of therapy centre, the Psychological Therapies Unit, in Liverpool. ‘People can be sloppy with their language and therapists are no exception. In this case it’s bloody dangerous, quite frankly.’

And when it comes to leaving vulnerable people waiting days for a reply it could have ‘very serious consequences’. ‘It’s a big deal to reach out. Not getting a reply might make patients retreat further and become more isolated.’

Elsewhere online, there are more troubling stories. Catherine Burgess, a 36-year-old buyer from Bedford, has tried ten apps since her divorce three years ago.

‘I’ve always been an optimistic person and never had any trouble with my mental health until the breakdown of my marriage. But that’s when I spiralled. I didn’t know what to do.’

Short of money because of the divorce, she looked online and tried Woebot, an AI chatbot developed by a team at Stanford University in 2017, which claims to help with ‘depression, anxiety, relationship problems, procrastination, loneliness, grief, addiction, pain management and more!’

But Catherine was not impressed. After exchanging a few texts with the bot, she found its responses weird, strange, and irrelevant. ‘It was like one of those horrible chatbots you get on banking. It wasn’t real or empathetic. It made me feel worse.’

Catherine felt she was somehow at fault for not ‘succeeding’ in getting better with the apps, and fell further into depression. Eventually, she decided to see a real life therapist who reacted with shock when she shared some of the ‘advice’ she’d been given via text.

Founder and president of Woebot Health, Alison Darcy, replies: ‘Woebot is a constrained system that is built within strict ethical standards and in accordance with best practices. Being a constrained system means that sometimes the range of pre-scripted options people can use to respond don’t fully reflect the nuance of how they’re feeling.

‘We’re always working to better understand people’s experience and fine tune the service.

‘We don’t do any advertising in our app and never sell data to third-party advertisers. When we share data with third parties it is de-identified, aggregated, and with health systems partners, and only after explicit user consent is given, in accordance with GDPR.’

As Catherine says, however, it’s clear many of these apps ‘were made by tech-developers rather than mental health experts’.

When ORCHA reviewed 600 mental health apps, it found less than 30 per cent met its quality thresholds. ‘There are some extremely unsafe products out there,’ says ORCHA’s CEO, Liz Ashall-Payne. ‘People are downloading trash apps with no proven efficacy. It’s dangerous from a clinical perspective.’

Victoria Wade, 26, a bartender from Northamptonshire, used an app called Bloom for her anxiety. Founded by two app creators who say its methods worked for them, it costs £10 a month and includes self-help videos, an ‘emotion tracking’ feature and CBT exercises.

For Victoria, it only added to her struggles. ‘It told me anxiety was triggered by certain foods such as sugar and caffeine which made me scared to consume those things.’ Her anxiety spiked, leaving her unable to sleep due to near-constant racing thoughts.

Leon Mueller, chief executive of Bloom, responds: ‘All our content is developed together with Seth Gillihan, a leading CBT therapist in the U.S. and best-selling author.

‘There are many studies that show that Mobile CBT treatment is helpful and effective.’

He adds: ‘Data privacy is something we care about. We have just updated our privacy policies with our lawyers to meet all the guidelines but will continue to take more measures. We are not selling and will never sell any data to third parties.’

However, researchers warn that health apps pose an ‘unprecedented privacy’ risk. Even when data isn’t sold, it’s still often shared with social media platforms — a BMJ study looking at 24 apps found that 19 shared user data with third-parties, including Facebook, Google and Amazon.

According to ORCHA, you’re more likely to download an unhelpful mental health app than a helpful one — so be warned. When it comes to a mental health problem requiring fully-qualified, empathetic, real-time human contact, there probably isn’t an app for that.

For a list of apps that did meet ORCHA’s standards, visit

Source: Read Full Article