Liz Ashall-Payne is the founding CEO of ORCHA (the Organisation for the Review of Care and Health Apps). She speaks to Emma Wilkinson about what healthcare leaders need to understand about making the best use of the technology.
Can you tell us about ORCHA and your work on making digital health safer?
I trained as a speech and language therapist 25 years ago and recall the frustration of long waiting times. With the advent of digital, there was an opportunity for more people to access healthcare, but there is a gap between the technology and the NHS.
When it comes to medicines, there is a process of testing and then introducing it into the formulary. Prescribers are trained in how to use them and you can track impact. I thought this was needed for digital health apps, so I launched ORCHA in 2015.
Over the past eight years, we have built an infrastructure that allows us to assess and evaluate technologies. There are currently 360,000 digital health apps and we’ve assessed about 25,000 of them.
Why is this important? What are the pitfalls of health apps?
Every day, 5 million people download a digital health app. But you need a process of governance. Of the 25,000 we have looked at, only 20% meet the quality criteria. Our sample size is such that we can say many technologies out there are unsafe.
In terms of why it matters, there is a specific example from another country where a new product was launched to advise women using a tracker during the early stages of labour when to go to hospital to give birth. It told everybody to go too early, and the hospital was inundated.
How does ORCHA assess an app and what are you looking for?
We have built a technology platform that allows us to do this evaluation.
In the NHS, the Digital Therapeutics Assessment Criteria (DTAC) brings together all the standards, regulations, and legislation. Within that are data privacy, data security, clinical and professional assurance, evidence of impact and usability, and accessibility.
We’ve all heard of GDPR (general data protection regulation), but you also need to consider consent, authentication, and verification. Data security relates to what information is stored and where and how it’s being used. Then there are medical device regulations but also very nuanced legislation around clinical safety as well as NICE evidence standards. From a regulation point of view, the weakest area is usability and accessibility, but we are starting to see more interest there.
Not every product needs to answer all of it. A very low-risk product that plays music to help relaxation and holds no data is very different to one that is clinical. For example, there is an app called Neomate for professionals to input information about a child, and it tells you how many drugs to give them at what point. That is a medical device so there is a different risk profile.
You have only looked at a small proportion of what’s out there, but 80% do not meet the standards. Why is that?
In most cases, it’s because the supplier didn’t know what they had to do. In the NHS DTAC, most fail on the clinical safety side. If you’ve got clinical and professional medical leads, they’ll help you meet some requirements, but they may not know about data security and data privacy. Products also update and change frequently, and you need to make sure it has been reassessed.
What responsibility does the NHS have in guiding people to the best apps?
There was an NHS Apps library, but reassessment wasn’t ever part of the approach so it was shut down. NHSX decided to relook at the standards and ask the NHS at large organisational and regional levels to do the assessment. However, there was a lot of duplication in the system, and organisations may not have the capacity or capability to assess. We work with about 70% of the NHS and do that job for them. There are products out there that claim if you put your thumb on the screen, it’ll tell you your blood pressure. Sometimes, it gives false positives, so that person could go on to have a stroke or heart attack.
And when you have a good app, you need people to know about it?
NICE recently assessed and approved various apps for depression and anxiety. But how would you know if you didn’t see that press release or look on their website? And that is the case for clinicians as well as patients.
It’s a missed opportunity. There are 21,000 mental health apps on the market. You wouldn’t have 21,000 mental health drugs on the open market with NICE approving six and never telling anybody. I’m obsessed with supporting companies to become sustainable in a crowded marketplace because we’ll lose access to these innovations if we don’t.
How do you help NHS organisations provide access and information for patients on health apps?
The NHS in Bolton identified some specific areas – smoking, supporting people while waiting for surgery, heart health and weight management. We created a digital health app library around them, which is promoted in local NHS campaigns. ORCHA powers it, but it’s their site.
You can search by factors such as whether it’s free or for use on an Android phone. Then, you get a list that is ranked with a simple explanation of what it’s for and that it’s certified.
If you’re in an area where this isn’t available, does it become a postcode lottery?
This is where we could be heading. It’s not about the libraries; it’s more about what has been procured and commissioned.
What do you think is the next step for the NHS?
I would push for centralised evaluation, which is continuously monitored so that everybody has access to a repository of evaluated technologies. Then, I would advocate for workforce training. We have to do it for the current workforce, but we have to get this into clinical training otherwise we’re just making the problem worse for ourselves.
This is about the NHS setting up solid processes to make sure that there’s governance. A website with some products listed is not a safe clinical system. You need the whole infrastructure and an ongoing process to ensure people are accessing good quality, safe products.
Ultimately, who is responsible when things go wrong?
If I recommend an app and you download the wrong product – or you download the product and it has an unintended consequence – is it me as a recommender, or is it the product owner? That’s still a grey area. I’m desperate that we solve and address it before we have any law in this area.
Liz Ashall-Payne is the founding CEO of ORCHA (the Organisation for the Review of Care and Health Apps)