Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.
There are well over 150 000 health apps available in Europe1—from those designed to improve general wellness to apps that monitor medical conditions, apps for clinicians, and apps that function as medical devices. There have been more than 102 billion down-loads of health apps worldwide yet there is little regulation or guidance available for doctors or patients on quality, safety, or efficacy.
What is the problem?
Since the UK government founded the Cochrane Centre in 1992, evidence based medicine has been at the heart of healthcare.2 With the burgeoning apps market, however, things are different.
“There's a huge and growing number of health apps out there, and with that comes a wide variation in quality, testing, and evaluation,” says Sarah Williams, senior health information officer at Cancer Research UK. “As with any new technology there's a lot we still need to understand about whether they can be effective, especially in the long term, and, perhaps more importantly, whether they're helping the people who really need it.”
Technically any app that makes efficacy claims needs to be able to produce some evidence to support its claims under the European Directive on Misleading Advertising. This may, however, not include the high quality evidence from randomised controlled trials that clinicians have come to expect.
“People are increasingly using apps to monitor, manage, and even treat conditions but have no information on whether or how the apps have been calibrated, and it's hard to find any information on the research used in development,” explains Patricia Wilkie, chair of the National Association for Patient Participation. “It's a very murky area.”
Risks of uninformed choice
“People choose health apps the same way as they choose any apps—quality of design and how easy they are to use,” explains Satish Misra, cardiology fellow at the Johns Hopkins Hospital in Baltimore and managing editor at app review site iMedicalApps. “That's why many of the top downloaded apps are not evidence based—and some don't even make sense.”
Six months ago Jeremy Wyatt, chair in eHealth Research at the Leeds Institute of Health Sciences and clinical adviser on new technology to the Royal College of Physicians, tested 19 apps on the iTunes store—both free and paid for—that claimed to diagnose the user's risk of a heart attack over the next 10 years. “Given the same data, one app gave a risk of 19%, another gave a risk of 96%, and a third gave a risk of 137%,” he explains. “There was bad coding and poor research in half of the apps tested, with paid for apps performing worse than free apps.”
Yet patients and health professionals are increasingly using medical apps. Studies report that over 85% of health professionals use a smartphone and 30–50% use medical apps in clinical care.3 4 A survey of 233 NHS general surgical trainees working in Scotland5 found 82% had downloaded at least one medical app, 35% had used apps to help make clinical decisions, and 13% thought they had encountered errors. Some 58% thought that apps should be compulsorily regulated but none knew the name of any regulatory body.
What is the current system?
All apps are regulated by the Data Protection Act and the European Directive on Misleading Advertising. In addition, the European Medical Device Directive considers apps used in “diagnosis, prevention, monitoring, treatment, or alleviation of disease, injury or handicap as well as investigating, replacing or modifying the anatomy or a physiological process or control- ling conception” to be medical devices.6 These are regulated by the Medicines and Healthcare Products Regulatory Agency (MHRA) and have to undergo a conformity assessment by MHRA notified bodies to secure a CE certificate.
The Mersey Burns App (figure 1), which calculates how much fluid a burns patient needs, was the first to win a CE mark. It was accredited after clinical trials showed it was more accurate and quicker at calculating the result than doctors working out the figure by hand. It's hard to get a clear picture on how many apps have a CE mark—an MHRA spokesman said that the authority kept no register or list of CE marked apps, leaving that to the numerous notified bodies (there are 15 in the United Kingdom alone) individually.
Definition of a medical device is “a very grey area,” according to Charles Lowe, president of the telemedicine and ehealth section at the Royal Society of Medicine. “I was at a meeting of the London Health Technology Forum last November with two representatives of the MHRA. They were shown examples of telehealth and asked which was a medical device but they couldn't agree.”
Some apps—such as uChek, a smartphone app that sends photographs of urine test strips to clinicians—are effectively in vitro medical devices but are not currently classified as such, Julian Hitchcock, counsel at Denoon Legal, a law firm specialising in medical technology, argues.
In the United States the Food and Drug Administration regulates apps classified as medical devices with a very light touch.7 Only 100 apps have so far been classified as a medical device and none has been banned for safety or efficacy reasons. However, the Federal Communication Commission has banned some apps that make misleading claims, including an acne cure app that could provide no evidence of its claim that the iPhone screen backlight could reduce acne.8
What if an app definitely isn't a medical device?
“If it isn't a medical device there is no reliable framework or standard beyond the NHS Choices library guidelines, which were created two years ago and haven't been reviewed since,” warns Maureen Baker, chair of the Royal College of General Practitioners. “We're trying to apply a regulatory framework that was created in the 1960s and 1970s—it's not fit for purpose 50 years on.”
NHS England has so far taken a relatively informal approach. The NHS Choices Health Apps Library—launched in March 2013—lists apps found to be clinically safe, compliant with the Data Protection Act, and relevant to people living in England. 9
In May, the UK standards body the British Standards Institution published a voluntary best practice framework for developers with advice from a steering group including BUPA, NHS South West Academic Health Science Network, app developers, notified bodies, the Royal College of Physicians, and medical device manufacturers.10 There are also unofficial endorsement systems such as myhealthapp.net, which focuses on user recommendations and reviews.
There is disagreement over how heavily health apps need to be regulated—and uncertainty about what is practically feasible. But in April, the Royal College of Physicians advised its members not to use any apps that didn't have a CE mark—not least, it advised, because “if it is missing, then you are leaving yourself open to . . . possible litigation.” Wyatt, however, stops short of calling for National Institute of Health and Care Excellence (NICE) or MHRA regulation of every app.
Moves towards better information
Major efforts are afoot to help doctors and patients decide which apps to choose. Launched in March 2015, the Mental Health Apps Library—hosted by the NHS Choices Apps Library—lists apps that offer NICE approved evidence based treatments for mild to moderate depression and anxiety disorders. The apps are selected, according to NHS England, because there is “a strong evidence base of digital tools being effective in helping sufferers of mild to moderate depression and anxiety to manage their conditions.” This is the first stage of Personalised Health and Care 2020, the government framework for digitisation of the health service,11 which includes a library of endorsed apps and digital tools. Although inclusion will still be a recommendation, the endorsement process is intended to be much tougher than current systems.
This month the National Information Board (NIB), the body overseeing the roll-out of tech- nology across the NHS, begins consulting clinicians and patient groups on proposals for a stronger endorsement process. The proposals suggest that assessment is made in four steps: self assessment, community endorsement through crowdsourced feedback from professionals and the public, and then robust independent assessment, possibly involving NICE, at the third and fourth stages to confirm the effectiveness of apps.12
The NIB foresees some 10 000 apps entering stage 1 each year, with roughly 2000 invited to move into stage 2. Of these, only around 100 are expected to move into stage 3 with roughly 10 reaching stage 4.
It is not clear whether this process will have any legal powers. “The only genuine protection consumers and patients have is via the MHRA, data protection, or mis-selling legislation,” explains Hitchcock. “Mis-selling rules, however, are powerful. They can force developers to prove apps fulfil their claims, and with healthcare that could effectively mean [they have to produce] NICE standard evidence.”
But Baker is concerned, “Until we have case law in this area, who knows about any of this? And there has been no case law to date.”
This autumn, the European Commission will begin negotiating proposed reforms to existing directives on medical devices, in vitro diagnostic devices,13 and use of mobile health.14 The proposals, which include apps, point to stronger supervision of independent assessment bodies by national agencies and the EU and give those assessment bodies more powers to ensure thorough testing and regular checks, impose stricter requirements for clinical evidence, update risk classification rules for medical devices, and establish better coordination between regulators. Adoption of the new rules is expected by the end of this year or at the beginning of 2016.