Intended for healthcare professionals

Observations Medicine and the Media

How do we know whether medical apps work?

BMJ 2013; 346 doi: https://doi.org/10.1136/bmj.f1811 (Published 20 March 2013) Cite this as: BMJ 2013;346:f1811

This article has a correction. Please see:

  1. Margaret McCartney, general practitioner, Glasgow
  1. margaret{at}margaretmccartney.com

Smartphone apps have the potential to transform the way the public manage their health and interact with health services, says Margaret McCartney, but regulation of medical apps has only just started

Angry Birds, Cut the Rope, and Fruit Ninja are favourite games among smartphone owners, but many apps are for function rather than fun, such as maps and shopping lists, and a host of medical apps that say they offer us ways to better health.

Some are aimed at healthcare professionals but are available to all. The National Institute for Health and Clinical Excellence (NICE), the Scottish Intercollegiate Guidelines Network, and the British National Formulary have free apps allowing easy and rapid access to their advice. But other apps do not just reproduce advice available elsewhere. In January the UK Medicines and Healthcare Products Regulatory Agency (MHRA) approved its first app: Mersey Burns is a free tool that calculates burn area percentages and fluid requirements.

Other medical apps are aimed at the public. Many advise on diet and exercise, and these vary widely in quality,1 2 but newer apps purport to help diagnosis. The NHS Healthcare Innovation Expo this month featured an app from Skin Analytics that offers to track changes in skin moles to “raise early warning signs” by comparison with an online database.3 Its website says that, for £30 a year for an individual or £50 for a family, the app can “baseline you and your family” using “patent pending technology” that can “detect small changes in both the geometrical structure and colour composition of your moles with an exceptional 95% accuracy.”4

A recent study in JAMA Dermatology showed that most previously marketed apps had a failure rate in melanoma diagnosis of about 30%.5 Julian Hall, director of Skin Analytics, said that this app, which is not yet available to buy, was not a diagnostic service but was instead “trying to implement the self examination advice from public health bodies and answer the question, ‘Has the lesion changed or not changed?’—prompting people to see their GP or dermatologist.” Clinical trial data on the app are lacking, but Hall says a trial is planned for later this year.

Yet the question of evidence is crucial. Do apps offer to gather more, or misleading, data for little useful signal?

Several apps offer to check pulse rate using a phone’s camera light. One app claims 25 million users after promotion in the United States,6 with the ability to record serial pulse rates, but it is not clear what advantage this offers over manual pulse measurement, should this be desired. It is also possible to buy a small plug-in device that turns your phone into a pulse oximeter, although this is described as “not for medical use” and is marketed as useful for mountain climbers or private pilots and retails at about $250 (£165, €190).7 Some free apps offer “health checks” that are really just adverts for cosmetic surgery.

Specsavers, which the BMJ recently reported had been advertising for contracted NHS services,8 offers a free app described as a “sight check.” Users cover an eye, and test their visual acuity with images on the phone. (Despite having had a recent prescription, I was still “strongly recommended” to speak to my optometrist.)

The interactivity that apps provide based on information entered makes them distinct from books or leaflets, and the handheld nature and additional recording offered is different from the reach of websites. This can widen the potential for unintended outcomes. The NHS Commissioning Board last week launched a “library of NHS-reviewed phone apps to keep people healthy” because they are “committed to improving outcomes for patients through the use of technology.” More than 70 have been approved in a review that includes a “clinical assurance team,” to ensure that they “comply with trusted sources of information, such as NHS Choices,” with assessment of the potential to “cause harm to a person’s health or condition.”9

However, a high standard of evidence should surely be crucial in a product approved by the NHS. For example, the charity Beat Ovarian Cancer offers a “symptom tracker,” which “helps women recognise the signs and symptoms of ovarian cancer,” but, without real world trials to show effects and quantify harms, we do not know whether this is beneficial. The NHS Commissioning Board said that, through its review process, it is “ensuring that the apps listed in the Library are clinically safe and suitable for people who are living in the UK,” and that apps “have been checked by the NHS and adhere to NHS safety standards.” Yet these apps could be tested in a real life situation for evidence of benefit and free of unintended harms. Why not?

Another NHS recommended app is iBreastcheck, which can be set to remind women to check their breasts weekly, fortnightly, or monthly. The app includes videos of women examining themselves and a link to donate to the charity Breakthrough Breast Cancer, which devised it. It would be possible to trial this app to find evidence of benefit and harm in the same way that other trials have investigated breast self examination,10 but this has not been done. Breakthrough Breast Cancer said that the content was reviewed by a panel of experts and that it was “not a breast self examination app. It is a breast awareness app.” It said that it was created because “women want more practical information on what to check for and also a gentle nudge to remind them to check regularly.”

The US Food and Drug Administration published draft guidance for medical apps in 2011.11 Straightforward information or recording devices are not subject to its guidance—as long as these apps do not offer to diagnose, treat, or cure a condition. Instead, it suggested that its oversight should apply to apps that, for example, turn a smartphone into a stethoscope, or that offer risk assessments of disease or diagnosis based on information entered.

In the United Kingdom apps that are “medical devices” must be registered with the MHRA. It has the power to withdraw products from the market, but what constitutes a medical device is a grey area—for example, the agency said that an app that charted changes in skin moles would not be a device, whereas one that offered diagnosis would be.

But registration with the MHRA does not imply efficacy. Approval of efficacy is granted by Europe-wide “notified bodies,” which can award the CE quality mark if their standards are met. These are “principally trade measures designed to remove technical barriers to trade,”12 and for apps they do not insist on evidence of better outcomes, such as from randomised controlled trials.

It would be a great pity if apps that have solid evidence behind them—such as decision making aids, several of which are approved by the NHS app library—become confused with ones that haven’t. Apps are likely to be a new source of information that enable patients to interact with the NHS in a different way. We need to ensure that these are safe, useful, and effective. If they work we should use them—but, as with any medical intervention, they need fair tests in the real world before we can know.

Notes

Cite this as: BMJ 2013;346:f1811

Footnotes

  • Competing interests: I have read and understood the BMJ Group policy on declaration of interests and have no relevant interests to declare.

  • Provenance and peer review: Commissioned; not externally peer reviewed.

References

View Abstract