July 17, 2019

Alexa, will you be my next doctor?

Much has been said about Matt Hancock’s (the Secretary of State for Health & Social Care), announcement regarding the new links between Alexa and the NHS over the course of this week, many calling the move outrageous and an attempt at replacing doctors with technology.

On the other hand, Mr Hancock has said the following:

“We want to empower every patient to take better control of their healthcare and technology like this is a great example of how people can access reliable, world-leading NHS advice from the comfort of their home, reducing the pressure on our hardworking GPs and Pharmacists”.

So what exactly does the new partnership consist of? And, more importantly, does it represent a shift in how medical advice is provided?

We live in the world of information, where any query or doubt can be answer with a quick click of the fingers. This is true for everything, including our health; many, if not all, have at some point googled an aspect of their medical health or symptoms they were having. For instance, ‘how long does a cold last?’ or ‘what can help with constipation?’.

Search engines can be incredibly useful for quite mundane questions, however one needs to be vigilant as not all information is created equal. When physically googling any medical issues, you would normally attach a lot more relevance and trust to official NHS advice than a random blog.

Essentially, what the new initiative will do is effectively that; give the official NHS advice rather than any other information found online when asking Alexa for medical advice.

When looking at it, it is difficult to see the downsides; if you were not the kind of person that would ask Alexa for medical advice, nothing would change. If, on the other hand, you do, then it ensures that you get the best information available.

This will be particularly useful for those with disabilities that stop them from being able to sit in front of a computer or use a mobile phone, or those who are not technologically savvy. Although, arguably, the latter is unlikely to have a smart speaker if they do not know how to use a computer.

What is probably more worrying is Mr Hancock’s announcement that this initiative will help reduce the pressure from NHS doctors and nurses. It is not obvious to see how this could be the case in the circumstances, making many wonder whether the links between Amazon and the NHS will significantly increase over the years.

What are the issues here?

There are two main issues which could arise, one relating to the data Alexa will be able to store regarding your health, and the other that the artificial intelligence will most likely revolutionise medical care but most new technologies are designed and own by private limited companies.

Where does the data go?

The main issue with this is that Alexa by default will store all of the information fed to her. This means that, in theory, Amazon, owner of the Alexa smart speaker, would be able to access all of your searches and sell them to the highest bidder.

In theory that is. Under GDPR you have control of what information is stored about you and always retain the right for this to be permanently deleted, should you wish.

Further, in the UK, where the NHS provides free healthcare at the point of access, who would want that data anyway?

The announcement would have significantly bigger implications if we lived in a country with a private health insurance system, such as the USA. Should a similar system ever be introduced in the UK, then the announcement would be significantly more worrying as it would mean that the insurance company could ask Amazon for the records of your searches and void your insurance.

For example, let’s say that you find a mole on your back and your are kind of worried about it but not so much so that you seek medical advice. You would ask Alexa, ‘what are the symptoms of skin cancer’. Let’s assume that you did not immediately meet the criteria and you were satisfied that your mole was not cancerous. Following this, you take out insurance and tell them about any pre-existing conditions but do not mention anything about the mole. As far as you are aware, the mole is just that, a mole. Years pass and it turns out that the mole was indeed cancerous and you need very expensive medical care. Your insurance (if the American system was copied) would probably be allowed to void your policy on the grounds that you did not disclose all pre-existing conditions.

However, in the current circumstances in the UK, the potential issues seem very unlikely.

Is A.I. going to replace medical practitioners?

Probably more applicable to the current state of affairs in the UK is the criticism that Mr Hancock wants to cheapen the care received from the NHS by using artificial intelligence such as Alexa to replace doctors and nurses. The reality is that the Alexa’s initiative does not threaten the role of medical staff (at least not from what has been reported in the media) but significantly more concerning announcements have been made which have not been so publicised.

For example, The Guardian reported back in May about University Hospitals Birmingham NHS Foundation Trust’s development of a revolutionary artificial intelligence triage. The idea is that people will be able to download an app created by Babylon Health where you will input your symptoms and it will tell you what type of medical attention you need, if at all. Now, this is a lot more problematic in terms of accountability when things go wrong.

It will be part of the NHS so to some extent they will be responsible. However, they are contracting services from a private company which created the algorithm for the application, so to some extent they too would be responsible. It also requires that the patient is able to describe their symptoms in a coherent manner, a rarity as most doctors will tell you.

So, if the app decided, based on your responses, that you did not need urgent medical attention, but it later transpires that you did and you suffer severe injury or death as a result, who is at fault? The NHS, as they are under a duty to provide a service that is fit for purpose? Babylon Health, for creating an application that resulted in you not seeking medical advice thereby causing you injury? Or the person answering the queries for not being answering them ‘correctly’? If things go right 100% of the time, which is unlikely to say they least, then it could be a great development but, when things inevitably go wrong, it will became a liability nightmare with the different parties blaming each other.

Indeed, one only has to look at the terms and conditions stated on their very on website to see that even Babylon does not think that their triage tool is fit for the purpose which the NHS plans to use it:

“4.Except for our Clinical Services where we provide medical advice, all our other services, such as our symptom checker and triage tool, provide healthcare information and not medical advice, diagnosis and/or treatment. They provide information to you based on information entered. They do not diagnose your own health condition or make treatment recommendations for you. Our information services are not a substitute for a doctor or other healthcare professional. You shouldn't take or stop taking any action (such as taking medicines) based on information from our information services. We make no warranties in relation to the output of our information services.

But even the idea of creating applications, or hotlines with non-medical staff, to tell you what type of help you need is not that novel or different from the current system. Think of the 111 service, or the symptom checker online; they provide a very similar service to that of the triage tool. The main difference however is that the 111 is part of the NHS and therefore liability is not so complex; the NHS has to make guarantees about the services they provide.

Further, in terms of data, the triage tool has the same issues as Alexa. Babylon Health is a private limited company, very much like Amazon and so the risks about controlling what the data is used for are very much similar. Indeed, the ‘GP at Hand’ (a virtual GP practice you can sign up to, also run by Babylon) requires you to provide significantly more information that you would ever give Alexa when asking a question about your health.


All in all, Alexa providing official NHS advice is not that revolutionary or worrying, however many of the issues that have been raised as result of the announcement very much are.

The implications of the introductions of diagnostic A.I. in our healthcare system will not be known for quite sometime. It may be that in a few years we see a drastic surge in A.I. misdiagnosis clinical negligence claims. Or, we might not seem the at all. We might find that the NHS becomes liable for the mistakes of the A.I diagnostic tools they did not create or it may be the designer companies take responsibility. The simple yet frustrating answer is that we just don’t know yet.

What we do know, however, is that the medical research carried out by Babylon (at least up to November 2018) is not conclusive, even if at times the data is presented to make it seem like it is. Scientific research into the safety of a product requires expensive, tedious and repetitive research. It requires carefully thought and statistical analysis which shows a clear pattern in a very large sample. Unfortunately, this does not appear to have taken place when testing the capabilities of the triage tool designed by Babylon according to the review paper "Safety of patient-facing digital symptom checkers”  published by The Lancet less than a year ago.

Will these A.I. diagnostic tools become the next ‘Mesh Scandal’ due to the lack of appropriate research into the potential safety concerns? Will there be a surge of clinical negligence claims as a result of the increased use of these? Will they bring a liability war between the patient, the NHS and the company that designed them? Only time will tell…

Share on: