Putting aside sci-fi projections of the ‘rise of the machines’ and romantic visions of near-humanised droids, here we look at the current and future applications of artificial intelligence (AI) and virtual reality (VR) in surgery, and perhaps the most practical current tech application: augmented reality (AR).
In the UK, the Royal College of Surgeons has launched an independent commission to explore the future of surgery, which aims to set out a ‘compelling and credible vision of the future advances in medicine and technology, and how those developments will affect the delivery of surgical care’. In addition to the topics outlined above, commissioners are considering advances in surgical innovation, including minimally-invasive surgery, genomics, 3D printing, robotics, and genomics.
We spoke with one of the commissioners on the Future of Surgery panel, plastic surgery registrar and honorary lecturer Dr Nadine Hachach-Haram, who has also co-founded the device-agnostic AR platform, Proximie, which enables doctors to work virtually with colleagues. Hachach-Haram has also recently received a British Empire Medal in this year’s Queen’s birthday honours list for her innovative work within the field of surgery and medicine. We also spoke with Professor Kaspar Althoefer, Director of Advanced Robotics at Queen Mary at Queen Mary University of London, and Peter Wells, Head of Policy at the Open Data Institute.
Tech: The right way
“We don’t want tech that’s just cool and exciting – we want tech that actually has impact here and now and in the future,” says Hachach-Haram, emphasising the importance of patient need driving the tech, rather than the other way round. “We are on the front line and we’ve seen the problems. We want to adopt technology that’s going to make that difference; that’s going to help us deliver optimal care to our patients; a better experience, a better patient journey – that’s what’s important.” More efficient, better tech also benefits the surgeon too – streamlining processes and avoiding duplication.
"We wanted to take our inspiration from biology – so octopus tentacles were our inspiration....making something that was made of soft materials, to achieve more dexterous behaviour but also to increase patient safety."
Kaspar Althoefer, Queen Mary University of London
Clinician-replacing robots?
When asked the inevitable ‘will robots replace humans?’, Hachach-Haram states: “Not at all. I believe there are some areas where technology in general can play a bigger role, for example machine learning or precision medicine, as we are seeing a shift towards ‘predictalytics’ and preventative care”. She underscores that, rather than replacing, the power of tech is also in the up-scaling and sharing of expertise, building a trusted community or ecosystem – by clinicians, for clinicians – supporting shared decision making across multi-disciplinary teams.
“Ultimately these technologies should be adopted to help augment and support the clinical team, to grow and crowdsource the content of a community. It’s about how we share expertise, how we share our knowledge, how we deliver that care to patients across the country and around the world. It’s also about finding new tools to help us deliver treatments more precisely and effectively with less morbidity.”
Responding to the same question, Althoefer makes the point that the current tailored robotics solutions require significant human collaboration, as with the Da Vinci robot – with embedded intelligence complementing the work of the surgeons: “Intuitive Surgical, probably the most successful surgical robotics company to date, is using advanced algorithms to make minimally invasive surgery more intuitive for the surgeon,” he explains. “When traditional laparoscopic surgery is done, the surgeon has to learn to do counter movements, because laparoscopic tools go through this narrow fulcrum point around which everything rotates, and that makes it tremendously difficult for the surgeon to conduct surgical tasks in the abdomen. It takes years to master this skill. And in the Da Vinci system, there’s already some intelligence in there that overcomes that problem.”
He adds: “Another thing that has been implemented in the Da Vinci system is the tremor reduction, and the scaling of motion – so a large movement by the surgeon turns into a very small motion at the tip of the operating system, allowing for more steady and accurate tissue handling and cutting as well as suturing. Again, this is some kind of intelligence that is built in to make things easier for the human user.”
With the caveat that he can’t predict the future, he says that the process of automation will continue and in some cases fully-automated systems may replace human–robot systems in the long run: “I suppose there will be a time when certain tasks will be completely automated, like suturing – maybe the surgeon will only need to highlight the areas that are to be sutured, and the rest will be done by the system. And that will continue, and possibly one day there will be a completely automated system to conduct entire surgical procedures; [however], we are very far from this,” he says.
Althoefer discusses the hype around the ‘threat’ of AI, making the point that these fears should not stifle progress: “There are these predictions that the robots will take over and that AI will be more intelligent than humans, and it is theoretically possible. How should we react to this potential threat? Should we stop now doing anything that is related to the development of intelligent technology? Abandon everything? Stop certain types of research? I think that’s not what we want either. So I think we have to move forward and see the positive things that we can achieve.” He adds: “I also don’t think it will happen in the foreseeable future. It’s pretty far away – what these colleagues talk about.”
When asked if we are overly reliant on technology, he makes the comparison to GPS and satnav technology – that humans indeed rely very much on these systems, but the benefits far outweigh the ‘risks’ of reliance. He adds: “I can only stress the point that we are very much at the beginning – I believe the more technology is developing the more we will be relying on the systems. And it can give us better outcomes. There are certain things humans cannot do as well as technology can do it, so why reject that?”.
"Healthcare professionals talk to each other, we collaborate, we share ideas, we share experiences, we share difficult cases, we learn from each other. It’s very much a social way of working and AR and mixed reality promote this very effectively."
Nadine Hachach-Haram, MD, Proximie
Virtual reality, machine learning and artificial intelligence
“We can see the effective use of virtual reality in simulation, in terms of preparation and rehearsal – the challenge here is the cost of scaling as there are tens of thousands of different procedures with nuances in each of them,” says Hachach-Haram. “And while the full range of the possibilities of artificial intelligence has not yet been realised, theoretically, we know the scale and impact will be significant,” she says.
As well as verbal evidence, the commission has received written submissions about the ‘game changing’ nature of VR in areas such as anatomical education, particularly in the ‘virtual dissection table replicating a real cadaver bed’, with the benefits of global availability and bypassing the legal, consent and supply issues.
Outside of training, Hachach-Haram notes that “… VR has very strong uses in areas like paediatric pain management and enhanced recovery,” but notes the importance of recognising the limitations and benefits of each type of emerging tech, and being able to identify where they can be beneficial but also where they don’t necessarily add value.
Discussing the use cases of AI in more depth, Hachach-Haram says: “If we think about the areas in healthcare that will be the first to benefit from machine learning, I can see radiology being one of those: it is a lot about pattern recognition – it’s very visual.” She describes the work in supporting and accelerating diagnoses, and notes: “There’s work already being done on skin lesions and skin cancer, as well as ophthalmology.”
"If we are to retain trust and make the most of the potential of data then handling it ethically needs to become normal practice, just like medical ethics."
Peter Wells, ODI
The role of data
As well as relying on high quality algorithms, AI and machine learning are only as accurate and effective as the data that feeds them. And with recent data scandals, notably Facebook and Cambridge Analytica, what are the ethical questions around data? “Ultimately, if data is used with the right intent, with the proper consents in place, and the necessary/appropriate compliance, then it can be done, and with the introduction of blockchain this will become more and more important,” says Hachach-Haram. She notes her experience that patients generally say they are happy for personal data to be used, with the right consents and safeguards in place. With consent as one of the lawful bases for processing personal data under the General Data Protection Regulation (GDPR), this is a topical point.
Althoefer describes the importance of data being used in the right way: “Obviously there’s always a downside: people can use things in a ‘bad’ way. I hear in the States [USA] there are some insurers who don’t want to insure people anymore because they can already predict how ill they will get and how quickly they will die, and that is of course a problem if we use data in this kind of way.”
To help address issues around data ethics, an independent non-profit company, the Open Data Institute (ODI), has published the Data Ethics Canvas, a tool for decision-making around the use of data, taking ethical considerations into account. It prompts users to ask questions such as: ‘who could be negatively affected by this project’; and ‘what is your primary purpose for using data in this project?’
Peter Wells, Head of Policy at the ODI, said: “If we are to retain trust and make the most of the potential of data then handling it ethically needs to become normal practice, just like medical ethics. Unfortunately it’s not yet normal practice and the necessary debates haven’t happened at a wide enough scale to know what normal practice would even look like.
“The ODI’s Data Ethics Canvas is intended to help on that journey by helping more people, project teams and organisations understand the questions, have the necessary debates and make better decisions.”
Augmented reality and scaling
A key technology in healthcare, Hachach-Haram argues, is AR. “I find AR really exciting because it’s about taking your natural world, your real world, and superimposing and augmenting with computer-generated digital content to break current boundaries and achieve more,” she says.
“If you think about the future of mixed reality in which you could be in your workspace, in your day-to-day activity, but being fed information and content from various virtual sources: this adds value and is very relevant to helping you deliver more efficient, better quality care,” she says.
Hachach-Haram describes a recent case where an AR system meant that a surgeon in Amsterdam could work with a surgeon in Cardiff on a complex cancer case: “If the technology hadn’t been there, it would have been a struggle’ she says.
Scaling is where AR really shines, says Hachach-Haram, explaining that as well as the immediate benefit of layering virtual content onto real situations, it provides the opportunity to share and replicate expertise. “At Proximie, we believe in scaling expertise and being able to connect clinicians to engage and collaborate. AR and mixed reality are very collaborative, lending themselves very well to healthcare. However, given the global pressures in healthcare, we believe, the key is to scale these technologies without having to invest vast amounts on hardware – Proximie allows clinicians to extract the true value from AR technology with their existing hospital infrastructure,” she says.
“This technology is available and accessible now and I believe that the collaborative nature of it is very exciting. Because that’s what healthcare professionals do – we talk to each other, we collaborate, we share ideas, we share experiences, we share difficult cases, we learn from each other. It’s very much a social way of working and AR and mixed reality promote this very effectively.“
STIFF-FLOP project and working across industries
As part of EU-funded project STIFF-FLOP, which ran from 2012 to 2015, Althoefer worked with a team to create, a ‘highly dextrous soft robotic arm able to locally control its stiffness from a soft state to a stiff one as required by the task, i.e., for the arm to be compliant with the environment when, for example, advancing towards the surgical site and to stiffen up, for example, when manipulating tissue’.
“The interesting thing is that we wanted to take our inspiration from biology – so octopus tentacles were our inspiration.” He adds: “I think what we developed was great – but to achieve something that performs as well as an octopus – we’re definitely still a long way from that.” Althoefer explains that the idea of the STIFF-FLOP project was to move away from the very traditional type of robot-assisted, minimally-invasive surgery, like the Da Vinci robot and instead focus on “… making something that was made of soft materials, to achieve more dexterous behaviour but also to increase patient safety.”
“There are two main areas that we’re interested in,” he says. “One is robots made from silicone rubber, so very soft, and they have pneumatic chambers integrated which we can inflate, and by doing so we can navigate our robots – bend them in different directions. And the other technique is actually based on textiles – so we’re using fabrics to create chambers and structures, and again these can be inflated, and cables attached to the outside of these inflatable textile structures can be used to move the structures around. The textile type of robot arm is exciting because like its role model, the octopus, it can not only move about but also change its stiffness.”
But we were also interested in automating things, and we have moved things forward in that area also quite a bit. We’ve developed systems that could learn from the surgeon and then carry out tasks in an automatic fashion.”
We’re now looking into quite a number of application areas beyond surgery, for example robots that can collaborate with humans in the manufacturing environment. But we’re also looking at extreme environments like the nuclear industry. And to try and develop robots again that can collaborate in this environment, with humans, to handle waste decommissioning, for example. So quite a range of applications – it’s not only surgery.”
Will tech create generalists or specialists?
Hachach-Haram notes the trend towards subspecialising in surgery, regardless of tech: “Even within my specialty, you have four or five different subspecialties,” she says. Inevitably, this leads to more niche skills held by fewer people. And this leads back to tech and AR: “So then what happens? – because you’ve become so specialised, because there are so few people doing what you do in the required volume [with the required] expertise, you want to continue to share that expertise globally. So how do we scale you?” she asks.
“We can scale you with technology, we can scale your expertise: we can share/proliferate globally, and help people using those new skills to perform them better, faster and more efficiently.” Hachach-Haram concludes: “This is what technology will do for surgery.”