Knowledge Institute Podcasts
-
AI in Patient Care and Clinical Decision Making with Roche UK’s Dr. Rebecca Pope
September 27, 2024
Insights
- Artificial intelligence is revolutionizing the NHS by improving diagnostic accuracy, streamlining operations, and personalizing patient care, demonstrating its significant potential to enhance healthcare delivery.
- Ensuring AI technologies are grounded in rigorous evidence-based science is essential for their successful integration into healthcare. This validation process helps maintain high standards and ensures that AI solutions are both effective and dependable.
Kate Bevan: Hello and welcome to this episode of Infosys' Podcast on being AI-first, the AI Interrogator. I am Kate Bevan of the Infosys Knowledge Institute, and my guest today is Dr. Rebecca Pope. Rebecca is Roche's digital and data science innovation lead. However, Rebecca described herself to me as having staggered into AI, which doesn't really do her and her distinguished career justice.
Rebecca, what's the road that's led you to where you are today? Because I know it's very varied.
Dr. Rebecca Pope: 100% it is, Kate. I still wouldn't describe it as a road, but rather a rockery with wonderful barriers in the way.
Yes, I'm fundamentally fascinated about how the human brain works and the sadness and consequences of when that is damaged in some way, maybe by neurodegenerative disease or as a consequence of an acute injury, maybe a road traffic accident or a sudden growth of a tumor. That led me to psychology, which in turn led me to a PhD in clinical neuroscience. I had the privilege of working with families going through different types of brain surgeries for different conditions, particularly epilepsies when that area of the brain is removed. My PhD was about understanding what happens as a consequence of that surgery from a cognitive and psychiatric perspective. Can we find structural biomarkers ahead of time before surgery to help us understand brain function and ultimately so that those patients and families can be counseled?
Then I did a few post-docs, one in medicinal cannabis. And then I got to this crossroads in my career of going, do I carry on with what I'm doing, or do I do something different? I decided to do something different. I had this wonderful opportunity to join a health think tank, The Health Foundation, where I learned all about NHS data. I live in the UK, and we are very privileged to have the National Health Service.
And then this place called IBM were recruiting for this thing called a data scientist. I remember saying to you, "Well, I work with data, and I identify as a scientist, and if you put the two things together, how hard could it be?"
Kate Bevan: You are a data scientist.
Dr. Rebecca Pope: It turns out it was actually quite difficult.
But I was really, really fortunate to get a role at IBM as a data scientist, where they actually trained people up irrespective of their backgrounds. What they were looking for were people that were curious and wanted to solve real-world problems using this machine learning type of approach. I was there for a number of years, and then I went into technical consulting at what's termed one of the big four and led their AI practice, particularly towards public health.
And then a friend of mine who had left that consulting organization said, "There's this place called Roche and they're looking for somebody in the data AI space. Are you interested?" That's how I ended up where I am. In no way planned absolutely loving the ride.
Kate Bevan: What does your job entail? What does the data and science lead do at Roche?
Dr. Rebecca Pope: I think it depends where you sit in the pharma value chain. What I mean by that is I work in the UK, so primarily my focus is thinking about UK patients, the NHS. I'll regulate the MHRA.
What I do is work with that ecosystem of partners, whether that be the NHS or the MHRA or charity sectors or university. Actually, fundamentally how do we move a thing through the software development life cycle, whether that be an algorithm or a digital health technology? Because we all know we can't do it alone, but actually how do we collaborate and partner so that patients...
I often think we need to just take a pause of what we mean by patients. We mean you and me. I'm a patient already in the NHS. We're all going to be patients at some point or know somebody that is. How do we really drive impact in partnership with that ecosystem?
Kate Bevan: I think one of the things here we talked about was getting these really promising ideas because we've seen so many promising pilot schemes and ideas coming through in the NHS. I'm thinking the scheme to read mammograms, for example, which picked up all kinds of potential cancers in women. How do we get that beyond the proof of concept on the pilot scheme actually to being part of everyday healthcare.
Because you say we are all patients. I'm getting closer to the age where I'm probably going to have to have more involvement with the NHS on a regular basis, and I would have a stake in that by knowing what the technologies are and where they're going to be used with me.
Dr. Rebecca Pope: I think if we just take a step back and we think about how do you get anything into everyday healthcare, and that is ultimately through responsible and evidence-based science, and AI is no different.
I've often seen that we have a healthcare problem here and let's take this AI technology and apply it. It's this square peg, round hole mentality and actually not really thinking about the local context in which that technology is being deployed.
But again, there are so many steps prior to deployment that are so very important. If I just put that into a number for you, in 2019 there was a piece of research done that said of the scans taken of the eye, how many meet a threshold that the community would accept is performing this type of task as well as a clinician? Less than 1% of the almost 32,000 papers could not evidence that they had done the things in order to show that this could move through that software development life cycle.
I think we have a real responsibility, even more so in healthcare because, and I don't mean this in a derogatory way, if I and colleagues and the team build a model and it sends 1,000 extra flyers because of a marketing campaign, that's nowhere near as impactful or devastating as actually building models that may are deployed and miss that cancerous growth. The end impact is significantly higher.
Your question was about why aren't we getting there? That's a question we're trying to tackle at Roche UK, and we're doing that in partnership, for example, with Great Ormond Street Hospital.
Kate Bevan: Great Ormond Street is the leading children's hospital in the UK and I think probably in Europe and around the world.
Dr. Rebecca Pope: Yeah, that's correct. They're also very digitally mature hospital, so that's why we chose them as well because of the problems that we see is that there is a lot of hubris around the NHS data as a data asset. If you could unlock it, yes, it would drive precision medicine and operational and productivity gains, but it's not machine learning ready.
Essentially, artificially intelligent systems are ways where machines learn from data so they're not pre-programmed. The rules of the game are not almost hard-coded or crafted into the software program. And so that means that the input to these systems is data because that's what it's learning from. It's almost like having a parent that calls a cat a dog. When you see a cat, you are always going to think it's a dog, right? Because you've learned the wrong input. If the data is not collected, not curated, and labeled, annotated in a way that can teach a machine to learn from it, we get a real problem with data quality. It's that classic garbage in, garbage out.
And so, we wanted to learn, and we are on a journey of learning with Great Ormond Street Hospital. They have collected data for the purposes of hopefully developing machine learning algorithms to improve patient care, to optimize hospital workflows, et cetera.
And so, what we wanted to learn in partnership is, what are those barriers? Because I think that most people would say actually the NHS and pharma working together is a positive thing. We've seen that through the pandemic. We see that in clinical trials, et cetera. It's a little bit like world peace. Everyone agrees with it, but it's operationally really hard to realize. We are working at Great Home Street Hospital to go, what are those barriers to deployment? Why can't you take a proof of concept of a thing, let's say an AI algorithm, and just put it into clinic? How do you go from code to clinic?
Some of the areas that we are finding are things that we already know as a community. Data isn't collected for machine learning and healthcare. It's there and collected in order for you to get optimal treatments hopefully in your pathway.
But then the other piece that I think is an often-overlooked problem is that there isn't the workforce and the technical skills and capabilities infused into the system in order to take a proof of concept. Let's say your point of saying there's an image and I'd like to understand if that image has any cancerous pixels. From an MRI machine, let's say, how can you take that along into clinic? Who is responsible for making sure that that data doesn't have bias in, or if that data or that environment changes in some way?
I often say this to people when I have the opportunity to talk to them in a conference environment. Put your hand up if you know a chief medical officer in the NHS. Lots of people put their hand up. Now put your hand up if you know a chief algorithmic officer. No one puts their hand up because whose job is it to make sure that the governance, the processes, the regulatory frameworks are being applied to, and that post-monitoring, surveillance and safety signals are being... If it's nobody's job and the workforce isn't there and we are not looking at these skills and infusing these skills in the system in a way that is financially sustainable for that healthcare system, then we aren't going to realize the ambition of precision medicine.
Kate Bevan: Yeah, we talk a lot in our content at Infosys and the Knowledge Institute particularly about the real importance of ethical AI, of responsible AI, of having all those procedures in place. As you say, if you don't have those, you cannot be AI first. You cannot bring these AI tools in.
Dr. Rebecca Pope: Yeah, absolutely. There's a wonderful unit that has been going for many years prior to our partnership called Great Ormond Street Drive Unit that houses an MDT, a multidisciplinary team, of data engineers, data scientists, ML engineers.
You may make something, and I've done this in my career. Made a wonderful algorithm that works on retrospective data and it has this wonderful evaluation performance metric, but that was on data that's in the past. What about prospectively? What happens when you put that into deployment, and what happens when actually that algorithm starts to see things it never saw in the retrospective?
A linked point to that is therefore how do you ensure that you pick up where you may have discriminatory signals that are becoming systemically amplified by these systems because it's never seen that data before?
Kate Bevan: Where does AI bring value and where doesn't it bring value? Where should we be focusing, particularly on healthcare, particularly the patients at Great Ormond Street to some of the most vulnerable of all? How do we make sure that we are using AI in the best way for them?
Dr. Rebecca Pope: There's a few things that we're working on through that partnership, and one is around operational analytics. There are what you call what is termed these back-office type activities. For example, are there tests or orders that ultimately culminate in potentially over testing and financially higher spend for the hospital that don't actually drive anything useful from a data perspective to augment human decision making in looking after that patient? There's the back-end type things.
There's also the administrative side of running a hospital and understanding patient flow, and so that's another area that we're looking into.
Also of course, all this should be about the public, patients, their families, whether it's Great Ormond Street Hospital or another hospital or you and me, right? How do we understand and use certain tools, maybe natural language processing, which is a form of artificial intelligence, to understand patient feedback that maybe comes to the hospital in what's called free text form? Not in a structured way. Are there signals in there where actually there could be better care for patients and better quality for them during their often-long pathway at somewhere like Great Ormond Street hospitals?
Ultimately, we want to be able to develop what's called a clinical informatic service. We want to be able to augment human intelligence so that if a doctor is sitting in clinic and they see a new patient, they can literally say, "Well, given the parameters of you," so maybe you have this rash or you have a family history of X, or you have a genomic test that tells us this, "what does the future look like for you? What does some of the optimal treatments look for you?"
Now, in no way is the AI making that decision. I want to be very clear about that. But rather it's an algorithmic stethoscope that enables a clinician to use data at their fingertips to inform their decision. I think that that's what we're trying to do.
Kate Bevan: Well, the next question really, which is about trust. You're building trust for the clinicians. How do you also build trust with the patients and with their parents?
Dr. Rebecca Pope: It's about transparency.
When we first started out on this journey, I think there was understandable reservations about, actually, why does pharma want to do this? What's the end goal here? What are you really about?
I think that we've worked extremely hard in collaboration with Great Ormond Street to try and showcase, actually, this is about bringing skills from pharma that don't understand NHS data but are data scientists. You all also have data scientists at your team and data engineers and others. How do we blend that workforce? And so, part of that has been Great Ormond Street being really open and transparent as we have around what are we trying to achieve here?
Also, what is the level of data that we analyze, for example? To be very clear, myself and other Roche colleagues that have the privilege of working at Great Ormond Street Hospital, we're on honorary GOSH contracts for example. We can only see and process anonymized information. And so, to be very clear, that means that I can never know who the patient is. There is no way that I can know that... For example, I have type one diabetes. If I'm seeing in the data type one diabetes, I will never know that's Rebecca Pope, for example.
Another thing is around IP, and I think this is a really important piece when you have a private and public organization coming together. It was really clear to me when we were looking to fuse this partnership together that the intellectual property, so the tools and all the learnings, is owned by the NHS for the NHS. I think that when people hear that and when we articulated that very clearly, I think it engendered trust because the NHS is trusted. It's trusted with our most valuable thing, which is our life.
Kate Bevan: I am going to ask you my final question now, which is a bit of a pivot from that, which is, do you think AI's going to kill us all?
Dr. Rebecca Pope: I think it will if we just start letting people be in the Wild West in healthcare.
I also think paradoxically, it may do, if we don't have very serious conversations around how we start to engage the public and help them understand that actually maybe some of this data, if there are fair principles, if it's dealt with safely and securely, could potentially, we don't know yet because we haven't got the evidence, spot at a much earlier diagnosis of, let's say, cancer.
And so, I think that actually there's a yes or no it's a genuine AI Schrodinger's cat, right? It could or couldn't. I think the parameters under which that could happen depend, one, on a regulatory pathway, and second on having really open and difficult discussions which are happening.
To say, if we don't do this and we can't get access to data and we can't get access in a safe and secure way, for example, OpenSAFELY did this brilliantly and are doing this for NHS England during COVID, then we won't be able to use these potentially revolutionary technologies in the most, in my view, critical sector of society, which is healthcare.
Kate Bevan: I'm going to leave it there because that's a great place to finish. Dr. Rebecca Pope, thank you so very much for your time.
Dr. Rebecca Pope: Always. It's a pleasure. Thank you for the invitation.
Kate Bevan: The AI Interrogator is an Infosys Knowledge Institute production in collaboration with Infosys Topaz. Be sure to follow us wherever you get your podcasts and visit us on infosys.com/iki.
The podcast was produced by Yulia De Bari and Christine Calhoun. Dode Bigley is our audio engineer. I'm Kate Bevan of the Infosys Knowledge Institute. Keep learning, keep sharing.
About Dr. Rebecca Pope
Dr. Rebecca Pope (she/her) is a Clinical Neuroscientist & Roche UK's Digital & Data Science Innovation Lead. Rebecca's focus is working with the NHS, governments, regulators, clinicians, and patients on how we can bring the advanced technologies that we’re used to using in our everyday lives, like Artificial Intelligence, into healthcare, at scale. She established and co-leads the Roche UK and Great Ormond Street Hospital 5-yr partnership and is part of multiple partnerships focusing on AI and scalable digital health solutions across the patient pathway within Oncology, Ophthalmology, Neurology & Rare Diseases.
She is also an honorary Senior Research Fellow and PhD supervisor in AI at UCL, TEDx speaker on the application of AI within the NHS and has been consistently voted as one of 'The Most Influential Women in UK Tech'. Rebecca published a number of academic research papers, wrote several scientific broadsheet commentaries, and is the recipient of National awards for her research endeavours.
Rebecca is deeply enthusiastic about equality, diversity & inclusion in STEM & society and is part of several initiatives & networks that encourage an inclusive environment where everyone feels empowered. In 2021, Rebecca established the Roche UK Health MBA Scholarship for underrepresented groups.
- On LinkedIn
About Kate Bevan
Kate is a senior editor with the Infosys Knowledge Institute and the host of the AI Interrogator podcast. This is a series of interviews with AI practitioners across industry, academia and journalism that seeks to have enlightening, engaging and provocative conversations about how we use AI in enterprise and across our lives. Kate also works with her IKI colleagues to elevate our storytelling and understand and communicate the big themes of business technology.
Kate is an experienced and respected senior technology journalist based in London. Over the course of her career, she has worked for leading UK publications including the Financial Times, the Guardian, the Daily Telegraph, and the Sunday Telegraph, among others. She is also a well-known commentator who appears regularly on UK and international radio and TV programmes to discuss and explain technology news and trends.
- On LinkedIn
- “About the Infosys Knowledge Institute”
- “Generative AI Radar” Infosys Knowledge Institute
- Roche UK
- Roche UK and Great Ormond Street Hospital
- How can AI help our NHS, and should we be concerned? | Rebecca Pope | TEDxFolkestone
- OpenSAFELY
Mentioned in the podcast