Victoria Vaughan: In this roundtable we’ll discuss what changes need to happen to bring the NHS into the digital age and why trust is such an obstacle. Let’s start with everyone introducing themselves and articulating why trust is a barrier to the sharing of healthcare data.
Mavis Machirori: I’m a senior researcher at the Ada Lovelace Institute, which is part of the Nuffield Foundation. The work we’ve done has shown that trust is related to transparency in the ways that governments and the NHS create partnerships because people think of these things as connected rather than as separate. Trust becomes a barrier if it’s never clear exactly what information is collected and who’s going to have access to it and why.
Dr Rowan Sil: I’m a GP for three days a week and chief clinical information officer at Leicester, Leicestershire, and Rutland ICB for two days. There is trust in public institutions, but it ebbs when there are high-profile cases where you see data getting into the wrong hands – for example, when police lists in Northern Ireland were leaked. For me, it’s trying to balance that among my GP colleagues, who are data controllers, and our health informatics service, who are trying to look for health inequalities. It’s having to explain that if there are fewer than six of seven cases, we have to suppress the data. Those are the cases we really want to target but we don’t want those individuals to be recognised at a practice level. So, there are different intricacies in that trust, and it can go all the way through the system.
Dr Dan Bunstone: I’m a GP in Cheshire and Merseyside, clinical director for Warrington Innovation Network PCN and clinical advisor at ETC Health, which is part of BT group. I’m also chair of an NHS Confederation design group exploring data and digital in primary care. Trust is an issue for sure. We’ve talked about nefarious characters, but what we’ve not really talked about is someone like me releasing information ‘incorrectly’ to the wrong person. I don’t mean a scammer; I mean the police or some apparently reasonable organisation requesting data for insurance.
Anybody who’s a clinician in the room will have completed an insurance report for somebody who had a sore throat on holiday and needed antibiotic treatment. The patient will sign a consent form to say, yes, you can have my information, not expecting questions about everything that’s ever happened to them. There’s very commonly something on the last page asking whether this patient ever had any problems affecting any organ system you can think of. So, there’s a big trust issue there.
We need to universally push back on those questions and say, quite politely, I’ve answered on their sore throat and the other stuff isn’t relevant to you.
On the one hand, the public expects us to be ultra-connected but there are groups who are very concerned…it can be a bit bizarre because the Facebook app on your phone will know more about you than your GP
David Sgorbati: I’m a computer scientist by background and now chief analyst in the Health Economics Unit and I work at NHS Midlands and Lancashire commissioning support unit.
I agree with everything that’s been said so far. It is complex. On the one hand, the public expects us to be ultra-connected and be able to share but then there are groups who are very concerned. And sometimes that can be a bit bizarre because sometimes the Facebook app on your phone will know more about you than your GP knows about you, and people don’t think about that.
But I think the philosophy tends to be overcautious, in my opinion. There were allowances during Covid and great things happened. I don’t think there was a series of catastrophic incidents because those relaxations happened, which should really make us think about how much we can trust the system, how much we can trust each other, and how much we can trust the setup.
And it feels that there is, unfortunately, often a lot of conversations about how ‘legally we need to do this’, but interpretation of that is not always the clearest. And let’s be honest, there is a lot of complexity to what we do.
Helen Duckworth: I’m director of Business Intelligence transformation at NHS Arden and Gem commissioning support unit. In my previous role, I was the associate director of business intelligence at Cheshire and Merseyside ICS.
For me, I think the data controller needs to believe that they’re making the right decision. There is fear. Covid took away that fear because there was a legal framework that everyone could point at. And they could say to their patients, this is okay, because I’ve got this piece of paper. The complexity of the national IT environment is that there’s no longer that piece of paper to point out.
VV: You’ll all be familiar with Professor Ben Goldacre’s review into how the efficient and safe use of health data for research and analysis can benefit patients. He talks about trusted research environments. Helen, I think you’re working on this. Can you talk about that? How can we make it safe to share data?
HD: The NHS is desperately fragmented, the data is often moving all over the place, going to different places for different purposes. There was a thread through the Goldacre review about changing the way we do that.
It’s going to be much more standardised and, hopefully, only curated once. The thrust of secure data environments from a technical perspective is that we bring people to view the data and have much more central control over it.
It will enable a single access point. We’d be able to provide assurance to the public, and patients, that we have checks in place and rules around which data is accessed.
That’s not to stifle things for analysts. But, for example, if a research organisation wants access to the data, then there’s a stringent, data access request process that requires them to have done certain things around security and patient public involvement. At the moment, there’s less standardisation of that process.
With all of this, the idea of incremental change is important. When I was in Cheshire and Merseyside and we were thinking about our data sharing framework, we started with direct care, and we got everybody used to flowing data for the shared care record. And then we thought, can we repurpose that data for population health? We’ll provide it back to you so the clinicians can use it to risk stratify and provide more analytical and proactive care for patients.
So, then we were using it for population health. Now we’re saying, can we provide the data to some academic institutions to use for research? Each of those changes has taken a year to 18 months. Incremental change – like trust – takes time.
Data quality is one of our biggest issues, to be honest…We’re trying to address the quality of the data that’s coming out of GP systems.
Dr Rowan Sil
RS: We’ve got a bid in our region for a secure data environment that encompasses three universities and local NHS institutions. So, we’ve had this exact conversation. We want to make sure it’s transparent and get those messages out there.
We have several ways that we share data at the moment. We have an information-sharing agreement for all GP practices into our local health informatics service so we can pull data from GP systems and the ICB can look at health inequalities. Data quality is one of our biggest issues, to be honest. Once you’ve got the information-sharing agreement in place and look at the data itself, you start to think, ‘Oh, my goodness, this can’t be right’. We’re trying to address the quality of the data that’s coming out of GP systems.
DS: It is true that sometimes there are problems with data quality. And the conversation around data quality is very strongly related to the idea of trust because it’s all part of a culture where we trust each other. We’re comfortable sharing data because we know that we’re sharing data for a purpose – we can look after our population better. And the moment we know that we are sharing data, we look after what we put in our systems.
MM: I understand that we need to standardise it because data is coded in multiple ways and that somehow needs to be aggregated to make sense. But we’ll have to think about what we’re doing when we’re standardising data if we’re to maintain trust. We need to think about how we do it in a meaningful way for both the analysts and the people providing the data.
When it is standardised, certain things are going to get missed out, and if you’re not represented in the data – your experiences of health or poverty, or whatever social situation is impacting your health – you’re not going to trust that the data will benefit you.
We need to think about the contractual obligation that the NHS is bringing to people when it comes to data. What is the value and how is that value being transferred back to communities? The NHS is made up of so many different private players who are using the NHS, and I think that also needs to be articulated a little bit more to people.
DS: I think we need to make good use of the data. We spend a lot of time digging into the data so, when we get it, let’s make sure it’s really actionable – usefully told with the story data rather than sat on an Excel spreadsheet.
VV: In terms of helping with trust, what needs to happen in the next six to 12 months? By the end of October, for example, all patients should have automatic access to their patient record. How will this change things?
RS: A lot more patients are now requesting to see their records and it might save some time around getting results out to people. But the other side of that is what if some things are released on the patient record before we get a chance to tell people or we get a chance to explain what those results are?
If we’re talking about data quality, patients are going to be the best advocates of what that data is. You know, if they see something that was coded back in 1984, and they’ve never heard of it. As time goes on, we can start to improve the data quality of those records. It’s people being in charge and knowing what their record shows.
DB: My experience is that patients get their records, and you have that difficult conversation where something was recorded in 1972 that they haven’t got, or they have something that was never recorded. And then it’s a bit of a treasure hunt to try and find out what happened and whether the code was incorrect.
It’s a conversation that we increasingly need to have because there’s so much software out there in a variety of different guises, whether it’s triage-based software or smarter AI-based software. There’s a whole host of ways that patients walk into a noose or, I would suggest, or have [situations] thrust upon them in our attempt to deal with capacity and demand.
MM: We really need to be engaging the public early and bringing them to an understanding about what it is they want out of their records. Just because someone is engaging with a GP app, it doesn’t mean that they have no interest in what happens to their records or who accesses them.
And we also need to recognise that not everyone is digitally connected. What happens to people who are not digitally connected?
We really need to be engaging the public early and bringing them to an understanding about what it is they want out of their records
RS: I absolutely agree. The data that we’re probably going to find the most useful is going to be from those areas of the population that are digitally excluded.
We need to be looking at the bigger picture – where and how we’re going to have those conversations. It’s actually going into the communities and talking to the people that you wouldn’t normally get on a patient representation group. It’s being clever about targeting the right people at the right time for this, and I don’t think there’s an answer that will work for everyone. It all depends on what your cities and communities are like. The ICB and local authorities can help to target those groups. It’s time intensive and it’s resource intensive, but if we don’t get it right now, we’re going to find that we’re not able to take full advantage of this data.
HD: I think it comes down to having a well thought through and financed engagement plan. You’ve got to have the investment; a good engagement campaign costs money, right? And then I think it’s about being imaginative. We can’t see it as a box ticking exercise – anyone can set up a system and a jury, and get 10 people there who give their views, but are they the right 10 people? We’ve got to send people to Wetherspoons or wherever, and you’ve got to make sure that the engagement is being done properly, which means you’ve got to design engagement campaigns that are bottom-up.
This is where, nationally, they really struggle because they’re not bottom-up. Even at ICB level, you probably struggle. So you know, it’s going to practices to really understand how to engage properly, in my opinion and, like you say, that’s just really intensive. Everyone’s got a day job, and it’s hard to find the right people to have the conversation. And it’s continual. It’s part of your operating model – it’s not something you do once, and then you stop doing it.
DB: We have to be in a position where we can democratise data. Your hospital records and your GP records will contain things – and so will your smart wearables. I think we must balance out data control versus the human benefit. It’s the ability to be able to prevent, say,10,000 heart attacks and strokes nationally versus the fear of accidentally releasing 100 people’s blood pressure.
Part of the way we can get to that is by supporting GP surgeries. Help is needed, in particular from the data control at larger organisations, which tend to have much better support or a dedicated person to do that. It could be proactive support or it could be training.
We also need the removal of consequences – maybe not make it quite so draconian. We know that a data release is undesirable, but sometimes it happens because you’ve made the wrong decision. And I think there’s a whole host of fear there. GDPR is supposed to support and not stifle. I think we probably sit too far on the stifle end of the seesaw at the moment.
VV: What about the longer term? Where can we be in five to 10 years? Given that it has to go slow for patients even though the tech is going really fast, what could the future look like?
MM: I think it could be positive. To get there and we need to think about data standards and impact assessments for decisions that are made around the data. If public engagement is done and we are serious about lived experience and think about how we turn that into decision-making, then I think the future is really promising.
DS: At the end of the day, it’s a culture problem. We need to make sure that we work together and have a good relationship and then a shared culture of really making the best of the data.
It’s really about how the analysts can think more creatively about how we can bring in these different data sources and how we can better support decision-makers. And also decision-makers having bigger ideas on what we can do with the data – how many tens of thousands of things we can prevent.
I’m optimistic about the future. I think there is a lot of potential for us to have a society where we actually figure out this data stuff, you know. People are semi-voluntarily giving away a lot of personal information to private companies without really thinking too much about it. I think that’s probably going to change – as an individual, I hold personal information that is precious and powerful. It’s important that I have a relationship with the health and care system that allows them to use my information for greater benefits.
And it’s important for the health and care system to really start thinking about the population from many points of view, and truly, as a system. We’ve not focused a lot on wider determinants, such as housing, justice, and education, but for me, that’s going to be the big next step – when we really start working together, across the entire complexity of a person’s life to ensure health and wellbeing.
RS: For me, it’s using that data to really push forward how we can spend the public pounds. If we can move people towards the benefits in their lives around activity and around their social circumstances, we know that it could safeguard the NHS in the future. It’s the reduction in the use of prescribed medication and long-term conditions. That is why I am a massive advocate for this.
If we can encourage people to use all the public services and, from their data, help share public funding towards making their lives better, it will safeguard our NHS for the future.
The next generation coming through are on TikTok and they’re all sharing data. And I really think that we’ve got to work with them to design a patient and public engagement campaign
HD: The power of data is the biggest driver. I agree with David about changing the culture and changing the conversation – nationally and locally – around data. We might all have data locally, but we’ve got to try and fix that dynamic between local and national. We’re all old people really – no offense to anyone here but, you know, the next generation coming through are on TikTok and whatever else and they’re all sharing data. And I really think that we’ve got to work with them to design a patient and public engagement campaign because there will be some imaginative ideas to try and make the sharing of health and care data relevant.
This isn’t about a six to 12-month plan. It’s about a 10 to 20-year plan. And what does that arc need to look like to change the dial?
DB: Yeah, I mean, I agree entirely with what Rowan said. I think the shift could be seismic. It could improve the health of the nation and help us to break down the barriers of health inequalities.
If we can share data, we can move the NHS from a place that treats illness to one that is about better managing health. It’s not a place where we’re trying to prevent people going to A&E but, instead, stopping people from having a heart attack because drugs are better at managing their blood pressure. I’m doing as much work as I can in this area at the moment and you see the difference. The impact that huge data can have would augment that and catalyse the change.