Matt Hennessey, chief intelligence and analytics officer at NHS Greater Manchester, tells Healthcare Leader’s editor, Victoria Vaughan, how data and information are being used to transform care.
Victoria Vaughan (VV): What’s your current role and focus on data and transformation in the Greater Manchester integrated care system (ICS)?
Matt Hennessey (MH): One of the things I am really keen on is that data doesn’t become the focus point; it’s intelligence, it’s the application of curated and interpreted data that’s important. Data is inert. It’s the actual intelligence that matters.
I see my role as ensuring we’ve got the right connections, improving data quality, and all that technical stuff. But actually, it’s more about making sure we derive the value from that data rather than just having lots of information. We’ve all been in meetings where you’re presented with graphs, and you still leave the meeting saying, ‘I know quite a lot, but what do I do about it?’
VV: Data quality is cited as a major barrier when it comes to applying data. Are you working on specific ways of improving that, particularly GP data quality?
MH: We have a shared care record, which is about delivering direct care and allowing the clinicians anywhere in the system to see the patient’s record. That’s an amazing resource because that allows us to benchmark some of the data quality challenges and understand where it might require improvement. It’s really useful to have the resource to record things like ethnicity or protected characteristics and it enables us to do analysis on the levels of recording compliance. We’re only going to be able to improve data quality by almost manifesting that there is an issue. That’s the conversation I’m having with a lot of primary care networks (PCNs) and GP practices – it’s about trying to manifest the issue and understand what might be driving it. Sometimes it’s technical, sometimes it can be rectified by human intervention. The other thing that happens at a technical level in the establishment of the shared care record is that you are applying standards. So in order to connect the data, in order to interoperate with these systems, you start to create standards that source IT systems can start to adhere to.
VV: Are PCNs and practices on board with this and as an issue?
MH: There’s a recognition that if people are going to use data, and they want it to inform decision-making, it has to be good quality. There is an understandable tension that people feel if it’s not presented sensitively, and it’s not presented as an opportunity for improvement. It then becomes a sort of performance-type conversation. And that’s not what we want. Because we want everybody to improve the data quality and to benefit from that improvement rather than having a sort of ‘you’re better than they are’ or introducing that performance element. I’m really keen to stay away from those sorts of conversations when we talk about data quality.
VV: How are you applying the shared care record in terms of new ways of using data for healthcare?
MH: We think about how data can benefit patients in terms of supporting clinicians to make decisions about care. And that’s the direct care components of what the shared care record would provide. But there’s also an opportunity to redesign services, optimise patient pathways, and develop new and innovative treatments and interventions. That’s a secondary use. One of the things we’ve embarked on is to create a longitudinal patient record, which is de-identified. This is now up and running in preview having been through a clinical roadshow which goes live fully next month.
We have data for every registered patient in Greater Manchester, and it tells us not just about the interactions they have in primary care but in adult social care, mental health services, cancer services, acute hospitals, and secondary care more generally.
We can see what prescriptions are done, A&E attendances, 999 and 111 calls. So, we’re getting a complete picture on a patient’s individual pathway without actually knowing who that patient is. And because of that, we can cohort up patients that have some similar characteristics.
We could put together all the longitudinal records for a white male, over 50 years old who has a stent fitted, who has been to A&E in the last six months. You’ve got a cohort there. And we’re able to see for that cohort, what are the antecedents? How many of them saw their GP in the previous three weeks and after their A&E visit – what was their general discharge profile, were many of them discharged home, were many of them inpatients, and what were they prescribed?
We can start to see patient flows for cohorts of patients that can be defined either by socio-demographics – so we can understand the inequalities between different communities, age groups, ethnic groups – or by a collection of medical conditions. We can see what the standard patient pathways are for people who have both diabetes and cancer, for example.
VV: Is this in operation now?
We’ve built the capability to do it. We have had a linked data set for all the secondary care data, mental health, prescribing data, A&E and we’ve had the shared care record. In September, we had to make a specific application to the national confidentiality advisory group to be able to link the primary care data with all the secondary care data. This is something we could do during Covid, but post-Covid, we have to go through a process to set aside the common law duty of confidentiality, which would allow us to do this.
We’re in the process of making that linkage, and what that will enable us to do is not just to identify the pathways and those cohorts, but if there is a direct care case, then a clinician would be able to re-identify that anonymised patient so that they can provide direct care. And we’ve got a couple of projects in development that specifically speak to that ability of being able to identify the risk of a population and then the clinicians being able to re-identify individuals within it so that they can mitigate the risk of that population.
VV: And does this cover all patients in Greater Manchester?
MH: Yes, all patients who are registered with a Greater Manchester GP. So there’s about 2.8 million residents in Greater Manchester, and there’s about 3.1 million registered patients. Maybe they work in innercity and they live outside Greater Manchester, but they opt for their nearest GP in Greater Manchester.
VV: Did you have to go and ask them to opt into this as well as a shared care record?
MH: When we made the application to the national advisory group, they wanted to see exactly what communications we’d have with patients and the public. They wouldn’t sign this off unless we had a really good engagement process. A lot of this is based on opt-out rather than opt-in and there are different types of opt-out.
Essentially, you can opt out of your data leaving the GP practice. That’s what we call GDPR lockdown people. The next is that the data can leave the GP practice and go into the shared care record, but it can only be used for direct care. The final opt-out is that it can leave the GP practice, it can be used for direct care, but it can’t be used for secondary use or research. That’s referred to as the national opt-out, because at the same time that the data comes out of the GP practice, to us as a system, it also goes into the national reporting. And so you’re effectively opting out of the national teams using it for secondary use and research.
We’ve had to try and explain all of that complexity to the public, but we’ve done it through the user case stories that show the benefits of data being used in the right way and providing assurances about the security of the data. We do talk about data leaving practices and it makes it sound like data is flowing everywhere, and what we’re trying to do, particularly in Greater Manchester, is to get away from the model of data sharing and talk about data access. So the data sits in one place and the analysts go to the data rather than the data going to the analysts – these are known as secure data environments.
One of the things we’re concerned about is the longer people are on a waiting list, there’s a potential for individuals and their condition to deteriorate. We wanted to use the data to produce a kind of risk assessment as to who was likely to deteriorate the longer they wait.
We’re able to create a kind of risk stratification that highlights those people who are at highest risk due to the nature of the condition, but also their external circumstances – for example, maybe they don’t have carers’ support. So you’ve got a high-risk cultural cohort of people who might experience health deterioration the longer they wait.
There are two ways we can address that by surfacing the intelligence. Either the clinicians who are managing the waiting lists can consider whether it’s worth reprioritising, or, you know, moving down the waiting list to try and reduce the likelihood of that deterioration, and more importantly, being able to support community teams to say, ‘This person is at high risk of deterioration while they wait and these are the factors that might drive that deterioration. So, can you provide wraparound support? Can the GP practice or community teams provide wraparound support to mitigate the likelihood of that risk coming into fruition?’
VV: How far back does this data go?
MH: Each service has existed for a certain length of time. We’ve had a shared care record of some sort for about 11 years. But not all GP practices were signed up to it. I think since the pandemic – since 2021 – we’ve got 99.9% coverage of all the GP practices. In terms of the data, the GP dataset has 4 billion rows of data. It’s a huge amount of historical data. Some services have only come into being since virtual wards – we’ve got a lot of data on virtual wards, but that’s only a relatively new thing. The idea is that we start as far back as the patient record starts. And then we just supplement with whatever service provision data we have since that start point.
VV: Who are you working with? Who is interpreting and managing those 4 billion rows of data?
MH: The only people who are working on the data itself are NHS analysts who work in my team.
We’re also working closely with universities, local government and NHS providers. And in particular, we’re starting to do a huge amount of work with the voluntary, community and faith and social enterprise sectors. There’s a big piece of work to understand what data they have, what data capability they have and how we could share that. We all benefit from that. We’re doing quite a bit of mapping work to try and bring the systems together so that we’ve got a really rich picture that’s available at a personal level.
VV: What technology are you using to interpret the data?
MH: One of the things I’ve tried to do for the Greater Manchester system is an analytics and data science platform (ADSP). A lot of times, when people describe platforms, there are sort of proprietary single-supplier platforms. You could mention Cerner or Graphnet, or these big suppliers.
Our platform is supplier-agnostic; I describe it more as a tool belt. We have something called Snowflake, which is our data cloud. And we’ve got another component, which is Matillion, which is what moves the data around and helps transform the data.
We use DataRobot as our enterprise, machine learning and AI tool. We use Tableau to visualise our tools. There’s a whole suite of components in the tool belt. And the idea, from my perspective, was that we didn’t get a vendor lock-in or there wasn’t anything that was too sticky. If our visualisation software seemed not to catch up with the latest, we could just swap it in and swap it out without destroying the entire infrastructure.
The ADSP also deals with data on finance, workforce, events, and some supply chain data. For example, have we got enough PPE? The platform deals with the entire data landscape, I suppose, in respect of both health and care.
VV: So you can see a patient’s interaction with the health and care services along their journey. How do you then translate that into direct care?
MH: As part of the platform, we’ve created a single front door because you’ve got all these technologies, and there are always different stakeholders. The idea is everyone should go through the single front door and have a login; that login identifies the legal basis that you have to operate.
So, if you’re logging in as a GP, you would have the legal basis to identify patients in your practice. If you’re logging in as a strategic manager, planner, or commissioner, you wouldn’t have the legal basis because you’re not providing direct care. It’s not designed for patients to access it; it’s more about the public sector access. But all the people going in there will get to see the data that they need to do their job.
We’ve got visualisations and presentations of the population health level – for which community in Greater Manchester has the highest risk of this disease or this deterioration. You can visualise that if you go in as a clinician and say, ‘There are 30 people in this cluster of high-risk health conditions and I want to see who they are so I can phone them up and bring them in and do some blood pressure checks’ or whatever it may be.
VV: Does it support the ICB system control centre?
MH: Yes, it does. The urgent and emergency care hub is checked on a daily basis and looks at how many ambulances are going where, if mutual aid is required, and if care coordination is required. We have live A&E feeds. We can see how many people are waiting in each A&E and how they got there.
VV: How does the ADSP feed into winter planning?
MH: It provides an opportunity for us to realise some of the efficiencies that will ease the pressure in the system more generally. For example, if we do an analysis that highlights people’s risk of deterioration, we can [anticipate more] and start to free up some of the flow. And we can start to use machine learning and AI to identify risks related to acute deterioration, just more generally in the population, or to put mobile services in place in those communities.
Another thing it adds is information on natural communities. People forget that right across Greater Manchester, there are natural communities – they may be defined by people’s culture and ethnicity or location rather than public administration boundaries, which are just lines on a map. Having a longitudinal record enables us to understand the needs of the natural community. What are the needs of the LGB+ people, those of certain ethnicities or people who come to the city for work that don’t live there? We’re addressing the causes rather than just responding to the symptoms as they present.
VV: What about the people who don’t show up in the data, the people who don’t present to health and care providers? There are concerns this reliance on data exacerbates health inequalities. How do you guard against that?
MH: We can do it to a degree with the longitudinal record. In the past, say I had diabetes and I didn’t declare my ethnicity to my GP. If I was trying to do an analysis of how many people of white ethnicity in this GP practice have got diabetes, I wouldn’t be in that because I’d be down as unknown. If I went to hospital and I have declared my ethnicity, and they were looking at diabetes in hospital, I would be in that dataset. What we’ve been able to do is enrich the data to plug some of the gaps.
Personally, I’m really committed, and my mantra is that the most important data for your analysis is probably data you’re not looking at. That’s because the data you have is what you’ve been told to collect over years and it’s for someone else’s purpose, maybe not the purpose that you’re actually setting out on. We’re trying to instil that view.
As a Marmot City region, Professor Michael Marmot came and gave us some support and challenged us around data use and intelligence and I think we’ve really taken that forward. And we don’t take data at face value.
I’m always trying to push the notion that things like AI are brilliant, but AI can only learn from what it’s seen. If there are baked-in inequalities, it will learn and try to predict a future that has baked-in inequality. It’s really important that as an analyst in the intelligence community, we’re always alive to those issues of the data that is not there. What might we be missing? How can we test that, challenge it and remedy it?
VV: Another key concern around data and AI is that there are too many products to choose from. Does the ICB have a role in helping navigate that for primary care?
MH: We’ve got some way to go. Because we’re starting from a point where PCNs and practices have been relatively sovereign organisations with their own budgets and, over the years, they might have invested in various different things. We are looking at the opportunities for converging, finding cost-effectiveness and efficiency to the single procurement of everybody using a particular application to see if we can get the economy at scale.
I think we are halfway there. We’ve got a lot of infrastructure that we bought once and that the whole system can use.
We start with slowly moving through each stakeholder to work out how we can maximise the support we provide to them and create the governance where we can make those decisions. Things like a technical design authority at a system level. So that, if there’s some new technology out, we have a load of experts in the data in a digital space and some clinical experts to actually kick the tires properly and ask ‘Have we got this already? What problem does it solve?’
VV: In that case, does a practice come to the technical design authority to ask if this is a good idea, or do you send out advice? How does that work in practice?
MH: That’s more in the digital space in terms of software procurement. But what we have is a primary care digital board, which has representation from the entirety of primary care, not just GP practices.
We would rather people come to us with the problem rather than the suggested solution. That’s what tends to happen. People say, ‘Can we buy this because it helps us?’ and we say, ‘Well, maybe we’re missing a trick – what is it you need help with?’ The primary care digital board will hear the asks and the interests of primary care. And then, on the occasions that they feel that there is a weight of interest or critical mass of interest, they’ll refer things into a technical design authority for some expert support.
VV: In terms of finance, all integrated care systems are pushed. Do you feel that digital and data are being prioritised through funding as a future cost saver?
MH: It is really challenging. Because it’s not so much the focus on the need to save money, it’s the focus on trying to do it quickly. The quickest and simplest thing to do – the salami slicing – is to say everybody reduce everything by 10%.
But I’ve been really fortunate in the ICB and the executive leadership, which firstly created my role and it was one of the first in the country, that they’ve always put a huge amount of stock in the intelligence. NHS England CEO Amanda Pritchard recently said, ‘A good analyst can save more lives than a good anaesthetist’. I don’t think she was having a go at the anaesthetists. She was just making the point that if you use the data intelligently, you can completely transform outcomes.
As a system, in Greater Manchester, we’ve recognised that even if we had the most efficient and effective healthcare system in the world, we still probably couldn’t balance the books unless we address ill health in the population. The hospitals will still fall over unless we get upstream and address some of the underlying health inequalities and the burden of disease that exists through poverty, poor housing, smoking and things like that.
VV: In hiring, it’s difficult to attract top data analysts and digital experts when you’re competing with the likes of Google in terms of salary. How is the employment market working for you?
MH: As an analytical community, we’re moving as fast as we can to adopt new technologies. But you know, there’s still a huge number of health services where paper forms are being filled in. The digitisation agenda is something we have to tackle before we can move on to the fancy analysis.
In terms of the analytical workforce, it’s absolutely a challenge that for roughly the same skills and capabilities, the agenda for change salary structure limit will always lose to the private sector who can pay more. There is some national work going on the national competency framework, and there are conversations happening nationally about what can be done within the scope of salaries to attain talent. Until that work comes out, we’re looking for creative solutions.
In Greater Manchester, the creative solution we’ve got is some genuine world experts in data science at our academic institutions – University of Bolton, University of Salford, Manchester Metropolitan and the University of Manchester. There’s a huge amount of work I’m doing with them to try and create these collaborative units where NHS analysts can learn from leaders in the cutting edge of their technical discipline. The benefit for the university is that they’re not necessarily working on just hypotheticals – they’re working on real data to solve real-world problems. There’s a mutual benefit, and that’s the sort of creativity we’re looking at.
VV: What can you see as the possibility of this work? What are you aiming and hoping for?
MH: I’m hoping that we get to a point where we automate what can be automated, and we use AI in that in that space. And that, actually, we recognise that there is more data and more information than anyone could ever consume. So we need to move our workforce to the interpretation and the application.
You start with data. If you transform it, you can make it into something presentable, which becomes information. Then, if you link it with evidence, and you link it with wisdom, and you connect it to clinical and patient experience, you turn it into something that’s insightful and intelligent – and that’s the thing that makes a decision work.