UK digital chiefs emphasise the need for data foundations to unlock government transformation

Speaking at Public Service Data Live, senior UK data leaders emphasised the need to invest in government’s data infrastructure and capabilities to realise the huge potential of emerging digital technologies
“As I look into my very professional crystal ball, I see quite a lot of clouds. It’s a murky picture,” said Sarah Munby, permanent secretary at the UK’s Department for Science, Innovation and Technology (DSIT). “We are on the brink of an AI revolution, but our ability to describe what that revolution really looks and feels like is actually pretty poor.”
Speaking on Thursday at Global Government Forum’s Public Service Data Live event in London, Munby explained that this lack of certainty is not rooted in “ignorance or stupidity or lack of thought or lack of work. It’s fundamental to what it means to be standing in front of technology-driven change at an enormous scale. When you’re in a pre-revolution society, your ability to describe the revolution is usually quite poor.”
This has been true throughout history, said Munby. “When people invented the spinning jenny, they didn’t know what they were unleashing,” she noted. “And if I look back at how the internet has changed the global economy and society, I don’t think we knew at the beginning what we would be dealing with now. Indeed, it feels like we’re currently catching up on some of the things that you might wish you had done earlier in the sequence.”
Alongside AI, Munby said other technologies promise equally radical change. “Quantum is just around the corner. Engineering biology is one of the priority technologies for government – part of a broader family of biologic technologies that are going to have potentially equally revolutionary effects,” she commented.
“Quantum cryptography is going to have really profound implications. But I don’t think we know the picture of what our society and economy will look like as those technological revolutions emerge.”
AI in action
Craig Suckling, the UK government’s chief data officer, offered a glimpse of that future. The Department for Work and Pensions, for example, is using generative AI to parse the 22,000 letters it receives each day from people needing assistance: “It used to take them two weeks to get through all of that,” he explained. “With AI and with data, they’ve been able to deliver automatic detection of sentiments to be able to understand people who are most in need, and they’ve taken the process from two weeks down to a day to get help to those most in need.”

In the Cabinet Office, meanwhile, the Redbox Copilot scheme is assisting in the production of ministerial briefings. “They take thousands of documents, and interrogate and summarise them into briefings for private offices,” said Suckling. “It’s a great story of taking unstructured data, and bringing that together with AI to drive really good benefits.”
Read more: Digital Leaders Study 2024: Embedding AI across government in the UK
Suckling emphasised that projects like this are as dependent on high-quality data as on AI functionality. “Take a rocket ship analogy: you can think of AI applications as the payload, and the fuel as the data,” he said.
“And just like in a rocket ship, raw fuel is not going to get you very far. You need to refine it: it needs to be harnessed and put into the right containers to lift off the rocket. The same with data; we have so much rich data, but we need to focus on how we create the right foundations to unlock that data.”
Those foundations, he explained, include an ability to exchange information “across the system so that we can create the right data applications,” and “the right data activation layer: stores of data that are in the right format, in the right place, at the right time for AI applications to use them.”
Invest in infrastructure
Suckling’s message chimed with that of Neil McIvor, head of data for event knowledge partner esynergy. “When you’re thinking of policy solutions, ensure you think about the data needs and the end-to-end data pipelines and processes you need, especially at the design and thinking phase,” he said. “This is even more critical if you’re thinking about AI.”
Those pipelines require investment, said McIvor: “Don’t treat data as a free good. Just like roads and rails, data needs infrastructure to work effectively and smoothly, and these can cost time and money to set up properly and to maintain – especially if you want to stop building unsustainable, expensive cottage industries in your organisation.”

Tariq Khan, chief data and information officer at the London Borough of Camden, suggested that on some occasions, it may be best to construct an entirely new infrastructure rather than trying to adapt existing systems. “There’s a lot of legacy architecture, there’s a lot of legacy technology, and there’s a lot of people making a lot of money out of keeping to incremental changes rather than big moves,” he said. “And that’s not just externally, but obviously internally, with the people that you’ve got.”
In many cases, digital and data leaders will encounter “some very powerful forces that you have to navigate in order to make the change”.
What’s important, Khan argued, is that public bodies develop the infrastructure, processes and capabilities required to gather, organise and process high-quality data – then they can always find new ways to realise value as technologies evolve and emerge. “The world of AI and GenAI is moving very fast,” he said. “If you look at OpenAI, they’re releasing stuff on a regular basis; and as that’s happening, secondary products are being built and then being collapsed, built and collapsed, built and collapsed.”
Public Service Data Live 2024 – as it happened
Don’t predict, prepare
“If we’re putting our money into the foundations at this point, that keeps us in a strong position irrespective of where we’re going with these applications,” he concluded. “It’s public money well spent in order to give us a really good launchpad.”
Here, Khan backed Munby’s messages around the need to invest in underlying data capabilities. DSIT is playing its part here, Munby noted. “We’ve been set, as one of the core priorities for the department in the coming months and years, the creation of a national data library,” she said, adding that delivering the “underlying change required” to make this project work will demand “a lot of work on the foundations”.
This work will pay off, Munby concluded, as new data technologies emerge – none of which will bear fruit without a ready supply of high-quality, accurate, well-organised data. “In these times of tremendous uncertainty, one of the things you’re trying to do is build robust capabilities that allow you to operate successfully in a whole range of different environments,” she said. “Because the idea you can predict the future and then make your way towards it; well, that’s not how life is. What you’re trying to do is invest in capability that allows you to manage the future, however it may come.”
Public Service Data Live took place on 19 September at the Business Design Centre, London. The conference was supported by platinum knowledge partner esynergy, and gold knowledge partners Capgemini, Iron Mountain, Hewlett Packard Enterprise, Netcompany, SAP, SAS and WR Logic. See all the knowledge partners for Public Service Data Live 2024 here.