Cleaning up and restructuring data is going to be critical to the success of Government-as-a-Platform - there’s no doubt about that. What’s the point in creating reusable platforms that are easy to tap in and out of if at the same time you are scrambling to suck in tonnes of messy data?
It doesn’t make sense. Which is why the Government Digital Service has prioritised the creation of canonical data registers - the first of which will be a list of country names held by the FCO.
The idea is that if the government can create a series of data sources that are considered ‘true and clean’ representations of whatever they are meant to refer to (e.g. companies, addresses, etc.), then these registers can be tapped into easily by departments, and controlled by the citizen, for the delivery of digital services.
Messy data = incoherent and siloed services. Think of it like the transport network. If the government wants businesses to be able to easily transport goods between each other and to citizens, it needs a good underlying infrastructure to make that happen - the transport network.
Data registers and ‘data-as-a-service’ will be the new underlying infrastructure for next generation public digital services.
Well, that’s the thinking anyway. As we know data is a thorny topic and mishandling of data makes for good headlines. Data restructures and sharing of data has typically led to a backlash from the media, privacy advocates and citizens.
Which is why communicating the benefits of ‘data-as-a-service’ and instilling trust into the registers being built will be critically important for Paul Maltby - the director of data for the British government.
Get with the programme
Maltby took to the stage at the Government Digital Service’s Sprint event last week to speak about his data programme. It is a three point plan that focuses on: making better use of the data government already has, introducing better legislative and policy frameworks for data governance and finally the data registers.
I want to start with what I hope is quite an obvious point, in that using data in government is hardly something new. There is a long and proud tradition of using analysis to support public services and you can trace it way back. But clearly something has changed dramatically and government has been keeping up.
If you think through what we did through the last parliament, the hard work from people across departments, meant that the UK was a world leader on open data. That’s enabled companies and organisations to do some wonderful things. Think about how it’s transformed how we have access to public transport data.
But this pace of change is relentless. This data revolution has great opportunities, but for those of us in government it presents a number of quite big challenges.
Maltby went on to say that the first part of the programme, making better use of the data government has, largely focuses on the “unfinished business of open data”. He said that there should be a focus on improving the quality of this data, as well as the accessibility of it, both for users inside and outside of government. Maltby also mentioned that skills will be critical to the success of the data programme. He said:
We have got to get serious about how we use data inside government and public services.
Part of that story is about bringing data science techniques closer to the hands of decision makers across government. We think there is perhaps more than 100 analysts now across the domestic departments in government with data science capability.
And we are seeing an increasing number of projects across the work we do. A lot of the time data science can do a lot to just put the data in the hands of those operational leads. I really like this example from the Department for Education, where the data scientists there are just providing really interactive visual tools for people in the system to understand for each particular school, where do the pupils live? What are the characteristics there? They are doing a really great job of helping to understand and predict when academy schools might be getting into some financial difficulties.
Maltby went on to say that whilst some of this data agenda is “different and new”, it’s not the “farflung future” and is “happening right now”. And that the second part of his data programme will be around the area of data policy, where there is a need to make sure that the underlying policy and legislative frameworks are fit for purpose.
He said that one aspect of this is how legislation does not support data access across government. Maltby said:
We are hoping to bring forward some firm consultation proposals in a couple of weeks on that. But if I’m honest there’s a much bigger picture on this point around data policy. Data is going to transform the relationship between the citizen and the state, and it’s for us to recognise that and harness it.
What do I mean by that? Well at one level I think for transactional services we are going to see much greater visibility about the data held by government about you and greater control for the citizen about that. That degree of clarity is a real step change and we will see much more of that for those direct transactional services.
But of course a lot of what government does isn’t directly transactional. We make decisions that impact on others, in our family, in our neighbourhood and as taxpayers. We still need to have appropriate, clear rules about what happens and what doesn’t happen. A small part of that is we have been working across government with the data science community as it emerges to think about, what are a set of quite practical principles that help us be appropriate as we bring this powerful new tools to bear on government’s challenges?
Finally, Maltby addressed the most “radical and important” part of the programme, which focuses on the data infrastructure. I’ve written a fair bit about data registers, about how they’re going to work and why they’re important, so I won’t go over all that again, but I thought it was interesting that Maltby introduced this idea of ‘data-as-a-service’. He said:
The friction caused by badly curated data makes building digital services harder than it needs to be. It makes doing analysis more expensive than it needs to be. Part of that is because we are still slaves to the direct text entry form, where it originated in paper and was transferred willy nilly to the web. Part of that problem with poor quality of data across government, is because data is duplicated and held in many different places by those that are not most expert at holding that. We want to move to a world where we have individual registers of our core data layer. They are provided as a service to those in government and outside.
So across departments we need to do the hard work to make data simple. It means separating out the data layer from our applications, whether they are digital services or layers on which we do analysis. And we need a change in the mindset that we think about data being provided as a service. Minimum viable data sets, not large bulky bases. And provided through the GDS software-as-a-service product for registers.
The really important thing here is about trust. Moving to this world requires us to trust other departments and other services to provide expert data to consume within our own worlds. And it means that we have to be trusted by others to look after the user needs of others outside of government, not just our own specific services that we are providing.
So much of what GDS has planned is dependent upon Maltby’s data plans. People are wary about changing how data is worked with, for obvious reasons, but that doesn’t mean it should be left alone.
As we’ve seen with NHS projects, it is critically important to communicate plans early, openly and to get everyone to understand the benefits. If you don’t, plans will fail.
Trust will be central to this. This will be helped by the disaggregated nature of control that Maltby has laid out (registers will be controlled by a number of different departments), so there is no single point of failure. However, this also means that all departments need to be clued up and unskilled on how to protect and work with data.
These plans are promising, but execution will be key.