A new government hub in the UK aims to put data quality on the map

Derek du Preez Profile picture for user ddpreez February 8, 2021 Audio mode
James Tucker, chief of the Government Data Quality Hub, wants to help government departments make their data ‘fit for purpose’

Image of the Thames and Houses of Parliament

The COVID-19 pandemic has highlighted to governments around the world the importance of access to high quality data. Throughout the crisis data has played a critical role in guiding healthcare provision, directing lockdown efforts and supporting policy decision making. As the pandemic has persisted, we've heard different sections of the public sector talk about the need to sustain data efforts in the ‘post-COVID-19 world'. 

The fragmented nature of government institutions, which often operate in silo and rely on aging legacy systems, means that access to high quality information can often face numerous obstacles. Not only this, but when working with citizen data the need to place privacy and security front and centre is an imperative - and often a barrier to making change (as officials fear negative outcomes). 

With this in mind, there's a new team working at the centre of the Office for National Statistics (ONS) in the UK, which received funding from the Treasury in the middle of last year, that is thinking about how data quality models can be applied to the broader public sector - making data valuable for organisations to make decisions. The Government Data Quality Hub is taking the statistics expertise from the ONS and hoping to share knowledge with other departments, helping them understand how data can be made ‘fit for purpose'. 

James Tucker, head of the Government Data Quality Hub, was speaking recently at an Institute for Government online panel about some of the approaches the team is taking and the work that it is doing in this area. On the problem facing departments and public sector organisations, he said: 

For our work on data quality we've really set about unpicking what that means and how culture applies to the way government departments address data quality. I think that the top issue that we've come across is that data quality isn't always seen as a priority. It can often be seen as sort of a backroom task or something that's done at the very end of the data lifecycle, when data is analysed and used. 

And also, when it comes to recognition of the importance of data quality it often doesn't come into the limelight until there's been some sort of error or poor decision made with the data. 

With this in mind, the historical attitude towards data in the public sector has often been one of ‘tolerance', in that government officials find workarounds for poor quality data only once problems have arisen. Tucker says this is akin to an ‘Elastoplast fix' to data quality issues, which isn't helped by the disparate nature of government. He added: 

In government we're dealing with so many different departments, with their historical ways of processing and managing data, and we have a lot of inconsistent approaches, as well as a lack of knowledge sharing in departments, which does really impede what we're trying to do with data across government 

Tucker said that these problems are taking place against a backdrop of a huge increase in the volume of sources of data, new ways of collecting data, which are in turn throwing up even more issues that people are perhaps not familiar with. 

However, the Government Data Quality Hub isn't looking to create a checklist of a tick box exercise for government operators and practitioners. Instead it wants to produce guidance that is principles based and to encourage people to think creatively about how they manage data quality. 

Some examples of this include, Tucker said: 

I think the first one about commitment to data quality. And the motivation for this really has to come from the most senior levels in organisations. I think the way to generate that is to really tie in the importance of data quality into department strategic objectives, which makes it really tangible for people. So

Data quality by itself can also come across as maybe quite a nebulous term, something that sounds quite general. And then, data quality is also often only considered at the end of the pipeline - so we want to encourage a data quality culture right across the data lifecycle. And by that I mean, understanding how everything from data collection, and people on the front line collecting the data, through to how it's processed and then used, all have an impact on whether that data is fit for its intended purpose. 

Fit for purpose

The crux of the Government's Quality Data Hub's mission is to broaden the public sector's understanding of what it means to make data ‘fit for purpose' and what compromises are allowable in order to achieve that. 

Tucker and his team are working with the Data Management Association to create a ‘data quality framework', which aims to establish various dimensions to how you can think about data quality and whether it is fit for purpose, based on your organisational needs. The key components can be seen in the image Tucker shared below:

Image of Government Data Quality Hub Framework
(Image sourced via ONS)

Tucker explained: 

I think the best way to think about data quality is fitness for purpose. It's tempting to say that we want to have high data quality, but that isn't really a too helpful. You don't always need data to be absolutely gold standard, it has to be fit for its intended purpose and that might mean some sacrifices in accuracy for timely data, for example. 

If you want data instantly to meet a particular policy need, for example, you might be prepared to sacrifice a bit of accuracy in favour of that timeliness. So, something that we became conscious of as we're developing this framework was that it's not sufficient just to produce a framework or a guidance document, which is why we developed a plan to create a government data quality hub based at the ONS.

Since being established, the Hub has been working hard to roll out this service across government. It's work can be divided into two main areas, according to Tucker. Firstly, it is focused on setting direction and secondly it is thinking about how it can incentivise improvement. 

For instance, Tucker said that he is a big fan of data maturity models, which could provide a framework for departments to understand ‘where they are at' in terms of data capability and where they want to get to. The Hub hopes that it can work with departments to advance through their maturity scales and reduce the risks associated with data quality. 

And Tucker is hopeful that the ONS's experience with data work will prove beneficial in working with other government departments and the broader public sector. He said: 

One of the great things about the Data Quality Hub is that we're building on a service that was already operational for the government's statistical service. So we have a tried and tested set of approaches for working with departments and techniques for improvement. So really, our task has been to apply this to a much broader community that's involved in all government data, not just that feeds into statistics. 

So it's been great to have something to build on but make it available to that wider audience as well.

A grey colored placeholder image