The revised security implications of G-Cloud 6 = buyer beware. Or do they?

Profile picture for user mbanks By Martin Banks February 10, 2015
The security implications of the changes introduced with G-Cloud 6 put new responsibilities onto both users and SME vendors, but future solutions may come from elsewhere.

Election 2015

The changes in G-Cloud 6 have, amongst others things, changed the ways in which security issues are addressed by both vendors and users, putting new responsibilities on both that aim to increase the level of understanding of the security capabilities for both camps. There are concerns, however, that the greater responsibilities may yet stretch the capabilities of both vendors and users further than they might expect.

That stretch formed the backbone of discussion in the security panel session at the Think Cloud for Government conference this week.

All three panellists - Ian McCormack, Technical Director for Public Sector Service with CESG, John Godwin, Head of IA and Compliance at Skyscape Cloud Services, and Tony Richards, Head of Security, G-Cloud and Digital Commercial Programme with the Government Digital Service - stressed the shift in emphasis in security provision.

Better communications

Richards talked of where the changes stemmed from, an accreditation process that became a bottleneck as it had to handle other services as well as G-Cloud. More of the responsibility has been put on suppliers, - see G-Cloud passes £200 million milestone as suppliers get to mark their own security homework - which now have to answer explicitly what they are doing for customers. Richards said:

They also say what evidence they can provide to support their assertions of their security capabilities.

McCormack observed that determining security capabilities had not been overly transparent. The changes do put a lot more emphasis on the buyer and the Cloud Security Principles introduced last April have in their way not necessarily helped. One delegate at the event suggested that understanding them in detail can be a time-consuming and brain-numbing exercise. So now there are efforts to try and help give a framework for both suppliers and users to work with.

According to Godwin, the approach now is to create an environment where users can `come in and look’ at the suppliers to see if their claims stack up. This puts new responsibilities on both parties.

Both vendors need to know not only what the customer wants in terms of security, but what the customers’ customers and end users want. There is a need for both to know what their risk appetite really is.

If there is a problem with the security provided by a supplier, either in the evaluation stage or after an application or service is up and running, the Crown Commercial Service (CCS) will have the right to remove that supplier from the whole G-Cloud process.

Richards explained that this usually starts with an audit of the security processes and what the vendor has claimed for them, and if they are found wanting the vendor can be invited to update their processes:

If they don't do that and they are removed from the process, their customers will be informed.

Godwin acknowledged that the changes do suggest that there will be a commensurate change in buying patterns in the user community, not least because the changes do presuppose that buyers are technically literate. In practice, of course, it is inevitable that there are still many technically illiterate buyers around:

There is now responsibility on all suppliers to ensure that everyone understands what they are signing up to and what it all means. There is also a need for more comprehensive information from suppliers. It is no longer good enough for them to give just yes/no responses to questions on a form. It now requires them to give short explanations as to what the `yes’ or `no’ actually means.

A repeated sub-text at the conference was the issue of data sovereignty and the fact that users now have to state where their data is to be stored. This raised the question of whether the security questions should validate this, or whether it was, in fact a genuinely important issue any more. Richards acknowledged the growing complexities around this whole subject:

It is not just an issue of where the disks are, it is also where the data management is located. Data can be here in the UK, but management services can now be in China or anywhere. It does depend on local legal issues on where data sits, and the law is lagging behind on this issue. In the end it doesn't matter too much where the data is stored if the right protections are in place around it.

As Godwins observed, the underlying issue now is the end users’ risk appetite:

If they are happy with having their data in a foreign country then fine. But there is this feeling amongst the security community that home is best.

Off-the-shelf security

This allusion to security tools and services being better than some security experts believe as backed by Sharon Bagshaw, VP Central Government, Defence and Health with IBM UK. She noted, during her earlier presentation, that analysis of security offerings by one enterprise customer showed that the off-the-shelf security from Microsoft Azure was better than that provided by their own security specialists.

Think Cloud 2015
It was also indirectly backed up by the security seminar provided by Verizon’s Director of Product Strategy, Ryan Shuttleworth, and Andy Bates, who is now leading the company’s move into G-Cloud service provision. As Bates indicated, Verizon now `bakes’ security management into all its cloud offerings, so the capabilities are available at all levels:

We are giving users a bit of a take it or leave it strategy, based on using our global network. They can also use our managed services, or they can run their applications on-premise. In our experience so far the majority of public sector cloud users seem happy with that.

According to Shuttleworth the security drivers now are global connectivity and workloads. The connection now is usually over public networks and the trend is towards collecting clouds around the world together:

This creates new challenges for security. It is now a complex, hybrid world, so users have to understand the different ways different data is handled on different environments, especially when local choice is an issue in a global infrastructure and environment. There is now no longer the need to have everything in one big data center. It can be almost anywhere, using programmable everything on elastic resources.

And the underlying use case is now all about the apps, as they drive the workloads. This, he suggested, makes workloads now more important than the infrastructure as the criticality of the app and its data, plus the users’ risk appetites or intolerance, become the key drivers.

Verizon’s answer is the provision of a series of bundled cloud offerings. There is a Public Cloud service, which he described as suitable for some data and applications but certainly not all. There is a Dedicated Public Cloud offering for private off-premise use. Of particular relevance to conference is a Government Cloud, which he described as a deviation from standard services that comes with tighter controls. And there is a Private On-Premise offering which uses standard technologies coupled with user-defined controls.

Lastly, there is also a Bespoke Cloud capability, which is built to meet specific user needs.

The choice is based on risk appetite/intolerance. The vast majority of breach investigations show that problems are down to things like miscellaneous errors, such as incorrect setup or subsequent changes.

On Premise – Pah!

Some sage advice on security was also offered by Ronald Duncan, chairman and CIO of Cloudbuy. When asked about the best options for achieving a secure system, he voted squarely for what conventional user wisdom would reject:

The easiest way to go about it is not to use on-premise systems. They can be broken into in a few minutes, every time. And don't think of cloud security as defending a whole system. Different data requires different security so has to be defended differently.

This thinking gets close to one of the newer approaches to security now evolving, that of self-defending data. Here, data files carry with them the security policies that apply to them.

This gives a highly granular security scenario where some files can be totally unprotected, while others have very tight policies associated with them, such as who has authorisation to view the data, who can change/action the data, how they authenticate themselves, what reports the data should make about itself back to the management system, and what actions the data should take if a breach occurs, upto and including self-destruction rather than allowing unauthorised access or actions taking place.

My take

As someone with an acknowledged lack of detailed knowledge of G-Cloud minutiae, the security themes that emerged did show an interesting diversity, from the hand-on-heart assertion of capabilities and form-filling beloved of civil servants, through increasingly accepted security capabilities of major service providers and on to the potential of data defending itself.

It did seem that there is now an opportunity to make wider use of what the major service providers can offer off-the-shelf, especially if at least half an eye is kept focused on what may come from future tech developments.


J.F.D.I! Why the G-Cloud’s original program director still finds public sector IT unacceptable

DSF 2: The government needs to understand digital isn’t a commodity service

UK gov comes to Think Cloud to praise not bury G-Cloud