Back in 2009, Philippe Courtot, chairman and CEO of vulnerability management specialist, Qualys, spoke at the annual RSA security conference to outline a dream. He saw that the typical security solution was to add layers of security on to the unknown network currently in use, and that a far better solution was likely to be found in the internet and cloud technologies.
His key premise was that, in a cloud-based environment, it would be impossible to manage security, and the possible introduction of vulnerabilities
He saw security requirements changing in ways that old technologies and approaches could no longer match up to, in much the same way that the advent of client/server architectures had rapidly and completely demolished the world market for minicomputers (remember them?). The new direction, in his view, had to be based on the internet and cloud services. If the technology had the potential to deliver compute, information, and data services around the globe, the security threat would have the same potential and the defences against that had to follow suit.
The fact that it has taken so long for security services to really exploit the potential of the cloud is, in his view, down to two main reasons. One has been the lack of available technology in use from the security industry, and the other is resistance from security people, both within user business and security tools vendors.
We have had a lot of resistance from security people that were very against a cloud solution. And they had good reason, in a way. I'm not criticising the judgement at the time, for it was new. And of course, people don't like to change.
The emergence of open source has been the catalyst in allowing the technology side of the argument to gain strength. Open Source software has brought the flexibility to assemble complex services and applications from a vast supply of code options.
They have also brought the ability to scale them to match the capabilities of the cloud.
It has also allowed the Dev/Ops movement to grow as it has taken away the restrictions that came with commercial applications and their limited operational view of the world. At the same time, Dev/Ops engineers have become the main buyers of code across the board, making them the new key targets for security tools and services. In that role they are also bringing their own perspectives on to what makes good security.
Eating a whole cloud at one go
It is against this background that Qualys has now brought Courtot’s original dream to life in the form of a Global IT Asset Discovery and Inventory app. This is, in effect, an agent device that provides monitoring capabilities right across a business network, identifying what is on the network, when it is on, and what it is doing.
There are two interesting aspects here, one is that it is free to download and use, and the other is that this is no mere snapshot-taking sales gimmick. Once installed it carries on working providing real time monitoring. So it is capable of providing a continuous, real-time inventory of known and unknown assets across the global IT footprint of a business. The assets can be any assets from on-premises, endpoints, multi-cloud, mobile, containers, OT and IoT.
It can automatically classify, normalise, and categorise assets to ensure consistent data, as well as search across them so that any asset can be identified and examined, and has the ability to detect any external device that connects to the business network, using a passive scanning technology.
Of course, Qualys then has a range of additional monitoring and management tools available at a price to allow users to examine devices more deeply and manage them as part of the business network. For example, the company offers a self-updating agent that can turn an unknown, external device into a managed device, and/or scan it for vulnerabilities. If necessary, the device can then be quarantined if compromised or outside compliance rules.
Additional features in this category include the compilation of hardware and software lifecycle and licence information, the tagging of business-critical assets and, quite important these days, the synchronization of assets with the ServiceNow CMDB. According to Courtot, this can now handle 3 trillion data points on the company’s Elastic Search clusters, with the whole service managed via a single pane of glass console.
You couldn't not do that with enterprise software. But you can do it with a cloud architecture, because you can put all the data into one place. And then we also have come up with a passive scanning solution, which allows us to detect anything that connects your network, and essentially fingerprint and identify what it is. Essentially, we really provide the single source of truth.
In his view we are now at another `minicomputer’ tipping point, with more and more changes coming faster and faster. He sees the buyers and technology changing fast and that many established vendors will have increasing difficulty in re-architecting their security products and services to come up with client-centric solutions.
You saw what happened with the minicomputer, they really didn't last very long.
Once users have a real time, global view of their entire environment, his expectation is that it becomes very easy to identify any anomaly. And in a cloud environment where third parties can be legitimately connecting and interacting with any point across the business network, any anomaly is potentially dangerous, and dangerous in real time.
In his view, what businesses really need most is, therefore, an overall context which any connection of any device is made. The context for those in head office, chuntering away on product development, sales call management and accounts ledgers have an obvious context that is immediately clear, but the part played by a system from a third party business partner may not be so clear until identified and the context of its connection understood. And because the agent then sits on every approved device on the network, the lack of one on any device rings the `anomaly bell’ loud and hard.
One issue here is that most malware, these days, is designed to mutate. That way, it can become something, different and – most important of all – not identified as a vulnerability. To overcome this, Qualys has developed the means to classify vulnerabilities a malware instances into families.
So instead of looking for just the pure 100% Indication Of Compromise match, we look for a family match. But we don't really know for sure whether that device is already compromised, or whether it could be an artefact. So then that's where we use our passive scanning to look at the network traffic and see what's coming in and out of the device, what is the command and control type of communication that we see. That can equate to a compromise, and now we can quarantine it automatically.
Closing the contextual loop, as well as quarantining vulnerabilities the system uses APIs to connect to most the popular remediation tools, so that businesses can be kept up and running.
This is an interesting example of the shift in emphasis that comes with working with cloud-delivered services. First of all, cloud services are already pretty dispersed entities for users, and this will only get better/worse (depending on your point of view). Tracking down vulnerabilities is therefore only going to become a much harder and more complex job that current security technologies may having difficulty keeping pace with. Secondly, that subject of `context’ will become all-important. What is on your network and what is it doing while it is there has to be mapped against what it is you want to achieve using the network, and whether the outcomes meet the business objectives. Knowing `how’ to deal with vulnerabilities is one thing, knowing `why’, `when’ and `whether’ are more complex – and now more important – objectives.