National Cyber Security Centre Director - ‘I want less scare tactics, more data’

Profile picture for user ddpreez By Derek du Preez November 1, 2016
Summary:
Dr Ian Levy, technical director of the recently announced National Cyber Security Centre, argues that cyber crime is all smoke and mirrors at the moment.

padlock - security

The Chancellor of the Exchequer, Philip Hammond, yesterday announced Britain’s new £1.9 billion cyber security strategy - which will support the nation’s progress in defending against online criminal activity over the next five years.

Core to the strategy was the announcement of a new National Cyber Security Centre, which will be based out of Victoria in Central London, and be the UK’s “outward facing authority on cyber”.

Speaking at Microsoft’s Future Decoded event in London this week, Dr Ian Levy, Technical Director at the National Cyber Security Centre, gave delegates some insight into some of the objectives of the newly formed organisation.

Dr Levy’s talk centred around the idea that if we, as a society, are going to want to take advantage of the latest technologies - such as artificial intelligence - then we are going to have to take the fear out of cyber crime. He believes that much of what we hear in the media at present is largely scare tactics because of a lack of understanding, and that if we inform users with data and root cause analysis of problems - then informed decisions can be made based on risk.

Using the example of aeroplane safety, Dr Levy said a similar approach could be applied to the internet. He explained:

Does anybody know why planes don’t have square windows? You look around everywhere, and windows are square. Apart from aeroplanes and a couple of other places where they are squarish. In 1953 three aeroplanes exploded in mid-air, which is obviously a bad thing for aeroplanes to do.

They found some wreckage and around the corners of the windows they found diagonal stress fractures that had caused the metal fatigue, the windows had popped out and bad things happened.

They wanted to prove that that was the case. So they stuck one in a water tank and filled and drained it 1,500 times to simulate. Then they causally proved that that was the problem. So now you have a system with a fielded vulnerability. What do you do? Do run around yelling ‘nobody go on a plane’.

No, because every plane had square windows. So they put in place monitoring, so they put in place design changes. And over time you start to reduce the vulnerability and remove it from the system over time. That’s systemic root cause analysis.

Levy added that the consequence of this systemic root cause analysis applied to aeroplane technology has meant that the safety of aircrafts has improved dramatically over the past four decades. Whilst you were relatively likely to be in a plane accident in 1970, and relatively unlikely to survive if you were; by 2005 these risks had diminished and it was relatively unlikely to be in a plane accident and relatively likely survival rate if you were.

Dr Levy said:

That’s good, that’s harm reduction. That’s what we should be doing in cyber security, we have to change the narrative to talk about harm reduction, rather than vulnerability reduction.

Fear

Levy went on to say that the image in the media of hackers and cyber crime is one that typically perpetuates fear, driven by the unknown. He believes that this approach to cyber safety takes people back to “medieval witchcraft” where they don’t know what the problem is, are not entirely sure how to fix it, and will likely use some “magic” solution that will only work for half of the people.

He said:

I’ve been doing this for 15 years. And let me tell you, the majority of the attacks I’ve seen, I don’t think the APT stands for Advanced Persistent Threat. I think it stands for Adequate Pernicious Toe-rags.

They’re adequate because they use vulnerabilities that they patched years ago. They’re pernicious, because let’s be honest, they are. And they’re toe-rags because that’s the least sweary word I could come up with beginning with a T. This is how we have to start talking about this stuff. A lot of the attacks we see on the internet today are not purported by winged ninja cyber monkeys. My attackers have to obey the laws of physics, they cannot do stuff that’s physically impossible.

So let’s talk about how they actually do stuff. Let’s change the narrative so that people can make rational, risk management decisions on their own by giving them high quality information.

ostrich-head-in-sand

Dr Levy added that telling people to do things like not open emails they don’t recognise, or change their passwords every 30 days for every application, simply amounts to “blaming the user for designing the system wrong”. He believes that this means getting the user to compensate for bad system design, which is “stupid”.

He said:

If we are trying to secure the UK. And we are trying to make the UK a better and safer place, this kind of advice has to go. We have to make it much more user centric, stop blaming the user. Give them the information and let them make decisions.

Cyber security today, it all runs on fear. The entire industry runs on fear. Everything that we do as an industry is about making it sound really, really bad. Because then you can’t possibly defend yourself and buy the magic thing. There is no other part of public policy that allows this to happen. Nowhere else in public policy do you allow fear to rule the public’s perception. And so, my job is to change that fear into evidence. Driven by data.

Becoming data-driven

Instead of focusing on ‘winged ninja cyber monkeys’, Dr Levy wants to take a data driven approach and create an evidence based discipline - which will be the primary focus of the new National Cyber Security Centre.

Over the next five years, the centre will spend time collecting data and being transparent about the risk of threat to the UK and internet users. He said:

I want to start generating real, national scale data, so that we can have metrics that mean something to the average person on the street. I want to explain to them how we are spending their money, so they can understand the measured effect we are having.

More importantly, we need to do it transparently. Transparency in cyber security is unheard of as far as I can tell. We have never had a national scale data generation defence understanding about what is actually going on. What is the national threat picture? What does it actually look like? Who are these people that are attacking us and what does it actually look like? How successful are they? What does it mean to you?

Until we have that kind of conversation, where people make sensible, value based risk management decisions every day, we will never reap the benefits of all the new stuff that’s coming. People will be too scared. We have to get underneath the hyperbole, and start to do this in public, transparently. That’s the goal of the new National Cyber Security Centre.