Cyber-security, a town called Wassenaar and the software ostriches

Profile picture for user mbanks By Martin Banks June 8, 2015
Summary:
A trans-national arrangement on cybersecurity, coupled with software vendors playing ostrich, may proliferate software vulnerabilities to everyone’s disadvantage.

security_key_1
Let's get cyber

We all know there is a healthy black market for 'researchers’ looking to make money by selling details of the code vulnerabilities they find on to those that would seek to use them for malicious intent. Software vendors may wring their hands about it, but there is an alternative if they care to use it – bug bounty schemes.

One of the best ways of killing a black market is to turn it into a white market, which is what the bug bounty programs set out to do.

There are, however, two problems with this suggestion. One is that many of the software vendors seem none too keen on going along with the idea.

The other is arguably more important, for it involves many national governments that seem set on clamping down on the movement of information within the bug hunting community, significantly reducing their effectiveness in finding security vulnerabilities.

Not only that, but their collective actions are likely to put many more businesses at risk of malicious attack, not just indirectly through vulnerabilities being left open, but also more directly by some of those researchers being then tempted to part with their information for a fat payment from players out on the dark side.

This is especially so if the authorities believe they are being 'bad’, for which 20 years in jail and $1million fines are a likely result.

Why this might happen, indeed is very likely to happen, is because of the Wassenaar Arrangement. This dates back to 1996 and is aimed at stopping the proliferation of uranium enrichment and chemical weapons precursors, as well as the proliferation of conventional weapons. The USA, Europe and Russia are signatories, while China, Africa and the Middle East are not.

In this context cryptography has become designated as a munition, and is therefore subject to controls. And as of last year cyber-weapons have been added to the list. Indeed, the US Bureau of Industry and Security (BIS) is now proposing to comply with the Wassenaar regulations.

A very comprehensive blog on the subject by security blogger Robert Graham highlights many of the implications of trying to build any community of like-minded bug-hunters will probably fall foul of the Wassenaar regulations.

Graham observes that the proposed BIS implementation of these rules actually goes beyond what Wassenaar set out, affecting a large number of cyber-security products, and perhaps more important: cybersecurity research.

As Graham writes:

These rules further restrict anything that may be used to develop a cyberweapon, which therefore make a wide number of innocuous product export-restricted, such as editors and compilers.

And as he points out, the real fun here is that these rules then create the necessity of a huge bureaucracy to oversee and apply the regulations, which in turn means that mistakes will be inevitable.

The bureaucracy will miss some cyber-weapon precursors as they are transmitted to a `collaborator’ somewhere in the world, while somebody else pondering a bug bounty opportunity will forget to tick a box on some form, send an email to another collaborator, and be banged up for 20 years.

That word – collaborator – is also important here, for this is how the bug hunting community tends to operate. Just like any open source development model, people with like minds and similar areas of interest start to hang together online and discuss things. This goes against the need for structure, a sense of order and the immutable impact of contract terms found within every business.

There are security research businesses that play along these lines, and established applications vendors may well us them. But, as Graham points out in the blog, some of the biggest and best known vendors are seriously anti the idea of using them – or anyone – to go bug-hunting around their products.

It does seem as though they would rather carry the costs, and reputation damage, than acknowledge that they need help in making their code reliable. (And given the way that the reputational damage caused by serious breaches has slid into the laps of users – their customers – some of these vendors do seem to have an arguable point when breaches never even end up being perceived as their fault).

Tapping into the research

 

Be that as it may, there is some traction for using security researchers amongst applications vendors. But it does seem as though the business model ends up being skewed towards not finding any real bugs. According to sources, the typical interaction is based on contract terms, with the primary one being $XXX spent on Y weeks of research.

With `Y’ often equalling a small number, the researchers search for the low hanging fruit and both honor and contract terms are satisfied. The vendors feel they have done their bit and shown their code to be pretty bomb-proof.

However, the bug bounty-hunters are the kindred spirits of the open source applications builders. There are thousands of them around the world, with most communicating and collaborating in loose teams attracted together to try their chances with an application. And that key word, communicating, is now likely to be considered an illegal act. Passing messages or code samples between themselves will be seen as a potentially dangerous act.

One of the troubles here is that many of the vendors don’t even want to talk about bounty schemes, and word has it only a minority take the idea seriously. Some of those are said to only award successful bug discoverers with a T-shirt – scant reward for saving a company from dropping its customers in the reputational cock-up mire.

Coupled with the fact that even indulging in unearthing an application bug is legally the same as messing with the precursors of a nuclear bomb, the results are likely to be reasonably predictable. Many in the bug hunting community will be put off from getting involved, while those that are aren’t are more likely to be those with more malicious intent anyway.

Those governments that are signed up to the Wassenaar Arrangement will achieve the exact opposite of what they aim for, as the number of applications vulnerabilities found and repaired will likely go down. Meanwhile, those with a specific interest in exploiting those vulnerabilities will use their skills to find ways of continuing their practices. Their objectives may be honourable, but their tactics are potentially very counter-productive.

ostrich-head-in-sand
A software vendor near you?

What is needed now is for the majority of software vendors to publicly institute bug bounty programmes. They must stop thinking that paying outsiders to demonstrate their applications architecting and code writing skills may not always be that good is a `bad thing’. It is a good thing that shows they are human and that they care about the potential holes into which they drop their customers.

They could at least sign up with emerging operations such as Bugcrowd, which operates in an agency role for the bug hunting community, based around its Crowdcontrol vulnerability management system. It even allows vendors to sign up for private services if they want to pretend they don’t need it.

With software vulnerabilities coming front and centre as a primary attack vector for not only criminals but terrorists and national governments, finding them and plugging them is now more important than ever. There is now a need for a drastic rethink on how this issue is tackled.

Vendors pretending it doesn’t happen to them for the sake of their image, and national governments disarming the very people that might protect them best, are no longer appropriate or sensible approaches.

My take

Yes, bug hunters may live on the divide between nice and nasty, but so do informers and `undercover’ police and these are accepted as an inevitable. It is time for the software business to show some maturity – and tell national governments the facts of software life.