Confessions of an Internet pioneer - Vint Cerf's 2020 vision with the benefit of 20/20 hindsight

Profile picture for user cmiddleton By Chris Middleton July 6, 2020
What one of the founding fathers of the Internet would do differently...if he thought he could get away with it!


There are some things that I wish I could have done, but I'm not sure that I would have gotten away with it.

So says Vint Cerf, VP and Chief Internet Evangelist at Google. Along with TCP/IP co-developer Bob Kahn, Vinton Gray Cerf is traditionally referred to as one of the fathers and architects of the Internet, for the pair’s 1974 paper on ARPANET, work on which began in the previous year. 

One of the things Cerf would like to do much more of in the future, he says, is stress the number of mothers of the Internet: women such as Dr Radia Perlman, Judy Estrin, Susan Estrada, Dr Yvonne Marie Andrés, Elise Gerich, and other Internet Hall of Fame Inductees, whose work was either critical in the early days of Internet development, or has been instrumental in pushing it since. 

Aside from that important point of recognition, there are some other things that now Cerf wishes he had done differently in the mid-1970s, as the Internet’s 50th birthday looms up in a world that has been changed beyond recognition by it, the World Wide Web, and mobile technology. 

Addressing the situation

The first is the address space, which was based on a back-of-an-envelope calculation for how many termination points would be needed for a global command and control network for military forces. They estimated 128 countries – “There was no Google to ask” – with a possible two networks per country. That’s 256 networks, or eight bits. 

Then we said, ‘How many computers will there be in each network?’ Remember, this is 1973 when computers were big, expensive things in air-conditioned rooms and they didn't get up and move around. And we said, ‘How about 16 million or something?’ So in the end, we had a eight bits of network and 24 bits of host ID. 

That was a 32-bit number and, of course, if we could allocate that fully and densely that would have been 4.3 billion terminations, which was more than there were people in the world at that time. And this was just an experiment, we didn't know if it was going to work, so we adopted a 32-bit address space. 

We considered 128 bits, but it didn't pass the ‘red face’ test; in 1973, I couldn't imagine going to somebody and saying, ‘I need 128 bits of address space to do this experiment’, so we stuck with 32. It lasted until 2011, but the Internet Engineering Task Force recognised around 1992 that we were consuming the address space, so we introduced IPv6 after some arm wrestling.

The second issue is to do with security, he says, picking up on the all-too-familiar complaint from all IT leaders today:

Lots of people say, ‘You idiot! Why didn't you make a more secure system? It’'s falling apart everywhere. What's the matter with you?’ The honest answer is that at the time, I had worked with the NSA on a version of the Internet that could be secured. But the equipment we were using turned out to be classified and I couldn't share information about it with people who didn't have clearances. 

In 1976, two years before the point where we froze the design to start implementing it, a paper was published by Martin Hellman and Whitfield Diffie [and Ralph Merkle], who were colleagues at Stanford, called New Directions in Cryptography, all about public key crypto. That was a powerful conceptual shift in the way in which crypto works, because up until then the keys had to be symmetric.

“The public key crypto idea really struck me. However, there was no software and no algorithms readily available at that time. By this time I'm running the programme and I really wanted to demonstrate its functional capability, so we released the system without the benefit of public key crypto. But it’s retrofittable and, of course, we now use it all the time to support HTTPS and other things.

Actually, you should know that the public key crypto idea was discovered by GCHQ in the UK somewhere around 1974, a couple of years before Marty came up with it. But they didn't publish it, because they wanted to keep it a secret.

2020 power

Fast forward to 2020, and Cerf fields mounting questions about governance, disinformation, and the growing power of companies like Facebook, Amazon, and his own employer, Google:

When you see a phenomenon like the Internet, which is rich in its evolution, new ideas, new applications, it is a very open architecture and invites people to invent new ways of using it. But this introduces new kinds of governance concerns: what we do about misinformation, about malware which is propagating through the network, about someone in one country who is harmed by someone in another.  For anyone who is interested in governance, there is simply a wide open space here for hard work and for international agreements, in order to manage this very complex and very rich environment that we call the Internet, and the World Wide Web.”

That sounds like a lengthy ‘don’t know’, particularly with the explosion of troll farms, 'Fake News' – and allegations of 'Fake News' – all of which seem to be engendering widespread dishonesty in public life and a mistrust of information itself. Cerf opines: 

It’s an extremely hard problem to solve. The platform is largely neutral, so if you think of YouTube, Facebook, Twitter, they are, generally speaking, neutral in their form and anybody can inject anything into them. Then the question is, how do you cope with the things that you conclude should not be visible? How do you filter that out – at scale? 

We estimate that there are 400 hours of video uploaded into YouTube every minute. There is no way that a cadre of human beings is going to be able to look at all that video and do something about it, so we have to invent automated methods. We train those automated methods against videos that we're concerned about, and then we use machine learning to try to detect other videos that might be similar. But we also need to do something about detecting false positives. 

We clearly have to apply automation. But there is also a filter that we could train which might be even better. It's called wetware up here, but I call it critical thinking. People should exercise critical thinking about what they see and here. Where did this come from? Is there any corroborating evidence to support a claim? What was the underlying purpose behind putting this up? But that takes work. I'd like to see kids trained to think that way.

In other words, it’s our fault for living in a world of surface noise – partly encouraged by the same platforms. Google was set up to make information easier to find, but content providers now routinely change content to make it easier for Google to find, bending the universe of data around one massive object, distorting the underlying fabric. 

Blockchain isn't the answer

Some have suggested that Blockchain may be a technological solution for authenticating information – notwithstanding the energy problems associated with it, which often see those costs offloaded onto users. Cerf isn’t convinced by any of the arguments in the technology’s favour (and he was speaking at a Blockchain conference):

I’m a well-known Blockchain skeptic. It is, roughly speaking, a theoretically immutable distributed database, and it has some useful properties. But I have not seen it scale terribly well. I’m not sure that blockchain is really going to change the Internet. It is a useful tool for aggregating things. Nonetheless, there are some issues, like how long do you have to remember everything in the blockchain in order to validate something that occurred a long time ago? 

In the case of the real estate, federal income tax, or the Internal Revenue Service, there are reasons why you have to hold on to things for a long time, and the question will Blockchain do that efficiently? I see blockchain simply in the application space.  Now the question is is does Blockchain have a role to play down in the core of the protocols, and I have not seen that, unless it’s getting into domain name assignment.

So what can be done about the growing power of Facebook, Google, et al and of the approach of some countries, such as China, to using the Internet as a surveillance tool rather than a medium for the free exchange of ideas? Cerf has obviously been asked this before...a lot: 

Wow, this is sort of like the question ‘When did you stop beating your wife?’ I think that, if there are dangers at all – and perhaps there are – they are not all the same.  t is not unusual to see concentrations of of capacity in the commercial market. It happens a lot in the automobile market in the US. There are economies of scale that often dictate that the number of successful actors in the market will diminish over time. But there is vigorous competition still in the Internet space, even though a person might see that there should be more. 

It's very important to recognise that the success of some of Internet-oriented companies is not guaranteed, nor is it necessarily guaranteed to persist forever. We've seen some very successful entities diminishing their visibility and success over time. As a Google executive, I feel like we have to keep running like crazy, because there's probably a couple of students in some dorm somewhere inventing something better than we have.

As for China: 

China is an interesting alternative. It has invested very heavily in the Internet. It's certainly a country that has the largest population of Internet users per capita anywhere in the world – at least 800 million, possibly as many as a billion. They've invested in the infrastructure and they have companies like Alibaba, WeChat, and others. 

On the other hand, China has exhibited a philosophy which is less than open. In fact dramatically less, and surveillance is an increasingly significant component of the way in which they've chosen to implement the Internet or applications on it.  In that sense, there are risk factors, especially if their sense of the Internet propagates more broadly, versus the Western openness view. We can already see that there are some regimes other than the Chinese that are also interested in controlling and or observing what their population does and limiting what they can see or what they can say. 

He concludes 

I think it's harmful to the fundamental philosophy of the Internet, which has always been to keep it open, allow it to evolve, allow people to share information, and to essentially accelerate progress.

My take

An architect, a prime mover, but also a flawed human being who admits to mistakes with the benefits of 20/20 – hindsight. What is interesting is that Cerf now finds himself working in a world of command and control that has, in many ways, returned his ideas to their roots, but perhaps not in the way he intended.