The TL;DR version goes like this: the Dreamscope founders set out to build a deep learning powered image editor for general use. The Dreamscope app took off but the costs of running AWS were prohibitive. So they invented a way to build scalable servers from much cheaper GPUs and Lambda Labs was the outcome. Note: for those interested, you can buy a deep learning laptop machine from Lambda for as little as $2,493. Blades come somewhat more expensive but still within the realms of a modest IT budget.
A decentralized computing architecture
But such is the pace of innovation in the blockchain world that Gilder didn't have time to expand on his theory with further examples. However, Gilder outlined the benefits of a decentralized computing architecture compared to the centralized model epitomized by Google, Facebook, Amazon, and Apple.
According to Gilder, Google is faced with a major problem. The amount of computing and storage capacity it needs, especially in imaging, is growing at a rate that means it has to invest billions of dollars in infrastructure just to keep up with demand while at the same time facing a ferocious competitor in the shape of Facebook. Microsoft and AWS face similar challenges but not to the same extent because their business models are not based on 'free' pretty much everything.
Gilder says that in the days when the Internet consisted of email messages, documents passing between learned academies and the odd cat photo then of course 'free' makes sense. But the moment you introduce the ability to transact in money's worth then free becomes a shackle that guarantees a race to the bottom but with a tailing off of the economies of vast scale that so far Google has successfully traversed. And that race to the bottom is exacerbated by the likes of AWS that has relentlessly driven down the computing and storage costs for cloud hosting.
In a decentralized environment, none of those constraints apply because the network is effectively the spare computing resource and storage you, me and anyone else is willing to share in a virtual environment. Dfinity is encouraging crypto miners to provide dedicated server capacity with the end goal of developing:
...a public network of client computers providing a "decentralized world compute cloud" where software can be installed and run with all the usual benefits expected of "smart contract" systems hosted on a traditional blockchain.
In order to make this viable, Dfinity had to solve for scalability, one of the major constraints holding back Ethereum based blockchain deployment in the real world. The company says that it has overcome this problem and expects to go into production with over a million virtual machines at some as yet to be defined time in 2019.
Why would an enterprise buyer or IT person care? Try this from the Dfinity Medium presence:
...we want to see mass reengineering of enterprise IT systems to take advantage of the special properties blockchain computers provide and to dramatically cut costs. This last point is non-obvious because computation on blockchain computers is far more expensive than on traditional clouds such as Amazon Web Services. But dramatic savings are possible because the vast majority of costs involved in running enterprise IT systems derive from supporting human capital, not computation per se, and the DFINITY cloud will make it possible to create systems that involve far less human capital.
That's slap bang in the middle of the market wheelhouse we address for enterprise apps. It is, therefore, an opportunity for many of our partners to get from underneath the shackles of near permanent patching for security risks while providing the basis for reduced cost in the hands of end users. In turn, that should mean that the enterprise vendors can expand their reach at a fraction of today's pricing.
However, there's a gotcha. We don't see any way that Dfinity's heady ambitions will work for deployed systems in the near future. Apart from the obvious technical engineering issues, there is the question of governance. As I understand the Dfinity system, it will be impossible to know the location of data being processed in these systems. That's a no-no under many forms of regulation around the world.
So on the one hand, Dfinity is offering us an early look into the future but it is one that will require considerable technical work in order to make it viable combined with regulatory rethinking about data protection and sovereignty. In a worst-case scenario, the decentralized operators are faced with dev and test instances only for years to come.
Having said that, in years past, we thought that AWS would only ever be used for dev and test instances. Today, it is clear that the partnerships we see developing (sic) between Microsoft, Google, and AWS with a variety of enterprise vendors have opened the door to cloud production computing.
Of course, this might all crash and burn but it is perhaps a firm sign of confidence that according to Fortune, this week:
Dfinity raised $102 million from the prominent venture capitalist firm Andreessen Horowitz, the crytpo fund Polychain Capital and other investors. The new funding, which comes after Dfinity raised $61 million in March, will help build its vision of a blockchain-based “Cloud 3.0.”
The video below is well worth the time investment because it shows what is possible and provides clues as to where Dfinity sees this going.
Storj - a new way to provide and use storage
What about Stormj? This video, recorded by Alex Williams of the New Stack with the co-founder and interim CEO of Stormj is instructive.
— The New Stack (@thenewstack) August 29, 2018
For those that don't have the time to sit through half an hour of chat, the TL;DR goes like this: Stormj is using the principles of decentralized computing enabled by the blockchain to offer on-demand storage for open source communities. Most interesting to me is the fact that Stormj has created a two-sided market where 'farmers' (who provide the storage capacity) are rewarded while users naturally pay a fee. The model is best summed up this way from Ben Golub's explanatory blog story:
Critically, Storj opened its doors to partners who are participating in the beta program (the company hopes to go fully public early 2019) and has attracted interest from Confluent, Couchbase, FileZilla, InfluxData, MariaDB, Minio, MongoDB, Nextcloud, Pydio and Zenko.
To give you an idea of the economics involved, Storj has raised a total of $35.4 million across seven rounds in two years, of which $30 million was from an ICO last year. Even with that relatively small amount of funding:
Storj managed to build a highly distributed, performant, secure, and economical storage network (over 150 PB, in nearly 200 territories and countries.) We did not do so by massive capital investments, but instead built incrementally by compensating people who operated underutilized drives and incenting them to build up supply. The margin picture is different too, as we leverage existing equipment, and require our farmers to make little or no incremental investments in power, bandwidth, or people.
In this case, the key has been incentivizing the community of those who are participating in the network. In that sense, the marketing requirements that usually saddle DC operators, start to fall away. Yes, attention matters but the ecosystem of those who are benefitting financially is a much stronger magnet than a few screeds in the trade or open source media.
Where does this fit in the enterprise? I can imagine myriad uses of this technology for the many projects that enterprises undertake as part of their internal development. I can see agencies that specialize in Couchbase, MariaDB, MongoDB and others using this type of service as secure repositories for ongoing projects.
Dfinity and Storj are not alone in this field but they are confident and moving forward at pace.
The traditional enterprise mindset has been one of command and control. Don't put anything on the public network that even has the faintest whiff of being sensitive. These decentralized models - provided they are proven to be as secure as claimed - challenge that way of thinking.
There is a reason that some of us have been nervous about the meteoric rise in AWS beyond the potential vulnerability of the massive data centers that, from time to time, go down. In a centralized system, there is very much a 'winner takes all' theory of future economics. That in turn, brings the specter of monopsony power and the pricing ratchets that go with it. Decentralization neutralizes that problem which in turn should mean that enterprises can predictably cost out their computing and storage needs in dev and test situations, and at a lower cost than is possible from current DC operators.
Does this mean it is game over for the likes of AWS etc.? Absolutely not. If anything, I see this as an opportunity for a significant expansion of the addressable market in which those who believe in the decentralized model can be profitable participants.
I am though interested to see whether this foundational infrastructure leads towards the emergence of fresh ways of transacting across whole supply chains in ways that we have not seen in the past.