Commodity AI is coming - more than just AIRI fairy thinking?

Profile picture for user mbanks By Martin Banks March 28, 2018
Summary:
A partnership between Nvidia and Pure Storage is leading to the development of AIRI, an AI and machine learning 'appliance' that aims to bring AI to the 'masses'.

Golden gift box opens with confetti © Maksim Pasko - Fotolia.com
In the same way that turning servers into commodity items has opened up serious levels of compute power to the masses via the cloud, a partnership between Nvidia and Pure Storage is setting out to use that same commoditzation model to achieve a similar result for Artificial/Augmented Intelligence.

As partnerships go this is a non-legal `buddying up' operations we might see more of in future, a coming together of two vendors that have spotted an opportunity where a whole new product or service can be created between them that is potentially bigger than the sum of the (probably quite specific) parts.

The target market is good bit more defined than just 'the masses’. According to Alex McMullan, CTO of Pure Storage, the target is the data science community across the board. But it is not too difficult to speculate that, if the partnership hits the sweet-spot it is aiming for, this could be the precursor of AI spreading much further.

If it brings the cost of processing AI applications and storing the relevant data can be brought down to the right price point, while at the same time the convenience and ease of use is pushed up to the point where little thought or pre-planning is required, putting AI resources where the data already is becomes a distinctly feasible option.

The result is a packaged data-hub 'appliance' approach that can be used on premise or as a SaaS offering. The latter will come from the partners targeting cloud services providers as a potential market. This also maps onto the growing trend of distributing compute resources out towards the network edge, close to the coalface of everyday business activity.

Are you ready for AIRI?

Known as AIRI - AI Ready Infrastructure – it is a combination of Pure Storage FlashBlade storage systems, which have been developed modern analytics and AI in mind, and four Nvidia DGX-1 supercomputers, based on the company’s Tesla V100 GPUs and delivering 4 PetaFLOPS performance. Arista 100GbE switches tie it all together to run Nvidia’s GPU Cloud deep
learning stack and the Pure Storage AIRI Scaling Toolkit. McMullan says:

This is the next generation of AI infrastructure, making AI and machine learning accessible to everyone. There were previously only really available to academia. We are setting out to solve the problem many data specialists in business have of answering questions such as,  'How do I get into it?’, 'How do I demonstrate it to my board?’ or  'How can I start making use of the data I have available?’ .You can now get into AI using a third or even a quarter of a data center rack rather than a turnkey basis.”

The potential here could be significant. Market research firm Gartner has predicted that some 80% of enterprises will deploy AI by 2020. Given that McMullan claims that an AIRI set up can convert 60-70 data center racks committed to AI down to around half a rack, giving significant savings on space utilisation, power consumption and heat management requirements, something of this type will be needed if the Gartner prediction is to be even close over the next two years.

AIRI is already compatible with Docker containers so the ability to port AI applications to the environment will exist. This should make it possible for applications to be up and running without lengthy porting or implementation/optimisation issues. He also expects to see a move towards Kubernetes-based services orchestration in the future, as integration higher up the stack becomes more generally available:

We have put considerable effort into the issue of helping users work with data gravity and mobility issues and how difficult it can be to move large datasets around as it becomes more difficult to move data than to store it. But if users want to be in control of data behaviour then it should be mainly in one place, and that is the idea behind building a data-hub.

It all sits in one place and feeds GPUs, PC grids, VMware files and the rest from a central repository. Feedback from data scientists suggests they spend a good part of their day on data cleansing and simply moving data around, and they really want to keep their expensive GPUs working as much as possible.

As a slightly different take on the move of the overall marketplace beyond the technology of IT, McMullan also observed that he doesn’t see AIRI as an IT product. Rather, he sees it as a data product being pitched at the data science in just about every branch of industry and business. In effect the idea is to sell it as a consumable that can be plugged in, switched on, and used.

The long term benefit for users here is that, presuming it establishes itself in the minds of data scientists during the inevitable pilot project phases, it should then allow scaling out of applications to large production environments in much the same way as hyper-converged systems today. The gambit of wheeling in and connecting up new resources and letting the system self-discover and utilise them has the potential to make scaling another 'no-brainer' exercise.

My take

Think about where this could end up – using AI out at the network edge is a definite goal that has direct IoT implications for economically viable distributed intelligence for management and control. Those same implications can be applied to non-IoT applications for many areas of business and commerce processing and management.

Longer term, but with AI still very much in mind, it should be remembered that Nvidia makes GPUs – Graphics Processor Units – and Pure makes fast storage. There is scope here for them to take the AI appliances and first off, make pretty good facsimiles of in-memory processing systems.

Any sign of that being a market could soon lead to a much deeper partnership with a unified in-memory processing chip as the end result. I know nothing at all of such a possibility, but I just observe that HP is already moving inexorably along that road with its 'The Machine’ project, so if these two are not thinking along such lines, they really ought to be by now.