The quid pro quo of the need for business-led innovation in the application of IT is for the vendor community is to produce development environments in which those innovations can not only emerge but brought into the production environment as quickly and effectively as possible.
In practice, this means packaging up the underlying technology so that it is not focal point of the development process, the business application and processes are.
This notion formed part of the theme underpinning Timo Elliott’s presentation as SAP’s EVP and Global Innovation Evangelist, at the recent Leaders Met Innovation conference in London. Later that day, he sat down to talk through some of the issues involved in delivering such tools and to unveil the company’s latest moves in that direction with the announcement of Leonardo.
Named after – and for – the famous Italian artist, engineer and polymath Leonardo da Vinci, this is half of a twin thrust by SAP into advancing the development of applications in and around IoT. The second element is a Jump Start program, which is part of the company’s recently announced €2 billion investment in IoT. They are expected to be available later this year.
According to a blog by Tom Raftery, SAP’s Global IoT Evangelist, Leonardo will consist of six pre-configured solutions, with each targeting a broad application area, all using a `Connected…’ name. So, there will be:
- Connected Products aimed at end-to-end visibility of product-centric operations
- Connected Assets to link production systems and assets with manufacturing and Maintenance business processes
- Connected Fleet for managing any assets that move
- Connected Infrastructure for delivering new forms of digital operational intelligence
- Connected Markets for optimise the utilisation of resources and assets
- Connected People, which is pretty much self-explanatory.
The Jump Start package is outlined by Raferty in his blog:
The offering consists of an executive design thinking session to kick off the customer’s project and identify an area for innovation, a rapid prototyping workshop to develop a real prototype to validate the vision, and finally an implementation phase to convert the prototype into a live pilot project and define an IoT roadmap for further business processes. To my mind, the most interesting aspect of the offer is that it has a fixed price for the software and services to cover the pilot and first year of usage of SAP Leonardo solutions.
Up with the umbrella
As Elliott sees it, Leonardo is the umbrella brand for an IoT solutions platform that bundles services with some of the company’s existing IOT products. This will include providing increasing levels of connectivity and collaboration between IoT systems and the ERP tools of standard business management. The idea is to get them ever closer in lockstep, says Elliott:
Every industrial customer we’re working with is investing strongly in the Internet of Things and connected machines. What is new is that it’s increasingly connected and used to create new products and services.
As an example of how this allows businesses to find new markets and re-innovate themselves to meet those new needs, he referenced a German company, that traditionally sold air compressors, but realised through the use of predictive analytics that what it was really selling was a reliable supply of compressed air.
The ERP collaboration work is well under way, with SAP now working with most of the main IoT technology vendors. Some, such as GE Digital with its Industrial Internet focus, are an obvious target for some specific collaboration tools between SAP’s ERP balliwick and GE’s Predix management suite, though Elliott was in no mood to confirm any developments beyond the general:
This is our live business. Our tagline is ‘Seamlessly connected’. But I don’t know the details of the GE deal. There are various consortia being created working very closely with Siemens, with GE, with various platforms. Obviously, ultimately, we’re going to do what’s right for our customers. We have customers in all different types of industries and they’re all working with lots of different IoT platforms. They want to be able to connect the data in their SAP systems to all of these new platforms, so it’s going to be a period of co-opetition for us.
Leonardo can obviously make a model to help users develop other, business management-focused innovations, and the company already has a huge repository of data models, management tools and raw historical data itself which can be sliced and diced in any number of ways to form starting platforms for business users.
The company is particularly keen to make data of this type widely available to the small and medium businesses sector as well its traditional market of the large enterprises. This, Elliott said, is part of the SAP Startup Application, and one of the key goals here is to help users innovate new applications that exploit data analytics to the best effect:
When we came up with the cloud platform we really wanted to make sure that it was something that would also be suitable for startups. So we ensured we included small organisations and made sure they could integrate that. First let’s start with reality. Where we are today, most analytics projects run into problems or are ultimately disappointing because it’s really hard. So we’ve got a massive opportunity just to do a better job of the data we’ve already got available. Honestly, that alone is huge.
This, he suggested, goes back to integrating data. One of the biggest problems for using data effectively is to integrate across different sources. He also said that SAP has some new technologies that are likely to make that much more powerful, so artificial intelligence, and predictive algorithms can be used also for the data matching. The idea is to use some of the power of analytics to improve the analytics themselves:
When it comes to the privacy side of things, it does mean that you can find different data sources and see insights that wouldn’t otherwise be available. Again, I tend to believe that, unfortunately, people's most embarrassing secrets are stored in a database somewhere, and now it’s just a question of whether we have the right controls in place to ensure that someone with bad intentions can’t get that data out.
Getting the right data
Providing data itself to users as part of the toolset of innovation has the potential to be something of a two-edged sword for any vendor. There is the chance that it might resell data that is subject to privacy constraints that are unknown and not obvious, for example, and some of the data may be of limited value through being a generalised, third party dataset of indeterminent history, and therefore of arguable value to any analytical process.
There is also the possibility that such data, even if based on recently anonymised real records, will be associated with business processes that may not match the needs of the new business processes that analytics-based innovations are likely to throw up.
Elliott does not see this as too much of a problem, however:
Ultimately, the data stored from SAP systems or any other system, that’s robust. It’s just facts about what happened in the past. There is no such thing as bad facts or wrong facts, they’re going to be a solid foundation for anything you want to do with your customers in the future. Certainly data dates. More recent data is more effective and there are some industries where it turns around so fast that yes, old data is not necessarily important. But if you’re doing anything to do with supply chains or manufacturing, that data lasts a good few years.
The company is also working to build and refine more datasets to meet a growing potential. Examples of the more intelligent apps that SAP is working on include inverse matching, which provides the ability to match huge numbers of payments coming in to a business with multiple purchase orders. This uses machine learning technologies to observe staff actions and learn when there is a very high probability of a payment/purchase order match.
The company is using a similar approach to areas of HR management, in particular staff recruitment interviews. Tasks such as identifying short-list candidates can be automated, based on the data collected from past CVs, interviews and eventual choices, says Elliott:
It’s a good example of both the power and the danger of this. For example, what happens if your hiring practices have been very discriminatory? Your training dataset would be the wrong dataset, and it would be embedding the wrong decisions into your future hiring or supply chain or anything else. So then you need to go beyond that. So as part of our SuccessFactors HR technology, we’re making sure that we’re using the artificial intelligence to do thorough reviews to check for common types of discrimination.
This move towards providing datasets as part of the innovation toolsets for business users does beg one further obvious question. Will we see the company moving on to the acquisition trail to snap up major sources of current data relevant to aspects of business management, in the same way that IBM moved in on The Weather Channel as a complement to its Watson services?
Of SAP’s four acquisitions last year, three - Roambi, PLAT.ONE and Altiscale - are technology additions. One, however, Abakus, is an information source and as it is in the marketing information sector, this is probably a good indicator of what can be expected from SAP in the future.
Information as a Service has, of course, been around for as long as the ability to write the stuff down. But its digitization and coupling with AI and analytical tools is opening up a new market for information exploitation as a tool – be that on premise or as a cloud service. One of the key problems with analytics is not the ability to derive `answers’ from data but the ability to know what data to use to arrive at the types of answers that are required.
The next step here, it seems to me, is for SAP to `big up’ whatever data science expertise it has resident in the company – and there is bound to be a good deal. Pulling together analytical technologies with data sources, and then overlaying data science expertise, and placing it all under an umbrella brand name, as with Leondardo, marks the start of what might be termed Applied Data Science as a Service. If this can short cut the road to DIY innovation for business managers the impact could be significant.