With HANA’s in-memory approach. SAP appears able to significantly reduce the need for redundant tables. These tables of pre-calculated data have plagued relational database based deployment and data warehousing for years. In the past, significant complexity, fragility and added storage requirements were introduced into designs in the effort to eke out performance, One focus of S4/HANA is to get rid of much of the derived data that they were forced to store in previous versions and instead perform those calculations on-demand since the in-memory approach has improved performance significantly. This allowed SAP to shift their storage consumption significantly, as well as simplify the database structures.
The example described in the Hasso Plattner’s keynote:
- A typical R3 system on a standard relational database with its failover system that takes 54TB in total.
- Move the same system over to HANA and the storage requirements go down to 9.5TB
- When the equivalent workload is moved to S4/HANA, the storage requirements would be 3TB.
Taking the same workload from traditional R3 to S4/HANA reduces the storage costs from $20M down to $550K – that is quite a shift in relatively non-value added costs. Naturally, this level of impact needs to be validated based on your own specific needs and I am not sure what the software cost comparison would be for the computing infrastructure needed.
At the same time as the in-memory approach addresses the storage needs, the complexities of the customizations would likely be reduced. In fact. I’d say the first step in moving to S4 should be a real inventory and assessment of all the customizations that have been put in place over the years. They may no longer be necessary. If the customizations are still needed, they will likely be much simpler to implement and maintain, since the database design they need to interact with will be more straightforward.
I had the opportunity to talk with Deepak Krishnamurthy (Chief Strategy Officer) and Nayaki Nayyar (Senior VP for Cloud and Customer Engagement) along with others from diginomica while at the conference. The development strategy used in S4/HANA, particularly in the consumer engagement space that Nayaki pointed out, relies heavily on microservices. SAP’s use of microservices to aggregate smaller processing components into larger functionality should significantly improve the quality and capability of SAP’s code but also allow for improved customization (assuming the APIs are publicized) and more rapid prototyping and value delivery results.
With the big shifts in business expectations of IT that are possible with the abundant computing capabilities available, the agility and capability that appears to be built into S4/HANA should be part of any organization that has deployed R3’s tactical plan within their strategic efforts. The time to assess impact is now.
SAP appears to be shifting how they are going to market. Business demands for how information is consumed is shifting across all industries. As businesses move to take advantage of IoT, analytics, both traditional and predictive, and process automation, understanding where SAP is investing may shift the definition of what’s possible, the definition of value and the timeframe in which goals can be accomplished.
I am not saying SAP is simple yet, but they do seem to be addressing many of the concerns and complaints I’ve heard expressed over the years about how much it costs to keep these systems running. Let’s see how it pans out.
Disclosure: SAP is a premier partner at time of writing and covered most of the author’s T&E for attending Sapphire Now 2015