A modern data platform - the foundation for success

Patrick Smith Profile picture for user Patrick Smith November 28, 2023
Summary:
As business grapple with ways to get maximum value from data, Patrick Smith of Pure Storage argues why an all-flash data center can lay the foundations for success.

Shot of a Working Data Center With Rows of Rack Servers Connected with Enthernet Connection Visualisation Lines © gorodenkoff - Canva.com
(© gorodenkoff - Canva.com)

In today’s complex business environment, many organizations are grappling with how to create value from all their data, doing more with less and ensuring productivity. Many are looking for solutions which make their business run in a smoother, more streamlined way to maximize outcomes. Understanding their data is the key to this. But there is a lot of complexity in how data is stored, accessed, managed, protected and analyzed. The platform that data sits on has to be modern, built for current and future purposes. Having the right platform will enable business leaders to make data-driven decisions about the right investment options and timing, solving challenges, improving customer experience, incorporating flexibility to try new things and innovate.

This two-part series looks at the foundations for success, how to identify a modern data storage platform which supports business outcomes and a future proof data strategy. This includes examining how an all-flash data center can meet business needs; understanding what requirements the organization has; meeting all workload needs; managing an infrastructure estate; ultimately looking at how infrastructure can support a business's data needs.

The all-flash data center

Many years ago, flash wasn’t considered suitable for data where fast access wasn’t needed. However, with flash now at parity with legacy spinning disk in terms of cost and the advances in flash management, it’s possible to use flash for all storage workloads, even large-capacity price-sensitive workloads.

While there are workloads which have traditionally been thought of as Tier 1 flash-suitable, such as quick data requirements for relational databases, analytics or AI, now organizations should be leveraging flash for all workloads. The dynamic has changed and organizations aren’t just thinking of flash for performance anymore as there is no longer any justification for a disk or hybrid system.

Further adding support for an all-flash data center is the competitive landscape – some vendors chose iterative change when they started by repackaging existing solutions and programming their flash to work like a disk. Vendors who have been all-flash from day one don’t have this baggage. The legacy approach ultimately doesn’t meet modern requirements and creates more complexity as old HDDs aren’t developed with modern purposes in mind.

What makes this transition a reality now is cost efficiency driven by larger capacities and system efficiency – Pure Storage has announced 75TB drives with a roadmap that targets 150TB next year and 300TB within the next three years. Looking at the investment rate in HDD it’s impossible to compete with the benefits and capabilities of all-flash.

Understanding business goals

In order to choose the right solution for a data platform, organizations need to take a step back and understand their data requirements, which can be complex. They need to consider what success looks like – how IT and data can support both wider business goals and create end user benefits. Some of the complexity includes avoiding lock in; enabling flexibility of data movement; app development being done in containers; and managing existing storage inefficiency as not many start from a greenfield situation with nothing in their data centers. Understanding this should mean it’s easier to identify a vendor who can support these needs.

Meeting all price and performance requirements

Most, if not all organizations have a full range of needs for their data storage: fast access for immediate, customer facing apps; scale-out unstructured data repository; as well as cold data storage such as long term archive and back up.

These modern, but complex needs cannot be shoehorned in one specific storage format. Depending on an organization’s goal, they should have the flexibility to use block, file or object storage in any combination that works for them. These options should be able to support diverse workloads across a range of price and performance requirements. Organizations should be looking at a workload first model to determine which they need. This more holistic approach means organizations can better match business needs. For example unstructured data is expected to continue its exponential growth and most AI algorithms are trained on unstructured data. Supporting it with fast file and object storage is a must to enable organizations to get the best out of their data and meet modern goals.

Not only should organizations find a vendor who can support all data and application requirements, but also one who has a concise portfolio to meet those needs whether they be for scale-up latency sensitive or scale-out throughput optimized workloads. The right platform needs to scale to an organization’s data needs, rather than vendors imposing limits based on legacy technology and approaches. The right vendor can help IT leaders extract the most from their data for both critical and non-critical systems with the optimal platform - delivering efficiency, simplicity, and performance.

Mind the skills gap 

Managing skills requirements can be difficult. Especially when organizations are dealing with multiple environments, block, file and object storage, on-prem and cloud. Historically, it was commonplace for vendors to expect customers and channel partners to spend a week with them learning about how to operate their kit or commit to expensive professional services engagements. Many organisations are proud of this technology which is so complex, it takes several days of training to understand. It was part of the investment so customers should accept it and then repeat the process for each tech refresh and training cycle.

However it shouldn’t be this way. Vendors should be simplifying infrastructure operation and management and ensuring customers can have one way to view all storage. There should be software which uses AI driven insights to deliver recommendations, updates and upgrades when needed. This simplification and additional support in the management of infrastructure aids business level decision making, lowers overheads and delivers more efficiency.

Single operating environment 

Listening to customers complain about complexity has been a driver for Pure Storage to develop a single operating environment. Vendors should be optimizing hardware, and also if the software which manages hardware is optimized. One environment for everything should make processes simpler and faster for customers.

One operating environment means consistency of operation, the concepts for managing block, file and object storage become the same as it’s the same management paradigm. If a team is spending less time learning new systems, they have more time for higher value tasks and innovation. By reducing the complexity from disparate software and hardware platforms, customers can accomplish the same tasks faster and with less stress.

Coming up

Part two of this article will examine the technicality behind flash v disk; why organizations aren’t rushing to the cloud, but rather demanding the benefits of cloud delivered on-premise; how Service Level Agreements hugely impact user experience and financials; and how to ensure sustainability goals are incorporated into choosing a data platform provider.

Loading
A grey colored placeholder image