Software-Defined Storage Market Growing, but Obstacles to Adoption Remain

By: Jacqueline Lee| - Leave a comment

Software-defined everything (SDx) adoption is accelerating, with software-defined storage (SDS) expected to grow in alignment with SDx. According to MarketsandMarkets, SDS will grow at a 36.7 percent compound annual growth rate through 2021, reaching $22.56 billion.

Growth drivers include skyrocketing data generation and the need to find cost-effective ways to generate business benefits from this data. To take advantage of the potential, vendors must overcome these four main obstacles to SDS adoption.

1. Simplicity

For many enterprises, software-defined storage is built into the DNA of a hyperconverged architecture (HCA). With HCA, storage is pooled and accessible by every VM in a cluster. This eliminates the need for a separate storage network and cuts total cost of ownership related to storage. It also means single-vendor support, eliminating the inconvenience of dealing with multiple vendors while troubleshooting problems.

Unfortunately, most data centers aren’t brand-new projects, and they aren’t in a position to gut the legacy systems and replace them with HCA. In a data center that attempts to incorporate HCA, legacy storage often isn’t replaced by the new shared virtual volume. Instead, as to Storage Swiss founder George Crump notes, HCA ends up as yet another storage silo.

Crump says there are ways to bring SDS values into existing legacy storage systems, especially setting up a single point of management for storage. Through this point, leveraging the existing storage area network and dedicated compute resources, data centers get the management simplicity of a shared-everything environment without remaking their architecture from the ground up.

The key is to invest first in unified data services and then augment existing systems with low-cost commodity storage. Businesses that help data centers leverage existing storage, while setting up a single data console for abstracted storage intelligence, occupy an important market niche.

2. Reliability

IT teams are reluctant to give up legacy storage solutions because they trust the reliability of their current setup. In an HCA setup where network, power and memory are shared by everything, a spike in application demand can limit other applications’ access to storage. Many SDS solutions also don’t come with complete data services. For instance, automatic replication to a remote location often isn’t part of the SDS package, which causes business continuity concerns.

3. Scalability

In many data centers, increased storage comes with the package when data centers increase their compute power and add new physical servers to the mix, and their storage capacity gets pooled into the aggregate. As more nodes contribute to the shared virtualized volume, network performance can suffer, which makes it difficult to scale SDS reliably and efficiently.

One trend speeding up access to solid state drives (SSD) is the nonvolatile memory express (NVMe) protocol. NVMe, according to Network Computing, slashes latency and boosts bandwidth to all SSDs, delivering as much as six times higher throughput. It’s easy to implement in existing SSD arrays with changes to the interconnect layout, but there is a catch: NVMe SSDs with dual ports aren’t cheap. Hopefully they will become more accessible in the near future, as their price is expected to drop.

4. Cost

In addition to the upfront purchase of SDS appliances, the current pricing structure for SDS software doesn’t make sense for many organizations. Crump, in another Storage Swiss post, says pricing tiers offered by SDS vendors don’t always make sense for organizations. For example, an entity licensing storage services for up to 10 terabytes but actually in need of 11 TB shouldn’t have to jump to the 25-TB payment tier. More granular pricing would bring the HCA model more in line with cloud economics.

Software-defined storage offers significant benefits to both existing and new data centers. The key is to work with vendors that meet organizations where they’re at — rather than asking for significant up-front investments that risk reducing storage performance.

Topics: , ,


About The Author

Jacqueline Lee

Freelance Writer

Jacqueline Lee specializes in business and technology writing, drawing on over 10 years of experience in business, management and entrepreneurship. Currently, she blogs for HireVue and IBM, and her work on behalf of client brands has appeared in Huffington Post, Forbes, Entrepreneur and Inc. Magazine. In addition to writing, Jackie works as a social media manager and freelance editor. She's a member of the American Copy Editors Society and is completing a certificate in editing from the Poynter Institute.

Articles by Jacqueline Lee
See All Posts