NetApp is in a unique position to integrate legacy, hybrid, and cloud storage, announcing their “Data Fabric” vision a few years back. Until NetApp Insight 2018, this all seemed like so much marketecture or vaporware. But the company is starting to make it happen, with real data services, orchestration, and management solutions coming to market.
Orchestration and integration may be what people think of most when in comes to DevOps. But in this post, Ned Bellavance looks at how to close the DevOps feedback loop with measuring and optimizing application performance.
If you’re not familiar with ioFABRIC, they make Vicinity, a data fabric solution that lets you get better utilization of all your storage. It does this by presenting applications with a virtual data plane that amalgamates all available storage. This is governed independently by their own control plane. Essentially, the virtual data plane presents to the application as whatever kind of storage it natively needs (block, file, SMB, etc.).
Up to date, ioFABRIC Vicinity has supported storage in your data center, whether it’s a SAN, SSDs, NVMe, or emerging NVDIMMs. But with their 3.0 release, they are fundamentally changing the product.
Hyperconverged infrastructure has been around for a while. We’ve seen companies go public on the strength of the market, and companies get acquired for the same reason. It’s a way to simply the often complex world of provisioning and managing a virtualization infrastructure. But HCI has been around long enough that the limitations of that model have become clear to the enterprise. Any new entrant to the crowded market should have solutions to those problems.
Today, NetApp announced their entry into the HCI market. In their messaging, they hammered in on those limitations.