If data is locked in the datacenter, so are applications. The first step towards overcoming data gravity is to discard the concept of data locality and begin building a new infrastructure. Once data is “there”, in the cloud, applications may begin moving as well.
If I had to guess what the next buzzword was going to be in enterprise IT, “intent driven” seems to be the new hotness. For one, it sounds a lot more humanistic than saying automation. But it also represents a larger shift of companies moving away from the idea of how something has to be done, and toward looking for ways to implement how they want a given IT goal to proceed.
But as much as “intent driven” products seems to be catching on, we often see companies struggling to identify what is the actual intent behind their solutions.
Open source is not entirely new to NetApp, they’ve had an OpenStack team in the company since 2011, mainly contributing to the Cinder project. This provided on-demand block storage in OpenStack. In the past 18 months, this has been consciously expanded into an open ecosystem team, organized around thePub.
Late last year, I wrote an overview about ClearSky Data. The company has a unique product. They offer an alternative to the usual state of cloud storage, with lots of latency and multiple data copies that you’re paying for individually. What continues to strike me about their offering is its completeness. Make no mistake, this is a fully managed storage solution.
The company has recently announced some exciting developments coming down the pipeline.
When a category becomes settled, a bit of tedium begins to set in. Room for innovation rapidly shrinks, and becomes more about efficiency and refinement than redefinition. That’s kind of how I felt the hyperconverged infrastructure market was settling into. There are still marked differences in price, features, and capability between the players. But the literal configuration of hardware seemed to be homogenized.
Datrium is trying to change the expectations of hyperconvergence. Instead, they are billing their concept as Open Convergence. This is their response to the traditional issue with HCI. Their basic format is to separate bulk storage from compute, flash, and networking.
Can a framing metaphor be a product differentiator? In Turbonomic’s case, I think it can. They use a supply and demand model for their application assurance platform. This brings some interesting implications into the overall solution.
In this week’s Gestalt Server News:
– Get a look at how Dell EMC is handling their merger in the VxRail division
– Next IT is finding ways to put AI to work
– A New York airport finds out why your should check your server configuration.
Plus, what else can you buy for $8988 instead of Intel’s top of the line Xeon.
Gestalt News has a fresh batch of mobility news for you. In this iteration:
– Nokia bets big on IoT networking
– Qualcomm releases 802.11ax chipsets
– A look at client-side networking
Plus more great reads from the community!
In this iteration of the Gestalt IT Server Newsletter:
– Scale Computing rethinks HCI from the ground up
– Nvidia keeps growing in the data center
– Alastair Cooke reviews how to make an app fit into a container
Plus Diablo Technologies terabyte memory solution, and a look at a compact hyperconverged home lab!