Data centers often need to deploy and manage multiple types of storage devices to provide the needed performance and storage tiers. Meeting an organization’s storage capacity and storage performance requirements can be a juggling act with multiple storage devices. VAST Data’s Universal Storage Solution can simplify this complexity using a single device providing a single cost-effective Flash tier.
With ever-expanding data sets and machine learning capabilities, organizations are exploring cloud solutions for access to resources not available in their on-premises datacenters. Hybrid cloud is becoming the de facto approach to cloud for many. This post explores NetApp Data Availability-as-a-Service from their presentation at Storage Field Day 18.
The challenge of data management is nothing new, and the Enterprise Storage market has responded at both ends of the spectrum by providing mature, high performance solutions for primary datasets, and lower-cost archival solutions for less important bulk data. But many businesses are discovering a capability gap when attempting to store secondary data that falls into the “everything else” category. For this reason, Cohesity has begun to leverage its core capabilities to attack new problems as customer requirements, and the definition of secondary storage, continue to evolve.
Two of the most impressive technological advancements in recent years are Artificial Intelligence (AI) and Machine Learning (ML). But AI and ML workloads require vast amounts of storage and high-performance access from multiple servers. This is the ideal market for the WekaIO distributed storage solution, as demonstrated at Storage Field Day in February. It offers scale, performance, and compatibility with AI and ML workloads.
Datacenter architectures must meet the changing needs of many applications, while also supporting demanding workloads today. Datera’s design focuses on data movement, to allow flexibility for growth as requirements change. It uses commodity hardware and machine learning to create an optimal and cost-effective storage platform for the modern datacenter.
As discussed at Storage Field Day in February, the idea of reusing and repurposing stored data makes sense from both a business intelligence perspective and from a cyber-resiliency standpoint. IBM’s solution is based on a solid foundation with years of enterprise use, and enhances the snapshot capabilities already found in most enterprise storage systems. Spectrum Protect Plus really moves IBM’s data protection offering forward, “upcycling” data for the enterprise.
Before buzzwords like “cloud” and “software-defined storage” (SDS) became part of our everyday vocabulary, software developers like Boyan Ivanov, Chief Executive and Co-Founder of StorPool Storage, already were hard at work devising ways to build extremely fast, reliable, and scalable storage solutions for public and private use cases. While the solution may not have had a fancy moniker, it aims to re-energize block storage, combining high-end storage area network (SAN) technology through software and the performance of local solid-state drive (SSD) storage.
When discussing modern applications, infrastructure is often not given much though except to say that it should be easy to provision according to the needs of the application itself. Those who provide the infrastructure to applications developers and end users know this is no trivial task. Ken Nalbone looks at how Datera’s software defined storage platform helps bridge the this gap.
While data storage/recovery remain as critical elements to business success, tech companies are increasingly bringing greater innovation to data protection as we see with IBM Storage.
Get up to date with all that’s new at Gestalt IT with Gestalt News. This week we had posts about why picking a cloud provider doesn’t need to be a binary choice, a podcast episode on the death of the storage array, and post on refactoring monolithic apps for the cloud.