Datacenter architectures must meet the changing needs of many applications, while also supporting demanding workloads today. Datera’s design focuses on data movement, to allow flexibility for growth as requirements change. It uses commodity hardware and machine learning to create an optimal and cost-effective storage platform for the modern datacenter.
The Challenge of Datacenter Architecture
With all the different technologies and vendors available, building out datacenter infrastructure can feel overwhelming. There are just too many options! Datacenter architects need to consider several requirements in creating their infrastructure plan. While meeting a single application’s requirements can be simple, most organizations have many different applications, each having its own performance requirements. This creates a need for an infrastructure solution that is both flexible and meets the needs of demanding workloads.
The first thing to consider is that the various applications and workloads that run in any given datacenter will change over time. Planners need to create an environment that is not only capable of meeting the needs of multiple applications, but also easily provides for new or changing requirements. Many datacenter infrastructures are configured to meet a specific set of requirements, but these can be difficult to reconfigure without disruption when requirements change. Infrastructure needs to be flexible.
The second thing to consider is the pace of technology development. New technologies, and improved versions of current ones, are coming out all the time. This is happening across all parts of the infrastructure, as new CPU, memory, storage, and networking technologies appear. Infrastructure planning must allow organizations to take advantage of new and improved technologies when they become available, not having to wait for too-infrequent tech refresh cycles. The datacenter needs to allow the new tech to be added simply, and in a way that doesn’t disrupt anything currently running.
The third consideration, and usually the first on everyone’s list, is cost. Every organization is looking to do new things inexpensively, and looking to reduce existing costs. Any plan for datacenter infrastructure needs to have cost as a factor in that planning.
Software-Defined Storage as a Foundation for the Modern Datacenter
One technology that answers all three considerations above is software-defined storage (SDS). While any type of storage obviously requires both software and hardware, the industry has come to call it “software-defined” if it is not tied to any single piece of hardware. This includes not just devices but also technologies, brands, and configurations.
A modern “software-defined storage” platform must meet the following criteria:
- It should have a scale-out architecture, allowing new storage to be easily added as needed
- It should be able to meet high-performance demands, with response times measured in microseconds not milliseconds
- It must run on standard components, including commodity servers, and avoid lock-in to a specific vendor
- It must be fully automated, requiring little interaction from administrators
An SDS solution can be adapted as workloads change, and allow new technology to be incorporated without the disruptions that usually accompany a tech refresh. And the use of commodity servers in a scale-out configuration tends to make it cost-effective as well.
Introducing Datera’s Data Services Platform
One software-defined storage solution worth considering is the Datera Data Services Platform, which we saw at Tech Field Day and Storage Field Day earlier this year. Datera, founded in 2013, is actively working to make the idea of a fully software-defined datacenter (SDDC) a reality.
While Datera easily meets the requirements of enterprise SDS listed above, what really makes their solution stand out is their automation. Datera prefers the term data orchestration for what it automates, which is largely focused on data placement. Since SDS can run on multiple types of hardware, it can create storage volumes with a wide variety of performance characteristics. Datera uses application-specific templates to help define storage volumes and initial data placement.
Datera’s data orchestration stands out thanks to its use of machine learning algorithms to place data. This allows it to non-disruptively automate optimizations as workloads change, creating greater efficiencies while reducing time and costs.
With so many storage options available, datacenter architects need to focus on solutions that cost-effectively meet their requirements. In many cases software-defined storage can help accomplish this. For more information about Datera’s Data Service Platform, watch their presentation at Tech Field Day 18.
- The Cloud Extends Into the Datacenter: Druva at AWS re:Invent - December 13, 2019
- Docker Enterprise Sold to Mirantis, Docker Desktop, Hub Raise $35MM - November 19, 2019
- Juniper NXTWORK 2019 Keynote Live Blog - November 12, 2019
- Where Did Tech Field Day Come From? Tech Field Day at 10 - November 4, 2019
- Druva Protects Corporate Data Outside the Firewall - October 30, 2019
- It’s All About the Delegates: Tech Field Day at 10 - October 27, 2019
- Nth Generation Symposium Keynote Live Blog - October 24, 2019
- SolarWinds Aims to Connect the Data Dots - October 24, 2019
- Celebrating 10 Years of Pure Storage and Tech Field Day - October 23, 2019
- Meeting the Storage Performance Requirements of AI and ML Workloads with WekaIO - October 23, 2019