All Utilizing Tech

Focusing on AI Data Infrastructure Next Season on Utilizing Tech with Solidigm | Utilizing Tech 06×13

Great AI needs excellent data infrastructure, in terms of capacity, performance, and efficiency. This episode of Utilizing Tech serves as a preview of season 7, brought to you by Solidigm, and features co-hosts Jeniece Wnorowski and Ace Stryker along with Stephen Foskett. Solidigm’s partners are discovering just how important it is to optimize every element of the Ai infrastructure stack. With ever-larger AI datacenters being built, efficient storage can make a big difference, from power and cooling to physical density to performance. As we will hear throughout season 7, different AI environments will need specialized data infrastructure, from the edge to the cloud. With retrieval-augmented generation (RAG) emerging as a new trend in AI, it makes high-performance even more important at run-time.

Apple Podcasts | Spotify | Overcast | Audio | UtilizingTech.com

Keeping AI Datacenters Green with Solidigm’s Sustainable Storage Solutions

The AI boom has started a massive scramble for energy that only the biggest datacenters can satisfy. These megastructures, with their bottomless supply of power, will support capacity at giga-watt scale. For normal datacenters, this is the turning point.

This episode of Utilizing AI Podcast, a teaser of Season 7, talks about the importance of energy-efficient and AI-optimized storage solutions in datacenters.

Season 7 will focus on storage and AI data infrastructure. As a prelude to that, host Stephen Foskett sits down with guests, Jeniece Wnorowski and Ace Stryker of Solidigm, to discuss the implications of AI at data infrastructure level.

A Surge in Energy Use in Datacenters

A number of factors play out in the energy demand surge. Traditional datacenters do not have the infrastructure required to sustainably deploy cutting-edge AI workloads. Big plumbing upgrades including new facilities, latest behind-the-scenes technologies, and a shockingly large power grid, are required to make it happen.

For companies with a foot in a pile of capital, jumping into this spending frenzy is just a smart investment decision. But smaller companies face a rough financial test. The combined capital expense to set up a datacenter for artificial intelligence is jarringly big. But even more shocking is the operating expense.

“When you step back and look at the AI data pipeline as a whole, and what your hardware and software requirements to do AI work are, a lot of the focus is on compute, and rightly so,” says Stryker, director of market development. “You need a lot of compute horsepower to clean data, to train and validate a model, and then deploy it in the real world and make it useful.”

But datacenters running AI workloads also require an eye-popping amount of storage. This storage system, made up of the fastest drives and meatiest arrays, accounts for 35% of the total energy footprint.

“The GPUs are churning all the time and they’re big and hungry and hot and all that’s true but more than a third of that is going to power your storage devices,” explains Stryker. “Very quickly you can run up an astronomical power bill.”

HDD arrays have a serious disadvantage with space and power consumption. Besides being inherently low on performance, the drives use up a lot of space and energy owing to their bulky form factors.

Hardware efficiency is an important part of the AI picture. “It’s really a function of two things going on that move the meter the most. One is density – how much storage per device are you getting. The denser it is, the higher capacity the drives have, and the more power efficient they are going to be.”

The second factor is utilization which pertains to data replication or the number of copies across drives.

“If you have hard drives and you’re short-stroking them to meet the minimum IOPS requirement, these are the two things that create a bunch of inefficiency and opportunity for optimization,” he says.

Fast and Efficient Data-Storage Is Organizations’ Big Opportunity for AI Success

For Solidigm, these factors make the top agendas. “The focus from our company’s point of view has been to help and support folks who are spending all that money, energy and effort on really high-powered datacenters and make sure that the right storage is there to feed the GPUs, maximize utilization and ultimately improve two things – performance, and total cost of ownership,” Stryker says.

With organizations having no plans of slowing down, they need a technology infrastructure that is fully optimized for AI. Moving to higher density drives in slimmer form factors reduces the number of individual drives in the servers.

This produces massive savings in terms of dollar cost as well as cooling costs, space and weight.

Solidigm tightly partners with industry players like CoreWeave and DDN, and OEM companies like Dell and HPE to fuel the drive for energy-efficient storage. “We’re just as passionate about our partners as the products we build to support them,” says Wnorowski, datacenter product marketing manager.

As customers look to get away from using HDDs at the edge, and adopting comparable SSD solutions, Stryker and Wnorowski hope that they will find Solidigm’s high-density, max-performance flash solutions a great fit for their use cases. Thin and mighty, the drives pack a lot of capacity within their slight frames, without any performance tradeoffs. A bonus, these are low-maintenance and do not need to be serviced frequently.

Solidigm’s storage solutions are an equally good fit for RAG applications that typically require ultra-high performance SSDs.

Keep an eye out for the upcoming season of the Utilizing AI Podcast that will dive deeper into all of these areas and more.


Podcast Information:


Thank you for listening to Utilizing AI, part of the Utilizing Tech podcast series. If you enjoyed this discussion, please subscribe in your favorite podcast application and consider leaving us a rating and a nice review on Apple Podcasts or Spotify. This podcast was brought to you by Tech Field Day, now part of The Futurum Group. For show notes and more episodes, head to our dedicated Utilizing Tech Website or find us on X/Twitter and Mastodon at Utilizing Tech.


Gestalt IT and Tech Field Day are now part of The Futurum Group.

About the author

Sulagna Saha

Sulagna Saha is a writer at Gestalt IT where she covers all the latest in enterprise IT. She has written widely on miscellaneous topics. On gestaltit.com she writes about the hottest technologies in Cloud, AI, Security and sundry.

A writer by day and reader by night, Sulagna can be found busy with a book or browsing through a bookstore in her free time. She also likes cooking fancy things on leisurely weekends. Traveling and movies are other things high on her list of passions. Sulagna works out of the Gestalt IT office in Hudson, Ohio.

Leave a Comment