All Utilizing Tech

Focusing on AI Data Infrastructure Next Season on Utilizing Tech with Solidigm | Utilizing Tech 06×13

Great AI needs excellent data infrastructure, in terms of capacity, performance, and efficiency. This episode of Utilizing Tech serves as a preview of season 7, brought to you by Solidigm, and features co-hosts Jeniece Wnorowski and Ace Stryker along with Stephen Foskett. Solidigm’s partners are discovering just how important it is to optimize every element of the Ai infrastructure stack. With ever-larger AI datacenters being built, efficient storage can make a big difference, from power and cooling to physical density to performance. As we will hear throughout season 7, different AI environments will need specialized data infrastructure, from the edge to the cloud. With retrieval-augmented generation (RAG) emerging as a new trend in AI, it makes high-performance even more important at run-time.

Apple Podcasts | Spotify | Overcast | Audio |

Keeping AI Datacenters Green with Solidigm’s Sustainable Storage Solutions

The AI boom has started a massive scramble for energy that only irregularly large datacenters can satisfy. These mega campuses will be equipped to support electric power capacity at giga-watt scale. But for normal datacenters, this is a rude awakening.

This episode of Utilizing AI Podcast which is a teaser for the next season, talks about the importance of energy-efficient and AI-optimized storage solutions in datacenters. Season 7 will focus on storage and AI data infrastructure. As a prelude to that, host Stephen Foskett sits down with guests, Jeniece Wnorowski and Ace Stryker of Solidigm, to discuss the implications of AI at data infrastructure level.

A Surge in Energy Use in Datacenters

A number of factors play out in the energy demand surge. Traditional datacenters lack the infrastructure to deploy cutting-edge AI workloads sustainably. Big plumbing upgrades including new facilities, latest behind-the-scenes technology, and a shockingly large power grid are required to make it happen.

For companies with a foot in a pile of capital, jumping into this spending frenzy is a smart investment decision. But smaller companies face a rough financial test. The combined capital expense to set up datacenter for artificial intelligence is jarringly large. But even more shocking is the operating expense.

“When you step back and look at the AI data pipeline as a whole, and what your hardware and software requirements to do AI work, a lot of the focus is on compute and rightly so. You need a lot of compute horsepower to clean data, to train and validate a model, and then deploy it in the real world and make it useful,” says Stryker, Director of Market Development.

But datacenters running AI workloads also require an eye-popping amount of storage composed of the fastest drives and meaty arrays. 35% of the energy consumed is used up by this storage infrastructure.

“The GPUs are churning all the time and they’re big and hungry and hot and all that’s true but more than a third of that is going to power your storage devices,” explains Stryker. “Very quickly you can run up an astronomical power bill.”

HDD arrays have a serious disadvantage with space and power consumption. Inherently low on performance, the drives use up a lot of space owing to their bulky designs and consume too much energy.

This makes hardware efficiency an important part of the picture.

“It’s really a function of two things going on that move the meter the most. One is density – how much storage per device are you getting. The denser it is, the higher capacity the drives have, the more power efficient it is going to be.”

The second factor is utilization that pertains to data replication or number of copies across drives. “If you have hard drives and you’re short stroking them to meet the minimum IOPS requirement, these are the two things that create a bunch of inefficiency and opportunity for optimization.”

Fast and Efficient Data-Storage Is Organizations’ Big Opportunity for AI Success

For Solidigm, these make the top agendas. “The focus from our company’s point of view has been to help and support folks who are spending all that money, energy and effort on really high-powered datacenters and make sure that the right storage is there to feed the GPUs, maximize utilization and ultimately improve two things – performance, and total cost of ownership,” Stryker says.

With organizations having no plans to slow down on AI, they need a technology infrastructure that is fully optimized for the task. Moving to higher density drives in slimmer form factors reduces the number of individual drives used to power the servers. This produces massive savings in terms of cooling costs, space and weight, and dollar cost, he emphasizes.

Solidigm tightly partners with industry players like CoreWeave and DDN, and OEM companies like Dell and HPE. “We’re just as passionate about our partners as the products we build to support them,” says Wnorowski, Datacenter Product Marketing Manager.

As customers look to get away from using HDDs at the edge, and adopting comparable SSD solutions, Stryker and Wnorowski hope that they will find Solidigm’s high-density, max-performance flash solutions a great fit. They are dense and pack a lot of capacity within their light frames, and without any performance tradeoffs. As a bonus, the drives are low-maintenance and do not need to be serviced frequently.

Solidigm’s storage solutions are an equally good fit for RAG applications that typically require ultra-high performance.

Keep an eye out for the upcoming season of the Utilizing AI Podcast that will touch on all of these and more in depth.

Podcast Information:

Thank you for listening to Utilizing AI, part of the Utilizing Tech podcast series. If you enjoyed this discussion, please subscribe in your favorite podcast application and consider leaving us a rating and a nice review on Apple Podcasts or Spotify. This podcast was brought to you by Tech Field Day, now part of The Futurum Group. For show notes and more episodes, head to our dedicated Utilizing Tech Website or find us on X/Twitter and Mastodon at Utilizing Tech.

Gestalt IT and Tech Field Day are now part of The Futurum Group.

About the author

Sulagna Saha

Sulagna Saha is a writer at Gestalt IT where she covers all the latest in enterprise IT. She has written widely on miscellaneous topics. On she writes about the hottest technologies in Cloud, AI, Security and sundry.

A writer by day and reader by night, Sulagna can be found busy with a book or browsing through a bookstore in her free time. She also likes cooking fancy things on leisurely weekends. Traveling and movies are other things high on her list of passions. Sulagna works out of the Gestalt IT office in Hudson, Ohio.

Leave a Comment