All Utilizing AI

AI Needs Non-Traditional Storage Solutions with James Coomer of DDN | Utilizing AI: 2×13

AI applications have large data volumes with lots of clients and conventional storage systems aren’t a good fit. In this episode, James Coomer from DDN talks about the lessons they have learned building storage systems to support AI applications. Inferencing requires terabytes or petabytes of data, often large files and streaming data. For example, autonomous driving applications generate hundreds of terabytes of data per vehicle drive, resulting in petabytes of data to ingest and process. DDN’s parallel filesystem goes a step further than NFS with an intelligent client that directs I/O to leverage all network links and storage endpoints available. Deep learning loves data, and a smart client can make the whole application faster. Because data is the biggest AI challenge today, an advanced storage solution can really help deliver AI solutions in the enterprise. Although most companies realize that finding expertise (data scientists, etc) is a major challenge, building infrastructure to support them is just as critical.

Guests and Hosts:

For your weekly dose of Utilizing AI, subscribe to our podcast on your favorite podcast app through Anchor FM and watch more Utilizing AI podcast videos on the dedicated website

About the author

Stephen Foskett

Stephen Foskett is an active participant in the world of enterprise information technology, currently focusing on enterprise storage, server virtualization, networking, and cloud computing. He organizes the popular Tech Field Day event series for Gestalt IT and runs Foskett Services. A long-time voice in the storage industry, Stephen has authored numerous articles for industry publications, and is a popular presenter at industry events. He can be found online at,, and on Twitter at @SFoskett.

Leave a Comment