AI applications have large data volumes with lots of clients and conventional storage systems aren’t a good fit. In this episode, James Coomer from DDN talks about the lessons they have learned building storage systems to support AI applications. Inferencing requires terabytes or petabytes of data, often large files and streaming data. For example, autonomous driving applications generate hundreds of terabytes of data per vehicle drive, resulting in petabytes of data to ingest and process. DDN’s parallel filesystem goes a step further than NFS with an intelligent client that directs I/O to leverage all network links and storage endpoints available. Deep learning loves data, and a smart client can make the whole application faster. Because data is the biggest AI challenge today, an advanced storage solution can really help deliver AI solutions in the enterprise. Although most companies realize that finding expertise (data scientists, etc) is a major challenge, building infrastructure to support them is just as critical.
Guests and Hosts:
- James Coomer is Senior Vice President for Products at DDN. Connect with James on LinkedIn or learn more on Twitter at @DDN_Limitless
- Andy Thurai, technology influencer and thought leader. Find Andy’s content at theFieldCTO.com and on Twitter at @AndyThurai
- Stephen Foskett, Publisher of Gestalt IT and Organizer of Tech Field Day. Find Stephen’s writing at GestaltIT.com and on Twitter at @SFoskett
For your weekly dose of Utilizing AI, subscribe to our podcast on your favorite podcast app through Anchor FM and watch more Utilizing AI podcast videos on the dedicated website https://utilizing-ai.com/