What makes AI applications special, and how does this change the infrastructure required to support these?
In this episode, we ask Red Hat about the platform requirements for AI applications in production. The demand for flexibility, scalability, and distribution seems to match a hybrid cloud’s capabilities, which is emerging as the preferred model for AI infrastructure.
Red Hat supports the container-centric hybrid cloud with OpenShift, and containers are also critical to AI workloads.
Red Hat has production customers in healthcare, manufacturing, and financial industries deploying ML workloads in production right now.
Episode Hosts and Guests
Abhinav Joshi, Senior Manager, Product Marketing, OpenShift Business Unit, Red Hat. Find Abhinav on Twitter at @Abhinav_Joshi.
Tushar Katarki, Senior Manager, Product Management, OpenShift Business Unit, Red Hat. Find Tushar on Twitter at @TKatarki.
Stephen Foskett, Publisher of Gestalt IT and Organizer of Tech Field Day. Find Stephen’s writing at GestaltIT.com and on Twitter at @SFoskett.
Chris Grundemann, a Gigaom Analyst and VP of Client Success at Myriad360. Connect with Chris on ChrisGrundemann.com on Twitter at @ChrisGrundemann
Catch this episode and more on Anchor FM and watch more Utilizing AI podcast videos on the dedicated website https://utilizing-ai.com/