All Utilizing AI

Expanding ML Models Beyond Current Limits with Groq | Utilizing AI 2×20

Machine learning models have grown tremendously in recent years, with some having hundreds of billions of data points, and we wonder how big they can get. How do we deploy even bigger models, whether it’s in the cloud or using captive infrastructure? Models are getting bigger and bigger, then are distilled and annealed, and then grow bigger still. In this episode, Dennis Abts of Groq discusses the scalability of ML models with Stephen Foskett and Chris Grundemann. HPC architecture and concepts are coming to the enterprise, enabling us to work with unthinkable amounts of data. But we are also reducing the precision and complexity of models to reduce their size. The result is that businesses will be able to work with ever-larger data sets in the future.

Three Questions

  1. How long will it take for a conversational AI to pass the Turing test and fool an average person?
  2. Will we ever see a Hollywood-style “artificial mind” like Mr. Data or other characters?
  3. How small can ML get? Will we have ML-powered household appliances? Toys? Disposable devices?

Guests and Hosts

For your weekly dose of Utilizing AI, subscribe to our podcast on your favorite podcast app through Anchor FM and watch more Utilizing AI podcast videos on the dedicated website

About the author

Stephen Foskett

Stephen Foskett is an active participant in the world of enterprise information technology, currently focusing on enterprise storage, server virtualization, networking, and cloud computing. He organizes the popular Tech Field Day event series for Gestalt IT and runs Foskett Services. A long-time voice in the storage industry, Stephen has authored numerous articles for industry publications, and is a popular presenter at industry events. He can be found online at,, and on Twitter at @SFoskett.

Leave a Comment