Today’s storage devices (disks and SSDs) have processors and memory already, and this is the concept of computational storage. If drives can process data locally, they can relieve the burden of communication and processing and help reduce the amount of data that gets to the CPU or GPU. In this episode, Vladimir Alves and Scott Shadley join Chris Grundemann and Stephen Foskett to discuss the AI implications of computational storage. Modern SSDs already process data, including encryption and compression, and they are increasingly taking on applications like machine learning. Just as industrial IoT and edge computing are taking on ML processing, so too are storage devices. Current applications for ML on computational storage include local processing of images and video for recognition and language processing, but these devices might even be able to execute ML training locally as in the case of federated learning.
- Are there any jobs that will be completely eliminated by AI in the next five years?
- Can you think of any fields that have not yet been touched by AI?
- How small can ML get? Will we have ML-powered household appliances? Toys? Disposable devices?
Guests and Hosts
- Vladimir Alves, CTO and Co-Founder at NGD Systems. Connect with Vladimir on LinkedIn.
- Scott Shadley, VP of Marketing at NGD System. Connect with Scott on LinkedIn or on Twitter @SMShadley
- Chris Grundemann, Gigaom Analyst and Managing Director of Grundemann Technology Solutions. Connect with Chris on ChrisGrundemann.com and on Twitter at @ChrisGrundemann
- Stephen Foskett, Publisher of Gestalt IT and Organizer of Tech Field Day. Find Stephen’s writing at GestaltIT.com and on Twitter at @SFoskett
For your weekly dose of Utilizing AI, subscribe to our podcast on your favorite podcast app through Anchor FM and check out more Utilizing AI podcast episodes on the dedicated website https://utilizing-ai.com/