Thomas LaRock wrote up a post about AI, Deep Learning, and Machine Learning. These sophisticated tools allow for automation of a lot of work we thought might only ever be done by humans. But Thomas outlines why he’s not waiting for SkyNet quite yet.
AMD returned to the data center CPU market in a big way with their Epyc platform in 2017. But what about competing with Nvidia on the GPU side? The release of the Vega 20-based Radeon Instinct cards signals that AMD is ready to try.
The terms training and inferencing get thrown around a lot with machine learning, but what do they actually mean? This video by Thomas Henson breaks down the concepts.
Nvidia’s new Turing architecture marks a significant departure for the company, offering dedicated ray tracing and tensor processors. On their workstation cards this make sense. But for consumers, is the added complexity and power worth the benefits?
Searching for files across cloud storage and on-premises sources can be a disjointed and time-intensive process. Cloudtenna’s DirectSearch looks to ease that problem, providing a single login and UI to get unified file search across a wide range of repositories.
Let’s face it, AI gets thrown around a lot in the enterprise these days. It often gets conflated with Machine Learning, Deep Learning, and neural networks. But does the term actually mean anything? Are there solutions out there that actually qualify as AI? The roundtable debates.
In our First IT Origins Survey, we’re asking the community on of our standard interview questions: What are the best and worst trends in IT right now? We’re pulled together some early responses, but we’d love your feedback as well.
The phrases “Machine Learning” and “Artificial Intelligence” get thrown around a lot in enterprise IT. Every solution seemingly has one of the two baked in. But what do those terms actually mean? How can be tell the difference between actual implementations and marketing bluster? We talked to mathematician Dr. Rachel Traylor to find out.
It’s become common now for IT companies to list deep learning algorithms as a major platform feature, from analytics to automation. But home does deep learning compare to actual human intelligence? Ray Lucchesi looked at some of its issues in the context of the MIT Intelligence Quest.
In this edition of Gestalt News:
– we generate new IT slogans using machine learning
– Matt Leib sits down for the IT Origins interview
– the Gestalt IT Rundown discuss the chip market crunch, Samsung surpassing Intel, and what a potential Dell EMC – VMware merger does