Is your wireless network producing more data than you can manage? Is there a way to sort through it all to provide real information? And how does that all scale in a stadium? Tom Hollingsworth takes a look at the new hardware and software releases from Extreme Networks and how they’re working together to build the next generation of wirelessly-connected stadiums.
Thomas LaRock wrote up a post about AI, Deep Learning, and Machine Learning. These sophisticated tools allow for automation of a lot of work we thought might only ever be done by humans. But Thomas outlines why he’s not waiting for SkyNet quite yet.
Red Hat and Nvidia are officially partnering to bring better GPU training for AI applications on OpenShift.
Let’s face it, AI gets thrown around a lot in the enterprise these days. It often gets conflated with Machine Learning, Deep Learning, and neural networks. But does the term actually mean anything? Are there solutions out there that actually qualify as AI? The roundtable debates.
Is the GPU stranglehold on AI workloads about to be disrupted by FPGAs and ASICs?
In our First IT Origins Survey, we’re asking the community on of our standard interview questions: What are the best and worst trends in IT right now? We’re pulled together some early responses, but we’d love your feedback as well.
The phrases “Machine Learning” and “Artificial Intelligence” get thrown around a lot in enterprise IT. Every solution seemingly has one of the two baked in. But what do those terms actually mean? How can be tell the difference between actual implementations and marketing bluster? We talked to mathematician Dr. Rachel Traylor to find out.
It’s become common now for IT companies to list deep learning algorithms as a major platform feature, from analytics to automation. But home does deep learning compare to actual human intelligence? Ray Lucchesi looked at some of its issues in the context of the MIT Intelligence Quest.
Technical debt is more than the cost of not adopting a new technology. Dr. Rachel Traylor points out that it can also be the cost of hastily adopting a new technology without considering how it will fit into your bigger strategy.
If you’re not familiar with ioFABRIC, they make Vicinity, a data fabric solution that lets you get better utilization of all your storage. It does this by presenting applications with a virtual data plane that amalgamates all available storage. This is governed independently by their own control plane. Essentially, the virtual data plane presents to the application as whatever kind of storage it natively needs (block, file, SMB, etc.).
Up to date, ioFABRIC Vicinity has supported storage in your data center, whether it’s a SAN, SSDs, NVMe, or emerging NVDIMMs. But with their 3.0 release, they are fundamentally changing the product.