You probably don’t need to be told that networks are getting faster. There’s a ton of reasons why. Cloud-based applications. Changing consumption models. 5G networks. Enhanced Wi-Fi. You name it and it’s going to be a reason why there’s more data flowing through your switches than ever before. You still need to make sure everything is on the up-and-up though, right? Letting all that data fly around without analysis is just asking for trouble. Tom Hollingsworth looks at how CounterFlow.ai helps to analyze this ever increasing data flow.
Not all AI consumers are interested in diverting their limited resources and time into becoming an IT infrastructure system integrator. The breadth of work needed to ready the IT infrastructure resources needed so that developers can start their work can be dizzying.In highly competitive industries, three to six months of delay on critical, competition-driven business initiatives can make a huge difference. With that in mind, Max Mortillaro looks at Pure Storage’s AIRI, a fully integrated AI-ready infrastructure stack that enables organizations to spend more time on delivering value and achieve a faster ROI.
How can you hope to fight viruses when you’ve never seen them before? Is the world of heuristic detection dead? Tom Hollingsworth takes a look at Ziften and how they are harnessing the power of AI to provide a way to find unknown viruses.
Is your wireless network producing more data than you can manage? Is there a way to sort through it all to provide real information? And how does that all scale in a stadium? Tom Hollingsworth takes a look at the new hardware and software releases from Extreme Networks and how they’re working together to build the next generation of wirelessly-connected stadiums.
Thomas LaRock wrote up a post about AI, Deep Learning, and Machine Learning. These sophisticated tools allow for automation of a lot of work we thought might only ever be done by humans. But Thomas outlines why he’s not waiting for SkyNet quite yet.
Red Hat and Nvidia are officially partnering to bring better GPU training for AI applications on OpenShift.
Let’s face it, AI gets thrown around a lot in the enterprise these days. It often gets conflated with Machine Learning, Deep Learning, and neural networks. But does the term actually mean anything? Are there solutions out there that actually qualify as AI? The roundtable debates.
Is the GPU stranglehold on AI workloads about to be disrupted by FPGAs and ASICs?
In our First IT Origins Survey, we’re asking the community on of our standard interview questions: What are the best and worst trends in IT right now? We’re pulled together some early responses, but we’d love your feedback as well.
The phrases “Machine Learning” and “Artificial Intelligence” get thrown around a lot in enterprise IT. Every solution seemingly has one of the two baked in. But what do those terms actually mean? How can be tell the difference between actual implementations and marketing bluster? We talked to mathematician Dr. Rachel Traylor to find out.