James Green takes his turn in the IT Origins hot seat. He reveals how he got started in IT (a career slightly delayed thanks to girls, beer, and video games), what he’s reading, how caffeine changed his life, and what tools he uses to stay organized.
Technical debt is more than the cost of not adopting a new technology. Dr. Rachel Traylor points out that it can also be the cost of hastily adopting a new technology without considering how it will fit into your bigger strategy.
Wireless IT also seems to personally effect end-users. Perhaps it’s because it’s easier for them to seemingly isolate Wi-Fi as the source of their frustration, it seems less bundled into other IT infrastructure (even if it really isn’t).
This makes these end-users both insanely frustrating, with the blanket declaration that “Wi-Fi sucks”, but also useful as the ultimate arbiter of performance. There’s generally only binary reactions of approving apathy or vocal derision.
Checkers is the game I played to kill time waiting for tables at restaurants. But solving checkers turns out to be a fascinating exercise. Recently, Alphabet’s AlphaGo team has made a lot of headlines with their neural network-based ability to beat human Go masters. But Ray Lucchesi looks back at earlier days trying to solve checkers with much more limited hardware and fundamentally different approaches.
Today, the term artificial intelligence is a lot like a baseball at a tee-ball game, it gets thrown around a lot, albeit not very accurately. Often in the rush to brand something as trendy, all meaning gets tossed out the window. So when I saw Trove in the iOS App Store claiming to bring AI to email, I was skeptical.
To be clear, the answer to “what is big data?” isn’t the On-Premise IT Roundtable. Nevertheless, our panelists discuss what exactly they mean when they use the term, why it’s the new hotness, and how they’ve seen it impact organizations.
At the Open Compute Summit, AMD went into some more details about it’s high end server CPU, codenamed “Naples”. At one time, the company’s Opteron processors were used in supercomputers. While never the dominant force in the data center, AMD had carved out a niche. The last decade has proven more problematic in the enterprise. AMD thinks Naples is not only competitive with the best from Intel, but will serve as a bulwark against what they describe as the problem of server “incrementalism”.
Nvidia’s data center division made $296 million in revenue for the quarter. In the exorbitant world of technology, this might not seem all that notable as a raw figure. But compared to Q4 2015, its a 205% increase. This isn’t just a one-time blip either, in Q3 they saw year-on-year growth of 193%. If anything this is accelerating.
The rise of the virtual assistant points to the important element that’s been added to voice recognition, artificial intelligence. This is really what can turn it from a minor convenience, into something that can shape lives and businesses. That’s what Next IT is banking on.
Ray Lucchesi of RayOnStorage Blog comments: At Google IO conference this week, they revealed (see Google supercharges machine learning tasks …) that they had been designing and operating their own processor chips in order to optimize machine learning. They called the new chip, a Tensor Processing Unit (TPU). According to Google, the TPU provides an order of magnitude more […]