Those that follow trends with a keen eye know what a trailblazing technology edge computing is. A brand-new take on the earlier paradigms of cloud and datacenter computing, edge computing heralds a new era of computing where data processing happens right where data originates.
Experts believe that it is going to shift and shape the corporate landscape in the next few years. Already, a dash for edge computing is palpable in certain industries. In a matter of a few years, the rest of the verticals is predicted to follow in their footsteps, making edge computing the new standard.
There’s no doubt value in edge computing, but if you magnify the networking bit of it, there’s more than meets the eyes. During the Delegate Roundtable at the recent Networking Field Day event, we urged the panel to parse these and come up with, what in their opinion are the true use cases of edge computing.
“Is there value in doing things along the edge?” asked host Tom Hollingsworth.
When we talk of edge networking, several questions come to mind. For example, what does the edge network look like? What part of the network is it? And what is networking like at the edge?
The edge network which happens to be the edge of the network is the network access layer. One or more security boundaries separate it from the deeper recesses of the network. Think of it as a delta that distributes compute to various edge devices users have.
You might ask what is the purpose of putting a data architecture at the far end of the network when there is already one at the center? The idea behind it is to offload compute from the central servers onto the smaller devices, and let them do the work, so that computing is fast, results are instantaneous, and wastage is minimum.
Odds and Obstacles
But networking at the edge is not exactly a picnic. It has its pains, but there are ways around them too. Disaggregated networking, a trend that has been setting in the past few years, holds the key to unlocking ultimate agility. Unbundle the software from the hardware and you have no problematic vendor lock-ins, annoying scaling limits or everyday deployment issues. Everything is a breeze.
The edge is one of the biggest use case for that, says Drew Conry-Murray, tech blogger and podcaster. “This is the opportunity to break the traditional vendors’ hold on the network with new ideas and capabilities,” he says.
While on one hand, it can be safely said that the distributed nature of edge computing opens avenues for new innovations, the very fact also limits the possibilities. Edge sites are far flung, and are often in remote and out-of-range locations. We’re talking oil rigs, power grids, and wind turbines. These sites have limited power supply, and when it comes to AI, ML and HPC workloads which are known to consume insane amounts of power, enterprises have rather slim pickings to choose from.
“When you have an edge computing solution, you’re going to have to be very careful about what you run on it,” warns Chris Cummings, Network Engineer. “Those workloads are going to be way more expensive than something running way off in a datacenter where it’s almost free to run them because of how cheap power is. Power is that main currency of all this,” he adds.
Running workloads wholesale at the edge is out of the question because power supply and cost barriers at the moment are too big. But does that make edge computing untappable, or a half-baked technology for the masses?
A clever alternative would be to be selective about what workloads one wants to run at the edge, proposed Cummings. If companies are smart about deciding what workloads they need to compute at the edge, then a reserve of benefits waits to be tapped into.
Several vendors are making use of unused facilities within small geographic footprints that are physically closer to the workloads to work around the energy barrier. More work is underway around designing networking solutions that are light, compact and energy-efficient. As these come to market, they will produce tiny datacenters at the edge unlocking the same cost advantages as datacenters.
A Matter of Selection
Beyond the cost factor, lies a bigger element – the use case. With edge computing, it really comes down to who it makes the most sense for, says the panel.
The questions to ask are, “What makes the most sense in terms of where you put your workload and what sizing and capabilities are there, and do they provide something that supersedes that capability?” said Ed Horley, Founder and CEO of HexaBuild, an IT consultancy company.
Horley mentioned that the agro and rural manufacturing industries are the biggest benefactors of edge computing. These verticals use massive amounts of telemetry and faster processing of data greatly benefits their everyday work.
For latency-sensitive applications used in industries like high-frequency trading, transportation, supply chain and retail, real-time analytics is pure gold. Edge computing is the first choice for these organizations because of the speed and flexibility it provides by letting them react to changes faster.
A Grain of Doubt
However, the panelists are doubtful about the self-sufficiency of edge sites in the near future, which is key to uptake. It is a distant possibility without more innovation in the application stack that allows local data processing, says Carl Fugate, Cloud Technology Advisor. “We have only so much power at the edge to be able to do these things. What we can keep in scope, or being able to make those local decisions is fairly small. We’re not going to scale that out,” he said.
There’s also considerable anxiety around security. The edge network ingests massive amounts of data every day, and without a solid security structure guarding the operations, data can easily end up in the pockets of bad guys, and very well be the downfall for organizations working with sensitive data.
Put that way, the prospects of edge computing may look a bit dim, but it is worth reminding that bandwidth saturation is as much a reality as is security in any network. Instead of going back to doing things the old way which is undoubtedly the proven and cost-effective way but not the most ideal for new-age applications that require millisecond response times, the question is, shouldn’t we be pushing vendors to optimize their solutions so that they can process data on the system for faster output?
A New Way
For the whole of the last decade, enterprises have known to compute only one way that involves aggregating data from scattered sources, sending it upstream, processing and bringing it back down, losing both time and money. Wouldn’t it be smarter to utilize those resources to optimize the network for computing at the edge?
Adoption is on the rise and even though cost continues to be a constraint, the work around energy efficiency will likely soften it to some extent.
“We are slowly but surely increasing the power efficiency and compute processing power of the smaller machines. Look at what Arm has been able to do in the past few years. With CPUs working inside GPUs, they’ve effectively become offload engines and as they become more and more optimized for edge computing, it gives us more flexibility to do computing at the source,” said Hollingsworth.
To listen to the full discourse, be sure to check out the Delegate Roundtable Discussions from the recent Networking Field Day event.