Modern applications are widely deployed in the cloud, but they’re coming at the edge as well. This episode of the Tech Field Day podcast features Alastair Cooke and Paul Nashawaty from The Futurum Group, Erik Nordmark from ZEDEDA, and host Stephen Foskett discussing the intersection of application modernization and edge computing. As enterprises look to deploy more applications at the edge they are leveraging technologies like Kubernetes and containers to enable portability, scalability, resilience, and high availability. In many cases customers are moving existing web applications to the edge to improve performance and security, but not all webscale technologies are appropriate on the limited hardware, environmentals, and connectivity found at the edge. The question is whether to improve the edge compute platform or build resiliency into the application itself. But there are limits to this approach, since edge locations don’t have the elasticity of the cloud and many of the features of Kubernetes were not designed for limited resources. It comes down to developer expectations, since they are now accustomed to the experience of modern webscale platforms and expect this environment everywhere. In the future, we expect WASM, LLMs, and more to be used regardless of location.
Apple Podcasts | Spotify | Overcast | Amazon Music | YouTube Music | Audio
The modernization of applications, from datacenter to cloud to edge, is rapidly progressing. Technologies drawn from the hyperscale world are finding their way to edge locations, where data processing and analysis occur closer to the source of data and customer transactions. This shift is driven by the need for real-time processing, reduced latency, and enhanced security, and technologies like Kubernetes and containers are increasingly used to facilitate this transition.
The Benefits of Containerized Applications at the Edge
Containerization offers many benefits essential for modern applications, especially those deployed at the edge. It provides a level of portability that allows applications to be easily moved and managed across different environments, from the cloud to the edge, without the need for extensive reconfiguration or adaptation. This is particularly important given the diverse and often resource-constrained nature of edge environments, which can vary greatly in terms of hardware, connectivity, and operational conditions.
Scalability is another critical aspect of containerization that aligns well with the needs of edge computing. Containers enable applications to be decomposed into microservices, allowing for more granular scaling and management. This microservices architecture facilitates the efficient use of resources, enabling applications to scale up or down based on demand, which is particularly useful in edge environments, again in the face of resource constraints.
Resilience and high availability are further enhanced through containerization. By deploying applications as a set of interdependent but isolated containers, developers can achieve a level of redundancy and fault tolerance that is difficult to achieve with monolithic architectures. This is crucial at the edge, where the risk of hardware failure, network disruptions, and other environmental factors can pose significant challenges to application availability and reliability.
The security benefits of containerization should not be overlooked in the context of edge computing either. Containers provide a level of isolation that helps mitigate the risk of cross-application interference and potential security breaches. This isolation is complemented by the ability to apply granular security policies at the container level, enhancing the overall security posture of edge deployments. And containerized applications are easier to keep up to date as security patches are developed.
Challenges for Modern Applications at the Edge
Despite these advantages, the deployment of containerized applications at the edge is not without its challenges. The resource limitations of edge environments, including constraints on compute power, storage, and network bandwidth, require careful consideration of the containerization strategy employed. Additionally, the management and orchestration of containers at the edge introduce complexity, particularly in highly distributed environments with potentially thousands of edge locations.
The choice between improving the edge compute platform to better support containerization and building resilience into the application itself is a critical decision. While enhancing the edge platform can provide a more robust foundation for containerized applications, it may require significant financial and technological investment. Although designing applications with inherent resilience and adaptability can offer a more immediate solution, these may not achieve all of the benefits of containerization.
The expectations of developers, accustomed to the rich features and flexibility of modern cloud-native platforms, also play a significant role in the adoption of containerization at the edge. Developers seek environments that offer the same level of agility, ease of use, and comprehensive tooling they are familiar with in the cloud, driving the demand for containerization technologies that can replicate this experience at the edge.
Looking forward, the evolution of containerization at the edge is likely to be influenced by emerging technologies such as WebAssembly (WASM) and large language models (LLMs). WASM promises to enhance the portability and efficiency of applications across diverse computing environments, including the edge, by enabling more lightweight and adaptable application architectures. The integration of AI and machine learning capabilities, particularly for processing and analyzing data at the edge, will further drive the modernization of applications in these distributed environments.
Containerization and the Edge
Containerization is a fundamental enabler for the modernization of applications in the cloud, and this is true at the edge as well. It offers the portability, scalability, resilience, and security necessary to address the unique challenges of edge computing, while also meeting the expectations of developers for a modern application development environment. As enterprises continue to push the boundaries of what is possible at the edge, containerization will play a pivotal role in shaping the future of edge computing, driving innovation and enabling new levels of performance and efficiency.
Podcast Information:
Stephen Foskett is the Organizer of the Tech Field Day Event Series, now part of The Futurum Group. Connect with Stephen on LinkedIn or on X/Twitter.
Alastair Cooke is a CTO Advisor at The Futurum Group. You can connect with Alastair on LinkedIn or on X/Twitter and you can read more of his research notes and insights on The Futurum Group’s website.
Paul Nashawaty is a Practice Lead focused on Application Development Modernization at The Futurum Group. You can connect with Paul on LinkedIn and learn more about his research and analysis on The Futurum Group’s website.
Erik Nordmark is the CTO and Co-founder at ZEDEDA. You can connect with Erik on LinkedIn and learn more about ZEDEDA on their website.
Thank you for listening to this episode of the Tech Field Day Podcast. If you enjoyed the discussion, please remember to subscribe on YouTube, Apple Podcasts, Spotify, or your favorite podcast application so you don’t miss an episode. Please do give us a rating and a review, it helps with discoverability. This podcast was brought to you by Tech Field Day, home of IT experts from across the enterprise, now part of The Futurum Group. For upcoming events and more episodes, head to the Tech Field Day website.