Jump to content

Kubernetes at the Edge - Part 1


Recommended Posts

This is the first in a two-part series on Kubernetes at the edge. We discuss the topic from the perspective of telecommunication organizations (telcos) and how they will adopt cloud native technologies like Kubernetes and use GitOps to implement and manage what’s been called, “The Third Act of the Internet”. 

Kubernetes is the chosen winner for many Telco as it is agnostic to infrastructure and capable of managing diverse workloads running on different compute resources. Automation is another crucial feature that makes Kubernetes so attractive and GitOps is the number one way to enable model driven automation. Managing Kuberentes with GitOps increases production and deployment speed, which impacts how fast application teams can bring innovative features to end customers. These ideas, however, also apply to any organization looking to expand their networks and increase innovation at the edge (Download our latest whitepaper to learn how velocity equates to competitive business success)

What is IoT and Edge computing?

Internet of Things (IoT) is a term that defines modern internet-connected devices like smart watches, self-driving cars, smart home appliances, industrial sensors, and more. The number of these devices is growing exponentially every year.

The IoT devices need to operate in real-time, which means there isn't enough time for a roundtrip request going to a centralized cloud server and returning with a response. The solution is to move the processing closer to the IoT devices, at the edge of the network. This type of processing is what we call edge computing.

For end-users, edge computing enables real-time experiences that were not possible before. For organizations, this opens up a world of opportunities to expand their products and services and create the future of technology. For technology teams, the benefits are being able to run a self-managed Virtual Private Clouds (VPC) on relatively inexpensive hardware, and being able to run thousands of such mini-data-centers that are managed centrally and virtually.

Chik-fil-A for example, discusses how they use GitOps to manage more than 2,000 Kubernetes clusters at the edge. The 'things' in their store such as kitchen equipment and trays connect with the Kubernetes clusters locally and are able to function in real-time, and sync data to the cloud as needed.

What's evident from this example is that the edge is not so much a thing, but a location and enables connected things to function in real-time. Building this edge infrastructure is the next wave of innovation in the cloud and is also where cloud native technologies like Kubernetes help make this a reality.

Cloud native makes real time processing possible

Edge computing relies on ultra low level latencies in the millisecond range. Without a near zero latency, real-time data processing would not be possible. For example, CenturyLink targets 5 milliseconds latency for their edge computing needs. Similarly, when launching their 5G services in select cities, Verizon projected a latency of 30 milliseconds.

state_of_edge_latencies.png

Source: State of Edge Report https://www.lfedge.org/projects/stateoftheedge/

IBM explains that a typical round trip request from a connected device to a centralized cloud data center and back can be as high as 250 milliseconds. While 5G speeds can marginally improve this latency by just about 2%, the big difference is when workloads are shifted to the edge. This is when latencies under 20 milliseconds are possible.

Containers are lightweight instances that area ideal to power edge nodes. They are ephemeral, and can be scaled according to a telco’s changing needs. To manage containers at the edge, it requires an architecture that is highly fault tolerant. Aside from providing a needed layer of abstraction on top of physical infrastructure, Kubernetes’ cluster-based architecture, and its self-healing capabilities are ideal here. Additionally, with its growing ecosystem of pluggable tools, Kubernetes equips administrators with the necessary management, monitoring, and governance for containers at the edge.

What's clear is that faster broadband can not on its own achieve real-time experiences. Organizations looking to enable modern experiences need to avoid looping data to the central data center and back. They need to enable data processing at or near the edge with cloud native technology like Kubernetes.

Micro data centers at the edge

Edge computing resources need to be managed in much the same way a data center is - with robust security practices, high fault tolerance, and the ability to scale according to workload. We need to look at them as micro data centers.

Over the past decade, we've mastered the art of maintaining a data center, whether that's on-premise or in the cloud or both. Particularly with the rise of Kubernetes, we're now able to automate deployments and infrastructure with GitOps across multiple clouds and locations, and monitor every part of the system in great detail.

The challenge with edge computing is to take all these learnings from Kubernetes in the cloud and transfer them to the edge. How does a single organization manage thousands of Kubernetes clusters and hundreds of thousands of applications and its infrastructure with only a small platform team?

3 layers of edge infrastructure

k8s_at_the_edge.jpg

As this diagram shows, the entire edge infrastructure consists of 3 layers - a centralized cloud and data center, a middle regional layer, and a last-mile edge layer. What's interesting to note is that Kubernetes is the common orchestrator across all three layers. This unifies management of each layer and simplifies a very complex system.

The cloud and data center are established and well-evolved over the past decade. The middle mile and last mile delivery networks, however, are new, and they will define the play when it comes to edge computing. The edge nodes perform the task of data acquisition and processing at the edge. The regional nodes handle data aggregation and perform the vital task of transferring data back and forth between the cloud and edge nodes.

Infrastructure as a Service platform managed with GitOps

The system becomes more distributed as you move out from the centralized layer to the IoT devices and applications. The number of Kubernetes clusters, containers, and networking devices increase as you move towards the edge layer. This creates operational challenges of a massive scale. With organizations left to manage thousands of Kubernetes clusters at the edge layer. This calls for centralized and streamlined Kubernetes management and a service for infrastructure that can be managed with GitOps. This type of setup can effectively manage massive scale even with only a small platform team (Go deeper on GitOps).

Kubernetes is the underlying ‘operating system’ for each of the edge architecture layers. It is not just a technology that powers clusters of containers, but it also includes an ecosystem of tools to better deploy, maintain, monitor, and govern those clusters. Along with tooling, Kubernetes enables telcos to take advantage of cloud-native approaches such as GitOps to manage edge infrastructure.

Vuk Gojnic of Deutsche Telekom talks about how they run over 10,000 edge locations powered by containers. They take a declarative GitOps approach to do this. Vuk comments that “The DT team doesn’t build or develop any of the containers, instead the platform is designed to shift the containers left or right and deploy them wherever they need to run in a secure and reliable way.” This deliberate approach to managing Kubernetes at the edge empowers application teams with all the tooling and add-ons they need to create outstanding edge experiences.

Stay tuned for the part 2, where we will go into more detail to discuss the middle and last mile layers of the edge computing architecture. In the meantime, learn more about GitOps at the edge.

View the full article

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...