TPU Inference Servers for Efficient Edge Data Centers - DOWNLOAD

Section’s Kubernetes Edge Interface simplifies the Kubernetes deployment process — and broadens its appeal beyond CDN

Section.io, a provider of edge orchestration and computing services, announced the release of its Kubernetes Edge Interface (KEI) to simplify the Kubernetes deployment process.

Managing distributed multi-cluster Kubernetes environments has always been challenging. Section said Kubernetes Edge Interface makes the process easier by presenting edge clusters as a single cluster, allowing customers to distribute Kubernetes workloads across the edge network with an API call. The company developed a patent-pending application called the Adaptive Edge Engine to do all the workload management behind the scenes.

“Edge deployment is simply better than centralized data centers or single clouds in almost every important metric: performance, scale, efficiency, resilience, usability, etc.,” said Stewart McGrath, Section’s CEO in a statement. “Yet organizations historically put off edge adoption because it’s been complicated. With Section’s KEI, teams don’t have to change tools or workflows; the distributed edge effectively becomes a cluster of Kubernetes clusters, and our AEE automation and Composable Edge Cloud handles the rest.”

One of the key highlights of KEI is the Kubernetes API that enables the developers to use familiar tools and frameworks like kubectl and Helm to manage the edge environment. The interface extends the Kubernetes API to connect the Kubernetes resources with the in-house edge platform so that the existing applications can move to the edge with ease. In short, even though KEI allows developers to deploy workloads as easily as a single cluster, they can still enjoy all the performance and scalability benefits of a multi-cluster environment.

Section’s Adaptive Edge Engine (AEE) is a key part of the equation; it continuously monitors the edge delivery network to optimize the performance and cost ratio along a multitude of user-defined variables. The Kubernetes Edge Interface provides control on scheduling containers, performing health management, and managing traffic routes. To take the advantage of KEI, developers do not need to modify the container size, location, or any other related settings to deploy applications to Section. The last piece of the puzzle is Section’s Composable Edge Cloud service, which runs on heterogeneous hardware from a variety of providers including Equinix Metal, Lumen, and others.

Section.io transitions from running CDN workloads to a broader set of workloads

Section.io first came to the attention of developers who needed to run customized content delivery applications. While CDN service providers offer a wide array of services, most of them are a take it or leave it proposition with little room for customization. Companies like Cloudflare and Fastly have been offering more programmability thought the use of serverless functions, but Section’s McGrath told EdgeIR that their company is able to help developers run any containerized application at the edge.

That capability is part of why Centurylink (now Lumen) partnered with Section for an edge application delivery service. The low hanging fruit in the market was content delivery and web application firewalls. Now, after closing a a $12 million Series B funding round in 2021, the company’s ready to talk about new uses.

“We’re focusing on general purpose workloads across true distributed edge infrastructure, said McGrath. KEI is important because it allows developers to scale beyond the typical 100 or so host node limit that most organizations bump up against and deploy and manage to many thousands of locations.

Managing cost is another key issue, and McGrath noted that AEE was built to continuously optimize the place, scale and route traffic for containerized workloads based on user-defined policies. That idea alone isn’t novel — AWS customers might deploy cost optimization platform from a third party vendor, for example — but having the management engine closely tied in to an edge cloud platform and the Kubernetes workflow that developers use could bring Section into competition with a different set of competitors ranging from MobiledgeX (edge workload management) to managed Kubernetes platforms from companies like Rafay Systems.

McGrath said that companies have already been using KEI. One unnamed game console manufacturer is developing a way to run game logic on edge containers for better performance. Another customer in e-commerce is running the website rendering engine using Section.io.

Article Topics

 |   |   |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Featured Edge Computing Company

Edge Ecosystem Videos

Latest News