REGISTER HERE!

Akamai targets AI inferencing bottlenecks with new edge cloud solution

Akamai targets AI inferencing bottlenecks with new edge cloud solution

Akamai launched its Akamai Cloud Inference service designed to improve AI inference performance,  boasting better throughput, 60% less latency, and 86% lower costs compared to traditional hyperscale infrastructure. 

The service focuses on running AI inference closer to users and devices, addressing limitations of centralized cloud models. Akamai Cloud Inference provides tools for developers to build and run AI applications with reduced latency and improved efficiency. 

The platform integrates with NVIDIA’s AI ecosystem and partners with VAST Data for optimized data management and real-time access. It supports containerized AI workloads using Kubernetes for scalability, resilience, and cost optimization. 

Akamai’s edge compute capabilities enable low-latency AI applications by executing lightweight code directly at the edge. The platform leverages Akamai’s distributed network with over 4,200 points of presence in 130+ countries for scalability and performance. 

“Getting AI data closer to users and devices is hard, and it’s where legacy clouds struggle,” says Adam Karon, Chief Operating Officer and General Manager, Cloud Technology Group at Akamai. “While the heavy lifting of training LLMs will continue to happen in big hyperscale data centers, the actionable work of inferencing will take place at the edge where the platform Akamai has built over the past two and a half decades becomes vital for the future of AI and sets us apart from every other cloud provider in the market.”

Akamai emphasizes the shift from training large language models (LLMs) to lightweight, industry-specific AI models for practical business solutions. Distributed cloud and edge architectures are highlighted as essential for real-time, actionable insights and operational intelligence. 

Early use cases include in-car voice assistance, AI-powered crop management, virtual shopping experiences, and automated product descriptions. 

Akamai positions AI inference as the next frontier for AI, focusing on delivering faster, smarter decisions and personalized user experiences. This underscores a significant evolution in edge computing, addressing critical latency and cost-efficiency challenges associated with centralized cloud AI inferencing. 

Article Topics

 |   |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Featured Edge Computing Company

Edge Ecosystem Videos

Latest News