Nvidia’s AI and edge announcements get a surprisingly warm welcome
Game architectures probably will be in Nvidia Corp.’s genetic makeup for as long as it exists, but the GPU giant could one day be primarily known as a data center and AI leader. That’s one of the takeaways from Nvidia’s annual conference, where announcements focused on GPU chips for data center deployments. The key development: the new designs offer massive gains in performance required for AI and other cutting-edge applications while consuming far less power.
Recent product announcements surprised many, including some prominent sell-side stock analysts. Company executives at Nvidia’s annual conference said next to nothing about interactive game infrastructure and analysts hardly took notice.
Of particular interest was the DGX A100 Nvidia’s third generation AI system. The new box boasts an upper performance limit of five petaflops. Executives say the DGX A100 is itself a data center.
In a release, the company claimed that the machine, with eight new A100 Tensor Core GPUs, is the first AI system capable of handling everything “from data analytics to training to inference.” Each can be partitioned into up to 56 instances.
Nvidia claims that a $1 million, five-DGX A100 data center, using 28 kilowatts, and doing AI training and inference work is the equal of an $11 million data center with 50 DGX-1s and 600 CPUs, using 630 kilowatts.
The new GPUs are generally being positioned as appropriate for centralized cloud services. However, as Edge Industry Review previously reported, Nvidia’s GPU’s have been making their way into many more data centers around the globe, including providers such as EdgeConneX, who have built data centers at the aggregation edge in numerous smaller cities around the globe.
This follows the Nvidia fall announcement of its EGX Edge AI platform, which is Kubernetes and container native software that delivers GPU-accelerated AI to both x86 and ARM-based GPUs.
Enabling AR/VR on the edge
During last week’s conference, Nvidia also announced the debut of CloudXR 1.0. XR means streaming augmented, virtual and mixed reality content. The software will accommodate high-performance networks, especially 5G.
CloudXR is supposed to work with any head-mounted displays and connected Microsoft Corp. and Alphabet Inc. Android devices, delivering pro-grade graphics. Executives are pitching the software to business owners, “from architecture to retail.”
Wall Street reacts
The list of new edge and AI goods announced was extensive, but the main take from Wall Street is that Nvidia’s revenue growth opportunities still have plenty of upside, thanks in part to edge and AI.
Four sell-side analysts surveyed by Yahoo! Finance heard the news and maintained either “buy” or “outperform” rating.
This is significant because analysts tend to run for cover when a company that is very well known for a category of product, as Nvidia is, takes up their day with news of non-core activity.
Analysis
In terms of the new chips and server systems our take is that saving money on the servers is important, but the huge reduction in power consumption is perhaps even more important. Lower power consumption means data center providers can stuff more processing power into the same physical space as previous generation systems without having to also reconfigure power and cooling systems. For them, that can translate into more revenue per square foot.
Nvidia also knows as well as any other chip company that they need to make software development tools to go along with their chips and supporting Kubernetes on multiple chip architectures is a smart move that should help developers move more AI models to edge data centers.
Jim Davis, Principal Analyst, Edge Research Group
Comments