TPU Inference Servers for Efficient Edge Data Centers - DOWNLOAD

OVHcloud adds NVIDIA GPUs to boost its AI solutions

OVHcloud adds NVIDIA GPUs to boost its AI solutions

European cloud computing company OVHcloud has added new NVIDIA GPU product offerings to its solution as part of its artificial intelligence strategy.

OVHcloud is a part of the NVIDIA Partner Network and notes its continual AI solutions portfolio development.

The company says it is designing AI-enabled infrastructures, which include new NVIDIA H100 and A100 Tensor Core GPUs, through a vertically integrated industrial model. Its customers will be able to choose from several options to power their machine learning workloads, including large language models.

OVHcloud says it already offers a competitive price with older generation NVIDIA V100 and NVIDIA V100S GPUs, and has now added NVIDIA H100, NVIDIA A100, NVIDIA L40S and NVIDIA L4 GPU offerings.

According to Michel Paulin, CEO of OVHcloud: “AI will seriously transform our clients’ businesses and we are in a unique position to help them easily transition to this new era. With a one-of-a-kind AI infrastructure that leverages all our expertise and the most sought-after GPUs, we provide world-class performance with all the benefits of the cloud. Our AI solutions customers will also benefit from these novelties through our easy-to-use AI Notebooks, AI Training and AI Deploy offers.”

New NVIDIA A100 80GB powered GPU instances enable AI specialists to run complex projects on highly specialized NVIDIA Tensor Cores, the company notes, and it’s also suited to run inference due to various optimizations in tackling such workloads. This includes LLM-related projects.

“Enterprises are looking for flexible cloud service options to drive innovation internally and to help customers adopt new generative AI applications,” says Matthew McGrigg, director of global development for cloud partners at NVIDIA.

“By offering a full complement of NVIDIA accelerated computing, OVHcloud is able to handle a variety of inference and training workloads.”

OVHcloud is also announcing upcoming H100-based GPU instances built around NVIDIA’s latest accelerator with a compute power starting at 26 petaFLOPS (FP64) per PCIe GPU.

The company also unveiled GPU instances featuring NVIDIA L4 GPUs with 24GB of memory. The L4, based on the NVIDIA Ada Lovelace GPU architecture, is a universal GPU for workloads with AI and video capabilities.

OVHcloud plans to add NVIDIA H100 and A100 options to its set of AI PaaS solutions designed to accompany the data life cycle: AI Notebooks, AI Training and AI Deploy.

Article Topics

 |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Featured Edge Computing Company

Edge Ecosystem Videos

Latest News