TPU Inference Servers for Efficient Edge Data Centers - DOWNLOAD

USI targets design/build services for customers with edge AI hardware

Categories Edge Computing News  |  Hardware
USI targets design/build services for customers with edge AI hardware

Network storage product design and manufacturing services provider USI recently assisted brand customers in design and manufacturing (JDM) cooperation to provide support for edge AI computing servers. USI provided a comprehensive architecture and conceptual design, chassis design, hardware, FW design (Switch FW, BIOS, and BMC FW), and thermal design. The product complies with stricter requirements than standard rack servers NEBS Telecommunications Standard.

This 2U rack-mount edge AI server with Intel’s third-generation Xeon Platinum processors, four single-width or two double-width NVIDIA GPUs, a short body that fits into small spaces, and is quiet, rugged, dust-proof, and shock-proof. It can run continuously at 5-45 degrees Celsius and be used in outdoor telecommunication boxes, computer rooms, factories, shops, and other environments.

Market research firm Gartner predicts that 75% of enterprise-generated data will be processed at the edge by 2025. Global spending on edge computing will reach $10.96 billion by 2026, according to market research firm Statista.

Daimon Lee, General Manager of USI Data Network & Storage BU, says that edge computing could bring enterprise applications closer to IoT devices or edge servers in a distributed computing architecture. Being closer to data sources can get substantial business benefits, improve responsiveness, and provide higher bandwidth availability. USI notes it will continue to develop the server storage market with efficient turnkey services of design, production, testing, and verification.

Article Topics

 |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Featured Edge Computing Company

Edge Ecosystem Videos

Latest News