Aetina unveils ASIC-based edge AI system powered by Blaize Pathfinder P1600
Aetina Corporation, an edge device manufacturer and AI solution provider, has unveiled an embedded computer powered by the Blaize Pathfinder P1600 system-on-module. This compact, fanless design allows it to run effectively and efficiently in extreme temperatures. The system can integrate into large-scale embedded systems that target computer vision applications, such as object detection, human motion and automated inspection.
The AIE-CP1A-A1 embedded computer leverages its low memory bandwidth and latency to make it ideal for neural network deployment. It has several inputs/output ports for its deployment in commercial and industrial AI systems.
“We are now expanding our edge computing product line to bring more GPU and ASIC-based solutions to help developers build their AI systems in various verticals and industries,” said Jackal Chen, senior product manager at Aetina. “In the future, by integrating different types of chips and modules, we can offer heterogeneous computing devices that are suitable for different AI models.”
At the core of the AIE-CP1A-A1 edge system is the Blaize Pathfinder P1600 system-on-module, capable of delivering 16 TOPS AI performance. With its software-programmable AI inference accelerator, the Blaize P1600 module allows users to keep their system up-to-date even after deployment at edge locations.
The compact edge AI system supports H.264/H.265 video encoders and decoders so that users can deploy the machine for image recognition and video analytic workloads. Blaize offers users access to the NetDeploy and Picasso software development platforms, allowing them to develop AI algorithms for these applications.
“Keeping AI processing and inferencing workloads at the edge, rather than sending data to the cloud, is critical for cost-effective and almost latency-free AI applications,” said Dinakar Munagala, CEO and co-founder of Blaize. “The Pathfinder SoM and our GSP architecture enable this key business objective for the demands of computer vision applications.”
In December, Aetina also launched an MXM module based on the Hailo-8 inference processor to support high-mobility applications that demand high performance, lower latency and high throughput.
Comments