Neuromorphic edge AI powers faster water rescues with drone-based detection

BrainChip has partnered with Arquimea to develop an AI-powered detection solution for enhancing water safety.
The solution uses BrainChip’s Akida processor and Prophesee’s event-based Metavision camera on low-power drones to detect distressed swimmers and surfers. Akida processes vision data directly without converting it into traditional frame-based formats, improving efficiency and reducing latency.
The system enables faster detection, lower power consumption, better tracking, and reduced computational and memory requirements compared to conventional methods. The technology aims to assist lifeguards in monitoring large beach areas, improving search and rescue operations and saving lives.
“The objective of event-based vision solution providers has long been a way to achieve faster and more accurate object detection with substantially fewer memory and computation requirements,” says Jonathan Tapson, Chief Development Officer of BrainChip. “BrainChip’s Akida, working in conjunction with Prophesee’s event-based vision systems, provides the ability to compute drone data efficiently, reducing latency for faster detection and minimizing demands on power to provide companies like ARQUIMEA with advantages that have previously been unavailable. We are proud to have our IP leveraged in such an important life-and-death application.”
BrainChip’s Akida mimics the human brain, analyzing essential sensor inputs locally on the chip, independent of the cloud, enhancing privacy and energy efficiency.
This deployment showcases how event-driven, neuromorphic computing at the device edge specifically on lightweight, power-efficient drones, can deliver ultra-low-latency inferencing and high energy efficiency for search and rescue scenarios.
Tencent’s edge capabilities cemented in global mobile standards amid rising network demands
Article Topics
AI/ML | Akida | Arquimea | Brainchip | edge AI | neuromorphic computing | processor
Comments