TPU Inference Servers for Efficient Edge Data Centers - DOWNLOAD

Nota nabs $14.7M funding for edge AI model optimization

Categories Edge Computing News  |  Edge Security  |  Funding
Nota nabs $14.7M funding for edge AI model optimization

Nota, which provides technology to optimize AI models, announced that it has closed a $14.7 million Series B funding round. The company’s technology is another important piece of the puzzle when it comes to helping resource-constrained edge devices run applications such as traffic control, worker safety, and biometric identification.

Participants in the funding round included Stonebridge Ventures, LB Investment, DS Asset, Intervest, and Company K Partners. The fresh funding comes roughly a year after Nota closed its Series A round with $6.7 million. Nota has raised a total of $23 million to date.

“This will be a step-up opportunity for us to reach a more diverse range of industries with our technology which has already gained validation from top-tier enterprises,” states Myungsu Chae, CEO of Nota, in a prepared statement. The technology has been tested in Pyeongtaek, South Korea, where the company deployed a real-time traffic control solution that uses object detection technology to identify traffic volume and automatically adjust traffic signal controls. The company has also partnered with Musma of Korea to develop construction site management systems that use video systems to track people and machinery to help prevent worksite accidents.

Nota’s products include NetsPresso, a proprietary hardware-aware AutoML platform that automates AI model development process only with datasets. The company claims that NetsPresso, which is has been in the beta testing stage, automates the process of optimizing AI models for different uses and hardware platforms. This translates into a significant reduction in the time required to develop and deploy models, according to Nota.

In addition to the funding news, Nota said that it has released a beta version of NetsPresso Compression Toolkit module which allows deep learning engineers to use a GUI to employ different AI model compression techniques, again allowing them to optimize AI models more quickly than previous manual tuning methods.

Nota’s offerings enable AI model compression that can be tuned by user-selected parameters, including optimization for specific devices based on systems from Nvidia, Raspberry Pi and other providers. Nota claims inference performance can be increased eight-fold while using less power consumption compared to the unoptimized model.

Analysis

Nota’s place in the AI ecosystem is in what is referred to as MLOps segment where object detection and inferencing models are refined and deployed in production on edge devices like video surveillance cameras. With its optimization technology, any adjustments to models can be implemented more quickly to systems while still ensuring that the hardware can perform efficiently. The larger picture is that Nota’s technology can be used to bring new capabilities to edge devices without necessitating hardware upgrades; from a people perspective, it means data scientists (a precious resource) spend less time creating and maintaining AI models.

Article Topics

 |   |   |   | 

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Featured Edge Computing Company

Edge Ecosystem Videos

Latest News