AI is being baked directly onto Sony image sensors, partners with Microsoft
Sony Corp. says it has two CMOS image sensors that will ship with machine learning capabilities baked on to the sensors themselves.
Care to order Microsoft Azure AI with that? Sony says it has also partnered with Microsoft Corp. to embed Azure AI capabilities in its new chips.
Sony said its sensors are the world’s first AI-equipped image sensors and is pitching them for retail and industrial equipment sectors. It is manufactured in a stacked configuration consisting of a pixel chip and a logic chip. The image sensor sends data to the logic chip, which has AI built-in processing capabilities, bypassing a need to send data to an external high-performance processor or memory. This approach is said to result in improvements in performance and power consumption over traditional designs.
The new sensors are the latest examples of adding capabilities to edge devices that normally would be done by a server or cloud. More than 80% of global chipsets will be AI-equipped and the market in the United States alone will be worth $12 billion by 2024, according to forecasts from Research and Markets.
To help garner a larger slice of the market, making it easier for developers to customize chips with AI models suited to a given task will be important. Recognizing products on a store shelf and recognizing a defective part are but two examples of the need for different vision models.
Sony will develop an application that works with Microsoft’s Azure IoT and Cognitive Services offerings that will enable independent software vendors (ISVs) specializing in computer vision and video analytics solutions, as well as smart camera original equipment manufacturers (OEMs) to quickly bring new smart video surveillance offerings to market. The idea is to enable model development and training easier by offering the chips with Microsoft’s software already integrated.
The sensors themselves are capable of real-time object tracking, and AI models can be changed by rewriting internal memory, according to Sony. In fact, Sony says an AI model for detecting heat maps on a sensor can be scrubbed for one assigned an entirely different task.
The sensors can distinguish between inputs and push along only metadata to the cloud or data centers. Beyond addressing privacy concerns and latency, the architecture conserves power by not shipping everything that hits the chip.
Samples of the IMX500 are shipping this month, and samples of the IMX501 go out next month. Sample prices are ¥10,000 ($92.75) for a bare-chip IMX500 and ¥20,000 ($185.53) for a package IMX501.
Market Background
Sony’s announcement comes as specialized chip designs are gaining a lot of investor interest.
Recent market developments include venture funding for San Diego-based Kneron, a developer of AI algorithms, and AI-optimized processor designs for use in edge devices such as surveillance cameras and other smart home products. Kneron raised $40 million in January 2020 to further develop its products.
Earlier the same month, Apple acquired Xnor.AI, a developer of software and chip technology that makes the job of object recognition easier for edge devices.
The development of new chip technologies for edge AI is linked in part to the fact that running AI on the device means there’s no need to send large amounts of data to brawnier servers at the access edge or core cloud. Applications such as image recognition can take place without latency penalties.
Article Topics
Azure | edge AI | image processing | IoT | Microsoft | Sony | video surveillance
Comments