Ignite Microsoft on Tuesday launched Azure Percept – a full stack of technologies to run AI code at the network edge – during its annual Ignite conference.
The materials are aimed at organizations deploying, or consider deploying, machine-learning software on network-connected embedded devices and equipment in remote locations. Said systems are expected to employ a level of intelligence to make decisions wherever they are, rather than completely rely on backend services.
For example, cameras that selectively record when they detect certain objects, situations, or people, and sensors that adjust machinery and report back summaries rather transmit every single reading and await instructions. Y’know, stuff that’s possible without AI but could be enhanced and made more efficient with a trained neural network.
Connecting up all the bits and pieces of such a deployment – the backend platform, suggested electronics to use at the edge, and the software to glue it all together – is the aim of Percept.
Better buckle up: Volkswagen puts Microsoft in driver’s seat to deliver ‘automated’ platform
For instance, the stack includes a hardware developer kit. This includes a small computer and a camera unit dubbed the Azuze Percept Vision system, which can be trained using Microsoft’s Azure Cognitive Services to perform a range of tasks.
The Azure Percept dev kit contains a modest NXP i.MX 8M processor, and has Wi-Fi and Bluetooth connectivity. The Azure Percept Vision system has an RGB camera and a Movidius Myriad X machine learning accelerator from Intel.
There is also a separate Azure Percept Audio system for speech applications. “We’ve started with the two most common AI workloads, vision and voice, sight and sound, and we’ve given out that blueprint so that manufacturers can take the basics of what we’ve started,” said Roanne Sones, corporate vice president of Microsoft’s edge and platform group. “But they can envision it in any kind of responsible form factor to cover a pattern of the world.”
The Azure Percept Audio (left), Trusted Platform Module (center), and Azure Percept Vision (right). Source: Microsoft. Click to enlarge
Sones said the platform is designed to make the process of deploying AI on edge devices much easier. “We integrated hardware accelerators with Azure AI and Azure IoT services and designed them to be simple to use and ready to go with minimal setup,” she added.
Developers don’t have to be experts in artificial intelligence to use this gear; there are various online tutorials on how to connect the hardware to software, and begin training it to perform simple tasks without having to code much.
The developer kits appear to be aimed at those producing prototype designs to hook into Azure. For full-scale production use, especially for non-trivial applications, one would expect to use custom or specialist hardware that’s compatible with Percept, if one so chooses to use Redmond’s stack.
As such, Microsoft is working with device and component manufacturers to have their gear certified to work with Percept so that customers can purchase the hardware and, hopefully painlessly, slot it in an Azure-backed environment and deploy AI workloads to the gadgets. ®