IT Brief New Zealand - Technology news for CIOs & IT decision-makers
Story image
GTC18 - NVIDIA and Arm partner to make IoT smarter
Wed, 28th Mar 2018
FYI, this story is more than a year old

NVIDIA and Arm have announced that they are partnering to bring deep learning inferencing to IoT devices and consumer electronics.

An inference engine is the aspect of AI that allows a device to figure out new information based on a set of rules and what it already knows.

This collaboration means that loT chip companies can integrate AI into their designs, which should help to make intelligent products more affordable and accessible for consumers.

The companies will integrate NVIDIA's open source Deep Learning Accelerator (NVDLA) architecture into Arm's Project Trillium platform for machine learning.

“Inferencing will become a core capability of every loT device in the future," says NVIDIA autonomous machines vice president and general manager Deepu Talla.

"Our partnership with Arm will help drive this wave of adoption by making it easy for hundreds of chip companies to incorporate deep learning technology."

Arm executive vice president and IT group president Rene Haas adds, "Accelerating Al at the edge is critical in enabling Arm's vision of connecting a trillion loT devices.

"Today we are one step closer to that vision by incorporating NVDLA into the Arm Project Trillium platform, as our entire ecosystem will immediately benefit from the expertise and capabilities our two companies bring in Al and loT."

Based on NVIDIA Xavier, NVDLA is a free, open architecture to promote a standard way to design deep learning inference accelerators which speeds the adoption of deep learning inference.

It is supported by NVIDIA's suite of developer tools, including upcoming versions of TensorRT, a programmable deep learning accelerator.

The open source design allows for new features to be added regularly, including contributions from the research community.

The integration of NVDLA with Project Trillium will give deep learning developers high levels of performance as they leverage Arm's flexibility and scalability across the wide range of loT devices.

“This is a win/win for loT, mobile and embedded chip companies looking to design accelerated Al inferencing solutions,” says Moor Insights - Strategy lead analyst for deep learning Karl Freund.

"NVIDIA is the clear leader in Al and Arm is the leader in loT, so it makes a lot of sense for them to partner on IP.