Google is bringing artificial intelligence to the edge of networks with a custom processor and an enhanced cloud platform to bulk up its internet of things portfolio.
The merging of AI and IoT will be powered by the new Edge Tensor Processing Unit (TPU)—an ASIC chip designed to run TensorFlow Lite inferences—effectively execute machine learning algorithms from Google's data processing library without connecting to the cloud, Injong Rhee, Google's vice president for IoT, told Next conference attendees in a keynote.
The tiny chip—four of which can fit on a penny—can pair with sensors to execute machine learning while delivering efficiencies in cost and power consumption, Rhee said.
"Edge TPU will bring a brain to your edge devices," Rhee said, delivering affordable and power-efficient data processing and machine learning computation without performance compromises.
To further enable development of software that enables data processing for gateways, cameras and other connected devices, Google introduced Cloud IoT Edge.
The integrated platform, released through an early access program, has two components—Edge IoT Core, and Edge ML.
Edge IoT Core delivers a runtime environment for securely connecting devices to the cloud—enabling software and firmware updates and managing data exchanges with Google's Cloud IoT Core platform.
Edge ML, based on TensorFlow Lite, executes pre-trained machine learning models in the field, where latency is a concern.
Customers can train models in the cloud, and with the services coming later this year, run them locally. Those workloads can take advantage of the new Edge TPU hardware, or general-purpose CPUs or GPUs, Rhee said.
In October, Google will also release a development kit, Edge TPU SOM (system on module), to further empower partners and customers to build custom IoT solutions, he said.
-Gina Narcisi contributed to this report