Stories
Slash Boxes
Comments

SoylentNews is people

Submission Preview

Link to Story

Google Announces Edge TPU

Accepted submission by takyon at 2018-07-26 02:24:09
Hardware

Google unwraps its gateway drug: Edge TPU chips for IoT AI code; Custom ASICs make decisions on sensors as developers get hooked on ad giant's cloud [theregister.co.uk]

Google has designed a low-power version of its homegrown AI math accelerator, dubbed it the Edge TPU, and promised to ship it to developers by October. Announced at Google Next 2018 today, the ASIC is a cutdown edition of its Tensor Processing Unit [theregister.co.uk] (TPU) family of in-house-designed coprocessors. TPUs are used internally at Google to power its machine-learning-based services, or are rentable [theregister.co.uk] via its public cloud. These chips are specific designed for and used to train neural networks and perform inference.

Now the web giant has developed a cut-down inference-only version suitable for running in Internet-of-Things gateways. The idea is you have a bunch of sensors and devices in your home, factory, office, hospital, etc, connected to one of these gateways, which then connects to Google's backend services in the cloud for additional processing.

Inside the gateway is the Edge TPU, plus potentially a graphics processor, and a general-purpose application processor running Linux or Android and Google's Cloud IoT Edge software stack. This stack contains lightweight Tensorflow-based libraries and models that access the Edge TPU to perform AI tasks at high speed in hardware. This work can also be performed on the application CPU and GPU cores, if necessary. You can use your own custom models if you wish.

The stack ensures connections between the gateway and the backend are secure. If you wanted, you could train a neural network model using Google's Cloud TPUs and have the Edge TPUs perform inference locally.

Google announcement [blog.google]. Also at TechCrunch [techcrunch.com], CNBC [cnbc.com], and CNET [cnet.com].

Related: Google's New TPUs are Now Much Faster -- will be Made Available to Researchers [soylentnews.org]
Google Renting Access to Tensor Processing Units (TPUs) [soylentnews.org]
Nvidia V100 GPUs and Google TPUv2 Chips Benchmarked; V100 GPUs Now on Google Cloud [soylentnews.org]


Original Submission