Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday May 19 2017, @12:34AM   Printer-friendly
from the Are-you-thinking-what-I'm-thinking? dept.

Google's machine learning oriented chips have gotten an upgrade:

At Google I/O 2017, Google announced its next-generation machine learning chip, called the "Cloud TPU." The new TPU no longer does only inference--now it can also train neural networks.

[...] In last month's paper, Google hinted that a next-generation TPU could be significantly faster if certain modifications were made. The Cloud TPU seems to have have received some of those improvements. It's now much faster, and it can also do floating-point computation, which means it's suitable for training neural networks, too.

According to Google, the chip can achieve 180 teraflops of floating-point performance, which is six times more than Nvidia's latest Tesla V100 accelerator for FP16 half-precision computation. Even when compared against Nvidia's "Tensor Core" performance, the Cloud TPU is still 50% faster.

[...] Google will also donate access to 1,000 Cloud TPUs to top researchers under the TensorFlow Research Cloud program to see what people do with them.

Also at EETimes and Google.

Previously: Google Reveals Homegrown "TPU" For Machine Learning
Google Pulls Back the Covers on Its First Machine Learning Chip
Nvidia Compares Google's TPUs to the Tesla P40
NVIDIA's Volta Architecture Unveiled: GV100 and Tesla V100


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by LoRdTAW on Friday May 19 2017, @04:38PM (1 child)

    by LoRdTAW (3755) on Friday May 19 2017, @04:38PM (#512245) Journal

    So when will the first open source chip show up?

    I have a feeling that it won't happen. TPU's might never see the light of day outside of a Google data center because google would lose control of the AI race. We may see open source tools to talk to said AI systems but the platform itself will be a proprietary black box.

    We are moving rapidly towards a closed computing world where the likes of Google, Amazon, and others seek to aggregate all of your computing needs into THEIR data centers. The idea being that everyone has to pay a recurring fee or are duped into freely working for said companies by letting them commoditize data mined from every corner of your digital life and sell it. You will also be duped into doing the dirty footwork for gathering other data such as poking through your videos, photos, locations and other data.

    All we will be left with are locked down dumb terminals in the form of tablets, "smart" TV's and phones. Desktop/Laptop computers will eventually be abandoned by said companies to "focus on delivering a user friendly multimedia platform" . They will be about as modern and stylish as 1970's decor. The walled gardens which are for now avoidable might one day be the only choice. I once thought it was paranoia to think like this but now it is becoming more and more real at a faster and faster pace. VR, AR, AI, all every other buzz acronym will all have us enslaved one day. But not to a giant computer but the almighty dollar, the man behind the curtain.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by kaszz on Friday May 19 2017, @04:47PM

    by kaszz (4211) on Friday May 19 2017, @04:47PM (#512250) Journal

    It will be the data center behind the fiber ;-)

    The component to make free is the process to make chips. Once that is accomplished their monopoly significantly decreases.