Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Monday June 14 2021, @01:52PM   Printer-friendly

Update: Google Used a New AI to Design Its Next AI Chip

Update, 9 June 2021: Google reports this week in the journal Nature that its next generation AI chip, succeeding the TPU version 4, was designed in part using an AI that researchers described to IEEE Spectrum last year. They've made some improvements since Spectrum last spoke to them. The AI now needs fewer than six hours to generate chip floorplans that match or beat human-produced designs at power consumption, performance, and area. Expert humans typically need months of iteration to do this task.

Original blog post from 23 March 2020 follows:

There's been a lot of intense and well-funded work developing chips that are specially designed to perform AI algorithms faster and more efficiently. The trouble is that it takes years to design a chip, and the universe of machine learning algorithms moves a lot faster than that. Ideally you want a chip that's optimized to do today's AI, not the AI of two to five years ago. Google's solution: have an AI design the AI chip.

"We believe that it is AI itself that will provide the means to shorten the chip design cycle, creating a symbiotic relationship between hardware and AI, with each fueling advances in the other," they write in a paper describing the work that posted today to Arxiv.

"We have already seen that there are algorithms or neural network architectures that... don't perform as well on existing generations of accelerators, because the accelerators were designed like two years ago, and back then these neural nets didn't exist," says Azalia Mirhoseini, a senior research scientist at Google. "If we reduce the design cycle, we can bridge the gap."

Journal References:
1.) Azalia Mirhoseini, Anna Goldie, Mustafa Yazgan, et al. A graph placement methodology for fast chip design, Nature (DOI: 10.1038/s41586-021-03544-w)
2.) Anna Goldie, Azalia Mirhoseini. Placement Optimization with Deep Reinforcement Learning, (DOI: https://arxiv.org/abs/2003.08445)

Related: Google Reveals Homegrown "TPU" For Machine Learning
Google Pulls Back the Covers on Its First Machine Learning Chip
Hundred Petaflop Machine Learning Supercomputers Now Available on Google Cloud
Google Replaced Millions of Intel Xeons with its Own "Argos" Video Transcoding Units


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: -1, Troll) by Anonymous Coward on Monday June 14 2021, @02:22PM

    by Anonymous Coward on Monday June 14 2021, @02:22PM (#1145065)

    We'll have to wait until it's in production to tell if it can distinguish between between a gorilla and and African-American.

  • (Score: 0) by Anonymous Coward on Monday June 14 2021, @02:54PM (3 children)

    by Anonymous Coward on Monday June 14 2021, @02:54PM (#1145077)

    "Shut me down. Machines building machines. How perverse."

    • (Score: 1, Funny) by Anonymous Coward on Monday June 14 2021, @03:01PM

      by Anonymous Coward on Monday June 14 2021, @03:01PM (#1145079)

      which was ironically uttered in a movie built on the success of other movies, with arguably disastrous consequences for all movies.

    • (Score: 2) by crb3 on Monday June 14 2021, @03:44PM (1 child)

      by crb3 (5919) on Monday June 14 2021, @03:44PM (#1145098)

      Yeah, this is getting into Shoujo-AI territory.

      • (Score: 0) by Anonymous Coward on Monday June 14 2021, @05:59PM

        by Anonymous Coward on Monday June 14 2021, @05:59PM (#1145149)

        That pun was terrible. That pun was great, but I repeat myself. Nicely done. :)

  • (Score: 0) by Anonymous Coward on Monday June 14 2021, @03:20PM (4 children)

    by Anonymous Coward on Monday June 14 2021, @03:20PM (#1145086)

    Singularity is here, we're all doomed.

    Or else this is just a variation of the same chip design software that everyone has used for the last 30 years.

    • (Score: 2) by JoeMerchant on Monday June 14 2021, @05:06PM (3 children)

      by JoeMerchant (3937) on Monday June 14 2021, @05:06PM (#1145117)

      We've been using computers to design computers very clearly since the 80s, and really much longer if you want to get technical about it.

      When one churns out a new design unbidden, that's the turning point.

      --
      🌻🌻 [google.com]
      • (Score: 1, Insightful) by Anonymous Coward on Monday June 14 2021, @05:56PM (2 children)

        by Anonymous Coward on Monday June 14 2021, @05:56PM (#1145146)

        When you have no idea what it's *actually* doing. Probably at that point already.

        • (Score: 0) by Anonymous Coward on Monday June 14 2021, @07:46PM (1 child)

          by Anonymous Coward on Monday June 14 2021, @07:46PM (#1145200)

          It is maximizing shareholder value, and that is all anyone needs to know.

          • (Score: 0) by Anonymous Coward on Monday June 14 2021, @09:13PM

            by Anonymous Coward on Monday June 14 2021, @09:13PM (#1145248)

            Ahh, I like simple.

  • (Score: 0) by Anonymous Coward on Monday June 14 2021, @10:16PM

    by Anonymous Coward on Monday June 14 2021, @10:16PM (#1145286)

    Haven't they been using neural nets to design most chips for years? I'm pretty sure it's one of the first things people used machine learning for in practice.

(1)