Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Wednesday February 28 2018, @02:13AM   Printer-friendly
from the is-that-an-AI-in-your-pocket-or.... dept.

Arthur T Knackerbracket has found the following story:

Neural networks are powerful things, but they need a lot of juice. Engineers at MIT have now developed a new chip that cuts neural nets' power consumption by up to 95 percent, potentially allowing them to run on battery-powered mobile devices.

Smartphones these days are getting truly smart, with ever more AI-powered services like digital assistants and real-time translation. But typically the neural nets crunching the data for these services are in the cloud, with data from smartphones ferried back and forth.

That's not ideal, as it requires a lot of communication bandwidth and means potentially sensitive data is being transmitted and stored on servers outside the user's control. But the huge amounts of energy needed to power the GPUs neural networks run on make it impractical to implement them in devices that run on limited battery power.

Engineers at MIT have now designed a chip that cuts that power consumption by up to 95 percent by dramatically reducing the need to shuttle data back and forth between a chip's memory and processors.

-- submitted from IRC


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by FatPhil on Wednesday February 28 2018, @08:52AM

    by FatPhil (863) <reversethis-{if.fdsa} {ta} {tnelyos-cp}> on Wednesday February 28 2018, @08:52AM (#645044) Homepage
    "reducing the need to shuttle data back and forth between a chip's memory and processors."

    Read any book on the theoretical model of neural nets, back to Minksy and beyond, and at no point are they imagined to be compute nodes which have to fetch and store data from an external memory. They are described as having their own internal state, which they communicate only, and directly, to other compute nodes. (While running, they also have a constant set of parameters, but passing them that is "programming" the net, no data that needs to flow more than once, and only in one direction.)

    The fact that previous neural net processors, and all common processors, have had a crappy implementation shouldn't be a good reason to herald this as being novel. Those who care about the true cost of computation (taking area into account) have been ranting about the wastefulness of RAM for decades.
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2