Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Sunday April 29 2018, @01:35PM   Printer-friendly
from the robots-processed-this-story dept.

They probably weren’t inspired by [Jeff Dunham’s] jalapeno on a stick, but Intel have created the Movidius neural compute stick which is in effect a neural network in a USB stick form factor. They don’t rely on the cloud, they require no fan, and you can get one for well under $100.

SiliconAngle has more:

What distinguishes AI systems on a chip from traditional mobile processors is that they come with specialized neural-network processors, such as graphics processing units or GPUs, tensor processing units or TPUs, and field programming gate arrays or FPGAs. These AI-optimized chips offload neural-network processing from the device’s central processing unit chip, enabling more local autonomous AI processing

Are we about to see another computing revolution and what will the technological and sociopolitical landscape look like after this?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Monday April 30 2018, @08:36AM

    by Anonymous Coward on Monday April 30 2018, @08:36AM (#673644)

    Neural networks is not AI. It's simply statistics. It's easier to see in a single level neural network, where each weight simply tells that given this input, there's this much probability of this output, but multi-layer neural networks work in the exact same way, it just becomes "given this input, there's this much probability that the correct answer is in this general direction".

    Yes, neural networks are modeled on what we know about the brain, but we are not anywhere near creating any form of artificial intelligence. Something is missing. And that's a good thing, because we have no idea how to handle it if we were to succeed in making AI. Should it have human rights? If it's intelligent (that's what the I in AI stands for), why wouldn't it. Doesn't that include the right to life, aka. the right not to be switched off? And what if it decides that we are no more intelligent than dolphins or several other species that don't have human rights, and thus should have no more rights than they do?