Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Wednesday October 18 2017, @04:39PM   Printer-friendly
from the The-singularity-is-coming!-The-singularity-is-coming! dept.

Intel will release a machine learning chip intended to compete with Nvidia's Tesla chips and Google's TPUs (Tensor Processing Units):

When the AI boom came a-knocking, Intel wasn't around to answer the call. Now, the company is attempting to reassert its authority in the silicon business by unveiling a new family of chips designed especially for artificial intelligence: the Intel Nervana Neural Network Processor family, or NNP for short.

The NNP family is meant as a response to the needs of machine learning, and is destined for the data center, not your PC. Intel's CPUs may still be a stalwart of server stacks (by some estimates, it has a 96 percent market share in data centers), but the workloads of contemporary AI are much better served by the graphical processors or GPUs coming from firms like Nvidia and ARM. Consequently, demand for these companies' chips has skyrocketed. (Nvidia's revenue is up 56 percent year on year.) Google has got in on the action, designing its own silicon named the Tensor Processing Unit to power its cloud computing business, while new firms like the UK-based Graphcore are also rushing to fill the gap.

Also at Engadget, TechCrunch, and CNET.

Previously: Nervana Deep Learning Chip


Original Submission

Related Stories

Nervana Deep Learning Chip 4 comments

http://www.nextplatform.com/2016/08/08/deep-learning-chip-upstart-set-take-gpus-task/

Bringing a new chip to market is no simple or cheap task, but as a new wave of specialized processors for targeted workloads brings fresh startup tales to bear, we are reminded again how risky such a business can be.

Of course, with high risk comes potential for great reward, that is, if a company is producing a chip that far outpaces general purpose processors for workloads that are high enough in number to validate the cost of design and production. The stand-by figure there is usually stated at around $50 million, but that is assuming a chip requires validation, testing, and configuration rounds to prove its ready to be plugged into a diverse array of systems. Of course, if one chooses to create and manufacture a chip and make it available only via a cloud offering or as an appliance, the economics change—shaving off more than a few million.

These sentiments are echoed by Naveen Rao, CEO of Nervana Systems, a deep learning startup that has put its $28 million in funding to the TSMC 28 nanometer test with a chip expected in Q1 of 2017. With a cloud-based deep learning business to keep customers, including Monsanto, on the hook for deep learning workloads crunched via their on-site, TitanX GPU cluster stacked with their own "Neon" software libraries for accelerated deep learning training and inference, the company has been focused on the potential for dramatic speedups via their stripped-down tensor-based architecture in the forthcoming Nervana Engine processor.

UPDATE: Intel bought the company for around $350 million to $408 million.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 1, Interesting) by Anonymous Coward on Wednesday October 18 2017, @05:37PM

    by Anonymous Coward on Wednesday October 18 2017, @05:37PM (#584021)

    None of these solutions allow 'bare metal' programming, and unless you can trust the software not to leak your proprietary neural network applications, why would you trust Intel, Nvidia, or anyone else with code when at least both of them have infringed on others work in the past?

    This is a dangerous time for tech, even moreso than the mainframe days.

  • (Score: 3, Interesting) by frojack on Wednesday October 18 2017, @06:40PM (2 children)

    by frojack (1554) on Wednesday October 18 2017, @06:40PM (#584042) Journal

    The NNP family is meant as a response to the needs of machine learning, and is destined for the data center, not your PC.

    Machine learning isn't even well defined yet, but largely means simply pattern recognition trained by feeding it examples.

    Make no mistake about it, what it is aimed at is facial recognition. Its being fancied up with generic sounding names. But its all about facial reco, based on all your facebook pictures - your selfies and others that include you.

    Sure it will have uses in text analysis, DNA, maybe even in DDOS attack analysis. But the push now is for control and tracking of people.

    --
    No, you are mistaken. I've always had this sig.
    • (Score: 0) by Anonymous Coward on Wednesday October 18 2017, @07:14PM

      by Anonymous Coward on Wednesday October 18 2017, @07:14PM (#584060)

      It sounds like the man who sold the world facial recognition went ahead and put it in a heart-shaped box.

    • (Score: 0) by Anonymous Coward on Thursday October 19 2017, @12:09AM

      by Anonymous Coward on Thursday October 19 2017, @12:09AM (#584264)

      Phew, just as long is it's not cock pics. I mean, not that I send a lot of cock pics but I mean, privacy right? Am I right?

  • (Score: 2) by Some call me Tim on Wednesday October 18 2017, @08:15PM

    by Some call me Tim (5819) on Wednesday October 18 2017, @08:15PM (#584111)

    At least they didn't name it Nickelback!

    --
    Questioning science is how you do science!
  • (Score: 2) by crafoo on Thursday October 19 2017, @12:46AM (1 child)

    by crafoo (6639) on Thursday October 19 2017, @12:46AM (#584297)

    Google is putting the cloud in a silicon chip?! Well fuck then. I'll take a dozen. Silicon what does the machine learning in the cloud!

(1)