Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Tuesday June 02 2015, @01:53PM   Printer-friendly
from the big-business dept.

Two members of our community have submitted information on the Intel purchase of Altera:

Intel To Buy Altera For 16.7 billion USD

Intel bought chipmaker Altera for 16.7 billion US$. This follows another huge purchase in the semiconductor industry last week, when Avago snapped up Broadcom for $37 billion US$. This has been a record year for consolidation within the industry, as companies struggle to deal with slowing growth and stagnating stock prices. Altera had already rejected an offer from Intel, but shareholders pressured them to reconsider. "Acquiring Altera may help Intel defend and extend its most profitable business: supplying server chips used in data centers. While sales of semiconductors for PCs are declining as more consumers rely on tablets and smartphones to get online, the data centers needed to churn out information and services for those mobile devices are driving orders for higher-end Intel processors and shoring up profitability. Altera has a huge FPGA business.

Perhaps this will impact Altera FPGA Linux support?

Intel Acquires Altera for $16.7 Billion

Intel Corporation has announced that it is buying Altera Corporation for $16.7 billion in cash. The deal will allow Intel to access potentially valuable field-programmable gate array (FPGA) revenue streams and integrate FPGAs into Xeon chips to try and maintain its dominance in datacenters. Altera has already been using Intel's 14nm process to make its Stratix FPGAs.

The Platform has more in-depth analysis of the deal:

The first hedge that Intel is making with the Altera acquisition is that a certain portion of the compute environment that it more or less owns in the datacenter will shift from CPUs to FPGAs.

In the conference call announcing the deal for Altera, Intel CEO Brian Krzanich said that up to a third of cloud service providers – what we here at The Platform call hyperscalers – could be using hybrid CPU-FPGA server nodes for their workloads by 2020. (This is an Intel estimate.) Intel's plan is to get a Xeon processor and an Altera FPGA on the same chip package by the end of 2016 – Data Center Group general manager Diane Bryant showed off a prototype of such a device in early 2014 – and ramp its production through 2017, with a follow-on product that actually puts the CPU and the FPGA circuits on the same die in monolithic fashion "shortly after that."

Intel plans to create a hybrid Atom-FPGA product aimed at the so-called Internet of Things telemetry market, and this will be a monolithic design as well, according to Krzanich; the company is right now examining whether it needs an interim Atom and FPGA product that shares a single package but are not etched on a single die.


Original Submissions

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by forkazoo on Tuesday June 02 2015, @08:01PM

    by forkazoo (2561) on Tuesday June 02 2015, @08:01PM (#191245)

    I've never actually played with one, but they'll never be fast enough to replace a modern CPU. That's not the point. Spending X transistors on building a CPU will get you a CPU. Spending X transistors on an FPGA to configure as a CPU will always mean that you can run it as a CPU with X transistors. The value is in flexibility. You can configure it as a MIPS CPU today, but then reconfigure it as an experimental video compressor chip tomorrow. And it's entirely possible that the video compressor in an FPGA will be much, much faster than running the same codec in software on a CPU. It just won't be as fast as implementing that custom video compression chip natively in fixed hardware.

    I kind of always wanted to play with one, but I never really needed to. At this point, computers as fast as the original Power Macs cost $6 and GHz computers cost $20, so stuff that used to require custom hardware is increasingly possible in software.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 1) by anubi on Wednesday June 03 2015, @07:46AM

    by anubi (2828) on Wednesday June 03 2015, @07:46AM (#191485) Journal

    I will chime in to say that even something as basic as an Arduino, when supervising a Parallax "Propeller" chip can be quite formidable in an industrial control application.

    Never in my life - in my wildest imaginings as a kid - did I think I would ever see the day such powerful machines available for a pittance of currency.

    The price/performance specs of things like the Raspberry PI blow me away. Priced like a toy. Performs like the most powerful machines I had at a University campus.

    One thing I regret is I never got involved with FPGA programming of things like Altera chips.

    I would have loved to have made an ultracheap frame buffer that would let me look at legacy signals ( old 9-pin monochrome ) for the old monochrome graphics displays and re-emit them in the modern 15-pin VGA format. This does not seem as simple as connecting horizontal sync, vertical sync, and mixing the color lines to monochrome. Apparently there are completely different horizontal scan rates and vertical interlace issues involved. Anyone seen a CHEAP box that does this? It may even be possible the ability to do this is already in the new LCD panels, but I know the old VGA displays had their horizontal circuits pretty tightly tuned to only work in certain ranges.

    The reason I ask is I occasionally still see old monochrome monitors in use, but they are wearing out, and their cathodes are losing emissivity - leading to blurry images on the screen. I would love to simply replace the old display with a modern flat-panel.

    --
    "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]