Slash Boxes

SoylentNews is people

Breaking News
posted by janrinok on Tuesday June 02 2015, @01:53PM   Printer-friendly
from the big-business dept.

Two members of our community have submitted information on the Intel purchase of Altera:

Intel To Buy Altera For 16.7 billion USD

Intel bought chipmaker Altera for 16.7 billion US$. This follows another huge purchase in the semiconductor industry last week, when Avago snapped up Broadcom for $37 billion US$. This has been a record year for consolidation within the industry, as companies struggle to deal with slowing growth and stagnating stock prices. Altera had already rejected an offer from Intel, but shareholders pressured them to reconsider. "Acquiring Altera may help Intel defend and extend its most profitable business: supplying server chips used in data centers. While sales of semiconductors for PCs are declining as more consumers rely on tablets and smartphones to get online, the data centers needed to churn out information and services for those mobile devices are driving orders for higher-end Intel processors and shoring up profitability. Altera has a huge FPGA business.

Perhaps this will impact Altera FPGA Linux support?

Intel Acquires Altera for $16.7 Billion

Intel Corporation has announced that it is buying Altera Corporation for $16.7 billion in cash. The deal will allow Intel to access potentially valuable field-programmable gate array (FPGA) revenue streams and integrate FPGAs into Xeon chips to try and maintain its dominance in datacenters. Altera has already been using Intel's 14nm process to make its Stratix FPGAs.

The Platform has more in-depth analysis of the deal:

The first hedge that Intel is making with the Altera acquisition is that a certain portion of the compute environment that it more or less owns in the datacenter will shift from CPUs to FPGAs.

In the conference call announcing the deal for Altera, Intel CEO Brian Krzanich said that up to a third of cloud service providers – what we here at The Platform call hyperscalers – could be using hybrid CPU-FPGA server nodes for their workloads by 2020. (This is an Intel estimate.) Intel's plan is to get a Xeon processor and an Altera FPGA on the same chip package by the end of 2016 – Data Center Group general manager Diane Bryant showed off a prototype of such a device in early 2014 – and ramp its production through 2017, with a follow-on product that actually puts the CPU and the FPGA circuits on the same die in monolithic fashion "shortly after that."

Intel plans to create a hybrid Atom-FPGA product aimed at the so-called Internet of Things telemetry market, and this will be a monolithic design as well, according to Krzanich; the company is right now examining whether it needs an interim Atom and FPGA product that shares a single package but are not etched on a single die.

Original Submissions

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by Anne Nonymous on Tuesday June 02 2015, @01:56PM

    by Anne Nonymous (712) on Tuesday June 02 2015, @01:56PM (#191135)

    Given Intel's acquisition track record, this is probably great news for Xilinx.

    • (Score: 2) by kaszz on Tuesday June 02 2015, @08:38PM

      by kaszz (4211) on Tuesday June 02 2015, @08:38PM (#191261) Journal

      Any specific acquisitions in mind? :P

      • (Score: 2) by Anne Nonymous on Tuesday June 02 2015, @09:04PM

        by Anne Nonymous (712) on Tuesday June 02 2015, @09:04PM (#191271)

        What I mean is that Intel will fuck up Altera like they have everything else they've bought.

        • (Score: 0) by Anonymous Coward on Wednesday June 03 2015, @05:32AM

          by Anonymous Coward on Wednesday June 03 2015, @05:32AM (#191458)

          My coworker said the exact same thing today! Seriously, how can this work. How did McAfee turn out for them?

  • (Score: 0, Offtopic) by Anonymous Coward on Tuesday June 02 2015, @02:23PM

    by Anonymous Coward on Tuesday June 02 2015, @02:23PM (#191144)

    Come on people! Even the dreaded Dice newsvertisement site had this story YESTERDAY

    • (Score: 0) by Anonymous Coward on Tuesday June 02 2015, @02:26PM

      by Anonymous Coward on Tuesday June 02 2015, @02:26PM (#191145)

      Well, they were probably paid specifically to have it.

    • (Score: 3, Informative) by janrinok on Tuesday June 02 2015, @02:45PM

      by janrinok (52) Subscriber Badge on Tuesday June 02 2015, @02:45PM (#191152) Journal

      Well I edited the story and removed the 'Breaking News' from the heading. But, as a result of the new software update, we now have nexuses - and one nexus is called Breaking News. So the box got ticked so that the story went into that nexus and would have gone out immediately. But none of us knew/realised that ticking that option automatically puts the 'Breaking News' prefix back into the title. So, here we are, a long time after it was edited, and it hits the front page with the prefix replaced. I discovered something about editing and software updates from that.....

      I am not interested in knowing who people are or where they live. My interest starts and stops at our servers.
      • (Score: 4, Touché) by zeigerpuppy on Tuesday June 02 2015, @02:57PM

        by zeigerpuppy (1298) on Tuesday June 02 2015, @02:57PM (#191157)

        Breaking News Breaking News...
        The prefix that was became the break
        Dreaming, flailing, like pepper dusted wings
        The news broke over a bug filled dawn
        Chattering, screaming over vexatios ledges
        And ledgers and sledges,
        There was no end.

  • (Score: 2) by tibman on Tuesday June 02 2015, @07:00PM

    by tibman (134) Subscriber Badge on Tuesday June 02 2015, @07:00PM (#191227)

    They never seemed fast enough to replace a modern cpu. I have seen articles where people emulate old cpus though. Has anyone here played with FPGAs before? How easy are they to get started with?

    SN won't survive on lurkers alone. Write comments.
    • (Score: 2) by forkazoo on Tuesday June 02 2015, @08:01PM

      by forkazoo (2561) on Tuesday June 02 2015, @08:01PM (#191245)

      I've never actually played with one, but they'll never be fast enough to replace a modern CPU. That's not the point. Spending X transistors on building a CPU will get you a CPU. Spending X transistors on an FPGA to configure as a CPU will always mean that you can run it as a CPU with X transistors. The value is in flexibility. You can configure it as a MIPS CPU today, but then reconfigure it as an experimental video compressor chip tomorrow. And it's entirely possible that the video compressor in an FPGA will be much, much faster than running the same codec in software on a CPU. It just won't be as fast as implementing that custom video compression chip natively in fixed hardware.

      I kind of always wanted to play with one, but I never really needed to. At this point, computers as fast as the original Power Macs cost $6 and GHz computers cost $20, so stuff that used to require custom hardware is increasingly possible in software.

      • (Score: 1) by anubi on Wednesday June 03 2015, @07:46AM

        by anubi (2828) on Wednesday June 03 2015, @07:46AM (#191485) Journal

        I will chime in to say that even something as basic as an Arduino, when supervising a Parallax "Propeller" chip can be quite formidable in an industrial control application.

        Never in my life - in my wildest imaginings as a kid - did I think I would ever see the day such powerful machines available for a pittance of currency.

        The price/performance specs of things like the Raspberry PI blow me away. Priced like a toy. Performs like the most powerful machines I had at a University campus.

        One thing I regret is I never got involved with FPGA programming of things like Altera chips.

        I would have loved to have made an ultracheap frame buffer that would let me look at legacy signals ( old 9-pin monochrome ) for the old monochrome graphics displays and re-emit them in the modern 15-pin VGA format. This does not seem as simple as connecting horizontal sync, vertical sync, and mixing the color lines to monochrome. Apparently there are completely different horizontal scan rates and vertical interlace issues involved. Anyone seen a CHEAP box that does this? It may even be possible the ability to do this is already in the new LCD panels, but I know the old VGA displays had their horizontal circuits pretty tightly tuned to only work in certain ranges.

        The reason I ask is I occasionally still see old monochrome monitors in use, but they are wearing out, and their cathodes are losing emissivity - leading to blurry images on the screen. I would love to simply replace the old display with a modern flat-panel.

        "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
    • (Score: 3, Informative) by kaszz on Tuesday June 02 2015, @08:47PM

      by kaszz (4211) on Tuesday June 02 2015, @08:47PM (#191264) Journal

      Implementing a CPU in FPGA requires more gates than said CPU because the FPGA needs a configuration support structure. And the latency between those gates will be slower than said CPU. So if one implements a CPU on FPGA and runs a video compression codec. It will be slower.

      The nice thing about FPGA is that you can implement the codec directly as a gate structure. In that situation you have no CPU, but a codec that runs way faster than most CPUs can even dream of. Also if you need to handle I/O at high speed. An FPGA can easily clock bit by bit in 2 ns. Something most CPUs just can't do without a specific sub circuitry, which makes it loose flexibility.

      So if you use a wrench to hammer nails. It will work but it's suboptimal. If you use it for the right task, it's awesome.

      • (Score: 4, Informative) by bob_super on Tuesday June 02 2015, @09:10PM

        by bob_super (1357) on Tuesday June 02 2015, @09:10PM (#191274)

        Pretty much this.

        I deal with FPGA on a daily basis. You don't use them for sequential CPU operations, you use them for parallel pipelined processing. If the clock seems slow at only 400MHz, it also that means you're pipelining a new chunk of data every 2.5ns. Apply this to a 1000-bit-wide bus, and suddenly you're processing a 400Gb/s Ethernet pipe in real time. Good luck doing that with a CPU.
        The same consideration applies to running multiple filters in real-time on HD/4k video and compressing it. CPUs can do it, but you won't like the electric bill. FPGAs love that stuff.

        Of course, an ASIC gives you higher performance and less power than the same logic in FPGA gates. But the ginormous upfront cost and time-to-market restrict ASICs to cost-sensitive high-volume applications.

    • (Score: 3, Informative) by Katastic on Wednesday June 03 2015, @02:27AM

      by Katastic (3340) on Wednesday June 03 2015, @02:27AM (#191402)

      I've consistently tried to tell people about FPGAs but they're all so far up their butts they refuse to even read about them.

      FPGAs aren't CPUs. "Soft cores" (CPUs using an FPGA) are not faster than regular CPUs, and are not the purpose of FPGAs. Soft cores are ONLY for experimentation of designs--nobody is using them in production environments except as a low-end auxiliary processor to talk to and deal with higher-order problems. (e.g. your bitstream coupled to a TCP/IP stack)

      CPU's do one complex thing, very well. GPU's do many things, less complex. FPGAs are further down the line. They do thousands if not millions of things at the same time. They are programmable logic gates that are all always running. Unlike 99% of code that only gets executed at one time, every piece of "code" in an FPGAs is always running. (*We're talking in general, laymen terms here.) A CPU uses memory reads, registers, and branches all in order. One at a time. An FPGA is always reading, always processing, always outputting at the same time. FPGA's running @ 100 MHz can have encryption throughputs riving clusters of Xeons. (See lower.) But their designs have to be much much simpler. They tend to focus on low-level problems that can run everything to run in parallel. They also consume TINY amounts of power and produce little heat.

      So why put an FPGA inside a CPU? I'll tell you why! So a CPU can have access to custom HARDWARE at a moments notice. Running a server? The FPGA will reflash itself to have more encryption units. Start Doom 5? It'll reflash itself to have hardware and texturing units. Want to do super fast, super low latency sound, software radio, or video processing? BAM.

      Altera specializes in FPGA's that allow "partial reconfiguration." That means they can reflash only REGIONS of the FPGA while the rest remain running. So whereas older FPGA's had to be shut off, completely flashed, and rerun, these new ones can keep running live at all times.

      It's a dream world of new possibilities, and all we needed was one company to commit the big bucks to design a new tool chain.

      Check out these jaw-dropping stats of an FPGA cluster vs a Quad-Core 2.5 GHz Xeon in 2009: []

      A 64-FPGA module required 104 watts, and the equivalent Xeon "Cluster" would require 72 KILOWATTS of energy consumption. Yet these guys then made a 512-FPGA cluster which ran 753,000,000 RC4 keys a second. They ran the entire key space in 3 minutes. All with common hardware, some professors and grad students on a grant. Imagine what real businesses can and DO do every day with these things. (Not to mention what the NSA already DOES do unless they're complete idiots.) Then imagine what they could do inside your computer with a super easy GCC-like toolchain and no need to play around with wiring schematics and protocols to get data to and from your FPGA. FPGAs and ASICs dominate the bitcoin world because they are pound-for-pound the highest performance in terms of hardware cost and power cost.

      It's gonna be a fun next few years!

      Lastly, here's the paper I cited: []