Intel's Skylake-X line-up has been finalized, ranging from the i7-7800X for $389 to the i9-7980XE for $1,999. 18 cores for over $2,000 (after tax)? Someone will buy it:
Intel's high brass made a decidedly un-Intel move last August. During a routine business meeting at the company's Santa Clara headquarters, they decided to upend their desktop CPU roadmap for 2017 to prepare something new: the beastly 18-core i9-7980XE X-series. It's the company's most powerful consumer processor ever, and it marks the first time Intel hsd[sic] been able to cram that many cores into a desktop CPU. At $2,000, it's the sort of thing hardware fanatics will salivate over, and regular consumers can only dream about.
The chip's very existence came down to a surprising revelation at that meeting last year: Intel's 10-core Broadwell-E CPU, which was only on the market for a few months and cost a hefty $1,723, was selling incredibly well. And for Intel, that was a sign that there was even more opportunity in the high-end computing world.
"The 10-core part was absolutely breaking all of our sales expectations," Intel's Anand Srivatsa, general manager of its Desktop Platform Group, told Engadget in an interview. "We thought we'd wait six months or so to figure out whether this was actually going to be successful. But within the first couple months, it was absolutely clear that our community wanted as much technology as we could deliver to them."
[...] If you've been feeling nostalgic for an old-school computing hardware war, we're about to get one. AMD also announced its Threadripper CPUs for high-end desktops a few months ago, and, as usual, they're significantly cheaper than Intel's offerings. The 16-core AMD 1950X will cost $999, with speeds between 3.4GHz and 4GHz. That's the same price as Intel's 10-core i9 X-series processor, while the 16-core model will run you $1,699.
Obligatory Intel Management Engine / AMD Secure Processor comment.
Also at Intel Newsroom.
(Score: 0) by Anonymous Coward on Tuesday August 08 2017, @05:31PM (8 children)
why is there so little open source hardware? there are thousands of FOSS devs releasing their work. Why are there so few hardware engineers releasing their work under Free licenses? Are hardware engineers just a bunch of whores (no offense to actual prostitutes)?
(Score: 2, Informative) by Anonymous Coward on Tuesday August 08 2017, @05:36PM (2 children)
Because if you don't own a fab, there isn't much you can do with open source hardware. How much open source software would there be if a compiler cost $100 million and filled a large warehouse?
(Score: 2, Informative) by Anonymous Coward on Tuesday August 08 2017, @06:22PM (1 child)
Because manufacturing hardware is so expensive, firms have done everything that they can to protect their margins; the natural consequence is that hardware (even at the most fundamental levels) is trapped in a giant web of legal obligations (patents, NDAs, etc.). In such a litigious environment, it's not possible to cultivate a culture that respects the idea of free and open source hardware.
(Score: 2) by lentilla on Wednesday August 09 2017, @04:39AM
This might be a good time to reflect on the timely success of the free software movement. By the time average businesses started to think about monetising software (and by monetising, I mean locking it up so others can't have any), the free software movement was already well under way.
Serious business investment in hardware started at the end of the second world war, it wasn't until the 1970s that software became something that was considered in its own right. So perhaps that explains the timing.
So I; for one; am certainly grateful that a number of prescient individuals took up the fight to allow the sharing of software. Today's landscape might have been considerably more hostile without their efforts. I don't like litigious environments.
(Score: 2) by LoRdTAW on Tuesday August 08 2017, @07:17PM
As in silicon? Obviously, you can't build and test hardware like you can software. Software needs only a compiler/interpreter, an editor and maybe some libraries. build/test/fix bugs/repeat until it works. Pretty simple.
Hardware isn't as easy as there are real world physical design issues beyond the HDL and logic. Design could work in simulation but a real world piece of silicon could flat out fail or exhibit strange behavior due to impedance or other physical design issues. Lots of testing and careful design is needed and those skills are not as common. Then there is the whole fabrication problem.
(Score: 1) by crafoo on Tuesday August 08 2017, @08:45PM (3 children)
Hardware and digital circuit design actually requires engineering skills, unlike software development.
(Score: 0) by Anonymous Coward on Tuesday August 08 2017, @08:51PM (2 children)
Even if that were true, it's utterly irrelevant to the question.
(Score: 2) by arcz on Wednesday August 09 2017, @03:34AM (1 child)
Hardware development is something that takes enough effort that you aren't going to be able to do it in your spare time with a small uncoordinated team. It's also not something that can be done for free.
Prototyping hardware designs requires money.
Prototyping software designs only requires time.
People with spare time are often willing to give some for free software, but given the extreme expenses of hardware design and manufacturing, it's unlikely altruism will be a sufficient source of money.
(Score: 3, Informative) by TheRaven on Thursday August 10 2017, @08:23AM
Our research processor began as a R4K equivalent that was developed by a single PhD student. Not exactly a competitor for a modern system, but fast enough to boot FreeBSD and be used for experimentation. We're using some quite expensive FPGAs (though they're not 6-7 years old, so are probably much less expensive), but you can simulate in software. The simulation isn't quite useable, but is fast enough for running our test suite (it's also useful in the test suite to be able to dump the register contents with a single magic nop at certain points).
We use a high-level HDL called BlueSpec. It's very powerful, but unfortunately it's insanely expensive for non-academic users. It's sufficiently simple to learn that I was able to add an improved branch predictor to our processor with no prior hardware experience after attending a couple of hours of lectures and have subsequently made some other quite significant changes (added a bunch of instructions, changed a data representation and so on).
In contrast, the most popular RISC-V implementations use a high-level HDL developed at Berkeley called CHISEL. CHISEL is a Scala DSL and is pretty easy to pick up. The most complex RISC-V implementation that I'm aware of is the Berkeley Out of Order Machine (BOOM), which is an out-of-order superscalar design that has most of the features of a modern chip. Again, most of the development is done using the simulator. This is true even for a lot of commercial microprocessors: you're 99% done by the time you fab the first prototype.
The expensive part comes when you want to either use an FPGA (comparatively cheap - a few hundred to a few thousand dollars) or fab a chip. The economics there are quite odd. It's actually pretty cheap to get a little bit of space on the edge of someone else's wafer for very low volume runs, as long as you don't have any time constraints. If you're a new company or project then you can get some very good deals, because the fabs want to tie you into using their cell libraries so that when you ramp up production they can charge you more. That said, these costs are easily amortised if you have enough people that want the chip. This is what a few of my colleagues are hoping for with the lowRISC effort: they're producing an entirely open source SoC and will aim to ship a few million of them on a RPi-like board (most of them were also involved in the RPi effort).
sudo mod me up