https://www.hpcwire.com/2017/05/18/ibm-d-wave-report-quantum-computing-advances/
IBM said this week it has built and tested a pair of quantum computing processors, including a prototype of a commercial version. That progress follows an announcement earlier this week that commercial quantum computer developer D-Wave Systems has garnered venture funding that could total up to $50 million to build it next-generation machine with up to 2,000 qubits.
[...] Meanwhile, IBM researchers continue to push the boundaries of quantum computing as part of its IBM Q initiative launched in March to promote development of a "universal" quantum computer. Access to a 16-qubit processor via the IBM cloud would allow developers and researchers to run quantum algorithms. The new version replaces an earlier 5-qubit processor.
The company also rolled on Wednesday (May 17) the first prototype of a 17-qubit commercial processor, making it IBM's most powerful quantum device. The prototype will serve as the foundation of IBM Q's commercial access program. The goal is to eventually scale future prototypes to 50 or more qubits.
The article also notes Hewlett Packard Enterprise's prototype of "The Machine", with 160 terabytes of RAM.
(Score: 2, Informative) by khallow on Sunday May 21 2017, @01:03PM (18 children)
(Score: 0) by Anonymous Coward on Sunday May 21 2017, @04:40PM (10 children)
Is there a proof that parsing, rendering, and dependency calculation cannot benefit from quantum computing, or are you assuming that currently known algorithms are representative of the usefulness of QC?
(Score: 1) by khallow on Sunday May 21 2017, @11:38PM (9 children)
(Score: 0) by Anonymous Coward on Monday May 22 2017, @12:24AM (8 children)
Power efficiency matters, esp. given the increasing about of tech relying on batteries.
(Score: 1) by khallow on Monday May 22 2017, @01:58AM (7 children)
There's huge room for improvement in power efficiency for classical computation, as in many orders of magnitude.
(Score: 0) by Anonymous Coward on Monday May 22 2017, @02:18AM (6 children)
Any efficiency difference captured by multiplication will only hold for problems below a certain size, this is about how the amount of resources required grows with the size of the problem. At some problem size QC will pull ahead forevermore, regardless of how many times faster classical computers are.
Possibly that problem size will be large enough that QC isn't worth it, but there's absolutely no way to know at present if that will be the case.
(Score: 1) by khallow on Monday May 22 2017, @02:30AM (5 children)
And almost all real world problems, particularly for power limited applications, are far below a "certain size". Further, we're nowhere near a level of technology development where this is relevant. IBM has to get there from here. They can't be doing that by eliminating their current business.
(Score: 0) by Anonymous Coward on Monday May 22 2017, @02:57AM (4 children)
We have no idea what ``real world problems'' will be in a decade or two. People today parse kilobytes of data literally hundreds of times a day. How much parsing is involved in a normal persons daily web-browsing? Hundreds of parses each of several kilobytes is probably a significant underestimate esp. when considering many people have a few shittily designed apps that poll frequently and how much JS gets pulled into a mundane webpage. This state of affairs would seem ridiculous to consider a few decades back.
kilobytes may seem tiny, but that's also the point, megabytes are also starting to look tiny now, and with HD HDR 60fps videos on the horizon gigabytes will start to look small pretty fast too. Literally billions of bytes will look small.
Furthermore this is all assuming that classical computers will maintain their large efficiency advantage over classical computers, if QC turns out to be significantly useful (and who knows what mundane stuff it's useful for yet, the research probably doesn't focus on using QC to make video decoding faster since that won't matter until QC is cheap and widely available) then unless there's a fundamental reason classical computers will maintain their advantages I just don't see them withstanding the potential power efficiency of equally developed QC.
Sure QC memory at the moment is sub-tubes-of-mercury in terms of manufacturability, but modern machines don't use tubes of mercury as memory and so limits to its efficient manufacture aren't relevant.
Unless some property of the physics being exploited requires hard to manufacture things, I just don't see QC perpetually lagging behind CC.
A decent QC implementation for scheduling or TSP has the potential save airlines and shipping companies enough money that their industries could support IBM offering QCaaS. Granted the development is currently slow, but given the rate of tech. development I'm not willing to accept that IBM funding research couldn't make progress fast enough to be commercially viable.
(Score: 1) by khallow on Monday May 22 2017, @03:55AM (3 children)
QC won't help at all with the need for larger data pipes. And it's worth noting that classical computation is way ahead of bandwidth. Existing computers have no trouble keeping up with data decoding and uncompression, which let us note, is not that hard a problem either for classical computers.
The thing is current computers are way more than fast enough to do that.
Such as the need for sufficient isolation from the rest of the universe?
(Score: 0) by Anonymous Coward on Monday May 22 2017, @04:57AM (2 children)
I meant to imply that problems are quickly becoming of sufficient size to benefit from QC despite efficiency issues while they catch up with CC.
They're certainly good enough, but even a fractions of a percentage point improvements in these approximations can mean millions saved. I doubt current implementations are close enough to optimal that it wouldn't have a market, but that is an ill-informed guess.
More importantly knocking a the average case running time's exponent down a bit would enable the use of optimizations which are currently infeasible for larger problems. I can't think of an example off the top of my head, but I'm sure there are a fair few algorithms people would prefer had their exponents slashed.
I'm ignorant of the requirements, but since the definition of what is ``hard to manufacture'' is entirely dependant on current tech. levels and robotic assembly is growing more prevalent I wouldn't be at all surprised to see assembly automated. The materials/space required seem the limiting factors. Materials will probably become slowly cheaper as we gain a better understanding of engineering these systems, but maybe not. As for space required, that may be a limiting factor, but particles are pretty small compared to humans and I'd be somewhat surprised if qbits require roughly human-scale space to remain isolated.
(Score: 1) by khallow on Monday May 22 2017, @05:33AM (1 child)
(Score: 0) by Anonymous Coward on Monday May 22 2017, @05:53AM
You're right, it isn't worth the risk; if they do get in early though it may be very valuable in the future.
(Score: 2) by takyon on Sunday May 21 2017, @07:26PM (1 child)
One day, quantum computing could come to consumer devices and even SoCs. Of course, it would have to work at room temperature.
Google's TPUs will be coming to cell phones, providing a fairly narrow neuromorphic kind of function. I don't think it is impossible for consumer SoCs to ship with quantum computing components. Even if they were as limited as D-Wave's "quantum annealer".
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 1) by khallow on Sunday May 21 2017, @11:54PM
Second, IBM would be throwing away some big businesses already. While everything can be sold with more advanced security algorithms (which is one of the few niches that QC might work in), there isn't that compelling a need to make something more secure. I'm just not seeing how QC will help that much with IBM's typical service-oriented businesses.
(Score: 2) by JoeMerchant on Monday May 22 2017, @01:09AM (4 children)
But, is there any profit growth left in "big iron" pushing HTML or spreadsheets?
QC is new territory, with mostly undefined applications, a chance for growth. HTML and spreadsheets are a race to the bottom with every other vendor on the planet.
🌻🌻 [google.com]
(Score: 1) by khallow on Monday May 22 2017, @02:05AM (3 children)
I don't see the point of abandoning a mature industry with a high profit margin for an unproven one, with extremely little development at present and little relevant to IBM's current business model. More likely is that IBM has grown so inefficient that they compete poorly in considerable portions of their old markets.
(Score: 2) by JoeMerchant on Monday May 22 2017, @03:45AM (2 children)
Laying off your high priced older employees isn't abandoning an industry, it's cost control. They can still service the old accounts and continue to milk them.
🌻🌻 [google.com]
(Score: 1) by khallow on Monday May 22 2017, @03:59AM (1 child)
(Score: 2) by JoeMerchant on Monday May 22 2017, @01:21PM
It's a really shitty thing to do as a "corporate citizen" and it is often not as effective as the spreadsheets say it will be since one older employee earning $125K per year can often bring more income to a company than 3 younger employees earning $50K per year (and the cost difference to the company is more like $225K for the old one vs $375K for the 3 younger ones).
🌻🌻 [google.com]