Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Tuesday September 24 2019, @09:35PM   Printer-friendly
from the did-they,-didn't-they dept.

Submitted via IRC for SoyCow2718

IBM and Google's race for quantum computing takes a mysterious turn

The battle for top-dog status in the emerging field of quantum computing took a strange turn last week when rivals IBM and Google both made important and—in Google's case—mysterious claims about where they are in a quest that most experts believe is still at least a decade away from the finish line.

IBM announced that it will add its 14th quantum computer to its fleet in October. This will be a new 53-qubit model which it says is the single largest universal quantum system made available for external access in the industry to date. IBM also announced the opening of the first IBM Quantum Computation Center in Poughkeepsie, NY, bringing the number of quantum computing systems available online via its IBM Q Experience platform to 10, with an additional four systems scheduled to come online in the next month.

Meanwhile, Google scientists posted, and then quickly took down, a research paper on a NASA web site that claimed that it had achieved a major milestone called "quantum supremacy," meaning it can solve problems that even the most powerful conventional supercomputers cannot.

According to a report in the FT, the report claimed that Google's 72-qubit quantum computer chip Bristlecone, introduced in March 2018, performed a calculation in just over 3 minutes that would take 10,000 years on IBM's Summit, the world's most powerful commercial computer. The report reportedly said:

To our knowledge, this experiment marks the first computation that can only be performed on a quantum processor.

If true, this would be a very big step in the advance toward quantum computing, but it appears that the researchers may have gotten a little too far out over their skis and the post was quickly taken down. Since then, Google PR and marketing has refused to discuss the topic and the paper has gone the way of the whistleblower's account of President Trump's phone call with Ukraine's president. In a puff of digital smoke.

[...] For all the kerfuffle and analyst excitement, we are some distance away from a quantum advantage. Most experts believe the first quantum computer that can do the miraculous things its advocates promise is still a decade off but that hasn't stopped IBM, Microsoft, Google, AT&T, and other heavyweights from pressing ahead in a race that represents the next Mt. Everest of computing challenges. As with the sudden disappearance of claims of 'supremacy,' keep on the lookout for more strange and mysterious turns before we get there.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by Rupert Pupnick on Wednesday September 25 2019, @12:58PM (2 children)

    by Rupert Pupnick (7277) on Wednesday September 25 2019, @12:58PM (#898468) Journal

    I frequently hear this presumption from a lot of tech enthusiasts that once quantum computing hits some magical threshold, there will be this rapid increase in capabilities you make reference to. I think this expectation is based on the experience of the gigantic advances in CMOS technology of the past several decades, but I think drawing this kind of parallel-- which many enthusiasts are making implicitly-- is completely wrong.

    In CMOS, it's the ability to turn down feature size that gives you all of these benefits that improve performance together: faster switching times, lower power dissipation, and higher density. There are NO such scaling advantages with qubits. Even if you figure out a way to mass produce them in ever larger quantities, the key problem of entangling them all is always the same, and just gets worse as the number of qubits increases.

    I'm not an expert by any means, but I do try to follow along. And I believe that the theoretical foundation for quantum computing is sound. It's the engineering that's the huge problem: maintaining these tiny but elaborate probabilistic energy distributions of systems on an atomic scale in the presence of noise, and then somehow interfacing this to a machine based on classical principles so that people can make use of it.

    I see a really slow, painful, and expensive slog to get enough capability to be commercially viable (which, incidentally, is the real motivation here when you have players like IBM and Google involved).

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 0) by Anonymous Coward on Wednesday September 25 2019, @01:49PM (1 child)

    by Anonymous Coward on Wednesday September 25 2019, @01:49PM (#898480)

    We don't want a room-sized quantum computer, and we have already spent decades getting transistors to the nanoscale. It's a no-brainer to scale qubits down using existing silicon manufacturing processes. And it's already happening:

    https://www.quantumsilicon-grenoble.eu/cmos-based-silicon-qubits/ [quantumsilicon-grenoble.eu]

    https://www.sciencedaily.com/releases/2019/05/190513112223.htm [sciencedaily.com]

    The researchers say the study is further proof that silicon as a technology platform is ideal for scaling up to the large numbers of qubits needed for universal quantum computing. Given that silicon has been at the heart of the global computer industry for almost 60 years, its properties are already well understood and existing silicon chip production facilities can readily adapt to the technology.

    If you can't entangle every qubit, just make the biggest group of entangled qubits that you can, and make that into a tile/core. Then make a grid of those. You could run the same problem on all of them simultaneously and compare the results for error correction. Or use them in some other way.

    They need thousands or millions of physical qubits for problems like Shor's, so there will be a relentless pursuit of more entangled qubits.

    • (Score: 3, Interesting) by Rupert Pupnick on Wednesday September 25 2019, @05:09PM

      by Rupert Pupnick (7277) on Wednesday September 25 2019, @05:09PM (#898627) Journal

      But tiling of qubit cores doesn't solve the fundamental problem of entangling a large enough number of qubits to do useful problem solving-- like the kind you mention in your last sentence above, presumably citing Aaronson (whom I respect and admire, but whose work I confess to not fully understanding).

      Even Aaronson says it's a terrifying engineering problem, and I don't believe he has full appreciation for all the issues involved. Things don't just scale because you build lots of them and you want them to scale. Noise just becomes a bigger and bigger problem as you open up the entanglement "aperture", for lack of a better word. If these QC guys had something like a credible plan that addressed the issue of large entanglements, I'd be less of a doubter, but when I read the tech press, I just hear claim after claim after claim with approximately zero quantifiable data to back it up. And in an age of rampant tech overvaluations, that's really scary.