Stories
Slash Boxes
Comments

SoylentNews is people

posted by chromas on Saturday June 22 2019, @02:34AM   Printer-friendly
from the 575-zettaflops-by-2371-sounds-reasonable dept.

Neven's Law is an observation of the growth of the quantum computing, somewhat akin to Moore's famous law, and describes how quickly quantum computers are gaining on classical ones. It is faster than you might think.

In December 2018, scientists at Google AI ran a calculation on Google's best quantum processor. They were able to reproduce the computation using a regular laptop. Then in January, they ran the same test on an improved version of the quantum chip. This time they had to use a powerful desktop computer to simulate the result. By February, there were no longer any classical computers in the building that could simulate their quantum counterparts. The researchers had to request time on Google's enormous server network to do that.

Neven's law suggests that following current trending, quantum supremacy—that point where an efficient quantum calculation cannot be simulated in any reasonable time frame on the most powerful classical computer—could happen within one year.

The rule began as an in-house observation before [Hartmut Neven, director of Google's Quantum Artificial Intelligence lab] mentioned it in May at the Google Quantum Spring Symposium. There, he said that quantum computers are gaining computational power relative to classical ones at a "doubly exponential" rate—a staggeringly fast clip.

With double exponential growth, "it looks like nothing is happening, nothing is happening, and then whoops, suddenly you're in a different world," Neven said. "That's what we're experiencing here."

Even exponential growth is pretty fast. It means that some quantity grows by powers of 2[.]

The first few increases might not be that noticeable, but subsequent jumps are massive. Moore's law, the famous guideline stating (roughly) that computing power doubles every two years, is exponential.

Doubly exponential growth is far more dramatic. Instead of increasing by powers of 2, quantities grow by powers of powers of 2[.]

Not all are convinced; classical computers are still improving subject to Moore's law (more or less), and quasi-quantum algorithms on classical computers continue to improve, pushing the goal-posts out further as well.

Still, even though the rate at which quantum computers are gaining on classical ones is debatable, there's no doubt quantum technology is racing towards an inflection point and the writing is, or is not, on the wall.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0) by Anonymous Coward on Saturday June 22 2019, @02:52AM (2 children)

    by Anonymous Coward on Saturday June 22 2019, @02:52AM (#858747)

    I remember reading about flying cars in Popular Mechanics as a kid when Scientific American was a serious magazine. Now Scientific American are just tech-cheerleaders. What's holding up that Bristlecone chip that was "previewed" over a year ago? Are there any scientific papers about it?

    • (Score: 2) by takyon on Saturday June 22 2019, @03:40AM (1 child)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday June 22 2019, @03:40AM (#858759) Journal

      Ask Google. Maybe they want a competitive advantage. Maybe the NSA has all the working quantum chips.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by krishnoid on Saturday June 22 2019, @05:03AM

        by krishnoid (1156) on Saturday June 22 2019, @05:03AM (#858779)

        So *that's* why they're cool with possibly getting defunded -- they can now run all their datamining right on their phones, and use the remaining computing power to mine the cryptocurrency of the week and arbitrage it. So obvious now that I think about it.

  • (Score: 1, Interesting) by Anonymous Coward on Saturday June 22 2019, @04:00AM (4 children)

    by Anonymous Coward on Saturday June 22 2019, @04:00AM (#858762)
    • (Score: 2) by takyon on Saturday June 22 2019, @02:08PM (3 children)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday June 22 2019, @02:08PM (#858844) Journal

      The guy's case is a bit certain, had to be amended (see editor's note), and doesn't consider quantum machine learning.

      Repeating "10300 continuous parameters" seems to be for shock value. It doesn't mean you need that many variables, bytes of data storage, or whatever.

      I think we'll know that quantum computing is dead if Google, IBM, Intel, Microsoft, and others [wikipedia.org] lose interest. Which they might if it becomes clear that 100+ qubit chips are unusable or not better than classical computers, even with error rates and other problems addressed.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 2) by Rupert Pupnick on Sunday June 23 2019, @01:59PM (2 children)

        by Rupert Pupnick (7277) on Sunday June 23 2019, @01:59PM (#859069) Journal

        I’m a Quantum Skeptic, but I agree with you. Talking about big numbers isn’t necessarily scary, on the contrary, it can appear to reinforce the argument that Quantum Advocates are making (“look at all those possible states”).

        As far as Neven’s Law goes, it depends on where you are along the curve (way left!) before things start bending upward, and then you have to not hit any natural physical limits first (see Moore’s Law).

        Surprised to see this in a SA article.

        • (Score: 2) by takyon on Sunday June 23 2019, @02:35PM (1 child)

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Sunday June 23 2019, @02:35PM (#859072) Journal

          Surprised about the SA article (actually a reprint from Quanta magazine, which has skeptical [quantamagazine.org] articles [quantamagazine.org]) or IEEE?

          Meant to say "uncertain" above but "certain" works too (insert quantum joke).

          I only just read the Neven's law TFA this morning. Big Google man says they are having those kinds of improvements, then it could well be real. And we may only have to wait months before they announce concrete details. It's not clear what they would actually be using it for, but machine learning is a strong bet. Real quantum computers (not D-Wave annealer) are supposed to be able to do everything a classical computer can, so maybe they can use it in place of many classical computers too.

          Article also quotes Scott Aaronson:

          “I think the undeniable reality of this progress puts the ball firmly in the court of those who believe scalable quantum computing can’t work,” wrote Scott Aaronson, a computer scientist at the University of Texas, Austin, in an email. “They’re the ones who need to articulate where and why the progress will stop.”

          AFAIK he has been the biggest quantum skeptic for years. Pack it up, folks. It's time for the Quantum Age.

          One thing that could be annoying: as long as quantum computers require stuff like liquid helium cooling, big companies like Google will control the powerful quantum sauce and only give you a taste through da cloud. We need room temperature quantum processors.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 2) by Rupert Pupnick on Sunday June 23 2019, @02:54PM

            by Rupert Pupnick (7277) on Sunday June 23 2019, @02:54PM (#859076) Journal

            I always considered Aaronson to be a Quantum Advocate, but only from the mathematical/theoretical side. I have his book, but only made it through three or four chapters before getting lost and picking up something else. I should take another crack at it. Check out his blog if you haven’t. I have a lot of respect for his work, but I don’t agree with what he says in that quote. The burden of proof is on those who say they can make it work.

            It’s when engineers start talking about QC implementations that things get even fuzzier, and my gut tells me that that shouldn’t happen. I’d love to hear an explanation of how the moleclur level I/O works. Guess it’s proprietary and I’d have to sign an NDA...

  • (Score: 5, Interesting) by HiThere on Saturday June 22 2019, @04:11AM (5 children)

    by HiThere (866) Subscriber Badge on Saturday June 22 2019, @04:11AM (#858767) Journal

    The thing is, "Nevin's Law" is a bit less soundly based than "Moore's Law" was, and that wasn't soundly based at all.

    Additionally, for many kinds of problem, quantum computers have no advantage, and lots of drawbacks...and for this step I'm not even counting the fragile engineering and the need for cryogenic computing.

    Additionally, the engineering is fragile and cryogenics don't scale well into a small form factor. So Quantum computers look to be "mainframe only" until some rather different approach is tried.

    But it's still past time to stop depending on prime factorization for encryption.

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
    • (Score: 2) by JoeMerchant on Saturday June 22 2019, @11:37AM (2 children)

      by JoeMerchant (3937) on Saturday June 22 2019, @11:37AM (#858812)

      I'm not current in cryptocurrency (meaning, I haven't looked deep into it for about 90 days), but so much of the crypto market's tech is based on signatures that are relatively easily broken by a powerful quantum computer. Of course there are "quantum safe" crypto signatures that will, in theory, just drop right in - but they significantly expand the data size required to record a signature, not to mention the conventional computing effort required to check it.

      Your point about quantum computers only being good for quantum problems is an excellent one - they're not going to be found in smartwatches anytime soon, they're not going to do much video transcoding, database processing, or other conventional computer work. Calling it "quantum supremacy" is kind of like calling the pine bark beetle "king of the world" - they will make dramatic and important changes felt around the world, but they still have very limited actual reach.

      --
      🌻🌻 [google.com]
      • (Score: 1, Funny) by Anonymous Coward on Saturday June 22 2019, @12:37PM (1 child)

        by Anonymous Coward on Saturday June 22 2019, @12:37PM (#858826)

        You lost trees too, I can feel it.

        I lost many. That damned beetle TOOK them. Dozens, just on my 1 acre lot. They're GONE, and almost as bad was the sudden emergence of woodpeckers that came with them.

        I used to think the laugh in Woody Woodpecker was made up, but no.. they sorta sound like that. And the sizes? There are these woodpeckers as large as ravens, others as small as robins, and they're EVERYWHERE LAUGHING!

        LAUGHING AND LAUGHING at me, and my poor choice of shirt. ME. The man one synonymous with 'the well dressed'! Well, Myrtle, this shirt was fine in 1976, and it's FINE NOW thank you very much, and STOP LAUGHING AT ME!

    • (Score: 2) by takyon on Saturday June 22 2019, @01:59PM (1 child)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday June 22 2019, @01:59PM (#858839) Journal

      Additionally, for many kinds of problem, quantum computers have no advantage, and lots of drawbacks...and for this step I'm not even counting the fragile engineering and the need for cryogenic computing.

      Additionally, the engineering is fragile and cryogenics don't scale well into a small form factor. So Quantum computers look to be "mainframe only" until some rather different approach is tried.

      A quantum coprocessor that could be useful for something, anything, could see use in supercomputing scenarios or even consumer devices. Just as we don't try to do everything on the CPU, GPU, or TPU/NPU alone (some smartphone SoCs supposedly have a "machine learning chip"). Quantum machine learning [technologyreview.com] is also a possibility.

      Work on room temperature quantum chips [medium.com] is also well underway.

      While quantum computing is useless for everyone right now, once the challenges are worked out we could just apply our nanolithography technologies to easily scale up to having millions or billions of qubits.

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 3, Interesting) by HiThere on Saturday June 22 2019, @04:28PM

        by HiThere (866) Subscriber Badge on Saturday June 22 2019, @04:28PM (#858874) Journal

        Room temperature quantum chips, if they come, which is, indeed, likely, would certainly change my argument. Then I'd be arguing that they'd be co-processors rather like GPUs, highly useful for certain jobs.

        Actually, a good enough room temperature quantum chip might actually totally replace current CPUs, as there's no inherent reason that they couldn't do classical operations also. But that "good enough" seems a long ways off, and may not be possible. Still, plants use quantum effects to make photosynthesis more efficient (shortest path "calculations", etc.) so it's likely there's no theoretical bar. But I think the engineering approach would need to change radically.

        --
        Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
  • (Score: 1, Touché) by Anonymous Coward on Saturday June 22 2019, @07:45AM

    by Anonymous Coward on Saturday June 22 2019, @07:45AM (#858793)

    I do not care what they SAY they need to simulate some unnamed "test" of some undescribed "quantum chip". What I could care about, is an example - ANY example - of a practical application. THEN we could compare what it costs in HW, time and money.

(1)