Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 17 submissions in the queue.
posted by chromas on Saturday June 22 2019, @02:34AM   Printer-friendly
from the 575-zettaflops-by-2371-sounds-reasonable dept.

Neven's Law is an observation of the growth of the quantum computing, somewhat akin to Moore's famous law, and describes how quickly quantum computers are gaining on classical ones. It is faster than you might think.

In December 2018, scientists at Google AI ran a calculation on Google's best quantum processor. They were able to reproduce the computation using a regular laptop. Then in January, they ran the same test on an improved version of the quantum chip. This time they had to use a powerful desktop computer to simulate the result. By February, there were no longer any classical computers in the building that could simulate their quantum counterparts. The researchers had to request time on Google's enormous server network to do that.

Neven's law suggests that following current trending, quantum supremacy—that point where an efficient quantum calculation cannot be simulated in any reasonable time frame on the most powerful classical computer—could happen within one year.

The rule began as an in-house observation before [Hartmut Neven, director of Google's Quantum Artificial Intelligence lab] mentioned it in May at the Google Quantum Spring Symposium. There, he said that quantum computers are gaining computational power relative to classical ones at a "doubly exponential" rate—a staggeringly fast clip.

With double exponential growth, "it looks like nothing is happening, nothing is happening, and then whoops, suddenly you're in a different world," Neven said. "That's what we're experiencing here."

Even exponential growth is pretty fast. It means that some quantity grows by powers of 2[.]

The first few increases might not be that noticeable, but subsequent jumps are massive. Moore's law, the famous guideline stating (roughly) that computing power doubles every two years, is exponential.

Doubly exponential growth is far more dramatic. Instead of increasing by powers of 2, quantities grow by powers of powers of 2[.]

Not all are convinced; classical computers are still improving subject to Moore's law (more or less), and quasi-quantum algorithms on classical computers continue to improve, pushing the goal-posts out further as well.

Still, even though the rate at which quantum computers are gaining on classical ones is debatable, there's no doubt quantum technology is racing towards an inflection point and the writing is, or is not, on the wall.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by takyon on Saturday June 22 2019, @01:59PM (1 child)

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Saturday June 22 2019, @01:59PM (#858839) Journal

    Additionally, for many kinds of problem, quantum computers have no advantage, and lots of drawbacks...and for this step I'm not even counting the fragile engineering and the need for cryogenic computing.

    Additionally, the engineering is fragile and cryogenics don't scale well into a small form factor. So Quantum computers look to be "mainframe only" until some rather different approach is tried.

    A quantum coprocessor that could be useful for something, anything, could see use in supercomputing scenarios or even consumer devices. Just as we don't try to do everything on the CPU, GPU, or TPU/NPU alone (some smartphone SoCs supposedly have a "machine learning chip"). Quantum machine learning [technologyreview.com] is also a possibility.

    Work on room temperature quantum chips [medium.com] is also well underway.

    While quantum computing is useless for everyone right now, once the challenges are worked out we could just apply our nanolithography technologies to easily scale up to having millions or billions of qubits.

    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 3, Interesting) by HiThere on Saturday June 22 2019, @04:28PM

    by HiThere (866) Subscriber Badge on Saturday June 22 2019, @04:28PM (#858874) Journal

    Room temperature quantum chips, if they come, which is, indeed, likely, would certainly change my argument. Then I'd be arguing that they'd be co-processors rather like GPUs, highly useful for certain jobs.

    Actually, a good enough room temperature quantum chip might actually totally replace current CPUs, as there's no inherent reason that they couldn't do classical operations also. But that "good enough" seems a long ways off, and may not be possible. Still, plants use quantum effects to make photosynthesis more efficient (shortest path "calculations", etc.) so it's likely there's no theoretical bar. But I think the engineering approach would need to change radically.

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.