Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Monday October 23 2017, @06:39PM   Printer-friendly
from the simulated-threat dept.

Arthur T Knackerbracket has found the following story:

Just when it was looking like the underdog, classical computing is striking back. IBM has come up with a way to simulate quantum computers that have 56 quantum bits, or qubits, on a non-quantum supercomputer – a task previously thought to be impossible. The feat moves the goalposts in the fight for quantum supremacy, the effort to outstrip classical computers using quantum ones.

It used to be widely accepted that a classical computer cannot simulate more than 49 qubits because of memory limitations. The memory required for simulations increases exponentially with each additional qubit.

The closest anyone had come to putting the 49-qubit limit to a test was a 45-qubit simulation at the Swiss Federal Institute of Technology in Zurich, which needed 500 terabytes of memory. IBM's new simulation upends the assumption by simulating 56 qubits with only 4.5 terabytes.

The simulation is based on a mathematical trick that allows a more compact numerical representation of different arrangements of qubits, known as quantum states.

A quantum computing operation is typically represented by a table of numbers indicating what should be done to each qubit to produce a new quantum state. Instead, researchers at IBM's T. J. Watson Research Center in Yorktown Heights, New York, used tensors – effectively multidimensional tables augmented with axes beyond rows and columns.

[...] they've upped the ante in the race to outperform classical computers with quantum systems. Google previously said they were on track to build a working 49-qubit processor by the end of 2017, but that will no longer win them the achievement of quantum supremacy.

[...] IBM's goal is to build a quantum computer that can "explore practical problems" such as quantum chemistry, says Wisnieff. He hopes to check the accuracy of quantum computers against his simulations before putting real quantum computers to the test.

"I want to be able to write algorithms that I know the answers for before I run them on a real quantum computer," he says.

-- submitted from IRC


Original Submission

Related Stories

IBM Announces Working Prototype of a 50-Qubit Quantum Computer 14 comments

IBM Raises the Bar with a 50-Qubit Quantum Computer

IBM established a landmark in computing Friday, announcing a quantum computer that handles 50 quantum bits, or qubits. The company is also making a 20-qubit system available through its cloud computing platform.

IBM, Google, Intel, and a San Francisco startup called Rigetti are all currently racing to build useful quantum systems. These machines process information in a different way from traditional computers, using the counterintuitive nature of quantum physics.

The announcement does not mean quantum computing is ready for common use. The system IBM has developed is still extremely finicky and challenging to use, as are those being built by others. In both the 50- and the 20-qubit systems, the quantum state is preserved for 90 microseconds—a record for the industry, but still an extremely short period of time.

[...] IBM is also announcing an upgrade to its quantum cloud software system today. "We're at world record pace. But we've got to make sure non-physicists can use this," Gil says.

The announcement should perhaps be treated cautiously, though. Andrew Childs, a professor at the University of Maryland, points out that IBM has not published details of its system in a peer-reviewed journal. "IBM's team is fantastic and it's clear they're serious about this, but without looking at the details it's hard to comment," he says. Childs says the larger number of qubits does not necessarily translate to a leap in computational capability. "Those qubits might be noisy, and there could be issues with how well connected they are," he says.

Also at The Mercury News and SiliconANGLE.

Previously: IBM Promises Commercialization of 50 Qubit Quantum Computers
IBM and D-Wave Quantum Computing Announcements
Intel Ships 17-Qubit Quantum Chip to Researchers
Google's Quantum Computing Plans Threatened by IBM Curveball (doesn't this undermine IBM's quantum system as well?)

Related: Microsoft is Developing a Quantum Computing Programming Language


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Interesting) by RamiK on Monday October 23 2017, @07:01PM (11 children)

    by RamiK (1813) on Monday October 23 2017, @07:01PM (#586490)

    Yeah we all know rank 2 is matrix and rank >=3 is tensor and a few of us even know quantum mechanics is basically just EE's complex math + some stat and diffs so it stands to reason you're going to store results in tensors... But telling us this is their "magic sauce" is about as useful as telling us the principle of running a logic engine using mechanical parts or silicone is using arrays.

    --
    compiling...
    • (Score: 1, Interesting) by Anonymous Coward on Monday October 23 2017, @07:40PM (1 child)

      by Anonymous Coward on Monday October 23 2017, @07:40PM (#586510)

      As briefly noted in the actual article, encoding quantum operations in the form of tensors yields a much more compact data structure. On top of this, compact form of information, they discovered that the simulation could be implemented with tensors in an "embarrassingly parallel" fashion (that's the actual terminology for a computation that can be performed in a highly parallel manner).

      So, their new tensor data structure and corresponding computational algorithm have turned out to be the bees knees.

      Now, FUCK OFF!

      • (Score: 0) by Anonymous Coward on Tuesday October 24 2017, @09:24AM

        by Anonymous Coward on Tuesday October 24 2017, @09:24AM (#586789)

        Now, FUCK OFF!

        It is basic etiquette to restrain from hurling insults and harsh language to logged-ins while posting under the provisory mask of anonymity.
        This post, though, is a fair game, so go ahead if you will.

    • (Score: 1) by khallow on Monday October 23 2017, @07:50PM (7 children)

      by khallow (3766) Subscriber Badge on Monday October 23 2017, @07:50PM (#586525) Journal
      It's probably the structure of these tensors which they are exploiting. Maybe the tensors are sparse (mostly zero) or maybe they have an algebraic structure which like with the fast Fourier transform (FFT) can be exploited numerically for a significant reduction in computation.
      • (Score: 0) by Anonymous Coward on Monday October 23 2017, @08:05PM (3 children)

        by Anonymous Coward on Monday October 23 2017, @08:05PM (#586536)

        The AC already said as much.

        • (Score: 0) by Anonymous Coward on Monday October 23 2017, @08:53PM (1 child)

          by Anonymous Coward on Monday October 23 2017, @08:53PM (#586567)

          Yeah but then he did that mic drop thing.

          • (Score: 0) by Anonymous Coward on Monday October 23 2017, @09:36PM

            by Anonymous Coward on Monday October 23 2017, @09:36PM (#586596)

            Did not!

        • (Score: 1) by khallow on Tuesday October 24 2017, @12:40AM

          by khallow (3766) Subscriber Badge on Tuesday October 24 2017, @12:40AM (#586666) Journal
          Actually, the AC did not. This would be additional structure on top of the tensor setup.
      • (Score: 2) by frojack on Monday October 23 2017, @08:28PM (2 children)

        by frojack (1554) on Monday October 23 2017, @08:28PM (#586555) Journal

        And maybe we are lucky they found this before everyone and dog started depending on Quantum Computing method for encryption only to find them cracked in the next cycle of discoveries. Equally lucky the government didn't move in and declare it a munition and keep it for themselves.

        Its been less than three years since so-called quantum computers became a thing, although they've been speculated about since the 80's. Already we see Moore out in his garage revving its engines.

        --
        No, you are mistaken. I've always had this sig.
        • (Score: 0) by Anonymous Coward on Monday October 23 2017, @09:13PM (1 child)

          by Anonymous Coward on Monday October 23 2017, @09:13PM (#586575)

          Moore died. Noone told you?

          • (Score: 1, Informative) by Anonymous Coward on Monday October 23 2017, @09:34PM

            by Anonymous Coward on Monday October 23 2017, @09:34PM (#586594)

            he still kickin it [wikipedia.org]

    • (Score: 0) by Anonymous Coward on Monday October 23 2017, @10:28PM

      by Anonymous Coward on Monday October 23 2017, @10:28PM (#586615)

      about as useful as telling us the principle of running a logic engine using mechanical parts

      Have you seen some of those logic circuits that have been built in Minecraft?

  • (Score: 5, Informative) by stormwyrm on Tuesday October 24 2017, @12:19AM (4 children)

    by stormwyrm (717) on Tuesday October 24 2017, @12:19AM (#586661) Journal

    All known methods for simulating a quantum system are exponential in the number of particles. The naïve way to simulate a quantum system with n particles (whose states may be entangled with each other), each of which can be in t quantum states is by using a Hilbert space with dimension tn, i.e. exponential in the number of particles. I suppose they managed to find a clever way to make the representation of the Hilbert space much more compact, which reduces the amount of space required by some constant factor or reduced the base of the exponent, but it still grows exponentially.

    The odd thing though is that no one has yet been able to conclusively prove that P ≠ BQP, meaning that there might possibly exist some algorithm to do quantum simulation on a classical computer in polynomial time and using at most polynomial space.

    --
    Numquam ponenda est pluralitas sine necessitate.
    • (Score: 3, Interesting) by anubi on Tuesday October 24 2017, @05:43AM

      by anubi (2828) on Tuesday October 24 2017, @05:43AM (#586733) Journal

      This topic reminds me of a little thing in our history where Ptolemy tried to describe the Universe. Earth centered. Boy, did he have a complex model.

      Then Copernicus came around and saw things from another angle. Maybe we aren't at the center of the Universe. Things got much much simpler.

      It won't be the first time that things that appear to be impossibly complex turn out to be quite elegant when "seen at the right angle".

      It would be quite foolhardy of us to think we have seen it all by now.

      We are just getting started.

      --
      "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
    • (Score: 0) by Anonymous Coward on Tuesday October 24 2017, @09:33AM (1 child)

      by Anonymous Coward on Tuesday October 24 2017, @09:33AM (#586791)

      The naïve way to simulate a quantum system with n particles (whose states may be entangled with each other), each of which can be in t quantum states is by using a Hilbert space with dimension tn, i.e. exponential in the number of particles.

      When the particles are entangled, then the system cannot no longer have as many (tn) degrees of freedom. Entangled sets of particles become sort of single entities, thus reducing the overall complexity. In a way, that complexity reduction is what quantum computing is.

      • (Score: 1, Informative) by Anonymous Coward on Tuesday October 24 2017, @09:49AM

        by Anonymous Coward on Tuesday October 24 2017, @09:49AM (#586793)

        When the particles are entangled, then the system cannot no longer have as many (tn) degrees of freedom.

        Wrong. Quite the opposite: Entangled states need many more parameters to describe than separable states. Which is exactly why classical simulation of quantum systems is so expensive.

        Entangled sets of particles become sort of single entities, thus reducing the overall complexity.

        Not reducing, increasing. The most general description of ten qubits in a completely separable state requires 20 real numbers (two per qubit). The most general description of ten qubits in an arbitrarily entangled state requires 2046 real numbers (2×210−2;).

    • (Score: 1, Interesting) by Anonymous Coward on Tuesday October 24 2017, @09:37AM

      by Anonymous Coward on Tuesday October 24 2017, @09:37AM (#586792)

      All known methods for simulating a quantum system are exponential in the number of particles.

      Exponential in time, definitely. Exponential in space, not.

      Quite some time ago I've stumbled over an article on arXiv [arxiv.org] (which also is published in a refereed journal) that IIUC only needs memory linear in the number of qubits. Note that this article is already from 2008. And it is not the only one, here's another one from 2009 [arxiv.org] (also published) which I found while searching for the first one. And I'm sure there are more proposals for it.

      In particular, in the scheme of the first article, each qubit is represented independently, so from memory requirements, it should be no problem to simulate even a million qubits. It's just that you couldn't await the result of it (as you have to run 3^N calculations of the circuit); even the 56 qubits IBM simulates would probably be out of reach in a reasonable time.

      I didn't look very closely on the second article mentioned above, but from a short glance it also seems to have the memory requirement scale linearly with the number of gates.

      Both articles thus disprove that you need exponential memory to simulate a quantum circuit.

      The odd thing though is that no one has yet been able to conclusively prove that P ≠ BQP

      Given that also nobody has been able to conclusively prove that P ≠ NP either, despite people trying to do so for a longer time than quantum complexity classes were even considered, I wouldn't call this odd.

  • (Score: 0) by Anonymous Coward on Tuesday October 24 2017, @04:46PM

    by Anonymous Coward on Tuesday October 24 2017, @04:46PM (#586926)

    Why does this announcement matter, or put anything into question. The hard thing about mathematically exponential problems are is that they are exponentially more difficult as the number increases.

    So let's say that due to clever tricks IBM figures out how to make the problem of simulating quantum mechanics 4x easier. That only makes the linear-ly faster, and they bump into the upper limit a bit later.

    So unless there is a particular reason to say "we'll never need more than 56 qbits" (or 60, or 80, or whatever), does their announcement really affect anything long-term?

(1)