Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 11 submissions in the queue.
posted by janrinok on Thursday February 29 2024, @12:49AM   Printer-friendly

Researchers Adopt Innovative Method to Boost Speed and Accuracy of Traditional Computing:

Quantum computing has been hailed as a technology that can outperform classical computing in both speed and memory usage, potentially opening the way to making predictions of physical phenomena not previously possible.

Many see quantum computing's advent as marking a paradigm shift from classical, or conventional, computing. Conventional computers process information in the form of digital bits (0s and 1s), while quantum computers deploy quantum bits (qubits) to store quantum information in values between 0 and 1. Under certain conditions this ability to process and store information in qubits can be used to design quantum algorithms that drastically outperform their classical counterparts. Notably, quantum's ability to store information in values between 0 and 1 makes it difficult for classical computers to perfectly emulate quantum ones.

However, quantum computers are finicky and have a tendency to lose information. Moreover, even if information loss can be avoided, it is difficult to translate it into classical information—which is necessary to yield a useful computation.

[...] The scientists' results show that classical computing can be reconfigured to perform faster and more accurate calculations than state-of-the-art quantum computers.

This breakthrough was achieved with an algorithm that keeps only part of the information stored in the quantum state—and just enough to be able to accurately compute the final outcome.

"This work shows that there are many potential routes to improving computations, encompassing both classical and quantum approaches," explains Dries Sels, an assistant professor in New York University's Department of Physics and one of the paper's authors. "Moreover, our work highlights how difficult it is to achieve quantum advantage with an error-prone quantum computer."

[....] In seeking ways to optimize classical computing, Sels and his colleagues at the Simons Foundation focused on a type of tensor network that faithfully represents the interactions between the qubits. Those types of networks have been notoriously hard to deal with, but recent advances in the field now allow these networks to be optimized with tools borrowed from statistical inference.

The authors compare the work of the algorithm to the compression of an image into a JPEG file, which allows large images to be stored using less space by eliminating information with barely perceivable loss in the quality of the image.

Journal Reference:
Tindall, Joseph and Fishman, Matthew and Stoudenmire, E. Miles and Sels, Dries, Efficient Tensor Network Simulation of IBM's Eagle Kicked Ising Experiment, PRX Quantum, 5, 1, 010308, 2024 https://doi.org/10.1103/PRXQuantum.5.010308


Original Submission

 
This discussion was created by janrinok (52) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Interesting) by anubi on Thursday February 29 2024, @12:21PM (1 child)

    by anubi (2828) on Thursday February 29 2024, @12:21PM (#1346784) Journal

    Somehow, I get the idea that a quantum computer is much like the old Hybrid Analog computers I worked with ( and built ) at University. In it's day, the analog computer could solve nonlinear differential equations far faster than the digital computer could...so fast we time scaled the integrators/differentiators so the solved equations refreshed on the oscilloscope so fast it appeared as a continuous solve to the eye. Using analog potentiometers, we could tweak various coefficients on the fly to observe their influence on the solution.

    A typical solve was for optimizing control systems for optimal stability and damping.

    We could measure what we had when we actually built a machine, then figure out how to drive it so it so it would not become unstable at certain conditions ( usually at some resonance or combination of conditions ).

    But the analog stuff, while fast, wasn't all that accurate. We had to compensate via tweaking the coefficients. Typical component tolerances on the resistors and capacitors was 1% or so.

    The analog voltage that represented the data was typically in the range of +/- 10 Volts. Nearly infinite resolution. In the milli-Hz to kHz timescale. Programming was done using patch-cords...usually "banana" plugs, so named because their spring contact that held them into the patch panel holes that connected to the various analog components. You had a fixed number of operation elements...integrators, differentiators, inverters, summers, multipliers, dividers, absolute-value, log and antilog operators, potentiometers, maybe some special function blocks. Wire them up to solve your analog equation. Drive it with a signal generator and watch the solve on the oscilloscope, or timescale it down ( by a factor of 1000 or more ) and plot the same display to a mechanical pen plotter driven be servo.motors.

    So, if the data in a quantum computer is represented by a value between 1 and -1 with "infinite" resolution...almost sounds like a UHF version of my old analog machine.

    I had no entanglement operator though. I haven't the foggiest idea how that works

    --
    "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
    • (Score: 3, Informative) by bzipitidoo on Thursday February 29 2024, @09:11PM

      by bzipitidoo (4388) on Thursday February 29 2024, @09:11PM (#1346849) Journal

      Quantum computing is definitely not analog. A superposition is not a value between 1 and 0, it is an unresolved probability of whether the quantum bit will collapse to a discrete value of 1 or 0. Schrodinger's cat is simultaneously dead and alive, it is not 20.1% dead and 79.9% alive (or any other percentage outside of 100% and 0%), it is all the way alive or all the way dead, but which is not known until the wave function collapses.

      Quantum computing attempts to use this uncertainty to sway the probabilities so that when the quantum bits collapse into ordinary classical bits, they contain a correct answer. Somehow with quantum bits it is possible to test all possible answers simultaneously. The most well-known problem with a quantum algorithm is number factoring. There is no known way to find the factors of an arbitrary number with anything close to the short time it takes to determine whether a number has factors or is a prime, that is, without massive parallelism. If you had enough computers, you could figure out the factors very quickly. Just have each computer test with a different prime, from 2 to the square root of the number. Really large numbers would take ridiculous quantities of computers, but in principle it is possible. Quantum superposition may make it possible to, in effect, try all the potential factors simultaneously, on one computer.

  • (Score: 3, Interesting) by VLM on Thursday February 29 2024, @01:09PM

    by VLM (445) on Thursday February 29 2024, @01:09PM (#1346791)

    It's a variation on the classic supercomputer problem where computational problems are turned into I/O problems.

    It's "easy" to solve certain problems with a quantum computer, but very hard to get the data in and out fast with a reasonable bit error rate.

  • (Score: 2) by DadaDoofy on Thursday February 29 2024, @10:43PM (3 children)

    by DadaDoofy (23827) on Thursday February 29 2024, @10:43PM (#1346861)

    Why is quantum computing sold as the future of computing when, by definition, it seems inherently broken to begin with. Can someone explain in layman's terms how something so randomly prone to error will ever be of commercial use? Most of the articles I've read say that's error rate is something that will need to be dealt with down the line, but obviously it's a prerequisite for it to ever take hold and surpass conventional computing. What's the path forward to achieving that goal?

    • (Score: -1, Troll) by Anonymous Coward on Friday March 01 2024, @04:44AM (1 child)

      by Anonymous Coward on Friday March 01 2024, @04:44AM (#1346907)

      No thanks. It sounds like explaining it to you would be like explaining math to a monkey.

      • (Score: 3, Funny) by driverless on Friday March 01 2024, @10:41AM

        by driverless (4770) on Friday March 01 2024, @10:41AM (#1346945)

        Almost right, it'd be like explaining religion to a monkey, not math. "You see, there's this sky-fairy who's so insecure he/she/it has to have humans constantly praising it, and occasionally it decides to kill lots of said humans because it loves them so much, and then...".

    • (Score: 3, Insightful) by driverless on Friday March 01 2024, @10:37AM

      by driverless (4770) on Friday March 01 2024, @10:37AM (#1346944)

      It doesn't solve problems, it attracts funding. Ten years ago to attract large amounts of funding you had to include the word "blockchain" in your proposal. Five years ago it was "quantum". Now it's "AI". So expect to see the quantum hype die down more and more as it's superseded by AI hype. Now if only I could predict what the next bit of hype was I could get millions in funding... maybe I just need an AI-powered quantum computer based on blockchain technology to sort it out.

      (OK, technically it does solve at least one problem, "what do I put in a grant proposal to get funding?", but that's probably not the problem you were thinking of).

(1)