Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Sunday October 13 2019, @07:53PM   Printer-friendly
from the double-time dept.

Submitted via IRC for Bytram

New compiler makes quantum computers two times faster

A new paper from researchers at the University of Chicago introduces a technique for compiling highly optimized quantum instructions that can be executed on near-term hardware. This technique is particularly well suited to a new class of variational quantum algorithms, which are promising candidates for demonstrating useful quantum speedups. The new work was enabled by uniting ideas across the stack, spanning quantum algorithms, machine learning, compilers, and device physics. The interdisciplinary research was carried out by members of the EPiQC (Enabling Practical-scale Quantum Computation) collaboration, an NSF Expedition in Computing.

[...] To match the constraints of current and near-term quantum computers, a new paradigm for variational quantum algorithms has recently emerged. These algorithms tackle similar computational challenges as the originally envisioned quantum algorithms, but build resilience to noise by leaving certain internal program parameters unspecified. Instead, these internal parameters are learned by variation over repeated trials, guided by an optimizer. With a robust optimizer, a variational algorithm can tolerate moderate levels of noise.

While the noise resilience of variational algorithms is appealing, it poses a challenge for compilation, the process of translating a mathematical algorithm into the physical instructions ultimately executed by hardware.

[...] The researchers address the issue of partially specified programs with a parallel technique called partial compilation. Pranav Gokhale, a UChicago PhD student explains, "Although we can't fully compile a variational algorithm before execution, we can at least pre-compile the parts that are specified." For typical variational algorithms, this simple heuristic alone is sufficient, delivering 2x speedups in quantum runtime relative to standard gate-based compilation techniques. Since qubits decay exponentially with time, this runtime speedup also leads to reductions in error rates.

For more complicated algorithms, the researchers apply a second layer of optimizations that numerically characterize variations due to the unspecified parameters, through a process called hyperparameter optimization. "Spending a few minutes on hyperparameter tuning and partial compilation leads to hours of savings in execution time", summarizes Gokhale. Professor Chong notes that this theme of realizing cost savings by shifting resources—whether between traditional and quantum computing or between compilation and execution—echoes in several other EPiQC projects.

The researchers' paper, "Partial Compilation of Variational Algorithms for Noisy Intermediate-Scale Quantum Machines" (arXiv link) will be presented at the MICRO computer architecture conference in Columbus, Ohio on October 14. Gokhale and Chong's co-authors include Yongshan Ding, Thomas Propson, Christopher Winkler, Nelson Leung, Yunong Shi, David I. Schuster, and Henry Hoffmann, all also from the University of Chicago.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0) by Anonymous Coward on Sunday October 13 2019, @09:11PM (3 children)

    by Anonymous Coward on Sunday October 13 2019, @09:11PM (#906709)

    Two Times. 2 Freaking Times.

    • (Score: 0) by Anonymous Coward on Sunday October 13 2019, @09:54PM

      by Anonymous Coward on Sunday October 13 2019, @09:54PM (#906727)

      2 times as fast or 2 times faster?

    • (Score: 0) by Anonymous Coward on Monday October 14 2019, @01:57AM

      by Anonymous Coward on Monday October 14 2019, @01:57AM (#906808)

      I am waiting to see if they can do it a third time

    • (Score: 0) by Anonymous Coward on Monday October 14 2019, @02:11AM

      by Anonymous Coward on Monday October 14 2019, @02:11AM (#906811)

      Hit me baby one more time

  • (Score: 0) by Anonymous Coward on Sunday October 13 2019, @09:28PM

    by Anonymous Coward on Sunday October 13 2019, @09:28PM (#906712)

    It lets you write quantum goto statements... they go everywhere at once.

  • (Score: 1, Insightful) by Anonymous Coward on Sunday October 13 2019, @09:54PM (5 children)

    by Anonymous Coward on Sunday October 13 2019, @09:54PM (#906728)

    I am starting to believe there is something wrong with quantum mechanics. It does not seem to lead to anything useful, just endless stories about how "strange" and "cool" things really are and promises of tech that is always 20 years away.

  • (Score: 2, Disagree) by FatPhil on Sunday October 13 2019, @10:00PM (3 children)

    by FatPhil (863) <{pc-soylent} {at} {asdf.fi}> on Sunday October 13 2019, @10:00PM (#906732) Homepage
    "can be executed on near-term hardware"

    "Can" has present aspect and tense here.

    "near-term" and "promising candidates" kinda sound like forward-looking statements rather than right-now facts. And the killer seems to be the first line of the abstract of the paper: "Quantum computing is on the cusp of reality". So it's not actually at reality yet? So these things might [be executed], but won't necessarily? Near-term hardware isn't hardware, near-term hardware is just ideas, designs, plans, aspirations. "Can be executed on near-term hardware" means "can't be executed on current hardware", which means "can't yet be executed".

    Build it, and run it, then we can talk about what your advances *can* do.

    But no - they said they ran and benchmarked it, I hear you protest - how can FatPhil be so negative about this quantum computation breakthrough?

    I'll let the paper explain that: "Our results were collected using over 200,000 CPU-core hours on Intel Xeon E5-2680 processors"

    My argument stands unchanged.
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    • (Score: 2) by PartTimeZombie on Sunday October 13 2019, @10:16PM

      by PartTimeZombie (4827) on Sunday October 13 2019, @10:16PM (#906742)

      I translated the phrase "near-term hardware" into "hardware that hasn't been invented yet".

    • (Score: 2) by martyb on Tuesday October 15 2019, @09:47AM

      by martyb (76) Subscriber Badge on Tuesday October 15 2019, @09:47AM (#907298) Journal

      Disagree.

      Cross compiler [wikipedia.org].

      Does it make sense to not write any software at all until the hardware is fully baked? You'd have software folk twiddling their thumbs waiting on that critical path. Instead, there is a way to get some overlap and not have to wait until the hardware is solid. I've worked on a project that, while the hardware folk were implementing new instructions in the hardware (according to a specification, think API), the software folk were using the same specification (API) to write their code to.

      How to debug/test?

      Emulation. Hence the "Our results were collected using over 200,000 CPU-core hours on Intel Xeon E5-2680 processors".

      Looks to me like SOP for any new hardware development process.

      In the matter at hand, I read it to say that, according to the specs that they have seen and the current trajectory of hardware development, when the hardware folk finally get the engineering challenges out of the way, we've got code that is ready to be run on it, and we've even done emulation to test it out.

      --
      Wit is intellect, dancing.
    • (Score: 2) by FatPhil on Wednesday October 16 2019, @07:11AM

      by FatPhil (863) <{pc-soylent} {at} {asdf.fi}> on Wednesday October 16 2019, @07:11AM (#907763) Homepage
      Dear person who disagrees: What's your counter-argument?

      I claim something doesn't exist, it's impossible to prove nonexistence of something whose existence can only be proved by proof of existence except through exhaustive search of the whole universe, and I claim that the fact they're running simulations on x86 clusters implies that likelyhood of existence is low. My claim is not extraordinary, it does not require extraordinary proof, and all the evidence is evidence of absense. If you disagree, then you are claiming that it does exist. That's extraordinary, where's your evidence? All you would need is *one* instance to disprove me - where's that counterexample?
      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
  • (Score: 0) by Anonymous Coward on Monday October 14 2019, @12:29AM

    by Anonymous Coward on Monday October 14 2019, @12:29AM (#906783)

    is still zero

(1)