Slash Boxes

SoylentNews is people

posted by martyb on Monday January 10, @05:13AM   Printer-friendly
from the doing-quantum-pushups dept.

Making quantum computers even more powerful:

Engineers at Ecole Polytechnique Federale de Lausanne (EPFL) have developed a method for reading several qubits—the smallest unit of quantum data—at the same time. Their method paves the way to a new generation of even more powerful quantum computers.

"IBM and Google currently have the world's most powerful quantum computers," says Prof. Edoardo Charbon, head of the Advanced Quantum Architecture Laboratory (AQUA Lab) in EPFL's School of Engineering. "IBM has just unveiled a 127-qubit machine, while Google's is 53 qubits." The scope for making quantum computers even faster is limited, however, due to an upper bound on the number of qubits. But a team of engineers led by Charbon, in collaboration with researchers in the U.K., has just developed a promising method for breaking through this technological barrier. Their approach can read qubits more efficiently, meaning more of them can be packed into quantum processors. Their findings appear in Nature Electronics.

[...] The number of qubits is currently limited by the fact that there's no technology yet available that can read all the qubits rapidly. "Complicating things further, qubits operate at temperatures close to absolute zero, or –273.15oC," says Charbon. "That makes reading and controlling them even harder. What engineers typically do is use machines at room temperature and control each qubit individually."

Journal Reference:
Andrea Ruffino, Tsung-Yeh Yang, John Michniewicz, et al. A cryo-CMOS chip that integrates silicon quantum dots and multiplexed dispersive readout electronics, Nature Electronics (DOI: 10.1038/s41928-021-00687-6)

Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1, Interesting) by Anonymous Coward on Monday January 10, @11:23PM (3 children)

    by Anonymous Coward on Monday January 10, @11:23PM (#1211646)

    If you start the timeline of classical computing in 1936, when Turing published his paper, within five years the first useful computers had been built, within 20 years the first stored program digital computers had been built, and within 42 years the 8-bit micro era had begun.

    If you start the timeline of quantum computing in 1980, when Benioff published his paper... It's been 42 years and quantum computing is still not accomplishing anything that couldn't be done by hand [].

    Starting Score:    0  points
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  

    Total Score:   1  
  • (Score: 0) by Anonymous Coward on Tuesday January 11, @05:08AM

    by Anonymous Coward on Tuesday January 11, @05:08AM (#1211712)

    The length of the timeline doesn't necessarily mean anything. You could say that quantum computing is orders of magnitude harder, requiring advancements in material science to even get started.

  • (Score: 2) by maxwell demon on Tuesday January 11, @09:01AM (1 child)

    by maxwell demon (1608) on Tuesday January 11, @09:01AM (#1211734) Journal

    You could as well start the timeline of classical computing with Babbage's Analytical Engine. It had never been built, but is was already the concept of a general purpose computer, and would have worked in principle. It's just that the technology of that time wasn't advanced enough to make that concept practical. Sounds familiar?

    The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 0) by Anonymous Coward on Thursday January 13, @02:18PM

      by Anonymous Coward on Thursday January 13, @02:18PM (#1212408)

      If someone had said that the Analytical Engine had no practical applications, they would have been completely right.

      Be sure your goalposts leave a forwarding address!