Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday January 10 2022, @05:13AM   Printer-friendly [Skip to comment(s)]
from the doing-quantum-pushups dept.

Making quantum computers even more powerful:

Engineers at Ecole Polytechnique Federale de Lausanne (EPFL) have developed a method for reading several qubits—the smallest unit of quantum data—at the same time. Their method paves the way to a new generation of even more powerful quantum computers.

"IBM and Google currently have the world's most powerful quantum computers," says Prof. Edoardo Charbon, head of the Advanced Quantum Architecture Laboratory (AQUA Lab) in EPFL's School of Engineering. "IBM has just unveiled a 127-qubit machine, while Google's is 53 qubits." The scope for making quantum computers even faster is limited, however, due to an upper bound on the number of qubits. But a team of engineers led by Charbon, in collaboration with researchers in the U.K., has just developed a promising method for breaking through this technological barrier. Their approach can read qubits more efficiently, meaning more of them can be packed into quantum processors. Their findings appear in Nature Electronics.

[...] The number of qubits is currently limited by the fact that there's no technology yet available that can read all the qubits rapidly. "Complicating things further, qubits operate at temperatures close to absolute zero, or –273.15oC," says Charbon. "That makes reading and controlling them even harder. What engineers typically do is use machines at room temperature and control each qubit individually."

Journal Reference:
Andrea Ruffino, Tsung-Yeh Yang, John Michniewicz, et al. A cryo-CMOS chip that integrates silicon quantum dots and multiplexed dispersive readout electronics, Nature Electronics (DOI: 10.1038/s41928-021-00687-6)


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2, Touché) by Anonymous Coward on Monday January 10 2022, @07:08AM (6 children)

    by Anonymous Coward on Monday January 10 2022, @07:08AM (#1211449)

    Now with twice the number of no practical applications

    • (Score: 2) by maxwell demon on Monday January 10 2022, @08:27AM (4 children)

      by maxwell demon (1608) Subscriber Badge on Monday January 10 2022, @08:27AM (#1211458) Journal

      I guess you would have said the same thing about the first classical computers.

      --
      The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 1, Interesting) by Anonymous Coward on Monday January 10 2022, @11:23PM (3 children)

        by Anonymous Coward on Monday January 10 2022, @11:23PM (#1211646)

        If you start the timeline of classical computing in 1936, when Turing published his paper, within five years the first useful computers had been built, within 20 years the first stored program digital computers had been built, and within 42 years the 8-bit micro era had begun.

        If you start the timeline of quantum computing in 1980, when Benioff published his paper... It's been 42 years and quantum computing is still not accomplishing anything that couldn't be done by hand [stackexchange.com].

        • (Score: 0) by Anonymous Coward on Tuesday January 11 2022, @05:08AM

          by Anonymous Coward on Tuesday January 11 2022, @05:08AM (#1211712)

          The length of the timeline doesn't necessarily mean anything. You could say that quantum computing is orders of magnitude harder, requiring advancements in material science to even get started.

        • (Score: 2) by maxwell demon on Tuesday January 11 2022, @09:01AM (1 child)

          by maxwell demon (1608) Subscriber Badge on Tuesday January 11 2022, @09:01AM (#1211734) Journal

          You could as well start the timeline of classical computing with Babbage's Analytical Engine. It had never been built, but is was already the concept of a general purpose computer, and would have worked in principle. It's just that the technology of that time wasn't advanced enough to make that concept practical. Sounds familiar?

          --
          The Tao of math: The numbers you can count are not the real numbers.
          • (Score: 0) by Anonymous Coward on Thursday January 13 2022, @02:18PM

            by Anonymous Coward on Thursday January 13 2022, @02:18PM (#1212408)

            If someone had said that the Analytical Engine had no practical applications, they would have been completely right.

            Be sure your goalposts leave a forwarding address!

    • (Score: 2) by driverless on Monday January 10 2022, @12:53PM

      by driverless (4770) on Monday January 10 2022, @12:53PM (#1211478)

      Dear quantum computer fans,

      Please get back to me when a quantum computer can factor a number better than a dog trained to bark three times.

      Love,
      IT person.

  • (Score: 0) by Anonymous Coward on Monday January 10 2022, @11:45AM

    by Anonymous Coward on Monday January 10 2022, @11:45AM (#1211475)

    Now with twice the number of unreadable bits!

    no technology yet available that can read all the qubits rapidly

  • (Score: 0) by Anonymous Coward on Monday January 10 2022, @05:39PM

    by Anonymous Coward on Monday January 10 2022, @05:39PM (#1211548)

    Might make a more accurate headline for the story...

(1)