Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Sunday May 21 2017, @10:31AM   Printer-friendly
from the maybe-IBM-isn't-all-services dept.

https://www.hpcwire.com/2017/05/18/ibm-d-wave-report-quantum-computing-advances/

IBM said this week it has built and tested a pair of quantum computing processors, including a prototype of a commercial version. That progress follows an announcement earlier this week that commercial quantum computer developer D-Wave Systems has garnered venture funding that could total up to $50 million to build it next-generation machine with up to 2,000 qubits.

[...] Meanwhile, IBM researchers continue to push the boundaries of quantum computing as part of its IBM Q initiative launched in March to promote development of a "universal" quantum computer. Access to a 16-qubit processor via the IBM cloud would allow developers and researchers to run quantum algorithms. The new version replaces an earlier 5-qubit processor.

The company also rolled on Wednesday (May 17) the first prototype of a 17-qubit commercial processor, making it IBM's most powerful quantum device. The prototype will serve as the foundation of IBM Q's commercial access program. The goal is to eventually scale future prototypes to 50 or more qubits.

The article also notes Hewlett Packard Enterprise's prototype of "The Machine", with 160 terabytes of RAM.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by JoeMerchant on Sunday May 21 2017, @11:51AM (19 children)

    by JoeMerchant (3937) on Sunday May 21 2017, @11:51AM (#512981)

    Perhaps this is why IBM is doing a big reduction in force on their traditional computing staff? Having recognized that their traditional "big iron" business is now superseded by hardware that's ubiquitous and very affordable, and the software that runs on it is becoming free-er year after year, is their new competitive bet quantum computing?

    --
    🌻🌻 [google.com]
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2, Informative) by khallow on Sunday May 21 2017, @01:03PM (18 children)

    by khallow (3766) Subscriber Badge on Sunday May 21 2017, @01:03PM (#513000) Journal
    That would be insane, if true. There is a huge gap between quantum computing (QC) and the service contracts that IBM makes its money on. And for most basic tasks, QC just doesn't add value. Your computer isn't going to be any better at displaying HTML or pushing spreadsheets, if it were a QC.
    • (Score: 0) by Anonymous Coward on Sunday May 21 2017, @04:40PM (10 children)

      by Anonymous Coward on Sunday May 21 2017, @04:40PM (#513054)

      Your computer isn't going to be any better at displaying HTML or pushing spreadsheets, if it were a QC.

      Is there a proof that parsing, rendering, and dependency calculation cannot benefit from quantum computing, or are you assuming that currently known algorithms are representative of the usefulness of QC?

      • (Score: 1) by khallow on Sunday May 21 2017, @11:38PM (9 children)

        by khallow (3766) Subscriber Badge on Sunday May 21 2017, @11:38PM (#513200) Journal
        Rather a third option, that current technologies are already vastly more than fast enough to keep up with human I/O and a huge variety of real world applications.
        • (Score: 0) by Anonymous Coward on Monday May 22 2017, @12:24AM (8 children)

          by Anonymous Coward on Monday May 22 2017, @12:24AM (#513220)

          Power efficiency matters, esp. given the increasing about of tech relying on batteries.

          • (Score: 1) by khallow on Monday May 22 2017, @01:58AM (7 children)

            by khallow (3766) Subscriber Badge on Monday May 22 2017, @01:58AM (#513257) Journal

            Power efficiency matters, esp. given the increasing about of tech relying on batteries.

            There's huge room for improvement in power efficiency for classical computation, as in many orders of magnitude.

            • (Score: 0) by Anonymous Coward on Monday May 22 2017, @02:18AM (6 children)

              by Anonymous Coward on Monday May 22 2017, @02:18AM (#513270)

              Any efficiency difference captured by multiplication will only hold for problems below a certain size, this is about how the amount of resources required grows with the size of the problem. At some problem size QC will pull ahead forevermore, regardless of how many times faster classical computers are.

              Possibly that problem size will be large enough that QC isn't worth it, but there's absolutely no way to know at present if that will be the case.

              • (Score: 1) by khallow on Monday May 22 2017, @02:30AM (5 children)

                by khallow (3766) Subscriber Badge on Monday May 22 2017, @02:30AM (#513275) Journal

                Any efficiency difference captured by multiplication will only hold for problems below a certain size, this is about how the amount of resources required grows with the size of the problem.

                And almost all real world problems, particularly for power limited applications, are far below a "certain size". Further, we're nowhere near a level of technology development where this is relevant. IBM has to get there from here. They can't be doing that by eliminating their current business.

                • (Score: 0) by Anonymous Coward on Monday May 22 2017, @02:57AM (4 children)

                  by Anonymous Coward on Monday May 22 2017, @02:57AM (#513289)

                  real world problems, particularly for power limited applications, are far below a "certain size"

                  We have no idea what ``real world problems'' will be in a decade or two. People today parse kilobytes of data literally hundreds of times a day. How much parsing is involved in a normal persons daily web-browsing? Hundreds of parses each of several kilobytes is probably a significant underestimate esp. when considering many people have a few shittily designed apps that poll frequently and how much JS gets pulled into a mundane webpage. This state of affairs would seem ridiculous to consider a few decades back.

                  kilobytes may seem tiny, but that's also the point, megabytes are also starting to look tiny now, and with HD HDR 60fps videos on the horizon gigabytes will start to look small pretty fast too. Literally billions of bytes will look small.

                  Furthermore this is all assuming that classical computers will maintain their large efficiency advantage over classical computers, if QC turns out to be significantly useful (and who knows what mundane stuff it's useful for yet, the research probably doesn't focus on using QC to make video decoding faster since that won't matter until QC is cheap and widely available) then unless there's a fundamental reason classical computers will maintain their advantages I just don't see them withstanding the potential power efficiency of equally developed QC.

                  Sure QC memory at the moment is sub-tubes-of-mercury in terms of manufacturability, but modern machines don't use tubes of mercury as memory and so limits to its efficient manufacture aren't relevant.

                  Unless some property of the physics being exploited requires hard to manufacture things, I just don't see QC perpetually lagging behind CC.

                  we're nowhere near a level of technology development where this is relevant

                  A decent QC implementation for scheduling or TSP has the potential save airlines and shipping companies enough money that their industries could support IBM offering QCaaS. Granted the development is currently slow, but given the rate of tech. development I'm not willing to accept that IBM funding research couldn't make progress fast enough to be commercially viable.

                  • (Score: 1) by khallow on Monday May 22 2017, @03:55AM (3 children)

                    by khallow (3766) Subscriber Badge on Monday May 22 2017, @03:55AM (#513305) Journal

                    kilobytes may seem tiny, but that's also the point, megabytes are also starting to look tiny now, and with HD HDR 60fps videos on the horizon gigabytes will start to look small pretty fast too. Literally billions of bytes will look small.

                    QC won't help at all with the need for larger data pipes. And it's worth noting that classical computation is way ahead of bandwidth. Existing computers have no trouble keeping up with data decoding and uncompression, which let us note, is not that hard a problem either for classical computers.

                    A decent QC implementation for scheduling or TSP has the potential save airlines and shipping companies enough money that their industries could support IBM offering QCaaS. Granted the development is currently slow, but given the rate of tech. development I'm not willing to accept that IBM funding research couldn't make progress fast enough to be commercially viable.

                    The thing is current computers are way more than fast enough to do that.

                    Unless some property of the physics being exploited requires hard to manufacture things, I just don't see QC perpetually lagging behind CC.

                    Such as the need for sufficient isolation from the rest of the universe?

                    • (Score: 0) by Anonymous Coward on Monday May 22 2017, @04:57AM (2 children)

                      by Anonymous Coward on Monday May 22 2017, @04:57AM (#513319)

                      QC won't help at all with the need for larger data pipes

                      I meant to imply that problems are quickly becoming of sufficient size to benefit from QC despite efficiency issues while they catch up with CC.

                      The thing is current computers are way more than fast enough to do that.

                      They're certainly good enough, but even a fractions of a percentage point improvements in these approximations can mean millions saved. I doubt current implementations are close enough to optimal that it wouldn't have a market, but that is an ill-informed guess.

                      More importantly knocking a the average case running time's exponent down a bit would enable the use of optimizations which are currently infeasible for larger problems. I can't think of an example off the top of my head, but I'm sure there are a fair few algorithms people would prefer had their exponents slashed.

                      Such as the need for sufficient isolation from the rest of the universe?

                      I'm ignorant of the requirements, but since the definition of what is ``hard to manufacture'' is entirely dependant on current tech. levels and robotic assembly is growing more prevalent I wouldn't be at all surprised to see assembly automated. The materials/space required seem the limiting factors. Materials will probably become slowly cheaper as we gain a better understanding of engineering these systems, but maybe not. As for space required, that may be a limiting factor, but particles are pretty small compared to humans and I'd be somewhat surprised if qbits require roughly human-scale space to remain isolated.

                      • (Score: 1) by khallow on Monday May 22 2017, @05:33AM (1 child)

                        by khallow (3766) Subscriber Badge on Monday May 22 2017, @05:33AM (#513326) Journal
                        The thing is, current computation is already vast overkill for a variety of computation needs while quantum computers are extremely primitive (handful of qubits, for example). Gambling a huge company on the remote possibility that QC can be massively accelerated at some point in the near future seems a bad idea to me.
                        • (Score: 0) by Anonymous Coward on Monday May 22 2017, @05:53AM

                          by Anonymous Coward on Monday May 22 2017, @05:53AM (#513335)

                          You're right, it isn't worth the risk; if they do get in early though it may be very valuable in the future.

    • (Score: 2) by takyon on Sunday May 21 2017, @07:26PM (1 child)

      by takyon (881) <takyonNO@SPAMsoylentnews.org> on Sunday May 21 2017, @07:26PM (#513116) Journal

      One day, quantum computing could come to consumer devices and even SoCs. Of course, it would have to work at room temperature.

      Google's TPUs will be coming to cell phones, providing a fairly narrow neuromorphic kind of function. I don't think it is impossible for consumer SoCs to ship with quantum computing components. Even if they were as limited as D-Wave's "quantum annealer".

      --
      [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
      • (Score: 1) by khallow on Sunday May 21 2017, @11:54PM

        by khallow (3766) Subscriber Badge on Sunday May 21 2017, @11:54PM (#513206) Journal
        The first problem comes with the term, "one day". I doubt that they are foreseeing serious QC business in the short term. There isn't much that QC can do right now.

        Second, IBM would be throwing away some big businesses already. While everything can be sold with more advanced security algorithms (which is one of the few niches that QC might work in), there isn't that compelling a need to make something more secure. I'm just not seeing how QC will help that much with IBM's typical service-oriented businesses.
    • (Score: 2) by JoeMerchant on Monday May 22 2017, @01:09AM (4 children)

      by JoeMerchant (3937) on Monday May 22 2017, @01:09AM (#513236)

      But, is there any profit growth left in "big iron" pushing HTML or spreadsheets?

      QC is new territory, with mostly undefined applications, a chance for growth. HTML and spreadsheets are a race to the bottom with every other vendor on the planet.

      --
      🌻🌻 [google.com]
      • (Score: 1) by khallow on Monday May 22 2017, @02:05AM (3 children)

        by khallow (3766) Subscriber Badge on Monday May 22 2017, @02:05AM (#513263) Journal

        But, is there any profit growth left in "big iron" pushing HTML or spreadsheets?

        I don't see the point of abandoning a mature industry with a high profit margin for an unproven one, with extremely little development at present and little relevant to IBM's current business model. More likely is that IBM has grown so inefficient that they compete poorly in considerable portions of their old markets.

        • (Score: 2) by JoeMerchant on Monday May 22 2017, @03:45AM (2 children)

          by JoeMerchant (3937) on Monday May 22 2017, @03:45AM (#513300)

          Laying off your high priced older employees isn't abandoning an industry, it's cost control. They can still service the old accounts and continue to milk them.

          --
          🌻🌻 [google.com]
          • (Score: 1) by khallow on Monday May 22 2017, @03:59AM (1 child)

            by khallow (3766) Subscriber Badge on Monday May 22 2017, @03:59AM (#513307) Journal
            I think that's a better take on the situation. It's still short-sighted, since the experience is in the people they're getting rid of, but at least it a saner course of action.
            • (Score: 2) by JoeMerchant on Monday May 22 2017, @01:21PM

              by JoeMerchant (3937) on Monday May 22 2017, @01:21PM (#513471)

              It's a really shitty thing to do as a "corporate citizen" and it is often not as effective as the spreadsheets say it will be since one older employee earning $125K per year can often bring more income to a company than 3 younger employees earning $50K per year (and the cost difference to the company is more like $225K for the old one vs $375K for the 3 younger ones).

              --
              🌻🌻 [google.com]