Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Saturday October 22 2016, @04:38AM   Printer-friendly
from the hopefully-for-the-better dept.

When the world's first digital computer was completed in 1946 it opened up new vast new worlds of possibility. Still, early computers were only used for limited applications because they could only be programmed in machine code. It took so long to set up problems that they were only practical for massive calculations.

That all changed when John Backus created the first programming language, FORTRAN, at IBM in 1957. For the first time, real world problems could be quickly and efficiently transformed into machine language, which made them far more practical and useful. In the 1960's, the market for computers soared.

Like early digital computers, quantum computing offers the possibility of technology millions of times more powerful than current systems, but the key to success will be translating real world problems into quantum language. At D-Wave, which is already producing a commercial version, that process is already underway and it is revealing massive potential.

[Continues...]

... while these are major advances, our newfound knowledge has also revealed our limitations. Unlocking the secrets of DNA exposed how little we know about the proteins it codes for, just as early successes with targeted therapies have shown us how much more we can achieve by working with complete genomes rather than just isolated markers in our chromosomes.

Unfortunately, conventional computers aren't powerful enough to perform these tasks well, but early indications are that quantum computers can close the gap. Scientists at Harvard have found that quantum computers will allow us to map proteins much as we do genes today. D-Wave has also formed a partnership with DNA-SEQ to use its quantum computers to explore how to analyze entire genomes to create more effective therapies.

Mapping the human genome was a triumph of technology as much as it was an achievement in biology. It was, essentially, more powerful computers that allowed us to analyze human DNA on a massive scale. However, if we are to advance further, quantum systems will likely be a big part of the answer.

Take a look on the Internet and you can find hilarious lists of autocorrect mangling phrases, like changing "I don't" to "idiot" in a text to your mom. These are embarrassing mistakes, but they usually don't cause too much trouble. However, in other applications, like picking a terrorist out of a crowd through facial recognition, the stakes are far higher.

These problems arise because of how machine learning algorithms are designed and trained. Like our brains, they process different aspects of an experience, such as colors and shapes and integrate them into larger concepts, such as a human face, a type of hairstyle or the signature style of a popular designer.

However, in order for this process to work well, the more elemental aspects need to be correctly identified or they will pass on bad information to the higher levels of the system. Because of the limited capacity of conventional computers, data is lost in the training process and things are not recognized correctly, resulting in insults to your mom and terrorists misidentified.

[...] In 1968, just a decade after John Backus introduced FORTRAN, Douglas Engelbart presented the results of his project to "augment human intellect" and it turned out to be so consequential that it is now called The Mother of All Demos. Until that point computers were, much like quantum technology today, merely computational devices that few people ever saw.

[...] I'm not implying that we will all have quantum computers in our homes, but we will likely be able to access them in the cloud and they will help us solve problems that seem impossible today. D-Wave's Hilton told me "the quantum computing revolution may be even more profound than the digital computing revolution a half century ago and it will happen much faster."


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Saturday October 22 2016, @05:13AM

    by Anonymous Coward on Saturday October 22 2016, @05:13AM (#417526)

    ... you should at least stick to products that actually work.

    • (Score: 3, Informative) by driverless on Saturday October 22 2016, @07:29AM

      by driverless (4770) on Saturday October 22 2016, @07:29AM (#417543)

      Yup. What next, "How antigravity machines will change the world", "How eternal-life potions will change the world", "How free-energy generators will change the world"?

    • (Score: 0) by Anonymous Coward on Saturday October 22 2016, @09:45AM

      by Anonymous Coward on Saturday October 22 2016, @09:45AM (#417558)

      Nice product placement SN

      If you can afford a D-Wave quantum computer you're not reading SN ...but your butler's personal assistant's tech contact might be.

      • (Score: 2) by fubari on Sunday October 23 2016, @04:44PM

        by fubari (4551) on Sunday October 23 2016, @04:44PM (#417886)

        What product was placed? Do you mean the text where some DWave guy said 'blah blah quantum blah blah Cloud' ? I'm guessing here because the parent posts are kind of vague. Help me understand what you took offense to.

        I for one am genuinely interested to see if DWave is yielding practical results.
        One way (maybe the only way?) to find out about that is to see what kinds of "problem nails" people are using a "DWave hammer" to pound. Over the last year of so what I've read suggests "maybe, maybe not - results ambiguous so far."

        The dan-seqalliance [dna-seqalliance.com] page is largely a "we're going to try this" fluff piece, nothing solid yet. So I'm glad they're trying some stuff, maybe they'll get results. Time will tell.
        But being off put about "product placement"?

        This DigitalTonto article [digitaltonto.com], was a genuinely interesting summary of a a researcher's career. It would be a nice introduction for anyone not already versed in the goals of quantum computing. (fwiw, I didn't see any products being placed there.)

  • (Score: 0) by Anonymous Coward on Saturday October 22 2016, @05:22AM

    by Anonymous Coward on Saturday October 22 2016, @05:22AM (#417528)

    The intro sounds like something the green site would publish.

  • (Score: 0) by Anonymous Coward on Saturday October 22 2016, @06:17AM

    by Anonymous Coward on Saturday October 22 2016, @06:17AM (#417538)
    Not even D-Wave, despite the hype they're clearly trying to generate with this submarine ad that they tricked you folks into running for them. Now, if they really have a working quantum computer, let's see them try to get us the prime factorisation of: 412023436986659543855531365332575948179811699844327982845455626433876445565248426198098870423161841879261420247188869492560931776375033421130982397485150944909106910269861031862704114880866970564902903653658867433731720813104105190864254793282601391257624033946373269391. That's an 896-bit number, should be easy as pie if you had a real quantum computer capable of doing computations on superpositions the way Shor's algorithm requires.
    • (Score: 0) by Anonymous Coward on Saturday October 22 2016, @02:30PM

      by Anonymous Coward on Saturday October 22 2016, @02:30PM (#417576)

      It's 42. The remainder is turtles, turtles all the way down...

    • (Score: 0) by Anonymous Coward on Saturday October 22 2016, @10:20PM

      by Anonymous Coward on Saturday October 22 2016, @10:20PM (#417683)

      as easy as pie

      A long wait even for a working quantum computer to equate pi as prime, maybe if it were an apple?

  • (Score: 2) by TGV on Saturday October 22 2016, @07:20AM

    by TGV (2838) on Saturday October 22 2016, @07:20AM (#417540)

    Is it a computer generated summary of a whole year of Scientific American? Or was the author high? Whatever the cause, it's pure gibberish.

    • (Score: 0) by Anonymous Coward on Saturday October 22 2016, @07:33AM

      by Anonymous Coward on Saturday October 22 2016, @07:33AM (#417546)

      Be thankful the summary is only 11 paragraphs, not the full 20. Muahaha!

  • (Score: 2) by HiThere on Saturday October 22 2016, @06:49PM

    by HiThere (866) Subscriber Badge on Saturday October 22 2016, @06:49PM (#417629) Journal

    Quantum computing is theoretically faster than standard computing for a limited set of uses. And if Fujitsu can succeed in their latest attempt, not even as many of those as was once thought. The Fujitsu attempt, while also specialized, claims to scale well, and address those problems that can be posed as problems of simulated annealing, which was one large group of problems where quantum computing was supposed to have an advantage.

    Actually, the benefits of the Fujitsu approach seem to boil down to 1) stable for long periods of time (less state decay), 2) cheaper, and 3) room temperature...no need to fiddle around with liquid helium, or even nitrogen.

    That said, the Fujitsu approach is still in pre-beta, while there actually ARE existing quantum computers, but they are so expensive to operate and buy that I expect Fujitsu's approach to dominate in the area of simulated annealing...IF they can make it work.

    There will still be some problems that can only be addressed well with quantum computers, but it's not at all clear how many. Secure communication might be one of them, but I suspect that there will be encryption approaches that don't yield to quantum computers ... and I mean besides one-time tables.

    --
    Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
    • (Score: 1, Informative) by Anonymous Coward on Sunday October 23 2016, @02:02AM

      by Anonymous Coward on Sunday October 23 2016, @02:02AM (#417714)
      There already are several cryptosystems that are not easily solvable by quantum computers, in that there is no known quantum algorithm that will break them. Of course, there's no proof that there really don't exist any quantum algorithms that can solve any of them efficiently, but remember there's also no proof that there don't exist any classical algorithms that can factor numbers or compute discrete logarithms efficiently either. There's the McEliece cryptosystem (based on error correcting codes), schemes based on supersingular elliptic curve isogeny, lattice-based cryptosystems like Ring-LWE, and a few more. They all require bigger (and in some cases much bigger) keys than traditional algorithms. Symmetric key cryptosystems can be decrypted by quantum computers as though they had only half the bits in their key, but that is easily enough compensated for without resorting to completely new algorithms: triple encryption can be employed to increase key lengths to the point where not even quantum computers can deal with them. Triple-AES for instance would instantly negate any advantage a quantum computer might have in breaking the encryption.
  • (Score: 2) by gidds on Monday October 24 2016, @12:46PM

    by gidds (589) on Monday October 24 2016, @12:46PM (#418124)

    When the world's first digital computer was completed in 1946 [...]

    I guess it's referring to ENIAC, which shows a real ignorance of early computers.  There were many firsts in the field, but several digital computers pre-dated that, some by quite a long time...

    Even in the USA, the Harvard Mark I [wikipedia.org] was a programmable digital (electromechanical) computer in 1944.

    Colossus [wikipedia.org] was a programmable electronic digital computer working in 1943.

    Konrad Zuse's Z3 [wikipedia.org] was a programmable digital (electromechanical) computer in 1941.

    And of course Babbage's Analytical Engine [wikipedia.org] was a programmable digital computer designed back in 1837! (Parts were built by 1862; no-one's yet constructed the whole thing, but it's believed it would work as designed.)

    (And if you relax the 'digital', then the Antikythera mechanism [wikipedia.org] was an analogue computer well over 2000 years ago!)

    --
    [sig redacted]