Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Wednesday July 20 2016, @05:17PM   Printer-friendly
from the i-just-don't-get-it dept.

Submitted via IRC for Bytram

This week Samuel Arbesman, a complexity scientist and writer, will publish "Overcomplicated: Technology at the Limits of Comprehension." It's a well-developed guide for dealing with technologies that elude our full understanding. In his book, Arbesman writes we're entering the entanglement age, a phrase coined by Danny Hillis, "in which we are building systems that can't be grasped in their totality or held in the mind of a single person." In the case of driverless cars, machine learning systems build their own algorithms to teach themselves — and in the process become too complex to reverse engineer.

And it's not just software that's become unknowable to individual experts, says Arbesman.

Machines like particle accelerators and Boeing airplanes have millions of individual parts and miles of internal wiring. Even a technology like the U.S. Constitution, which began as an elegantly simple operating system, has grown to include a collection of federal laws "22 million words long with 80,000 connections between one section and another."

In the face of increasing complexity, experts are ever more likely to be taken by surprise when systems behave in unpredictable and unexpected ways.

Source: http://singularityhub.com/2016/07/17/the-world-will-soon-depend-on-technology-no-one-understands/

For a collection of over three decades of these (among other things) see The Risks Digest - Forum On Risks To The Public In Computers And Related Systems. It's not so much that this is a new problem, as it is an increasingly common one as technology becomes ever more complicated.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by hendrikboom on Thursday July 21 2016, @05:47PM

    by hendrikboom (1125) on Thursday July 21 2016, @05:47PM (#378043) Homepage Journal

    Compilers got to be better than humans sometime in the 70's. Not that all of them are, nor even that humans can't improve a lot of fragments of the code they generate. But when you start to look at programs of thousands of lines of assembler, people get tired, and fail to work on each fragment at their human peak optimization ability.

    Compilers may do worse in small pieces of code, but they don't get tired and in the long stretches they end up excelling. It's like the difference between a sprinter and a long-distance runner.

    Hand-optimizing the few bits of code that are time-critical can, of course, still improve efficiency.

    Not to mention that coding a lot of assembler can blind you to the possibilities of changing data representations or algorithms, which I'm not even taking into account.

    But I'll agree that on small pieces of code, like the ones presented in this discussion, human programmers can often outdo compilers.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2