Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Tuesday April 11 2017, @09:11PM   Printer-friendly
from the do-you-know-how-you-think? dept.

Will Knight writes:

No one really knows how the most advanced algorithms do what they do. That could be a problem.

Last year, a strange self-driving car was released onto the quiet roads of Monmouth County, New Jersey. The experimental vehicle, developed by researchers at the chip maker Nvidia, didn't look different from other autonomous cars, but it was unlike anything demonstrated by Google, Tesla, or General Motors, and it showed the rising power of artificial intelligence. The car didn't follow a single instruction provided by an engineer or programmer. Instead, it relied entirely on an algorithm that had taught itself to drive by watching a human do it.

Getting a car to drive this way was an impressive feat. But it's also a bit unsettling, since it isn't completely clear how the car makes its decisions. Information from the vehicle's sensors goes straight into a huge network of artificial neurons that process the data and then deliver the commands required to operate the steering wheel, the brakes, and other systems. The result seems to match the responses you'd expect from a human driver. But what if one day it did something unexpected—crashed into a tree, or sat at a green light? As things stand now, it might be difficult to find out why. The system is so complicated that even the engineers who designed it may struggle to isolate the reason for any single action. And you can't ask it: there is no obvious way to design such a system so that it could always explain why it did what it did.

The mysterious mind of this vehicle points to a looming issue with artificial intelligence. The car's underlying AI technology, known as deep learning, has proved very powerful at solving problems in recent years, and it has been widely deployed for tasks like image captioning, voice recognition, and language translation. There is now hope that the same techniques will be able to diagnose deadly diseases, make million-dollar trading decisions, and do countless other things to transform whole industries.

[...] The U.S. military is pouring billions into projects that will use machine learning to pilot vehicles and aircraft, identify targets, and help analysts sift through huge piles of intelligence data. Here more than anywhere else, even more than in medicine, there is little room for algorithmic mystery, and the Department of Defense has identified explainability as a key stumbling block.

[...] At some stage we may have to simply trust AI's judgement or do without using it. Likewise, that judgement will have to incorporate social intelligence. Just as society is built upon a contract of expected behaviour, we will need to design AI systems to respect and fit with our social norms. If we are to create robot tanks and other killing machines, it is important that their decision-making be consistent with our ethical judgements.

https://www.technologyreview.com/s/604087/the-dark-secret-at-the-heart-of-ai/

What do you think, would you trust such AI even if you couldn't parse its methods? Is deep learning AI technology inherently un-knowable?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by Anonymous Coward on Tuesday April 11 2017, @09:20PM (27 children)

    by Anonymous Coward on Tuesday April 11 2017, @09:20PM (#492464)

    All of the same questions apply to humans behind the wheel, too.

    Starting Score:    0  points
    Moderation   +3  
       Insightful=2, Interesting=1, Total=3
    Extra 'Insightful' Modifier   0  

    Total Score:   3  
  • (Score: 2) by NotSanguine on Tuesday April 11 2017, @09:49PM (3 children)

    All of the same questions apply to humans, regardless of what they're doing.

    There. FTFY.

    --
    No, no, you're not thinking; you're just being logical. --Niels Bohr
    • (Score: 2) by maxwell demon on Wednesday April 12 2017, @07:24AM (2 children)

      by maxwell demon (1608) on Wednesday April 12 2017, @07:24AM (#492649) Journal

      Not really. We know that humans generally have a common inherited system that evolved to prevent things going too wrong. This system is mostly encoded in our emotions. Sure, emotions can themselves go wrong and cause problems, but all in all, they are what prevents most humans to become a major danger (and not surprisingly, the most dangerous humans are those whose emotional system is defective in a way that it doesn't properly control them; we tend to call such people sociopaths).

      Also note that in the specific case of driving, humans don't just imitate other humans, they get explicitly taught certain rules, including rules for situations they might not ever have experienced first hand (like, how to behave when the car engine fails while the car is still moving).

      --
      The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 2) by massa on Wednesday April 12 2017, @01:11PM

        by massa (5547) on Wednesday April 12 2017, @01:11PM (#492724)

        Not really. We know that humans generally have a common inherited system that evolved to prevent things going too wrong.

        This was only true while we were hunter-gatherers, 50.000 years ago.

        Also note that in the specific case of driving, humans don't just imitate other humans, they get explicitly taught certain rules, including rules for situations they might not ever have experienced first hand (like, how to behave when the car engine fails while the car is still moving).

        This (and a lot of circulation rules, etc) is encoded in the previously loaded knowledge base of the driving AIs, too.

      • (Score: 2) by NotSanguine on Wednesday April 12 2017, @04:08PM

        by NotSanguine (285) <NotSanguineNO@SPAMSoylentNews.Org> on Wednesday April 12 2017, @04:08PM (#492843) Homepage Journal

        The post I replied to was (IIUC) referring to this:

        What do you think, would you trust such AI even if you couldn't parse its methods? Is deep learning AI technology inherently un-knowable?

        If one were to rewrite the above as:

        What do you think, would you trust a human even if you couldn't parse its methods? Is human behavior inherently un-knowable?

        Do you trust all humans? Do you trust any humans? What criteria do you use when deciding to trust someone?

        Trust is earned through experience with a person. Unfortunately, we don't have experience with most other people, or most other drivers. Which is, I assume, why, when I was learning to drive, one of the first things the instructor told me was that I should assume that all other drivers are deaf, blind and stupid.

        While, as you correctly pointed out, we do have some understanding of the basis for human behavior, the neural processes (which is what the questioner was referring to) that underlie decision making are pretty poorly understood.

        --
        No, no, you're not thinking; you're just being logical. --Niels Bohr
  • (Score: 3, Insightful) by Justin Case on Tuesday April 11 2017, @09:59PM (2 children)

    by Justin Case (4239) on Tuesday April 11 2017, @09:59PM (#492481) Journal

    But what doesn't apply to humans is utter lack of fear, pain, or consequences for misbehavior.

    • (Score: 3, Insightful) by inertnet on Tuesday April 11 2017, @11:24PM

      by inertnet (4071) on Tuesday April 11 2017, @11:24PM (#492525) Journal

      Maybe the lack of hormones, emotions and animal instincts will compensate for that.

    • (Score: 2) by darkfeline on Wednesday April 12 2017, @05:32PM

      by darkfeline (1030) on Wednesday April 12 2017, @05:32PM (#492912) Homepage

      It sounds like you never heard of teenagers.

      --
      Join the SDF Public Access UNIX System today!
  • (Score: 1) by butthurt on Tuesday April 11 2017, @10:45PM (18 children)

    by butthurt (6141) on Tuesday April 11 2017, @10:45PM (#492508) Journal

    In some places, human drivers are expected to pass a written test. This software is somewhat like a human who couldn't.

    • (Score: 0) by Anonymous Coward on Tuesday April 11 2017, @11:10PM (9 children)

      by Anonymous Coward on Tuesday April 11 2017, @11:10PM (#492514)

      You could train a neural net on every single driving test on the planet; it would be far better at taking such exams than any human.

      • (Score: 2) by Unixnut on Tuesday April 11 2017, @11:19PM (5 children)

        by Unixnut (5779) on Tuesday April 11 2017, @11:19PM (#492521)

        > You could train a neural net on every single driving test on the planet; it would be far better at taking such exams than any human.

        And when they actually do that, and the AI proves itself by beating the test to the competence level of a human, then we can consider them fit for purpose, and can start trusting them as much as a human in the situation (which isn't much admittedly, but ok). Yet despite the AI being "capable" of this feat (stated buy humans), not one of them has so far achieved the feat.

        Until an AI proves itself as capable as a human by those tests, it is still an lab curiosity, not fit for the real world (except to act under direction of humans, as a sort of "mental crutch" to help with specific thinking problems, which is what these DNN are actually good for).

        • (Score: 1, Informative) by Anonymous Coward on Tuesday April 11 2017, @11:39PM

          by Anonymous Coward on Tuesday April 11 2017, @11:39PM (#492531)

          Driving is all about experience, and that's what neural nets are: Pure experience.

        • (Score: 2) by massa on Wednesday April 12 2017, @01:14PM (3 children)

          by massa (5547) on Wednesday April 12 2017, @01:14PM (#492726)

          Until an AI proves itself as capable as a human by those tests, it is still an lab curiosity, not fit for the real world (except to act under direction of humans, as a sort of "mental crutch" to help with specific thinking problems, which is what these DNN are actually good for).

          AFAICT self-driving vehicles pass those specific (driving) tests with flying colors, just like fresh 16 year-olds (or 18 down here in Brasil). I would trust my life on the highway to one of those AIs much sooner than to one of the college freshmen I see driving on the street.

          • (Score: 2) by Unixnut on Wednesday April 12 2017, @02:34PM (2 children)

            by Unixnut (5779) on Wednesday April 12 2017, @02:34PM (#492771)

            > AFAICT self-driving vehicles pass those specific (driving) tests with flying colors, just like fresh 16 year-olds (or 18 down here in Brasil). I would trust my life on the highway to one of those AIs much sooner than to one of the college freshmen I see driving on the street.

            Based on that statement, all I can say is you should have harder driving tests. Not being flippant, and I don't know in Brazil, but I know Europe, in some countries you can basically get a driving licence by looking pretty, a bribe or "service exchange" with the examiner, while others take it seriously, and have tests that most people would fail, regardless of age (e.g. Finland)

            Needless to say, the stats of EU road accidents correlate to the countries where the person passed their "test". I personally am in a country that isn't as hard as Finland, but pretty hard, so we have higher standards of driving on the roads as a result. I see no reason why it should be easy to get a licence. You are operating 1+ Tons of metal with a lot of kinetic energy, it is a privilege not a right to drive.

            • (Score: 2) by massa on Wednesday April 12 2017, @03:45PM (1 child)

              by massa (5547) on Wednesday April 12 2017, @03:45PM (#492816)

              I think you missed the point. The point is that as difficult as the test is in whatever country, it really does only test if the fresh driver can drive the car around some city blocks, safely. And THAT the driving AIs already do far better than fresh drivers. Even if in Finland (or wherever) the driver test is not a crash-oriented test, nor a radical defensive driving test, or even a Kobayashi-Maru style "what to do in a no-win situation" test. And the problems we (as human drivers) encounter (besides the fatigue, alcohol, and twitter problems -- that the AIs already solve) are on those situations that are NEVER tested in a driver-license test.

              • (Score: 3, Informative) by Unixnut on Wednesday April 12 2017, @04:16PM

                by Unixnut (5779) on Wednesday April 12 2017, @04:16PM (#492851)

                and I think you missed my point. Driving is hard, really hard. Not the technical skill required to go round the block a few times (hell, I did that as a 7 year old in my dads car). The ability to anticipate events, the ability to read what other drivers are doing, understand their intent. The ability to silently agree on ways to deal with obstacles, negotiate rights of way (even if it goes against road rules). infer events from things as subtle as reflections off another car indicating one is approaching a blind corner, the ability to hear engine sounds and infer whether what you can't see is accelerating, decelerating or holding steady.

                Then you get into things like bad weather, worn out (or non existent) road markings, non functioning traffic lights, potholes, temporary redirections due to road works, water on roads, etc.. The world is not a perfectly clean and well maintained ideal.

                Yes, superior sensors can mitigate some of the above, but not enough to meet the standards that a human (even a "fresh" teenage one, if properly trained) can achieve. Humans are very adaptable, and we have millions of years of evolutionary training in this. No AI can reach that (at this point in time) so no, they can't do far better than even fresh drivers (unless the drivers were trained poorly, which is the fault of the training, not the human, hence my point about better driving tests).

                That an AI can beat a poorly trained human in a really restricted set of circumstances doesn't surprise me. I am sure we could program an automated car directly (without ML) and it would also beat the human in that test. However that doesn't mean the car can "drive" better than humans can, or that we are close to some technological singularity where everyone will have a robot chauffeur (unless of course, you just want to go round your block endlessly).

                We are still a long way off from an AI that can be considered a fit replacement for a human driver.

      • (Score: 1) by butthurt on Wednesday April 12 2017, @01:45AM (2 children)

        by butthurt (6141) on Wednesday April 12 2017, @01:45AM (#492573) Journal

        > You could train a neural net on every single driving test on the planet; it would be far better at taking such exams than any human.

        I don't disagree. I was, however, writing about the neural net described in the article, which hasn't had such training. The one in the article has been trained to actually drive, not to take tests. If a neural net were trained to both drive and take written tests, its knowledge (as it were) of how to drive could be independent of its knowledge of how to answer test questions, could it not?

        • (Score: 0) by Anonymous Coward on Wednesday April 12 2017, @04:08AM (1 child)

          by Anonymous Coward on Wednesday April 12 2017, @04:08AM (#492601)

          People cram for that worthless exam, and then immediately forget it.

          Nobody drives by the book; everybody drives by their experience—driving is 100% instinct.

          • (Score: 2) by butthurt on Wednesday April 12 2017, @05:49AM

            by butthurt (6141) on Wednesday April 12 2017, @05:49AM (#492632) Journal

            Reading a book, listening to a lecture in a classroom, and talking to someone while driving are experiences too. Is that what you're saying? If not, let's leave that aside. People are often taught by methods such as those, which supplement their time in a simulator or alone in a car. An instructor can say "this sign means you must stop for a few seconds)" and expect the pupil to learn right away from being told. A human who already drives proficiently can be told "it's fine to park here today, because it's a holiday" or "'U' turns will be forbidden in the village starting 1 January" and will react accordingly. There's more to it than observation and experience.

    • (Score: 2) by frojack on Wednesday April 12 2017, @04:16AM (6 children)

      by frojack (1554) on Wednesday April 12 2017, @04:16AM (#492604) Journal

      In some places, human drivers are expected to pass a written test

      Where's that?

      I'm 68, and I took my first driver's test on a machine, multiple choice.
      Not certain passing the test assured any driving skill.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 2) by butthurt on Wednesday April 12 2017, @05:18AM (4 children)

        by butthurt (6141) on Wednesday April 12 2017, @05:18AM (#492626) Journal

        > Where's that?

        Malaysia, for one.

        http://www.expatgo.com/my/2012/05/25/driving-test-in-malaysia/ [expatgo.com]

        > I'm 68, and I took my first driver's test on a machine, multiple choice.

        I suppose you answered via a touch screen or a mark sense card, and you're asserting that that wasn't really writing? All right, but you were still communicating.

        > Not certain passing the test assured any driving skill.

        Indeed, the page I linked advises:

        Some of the questions and answers can be strangely worded, so it’s best to just memorise all of the questions and answers rather than relying on common sense.

        • (Score: 5, Funny) by NotSanguine on Wednesday April 12 2017, @05:28AM

          by NotSanguine (285) <NotSanguineNO@SPAMSoylentNews.Org> on Wednesday April 12 2017, @05:28AM (#492627) Homepage Journal

          Some of the questions and answers can be strangely worded, so it’s best to just memorise all of the questions and answers rather than relying on common sense.

          Pittsburgh driver's test
          2: A traffic light at an intersection changes from yellow to red, you should
          a) stop immediately.
          b) proceed slowly through the intersection.
          c) blow the horn.
          d) floor it.
          The correct answer is d.
          If you said c, you were almost right, so give yourself a half point.

          3: When stopped at an intersection you should
          a) watch the traffic light for your lane.
          b) watch for pedestrians crossing the street.
          c) blow the horn.
          d) watch the traffic light for the intersecting street.
          The correct answer is d.
          You need to start as soon as the traffic light for the intersecting
          street turns yellow.
          Answer c is worth a half point.

          5: Your car's horn is a vital piece of safety equipment.
          How often should you test it?
          a) once a year.
          b) once a month.
          c) once a day.
          d) once an hour.
          The correct answer is d.
          You should test your car's horn at least once every hour,
          and more often at night or in residential neighborhoods.

          7) The car directly in front of you has a flashing right tail light
          but a steady left tail light. This means

          (a) one of the tail lights is broken; you should blow your horn
          to call the problem to the driver's attention.
          (b) the driver is signaling a right turn.
          (c) the driver is signaling a left turn.
          (d) the driver is from out of town.

          The correct answer is (d). Tail lights are used in some foreign
          countries to signal turns.

          (8) Pedestrians are

          (a) irrelevant.
          (b) communists.
          (c) a nuisance.
          (d) difficult to clean off the front grille.

          The correct answer is (a). Pedestrians are not in cars, so they are
          totally irrelevant to driving; you should ignore them completely.

          --
          No, no, you're not thinking; you're just being logical. --Niels Bohr
        • (Score: 3, Insightful) by mhajicek on Wednesday April 12 2017, @06:41AM (2 children)

          by mhajicek (51) on Wednesday April 12 2017, @06:41AM (#492642)

          If you think a 68 year old took his first driving test on a touch screen I think you're a bit out of touch.

          --
          The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
      • (Score: 2) by maxwell demon on Wednesday April 12 2017, @07:30AM

        by maxwell demon (1608) on Wednesday April 12 2017, @07:30AM (#492650) Journal

        Not certain passing the test assured any driving skill.

        Of course not. The test proves that you know the rules.

        Driving skill: You know how to keep the car on the road at a certain speed.
        Rule knowledge: You know to keep your speed below a certain limit even if the conditions would allow for more.

        --
        The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 2) by VLM on Wednesday April 12 2017, @12:43PM

      by VLM (445) Subscriber Badge on Wednesday April 12 2017, @12:43PM (#492714)

      In some places, human drivers are expected to pass a written test. This software is somewhat like a human who couldn't.

      Its a white privilege to take drivers ed and possess a drivers license. In some of the more vibrant multicultural areas of my nearest big city about 1/3 of the drivers do not follow that aspect of whiteys lifestyle. Its interesting that if you live and work in the (white) burbs uninsured motorist coverage is literally like $4/year and working near the very multicultural hood its more like $300/year because lets face it practically all accidents are caused by unlicensed uninsured drivers.

      Just something to think about, that an AI driving at the level of an experienced sober wide awake white male is probably very difficult, but people generally tolerate some pretty unusual behavior in the hood.

      As long as it stays in the hood. Which is going to be interesting. "For the safety of our staff and students, XYZ middle school forbids self driving cars at student dropoff and pickup". Note that they can't issue traffic tickets or take you to court but they can and will make your childs school experience a living hell if you don't obey.

      I suspect as the death toll from self driving cars increases, there will be discrimination against self driving cars. Sorry no self driving cars allowed in our parking lot or parking structure. Sorry no self driving cars allowed in our drive thru. Sorry no self driving cars allowed around our school (kinda like how smoking has been banned in public nearby schools). You'll be free to own self driving cars as property, and they Might be legal to use on the public roads, but what good is that if you can't put the car on any private property other than your home?

      I predict self driving cars will go thru a phase kinda like open firearm carry in practice. Technically its legal. In practice doing it is "disturbing the peace" and you'll get arrested although maybe no charged. You can do it anywhere in public except virtually all private retail property and schools and the courthouse and any government office (post office etc). So you can own it, and do it, but its so heavily restricted you can't live a normal life while doing it.

      Thats kinda my hard sci fi plot of the day, how self driving cars will be discriminated against in lets say 2025 or so.

  • (Score: 2) by c0lo on Wednesday April 12 2017, @11:04AM

    by c0lo (156) Subscriber Badge on Wednesday April 12 2017, @11:04AM (#492686) Journal

    Artificial human acts as human; scientists scared.

    You mean... some artificial humans start receiving blowjobs while driving, too?
    Now I'm scared, even if not a scientist.

    (grin)

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford