Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday July 26 2018, @08:01AM   Printer-friendly
from the if-it-walks-like-a-duck,-sinks-like-a-duck,-oh,-wait... dept.

The Los Angeles Times reports:

The duck boat that sank in a Missouri lake last week, killing 17 people, was built based on a design by a self-taught entrepreneur who had no engineering training, according to court records reviewed by the Los Angeles Times.

The designer, entrepreneur Robert McDowell, completed only two years of college and had no background, training or certification in mechanics when he came up with the design for "stretch" duck boats more than two decades ago, according to a lawsuit filed over a roadway disaster in Seattle involving a similar duck boat in 2015.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by bobthecimmerian on Thursday July 26 2018, @11:07AM (9 children)

    by bobthecimmerian (6834) on Thursday July 26 2018, @11:07AM (#713013)

    I'm no fan of Tesla, Google, or Elon Musk or Larry Page and Sergei Brin. That said, my impression is that Google will get this right and Tesla won't simply because of money. Tesla is struggling to be profitable so Musk is rushing products out the door in a hurry. Google has the resources it needs to develop the product carefully and make damn sure nobody that buys it can sue them for making it unsafe.

    I'm a Free Software Foundation member, I would genuinely like to see free-as-in-freedom/open source software everywhere for everything. But I don't see how that can work for automated driving unless some incredibly wealthy people or groups back the project. I'm not going to put something with (pardon the capitalization, I'm just copy-pasting) this disclaimer in control of my safety: "THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM “AS IS” WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION."

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 0) by Anonymous Coward on Thursday July 26 2018, @02:32PM (3 children)

    by Anonymous Coward on Thursday July 26 2018, @02:32PM (#713113)

    Waymo is using a different model than Tesla is. The extra money is important, but Waymo seems to actually be testing things prior to handing them over to consumers. I'm sure by the time that Waymo offers anything for sale to the consumers it won't kill anybody. There will probably be a lot of other things wrong with it like that it'll spy on people as they drive, but I doubt it'll kill anybody.

    • (Score: 0) by Anonymous Coward on Thursday July 26 2018, @08:46PM (2 children)

      by Anonymous Coward on Thursday July 26 2018, @08:46PM (#713358)

      Yes, Waymo is using a different model and they are testing in depth, for example see this recent press release,
          https://www.automotivetestingtechnologyinternational.com/news/vehicle-testing/fca-us-adds-62000-chrysler-pacifica-hybrid-minivans-waymos-self-driving-fleet.html [automotivetestingtechnologyinternational.com]

      Yes -- they are adding 62 000 minivans to their test fleet. For comparison, that's double the number of vehicles made in total by Tesla in the first quarter this year.

      The old saying that you can't "test in reliability" may still be true, but, if you do enough testing, maybe you can approach a acceptable level of reliability?

      • (Score: 2) by Knowledge Troll on Thursday July 26 2018, @09:57PM (1 child)

        by Knowledge Troll (5948) on Thursday July 26 2018, @09:57PM (#713384) Homepage Journal

        if you do enough testing, maybe you can approach a acceptable level of reliability?

        I'm not looking forward to a future where we can no longer prove a control system is operating correctly. We are going from control systems based on state transition charts and a dynamics model that can be analyzed and proven to have the properties that it is designed for. There can be errors in design that lead to problems but fundamentally you can go back and say "this is exactly what went wrong".

        You can't do that with a NN control system. The best you can say is "we made a test case and it no longer faults in that test case." Maybe in the future as network analysis gets better but certainly not right now.

  • (Score: 3, Interesting) by Knowledge Troll on Thursday July 26 2018, @03:05PM (4 children)

    by Knowledge Troll (5948) on Thursday July 26 2018, @03:05PM (#713148) Homepage Journal

    That said, my impression is that Google will get this right and Tesla won't simply because of money.

    I think you are right that Google has the best chance out of all 3 of the organizations I mentioned though I'm not sure that Google can actually pull it off because I'm not sure anyone can actually pull it off.

    Google has the resources it needs to develop the product carefully and make damn sure nobody that buys it can sue them for making it unsafe.

    I think this is pretty good analysis but I'll go a step further and bring someone else into this argument I left out originally: GM. Google hasn't been sued even once for killing someone that I know of. GM has been sued for killing people more times than I can count. Out of everyone working on robot cars I think only the current autombile manufacturers really have an intuitive organizational level of understanding of what it means to be sued and how unpleasant that is (lets not kid ourselves that people dying is a problem for them unless it hits the news) - and even when that is true there is still issues with faulty ignition locks and automatic transmission shifters and crap like that.

    Google out of the list I originally gave is the only one who hasn't killed someone with their machine. I'm pretty sure GM hasn't either. I personally believe the car manufacturers have the best chance at doing this "right" in terms of safety and working with the driver because of this history of being sued and a very long history of understanding their customer. Google wants to take the steering wheel away (they are very public about this) but GM wants to give drivers something they will like.

    Google is also the only one with any understanding of AI. I think this has helped them greatly make up the gap with industry specific knowledge with automobiles. I'm not sure what is going to happen with engineers at GM that have all the car experience but zero NN experience.

    The whole robot car thing is fucking insane!

    • (Score: 2) by darkfeline on Thursday July 26 2018, @07:19PM (3 children)

      by darkfeline (1030) on Thursday July 26 2018, @07:19PM (#713321) Homepage

      > I'm not sure that Google can actually pull it off because I'm not sure anyone can actually pull it off.

      Depends on what you mean by "pull it off", but Waymo has been serving public riders for almost half a year now. If you're talking about "can self-drive in a blizzard tornado thunderstorm on a country road", then I agree that I don't think that will be possible (not that humans could do better), but if you're talking about "can be successfully commercialized for a large proportion of use cases", then the horse has already left the barn and won the race.

      --
      Join the SDF Public Access UNIX System today!
      • (Score: 2) by Knowledge Troll on Thursday July 26 2018, @08:36PM (2 children)

        by Knowledge Troll (5948) on Thursday July 26 2018, @08:36PM (#713351) Homepage Journal

        I generally think in terms of success as level 5 autonomy though I can see where correctly operating level 4 autonomy (minus the part where those vehicles are currently killing people) can be commercialized.

        I don't live in an area that would be served well by level 4 so I don't even think in those terms. I think people who live in cities forget that lots of us live in a place where there won't be the level of data and hinting needed for the level 4 vehicle to operate correctly.

        • (Score: 2) by darkfeline on Saturday July 28 2018, @03:18AM (1 child)

          by darkfeline (1030) on Saturday July 28 2018, @03:18AM (#713913) Homepage

          I don't think level 5 is well-defined enough for that to be a useful metric. By simple fact of reality, there will always be situations which a self-driving AI is incapable of handling, just like there are situations that human drivers have proven incapable of handling. Even if we assume level 5 is well-defined by an ideal example of a human driver and we assume that we cannot achieve that via AI, it still wouldn't be a good metric because it has already be demonstrated that self driving AI can be made good enough to replace a large number of use cases and providing huge benefits in safety, cost, traffic, and parking. The additional benefit provided from making that last 1% advancement to level 5 is almost certainly not worth the cost. Thus, it seems questionable to me why that metric even exists.

          I predict that AI will stay at level 4/4.5 making incremental improvements in the number of situations it can handle until one day everyone just shrugs and agrees that it satisfies this ill-defined level 5 metric. It's not possible to handle all situations; level 5 is just a hand-wavy mark that says "anything a human can handle", with no provisions about the particular human (and human driving skill varies wildly, generally for the worse). Thus, it just boils down to "level 4 but works 99.99...% of the time", which has been demonstrated to be commercially feasible.

          --
          Join the SDF Public Access UNIX System today!
          • (Score: 2) by Knowledge Troll on Saturday July 28 2018, @03:56AM

            by Knowledge Troll (5948) on Saturday July 28 2018, @03:56AM (#713927) Homepage Journal

            and providing huge benefits in safety

            Please cite that, who has achieved that? Google?

            "level 4 but works 99.99...% of the time", which has been demonstrated to be commercially feasible.

            Very reasonable point and decent example of "pulling it off." I don't really disagree with that but I do disagree vehemently that the steering wheel can be taken away. And Google has gone as far as advocating for laws to do that.

            I spent about 10 minutes trying to find a source to cite for that but I can't. It is from my recollection but it is not as outlandish as it sounds. Thrun, the guy who ran the Team that built Stanley, wanted to solve robot cars because his family suffered a death in a car accident. He made no reservations about explaining this in The Great Robot Race from NOVA. After the second Grand Challenge he went to Google to start the car project there and I believe they had entirely identical view points at least at that time. It has been 10 years but I don't think Google changed.