Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday July 26 2018, @08:01AM   Printer-friendly
from the if-it-walks-like-a-duck,-sinks-like-a-duck,-oh,-wait... dept.

The Los Angeles Times reports:

The duck boat that sank in a Missouri lake last week, killing 17 people, was built based on a design by a self-taught entrepreneur who had no engineering training, according to court records reviewed by the Los Angeles Times.

The designer, entrepreneur Robert McDowell, completed only two years of college and had no background, training or certification in mechanics when he came up with the design for "stretch" duck boats more than two decades ago, according to a lawsuit filed over a roadway disaster in Seattle involving a similar duck boat in 2015.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by Knowledge Troll on Thursday July 26 2018, @03:05PM (4 children)

    by Knowledge Troll (5948) on Thursday July 26 2018, @03:05PM (#713148) Homepage Journal

    That said, my impression is that Google will get this right and Tesla won't simply because of money.

    I think you are right that Google has the best chance out of all 3 of the organizations I mentioned though I'm not sure that Google can actually pull it off because I'm not sure anyone can actually pull it off.

    Google has the resources it needs to develop the product carefully and make damn sure nobody that buys it can sue them for making it unsafe.

    I think this is pretty good analysis but I'll go a step further and bring someone else into this argument I left out originally: GM. Google hasn't been sued even once for killing someone that I know of. GM has been sued for killing people more times than I can count. Out of everyone working on robot cars I think only the current autombile manufacturers really have an intuitive organizational level of understanding of what it means to be sued and how unpleasant that is (lets not kid ourselves that people dying is a problem for them unless it hits the news) - and even when that is true there is still issues with faulty ignition locks and automatic transmission shifters and crap like that.

    Google out of the list I originally gave is the only one who hasn't killed someone with their machine. I'm pretty sure GM hasn't either. I personally believe the car manufacturers have the best chance at doing this "right" in terms of safety and working with the driver because of this history of being sued and a very long history of understanding their customer. Google wants to take the steering wheel away (they are very public about this) but GM wants to give drivers something they will like.

    Google is also the only one with any understanding of AI. I think this has helped them greatly make up the gap with industry specific knowledge with automobiles. I'm not sure what is going to happen with engineers at GM that have all the car experience but zero NN experience.

    The whole robot car thing is fucking insane!

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by darkfeline on Thursday July 26 2018, @07:19PM (3 children)

    by darkfeline (1030) on Thursday July 26 2018, @07:19PM (#713321) Homepage

    > I'm not sure that Google can actually pull it off because I'm not sure anyone can actually pull it off.

    Depends on what you mean by "pull it off", but Waymo has been serving public riders for almost half a year now. If you're talking about "can self-drive in a blizzard tornado thunderstorm on a country road", then I agree that I don't think that will be possible (not that humans could do better), but if you're talking about "can be successfully commercialized for a large proportion of use cases", then the horse has already left the barn and won the race.

    --
    Join the SDF Public Access UNIX System today!
    • (Score: 2) by Knowledge Troll on Thursday July 26 2018, @08:36PM (2 children)

      by Knowledge Troll (5948) on Thursday July 26 2018, @08:36PM (#713351) Homepage Journal

      I generally think in terms of success as level 5 autonomy though I can see where correctly operating level 4 autonomy (minus the part where those vehicles are currently killing people) can be commercialized.

      I don't live in an area that would be served well by level 4 so I don't even think in those terms. I think people who live in cities forget that lots of us live in a place where there won't be the level of data and hinting needed for the level 4 vehicle to operate correctly.

      • (Score: 2) by darkfeline on Saturday July 28 2018, @03:18AM (1 child)

        by darkfeline (1030) on Saturday July 28 2018, @03:18AM (#713913) Homepage

        I don't think level 5 is well-defined enough for that to be a useful metric. By simple fact of reality, there will always be situations which a self-driving AI is incapable of handling, just like there are situations that human drivers have proven incapable of handling. Even if we assume level 5 is well-defined by an ideal example of a human driver and we assume that we cannot achieve that via AI, it still wouldn't be a good metric because it has already be demonstrated that self driving AI can be made good enough to replace a large number of use cases and providing huge benefits in safety, cost, traffic, and parking. The additional benefit provided from making that last 1% advancement to level 5 is almost certainly not worth the cost. Thus, it seems questionable to me why that metric even exists.

        I predict that AI will stay at level 4/4.5 making incremental improvements in the number of situations it can handle until one day everyone just shrugs and agrees that it satisfies this ill-defined level 5 metric. It's not possible to handle all situations; level 5 is just a hand-wavy mark that says "anything a human can handle", with no provisions about the particular human (and human driving skill varies wildly, generally for the worse). Thus, it just boils down to "level 4 but works 99.99...% of the time", which has been demonstrated to be commercially feasible.

        --
        Join the SDF Public Access UNIX System today!
        • (Score: 2) by Knowledge Troll on Saturday July 28 2018, @03:56AM

          by Knowledge Troll (5948) on Saturday July 28 2018, @03:56AM (#713927) Homepage Journal

          and providing huge benefits in safety

          Please cite that, who has achieved that? Google?

          "level 4 but works 99.99...% of the time", which has been demonstrated to be commercially feasible.

          Very reasonable point and decent example of "pulling it off." I don't really disagree with that but I do disagree vehemently that the steering wheel can be taken away. And Google has gone as far as advocating for laws to do that.

          I spent about 10 minutes trying to find a source to cite for that but I can't. It is from my recollection but it is not as outlandish as it sounds. Thrun, the guy who ran the Team that built Stanley, wanted to solve robot cars because his family suffered a death in a car accident. He made no reservations about explaining this in The Great Robot Race from NOVA. After the second Grand Challenge he went to Google to start the car project there and I believe they had entirely identical view points at least at that time. It has been 10 years but I don't think Google changed.