Stories
Slash Boxes
Comments

SoylentNews is people

posted by FatPhil on Thursday November 09 2017, @04:44PM   Printer-friendly
from the sorry,-bot,-I-didn't-see-you dept.

On day one of its normal operations, a driverless shuttle bus in Las Vegas was involved in a minor crash [Luddites - content is there, but hidden by scripts/stylesheets - Ed.(FP)]. But neither the bus nor its human attendant were at fault:

A driverless shuttle bus was involved in a minor crash with a semi-truck less than two hours after it made its debut on Las Vegas streets Wednesday in front of cameras and celebrities. The human behind the wheel of the truck was at fault, police said. Las Vegas police officer Aden Ocampo-Gomez said the semi-truck's driver was cited for illegal backing. No injuries were reported.

"The shuttle did what it was supposed to do, in that it's (sic) sensors registered the truck and the shuttle stopped to avoid the accident," the city said in a statement. "Unfortunately the delivery truck did not stop and grazed the front fender of the shuttle. Had the truck had the same sensing equipment that the shuttle has the accident would have been avoided."

The oval-shaped shuttle that can transport up to 12 people has an attendant and computer monitor, but no steering wheel and no brake pedals. It uses GPS, electronic curb sensors and other technology to make its way. It was developed by the French company Navya and was tested in January in Las Vegas.

At the unveiling ceremony, officials promoted it as the nation's first self-driving shuttle pilot project geared toward the public. Before it crashed, dozens of people had lined up to get a free trip on a 0.6-mile loop in downtown Las Vegas. City spokesman Jace Radke said the shuttle took two more loops after the crash.

Also at DW, TechCrunch, and ZDNet.

Previously: Self-Driving Shuttle Bus Tested in Las Vegas


Original Submission

Related Stories

Self-Driving Shuttle Bus Tested in Las Vegas 9 comments

A company is testing an autonomous electric shuttle bus on Fremont Street in Las Vegas, and intends to fully deploy shuttles later in the year:

Las Vegas Mayor Carolyn Goodman took a cruise down Fremont Street Tuesday afternoon that made history. Goodman rode in the first completely autonomous, fully electric shuttle to ever be deployed on a public roadway in the United States. The driverless vehicle — called Arma and developed by the Paris-based company Navya — will be making trips down Fremont Street from today to Jan. 20 as developers test the product. "What a wonderful day for all of us to witness this," Goodman said. "Being the control freak that I am, I was very nervous to get on this vehicle, but it is clean, has beautiful air and moves sort of swiftly but so beautifully down Fremont East."

The vehicle holds a dozen passengers and operates safely at up to 27 miles per hour but will be limited to 12 mph during the trial period. [...] While the trial will last only two weeks, Cervantes says the driverless vehicles could be in full effect by late summer to early fall. "It's a matter of fine-tuning the technology to make sure it's safe," Cervantes said. "The last thing we want is for something to happen." The vehicle uses radar to detect and avoid obstacles in the road, as well as GPS technology to navigate the roads. Keolis, a world leader in public passenger transport, has partnered with NAVYA in the endeavor.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by realDonaldTrump on Thursday November 09 2017, @05:05PM (4 children)

    by realDonaldTrump (6614) on Thursday November 09 2017, @05:05PM (#594703) Homepage Journal

    People used to say, "Oh, the computer screwed up, so sorry!" But now they blame the guy, they say the guy screwed up. I say screwed up, you know what I mean. I think you know what I mean.

    • (Score: 0) by Anonymous Coward on Thursday November 09 2017, @05:27PM

      by Anonymous Coward on Thursday November 09 2017, @05:27PM (#594716)

      Aren't you going to yell "You're Fired!" at the guy, or have you retired that particular juvenile catchphrase already.

    • (Score: 3, Funny) by DannyB on Thursday November 09 2017, @05:30PM

      by DannyB (5839) Subscriber Badge on Thursday November 09 2017, @05:30PM (#594718) Journal

      In any collision, there are fine drivers, on many sides. On many sides.

      --
      To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
    • (Score: 0, Disagree) by Anonymous Coward on Thursday November 09 2017, @06:01PM (1 child)

      by Anonymous Coward on Thursday November 09 2017, @06:01PM (#594730)

      It's enough to make one think that there are certain interests who may be arranging these "accidents."

      • (Score: 2) by chromas on Thursday November 09 2017, @10:48PM

        by chromas (34) Subscriber Badge on Thursday November 09 2017, @10:48PM (#594897) Journal

        Big Yellow? Taxi drivers evrawhere: "they terk er jerbs!"

        Remember the old adage "if cars were made like computers…"?

  • (Score: 5, Insightful) by DannyB on Thursday November 09 2017, @05:34PM (12 children)

    by DannyB (5839) Subscriber Badge on Thursday November 09 2017, @05:34PM (#594720) Journal

    From the picture in TFA, and the text, I am making some assumptions about how the accident happened.

    “The shuttle did what it was supposed to do, in that it’s (sic) sensors registered the truck and the shuttle stopped to avoid the accident,”

    “Unfortunately the delivery truck did not stop and grazed the front fender of the shuttle. [. . . .]

    So I assume the shuttle stopped, and the druck continued backing and hit the FRONT of the shuttle.

    Maybe the shuttle could be a bit smarter and back up a few more feet if it realizes OMG, that stupid puny ignorant small-brained human descended from ape is still backing up!

    --
    To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
    • (Score: 5, Insightful) by Anonymous Coward on Thursday November 09 2017, @05:41PM (3 children)

      by Anonymous Coward on Thursday November 09 2017, @05:41PM (#594724)

      ...or maybe the shuttle could do what a human driver does? Lay on the horn and shout obscenities until the truck driver notices and stops...

      • (Score: 2) by Thexalon on Thursday November 09 2017, @06:25PM (1 child)

        by Thexalon (636) on Thursday November 09 2017, @06:25PM (#594742)

        That reminds me of a car alarm that instead of making annoying siren sounds, instead played the sound of a scary-sounding man yelling "BACK OFF, BUSTER!" It would be interesting to learn if that was any more effective.

        --
        The only thing that stops a bad guy with a compiler is a good guy with a compiler.
        • (Score: 2) by Nerdfest on Friday November 10 2017, @01:15PM

          by Nerdfest (80) on Friday November 10 2017, @01:15PM (#595104)

          I always wanted one that sounded like James Brown.

      • (Score: 2) by krishnoid on Thursday November 09 2017, @11:18PM

        by krishnoid (1156) on Thursday November 09 2017, @11:18PM (#594910)

        That's something we should be able to crowdsource from the online gaming community.

    • (Score: 3, Interesting) by Anonymous Coward on Thursday November 09 2017, @06:04PM (1 child)

      by Anonymous Coward on Thursday November 09 2017, @06:04PM (#594734)

      Well, from what I read this morning, the shuttle doesn't have reverse. Now that is monumentally stupid, certain groups who may be interested in arranging such accidents notwithstanding.

      • (Score: 2) by bob_super on Thursday November 09 2017, @07:39PM

        by bob_super (1357) on Thursday November 09 2017, @07:39PM (#594786)

        > from what I read this morning, the shuttle doesn't have reverse.

        Just turn it around and teach the computer to only drive in reverse. Problem solved!

    • (Score: 5, Insightful) by tftp on Thursday November 09 2017, @06:47PM (1 child)

      by tftp (806) on Thursday November 09 2017, @06:47PM (#594758) Homepage
      Every driver in a similar situation asks themselves : "does the other guy see me?" This is a very obvious question to ask when you are stopped behind a large trailer, in the blind zone of the driver. You watch him like a hawk, and if he does something dangerous you immediately take action. If the computer ignored all this and sat as a helpless victim, then it may need to be cited as well.
      • (Score: 2) by realDonaldTrump on Friday November 10 2017, @01:02AM

        by realDonaldTrump (6614) on Friday November 10 2017, @01:02AM (#594953) Homepage Journal

        Pigeons are very smart. They say if you put a pigeon in front of a mirror, and he has a smudge of lipstick, he sees it in the mirror. And he cleans himself off. A dog won't do that. A cat won't do it. A gorilla will do it, or a pigeon. Very smart. Instead of the cyber, maybe they can teach pigeons to drive.

    • (Score: 3, Insightful) by tangomargarine on Thursday November 09 2017, @10:13PM (3 children)

      by tangomargarine (667) on Thursday November 09 2017, @10:13PM (#594871)

      Maybe the shuttle could be a bit smarter and back up a few more feet if it realizes OMG, that stupid puny ignorant small-brained human descended from ape is still backing up!

      I mean, that's what pragmatically might make sense, but it's also duplicating the behavior that was causing the problem in the first place. Vehicles in traffic aren't supposed to be backing up, so let's back up ourselves to avoid him?

      --
      "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
      • (Score: 1) by tftp on Friday November 10 2017, @12:27AM (2 children)

        by tftp (806) on Friday November 10 2017, @12:27AM (#594933) Homepage

        Vehicles in traffic aren't supposed to be backing up

        They do that quite often after stopping too far into the intersection. It is expected.

        • (Score: 2) by tangomargarine on Friday November 10 2017, @04:16PM (1 child)

          by tangomargarine (667) on Friday November 10 2017, @04:16PM (#595167)

          Maybe where you live. Near me I can probably count the number of times I've seen that in the last 6 months on the fingers of one hand, and have a few fingers left over.

          --
          "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
          • (Score: 0) by Anonymous Coward on Friday November 10 2017, @06:45PM

            by Anonymous Coward on Friday November 10 2017, @06:45PM (#595268)

            Yea, but you're here and as a result fall into the 10 types of people who can count to 1023 on their fingers, barring additional encoding or previous accident.

  • (Score: 4, Insightful) by Anonymous Coward on Thursday November 09 2017, @06:31PM (2 children)

    by Anonymous Coward on Thursday November 09 2017, @06:31PM (#594749)

    Although it seems to be the human drivers fault, I do have to wonder if an accident would have been avoided if it were another human behind the wheel. There's many incidents on the road that I see everyday that I honestly don't think modern AI or computers can deal with yet.

    For example, will a self-driving car take a calculated risk, swerving into another lane close to another car (thus risking another accident), due to being tailgated and having some idiot pull out of a parking lot too close in front of him? The lawful and most appropriate response would be to stop/slow and take the rear end accident. This would place primary fault on the tailgater (thus human error), and would put no fault on the vehicle or it's designers. However if it were to be designed to swerve over into the next lane, thus taking the risk and potentially avoiding an accident, the company behind it's design would easily be more legally liable and likely to face a lawsuit in the event of an accident.

    I feel like it's unlikely we will find many issues with self-driving cars during this testing phase, as there are few of them and they can always take a passive/cautious approach avoiding any issues. However if they want to fill the roads with these kinds of vehicles, we're going to see all kinds of issues. They will be due to human generated error and chaos, but human resourcefulness and risk taking often helps avoid accidents potentially caused by these incidents.

    • (Score: 3, Insightful) by Virindi on Thursday November 09 2017, @08:34PM (1 child)

      by Virindi (3484) on Thursday November 09 2017, @08:34PM (#594818)

      It's not illegal to do what is reasonably necessary in an emergency situation to save lives. The problem is, that is based on a "reasonable man" standard, and a computer by definition cannot make such an evaluation. So, the computer is forced to be conservative.

      Also keep in mind that the company that made the car is likely to have a whole lot more money to go after than the average driver. If Joe Blow does something wrong in traffic, it is pointless to ask for billions from him in court. But, some juries have no problem awarding millions to billions from superbigco (not that the huge numbers tend to stand up, but the numbers in general can still be bigger that with individuals).

      • (Score: 3, Insightful) by tonyPick on Friday November 10 2017, @06:13AM

        by tonyPick (1237) on Friday November 10 2017, @06:13AM (#595050) Homepage Journal

        The problem is, that is based on a "reasonable man" standard, and a computer by definition cannot make such an evaluation.

        If it can't make that evaluation then you don't have a self driving car - just driver assist, and it shouldn't be on the road without a human driver in charge to make those decisions.

  • (Score: 2) by All Your Lawn Are Belong To Us on Thursday November 09 2017, @11:42PM (1 child)

    by All Your Lawn Are Belong To Us (6553) on Thursday November 09 2017, @11:42PM (#594919) Journal

    Did it signal it's presence? Did it try to signal it's presence? Would a reasonable person in the same circumstance have tried to signal it's presence? My guess "No, No, Yes." Which would mean the car bears some measure of liability. There is no stand your ground law for cars.

    --
    This sig for rent.
    • (Score: 2, Insightful) by tftp on Friday November 10 2017, @03:19AM

      by tftp (806) on Friday November 10 2017, @03:19AM (#595011) Homepage
      I suspect that if cars of such driving ability show up in the roads, there will be a surge of avoidable incidents just because the computer is too dumb to read the world correctly. One way to avoid that is to wait for a true AI. Another - to replace all cars with computers, at least on public roads. Driving is a highly collective, cooperative endeavor, but there will be an impaired link between humans and computers. Take that delivery truck, for example. A human knows that those trucks make deliveries, and on the back they often carry warnings. A human knows to stay away when trucks start maneuvering. Does the dumb computer know, for example, from what lane long trailers are making their turns? A dumb computer knows nothing of that and will cheerfully follow a truck at the minimum legal distance. I am concerned that the human driver is charged. Nothing would have happened if a human driver operated the shuttle. If in the future only humans will be charged for incidents caused by computers misreading the situation, there will be revolt.
  • (Score: 3, Insightful) by drussell on Friday November 10 2017, @03:53AM

    by drussell (2678) on Friday November 10 2017, @03:53AM (#595017) Journal

    Had the truck had the same sensing equipment that the shuttle has the accident would have been avoided.

    Alternatively, had the autonomous bus had the intelligence of a mediocre-level driver, it would have never strayed into the path of said inattentive truck driver....

    Excelsior!! Future autonomous everything, HAZAA!!

    :facepalm:

  • (Score: 2, Insightful) by MostCynical on Friday November 10 2017, @05:42AM (1 child)

    by MostCynical (2589) on Friday November 10 2017, @05:42AM (#595043) Journal

    either people who are bothering to comment in this thread are techno-luddites, who seem to want men with red flags walking in front of AI vehicles, or claim The Is Always An Exception Where A Human Would Have Made A Better Choice, then spend a while concocting more and more ludicrous scenarios to justify this position.

    Face it, a human truck driver reversed into another vehicle.
    Many, if not most, drivers would stop, then sit, watching as their car gets crunched. As all the comments show, you ar all Superior Drivers, and would swerve, reverse, or Do Something Risky. Great, from "stopped, not liable", you have moved to "liable for a different crash"
    Congratulations, no death to possible worse crash, with who not what added damage, injuries or deaths.

    --
    "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
    • (Score: 3, Insightful) by Nuke on Friday November 10 2017, @02:39PM

      by Nuke (3162) on Friday November 10 2017, @02:39PM (#595128)

      I have reversed and/or sounded the horn in similar situations. In fact i avoid going anywhere close to a truck which looks like it is manoevering. I don't consider myself a "superior" driver.

  • (Score: 2) by Nuke on Friday November 10 2017, @02:36PM

    by Nuke (3162) on Friday November 10 2017, @02:36PM (#595127)

    FTFA ;

    "The shuttle did what it was supposed to do, in that it's (sic) sensors registered the truck and the shuttle stopped to avoid the accident,"

    They say that as if it was the right thing to do, but it might not have been. It was what a (stupid?) progammer told it to do. Is the programmer always to be assumed as in the right? Not in my experience of software.

(1)