Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Friday October 15 2021, @04:27PM   Printer-friendly
from the turnabout-is-fair-play? dept.

Dead-End SF Street Plagued With Confused Waymo Cars Trying To Turn Around 'Every 5 Minutes':

SAN FRANCISCO (KPIX 5) — A normally quiet neighborhood in San Francisco is buzzing about a sudden explosion of traffic. Neighbors say their Richmond District dead-end street has suddenly become crowded with Waymo vehicles.

[...] They come all day, right to the end of 15th Avenue, where there’s nothing else to do but make some kind of multi-point turn and head out the way they came in. Not long after that car is gone, there will be another, which will make the same turn and leave, before another car shows up and does the exact same thing. And while there are some pauses, it never really stops.

“There are some days where it can be up to 50,” [resident Jennifer] King says of the Waymo count. “It’s literally every five minutes. And we’re all working from home, so this is what we hear.”

At several points this Tuesday, they showed up on top of each other. The cars, packed with technology, stop in a queue as if they are completely baffled by the dead end. While some neighbors say it is becoming a bit of a nuisance, everyone finds it a little bizarre.

[...] In an emailed statement, a Waymo spokesperson said, “We continually adjust to dynamic San Francisco road rules. In this case, cars traveling North of California on 15th Ave have to take a u-turn due to the presence of Slow Streets signage on Lake. So, the Waymo Driver was obeying the same road rules that any car is required to follow.”


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by Tork on Friday October 15 2021, @04:29PM

    by Tork (3914) Subscriber Badge on Friday October 15 2021, @04:29PM (#1187303)
    I've had a few concerns about self-driving cars for a while, but somehow GTA-style glitches were never on my radar until now.
    --
    🏳️‍🌈 Proud Ally 🏳️‍🌈
  • (Score: 0) by Anonymous Coward on Friday October 15 2021, @04:44PM (6 children)

    by Anonymous Coward on Friday October 15 2021, @04:44PM (#1187305)

    If it's a bug in a program, normally the users are irritated by it (sitting at a desk, etc).

    A bug in self-driving code is a _lot_ more public.

    > "...So, the Waymo Driver was obeying the same road rules that any car is required to follow.”

    I don't believe this alibi for a second, the residents haven't noted a big increase in other traffic, just Waymo. And where is the vaunted "intelligence" which is supposed to propagate between all the self-driving cars, so the mistake is only made once (by the first car that finds the problem)?

    How long before one of those neighbors snaps and does something more drastic than going public with their annoyance?

    • (Score: 4, Insightful) by maxwell demon on Friday October 15 2021, @05:00PM (3 children)

      by maxwell demon (1608) on Friday October 15 2021, @05:00PM (#1187313) Journal

      Well, in principle it could indeed be the case that the local authorities created a situation where formally the only legal way is doing this, but humans are intelligent enough to recognize that this would be a dumb thing to do and ignore the rule in this case.

      --
      The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 2, Insightful) by Runaway1956 on Friday October 15 2021, @06:02PM (1 child)

        by Runaway1956 (2926) Subscriber Badge on Friday October 15 2021, @06:02PM (#1187329) Journal

        but humans are intelligent enough to recognize that this would be a dumb thing to do and ignore the rule in this case. /sarcasm

        You forgot your sarcasm tag. And intelligent human being is alert to his surroundings. The average human today seems to be engrossed in his technological toys. Do you really think intelligent people would be texting, watching movies, and working a GPS device while driving? But, you see it all the time.

      • (Score: 1, Informative) by Anonymous Coward on Friday October 15 2021, @08:36PM

        by Anonymous Coward on Friday October 15 2021, @08:36PM (#1187375)
        Just put u-turn signs at both ends of a long set of blocks, and "no exit" signs at the intervening intersections, and watch the hilarity ensue as the streets fill up with cars going nowhere fast.
    • (Score: 2) by krishnoid on Saturday October 16 2021, @03:10AM (1 child)

      by krishnoid (1156) on Saturday October 16 2021, @03:10AM (#1187432)

      I'm more surprised that enterprising criminals haven't waited for a car to come to the end, block it in, strip the sellable parts off, and repeat for every car that shows up. You'd think that would take care of the problem in a big hurry.

      • (Score: 0) by Anonymous Coward on Saturday October 16 2021, @05:32PM

        by Anonymous Coward on Saturday October 16 2021, @05:32PM (#1187520)

        That is probably because dirty influence has not propagated to this neighborhood yet, and it remains a nice place to live. But don't worry, where there are nice places to live, there are vested interests in selling off the niceness piecemeal until there is nothing left. After we get rezoning, and tax payer funded redevelopment that transfers the property values into the pockets of select few, the new more diverse element will become enterprising enough to try your idea.

  • (Score: 5, Insightful) by SomeGuy on Friday October 15 2021, @04:44PM (6 children)

    by SomeGuy (5632) on Friday October 15 2021, @04:44PM (#1187306)

    To err is human. To really fuck up -and still get praised by idiots- takes an IA.

    Waymno's statement does not say why they are going there in the first place. I'm assuming they are looking for some humans to kill.

    These things are essentially trains on an invisible track. When they decide that track runs through your house, you are screwed.

    Welcome to the future. Today it is just hundreds of cars making a u-turn. Next time it will be thousands off of a cliff. And consumertards will still love it.

    • (Score: 2) by SomeGuy on Friday October 15 2021, @04:49PM (5 children)

      by SomeGuy (5632) on Friday October 15 2021, @04:49PM (#1187308)

      Edit: AI, not IA. (Umm, I had it mistype that or AIs would automatically flag me as hostile toward them. Yea. :P )

      • (Score: 2) by BsAtHome on Friday October 15 2021, @06:35PM (3 children)

        by BsAtHome (889) on Friday October 15 2021, @06:35PM (#1187338)

        No, it is proof that you are an AI and its training model is insufficiently complete.
        Your owner may want to scrape all of SN comments and use it as a new baseline. And with the great depth of the comments on this site, you are encouraged to invest in a very large data center with accompanying power station to improve your language skill ratings.

        • (Score: 0) by Anonymous Coward on Friday October 15 2021, @06:55PM (2 children)

          by Anonymous Coward on Friday October 15 2021, @06:55PM (#1187346)

          Yeah. Humans don't make stupid mistakes. Only computers make stupid mistakes. And you should be afraid. Isn't that the whole point of the article?

          • (Score: 0) by Anonymous Coward on Saturday October 16 2021, @01:57PM (1 child)

            by Anonymous Coward on Saturday October 16 2021, @01:57PM (#1187485)

            um. to err is human.

      • (Score: 2) by hendrikboom on Monday October 18 2021, @03:35AM

        by hendrikboom (1125) Subscriber Badge on Monday October 18 2021, @03:35AM (#1187869) Homepage Journal

        Lol! IA is a vocaloid singer -- completely artificial. Not quite an AI yet, but ...

  • (Score: 4, Funny) by looorg on Friday October 15 2021, @04:46PM (3 children)

    by looorg (578) on Friday October 15 2021, @04:46PM (#1187307)

    If this happens this often there is clearly some kind of map/grid issue with the system thinking this is a great shortcut somehow (low traffic and everything) if the road just continued and didn't stop in a dead end. Either the system doesn't plot the entire route to begin with or far enough ahead or something and they end up in this dead-end over and over again only to have to u-turn out of it. They could just blacklist this little dead-end street from their maps and be done with it. Someone clearly fucked up with the mapping of the streets otherwise they wouldn't ever turn in there.

    ... or the people that live there should just dig a gigantic hole and watch all the Waymo cars fall into their trap MAD MAX style.

    • (Score: 0) by Anonymous Coward on Friday October 15 2021, @04:58PM

      by Anonymous Coward on Friday October 15 2021, @04:58PM (#1187312)

      > dig a gigantic hole

      Sounds like a heffalump trap to me! (Pooh fell in his own hole.)

      Better to put a traffic saw horse in front of these AI cars...and another one behind it. That should trap it and cause Waymo to wake up.

      https://www.trafficsafetywarehouse.com/A-Frame-Barricade-2-Legs-and-1-I-Beam-Board/productinfo/200-KITS/ [trafficsafetywarehouse.com]
       

    • (Score: 0) by Anonymous Coward on Saturday October 16 2021, @07:18AM (1 child)

      by Anonymous Coward on Saturday October 16 2021, @07:18AM (#1187448)

      Doesn't seem to be a map issue. Thing is if I got the location right ( https://goo.gl/maps/aFK4RzrpPeehmD3e6 [goo.gl] ), looking at the map in question the 15th avenue is a long street and if you head north there's a junction BUT (due to Slow Streets or something) cars aren't allowed to go left or right in that junction and the 15th continues north after the junction for a short stretch AND then it becomes a dead end for cars (bicycles allowed, more Slow Street stuff?).

      So if a robocar ever happens to ever turn north onto 15th avenue it can only U turn at the end - there's no out for it.

      Humans who have experienced this would probably u-turn much earlier, but lots of these AI navigation stuff don't like to tell drivers (humans/robo) to u-turn on normal roads - has to be at certain "safe" points OR when forced to.

      Of course the real issue is why are so many of these cars turning north onto 15th avenue and not using a different road? Only good reason is if many robocars are having places on that stretch of 15th avenue as their destinations. Which I doubt.

      So most likely is the top level navigation stuff thinks there's a way to the destination via 15th avenue but the car level navigation realizes too late (based on sensors) that there isn't. And there's no smarts to share that info to the top level navigation stuff.

      • (Score: 0) by Anonymous Coward on Sunday October 17 2021, @08:22PM

        by Anonymous Coward on Sunday October 17 2021, @08:22PM (#1187782)

        https://www.sfmta.com/sites/default/files/imce-images/2021/san_francisco_slow_streets_080132021-01.png [sfmta.com]

        Pretty much.

        Lake Street is designated Slow Street, while the one north of that is temporarily closed for traffic.

        Likely Waymo has not gotten their system updated with the temp closure.

        What is weird is the mention of a having to do a multi-point u-turn, as according to the map, 15th should end in a cul-de-sac to allow for easy turnaround. I guess the temp closure included the intersection.

  • (Score: 3, Interesting) by jelizondo on Friday October 15 2021, @05:01PM (13 children)

    by jelizondo (653) Subscriber Badge on Friday October 15 2021, @05:01PM (#1187314) Journal

    Not only Waymo, I had that happen on Google Maps. Near where my daughter lives there is a non-existent street, which Maps always tells me to take! I suppose on some urban planning map the street exists but in reality it was never built.

    (My daughter lives on another city, so I´m not really familiar with the area and use Maps to get around.)

    • (Score: 2) by looorg on Friday October 15 2021, @05:44PM

      by looorg (578) on Friday October 15 2021, @05:44PM (#1187325)

      That could very well be it in this case, and your case. There was a planned street there that just never became one. Still somewhat weird, they don't tend to go on the same maps if they are never there. Perhaps there was a street there once that is now gone. Or it was a street they decided to dead-end to create some kind of cul-de-sac due to previous traffic issues or they wanted/created it to create some little calm urban area. That or it's just some kind of map error where some corruption took place and it thinks that road connects somehow to something else but it just doesn't.

    • (Score: 0) by Anonymous Coward on Friday October 15 2021, @05:58PM (1 child)

      by Anonymous Coward on Friday October 15 2021, @05:58PM (#1187327)

      I can do you one better: I drove past the Google campus in Sunnyvale and Maps told me to take a nonexistent road through/near their campus!

      • (Score: 0) by Anonymous Coward on Friday October 15 2021, @07:32PM

        by Anonymous Coward on Friday October 15 2021, @07:32PM (#1187359)

        They're calling you to the mothership.

    • (Score: 5, Interesting) by Thexalon on Friday October 15 2021, @06:37PM (9 children)

      by Thexalon (636) on Friday October 15 2021, @06:37PM (#1187339)

      One reason this can happen is the map intentionally including trap streets [wikipedia.org], which are streets that aren't there in real life or aren't routed as depicted, but map-makers can look for to see if somebody copied their map and sold it as their own work.

      Looking at this specific spot on Google Maps [google.com], it sure looks to me like the problem is that Waymo's mapping software either doesn't know about the "Slow Streets" setup (the blue dashed line), or hasn't yet concluded that such a street is impassable for vehicles for routing purposes, but their sign response software sees it and decides it is impassable. But to see it, it has to drive down 15th Ave, see the sign, and turn around.

      I'm reminded of the 10-20 year period where signs had to go up in places reading "Your GPS is wrong, turn around." At one point, there were GPS's routing drivers across an active rural airport runway until local authorities blocked it off and had some nasty exchanges with the GPS company. Whoops!

      --
      The only thing that stops a bad guy with a compiler is a good guy with a compiler.
      • (Score: 0) by Anonymous Coward on Friday October 15 2021, @07:17PM (3 children)

        by Anonymous Coward on Friday October 15 2021, @07:17PM (#1187353)

        Using street view, I was able to figure out that 15th is closed to through traffic a block north of lake. It does not simply end. I don't see any other marking on the map indicating it is not a through street.

        • (Score: 2) by Thexalon on Friday October 15 2021, @07:52PM (2 children)

          by Thexalon (636) on Friday October 15 2021, @07:52PM (#1187365)

          The date on the street view is important, though: This has changed in the last year.

          What's more interesting here is that apparently Waymo hasn't thought to have its cars phone home to say "Hey, there's something blocking this route, use another one if possible for the next 24 hours" and then phone home to check for those kinds of updates when routing. And then if something gets flagged several days in a row, have a human look into updating the map there. That would be useful not only for this situation, but also for things like massive accidents blocking intersections and the like.

          --
          The only thing that stops a bad guy with a compiler is a good guy with a compiler.
          • (Score: 0) by Anonymous Coward on Saturday October 16 2021, @07:29AM (1 child)

            by Anonymous Coward on Saturday October 16 2021, @07:29AM (#1187449)
            Companies etc hype driving AIs but their stuff doesn't seem much smarter than ants.
            • (Score: 3, Insightful) by Runaway1956 on Saturday October 16 2021, @04:40PM

              by Runaway1956 (2926) Subscriber Badge on Saturday October 16 2021, @04:40PM (#1187511) Journal

              Please don't denigrate the ants. They outlived the dinosaurs, and will probably outlive us.

      • (Score: 0) by Anonymous Coward on Friday October 15 2021, @09:32PM (2 children)

        by Anonymous Coward on Friday October 15 2021, @09:32PM (#1187388)

        It is not just streets. There are also towns, or lack there are.

        Look for Woolsey, CA. I lived in the "downtown" there for years. - not - town was found in the 1880 but disppeared when Woolsey when bust. Area sold off into 23 acre tracks.

        The main road is Woolsey RD, that connect Woolsey with Fluton. - true - still there today..

        Down the street is Olivet School (Woolsey and Olivet RDs meet) - not. Sold to be house when Piner-Olivet School district was formed and new school was built south of Fulton in early 1960's.

        There is even a road connecting Woolsey RD to Slusser RD. It can still be seen on some online maps. It was "removed" (blocked) when the River RD was pushed through. It was twisty with 3 U-turns in a hundred ft asphalt road as it when down the hill front. The rest of the hill faced was used for fill when River RD was push through. Now looks like a bluff, with a grove of trees at one end.

        So all the facts are true and real, but do not reflex what there today.

             

        • (Score: 0) by Anonymous Coward on Saturday October 16 2021, @08:51AM (1 child)

          by Anonymous Coward on Saturday October 16 2021, @08:51AM (#1187454)
          I tried reading your post, but I literally can't parse it. (" I lived in the "downtown" there for years. - not - town was found in the 1880 but disppeared when Woolsey when bust." WTF?) The interruptions in the middle don't make any sense. It's like you're trying really hard to English but don't actually know it. Like, seriously, capital RD? It's an abbreviation, not an acronym.
          • (Score: 2) by kazzie on Sunday October 17 2021, @08:28AM

            by kazzie (5309) Subscriber Badge on Sunday October 17 2021, @08:28AM (#1187677)

            Those capitals threw me completely, and I assumed they stood for "Rural District" or somesuch.

            Even swapping for "road" doesn't seem to help that much.

      • (Score: 1, Interesting) by Anonymous Coward on Saturday October 16 2021, @06:00AM (1 child)

        by Anonymous Coward on Saturday October 16 2021, @06:00AM (#1187443)

        With everybody having access to satellite pictures and GIS data, is it still reasonable for mapping companies to include trap streets?

        • (Score: 0) by Anonymous Coward on Saturday October 16 2021, @08:54AM

          by Anonymous Coward on Saturday October 16 2021, @08:54AM (#1187455)
          No, they should be sued for damages every time they cost someone time by deliberately misrouting them or causing an accident due to confusion, sudden U-turns, etc..
  • (Score: 0, Informative) by Anonymous Coward on Friday October 15 2021, @07:00PM (14 children)

    by Anonymous Coward on Friday October 15 2021, @07:00PM (#1187349)

    >> “There are some days where it can be up to 50,” [resident Jennifer] King says of the Waymo count. “It’s literally every five minutes"

    50 x 5 minutes = 250 minute days... or this could just be another woman who literally has trouble with math and/or grammar.

    • (Score: -1, Flamebait) by Anonymous Coward on Friday October 15 2021, @07:34PM

      by Anonymous Coward on Friday October 15 2021, @07:34PM (#1187360)

      Is any more proof needed that women shouldn't work in tech?

    • (Score: 0) by Anonymous Coward on Friday October 15 2021, @08:06PM (12 children)

      by Anonymous Coward on Friday October 15 2021, @08:06PM (#1187367)

      It's entirely plausible that the reporter, Wilson Walker [cbslocal.com], is misquoting her. Or perhaps the editor cocked up. The "up to 50" and "literally every 5 minutes" are in separate quotes so Jennifer King may not have been talking about the same things.

      But yes that paragraph doesn't really make any sense. The editor should have caught this but nobody pays for newspapers anymore so...

      • (Score: 1, Insightful) by Anonymous Coward on Friday October 15 2021, @08:54PM (11 children)

        by Anonymous Coward on Friday October 15 2021, @08:54PM (#1187380)

        More likely that "literally every 5 minutes" is just the typical misuse of "literally" and "every 5 minutes" is shorthand for "it happens a lot."

        Of course since it was a woman the misogyny kicks it up a notch. Usually it would just be some bitching about bad science journalism, but SN has an alt-right infestation in case the racism and sexism wasn't enough of a tip off.

        • (Score: 0, Funny) by Anonymous Coward on Friday October 15 2021, @10:58PM (2 children)

          by Anonymous Coward on Friday October 15 2021, @10:58PM (#1187402)

          No one said she was black... or are you just assuming that because of her ignorance?

          • (Score: 2, Informative) by Anonymous Coward on Saturday October 16 2021, @12:16AM (1 child)

            by Anonymous Coward on Saturday October 16 2021, @12:16AM (#1187412)

            Oh man lol. The funny thing is that GP didn't say that she was Black either.

            • (Score: 0) by Anonymous Coward on Saturday October 16 2021, @02:02PM

              by Anonymous Coward on Saturday October 16 2021, @02:02PM (#1187488)

              you fool, this is SF, Jennifer will be asian.

        • (Score: 2, Informative) by janrinok on Saturday October 16 2021, @08:06AM (4 children)

          by janrinok (52) Subscriber Badge on Saturday October 16 2021, @08:06AM (#1187450) Journal

          SN has an alt-right infestation

          This has nothing to do with politics except in your mind. But as you frequently post rubbish such as this I am not surprised.

        • (Score: 0) by Anonymous Coward on Saturday October 16 2021, @08:57AM (2 children)

          by Anonymous Coward on Saturday October 16 2021, @08:57AM (#1187456)
          This. People can't English for shit.
          • (Score: 2) by kazzie on Sunday October 17 2021, @09:16AM

            by kazzie (5309) Subscriber Badge on Sunday October 17 2021, @09:16AM (#1187684)

            Meh, it's all Greek to me.

          • (Score: 0) by Anonymous Coward on Sunday October 17 2021, @05:50PM

            by Anonymous Coward on Sunday October 17 2021, @05:50PM (#1187762)

            The irony is beautiful, conservative twat waffles projecting any time they feel threatened. Janni the alt-right denying defender and the cowards that don't want to go on record defending someone that says stuff like "yet another woman who sucks ad math and grammar." Happu to keep pointing out the alt-wrong stupidity, bonus points when janni insists there are no bigots on DN cuz he is so old he doesn't even recognize it. Think he sweeps it under as locker room talk, or boys will be boys?

  • (Score: 0) by Anonymous Coward on Friday October 15 2021, @11:39PM

    by Anonymous Coward on Friday October 15 2021, @11:39PM (#1187406)

    They must be using google maps.

  • (Score: 0) by Anonymous Coward on Saturday October 16 2021, @02:29PM

    by Anonymous Coward on Saturday October 16 2021, @02:29PM (#1187491)

    The people are complaining because your cars being a nuisance on their normally quiet, dead-end street. They aren't complaining just because your cars are turning around "like anyone else would."

(1)