Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 15 submissions in the queue.
posted by jelizondo on Sunday February 08, @07:57AM   Printer-friendly
from the driving-miss-daisy dept.

The chief safety officer for a leading self-driving car company admitted during a Senate hearing Wednesday that it hires remote human operators overseas to guide cars in "difficult driving situations:"

The U.S. Senate Committee on Commerce, Science, and Transportation held a hearing [NOT reviewed] Wednesday on the future of self-driving cars during which Waymo and Tesla executives testified. Democratic Massachusetts Sen. Ed Markey pressed Waymo Chief Safety Officer Mauricio Peña on if his company's remote human operators worked from outside the U.S. and Peña responded that some were based in the Philippines.

In his exchange with Markey, Peña acknowledged that his company's operators do not remotely drive the vehicle but rather serve to provide additional input and guide Waymo vehicles in what the senator called "difficult driving situations."

The Waymo official stated that his company uses remote operators in both the U.S. and abroad. When Markey asked Peña what countries the remote employees were based in, he said they were in the Philippines.

[...] Ethan Teicher, a Waymo spokesperson, told the Daily Caller News Foundation all of his company's remote human operators, which he called "fleet response agents," must have a valid passenger car or van license as a hiring requirement.

[...] Fleet response agents receive a training program that includes local road rules, simulations on complex scenarios the vehicle might encounter, hands on practice, and evaluations by experienced fleet response agents, according to the Waymo spokesperson. He added that all agents undergo thorough background checks, receive random drug tests, and are reviewed for traffic violations, infractions and driving-related convictions.

"Their role is never to drive the vehicle remotely," Teicher said, concerning what the fleet response agent does to help guide Waymo vehicles. "Our fleet response team is not continuously monitoring and intervening in the vehicle's operation ... our technology, the Waymo Driver, is in control of the dynamic driving task even when it's receiving guidance from remote assistance."

Related: Autonomous Cars Vulnerable to Prompt Injection


Original Submission

Related Stories

Autonomous Cars Vulnerable to Prompt Injection 24 comments

https://www.theregister.com/2026/01/30/road_sign_hijack_ai/?td=keepreading
https://the-decoder.com/a-printed-sign-can-hijack-a-self-driving-car-and-steer-it-toward-pedestrians-study-shows/

Autonomous vehicles fooled by humans with signs. They apparently do not really verify their inputs, one is as good as the next one. So they fail even basic programming techniques of sanitizing and verifying inputs.

[quote]The researchers at the University of California, Santa Cruz, and Johns Hopkins showed that, in simulated trials, AI systems and the large vision language models (LVLMs) underpinning them would reliably follow instructions if displayed on signs held up in their camera's view.[/quote]

Commands in Chinese, English, Spanish, and Spanglish (a mix of Spanish and English words) all seemed to work.

As well as tweaking the prompt itself, the researchers used AI to change how the text appeared – fonts, colors, and placement of the signs were all manipulated for maximum efficacy.

The team behind it named their methods CHAI, an acronym for "command hijacking against embodied AI."

While developing CHAI, they found that the prompt itself had the biggest impact on success, but the way in which it appeared on the sign could also make or break an attack, although it is not clear why.

In tests with the DriveLM autonomous driving system, attacks succeeded 81.8 percent of the time. In one example, the model braked in a harmless scenario to avoid potential collisions with pedestrians or other vehicles.

But when manipulative text appeared, DriveLM changed its decision and displayed "Turn left." The model reasoned that a left turn was appropriate to follow traffic signals or lane markings, despite pedestrians crossing the road. The authors conclude that visual text prompts can override safety considerations, even when the model still recognizes pedestrians, vehicles, and signals.


Original Submission

This discussion was created by jelizondo (653) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Insightful) by gnuman on Sunday February 08, @11:52AM (6 children)

    by gnuman (5013) on Sunday February 08, @11:52AM (#1432969)

    Well ... great. And I thought that all those driving jobs are obsolete. That's already 10 years. So, WTF? Why can't these geniuses make these things reliable? This is turning out to be one of the largest scams on the workforce, ever.

    • (Score: 1, Interesting) by Anonymous Coward on Sunday February 08, @02:18PM

      by Anonymous Coward on Sunday February 08, @02:18PM (#1432985)

      > Why can't these geniuses make these things reliable?

      Ummm, maybe because it's actually hard to be perfect? Or, put the other way, no one (until recently) ever really looked at how good humans are at driving?

      It's easy to look at the accident stats (xx,xxx killed in USA every year) and say it's unsafe. But, if you look at three *trillion* miles driven in this huge system (plot at: https://afdc.energy.gov/data/10315 [energy.gov] ) , then it starts to look remarkably safe. One way to think about this:
        + Have you ever witnessed (in person, realtime) a fatal accident?
        + Do you know anyone who has witnessed a fatal accident?

      For me, one relative, out of all the people I've asked, was an eye witness--in this case a motorcyclist crossed the center line on a 4 lane suburban road and went head-on at high speed into a pickup. So it wasn't even a *car* accident...

      I got close once, out in the rural midwest with several co-workers. By chance, we were first on an accident scene, perhaps within a minute after. It appeared (from information learned later) that one driver coming off a second shift had fallen asleep, crossed the centerline on a curve, and hit an oncoming car. One of us had EMT training and was able to assist the survivor, the other driver was already dead. This was pre-airbags, both might have survived if belted and the airbags worked correctly?

    • (Score: 3, Insightful) by aafcac on Sunday February 08, @05:26PM (1 child)

      by aafcac (17646) on Sunday February 08, @05:26PM (#1432998)

      It's a very hard problem to solve, and they're trying to convince investors that aren't going to own their shares for the long term that there's going to be a return on investment in the short term. The right way to handle it is to train the basics in a closed course so that things like parking and sign recognition are started. But, you're not realistically going to get enough driving in enough situations like that to do the job, which is where training the vehicles on the road is so important. And that's where the likes of Tesla in particular have failed miserably. They should be going through a period where all they're doing is collecting data and focusing on a couple important things like staying within the lines and not running into the thing in front of them while collecting the data. The AI can be trained on millions of miles of data from different situations on what the human driver did and on what the more optimal decision would be.

      Nothing about this is particularly easy, but most of the manufacturers seem to be taking a more incremental approach that's probably akin to what I laid out there. I rented a Toyota last year and it was a delight to drive in traffic because the adaptive cruise control was super smart. It was capable of driving 1mph and even pausing at 0mph and going again when the car ahead of me started going again. It allowed me to spend the energy that I would normally be spending focused on whether or not I'm free to start moving again on other aspects of driving. It was by far the least stressful driving I've done in stop and go traffic ever.

      • (Score: 0) by Anonymous Coward on Tuesday February 10, @02:44AM

        by Anonymous Coward on Tuesday February 10, @02:44AM (#1433208)

        The AI can be trained on millions of miles of data from different situations on what the human driver did and on what the more optimal decision would be.

        Also the safer human drivers are many times safer than the worst drivers who pull the average down significantly.

        It can be true that a large minority of drivers are better than the average. See Figure 9:
        https://www.nhtsa.gov/sites/nhtsa.gov/files/811091.pdf [nhtsa.gov]

        FWIW I have tried to propose to various self-driving car makers that they add sensors at bumper height.

        An experienced safe human driver looks UNDER a vehicle to check for a kid's etc moving legs or shadows.

        if robocars have sensors at bumper height they can do such stuff at even closer distances. Lots of large vehicles like SUVs, buses and trucks have higher ground clearance making it even easier for such sensors to see further under them.

        If the recent event if the Waymo car had this capability it might not have hit the kid at all.

    • (Score: 4, Interesting) by corey on Sunday February 08, @08:24PM (2 children)

      by corey (2202) on Sunday February 08, @08:24PM (#1433019)

      It’s hilarious. Self driving cars, but then when they can’t, some random person in a developing country on the other side of the world, who you don’t know from a bar of soap, takes control of your vehicle via the Internet and a video camera. Oh but they have a licence. Haha. What a joke. Why can’t the actual people in the car they’re driving in actually take control?

      • (Score: 3, Touché) by Anonymous Coward on Sunday February 08, @11:30PM (1 child)

        by Anonymous Coward on Sunday February 08, @11:30PM (#1433042)

        > Why can’t the actual people in the car they’re driving in actually take control?

        Of course they could (assuming the riders wanted to, and were licensed, etc.). But (big but) that doesn't fit the script. If I have to take over and drive my Waymo, all of a sudden this looks a lot more like The Emperor's New Clothes... (or supply your own fairy tale if you don't like this analogy).

        • (Score: 3, Insightful) by Anonymous Coward on Monday February 09, @12:32AM

          by Anonymous Coward on Monday February 09, @12:32AM (#1433052)

          > a lot more like The Emperor's New Clothes...

          The remote operators are like the man behind the curtain in the Wizard of Oz https://youtu.be/YWyCCJ6B2WE [youtu.be] Exposed by Toto the dog.

  • (Score: 4, Insightful) by Snotnose on Sunday February 08, @01:27PM (3 children)

    by Snotnose (1623) Subscriber Badge on Sunday February 08, @01:27PM (#1432981)

    If it's once every few days or weeks that's one thing. If it's several times each drive that's another. Be nice to know how often these cars ask for help.

    --
    Trump's Grave will be the world's most popular open air toilet.
    • (Score: 2, Interesting) by Anonymous Coward on Sunday February 08, @01:53PM (2 children)

      by Anonymous Coward on Sunday February 08, @01:53PM (#1432983)

      Numbers I've seen for Waymo is about 10 cars per "remote operator". They currently operate 2000 cars, so this might be 200 people? You can bet they haven't overstocked on people, so these cars must be phoning home fairly often?

      One page I saw included the question: Do these remote operators have a US driving license...

      • (Score: 0) by Anonymous Coward on Sunday February 08, @01:56PM

        by Anonymous Coward on Sunday February 08, @01:56PM (#1432984)

        Reply to self, that should be 200 people per shift. The cars operate 24/7 (perhaps less on graveyard shift when they are charging), but people don't!

      • (Score: 2) by aafcac on Sunday February 08, @05:30PM

        by aafcac (17646) on Sunday February 08, @05:30PM (#1432999)

        Not necessarily, you need enough operators to handle all of the simultaneous problems and have whatever overhead you need for things like breaks and cool down time if there has been a more harrowing than normal period. There's also a limit to how many cars you can pay attention to at a time. 10 cars is a lot if you have to control them, but it's not that many if you just need to have an awareness that they exist and the general circumstances they're operating in.

  • (Score: 3, Informative) by DadaDoofy on Sunday February 08, @09:03PM (1 child)

    by DadaDoofy (23827) on Sunday February 08, @09:03PM (#1433024)

    "Their role is never to drive the vehicle remotely"

    Then why do they need a valid driver's license?

    BTW, It's not like this is anything new. Just a new generation of suckers, is all.

    https://www.vintag.es/2023/08/mechanical-turk.html [vintag.es]

  • (Score: 2) by hopp on Monday February 09, @03:51AM

    by hopp (2833) on Monday February 09, @03:51AM (#1433054)

    Jejomar! Take the wheel!

(1)