from the Daneel-Olivaw-will-be-your-driver-today dept.
If they ever hit our roads for real, other drivers need to know exactly what they are:
It will soon become easy for self-driving cars to hide in plain sight. The rooftop lidar sensors that currently mark many of them out are likely to become smaller. Mercedes vehicles with the new, partially automated Drive Pilot system, which carries its lidar sensors behind the car's front grille, are already indistinguishable to the naked eye from ordinary human-operated vehicles.
[...] We could argue that, on principle, humans should know when they are interacting with robots. [...] If self-driving cars on public roads are genuinely being tested, then other road users could be considered subjects in that experiment and should give something like informed consent. Another argument in favor of labeling, this one practical, is that—as with a car operated by a student driver—it is safer to give a wide berth to a vehicle that may not behave like one driven by a well-practiced human.
There are arguments against labeling too. A label could be seen as an abdication of innovators' responsibilities, implying that others should acknowledge and accommodate a self-driving vehicle. And it could be argued that a new label, without a clear shared sense of the technology's limits, would only add confusion to roads that are already replete with distractions.
From a scientific perspective, labels also affect data collection. If a self-driving car is learning to drive and others know this and behave differently, this could taint the data it gathers. [...] "I'm pretty sure that people will challenge them if they are marked by doing really harsh braking in front of a self-driving car or putting themselves in the way," he [Volvo exec] said.
To better understand and manage the deployment of autonomous cars, we need to dispel the myth that computers will drive just like humans, but better. [...]
Until now it has largely been left to self-driving car companies to decide how to advertise themselves. This lack of standardization will create confusion and jeopardize public trust. [...] Clear, standardized labels would be a first step toward acknowledging that we are facing something novel on the road. Even though the technology is still in its infancy, clarity and transparency are well overdue.
Do you now, or do you think you would, behave differently if you are driving in the vicinity of a car that is driving itself? There's at least anecdotal evidence that it happens, and maybe more common than thought?
(Score: 3, Insightful) by MostCynical on Tuesday May 17 2022, @07:56AM
self-driving cars will be the ones obeying the road rules
OR
they will be the ones stopped for no reason in the middle of an intersection.
or both
"I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
(Score: 5, Interesting) by FatPhil on Tuesday May 17 2022, @08:16AM (13 children)
However, I don't object to some indicator that the car is self-driving. If 'L' plates work for learners, and in some places 'P' plates for those in a probationary period, why not have 'R' plates that have to be carried for the first 5000 self-driven miles on the road? If the owners' arrogance thinks that's demeaning, tough titty. Heck, if they want KITT lights instead, I say that's fine. They might think it's actually cool, and not realise that we're laughing at them behind their backs, and because of that we're now laughing more.
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 3, Insightful) by PiMuNu on Tuesday May 17 2022, @08:21AM (6 children)
> If you change your behaviour because you don't trust bots, you're putting a bot in a situation he's not trained for.
"he"?
The image matching algorithms and Kalman filters should work regardless of the speed at which I am driving.
(Score: 3, Funny) by FatPhil on Tuesday May 17 2022, @08:32AM
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 0, Funny) by Anonymous Coward on Tuesday May 17 2022, @10:23AM (4 children)
>> "he"?
If I programmed automobile neural networks, they would identify when passengers were SJWs like you and then drive straight into a wall at very high speeds.
(Score: 3, Touché) by PiMuNu on Tuesday May 17 2022, @10:55AM
WTF?
Not "he", "it". As in "it's" a machine, or "it's" a dumbass computer program written by some beardy hipster delivered from FriendFace.
(Score: 0) by Anonymous Coward on Tuesday May 17 2022, @11:08AM
"I'm not familiar with that address!" [talkingpointz.com]
(Score: 1, Insightful) by Anonymous Coward on Tuesday May 17 2022, @02:35PM (1 child)
"Humans can have a variety of genders"
Rightwinger: RAGE!
"Haha, you called a computer male"
Rightwinger: MURDEROUS RAGE!
Why are rightwingers so damn violent? I thought words couldn't hurt you, but you think someone should be shot for teasing you about calling a computer "he"? Get a grip you violent nutter.
(Score: 4, Insightful) by DannyB on Tuesday May 17 2022, @05:20PM
If you had to listen to Tucker Carlson, you would be violent too.
Enraged!
Secretly wanting to shoot people. Sometimes hinting online about enraged violent attitudes towards others who are different.
You too would be a True BelieverTM ready to overthrow the majority and install an illegitimate minority government.
If you listened to Tucker Carlson.
To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
(Score: 2) by isostatic on Tuesday May 17 2022, @08:26AM (5 children)
The neural net that's controlling the car isn't individual to a car. If there's just 1,000 self driving cars on the road they'll hit that 5,000 mile range in 10 minutes.
(Score: 2) by FatPhil on Tuesday May 17 2022, @08:35AM (4 children)
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 1, Touché) by Anonymous Coward on Tuesday May 17 2022, @09:22AM
... which arguably gives them an even better driving experience than your average human driver's first 5000 miles.
(Score: 2) by DannyB on Tuesday May 17 2022, @05:23PM (2 children)
Self driving vehicles may have encountered more situations and novelties than the average driver -- but they are still not equivalent to the human brain.
The most obvious example is Lidar. A human drives using eyesight and possibly hearing. Humans don't need Lidar to make sense of the world. That is the trick self driving needs to figure out.
To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
(Score: 0) by Anonymous Coward on Wednesday May 18 2022, @03:55AM (1 child)
Humans would definitely benefit from lidar if we had the capability. The ability to see through deep shadows at night, rain during any hour of the day or night and see in all directions with little down time to swivel around would definitely help. Also, human drivers suffer from a bunch of issues like attentional blindness and inability to see behind at the same time as they're looking at the road ahead of them. Accurately judging the distance of a motorcycle or grey car can be an issue that wouldn't be an issue for lidar.
(Score: 2) by FatPhil on Wednesday May 18 2022, @07:06PM
Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
(Score: 4, Interesting) by Anonymous Coward on Tuesday May 17 2022, @08:27AM (9 children)
I don't trust human drivers. I would trust fully self-driving cars even less. There's a famous example of attaching stickers to a stop sign so a deep neural network misclassifies it as speed limit 45 mph [arxiv.org]. A big weakness of neural networks is that it's difficult to explain why it produces the results it does, which means it's also going to be difficult to show that it's not prone to unintended behaviors. I'm more trusting of simple systems that might detect when a car is braking in front of it, and automatically applying the brakes if the driver fails to do so. It's relatively easy to explain how that system works, and so I'm reasonably trusting of it to behave correctly. It's intended as a failsafe, so one would hope that such a car is driven at least as competently as a human would with no failsafe systems. But if control is completely handed over to automated systems, I absolutely don't trust that the black boxes controlling the car's behavior will work correctly. If nobody can explain why the AI in the black box behaves the way it does, I don't trust the black box.
(Score: 5, Insightful) by darkfeline on Tuesday May 17 2022, @08:40AM (3 children)
Are you saying humans don't randomly treat stop signs as if they were 45 mph signs?
>it's difficult to explain why it produces the results it does
I agree, humans are most mysterious creatures.
Join the SDF Public Access UNIX System today!
(Score: 3, Interesting) by garfiejas on Tuesday May 17 2022, @10:35AM
> humans are most mysterious creatures.
Agree 100% - but on the whole fairly predictable - that's one of the reasons for that expensive piece of wetware between our ears - machines on the other hand - especially ones with ML "networks" - not so much - the black boxes are strong with these - add a few painted stripes/reflectors/metal bits (really confuse those LIDARs) to a road/sign/house/kerb and viola - random behavior, if they're done right (they're not on the whole) you get them to display "default behavior" i.e. pull over, stop or generally *not* accelerate away...
In the recent past we had a spate of people emergency stopping in front of people who were too close - you hit them - bingo - insurance fraud/claim - their car repaired the car for mates rates whilst claiming top dollar/personal injury you name it - now imagine those folks with new toys to play with and "stupid" driverless cars - do they bunch up when someone is close behind? - don't assume its a single individual doing these things
As Mr Musk said a few weeks ago - Driver less cars *require* General Purpose AI - https://en.wikipedia.org/wiki/Artificial_general_intelligence [wikipedia.org] to be able to operate in a "human" environment - (note Interstates/Motorways are not "human" environments - they're for motor vehicles only - we have laws to keep people/horses etc off them - and they're pretty much monitored 24x7 - current Driver less cars and People at the moment don't mix ... https://www.theguardian.com/technology/2021/jan/03/peak-hype-driverless-car-revolution-uber-robotaxis-autonomous-vehicle [theguardian.com]
And don't get me started on their current implementation "cos cloud" - a comment a few days ago about "mobility" being one of the things that sets us "free" caught my attention - i.e. if we're watched all the time in the cars, and the AI in the car refuses to take you somewhere "cos reasons" (at least my phone cant' refuse to go to Wigan) - theres a whole mountain of civil society rules/obligations that have been deliberately ignored by the tech-bros pushing these things - when only the rich can go where they want - when they want - what happens to equality, equal opportunities etc we are supposed to be making things better - not far worse...
Don't get me wrong - I'm all up for GP AI driven cars - when I can tell "Jeeves" to go pick up the 90 year old Mother-In-Law and take her to the hospital it will be AWESOME - but it will be my AI - for Me, shared history with no-one else except perhaps my family, I may not even be too fussed about the Car - it will be changed every few years - but my AI, will be mine not some-one else's selling my data to anyone, tracking me everywhere - improving their AI that they can charge me more for for useless bits that should have been done 10 years before (Apple I'm looking at you)... Sorry almost dropped into a rant there :-)
(Score: 3, Interesting) by Freeman on Tuesday May 17 2022, @02:06PM (1 child)
No, they're saying that most people would recognize a stop sign that has graffiti on it and stop. Whereas every car with that flaw will tree every stop sign with said graffiti in the same manner. Thus, effectively changing the Stop Sign, into a "speed up" or "slow down" or "stay the same speed" sign. Which is definitively, much worse.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 0) by Anonymous Coward on Wednesday May 18 2022, @03:57AM
Not likely, stop signs are the only signs that have that shape. They have that shape because humans look for the shape first before looking at what it says. A car is going to be looking for the shape and may or may not even bother to read what it says. Now, for rectangular, triangular and circular signs, that may be a risk, but definitely not for stop signs.
(Score: 0) by Anonymous Coward on Tuesday May 17 2022, @10:33AM
Neither do I. That's why I chose Linux/BSD instead of Windows.
(Score: 5, Interesting) by SomeGuy on Tuesday May 17 2022, @12:33PM
The way I see it, the only way to make self-driving cars work reliably is to re-think how we treat roads. They can not be a place for both people and toasters on wheels.
Roads will have to be kept in tip-top physical condition, road signs will need some kind of authentication mechanism or be pre-programed in to a map. Essentially roads will have to become railroad tracks.
If this all worked 100.0000000000%, even I'd love a self driving car. But I know full well that this is not realistic.
I'm just waiting for some edge case to kill a bunch of people. Perhaps rain washes out a road surface but the autoputz thinks it is still there and winds up sliding off the "road" down a cliff. Yea, a human might make that mistake too, but 10,000 other self driving cars following it will keep making the same mistake until the error is caught.
Or perhaps a truck spills something on the highway that humans would just drive through but it sends the autoputzes in to a panic lockdown for miles. Police then have to spend days on the phone with unhelpful Indian tech support idiots trying to get things moving again.
I still would love to see how companies think these things should handle badly degraded roads that are common around here where striping is worn down and crisscrossing with old striping showing through, weird combinations of weather, random animals and people in the middle of the road, all kinds of different trash on the road, downed power lines, and so on and so on.
(Score: 0) by Anonymous Coward on Tuesday May 17 2022, @02:03PM (1 child)
And then you have things called maps. So if the car is approaching an intersection and the AI says "I suggest you hit the gas", there is that part of the program that has fixed logic where you can look at the map and say, hey, we are approaching intersection, the AI has bad data??
So, safe driving is defense at depth. You don't rely on one construct or you will fail. This is actually the reason why semis now are mandated (at least here in EU) to have crash sensors that will lock their breaks to prevent disaster from proceeding. It prevents runaway idiots from trying to crush people with them in extreme cases and stops the trucks when the normal driver has a medical emergency.
So, what's the problem? Require that maps are used for self-driving cars so they don't drive onto a lake with a sign next to it. AI is suppose to be a tool in the toolbox, not the only hammer on one's belt to fix every problem.
(Score: 2) by DannyB on Tuesday May 17 2022, @05:26PM
The AI is approaching the intersection and hits the gas. Maybe it knows what it is doing according to the rules:
* red means stop
* green means go
* yellow means go extremely fast
To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
(Score: 1, Interesting) by Anonymous Coward on Wednesday May 18 2022, @01:05AM
> I'm more trusting of simple systems that might detect when a car is braking in front of it, and automatically applying the brakes if the driver fails to do so. It's relatively easy to explain how that system works, and so I'm reasonably trusting of it to behave correctly.
Automatic braking has been available for some years now, I guess you haven't been shopping for a new car?
Unfortunately, current systems are not very reliable, for example AAA just ran some tests, report from last week is here: https://newsroom.aaa.com/2022/05/consumer-skepticism-toward-active-driving-features-justified/ [aaa.com]
My cars predate these ADAS features and my plan is to wait until they actually work correctly before I buy a new car with automatic braking, lane keeping (in the rain), etc.
I did the same thing with airbags--the first gen (roughly during the 1990s) were powered to catch a large person and had the nasty habit of hurting smaller people. My first car with airbags was from 2001 and had the 2nd gen system that mostly solved that problem.
Perhaps I know too much to be a first adopter (aka, beta tester)?
(Score: -1, Flamebait) by Anonymous Coward on Tuesday May 17 2022, @10:21AM
A yellow star might work., or perhaps a scarlet 'A' for 'autonomous'
(Score: 4, Interesting) by Snospar on Tuesday May 17 2022, @10:33AM
The simple answer here is to continue driving well and treat any other road users (including automated or assisted vehicles) with courtesy and respect. If you're spending time and effort trying to decide whether a vehicle is being driven by a human or not then your concentration is not focused on the job in hand: Driving safely.
If you see a vehicle driving erratically give them a wide berth and potentially inform the police. There could be a hundred reasons for poor driving and, at the present time, they are much more likely to be related to the condition of the human driver than some algorithm.
Huge thanks to all the Soylent volunteers without whom this community (and this post) would not be possible.
(Score: 4, Insightful) by Runaway1956 on Tuesday May 17 2022, @11:49AM (3 children)
Defensive driving requires that you assume every vehicle on the road is 'out to get you'. You are scanning traffic all around, and looking for a way out, if all that traffic decides to gang up on you. If you are practicing defensive driving, it's not going to matter to you whether any vehicle is being driven by a reckless teen, a timid granny, an inattentive fool on the cell phone, or an AI. They are all equal, in that they are potential aggressors who want to take the space you occupy. That 18-wheeler hauling a D9 caterpillar coming at you is equally a hazard, no matter who or what is operating it. You should eyeball it frequently, to make sure it remains in it's proper lane, and isn't crowding you. If it crowds you, then YOU must make adjustments. Change lanes, slow down, speed up, drive off of the road if you must - defensive driving says that YOU are responsible for your own safety. There is no good reason to change your driving just because you have identified another driver as being male, female, old, young, black, white, pretty, ugly, or artificial. Always assume the other guy is out to get you, and drive defensively so that he can't get you!
(Score: 2) by JoeMerchant on Tuesday May 17 2022, @12:37PM
Around here, I am sure there is a portion of the driving population (say: 0.01%, or about 50 drivers within a 1000 square mile area - more than double that on a Friday night after they've "had a few") that would actively target and harass robot driven cars.
Were it not for that factor, I would suggest that today's "self driving" cars should carry labeling similar to the "Student Driver" label on driving school cars.
🌻🌻 [google.com]
(Score: 2) by JoeMerchant on Tuesday May 17 2022, @12:40PM
>That 18-wheeler hauling a D9 caterpillar coming at you is equally a hazard
The old Carlin skit about making a risky pass on a two lane road: "maybe a VW Bug, but not a logging truck!" has always missed the point: you hit a VW Bug head on at 120mph closing speed, I don't care what you're in - that's going to be expensive and time consuming at a minimum, and potentially lethal to occupants of both vehicles.
🌻🌻 [google.com]
(Score: 2) by bradley13 on Tuesday May 17 2022, @06:16PM
I think this will happen gradually. Right now, roads rely on people havi g common sense. Example: near me is a freshly reoaved road with zero markings. A bit farther along, they aren't finished, and there is a chaotic mess of orange cones.
All of this could be improved, easily enough, just by having and using standardized construction markers. Which self-driving cars could be programmed to recognize.
As driver assistance systems become more prevalent, we will see changes in thus direction. In 10 years or so, things will be much better.
Everyone is somebody else's weirdo.
(Score: 2) by Phoenix666 on Tuesday May 17 2022, @12:57PM (3 children)
Labeling cars that have full self-driving would invite shenanigans from Jehus and other bad drivers. In no time they would figure out how to monkey with your car's AI so that they could gain another two car lengths progress on the highway.
Washington DC delenda est.
(Score: 0) by Anonymous Coward on Tuesday May 17 2022, @02:38PM (2 children)
Jehus? Wtf are Jehus?
(Score: 0) by Anonymous Coward on Tuesday May 17 2022, @07:58PM
Obviously Phoenix has been toking on too much Tucker Carson.
(Score: 0) by Anonymous Coward on Wednesday May 18 2022, @01:04AM
It's Mexican for Yahoos.
(Score: 0) by Anonymous Coward on Tuesday May 17 2022, @01:54PM
Put something inconspicuous on the license tag to indicate that the car has the capability to self drive.
In the more nanny states, this might also come with an annual inspection checking more things.
Alternatively, tractors have big triangles to indicate slow moving vehicle. Self driving could have a big cute, round set of eyes on the front and back ;)
If we are going to put a flashing green light on top of a self driving, we also need a color for the guy on the cell phone, etc.
(Score: 3, Informative) by Anonymous Coward on Tuesday May 17 2022, @02:06PM (1 child)
Every horseless carriage shall have a flagman proceeding it to alert other road users.
http://www.oceansplasticleanup.com/Politics_Plastics_Oceans_Cleanup/Red_Flag_Act_Locomotive_1865_Cars_Speed_Limits_Man_Running_Carrying_A.html [oceansplasticleanup.com]
(Score: 0) by Anonymous Coward on Wednesday May 18 2022, @04:01AM
At the time, it made a great deal of sense. Cars were rare and horses were common and easily spooked. Cars were effectively experimental contraptions that came with risks that weren't entirely understood. It's a bit like the folks climbing on the outside of trains to rob them in the 19th century, there wasn't as good of an understanding of how that would work out as there is now.
(Score: 2) by kazzie on Tuesday May 17 2022, @05:06PM
The Vauxhall Corsa has been able to do this for years [youtube.com], according to the TV ads.
(Score: 3, Insightful) by PinkyGigglebrain on Tuesday May 17 2022, @05:30PM
any fully self driving vehicle should be required by law to have a reflective signal flare Orange paint job so everyone, including pedestrians, can tell at a distance that said vehicle may not be acting like other vehicles on the roads.
I don't mind sharing the road with automated vehicles but I want to be able to spot them at distance so I know what to expect and can adjust how I drive appropriately. just like I do with trucks and emergency vehicles which are easy to spot at range.
"Beware those who would deny you Knowledge, For in their hearts they dream themselves your Master."
(Score: 1) by jman on Wednesday May 18 2022, @02:35PM
So long as it wasn't a big yellow star, some identifying mark would not be out of order...