The racetrack is the ultimate test of driving skill, managing power, traction, and braking to produce the fastest times. Now BBC reports that engineers at Stanford University have raced their souped-up Audi TTS dubbed ‘Shelley’ on the racetrack at speeds above 120 mph. When they time tested it against David Vodden, the racetrack CEO and amateur touring class champion, the driverless race car was faster by 0.4 of a second. "We’ve been trying to develop cars that perform like the very best human drivers,” says Professor Chris Gerdes who tested Shelley at Thunderhill Raceway Park in Northern California. “We’ve got the point of being fairly comparable to an expert driver in terms of our ability to drive around the track.”
To get the cars up to speed, the Stanford team studied drivers, even attaching electrodes to their heads to monitor brain activity in the hope of learning which neural circuits are working during difficult manoeuvres. Scientists were intrigued to find that during the most complex tasks, the experts used less brain power. They appeared to be acting on instinct and muscle memory rather than using judgement as a computer program would. Although there was previously very little difference between the path a professional driver takes around the course and the route charted by Shelley's algorithms until now the very best human drivers were still faster around the track, if just by a few seconds. Now the researchers predict that within the next 15 years, cars will drive with the skill of Michael Schumacher. What remains to be seen is how Shelly will do when running fender to fender with real human race drivers.
(Score: 4, Interesting) by kaszz on Tuesday February 24 2015, @06:24AM
The problem isn't speed, but reliability. Enjoying a 90 km/h ride and having the vehicle computer suddenly decide to go abruptly to the side because of some corner case isn't nice. That's where humans comes to play. They have judgment, computers don't.
Every attempt to make computer programming into an linear art will fail. Because it's nature isn't that way..!
(Score: 2) by tibman on Tuesday February 24 2015, @06:32AM
The fine summary says that using less judgement in a complex (corner?) case is what humans do. Why can't a computer do the same? Yes, you may abruptly fly into a wall. But if that's what the human driver was going to do too then i think the system is reliable enough, lol. Obviously the computer gets to learn from the incident where the normal driver would likely not.
SN won't survive on lurkers alone. Write comments.
(Score: 4, Insightful) by kaszz on Tuesday February 24 2015, @06:40AM
Measuring the thought process of humans is a very imprecise art. The human brain as tons of layers of control algorithms that science has yet to manage.
(Score: 2) by frojack on Tuesday February 24 2015, @07:37AM
To this, you have to add that little deek, that some drivers will use to prevent other drivers from passing, or if that isn't enough that little bump in the back stretch.
There is a lot more going on than just getting around the track.
No, you are mistaken. I've always had this sig.
(Score: 2) by kaszz on Tuesday February 24 2015, @03:24PM
What's Deek?
(Score: 3, Insightful) by Anonymous Coward on Tuesday February 24 2015, @07:46AM
Situations in which a driverless car can't cope have declined and will continue to decline dramatically before their mass market use. In the meantime, humans will continue to fall asleep behind the wheel, will become cognitively "bored" during long trips, drive drunk or high, have wildly varying levels of driving ability, will continue to drive while losing vision and cognitive ability to aging, experience cardiac arrest or seizures, will be distracted by phones and passengers, and have the same comparatively poor reaction times (hundreds of milliseconds).
These systems have been driven hundreds of thousands of miles, exposed to varying real world conditions, and are being developed by many competing companies and universities. Even if eventual commercial models fail an edge case, they will save many more lives than they will destroy. Driverless cars have the reliability edge, not humans.
(Score: 2) by c0lo on Tuesday February 24 2015, @02:17PM
So, your estimate, please: how many deaths away from the point in which driverless car get approved on public roads?
https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
(Score: 2) by quacking duck on Tuesday February 24 2015, @03:01PM
Millions more.
I'm assuming you're talking about deaths due to human drivers, of course.
And the approval process will be a highly emotional, illogical political fight. Humans as a group seem incapable of higher reasoning that lets them objectively evaluate evidence, odds, and risk/rewards objectively. Witness the anti-vaxers, blind supporters of any political party, and people who keep gambling and buying lottery tickets.
(Score: 2) by gnuman on Tuesday February 24 2015, @05:12PM
So, your estimate, please: how many deaths away from the point in which driverless car get approved on public roads?
You may want to see that approximately 1,240,000 people died on the roads last year.
http://en.wikipedia.org/wiki/List_of_countries_by_traffic-related_death_rate [wikipedia.org]
36,000 in USA alone. It's like a 9/11 every month on the roads in the USA, but I guess that's "normal" so no one cares. Instead, everyone just engages into stupid comments how we should be driving faster, presumably because we don't kill each other fast enough? The acceptability of this as "normal" reminds me of a quote from Joker in Dark Night.
http://www.imdb.com/character/ch0000180/quotes [imdb.com]
Joker: I just did what I do best. I took your little plan and I turned it on itself. Look what I did to this city with a few drums of gas and a couple of bullets. Hmmm? You know... You know what I've noticed? Nobody panics when things go "according to plan." Even if the plan is horrifying! If, tomorrow, I tell the press that, like, a gang banger will get shot, or a truckload of soldiers will be blown up, nobody panics, because it's all "part of the plan". But when I say that one little old mayor will die, well then everyone loses their minds!
So, people killing themselves on roads by millions, and that's OK. That's "normal" somehow. But if software would glitch and 100 people die on the roads a year because of software problems (before these edge cases can be fixed), then indeed that would no longer be acceptable??
In US, you have a 1% chance of getting killed on the roads in your life. That's more than any non-medical cause combined. You should not be worried about getting killed by guns in US, or by terrorists in Pakistan. You should be worried about getting killed on the roads in either country.
(Score: 2) by carguy on Tuesday February 24 2015, @08:02PM
Lots of ways to cut these statistics. Here is another, this breaks out "non-disease" causes of death:
http://www.cdc.gov/injury/images/lc-charts/leading_causes_of_injury_deaths_highlighting_unintentional_injury_2012-a.gif [cdc.gov]
In USA in 2012, unintentional poisoning killed somewhat more people (across all age groups) than unintentional motor vehicle accidents.
Unintentional falls were not far behind unintentional motor vehicle. c.1970 (before non-skid bathtub mats became common) I heard that these two categories were switched -- more fatal accidents in the home than on the road (sorry no link to back that up).
Home page for this and many other similar charts:
http://www.cdc.gov/injury/wisqars/leadingcauses_images.html [cdc.gov] (.gif /.jpg formats)
http://www.cdc.gov/injury/wisqars/leadingcauses.html [cdc.gov] (.pdf format)
(Score: 2) by c0lo on Tuesday February 24 2015, @02:14PM
You mean humans got a taste for playing in driverless care braking/veering suddenly?
If so, their play won't last long enough to make an enjoyable experience.
https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
(Score: 2) by darkfeline on Tuesday February 24 2015, @06:58PM
No, computers have judgement just like people, it's just a different kind of judgement.
Let me begin by demonstrating a point with facial recognition. Computers now have gotten pretty good at facial recognition, but they still make mistakes, labeling some faces as non-faces and some non-faces and faces. Humans, hey, they must be a lot better at facial recognition, right? Except people see faces in non-faces all the time (face in toast, face in mirror, face on side of mountain, face in clouds, etc.), and I'd imagine there are times when people fail to recognize faces as well (ignoring special cases like prosopagnosia).
Humans are biased; they think only human judgement counts as judgement, or that human judgement is somehow more infallible than machine judgement, or animal judgement, or scientific judgement, or statistical judgement, etc. But that's not true; in some cases it may be that human judgement is superior, but in most cases it's not. We humans are royally bad at judging; it doesn't help that most, if not all, of our decisions are made spontaneously and only justified logically after the fact.
A computer may make an error in driving judgement: it detects a person where there is none, swerving and putting their rider at risk. A human also makes errors in driving judgement: rubbernecking at accidents, new construction, new road signs, a hot woman, misjudging safe driving speed or current driving conditions, misjudging distances, misjudging other driver behavior.
Now let us ask, which makes more errors resulting in more net cost (damage)?
Join the SDF Public Access UNIX System today!