Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Monday March 19 2018, @10:31PM   Printer-friendly
from the bound-to-happen dept.

Related Stories

Video Released of Fatal Uber - Pedestrian Accident, and More 191 comments

A few Soylentils wrote in to tell us about a fatal accident between a pedestrian and an autonomous Uber vehicle.

Update - Video Released of Fatal Uber - Pedestrian Accident

I debated just replying to the original story, but this seemed a pretty significant update to me:

The Uber vehicle was operating in autonomous mode when it crashed into 49-year-old Elaine Herzberg on Sunday evening. Herzberg was transported to a hospital, where she later died from her injuries, in what may be the first known pedestrian fatality in a self-driving crash.

The video footage does not conclusively show who is at fault. Tempe police initially reported that Herzberg appeared suddenly; however, the video footage seems to show her coming into view a number of seconds before the crash. It also showed the vehicle operator behind the wheel intermittently looking down while the car was driving itself.

The link shows video of the seconds just before the accident.

The pedestrian did not step out in front of the vehicle, she was essentially out in the middle of the road, and all her lateral movement was nearly irrelevant. She might as well have been a stationary object in the middle of the road. You can see the headlights bring her feet into view first, (meaning she was pretty much in the line before the headlights could see her, and then move up her body; she's already in the middle of the road in front of him when she comes into view.

If I were driving that car, I think I'd have had time to hit brakes (but not stop in time). I also think that that if the camera view is an accurate representation of what was really visible, then the car was overdriving its headlights. Although given my experience with cameras, I wouldn't be surprised if actual visibility was better than what the video shows.

This, in my opinion, is pretty damning.

Police Chief: Uber Self-Driving Car "Likely" Not At Fault In Fatal Crash

First Autonomous Car Death Reportedly Caused by Faulty Software 83 comments

The first machine to kill a human entirely on its own initiative was "Likely Caused By Software Set to Ignore Objects On Road" according to a new report on the collision which happened last March:

The car's sensors detected the pedestrian, who was crossing the street with a bicycle, but Uber's software decided it didn't need to react right away. That's a result of how the software was tuned. Like other autonomous vehicle systems, Uber's software has the ability to ignore "false positives," or objects in its path that wouldn't actually be a problem for the vehicle, such as a plastic bag floating over a road. In this case, Uber executives believe the company's system was tuned so that it reacted less to such objects. But the tuning went too far, and the car didn't react fast enough, one of these people said.

Fast enough? She walked across three and a half lanes in what should have been plain view of the car's LIDAR the entire time.

takyon: Also at Reuters. Older report at The Drive.

Previously: Uber Pulls Self-Driving Cars After First Fatal Crash of Autonomous Vehicle
Video Released of Fatal Uber - Pedestrian Accident, and More


Original Submission #1Original Submission #2

Uber Shutting Down Self-Driving Truck Division 5 comments

Uber's controversial self-driving truck division shuts down

Uber is shutting down its self-driving truck program, the company acknowledged on Monday. It's the latest example of Uber scaling back its self-driving technology efforts in the wake of a deadly Uber self-driving car crash in March.

Uber's self-driving truck program has been embroiled in controversy since Uber acquired the unit two years ago. The acquisition price was reportedly $680 million, though the actual cost may have been much less than that. Previously, it had been a startup called Otto, led by controversial ex-Waymo engineer Anthony Levandowski. Waymo sued Uber, arguing that Levandowski had taken Waymo trade secrets with him on the way out the door.

[...] "We've decided to stop development on our self-driving truck program and move forward exclusively with cars," said Eric Meyhofer, the leader of Uber's self-driving technology program, in a statement to The Verge. Personnel from the truck division will be folded into the company's self-driving car efforts.

Previously: Uber Buys Autonomous Truck Startup Otto
The Fall of Uber CEO Travis Kalanick
Uber Pulls Self-Driving Cars After First Fatal Crash of Autonomous Vehicle


Original Submission

An Introduction to the ARMV8-M Architecture 36 comments

Although a wide range of potential applications exists for the ARMV8-M processors, developers working on secure real-time applications will certainly see the largest benefit. So far, the ARMV8-M architecture can be found in M23 and M33 Cortex-M and M35P processors. Let’s take a look at the new features included in ARMV8-M and how these processors differ from previous generation ARMV7-M parts.

[...] The ARMV8-M feature that really sets the M23, M33, and M35P apart is their support for ARM TrustZone. TrustZone is a security extension that provides hardware isolation within the microcontroller so that developers can create secure and unsecure regions. These regions can be locations in RAM, Flash, or even interrupts and peripherals. The separation between secure and unsecure regions creates isolation within the microcontroller, allowing developers to protect mission-critical code and data.

The isolation creates two new modes that the processor can be running in: secure and unsecure. When in secure mode, the executing code can access all memory within both the secure and unsecure zones. However, if the processor is executing in the unsecure zone, only the unsecure regions can be seen. The secure regions are hidden and cannot be executed from the unsecure state without special code being added, which creates a gateway to access a secure call. This makes it possible to use secure functions while hiding what is happening behind the scenes. 

There are several other new features that developers will find interesting besides the TrustZone extension. These include:

  • Simpler MPU setup
  • Flexible breakpoint configuration
  • Improved trace support
  • Instruction set enhancements
  • Dynamic reprioritization of interrupts

Original Submission

Uber Sells Off Self-Driving and Flying Taxi Units 13 comments

Uber Sells Off Self-Driving and Flying Taxi Units

Uber sells its self-driving unit to Aurora

Uber's self-driving unit, Advanced Technologies Group (ATG), is being acquired by its start-up competitor Aurora Innovation, the companies announced Monday.

The deal, expected to close in the first quarter of 2021, values ATG at approximately $4 billion. The unit was valued at $7.25 billion in Apr. 2019 when Softbank, Denso and Toyota took a stake.

[...] Uber's co-founder and former CEO Travis Kalanick had viewed self-driving as an essential investment, saying in 2016 he believed the world would shift to autonomous vehicles. ATG had been a long-term play for Uber, but the unit brought high costs and safety challenges. Throughout the course of a pandemic-stricken year, Uber has made efforts to stem losses in its ride hailing business, control business costs -- including with major layoffs in the spring -- and to grow its delivery business.

Uber is also reportedly selling its flying taxi division to Joby Aviation, presumably putting an end to its involvement with the U.S. Army.

Uber has been scaling back its driverless car efforts since it caused the death of a pedestrian in 2018. Uber has never had a profitable quarter.

Also at NYT, Ars Technica, TechCrunch, and The Verge.

Previously: The Fall of Uber CEO Travis Kalanick
Uber Pulls Self-Driving Cars After First Fatal Crash of Autonomous Vehicle
Uber Shutting Down Self-Driving Truck Division
Uber Allegedly Ignored Safety Warnings Before Self-Driving Fatality
Will Car Ownership Soon Become "Quaint"?
Uber Freezes Engineering Hires Amid Mounting Losses

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 5, Insightful) by Knowledge Troll on Monday March 19 2018, @10:37PM (25 children)

    by Knowledge Troll (5948) on Monday March 19 2018, @10:37PM (#655162) Homepage Journal

    I'm positive that the company that is known for pushing the limits, ignoring regulations, and in general being total shits and douche bags will cooperate fully with giving the investigators all the footage from their black boxes that make them look innocent. Uber can be trusted, yessir.

    Nothing can go wrong with Uber doing this at all.

    • (Score: 2) by Fluffeh on Monday March 19 2018, @10:46PM

      by Fluffeh (954) Subscriber Badge on Monday March 19 2018, @10:46PM (#655167) Journal

      The problem is that with the reputation that Uber has, the more it holds back, the more it looks like Uber is just being Uber here - and that's not going to work well in the court system for them. The downside is that they will just keep throwing more litigation at it until everyone gets weary of reading about it.

      I really think that this is one of those cases where the state/federal government should step in and sue on behalf of the family/estate of the deceased - just to make it an even fight in the courts rather than Joe Three-Partner-Practise up against the might of hundreds of Uber Lawyers.

    • (Score: 5, Interesting) by Eristone on Tuesday March 20 2018, @12:01AM (23 children)

      by Eristone (4775) on Tuesday March 20 2018, @12:01AM (#655201)

      There is a preliminary report that the car was not at fault. https://www.sfgate.com/business/article/Exclusive-Tempe-police-chief-says-early-probe-12765481.php?t=11286f4b07 [sfgate.com]

      • (Score: 5, Interesting) by vux984 on Tuesday March 20 2018, @12:46AM (12 children)

        by vux984 (5045) on Tuesday March 20 2018, @12:46AM (#655212)

        That's quite interesting. It's normally *really* hard to prove you aren't at fault when you hit a pedestrian. The claim that they came out of nowhere is usually not terribly convincing. Even in the event the pedestrian comes out from behind a parked truck or something, the police and insurance will usually counter that one was driving too fast for the conditions if there is insufficient time to react to someone stepping into traffic.

        The fact that the vehicle was technically speeding, even if only by a few mph; is also often cited as an aggravating factor to the accident, as well as increasing the severity of the injury; and impacts your liability -- not only was the driver going too fast for the conditions, but was going too fast period.

        • (Score: 2) by MostCynical on Tuesday March 20 2018, @02:09AM

          by MostCynical (2589) on Tuesday March 20 2018, @02:09AM (#655225) Journal

          "too fast for the conditions" includes "too fast to avoid a pedestrian in full suicide mode"
          Driver has a one-to-two tonne vehicle; pedestrian has skin.

          --
          "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex
        • (Score: 3, Informative) by pTamok on Tuesday March 20 2018, @08:02AM (4 children)

          by pTamok (3042) on Tuesday March 20 2018, @08:02AM (#655297)

          According to the reports I've seen, it wasn't 'technically' speeding - it was actually speeding - that is, its speed was greater than the applicable speed limit. Many people take the view that the posted speed limit is advisory, and being a few miles per hour (or kilometres per hour) over the limit is O.K., so long as you don't exceed some imaginary leeway. Such leeway is assumed by drivers in the UK as 10% + 2 mph*, so 35 in a 30 zone is OK, but legally, anything over the posted number is breaking the law, even if it is difficult to prove (radar sensor accuracy and stability since the last calibration is arguable in court).
          I think it is reasonable to require an autonomous vehicle to observe the limits - if you can only measure velocity to a certain accuracy, make sure your error margins don't exceed the posted speed. Car speedometers already do this: they deliberately read over the actual speed, as manufacturers are held to high standards on this point - speedometers are allowed to show higher than the actual speed, but not lower.

          Doing 38 in a 35 zone doesn't seem like much, but it adds 17.9% to the kinetic energy of the vehicle, and reduces the reaction time by 7.9% - that's enough to turn an accident that 'merely' injures a pedestrian (or just misses them) into a fatal accident. Someone else on this discussion has posted the pedestrian fatality rates by speed of collision at 20/30/40 mph. It's one reason (of several) why 20mph limits in residential areas is becoming more popular.

          *Some semi-official bodies have publicised this as a recommendation, basically saying that most (not all) police forces in the UK tend to apply that rule, but will apply the limits exactly if it is deemed justified i.e. you have been driving like a pillock and/or had an accident.

          • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @11:16AM (1 child)

            by Anonymous Coward on Tuesday March 20 2018, @11:16AM (#655324)

            > reasonable to require an autonomous vehicle to observe the limits

            Disagree -- it should be a judgement call and this could be quite difficult to turn into a useful algorithm--

              + Sometimes (as mentioned elsewhere in this discussion), the speed limits are stupidly high. Not all roads near schools have low limits during school hours. Personally, if I'm on a dark street with cars parked on both sides, I may be going slower than the limit. Same for suburban streets at dawn and dusk where there are deer (this is when deer seem to be most likely to move around and cross the road)...deer move a lot faster than pedestrians and pop out of wooded areas very quickly.

              + Sometimes it makes sense to observe the limits, in particular where the limits have been sensibly posted in towns/cities. Much of the time the limits are set by traffic engineers that have a clue (but not everywhere).

              + Other times it makes much more sense (and I believe is generally accepted to be safer) to move with the flow of traffic. One car obeying the limit makes trouble...on a freeway/motorway where everyone else is speeding.

            I believe that "driving at imprudent speed" is a judgement call cops can make just about any time they like. For example, failure to slow down through a temporary construction zone.

            Based on the limited data available at this time, I think Uber has a lot of work left to do.

            • (Score: 2) by vux984 on Tuesday March 20 2018, @08:49PM

              by vux984 (5045) on Tuesday March 20 2018, @08:49PM (#655611)

              "+ Sometimes (as mentioned elsewhere in this discussion), the speed limits are stupidly high."

              One isn't required to drive "stupidly fast" though; and in fact you are legally obligated to reduce your speed where safety requires it. The speed limit is the *maximum* it is legal to go; it's not a requirement.

              In theory if the majority of vehicles end up autonomous and they are programmed to observe the limits then they will effectively dictate the flow.

              Until then though, your note that it is safer to go with the flow is not wrong. But it's also a catch-22; speeding to go with the flow may be safer, but it is still speeding - its still illegal and if there is an accident, the damage will have been increased by the speeding factor. There is no real winning move there.

              There are so many such cases when driving, where you are damned if you do, and damned if you don't. Where the legal system is at odds with safety. Where the liability conflicts with safety. Where you can avoid an incident that won't be your fault, but in doing so increases the chances of an incident that will be your fault.

          • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @12:36PM (1 child)

            by Anonymous Coward on Tuesday March 20 2018, @12:36PM (#655340)

            I drive a car with more than 100 HP. Which is nothing by American standards, btw... That enough that if I look away from the speedo for a couple of seconds, the speed will easily climb from just below the speed limit to high enough to get a ticket (2 km/h over the limit).

            Yet, for some strange reason, they expect me to spend more time looking at traffic and pedestrians, than I do looking at the speedo.

            • (Score: 2) by vux984 on Tuesday March 20 2018, @05:53PM

              by vux984 (5045) on Tuesday March 20 2018, @05:53PM (#655521)

              That's just it though. Speedometers read a touch low. So if you are intentionally keeping your speed at 60km/h as shown by the speedo, and then climb to 62km/h by accident you are still ok. The manufacturers deliberately calibrate the speedos to overestimate your speed.

              The standard in the UK for examples is that a speedo must *never* show less than the actual speed, and must never show more than 110% of actual speed + 6.25mph. So at 100mph, its legal for your speedo to show anything from 100mph to 116.25mph. And from my (limited) experience most will probably show around 105-110. So if you are driving, and intentionally keeping it at 100mph as read by the speedo, you are *really* going mid-90s, and if you drift up to 102 now and again that's fine.

              Plus most speed traps themselves give a small bit of grace; to account for their own potential for error, except they are calibrated 'the other way'. So your speedo is always overestimating your speed (to ensure it never reads lower than you really are going), and police radar is underestimating it to ensure they never give you a ticket when you are within the limit, and the upshot is that if your speedo says 2km/h over the limit, you should have nothing to worry about. Unless your speedo is broken.

        • (Score: 2) by Knowledge Troll on Tuesday March 20 2018, @08:08AM (1 child)

          by Knowledge Troll (5948) on Tuesday March 20 2018, @08:08AM (#655298) Homepage Journal

          I'm untrusting of the reporting on the topic of autonomous vehicles because there is a really strong narrative being pushed that the robot cars are safer than human drivers. This narrative is being pushed very hard and with out justification because the technology is simply too new. Objective criticism of the robot car actions is almost impossible - for some reason people don't fault the self driving car for driving like crap.

          Maybe it's because most people can't drive very well to begin with, I don't know, but criticism of these machines is extremely difficult even when they behave poorly.

          • (Score: 2) by Wootery on Thursday March 22 2018, @10:52AM

            by Wootery (2341) on Thursday March 22 2018, @10:52AM (#656548)

            Agree that there doesn't seem to be any serious analysis of this stuff - or if there is, no-one's talking about it.

            How safe are the autonomous cars really? It doesn't do to just look at fatalities per mile - that ignores not only non-fatal incidents, but also the fact that city driving (which Uber presumably disproportionately favours) isn't the equivalent of high-speed driving.

            If a decent study on this question exists, please point me to it!

        • (Score: 2) by Freeman on Tuesday March 20 2018, @05:31PM (3 children)

          by Freeman (732) on Tuesday March 20 2018, @05:31PM (#655511) Journal

          The pedestrian was pushing a vehicle and apparently they weren't paying enough attention when they were crossing the road. Accidents happen, it sucks, but you can't stop a big heavy machine on a dime. My 2 year old knows they shouldn't cross the street without Mommy or Daddy and that you're supposed to look both ways for cars. While going 38 in a 35 is speeding, I wouldn't call it malicious.

          Uber's autonomous car being the first to kill anyone isn't the kind of publicity they need and definitely sends up all kinds of red flags. Since they have a history of doing unethical and possibly illegal things. I just don't think this is one of those things. There definitely should be a thorough investigation, though

          --
          Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
          • (Score: 3, Insightful) by vux984 on Tuesday March 20 2018, @11:14PM (2 children)

            by vux984 (5045) on Tuesday March 20 2018, @11:14PM (#655703)

            Take a look at the photo of the site:

            https://jalopnik.com/video-shows-pedestrian-in-fatal-uber-crash-stepped-in-f-1823922228 [jalopnik.com]

            This *seems* to show a fairly open area. A fairly wide median. The bike had white plastic bags on it that would have caught the headlights. The person was walking the bike so not moving terribly fast. I understand that these kinds of accidents happen, but it's kind of hard to believe that an attentive driver would have had zero warning, zero chance to react. And would have failed to at least hit the brakes before striking the pedestrian. Even if the accident couldn't have been avoided, I'm still surprised the brakes weren't even applied; especially by a computer with better reaction times than me.

            I also find the comments that she wasn't at the crosswalk to be ... misleading. Would the vehicle really have stopped if she'd been at the crosswalk? Is uber running a different algorithm for crosswalks? If she'd been approaching the crosswalk and just walked out into the crosswalk without looking, would the vehicle have stopped in time then? Crosswalks at best have better lighting, and you aren't allowed to park right adjacent to them to help sight lines, but im not seeing a sightline issue here, and id thought autonomous vehicles could cope with lower light levels better than people so the "shadows" should have been less a problem.

            I'm also seeing some followup that the vehicle didn't have "time to stop", which I don't really find suspicious. What i find worrying is that everything i can see suggests the vehicle really should have had time to at least hit the brakes though.

            • (Score: 1) by pTamok on Wednesday March 21 2018, @08:37AM (1 child)

              by pTamok (3042) on Wednesday March 21 2018, @08:37AM (#655991)

              There is a really good Twitter thread about the infrastructure design at the accident site.

              https://twitter.com/EricPaulDennis/status/975891554538852352/photo/1 [twitter.com]

              There are pedestrian footways build on the 'median', which are exactly where a good engineer would put them, but 'closed off' with signs, but no fences.

              Well worth reading.

              • (Score: 3, Insightful) by vux984 on Wednesday March 21 2018, @10:29PM

                by vux984 (5045) on Wednesday March 21 2018, @10:29PM (#656369)

                Wow, yeah... I mean, it's still clear the pedestrian shouldn't be there. But talk about setting things up to fail. And I find it ever less convincing that a human driver would ever found completely faultless for striking a pedestrian walking there, without so much as attempting to brake. It's just wide open. The pedestrian was walking. Its not like they were hiding behind a truck and jumped out in front of traffic to try and commit suicide.

                The pedestrian has responsibility too; they aren't innocent but the driver/vehicle should have had some time to react.

                But I just don't see how uber could get a complete walk on this; even if they weren't charged for the accident (which I'd understand; lots of accidents end without charges -- accidents happen and I'm not at the point of saying uber was criminally malicious or negligent); but you'd think in terms of insurance they'd have at least some liability.

      • (Score: 1, Informative) by Anonymous Coward on Tuesday March 20 2018, @02:35AM (6 children)

        by Anonymous Coward on Tuesday March 20 2018, @02:35AM (#655235)

        "Traveling at 38 mph in a 35 mph zone on Sunday night, the Uber self-driving car made no attempt to brake

        • (Score: 2, Interesting) by Anonymous Coward on Tuesday March 20 2018, @06:28AM

          by Anonymous Coward on Tuesday March 20 2018, @06:28AM (#655281)

          "Traveling at 38 mph in a 35 mph zone on Sunday night, the Uber self-driving car made no attempt to brake

          Which means it's at fault. Period. A stupid human may not be liable, but a computer that is suppose to be reactive to situations like that, should have hit the brakes.

        • (Score: 4, Interesting) by Knowledge Troll on Tuesday March 20 2018, @08:13AM (4 children)

          by Knowledge Troll (5948) on Tuesday March 20 2018, @08:13AM (#655299) Homepage Journal

          If that machine can't hold a constant velocity of 34.9 mph in a 35 zone it isn't fit for the purpose of being on the road. That control loop is one of the very first things they would be doing with the control system.

          • (Score: 2) by pendorbound on Tuesday March 20 2018, @03:26PM (3 children)

            by pendorbound (2688) on Tuesday March 20 2018, @03:26PM (#655415) Homepage

            That's unrealistic given the way gasoline automobiles work. It doesn't take much external force to alter the car's velocity a small amount. The nature of the drive train means over & under corrections are inevitable. A slight rise or dip in the terrain or a gust of wind would be plenty to change the car's velocity by a few MPH. The physical system that the computer is controlling (engine & brakes) isn't accurate enough to avoid inevitable overshoot when the computer applies control. You can see that at work with a simple cruise control on human-driven cars.

            The only way you'd get a control system to keep the car exactly at a given speed would be to have it constantly alternate gas & brake to speed up and slow down. That's ruinously bad for mileage and the mechanical lifespan of the car, and would make for a really bumpy ride, but that's the least of the problems. A car constantly braking & accelerating would appear erratic to other drivers, probably resulting in them speeding around it to "get away from" the erratic driver.

            Until/unless 100% of cars become fully autonomous, self-driving cars need to behave in a fashion that's similar to what human drivers expect of other human drivers. Drifting a few MPH over or under the posted limit is normal, and thus it's actually the safest way for an autonomous car to behave. The reactions of human drivers around it are the most likely source of accidents.

            • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @08:03PM

              by Anonymous Coward on Tuesday March 20 2018, @08:03PM (#655600)

              That's unrealistic given the way gasoline automobiles work.

              I don't agree at all with this statement.

              It doesn't take much external force to alter the car's velocity a small amount.

              And if the control loop on those cars is under 1khz I'd be surprised - that means with in 1/500th of a second corrections can be applied to any external disturbance. Running at 1khz is trivial for the velocity control loop if they wanted to.

              The nature of the drive train means over & under corrections are inevitable.

              True but that is actually the nature of any control system. They always go over and under their set point and perform constant adjustments when that happens. They effectively always have error that is always being corrected with an average that is the correct value. This is a standard problem solved all the time.

              A slight rise or dip in the terrain or a gust of wind would be plenty to change the car's velocity by a few MPH.

              The sensors are accurate enough that the deceleration can be measured as soon as the car starts traveling up hill at a fraction of a degree. With in a tiny fraction of a second the throttle will be adjusted and another reading from the sensors is taken.

              The physical system that the computer is controlling (engine & brakes) isn't accurate enough to avoid inevitable overshoot when the computer applies control.

              Control systems always overshoot, this is not realistic to complain about. Perhaps you could say the performance is too high and the car will overshoot too far and not maintain the speed requirements but that's a problem the control loop tuning solves.

              The only way you'd get a control system to keep the car exactly at a given speed would be to have it constantly alternate gas & brake to speed up and slow down.

              Is that how you drive? Jesus christ - you are always on the throttle or brakes? Do you even have a license or drive a car?

              A car constantly braking & accelerating would appear erratic to other drivers, probably resulting in them speeding around it to "get away from" the erratic driver.

              Dude the brakes and the gas pedal are not binary.

              Until/unless 100% of cars become fully autonomous, self-driving cars need to behave in a fashion that's similar to what human drivers expect of other human drivers. Drifting a few MPH over or under the posted limit is normal, and thus it's actually the safest way for an autonomous car to behave. The reactions of human drivers around it are the most likely source of accidents.

              Well moving above the speed limit because it's safer in traffic isn't really a thing to worry about in the middle of the night, now is it?

            • (Score: 0) by Anonymous Coward on Wednesday March 21 2018, @10:04PM

              by Anonymous Coward on Wednesday March 21 2018, @10:04PM (#656360)

              While I don't do it very often, I can keep within 1 mph of any speed limit on a reasonably flat straight road. Oftentimes even on curves and differing grades if I have the patience and attention. A robot has none of these limitations and having worked on cars from the 90s through late 2000s, I can tell you almost all of them have both the sensors and the ECU/ABS speedometer accuracy to hold speed at tenths of a mph within the vehicles operating range, and that is on MANUAL cars. Automatic cars with ABS, TCS, throttle and steering by wire should have no problems under conditions outside of 5mph or 5 percent grade in holding their speed steady at or below the posted speed limit. Furthermore having had a discussion about this just the other day: Most of the modern net-enabled GPS navigation systems have sign-accurate speed limit markers including with their navigation service, to notify you to slow down/speed up on a stretch of road. Given that, there really is no excuse for an autonomous car to not be driving almost exactly the speed limit. The entire point of autonomous vehicles is to provide better safety, reliability, and reproducability than human drivers can. If they aren't doing that then they fail at autonomous cars.

            • (Score: 0) by Anonymous Coward on Thursday March 22 2018, @12:54AM

              by Anonymous Coward on Thursday March 22 2018, @12:54AM (#656407)

              It's unrealistic to expect the car to be able to maintain 35.00 MPH, but we're talking 38 in a 35 MPH zone here. That's almost 10% off. The simple cruise control on my relatively uninteresting car would have no problem with that, and it doesn't even have control of the brakes, only the throttle. Most human drivers could do a pretty reasonable job at it too, which makes me wonder what it must be like following you around if you can't maintain your speed within 35 +/- 3 MPH.

      • (Score: 2) by rigrig on Tuesday March 20 2018, @08:15AM (2 children)

        by rigrig (5129) <soylentnews@tubul.net> on Tuesday March 20 2018, @08:15AM (#655301) Homepage

        it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,

        Not the same as not being at fault, especially as the car was speeding (a bit), and aren't autonomous vehicles supposed to have a bunch of sensors that don't care about lighting conditions?

        --
        No one remembers the singer.
        • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @12:42PM (1 child)

          by Anonymous Coward on Tuesday March 20 2018, @12:42PM (#655341)

          based on how she came from the shadows right into the roadway

          Wait, wasn't a big part of the argument for why autonomous vehicles will be safer that they are using radar and lasers to locate obstacles, and so won't be affected by shadows...

          • (Score: 2) by pendorbound on Tuesday March 20 2018, @03:31PM

            by pendorbound (2688) on Tuesday March 20 2018, @03:31PM (#655420) Homepage

            Better isn't the same thing as perfect. If you're only willing to accept a completely perfect and 100% safe car, you should probably never leave your home.

  • (Score: 0) by Anonymous Coward on Monday March 19 2018, @10:38PM (5 children)

    by Anonymous Coward on Monday March 19 2018, @10:38PM (#655163)

    Uber's liable to end up in the dumpster a few years before we all expected it to.

    • (Score: 2) by c0lo on Monday March 19 2018, @10:47PM (3 children)

      by c0lo (156) Subscriber Badge on Monday March 19 2018, @10:47PM (#655168) Journal

      But... was the passenger injured? If not, this is not a business busting incident**, Uber promises you get to the destination in spite of those service-refusing pedestrians and delivers
      Always safer to travel with Uber!
      Our latest self-driving light armoured cars make the best of both worlds - nimble enough in an urbanized environment, offering unparalleled protection for the passengers.

      ** we'll tale care about those pesky regulations in the next financial quarters

      (GRIN)

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 3, Funny) by bob_super on Monday March 19 2018, @10:57PM (2 children)

        by bob_super (1357) on Monday March 19 2018, @10:57PM (#655175)

        Tired of being slowed down by traffic jams? For only 10x the price of Uber Black, you may now enjoy Uber Tank.

        • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @02:42AM

          by Anonymous Coward on Tuesday March 20 2018, @02:42AM (#655239)

          At this price we'll throw in enough gear* for the 2nd amendment faithful too, satisfaction guarantee.
          Now you cam mow down up to ten schools before the police apprehends you.

          * ammunition not included

        • (Score: 3, Touché) by darkfeline on Tuesday March 20 2018, @06:13PM

          by darkfeline (1030) on Tuesday March 20 2018, @06:13PM (#655531) Homepage

          You mean Uberpanzer? There's nothing like German superiority for crushing the inferior masses underneath.

          --
          Join the SDF Public Access UNIX System today!
    • (Score: 5, Insightful) by srobert on Tuesday March 20 2018, @02:15AM

      by srobert (4803) on Tuesday March 20 2018, @02:15AM (#655227)

      The company basically ignores laws, pushes the costs of operation onto the drivers, unlike a decent cab company that buys cabs and hires drivers, and so on. I feel bad for just about everyone being abused by the gig economy, including Uber's drivers. Call me a social justice warrior, and all that shit if you want, but I don't get rides from Uber. If I can't drive my own car, I'll take the bus or call a cab. I understand that Lyft is a little better, so I might be open to that.

  • (Score: 2) by bob_super on Monday March 19 2018, @10:46PM (29 children)

    by bob_super (1357) on Monday March 19 2018, @10:46PM (#655166)

    The problem with the self-driving industry aren't the tech and the bugs. It's the liability.
    Car manufacturers occasionally get in multi-billion dollars recalls, but they reject (correctly) most crashes' liability onto the driver.

    With autonomous cars, they are liable, because there's really nobody else to blame (mandatory maintenance will be a problem). And the amounts could easily bankrupt them.

    • (Score: 2) by c0lo on Monday March 19 2018, @10:49PM (22 children)

      by c0lo (156) Subscriber Badge on Monday March 19 2018, @10:49PM (#655170) Journal

      And the amounts could easily bankrupt them.

      Bah... some regulation. Nothing a good lobbying and electoral contributions can't solve.

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 2) by bob_super on Monday March 19 2018, @10:55PM

        by bob_super (1357) on Monday March 19 2018, @10:55PM (#655174)

        Meh. I need to express my pain. Money is speech. You can't tell me I'm limited to a specific amount of pain (multiplied by the class size).

        Silly? It only takes 5 judges agreeing with me.

      • (Score: 3, Interesting) by fyngyrz on Monday March 19 2018, @10:59PM (12 children)

        by fyngyrz (6567) on Monday March 19 2018, @10:59PM (#655176) Journal

        Might also make pedestrians a bit more careful.

        I've been legally driving since 1971, and I couldn't possibly count the number of times someone has walked out into traffic in some unreasonable fashion and required me to stand on my brakes.

        Might even make us build safer roads - pedestrian bridges in towns and etc., anti-wildlife-fenced highways elsewhere. A lot of death and destruction is simply a result of cheaping out on how roads should actually be built. After all, the vehicle can't hit 'em if they aren't there.

        • (Score: 5, Informative) by bob_super on Monday March 19 2018, @11:12PM

          by bob_super (1357) on Monday March 19 2018, @11:12PM (#655180)

          Outside of towns, sure. Inside cities, it might be wiser to remove the cars, or force them to really low and therefore safe speeds. There are way too many places in the US where a 4-lane road, where people go 50mph, is right against houses, shops and schools.

        • (Score: 2) by Nuke on Monday March 19 2018, @11:40PM (2 children)

          by Nuke (3162) on Monday March 19 2018, @11:40PM (#655191)

          The idea of self-driving cars is being pushed on the claim that they are safer than human driven ones and that no infrastrucure changes will be required to accomodate them. But there you go already.

          • (Score: 2) by pendorbound on Tuesday March 20 2018, @03:35PM (1 child)

            by pendorbound (2688) on Tuesday March 20 2018, @03:35PM (#655423) Homepage

            Self-driving cars don't have to be all that safe to be "safer" than human drivers. The accident stats for all self-driving cars to date back that up. Given the number of total miles driven by self-driving cars, human drivers would statistically have caused far more accidents and deaths.

            • (Score: 1, Insightful) by Anonymous Coward on Tuesday March 20 2018, @05:05PM

              by Anonymous Coward on Tuesday March 20 2018, @05:05PM (#655492)

              > human drivers would statistically have caused far more accidents and deaths.

              Sorry, I don't give a rat's ass about the stats for the total driving population. I want the self driving car to be better/less accidents than my demographic niche. I don't drink and drive, I'm not a teenager and I've been to advanced driving schools--I do my racing off the public roads. My strong suspicion is that my demographic is a good order of magnitude safer than the general average. That's the target I want to see before I hand off driving to an AI.

        • (Score: 3, Informative) by MichaelDavidCrawford on Monday March 19 2018, @11:45PM (1 child)

          by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Monday March 19 2018, @11:45PM (#655195) Homepage Journal

          I saw a fenced highway in Banff National Park.

          It had underpasses here and there to enable animals to migrate.

          --
          Yes I Have No Bananas. [gofundme.com]
          • (Score: 2) by fyngyrz on Tuesday March 20 2018, @11:53AM

            by fyngyrz (6567) on Tuesday March 20 2018, @11:53AM (#655334) Journal

            In Indiana, the highway has (or had, it's been some years since I drove it) infrared animal detection and warning systems set up that would signal drivers if animals were near the pavement. I saw it work a couple of times.

            It has always struck me that the calculus for road engineering vs. saving lives – both human and animal – is pretty harsh, most places.

        • (Score: 3, Interesting) by takyon on Tuesday March 20 2018, @02:45AM (3 children)

          by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday March 20 2018, @02:45AM (#655241) Journal

          Might also make pedestrians a bit more careful.

          An autonomous car that isn't in beta (death) testing phase should be far safer than a human driver. LIDAR can be used to see around corners and the computer can react to circumstances in milliseconds rather than hundreds of milliseconds. So I don't see it having an effect on pedestrian behavior. In fact, if pedestrians know that autonomous cars will always halt when a pedestrian gets in the way, they may be motivated to jaywalk more.

          We always knew that people would die from autonomous cars. "Zero fatalities" isn't realistic.

          --
          [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @03:34AM (1 child)

            by Anonymous Coward on Tuesday March 20 2018, @03:34AM (#655255)

            How does lidar see around corners? Bendy light?

            • (Score: 3, Informative) by takyon on Tuesday March 20 2018, @03:43AM

              by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday March 20 2018, @03:43AM (#655261) Journal

              Stanford Researchers Develop Non-Line-of-Sight LIDAR Imaging Procedure [soylentnews.org]

              NLOS [(Non Line Of Sight)] imaging reconstructs the shape and albedo of hidden objects from multiply scattered light. Despite recent advances, NLOS imaging has remained impractical owing to the prohibitive memory and processing requirements of existing reconstruction algorithms, and the extremely weak signal of multiply scattered light. Here we show that a confocal scanning procedure can address these challenges by facilitating the derivation of the light-cone transform to solve the NLOS reconstruction problem. This method requires much smaller computational and memory resources than previous reconstruction methods do and images hidden objects at unprecedented resolution. Confocal scanning also provides a sizeable increase in signal and range when imaging retroreflective objects.

              --
              [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
          • (Score: 2) by HiThere on Tuesday March 20 2018, @05:57PM

            by HiThere (866) Subscriber Badge on Tuesday March 20 2018, @05:57PM (#655523) Journal

            I don't think that "lidar seeing around corners" is realistic outside of a controlled environment. It's one thing for "over the horizon radar" to work, as the sky is a pretty uniform environment, it's another for lidar to see around corners. Yes, they've done it in a controlled environment, but that's not a typical use case.

            --
            Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
        • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @03:36AM

          by Anonymous Coward on Tuesday March 20 2018, @03:36AM (#655258)

          Might even make us build safer roads - pedestrian bridges in towns and etc.

          Apparently pedestrian bridges do not, in fact, make the roads safer.

        • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @06:25AM

          by Anonymous Coward on Tuesday March 20 2018, @06:25AM (#655280)

          Nice, put it on the pedestrians. Drivers are often inattentive, drive too fast*, and are aggressive-- all the while feeling invincible because they are surrounded by tons of steel. The pedestrian is a grease spot waiting to happen. If the pedestrian could kill *you* by their actions, then you would have a right to complain, as it is, it is a certainty that you have come closer to killing a pedestrian while driving that any pedestrian has come to causing your death, unless that pedestrian was packing, and shot at you, but that is another thing entirely.

          I walk a lot. I've been almost killed by idiot drivers more times than I can count (cars running red lights, driving up onto the sidewalk while looking at their phones, blowing through intersections while pedestrians are in the cross walk, etc.) I've taken to carrying rocks to throw at the glass of vehicles that try to kill me, and once happened to have a bag of dog poop in hand at the right moment. Fuck car drivers, they are the problem, and the idiots who design terrible intersections / too wide of streets / too high of speed limits to make it as unsafe as possible for the pedestrians.

          *car - pedestrian collisions:
          40mph = 85% probability of death to pedestrian
          30 mph = 40% probability of death to pedestrian
          20 mph = 5% probability of death to pedestrian

          There are several 45 mph streets in front of schools in my town. This is fucking insane.

      • (Score: 5, Insightful) by AthanasiusKircher on Tuesday March 20 2018, @12:25AM (7 children)

        by AthanasiusKircher (5291) on Tuesday March 20 2018, @12:25AM (#655206) Journal

        Perhaps. Right now it's a bit of a race on whether they improve things fast enough to get widespread adoption before major accidents happen. All it might take is a few serious accidents and politicians might be loathe to touch this issue.. and might even regulate the industry enough to set it back a decade or two.

        Just imagine an incident like this involving a school bus collision where some kids die. Won't matter if a human driver would have also made an error. Won't matter if the autonomous car wasn't even primarily at fault.

        People keep saying self-driving cars only need to be better than the average human. Not so. Like any new tech, they may be judged harshly while they are still unfamiliar... And even a couple serious freak accidents could damage the industry.

        • (Score: 4, Insightful) by acid andy on Tuesday March 20 2018, @01:01AM (5 children)

          by acid andy (1683) on Tuesday March 20 2018, @01:01AM (#655218) Homepage Journal

          You're so right.

          People keep saying self-driving cars only need to be better than the average human.

          And those people consist mainly of astroturfing shills and those who've been brainwashed by them. We'll need more details of this accident but imagine how an owner of the autonomous vehicle would feel in such a situation, if it was an accident they felt sure they could have avoided had they been driving instead. They don't just need to be better than average, they at least need to be better than most drivers that want to buy one.

          Also, we need to be careful what we mean by "better than average". It's no good if your self-driving car is better than average for the first five years you use it and then turns out to be lethally dangerous in a rare situation you happen to encounter in the sixth year. For the long term owner's safety, it needs to be at least as good as them in all circumstances.

          --
          If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?
          • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @01:14AM (2 children)

            by Anonymous Coward on Tuesday March 20 2018, @01:14AM (#655220)

            They could also allow worse and worse human drivers to get licenses in order to lower the standard.

            • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @02:33AM (1 child)

              by Anonymous Coward on Tuesday March 20 2018, @02:33AM (#655234)

              Hard to lower the standards for getting a driving license much further than they are currently in much of the USA. Other parts of the world have much more stringent tests.

              Same for registering cars--some parts of the world (like Japan & Germany) have very rigorous inspections for used cars more than a few years old. Much of the USA has no annual safety inspection at all, old cars may be driving behind you that have no brakes (for one extreme example).

              • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @02:37AM

                by Anonymous Coward on Tuesday March 20 2018, @02:37AM (#655237)

                Fine, then raise the standards to revoke them. Same thing, except that can get you more votes.

          • (Score: 1, Insightful) by Anonymous Coward on Tuesday March 20 2018, @03:38AM

            by Anonymous Coward on Tuesday March 20 2018, @03:38AM (#655260)
            It needs to be much better since you are giving up control and your genes have a much reduced influence over what happens. Having a wide spread potential lethal danger where the DNA doesn't have much influence on survival is not so good for the species. Maybe in toughness and recovery but that's it.

            This is why elevators have to be much safer than the stairs and why airplanes, trains and self-driving cars have to be much safer than you driving a car for yourself.

            p.s. that's also why it's great for the species to have young guys do crazy stupid potentially fatal stunts. Guys are far more expendable (you just need the males for a few minutes, you need the females for nine months), so evolution has made the males more keen to push the limits in all sorts of stuff. If some fail and die it's good for the species. If they succeed they might have something to contribute to the gene pool. That's why you see more guys in "fail" and "awesome" videos.
          • (Score: 2) by bob_super on Tuesday March 20 2018, @04:12PM

            by bob_super (1357) on Tuesday March 20 2018, @04:12PM (#655447)

            > to be lethally dangerous in a rare situation you happen to encounter in the sixth year

            Haven't you been paying attention to the companies designing those? They won't get safety updates after three years, five at most.
            "Your autonomous car is no longer supported, have you looked at our latest model?"
            "But, it's still in good shape, I want to keep using it!"
            "You're on your own, maam. We have to disable the self-driving for liability reasons. You may buy our Steering Wheel package for $15000, and you get 50% off the pedals at $5000"

        • (Score: 2) by tonyPick on Tuesday March 20 2018, @06:34AM

          by tonyPick (1237) on Tuesday March 20 2018, @06:34AM (#655283) Homepage Journal

          People keep saying self-driving cars only need to be better than the average human.

          Yeah, only for an actual self driving car that's "better than average human" at driving you need a general purpose artificial intelligence with the whole "Reasoning, planning, learning and knowledge" thing baked in.

          Good Luck With That.

    • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @05:12AM (4 children)

      by Anonymous Coward on Tuesday March 20 2018, @05:12AM (#655272)

      With autonomous cars, they are liable, because there's really nobody else to blame (mandatory maintenance will be a problem). And the amounts could easily bankrupt them.

      The only thing that makes any sense is for the owner of the autonomous vehicle to be liable for any injuries caused by the use of their vehicle.

      Vehicle owners today carry liability insurance to cover exactly this sort of accident. They will continue to do so with autonomous vehicles.

      If the problem turns out to be the fault of the vehicle manufacturer, then owners (or more realistically, their insurance providers) can attempt to recover costs from the manufacturer through the usual means.

      • (Score: 2) by bob_super on Tuesday March 20 2018, @04:20PM (3 children)

        by bob_super (1357) on Tuesday March 20 2018, @04:20PM (#655456)

        > If the problem turns out to be the fault of the vehicle manufacturer

        Hungry lawyers will sue manufacturers for every bump, scratch, and crash.
        While cameras will help dismiss most lawsuits, that's a whole lot of money spent right there.
        Then there will be the crashes actually caused by bugs or weird circumstances, and those will cost a whole lot of cash.

        I honestly don't know how you can have autonomous cars in a heavily litigious society like the US. The "shut up this was made by a government manufacturer" Chinese stand a much better chance.

        • (Score: 2) by maxwell demon on Tuesday March 20 2018, @07:03PM (2 children)

          by maxwell demon (1608) on Tuesday March 20 2018, @07:03PM (#655571) Journal

          A Chinese autonomous car will also have it easier with the decision whether to prefer killing the passenger or the pedestrian. It will just access their social scores and decide to kill the one with the lower score.

          --
          The Tao of math: The numbers you can count are not the real numbers.
          • (Score: 2) by vux984 on Tuesday March 20 2018, @11:18PM (1 child)

            by vux984 (5045) on Tuesday March 20 2018, @11:18PM (#655707)

            " It will just access their social scores and decide to kill the one with the lower score."

            Even if it has to swerve out of the lane and onto the sidewalk to make the kill!

            That's what you get for wearing green mittens that your neighbors reported were an eye sore... and also not the party color!

            • (Score: 2) by bob_super on Wednesday March 21 2018, @07:39PM

              by bob_super (1357) on Wednesday March 21 2018, @07:39PM (#656298)

              But not a real green dress, that's cruel!

    • (Score: 2) by Freeman on Tuesday March 20 2018, @05:34PM

      by Freeman (732) on Tuesday March 20 2018, @05:34PM (#655512) Journal

      With a fully autonomous car, that doesn't have a steering wheel, I can see your point. Assuming they market it more like Tesla's auto-pilot feature, they won't be held liable.

      --
      Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
  • (Score: 2, Insightful) by cocaine overdose on Monday March 19 2018, @11:17PM

    I, for one, welcome our new automaton overlords. With every new accident, autonomic cars move one step closer to being banned. If Tesla could now allow one of their semis to plow into one of the many private girls-only catholic school, schools of children that are overrunning the streets of Manhatten at lunch, then we could finally can these high-capacity, assault vehicles. You thought school shootings were bad? Now how about millions of, Cloud (TM) connected vehicles. More complex than the human need to be a blithering asshat at every turn, and imbued with the expected security that comes with "Move fast, and break things" attitude. Think of the children! Long live autonomic cars.
  • (Score: 3, Insightful) by MichaelDavidCrawford on Monday March 19 2018, @11:42PM

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Monday March 19 2018, @11:42PM (#655193) Homepage Journal

    I expect many drivers have become completely dependent on Uber for their livelihood.

    As a result they are just barely making it with the aid of food stamps, and will soon lose their jobs completely.

    --
    Yes I Have No Bananas. [gofundme.com]
  • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @12:07AM

    by Anonymous Coward on Tuesday March 20 2018, @12:07AM (#655202)

    The network needs more incidents like this as negative data points to train on. It literally becomes smarter after each additional accident, and the more deadly ones will be given ever larger weights as examples of what *not* to do under any circumstance. Then what happens if somehow all the weights are multiplied by -1 someday? Possibly a single cosmic ray could do it.

  • (Score: 2) by srobert on Tuesday March 20 2018, @02:21AM (4 children)

    by srobert (4803) on Tuesday March 20 2018, @02:21AM (#655229)

    ... but won't stop it. I'm still concerned about the macro-economic impact of driverless cars. However, in terms of safety. This set back will only delay this research. Eventually, the driverless cars will be proven safer than their human counterparts. If human drivers want to delay this further, human drivers will need to stop driving drunk, stop driving enraged, pay more attention, and do everything possible to make no mistakes. I've been trying to do that but even then I've had some accidents and in the course of several decades some of them have been my fault. Driverless cars are inevitable.

    • (Score: 2) by c0lo on Tuesday March 20 2018, @02:53AM

      by c0lo (156) Subscriber Badge on Tuesday March 20 2018, @02:53AM (#655243) Journal

      Eventually, the driverless cars will be proven safer than their human counterparts.

      I'm an excellent driver [youtube.com]

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 2) by theluggage on Tuesday March 20 2018, @03:57PM (1 child)

      by theluggage (1797) on Tuesday March 20 2018, @03:57PM (#655436)

      Eventually, the driverless cars will be proven safer than their human counterparts.

      [Citation needed]

      Seriously, what's the basis of this claim, apart from optimism and the desire to talk up tech shares? At the moment, there are plenty of narrowly-defined "low-hanging fruit" party tricks like lane-keeping on a freeway or parallel parking that happen to demand the sort of technical precision and consistency that machines are good at, but low on things like human judgement/problem solving when faced with the unexpected. I believe freeway driving (on roads that have been designed from the ground up for safe driving at high speed with the minimum of obstacles) is one of the safest forms of driving (although when accidents happen they tend to be serious) c.f. driving around town and country roads. Parallel parking is just geometry when you have lidar and know the exact angle of your wheels (and I don't think it is the source of many accidents more serious than minor dents and broken lights) - an area in which humans are lacking.

      Fact is, humans are rather good at driving - we're built for it with binocular vision and instinctive ability to judge velocities and trajectories - as long as we're not drunk, distracted, half-asleep or just plain stupid. Its a tough challenge for AI.

      Seems to me that self-driving is about where speech recognition was in the 80s-90s - plenty of proof-of-concept but 30+ years later we're still mostly using keyboards (cue demonic cackle from Alexa). I'm sure it will come, but a quantum leap is required before its ready to take over from a human who is drunk/distracted/asleep/stupid (or otherwise unfit to be behind the wheel) and that's not a gulf that can be safely crossed via public betas.

      Meanwhile, there are other ways to solve the drunk/distracted/asleep/stupid problem - like better public transport (which, e.g. stays on at weekends, he says having spent last weekend jumping through the hoops necessary to get to and from an alcohol-consumption venue via public transport) - which also makes it easier to avoid handing out driving licenses to people falling in the "stupid" category without effectively putting them under house arrest.
         

      • (Score: 2) by srobert on Tuesday March 20 2018, @11:35PM

        by srobert (4803) on Tuesday March 20 2018, @11:35PM (#655716)

        No citation. Maybe it's just a hunch. I kind of hope you're right, because I don't know what's going to happen to the millions of people who drive to earn a living. Lately technology has been displacing people without much creating an equal number of comparable jobs.

    • (Score: 2) by HiThere on Tuesday March 20 2018, @06:04PM

      by HiThere (866) Subscriber Badge on Tuesday March 20 2018, @06:04PM (#655524) Journal

      There's a fair chance that automated cars are already safer than the average driver. But they aren't safer than the average drivers believes themselves to be.

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
  • (Score: 2) by Pslytely Psycho on Tuesday March 20 2018, @03:00AM (1 child)

    by Pslytely Psycho (1218) on Tuesday March 20 2018, @03:00AM (#655247)

    It begins.....

    https://xkcd.com/1656/ [xkcd.com]

    --
    Alex Jones lawyer inspires new TV series: CSI Moron Division.
    • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @06:55AM

      by Anonymous Coward on Tuesday March 20 2018, @06:55AM (#655287)
      An AI could post what you posted. Cheaper, faster and more reliable.
  • (Score: 2) by Kawumpa on Tuesday March 20 2018, @08:43AM (4 children)

    by Kawumpa (1187) on Tuesday March 20 2018, @08:43AM (#655306)

    You know what would be progress for society? No, not self-driving cars, getting people out of cars on to bicycles, cycling on properly designed safe bike lanes. Improving people's fitness and health as well as the environment.

    Yeah, yeah, I know off-topic...

    • (Score: 2) by PiMuNu on Tuesday March 20 2018, @12:53PM (1 child)

      by PiMuNu (3823) on Tuesday March 20 2018, @12:53PM (#655343)

      Self-driving bicycles - best of both worlds.

      • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @05:20PM

        by Anonymous Coward on Tuesday March 20 2018, @05:20PM (#655502)

        Coming soon to the Cornell U campus -- self driving bicycles:
            https://www.youtube.com/watch?v=AryfMOR0iIQ [youtube.com]
        As I heard the story, they will just ride around the campus for fun...stopping back at base for battery charging when needed.

    • (Score: 2) by Freeman on Tuesday March 20 2018, @05:37PM (1 child)

      by Freeman (732) on Tuesday March 20 2018, @05:37PM (#655514) Journal

      Have you seen serious bicycle crashes? You should watch more bicycle racing, then. Admittedly, the average bicycle crash is probably less lethal than your average car crash. I have twisted the frame on a bicycle by going over the handlebars in the rain, though. Didn't do good things to my ribs, either.

      --
      Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
  • (Score: 2) by All Your Lawn Are Belong To Us on Tuesday March 20 2018, @04:41PM (2 children)

    by All Your Lawn Are Belong To Us (6553) on Tuesday March 20 2018, @04:41PM (#655469) Journal

    I'm curious.

    If this were medicine or nursing you'd have Institutional Review Boards and required justifications before any drug or technique comes close to a human being. Even then there are errors.
    If this were aviation you'd have gobs of FAA requirements and licensing to complete before a plane is allowed to fly even with a test pilot. Even then there are errors.
    If this were Facebook you'd.... ok. Bad example but semi-legitimate parallel I hope.

    And the point is: Who has processed the ethics of allowing Uber (and Google and....) to be driving their experimental autonomous cars in live traffic with live pedestrians? Yes, momentary pause for hatred of bureaucracy we all have and I am preaching Libertarian heresy here. But regulation and ethics helps to limit negative impacts of experimental things. What regulations do autonomous car companies face?

    --
    This sig for rent.
    • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @05:23PM (1 child)

      by Anonymous Coward on Tuesday March 20 2018, @05:23PM (#655504)

      Regulations, ethics review boards? Ha! The big tech companies lobby state gov't and get permission. There may be some additional hand waving, but I think it is another case where money talks.

      • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @05:25PM

        by Anonymous Coward on Tuesday March 20 2018, @05:25PM (#655506)

        Replying to my own AC comment -- I almost forgot that Uber was caught testing self-driving cars without getting any permission -- was that in the SF area or somewhere else in California?

  • (Score: 2) by All Your Lawn Are Belong To Us on Tuesday March 20 2018, @04:55PM (2 children)

    by All Your Lawn Are Belong To Us (6553) on Tuesday March 20 2018, @04:55PM (#655482) Journal

    ..... In Arizona it may be true that the car can't be at fault. I might be wrong, but I think when I used to live in California the law was if you hit a pedestrian you are at fault, at all times and in all places. "The car is not at fault" would not be a legitimate statement there. Which may well mean that companies might venue shop for friendly testing environments.

    --
    This sig for rent.
    • (Score: 2) by HiThere on Tuesday March 20 2018, @06:09PM (1 child)

      by HiThere (866) Subscriber Badge on Tuesday March 20 2018, @06:09PM (#655528) Journal

      Uber did.

      They didn't want to accept the California regulations. Somehow I don't think California lost that they decided to test elsewhere. (Several companies, however, have accepted the California regulations. I think Nevada also has provisions for self driving cars being tested or used.)

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
      • (Score: 0) by Anonymous Coward on Tuesday March 20 2018, @07:35PM

        by Anonymous Coward on Tuesday March 20 2018, @07:35PM (#655582)

        We might guess that some CA regulators are patting themselves on the back. Their rules drove Uber to test in AZ and saved the life of a Californian (assuming that the Uber car would have eventually had a similar pedestrian accident in CA...)

  • (Score: 2) by takyon on Tuesday March 20 2018, @11:54PM

    by takyon (881) <takyonNO@SPAMsoylentnews.org> on Tuesday March 20 2018, @11:54PM (#655722) Journal
    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(1)