Stories
Slash Boxes
Comments

SoylentNews is people

Breaking News
posted by martyb on Friday July 01 2016, @01:49AM   Printer-friendly
from the Pay-attention! dept.

Two Soylentils wrote in with news of a fatal accident involving a Tesla vehicle. Please note that the feature in use, called "Autopilot" is not the same as an autonomous vehicle. It provides lane-keeping, cruise control, and safe-distance monitoring, but the driver is expected to be alert and in control at all times. -Ed.

Man Killed in Crash of 'Self-Driving' Car

Tech Insider reports that an Ohio man was killed on 7 May when his Tesla Model S, with its autopilot feature turned on, went under a tractor-trailer.

Further information:

Tesla Autopilot - Fatal Accident

http://www.cnbc.com/2016/06/30/us-regulators-investigating-tesla-over-use-of-automated-system-linked-to-fatal-crash.html

Accident is reported to have happened in May, and reported to NHTSA/DOT immediately by Tesla. But not public until the end of June -- something a bit fishy about this reporting lag.

On the other hand, the accident is described as one that might have also been difficult for an alert human to have avoided:

The May crash occurred when a tractor trailer drove across a divided highway, where a Tesla in autopilot mode was driving. The Model S passed under the tractor trailer, and the bottom of the trailer hit the Tesla vehicle's windshield.

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla wrote.

This was the first reporting found--by the time it makes the SN front page there may be more details. Because this is a "first" it seems likely that a detailed investigation and accident reconstruction will be performed.


Original Submission #1Original Submission #2

Related Stories

Elon Musk's "Top Secret Tesla Masterplan, Part 2" 26 comments

Yes, the phrase used in the headline is a direct quote. Tesla CEO Elon Musk is teasing new details about the company's future, set to be announced later this week. The news may be in reaction to slipping stock prices and troubles with regulators following a recent crash:

While offering no other details, the master plan is likely a follow-up to a 2006 blog post titled "The Secret Tesla Motors Master Plan (just between you and me)," in which Musk laid out his vision for Tesla, including eventual plans for the Tesla Roadster, the Model S sedan and the upcoming (and more affordable) Model 3 sedan.

It may not be a bad idea for Musk to roll out some optimistic news. In recent weeks, the electric car company has become the subject of a federal safety investigation following at least two crashes — one fatal — possibly related to its highly touted autopilot feature; Tesla has announced a drop in Model S shipments; and Musk himself has come under fire after proposing that Tesla purchase SolarCity, which he is also the chairman of, much to the chagrin of shareholders.

[...] Tesla shares are down almost 10% year-to-date, and down more than 16% in the past 12 months.

You may also be interested in this NYT editorial about "Lessons From the Tesla Crash".


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Insightful) by Anonymous Coward on Friday July 01 2016, @02:01AM

    by Anonymous Coward on Friday July 01 2016, @02:01AM (#368230)

    Was the driver of the tractor trailer at fault? If the truck unexpectedly crossed over the median strip that's one thing, but if it happened at an intersection that's quite another. And one would expect the intersection to be governed by traffic lights or stop signs - who had the right of way?

    As far as the Tesla driver not noticing the truck either, that's pretty weak. S/he was probably messing around on his phone or something because the car was driving itself.

    Eventually the Feds are going to step in and call for regulation. It's not going to be enough for the Silicon Valley guys to say we're the elite, we have engineers with multiple PhD's who are the cream of the crop, and besides, this is all in beta. That may be, but public safety is directly at stake here, just as it is with trad drivetrain vehicles.

    • (Score: 3, Insightful) by quintessence on Friday July 01 2016, @02:08AM

      by quintessence (6227) on Friday July 01 2016, @02:08AM (#368233)

      Depending on the re-creation of the accident it might be to neither human of AI could have avoided it.

      Even with the best of programing, I anticipate errors. The question will be the average accidents compared to a human driver. My suspicion is even poor AI will perform better than most drivers.

      • (Score: 2) by quintessence on Friday July 01 2016, @02:12AM

        by quintessence (6227) on Friday July 01 2016, @02:12AM (#368236)

        And a hell of a lot better than my spelling correction.

        • (Score: 1) by nitehawk214 on Friday July 01 2016, @04:28PM

          by nitehawk214 (1304) on Friday July 01 2016, @04:28PM (#368470)

          If only we had a spellcheck AI. :)

          --
          "Don't you ever miss the days when you used to be nostalgic?" -Loiosh
      • (Score: 3, Informative) by FatPhil on Friday July 01 2016, @05:14AM

        > My suspicion is even poor AI will perform better than most drivers.

        "This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles,"
        --
        Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
        • (Score: 4, Interesting) by butthurt on Friday July 01 2016, @06:52AM

          by butthurt (6141) on Friday July 01 2016, @06:52AM (#368311) Journal

          If drivers tend to turn on the autopilot feature mainly when travelling on expressways, but turn it off for surface streets, then the statistics reflect different driving conditions.

    • (Score: 2) by frojack on Friday July 01 2016, @02:09AM

      by frojack (1554) on Friday July 01 2016, @02:09AM (#368234) Journal

      Eventually the Feds are going to step in and call for regulation.

      Agreed. Auto pilot is not self driving, but its temptingly close enough to cause drivers to get careless.
      It probably shouldn't be offered.

      --
      No, you are mistaken. I've always had this sig.
      • (Score: 2) by black6host on Friday July 01 2016, @02:35AM

        by black6host (3827) on Friday July 01 2016, @02:35AM (#368244) Journal

        In the driver's video, in bumper to bumper traffic he states that although it's a slower drive at least he doesn't have to worry about anything. "You just let it go." I think it's quite possible if someone has spent a fair amount of time using autopilot that it would be easy to start taking things for granted.

        I have no way of knowing if the driver was complacent or acutely alert when the accident happened. From his comments in the video I doubt he was acutely alert.

        • (Score: 3, Insightful) by Immerman on Friday July 01 2016, @03:16AM

          by Immerman (3985) on Friday July 01 2016, @03:16AM (#368250)

          Agreed. In fact it would be extremely difficult to maintain attention on the road for an extended period without any need to react to it. There's a name for such inactive attention: meditation. And most people find it quite unpleasant without a lot of practice.

          • (Score: 0) by Anonymous Coward on Friday July 01 2016, @08:20AM

            by Anonymous Coward on Friday July 01 2016, @08:20AM (#368328)
            I thought it's called watching TV :)
            • (Score: 0) by Anonymous Coward on Friday July 01 2016, @03:18PM

              by Anonymous Coward on Friday July 01 2016, @03:18PM (#368439)

              exactly: very unpleasant without practice.

    • (Score: 1, Redundant) by CirclesInSand on Friday July 01 2016, @04:45AM

      by CirclesInSand (2899) on Friday July 01 2016, @04:45AM (#368279)

      we have engineers with multiple PhD's who are the cream of the crop

      So. Engineers who are well qualified, have a financial interest in performing well, and as you say are "the cream of the crop" aren't good enough.

      You want government employees to be in charge. Perhaps that is not such a good idea.

      • (Score: 2) by cubancigar11 on Friday July 01 2016, @06:03AM

        by cubancigar11 (330) on Friday July 01 2016, @06:03AM (#368293) Homepage Journal

        Government doesn't design, it makes something unlawful. Government regulation is nothing but a disincentive, and it ought to be there if my personal safety is being bargained against profit.

        • (Score: -1, Troll) by Anonymous Coward on Friday July 01 2016, @05:07PM

          by Anonymous Coward on Friday July 01 2016, @05:07PM (#368486)

          the disincentive is supposed to be well informed consumers who vote with their currency, not your beloved slave master state. you're like a grown baby with a dirty diaper crying for your mommy to change you.

      • (Score: 3, Insightful) by Thexalon on Friday July 01 2016, @02:39PM

        by Thexalon (636) on Friday July 01 2016, @02:39PM (#368416)

        You want government employees to be in charge. Perhaps that is not such a good idea.

        What if, and I know this is crazy talk, the government hires some of those engineers with multiple PhDs who are the cream of the crop, and has those engineers check their fellow engineers' work and make sure it won't break catastrophically? And we give them the power to nix management if management decides to overrule their engineers with multiple PhDs? Wouldn't that be effectively putting the smart engineers in charge, while reducing the power of not-so-smart or at least not-so-cream-of-the-crop management? Why is there this idea that as soon as somebody becomes a government employee, they lose all the skill and professionalism they had when they went into the job in the first place?

        That's where government involvement can be valuable: Stopping companies with an incentive to think as follows: "Now, should we initiate a recall? Take the number of vehicles in the field, A, multiply by the probable rate of failure, B, multiply by the average out-of-court settlement, C. A times B times C equals X. If X is less than the cost of a recall, we don't do one."

        --
        The only thing that stops a bad guy with a compiler is a good guy with a compiler.
        • (Score: 2) by frojack on Friday July 01 2016, @05:13PM

          by frojack (1554) on Friday July 01 2016, @05:13PM (#368490) Journal

          Why is there this idea that as soon as somebody becomes a government employee, they lose all the skill and professionalism they had when they went into the job in the first place?

          I think you'll find those that have all that much skill and professionalism invariably end up in government, which often pays well enough to attract just these. For every Warner Von Braun employed by government, there tend to be a lot of posers and industrial washouts.

          --
          No, you are mistaken. I've always had this sig.
        • (Score: 2) by CirclesInSand on Friday July 01 2016, @11:14PM

          by CirclesInSand (2899) on Friday July 01 2016, @11:14PM (#368684)

          It is a sign of political immaturity to talk about goals rather than policies.

          The policy you want to put in place is "create a government organization that has veto power over which products go to market". In your imagination, this achieves a goal of government employees hiring the best and brightest engineers who for some reason are out for hire rather than working in the industry. The motivation of the business owners not to kill their customers isn't enough, those government employees have a better motivation: altruism. So the government employees will save us from the greedy businessmen who don't realize that killing your customers is bad business.

          Now this is what happens in reality. The government program is created. They are immediately swamped by lobbyists hired by the incumbent owners in the industry. Those lobbyists will tell the government employees which "engineers" to hire. Those government fake engineers will pretend to understand safety better than the actual designers. The government employees now have 3 groups of people to try to appease: (1) the incumbent business owners who have a lot of money and clout enough to make life and election hell for the regulators, (2) customers, who never had a problem anyway, because the business owners don't want to kill their customers, and (3) voters who don't actually know what the new government program does. Guess which one will have the most influence.

          This is called regulatory capture. They pretend it takes a long time for this to happen, but in reality it happens immediately and people pretend not to notice for as long as possible.

          Now you have an organization with the power to shut down competition with no one to worry about except the incumbent business owners. The business owners are now compelled by law to exploit every legal ability they have to promote their own business (it is called fiduciary responsibility), not to mention a personal interest in doing so. So what do you think happens when a new company shows up with a competing product? If you said "healthy competition, both sides trying to show customers that their product is safer", you got zero points. The new company will be forced to go through a gauntlet of hurdles put in place by the government, none of them will be motivated by safety, all of them will be motivated by the lobbyists of the incumbent industry. A time tested technique is to simply make tests expensive, rather than useful, so that only the incumbent businesses can afford to pass them. Another common technique is to make the tests subjective, so it all is decided by who can bribe the regulators more.

          A lack of competition actually makes things less safe because no one is at the mercy of customers any more. Management will be replaced by those who are better at litigating rather than those who are better at designing (Steve Jobs was exactly this).

          If anyone at this point is thinking "well, we'll create a regulatory agency to regulate the regulators", then I pity them.

    • (Score: 2) by DutchUncle on Friday July 01 2016, @05:24PM

      by DutchUncle (5370) on Friday July 01 2016, @05:24PM (#368500)

      Most intersections do NOT have traffic lights or stop signs in rural areas. Many places where a truck would be turning off a road into a factory or warehouse lot would also not have traffic lights.

      OTOH should the truck have turned left at that time? Not if the truck driver saw the car approaching, but if there was a curve or hill shortening distance of view, or if the car was speeding (suggested by the distance traveled after the collision), then the truck driver's judgement would have been incomplete.

      OTOOH would an attentive driver have seen a white truck against a bright sky, in time to hit the brakes? Again, distance of view matters a lot. And a driver might have noticed the tractor and not been as sure of the trailer, but still slowed down just in case.

      I have seen white and grey trailers that would blend with a cloudy sky; one would think that reflective strips all along the length of a truck (if there is no other bright logo) would not be an impairment of aerodynamic efficiency. A very minor change for a big improvement in visibility.

      • (Score: 3, Insightful) by frojack on Friday July 01 2016, @05:38PM

        by frojack (1554) on Friday July 01 2016, @05:38PM (#368508) Journal

        Reflective material all along the trailer is already the law in the US, But that only works if your headlights are on and bright enough to over power a setting sun in your eyes.

        This was a 4 lane divided highway. These are seldom built with at-grade "unprotected crossings" in the US. (Where unprotected means no lights or stop signs). You will occasionally see them as temporary work-site access, with plenty of warnings about crossing trucks etc.

        If the divide highway had stop signs for a truck crossing that's a design flaw, but very occasionally you find such.
        If the truck crossing didn't have stop signs that's a design flaw.
        If the truck failed to stop that's a moving violation.
        If the truck pulled out in front of oncoming traffic, that's a moving violation. (A 12 point violation usually).

        UNLESS there was some form of stop sign/signal for the divided highway, this accident was probably caused by the truck driver.

        Autopilot and car driver inattention and setting sun were merely contributory, not causal.

        --
        No, you are mistaken. I've always had this sig.
  • (Score: 4, Informative) by frojack on Friday July 01 2016, @02:03AM

    by frojack (1554) on Friday July 01 2016, @02:03AM (#368231) Journal

    Most non-camera operated forward collision avoidance systems would have detected this.

    Some companies (Subaru) have standardized on cameras to the exclusion of all other sensors. These are no good in fog, or low oncoming sun. I actually have no idea what Tesla uses..

    Other technology uses 17 or 25ghz radar which is much better for this kind of stuff.

    --
    No, you are mistaken. I've always had this sig.
    • (Score: 2) by LoRdTAW on Friday July 01 2016, @02:29AM

      by LoRdTAW (3755) on Friday July 01 2016, @02:29AM (#368238) Journal

      That was my first thought, how didn't the car see the truck? Then I realized it was using a camera. Why not both a camera and radar? Seems to me you'd want redundancy, possibly triple so two sensors/computers can vote against a third misbehaving one.

      • (Score: 5, Interesting) by PocketSizeSUn on Friday July 01 2016, @06:49AM

        by PocketSizeSUn (5340) on Friday July 01 2016, @06:49AM (#368310)

        Because Telsa's autopilot is a nifty software hack using NVidia's GPUs and on-board cameras. It is miles away from and 10's of thousands of USD cheaper than using using LIDAR. Which is, as far as I am aware, the only computer vision input being taken seriously for actual autonomous driving.

        You can use fast GPUs and visual cameras to prototype but production is LIDAR. The kinds of money'd id10t that is Tesla's target market can show off to their friends. It's surprising safe on the happy path ... unfortunately now that someone has spectacularly failed we can expect a lot of over reaction and conflating what Telsa is doing and real engineering toward autonomous driving followed by a lot of new legal road blocks to make sure it's safe ... thanks Elon ... you an Bill G. are my heroes.

        Tesla relies on visual spectrum cameras which is far from the industry standard of LIDAR for autonomous driving. This is a classic GI/GO problem. Human vs computer image recognition when the image is 'polar bear in a snow storm' results in everybody losses.

        • (Score: 2) by LoRdTAW on Friday July 01 2016, @12:38PM

          by LoRdTAW (3755) on Friday July 01 2016, @12:38PM (#368374) Journal

          I was referring to lower cost radio radar on existing cars that assist with emergency braking and adaptive cruise control.It could easily be integrated into such a system as a comparison mechanism and this collision would not have happened.

        • (Score: 3, Insightful) by frojack on Friday July 01 2016, @05:26PM

          by frojack (1554) on Friday July 01 2016, @05:26PM (#368503) Journal

          I assure you GHZ radar in cars is not 10s of thousands of dollars. Its usually included in a Safety Tech option package that includes backup cams, blindspot detection, adaptive cruise control and front collision avoidance.

          Usually that whole package goes for something in the neighborhood of $2000 to $2500 additional, which you may well recover in lower insurance premiums if you keep the car 5 years.

          Lots of different car companies are offering this package, which is available from three or four third parties. (Almost no car manufacturer develops their own).

          The quality of the programming has improved dramatically over the last decade. I did quite a bit of research on this when buying my last car.

          Lane following is usually done with cameras, because the radar does not see paint well, and paint is poorly maintained. My friend's beamer with lane following often gets confused and alarms. He considers it an annoyance.

          --
          No, you are mistaken. I've always had this sig.
    • (Score: 0) by Anonymous Coward on Friday July 01 2016, @02:29AM

      by Anonymous Coward on Friday July 01 2016, @02:29AM (#368239)

      Came here to say the same thing, from http://www.techinsider.io/tesla-model-s-autopilot-fatal-crash-2016-6 [techinsider.io]

      Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied.

      This sounds like camera only, and one camera can not measure the distance (range) to an object. Clever processing can probably decide what is in front and what is in the background, but afaik this is all inferred. When calibrated, stereo cameras can range out to some distance but it seems like radar or lidar (both transmit and then time when the reflection returns) are required to measure the distance repeatably.

    • (Score: 2) by MadTinfoilHatter on Friday July 01 2016, @05:51AM

      by MadTinfoilHatter (4635) on Friday July 01 2016, @05:51AM (#368291)

      I actually have no idea what Tesla uses.

      Apparently they bought theirs from Volvo [youtube.com]

  • (Score: 2) by Scruffy Beard 2 on Friday July 01 2016, @02:31AM

    by Scruffy Beard 2 (6030) on Friday July 01 2016, @02:31AM (#368240)

    I have noticed that competing systems use duplicates of all the sensors; including cameras.

    I suspect that the Tesla design is eventually going to be ruled unsafe. They don't stand behind it themselves: saying the driver has to be aware and ready to take over at all times.

    It may work well for lane-keeping 99.9% of the time, but even a 0.1% failure rate is not ideal. I guess the big question is if the COTS computer failure rate is lower than the human failure rate.

    • (Score: 2) by Gravis on Friday July 01 2016, @02:38AM

      by Gravis (4596) on Friday July 01 2016, @02:38AM (#368245)

      It may work well for lane-keeping 99.9% of the time, but even a 0.1% failure rate is not ideal.

      riddle me this: how many times have you stopped on the highway because a truck jumped the divider and blocked the road? i'm not talking about being stuck behind traffic, i'm saying you actually had to stop because there was a giant white truck in the way. this is closer to a 0.0000001% failure rate which is far closer to ideal than what you are claiming.

      • (Score: 0) by Anonymous Coward on Friday July 01 2016, @03:07AM

        by Anonymous Coward on Friday July 01 2016, @03:07AM (#368248)

        So, the truck was at a right angle to the flow of traffic on the other side of the road after crossing the median.

        You gotta expect the other guy to do something stupid. [wikipedia.org]

        ...and, if you like to go fast on top of that, count on that going really bad. [wikipedia.org]

        Until AI can totally compensate for the unpredictability of humans, shit is still gonna happen.

        -- OriginalOwner_ [soylentnews.org]

      • (Score: 2, Informative) by tftp on Friday July 01 2016, @04:50AM

        by tftp (806) on Friday July 01 2016, @04:50AM (#368280) Homepage

        riddle me this: how many times have you stopped on the highway because a truck jumped the divider and blocked the road?

        Don't know how often it happens, but in construction zones it's very common. Just in last weeks I had to stop a few times at behest of workers holding large STOP signs, while various construction machinery crawled across the road. All in all, I'd say you ought to be careful nearly everywhere. A computer has a long, long way to go before it achieves the sentience level of a small dog - and we usually don't allow dogs to drive.

      • (Score: 2) by frojack on Friday July 01 2016, @05:44PM

        by frojack (1554) on Friday July 01 2016, @05:44PM (#368509) Journal

        There is no indication the truck jumped the divider.

        The May crash occurred when a tractor trailer drove across a divided highway, where a Tesla in autopilot mode was driving.

        This was an at-grade crossing incident.

        --
        No, you are mistaken. I've always had this sig.
  • (Score: 2) by Gravis on Friday July 01 2016, @02:32AM

    by Gravis (4596) on Friday July 01 2016, @02:32AM (#368242)

    it seems to me that the tractor trailer would have shown up just fine if the camera included near-infrared. though if they included an actual infrared camera sensor, it would have been exceptionally clear that something was in the way. obviously laser range sensor would have worked but that's expensive enough that it's only for autonomous cars.

  • (Score: 1, Informative) by Anonymous Coward on Friday July 01 2016, @03:24AM

    by Anonymous Coward on Friday July 01 2016, @03:24AM (#368253)

    Don't think this merits "breaking news" category.

    • (Score: 5, Funny) by Doctor on Friday July 01 2016, @03:31AM

      by Doctor (3677) on Friday July 01 2016, @03:31AM (#368254)

      More like "braking news"...

      --
      "Anybody remotely interesting is mad in some way." - The Doctor
      • (Score: 0) by Anonymous Coward on Friday July 01 2016, @03:36AM

        by Anonymous Coward on Friday July 01 2016, @03:36AM (#368255)

        Or lack thereof...

      • (Score: 3, Touché) by CoolHand on Friday July 01 2016, @12:04PM

        by CoolHand (438) on Friday July 01 2016, @12:04PM (#368371) Journal
        *groan*
        --
        Anyone who is capable of getting themselves made President should on no account be allowed to do the job-Douglas Adams
  • (Score: 5, Insightful) by RedBear on Friday July 01 2016, @03:40AM

    by RedBear (1734) on Friday July 01 2016, @03:40AM (#368257)

    Oh boy, do I have a problem with this part of the quotes:

    the accident is described as one that might have also been difficult for an alert human to have avoided:

    The May crash occurred when a tractor trailer drove across a divided highway, where a Tesla in autopilot mode was driving. The Model S passed under the tractor trailer, and the bottom of the trailer hit the Tesla vehicle's windshield.
    "Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla wrote.

    There is no possible way that Tesla could know whether or not the driver noticed the trailer. It may be argued to be a reasonable assumption, but it still a very theoretical assumption. All they know for certain is that the driver did not apply the brake manually. Another assumption could easily be that the driver trusted the autopilot way too much and failed to apply the brake because he EXPECTED the car to do it, and when one is engaged in allowing autopilot to drive one's vehicle there is a tendency to be unsure of whether you need to take over driving or not. You have to actively disengage your brain from the driving task in order to negate your own automatic motor-memory driving actions from interfering with the autopilot, and putting yourself back in "driver mode" can easily take a second or two, which is many car lengths at highway speeds. Anyone who hasn't spent time practicing instantaneously taking over from the autopilot can easily have a brain fart that simply takes too much time to react, because until the autopilot goes "beep beep" and explicitly hands control back to the driver, the driver will have a tendency to believe the autopilot is operating normally and everything should be fine. This problem will only get worse as more people put more trust in autopilot systems. This is why I believe we shouldn't even be experimenting with autopilot systems yet on public roads. We should be sticking to only things like emergency braking systems and lane keeping alarms and other things that help keep the attentive human driver from making fatal mistakes rather than relying on human drivers who aren't even paying attention to keep the artificial autopilots from making fatal mistakes. We have things totally backwards in my opinion. The human driver should never be encouraged to take their attention off the road or their hands off the wheel as long as they are sitting in the driver's seat, but that's exactly what these autopilot systems are encouraging. I just saw a GIF yesterday of a Tesla driver asleep behind the wheel while his car drove him down the highway. That is what autopilot systems are enabling.

    I like Tesla, but I find the above quote to be very self-serving. Whether he was looking in a different direction at that moment instead of looking straight ahead at the road, or maybe he simply hesitated a moment too long due to misfiring neurons keeping him from attempting to interfere with what he expected the autopilot to do in that moment, it is easily possible that the driver's excessive trust in the autopilot could have been the primary thing that got him killed. But expressing that possible assumption would be extremely detrimental to Tesla, even if it was just expressed as a remote possibility.

    --
    ¯\_ʕ◔.◔ʔ_/¯ LOL. I dunno. I'm just a bear.
    ... Peace out. Got bear stuff to do. 彡ʕ⌐■.■ʔ
    • (Score: 0) by Anonymous Coward on Friday July 01 2016, @01:35PM

      by Anonymous Coward on Friday July 01 2016, @01:35PM (#368388)

      According to some witnesses, the driver had Harry Potter on... so I think I can guess he did not fuckin notice the trailer, but you are right it is not guaranteed.

  • (Score: 2) by CirclesInSand on Friday July 01 2016, @06:44AM

    by CirclesInSand (2899) on Friday July 01 2016, @06:44AM (#368309)

    There still hasn't been an accident between 2 autonomous cars.

  • (Score: 2) by theluggage on Friday July 01 2016, @12:43PM

    by theluggage (1797) on Friday July 01 2016, @12:43PM (#368376)

    Please note that the feature in use, called "Autopilot"

    ...is stupidly named if its not suitable for hands-off driving, because - sure as eggs is eggs - that's how some people will use it. That's why we can't have nice things...

    Sounds like either the Tesla driver and/or the tractor trailer driver are primarily responsible for this particular accident (more data needed). Either the car had time to brake but didn't or it happened so quickly that the car stood no chance. That really needs to be determined first completely independently of the autopilot issue.

    Whatever the outcome, Tesla need to think long and hard about pushing pseudo-autonomous tech out there before it is bulletproof. Having a clear disclaimer might get them out of direct liability - but deciding liability doesn't bring people back to life (sometimes, you'd think that it did) and deluding yourself that people will observe a disclaimer when (other) lives are at stake is just plain irresponsible.

    Autonomous vehicles are a great idea and, when perfected, could make the roads a much safer place. Trouble is - I don't see a half-way-house between the currently available systems (lane/speed/proximity warnings, emergency braking, assistance with parallel parking) - where the computer keeps an eye on the driver - and full "hands off, phone out" auto-drive (because drivers can not be relied upon to keep an eye on the computer).

    Not a good application for Agile development...

  • (Score: 3, Insightful) by SpockLogic on Friday July 01 2016, @12:48PM

    by SpockLogic (2762) on Friday July 01 2016, @12:48PM (#368378)

    In Europe semi-trailers have substantial rear and side underride guards. This substantially reduces the decapitation of sedans in a wreck allowing crumple zones to work.

    The National Transportation Safety Board has recommended that side underride guards be required on trailers with GVW ratings above 10,000 pounds. The trucking industry has resisted.

    I'm not saying that side underride guards would have prevented this fatality but shouldn't we have them.

    --
    Overreacting is one thing, sticking your head up your ass hoping the problem goes away is another - edIII
    • (Score: 0) by Anonymous Coward on Friday July 01 2016, @01:37PM

      by Anonymous Coward on Friday July 01 2016, @01:37PM (#368391)

      Let me rebut:
      1) Moneyz
      2) Moneyz means JOBZ

      • (Score: 0) by Anonymous Coward on Friday July 01 2016, @06:29PM

        by Anonymous Coward on Friday July 01 2016, @06:29PM (#368538)

        1) It doesn't cost that much.
        2) It quickly pays for itself in improved efficiency at highway speeds.
        A skirt under the trailer is one of the 1st things they do in making a big rig more aerodynamic. [google.com]

        -- OriginalOwner_ [soylentnews.org]

    • (Score: 2) by timbim on Friday July 01 2016, @09:59PM

      by timbim (907) on Friday July 01 2016, @09:59PM (#368656)

      tractor trailer shouldn't even been on the same roads as passenger cars. They should have dedicated and isolated lanes driven autonomously.