Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Monday July 11 2016, @06:14PM   Printer-friendly
from the keep-it-secret-keep-it-safe dept.

Yes, the phrase used in the headline is a direct quote. Tesla CEO Elon Musk is teasing new details about the company's future, set to be announced later this week. The news may be in reaction to slipping stock prices and troubles with regulators following a recent crash:

While offering no other details, the master plan is likely a follow-up to a 2006 blog post titled "The Secret Tesla Motors Master Plan (just between you and me)," in which Musk laid out his vision for Tesla, including eventual plans for the Tesla Roadster, the Model S sedan and the upcoming (and more affordable) Model 3 sedan.

It may not be a bad idea for Musk to roll out some optimistic news. In recent weeks, the electric car company has become the subject of a federal safety investigation following at least two crashes — one fatal — possibly related to its highly touted autopilot feature; Tesla has announced a drop in Model S shipments; and Musk himself has come under fire after proposing that Tesla purchase SolarCity, which he is also the chairman of, much to the chagrin of shareholders.

[...] Tesla shares are down almost 10% year-to-date, and down more than 16% in the past 12 months.

You may also be interested in this NYT editorial about "Lessons From the Tesla Crash".


Original Submission

Related Stories

Breaking News: First Reported Fatal Accident of Tesla Model S Operating in Autopilot Mode 45 comments

Two Soylentils wrote in with news of a fatal accident involving a Tesla vehicle. Please note that the feature in use, called "Autopilot" is not the same as an autonomous vehicle. It provides lane-keeping, cruise control, and safe-distance monitoring, but the driver is expected to be alert and in control at all times. -Ed.

Man Killed in Crash of 'Self-Driving' Car

Tech Insider reports that an Ohio man was killed on 7 May when his Tesla Model S, with its autopilot feature turned on, went under a tractor-trailer.

Further information:

Tesla Autopilot - Fatal Accident

http://www.cnbc.com/2016/06/30/us-regulators-investigating-tesla-over-use-of-automated-system-linked-to-fatal-crash.html

Accident is reported to have happened in May, and reported to NHTSA/DOT immediately by Tesla. But not public until the end of June -- something a bit fishy about this reporting lag.

On the other hand, the accident is described as one that might have also been difficult for an alert human to have avoided:

The May crash occurred when a tractor trailer drove across a divided highway, where a Tesla in autopilot mode was driving. The Model S passed under the tractor trailer, and the bottom of the trailer hit the Tesla vehicle's windshield.

"Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky, so the brake was not applied," Tesla wrote.

This was the first reporting found--by the time it makes the SN front page there may be more details. Because this is a "first" it seems likely that a detailed investigation and accident reconstruction will be performed.


Original Submission #1Original Submission #2

Elon Musk Releases Master Plan, Part Deux 23 comments

https://www.tesla.com/blog/master-plan-part-deux

So, in short, Master Plan, Part Deux is:

Create stunning solar roofs with seamlessly integrated battery storage
Expand the electric vehicle product line to address all major segments
Develop a self-driving capability that is 10X safer than manual via massive fleet learning
Enable your car to make money for you when you aren't using it

Previously: Elon Musk's "Top Secret Tesla Masterplan, Part 2"


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by kanweg on Monday July 11 2016, @07:10PM

    by kanweg (4737) on Monday July 11 2016, @07:10PM (#373256)

    Under investigation because someone dies in a car crash? Because the driver was distracted the car manufacturer is investigated? Do we get an investigation into smartphone manufacturers too?

    Plus, what you won't read about is or is much less newsworthy is someone who lived to tell the tale because of the technology.

    There are about 30k people that die each year in the US because of traffic accidents (how many people died in traffic accidents in the US in 2015). That is some 600 per week. That's some 100 per day. And this one gets investigated? The Tesla crash was on 7 May. That is some 2500 fatalities ago.

    What about the others crashes? Just because they died in a run of the mill accident? The other crashes can't teach us anything?

    Bert
    (Who drives a Toyota and doesn't have shares in any company)

     

    • (Score: 3, Insightful) by Anonymous Coward on Monday July 11 2016, @08:19PM

      by Anonymous Coward on Monday July 11 2016, @08:19PM (#373281)

      Its quite clear there is a concerted effort by the media (maybe encouraged or financially backed by competing automotive or oil companies) to make these few incidence seem as bad as possible, with the end effect of negatively effecting Tesla's share price. The best example of this disingenuous campaign was the media fervor about the suspension failure. It was made out to be a fault in the car itself while in reality it was an isolated incident where a handful of drivers used their teslas in ways that rapidly degraded their suspension. Tesla's official statement answered all the questions and supposition the media was putting out, but the media basically ignored it. Oil companies especially have much to lose by letting Tesla succeed. This is only the beginning of their media campaign.

      • (Score: 0) by Anonymous Coward on Monday July 11 2016, @08:49PM

        by Anonymous Coward on Monday July 11 2016, @08:49PM (#373292)

        Seems pretty obvious to me, there is an investigation because this accident involves something new.

        Just saw an article in the newspaper this morning that NTSB is also investigating, in addition to Florida cops and NHTSA/DOT. NTSB investigations are usually very complete & detailed, I'm looking forward to reading their report...which (based on past performance) might be available in 6 months or a year.

        In the meantime I'm withholding judgement. Who else is patient enough to wait for it?

        • (Score: 2) by davester666 on Tuesday July 12 2016, @07:12AM

          by davester666 (155) on Tuesday July 12 2016, @07:12AM (#373501)

          We have a 15 minute news cycle for a reason.

          Because we have a 1 minute attention span, and hearing about something for 15 whole minutes makes us turn the tv to another station.

  • (Score: 5, Informative) by Anonymous Coward on Monday July 11 2016, @07:28PM

    by Anonymous Coward on Monday July 11 2016, @07:28PM (#373264)

    From a private automotive engineering forum (sorry, links won't work unless you have a password), someone posted this note, which they also submitted to NHTSA. Posted here as AC, with permission from the original author:

    My concern is the instrumentation is substantially inferior to a human. For instance, the forward looking camera in the Tesla is HD resolution (1280x720) with a wide angle lens that has a 5.6 arc-min resolution per pixel, compared to a human which has 1 arc-min. Or to put in human terms, the Tesla auto-pilot would fail the DMV vision test. I estimate the Tesla has 20/110 vision, or in other words it can see the big E at the top of the eye chart, but will make a couple mistakes on the second line.

    In the Florida accident, the Tesla came over the crest of the hill just under 1/2 mile (2250') from the truck. If it were a human driving, when cresting the hill it would be obvious that a large tractor-trailer is crossing in front of them. The human would let off the gas, and give the truck the extra 5 seconds in needed to clear the intersection. But for the computer the truck is a 8x3 pixel blob at 2250'. At 1/4 mile, it's still an unidentifiable blob -- 15x4 pixels, something is there, but not enough data to identify it. At 1/8 mile, there's finally enough data (30x8, 240 pixels) to identify it's a truck in optimal conditions. It's not till 500ft, the computer can clearly identify it's a truck, but the writing/text on the trailer would still be blurry. At 250ft, only the largest letters (>4ft tall) on the trailer would be legible. Not until 100ft, is there any detail. Attached below is a simulation of what a tractor-trailer looks like.

    For a self-driving car to match a human vision, it would need an 8K UHD camera (33 megapixels) -- impractical at so many levels. Another solution might be multiple cameras, such as a set of 3 cameras, one with a telephoto lens, a wide-angle, and a normal lens. And merge the 3 to have a varying resolution image. Probably need a 4th camera that the computer can steer and change the focal length to fill in detail where needed.

    The photos he prepared aren't posted anywhere (yet?) -- but match the description above.

    For reference, the Tesla and most cars can stop from 60 mph (~100kph) in something over 100 feet (30 meters), with the distance getting longer quickly as speed goes up. Also, to aid mental calculations, 60 mph = 88 feet/second.

    • (Score: 1, Insightful) by Anonymous Coward on Monday July 11 2016, @08:06PM

      by Anonymous Coward on Monday July 11 2016, @08:06PM (#373272)

      Autopilot is not there to allow people autonomous vehicles - it's there to aide them. Which means people need still be watching the road! This dude didn't. So who's fault is it? Seems to be the dude didn't see the truck any more than the car did.

      The purpose of autopilot is to look at the road and to keep lane. That alone saves lives considering the number of people that "drift" and side-swipe pedestrians, cyclists, motorcyclists and other driver.

      • (Score: 5, Insightful) by ilPapa on Monday July 11 2016, @08:25PM

        by ilPapa (2366) on Monday July 11 2016, @08:25PM (#373283) Journal

        Autopilot is not there to allow people autonomous vehicles - it's there to aide them.

        The AC backed into the truth here: There will not be widespread self-driving cars in any of our lifetimes. It's a fantasy. Sure, we'll have something like cruise-control on steroids, but if you think you're going to be able to call a driverless Uber to take you to your 4 hour per day job, you might as well just wait for a personal jet pack.

        As an article in the NYT put it just this weekend,

        The most realistic industry projection about the arrival of autonomous driving comes from the company that’s done the most to make it possible. Google, while never explicitly saying so, has long intimated that self-driving cars would be available by the end of the decade.

        In February, though, a Google car caused its first accident; a bus collision with no injuries. A few weeks later, Google made a significant, if little-noted, schedule adjustment. Chris Urmson, the project director, said in a presentation that the fully featured, truly go-anywhere self-driving car that Google has promised might not be available for 30 years, though other much less capable models might arrive sooner.

        Historians of technology know that “in 30 years” often ends up being “never.” Even if that’s not the case here, if you’re expecting a self-driving car, you should also expect a wait. And so you might want to do something to pass the time. Maybe go for a nice drive?

        --
        You are still welcome on my lawn.
        • (Score: 2) by bob_super on Monday July 11 2016, @09:02PM

          by bob_super (1357) on Monday July 11 2016, @09:02PM (#373299)

          > There will not be widespread self-driving cars in any of our lifetimes. It's a fantasy. Sure, we'll have something
          > like cruise-control on steroids, but if you think you're going to be able to call a driverless Uber to take you to your
          > 4 hour per day job, you might as well just wait for a personal jet pack.

          I'm going to have to disagree, because the tech is almost there, a least for roads in "first-world" countries. It's not a pipe dream with physics getting in the way.

          We're missing enough cheap sensors and processing power to deal with the worst corner cases, during which a single operator based halfway around the world could already get pinged to tell the safely-stopped machine what to do next.

          Are there dollars/euros/yuan/yen enough to make it widespread? Good question. I expect constant monitoring, insurance rates, mandatory maintenance and such to create a strong incentive (heck, you can't drive an old car inside Paris anymore). But the tech will work better than the humans before we have time to finish hunting all the lawyers.

          • (Score: 2, Insightful) by tftp on Tuesday July 12 2016, @05:32AM

            by tftp (806) on Tuesday July 12 2016, @05:32AM (#373472) Homepage

            because the tech is almost there

            The last 10% of any project always consume 90% of the budget.

            An automatic car on a road where only automatic cars are allowed? Not a problem. Can be done tomorrow.

            An automatic car on a road where any vehicle may appear? No way. Hey, even pretty flexible humans have problem with that - and a computer does not even come closer to a sentient being who, actually, *understands* what is happening.

            That truck half a mile ahead? That would have been an obvious observation for a human, and an effortless, light braking - up to a full stop, if need be (a fire truck, an accident, construction, etc.) Even if the driver cannot *clearly* see the obstacle, he will be watching it, and decreasing the speed if necessary - humans are not that bad in detecting deviations from a pattern (such as an empty, straight or curved road.) If a computer does not see an obstacle, it presumes that there is none and barges ahead.

            A fully automated self-driving car that can tackle any situation has to be equipped with a decent AI. There is no workaround. You cannot depend on fixed algorithms if you sometimes have to stop and ask the officer who has the road blocked. I have done that.

          • (Score: 2) by ilPapa on Wednesday July 13 2016, @02:27AM

            by ilPapa (2366) on Wednesday July 13 2016, @02:27AM (#373928) Journal

            I'm going to have to disagree, because the tech is almost there, a least for roads in "first-world" countries. It's not a pipe dream with physics getting in the way.

            The project director for Google Cars disagrees with you. This week, he changed his prediction for go-anywhere driverless cars from "end of the decade" to "30 years away".

            And thirty years away in tech is the same as "never".

            You'll have a personal 3D-printed jet-pack before you have a driverless car that can take you where you want to go.

            --
            You are still welcome on my lawn.
            • (Score: 2) by bob_super on Wednesday July 13 2016, @04:21PM

              by bob_super (1357) on Wednesday July 13 2016, @04:21PM (#374095)

              > The project director for Google Cars disagrees with you.

              No, he doesn't.
              I said "[at] least for roads in "first-world" countries", and you reply with "go-anywhere".
              I've spent enough time in Asia to know the difference.

              Billions of people take the bus and/or the train every day. Those don't go everywhere, they just get you close enough. If you restrict the self-driving cars to roads on which certain standards are met, the tech is already there to carry over 99% of first-worlders daily.

              The major problem with automatic cars is that the manufacturer will have to defend every accident, and enforce every maintenance, because people expect to discharge all liability. And that's not a line item any accountant, and their CEOs, want to see on a balance sheet.
              Which probably implies that the Chinese will beat the West to it.

              • (Score: 2) by ilPapa on Thursday July 14 2016, @02:39PM

                by ilPapa (2366) on Thursday July 14 2016, @02:39PM (#374368) Journal

                and you reply with "go-anywhere".

                "Go-anywhere" was not my reply. It was the exact phrase used by the project director of Google Cars.

                And please tell me, if they're only going to stick to the same routes as trains and buses, what is the possible benefit of replacing one train with 500 cars?

                --
                You are still welcome on my lawn.
                • (Score: 2) by bob_super on Thursday July 14 2016, @04:30PM

                  by bob_super (1357) on Thursday July 14 2016, @04:30PM (#374405)

                  > "Go-anywhere" was not my reply. It was the exact phrase used by the project director of Google Cars.

                  Which you quoted to object to my statement...

                  > And please tell me, if they're only going to stick to the same routes as trains and buses, what is the possible benefit of replacing one train with 500 cars?

                  Did I type the "same routes"? Don't be daft.

        • (Score: 2) by TrumpetPower! on Monday July 11 2016, @09:29PM

          by TrumpetPower! (590) <ben@trumpetpower.com> on Monday July 11 2016, @09:29PM (#373308) Homepage

          No, robot cars are coming much sooner than mid-century. Much, much sooner.

          A robot car doesn't need to be Platonically perfect.

          It doesn't need to be a better driver than Mario Andretti.

          It doesn't even need to be able to always get a perfect score on every DMV test.

          It just needs to get in about 10% fewer crashes as the median human driver on the road today -- a bar that we've probably already cleared.

          For individual owners, the rate break you'd get from the insurance company will probably convince enough people to sustain sales -- but that's peanuts compared to commercial driving. A robot truck, for example, isn't limited to eight hours on the road per day; nor does it cost hundreds of thousands of dollars per year in salaries and health insurance and liability insurance and all the rest. Similar calculations apply for all other professional drivers.

          Amazon alone has an overwhelming interest in roboticising all its jobs, including indirect jobs such as those that move goods to and between warehouses that they likely subcontract to the likes of UPS today. Robot trucks would likely cost Amazon a tenth of what they currently pay, once you factor in increased productivity and reduced overhead and not paying UPS shareholder profits and the rest...

          ...and if you think any sort of conspiracy could keep them from cutting those costs, let me invite you to bid on my auction for the Brooklyn Bridge.

          Cheers,

          b&

          --
          All but God can prove this sentence true.
          • (Score: 1) by kurenai.tsubasa on Monday July 11 2016, @11:07PM

            by kurenai.tsubasa (5227) on Monday July 11 2016, @11:07PM (#373360) Journal

            Catch-22. There need to be enough robot cars on the road to actually collect the data about safety. I'd bet that insurance companies actually raise rates at first for robot cars just because they'll be unproven outside of Google's data.

            - A robot car does need to be Platonically perfect
            - A robot car does need to be a better driven than Mario Andretti
            - A robot car must get a perfect score on every DMV test

            The press shows over and over again every time there's a collision involving either a Tesla or a Google car that it's going to frame Tesla/Google as inherently unsafe. Never underestimate the ability of mass brainwashing to overcome what should be an easy, logical conclusion.

            Sure, eventually the data will speak for itself, but that's going to take quite a while.

            • (Score: 2) by quintessence on Tuesday July 12 2016, @04:34AM

              by quintessence (6227) on Tuesday July 12 2016, @04:34AM (#373460)

              When ABS was introduced, it took a few decades for it to gain traction, and as even as early as the 1980s when it was just starting to become widespread, there were doubts from even innovative manufacturers like BMW as to its effectiveness (which is the reason they gave for not employing it sooner).

              Even today, ABS increases braking distances on slick surfaces. It is still mandatory on all vehicles. It has still reduced the total number of crashes by a fair amount.

              You lack history in how features are adopted in the automotive world. An overall gain is more important from a regulatory standpoint. The brainwashed masses will simply not be early adopters, but short of an outright ban, the numbers will trickle forwards with every tech advancement, which is far more accelerated now than in the 1980s.

              • (Score: 3, Funny) by VanessaE on Tuesday July 12 2016, @11:58AM

                by VanessaE (3396) <vanessa.e.dannenberg@gmail.com> on Tuesday July 12 2016, @11:58AM (#373570) Journal

                When ABS was introduced, it took a few decades for it to gain traction [...]

                If your ABS system is taking decades to gain traction, you're either driving incredibly fast, or you have the worst implementation of that system that was ever devised.

          • (Score: 2) by ilPapa on Thursday July 14 2016, @02:42PM

            by ilPapa (2366) on Thursday July 14 2016, @02:42PM (#374369) Journal

            No, robot cars are coming much sooner than mid-century. Much, much sooner.

            You know more about robot cars than the project director of Google Cars?

            http://www.nytimes.com/2016/07/10/opinion/sunday/silicon-valley-driven-hype-for-self-driving-cars.html?ref=opinion&_r=1 [nytimes.com]

            --
            You are still welcome on my lawn.
      • (Score: 4, Insightful) by tempest on Monday July 11 2016, @08:40PM

        by tempest (3050) on Monday July 11 2016, @08:40PM (#373290)

        Autopilot is not there to allow people autonomous vehicles

        That's the problem: it's called "autopilot", but doesn't automatically pilot. It should be called driver assist.

      • (Score: 2) by LoRdTAW on Monday July 11 2016, @09:30PM

        by LoRdTAW (3755) on Monday July 11 2016, @09:30PM (#373309) Journal

        You are correct, but the GP is pointing out another angle. There is nothing wrong with pointing out flaws and deficiencies in a system which could cause a catastrophic failure to occur. Fact is, both the driver and the cars auto-pilot failed to function as required. Though, the blame still lies on the driver who did not operate the auto-pilot properly.

        Though a few questions remain for me:
        -Was this system always available and activated by a software update?
        -How are Tesla drivers trained to use such a system?
        -Is there any strict form of education in place like the driver is asked to click and/or "sign" an agreement?

    • (Score: 2) by tibman on Monday July 11 2016, @08:30PM

      by tibman (134) Subscriber Badge on Monday July 11 2016, @08:30PM (#373284)

      That does sort of ignore the fact that the car has other cameras and global positioning. I can see the confusion at the DMV though. Sir, you literally have eyeballs in the back of your head, you never get lost, you always have a co-pilot, and you react to road hazards faster than humanly possible. But your vision is so bad that you can't read any advertisements or informational signs. License revoked. It seems like some trade-offs would be acceptable in this case, imo.

      --
      SN won't survive on lurkers alone. Write comments.
    • (Score: 2) by Scruffy Beard 2 on Tuesday July 12 2016, @01:48AM

      by Scruffy Beard 2 (6030) on Tuesday July 12 2016, @01:48AM (#373418)

      As I commented for the other Tesla story, Stereo cameras would help in this case.

      You don't have to identify a 8x3px blob to know it is intercepting your path.

      And human vision is not 33 megapixels: we rely on scanning with our center-of-focus to get high detail.

    • (Score: 1, Insightful) by Anonymous Coward on Tuesday July 12 2016, @09:53AM

      by Anonymous Coward on Tuesday July 12 2016, @09:53AM (#373537)

      My concern is the instrumentation is substantially inferior to a human. For instance, the forward looking camera in the Tesla is HD resolution (1280x720) with a wide angle lens that has a 5.6 arc-min resolution per pixel, compared to a human which has 1 arc-min.

      Human vision is sharp only in a very narrow angle of view. Scene awareness is (arguably badly) established through more-or-less constant scanning of directions of interest, which are chosen by mind (experience). Instead of wide angle lens, they should use scanning cameras on gimbals.

      • (Score: 2) by Scruffy Beard 2 on Tuesday July 12 2016, @02:50PM

        by Scruffy Beard 2 (6030) on Tuesday July 12 2016, @02:50PM (#373638)

        It may still be cheaper and more reliable to just push up the resolution and processing. It probably comes down to power budget.

      • (Score: 1) by tftp on Wednesday July 13 2016, @12:33AM

        by tftp (806) on Wednesday July 13 2016, @12:33AM (#373894) Homepage

        Scene awareness is (arguably badly) established through more-or-less constant scanning of directions of interest, which are chosen by mind (experience). Instead of wide angle lens, they should use scanning cameras on gimbals.

        It's much easier to process a multi-megapixel scene at 60 fps than to gain scene awareness. The latter requires intelligence. I read that one of tough problems for self-driving cars is to distinguish between an empty plastic bag that gently drfts across the road and a heavy concrete block that just lies in wait for you. Just like that metal piece that skewered one of Teslas a couple years ago (no autopilot was involved then.)

        Scene awareness is also not something that humans are born with. It takes a lot of training to scan correctly, even though humans can easily classify objects and sort them out by the threat they represent. Can a computer tell the difference between a squirrel and a child, for example? That is an important difference! Can the camera realize that the shape on the road ahead is just a shadow? Or that it is NOT a shadow?

  • (Score: 2) by MostCynical on Monday July 11 2016, @10:03PM

    by MostCynical (2589) on Monday July 11 2016, @10:03PM (#373334) Journal

    http://www.investopedia.com/articles/markets/071116/tesla-stock-higher-despite-production-fears-tsla-scty.asp [investopedia.com]

    Shares were worth $27 in July 2011
    $119 in July 2013
    $220 in July 2014
    Low point since the $151 in February this year, now back to $224.

    Tl;dr: lots of share price pundits are miffed they didn't recommend Tesla five years ago, or sold too early.

    --
    "I guess once you start doubting, there's no end to it." -Batou, Ghost in the Shell: Stand Alone Complex