Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Sunday November 24 2019, @10:58PM   Printer-friendly
from the If-only-you-could-see-what-I’ve-seen-with-your-eyes dept.

Arthur T Knackerbracket has found the following story:

It's never good when a giant of the technology business describes your product as "a fool's errand".

But that's how Tesla's chief executive Elon Musk branded the laser scanning system Lidar, which is being touted as the best way for autonomous cars to sense their environment.

In April he said Lidar was "expensive" and "unnecessary". He believes that cameras combined with artificial intelligence will be enough to allow cars to roam the streets without a human driver.

Lidar emits laser beams and measures how long they take to bounce back from objects, and this provides so-called point-clouds to draw 3D maps of the surroundings.

These can be analysed by computers to recognise objects as small as a football or as big as a football field and can measure distances very accurately.

Despite Mr Musk, some argue these $10,000 (£7,750) pieces of kit are going to be essential. "For a car to reach anything close to full autonomy it will need Lidar," says Spardha Taneja of Ptolemus Consulting Group, a mobility consultancy.

But why are experts so divided, and how should investors judge this potential gold mine?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by JoeMerchant on Monday November 25 2019, @03:36AM (2 children)

    by JoeMerchant (3937) on Monday November 25 2019, @03:36AM (#924386)

    Cameras are still objectively inferior to LIDAR in extremely poor lighting conditions

    Cool thing they came up with for automobiles back in the 1900s: headlights.

    I've often wondered if computer vision systems for cars could isolate an IR band for the cameras and establish some kind of TDM with oncoming cars so both could use super bright high beams, just each of them only being on for something like a 25% duty cycle, with phase controlled by direction of travel.

    --
    🌻🌻 [google.com]
    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by edIII on Monday November 25 2019, @09:27PM (1 child)

    by edIII (791) on Monday November 25 2019, @09:27PM (#924643)

    So what? I'm talking about recovering from, or being more reliable in, adverse conditions. Cameras could be setup for multiple spectrums, including infrared. Heck, you could hook it up with night vision and IR headlights and drift silently though the night :)

    However, cameras are not fool proof, and they're still fundamentally making assumptions. LIDARS don't make assumptions. The laser is informing us that there is absolutely no doubt that something reflected the laser light back to us, and it was exactly X distance from the collector at Y time. That's superior data all day long.

    Pick any sensor type and we can find limitations with it. BTW, headlights can actually be the problem too, and not the solution. If the camera's spectrum is completely saturated (snow blind), then no assumptions are being made anymore about the pixels, and spatial information is no longer being derived by the AI. This is where a different sensor type that explicitly operates within the "blind spots" of other sensors can help.

    In the end you guys are still arguing for just eyes, ears, or noses. Why not all of the above?

    --
    Technically, lunchtime is at any moment. It's just a wave function.
    • (Score: 2) by JoeMerchant on Tuesday November 26 2019, @02:30AM

      by JoeMerchant (3937) on Tuesday November 26 2019, @02:30AM (#924757)

      LIDARS don't make assumptions

      No, people make assumptions. Assumptions like: the laser has free space to travel through and will not be obscured by dust, bugs, rain, sleet or snow; backscatter from the other 200 self driving cars on the nearby road will not confound the time of flight calculations of MY LIDAR, and hoodlums with spray-glitter won't hit my car with it.

      Optical camera users make lots of similar assumptions, and sonar - whoa Nellie, assumption city.

      That's superior data all day long.

      No, that's active illumination, and it works until it doesn't.

      If the camera's spectrum is completely saturated (snow blind)

      Then you need to spend an extra $0.50 on the lens setup and put in some kind of iris, or polarizing light damper, or whatever other cheap and reliable light amplitude reduction trick has been invented since the first pinhole camera.

      you guys are still arguing for just eyes, ears, or noses. Why not all of the above?

      Not so much arguing, just pointing out that multiple sensors are better than single points of failure, and if your sensor budget is $10K, do you blow it all on a single lidar or 100 fixed optical cameras scattered all over the vehicle? Of course, if you've got $20K then it makes even more sense not to just blow it on 2 LIDAR.

      Why not drive by ears, or noses? Nope, not even starting a rebuttal for that.

      --
      🌻🌻 [google.com]