Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Wednesday April 24 2019, @09:55AM   Printer-friendly
from the take-me-to-Anchorage,-Alaska dept.

According to a [PDF] paper to be presented at the 2019 Conference on Computer Vision and Pattern Recognition, June 15-21 in Long beach, California, researchers have discovered a "simple, cost-effective, and accurate new method" of enabling self driving cars to recognize 3d objects in their path.

Currently bulky expensive lasers, scanners, and specialized GPS receivers are used in LIDAR (Light Detection And Ranging) sensors and mounted on top of the vehicle. This causes increased drag as well as being unsightly and adding another ~$10,000 to the price tag. Until now, this has been the only viable option.

Cornell researchers have discovered that a simpler method, using two inexpensive cameras on either side of the windshield, can detect objects with nearly LiDAR's accuracy and at a fraction of the cost. The researchers found that analyzing the captured images from a bird's eye view rather than the more traditional frontal view more than tripled their accuracy, making stereo cameras a viable and low-cost alternative to LiDAR.

According to the paper, which goes into this in considerable depth, it is not the quality of images and data which causes the difference in accuracy, but the representation of the data. Adjusting this brings the object detection results using far less expensive camera data for 3D image-analysis up to nearly the same effectiveness as much more expensive LiDAR.

Kilian Weinberger, associate professor of computer science and senior author of the paper, notes that

stereo cameras could potentially be used as the primary way of identifying objects in lower-cost cars, or as a backup method in higher-end cars that are also equipped with LiDAR.

The paper concludes that future work may improve image-based 3D object detection using the denser data feed from cameras further, fully closing the gap with LiDAR.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by JoeMerchant on Wednesday April 24 2019, @01:05PM (4 children)

    by JoeMerchant (3937) on Wednesday April 24 2019, @01:05PM (#834311)

    You can keep your windshield clear enough for you to see ahead, don't you?
    Why couldn't you use the same methods for the cameras?

    Most of the time, though early morning low sun angle with condensation and other glitter/glare sources on the glass can become pretty challenging even for a wetware neural net.

    The thing about a windshield is that it's a huge area, and if a bug splatters in the middle of it you can adjust your sight lines to compensate. Picture, instead of a windshield, a pair of goggles on your eyes - when a bug splatters on a small area like that, it's a much bigger problem.

    In-car cameras for racing have used rolling protective films, with limited success, and that's just for an entertainment feed. Open cockpit drivers have tear-off covers on their helmet lenses, and again, even in a race that lasts just a few hours it's not uncommon for those to become less than adequate when something like a rainstorm happens.

    --
    🌻🌻 [google.com]
    Starting Score:    1  point
    Moderation   +3  
       Interesting=2, Informative=1, Total=3
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 2) by PiMuNu on Wednesday April 24 2019, @01:33PM (1 child)

    by PiMuNu (3823) on Wednesday April 24 2019, @01:33PM (#834322)

    I wear glasses and often find myself wandering around with all sorts of gloop on them.

    • (Score: 2) by JoeMerchant on Wednesday April 24 2019, @01:43PM

      by JoeMerchant (3937) on Wednesday April 24 2019, @01:43PM (#834331)

      Me too, and my neural net somehow compensates. What amazes me is that sometimes I can't read with my right or left eye alone, but the two together manage to do it.

      --
      🌻🌻 [google.com]
  • (Score: 2) by RedIsNotGreen on Thursday April 25 2019, @01:00AM (1 child)

    by RedIsNotGreen (2191) on Thursday April 25 2019, @01:00AM (#834580) Homepage Journal

    Luckily, with the current rapidly progressing insect extinction event, bug splatter won't be an issue for much longer.

    • (Score: 2) by JoeMerchant on Thursday April 25 2019, @01:59AM

      by JoeMerchant (3937) on Thursday April 25 2019, @01:59AM (#834588)

      I don't think it really works like that - with fewer species I'd expect bigger and more frequent plagues of the survivors...

      --
      🌻🌻 [google.com]