Stanford researchers develop technique to see objects hidden around corners
A driverless car is making its way through a winding neighborhood street, about to make a sharp turn onto a road where a child's ball has just rolled. Although no person in the car can see that ball, the car stops to avoid it. This is because the car is outfitted with extremely sensitive laser technology that reflects off nearby objects to see around corners.
This scenario is one of many that researchers at Stanford University are imagining for a system that can produce images of objects hidden from view. They are focused on applications for autonomous vehicles, some of which already have similar laser-based systems for detecting objects around the car, but other uses could include seeing through foliage from aerial vehicles or giving rescue teams the ability to find people blocked from view by walls and rubble.
Confocal non-line-of-sight imaging based on the light-cone transform (DOI: 10.1038/nature25489) (DX)
Whereas light detection and ranging (LIDAR) systems use such measurements to recover the shape of visible objects from direct reflections, NLOS [(Non Line Of Sight)] imaging reconstructs the shape and albedo of hidden objects from multiply scattered light. Despite recent advances, NLOS imaging has remained impractical owing to the prohibitive memory and processing requirements of existing reconstruction algorithms, and the extremely weak signal of multiply scattered light. Here we show that a confocal scanning procedure can address these challenges by facilitating the derivation of the light-cone transform to solve the NLOS reconstruction problem. This method requires much smaller computational and memory resources than previous reconstruction methods do and images hidden objects at unprecedented resolution. Confocal scanning also provides a sizeable increase in signal and range when imaging retroreflective objects. We quantify the resolution bounds of NLOS imaging, demonstrate its potential for real-time tracking and derive efficient algorithms that incorporate image priors and a physically accurate noise model. Additionally, we describe successful outdoor experiments of NLOS imaging under indirect sunlight.
Related Stories
Lots of companies are working to develop self-driving cars. And almost all of them use lidar, a type of sensor that uses lasers to build a three-dimensional map of the world around the car. But Tesla CEO Elon Musk argues that these companies are making a big mistake. "They're all going to dump lidar," Elon Musk said at an April event showcasing Tesla's self-driving technology. "Anyone relying on lidar is doomed."
"Lidar is really a shortcut," added Tesla AI guru Andrej Karpathy. "It sidesteps the fundamental problems of visual recognition that is necessary for autonomy. It gives a false sense of progress, and is ultimately a crutch."
In recent weeks I asked a number of experts about these claims. And I encountered a lot of skepticism. "In a sense all of these sensors are crutches," argued Greg McGuire, a researcher at MCity, the University of Michigan's testing ground for autonomous vehicles. "That's what we build, as engineers, as a society—we build crutches."
Self-driving cars are going to need to be extremely safe and reliable to be accepted by society, McGuire said. And a key principle for high reliability is redundancy. Any single sensor will fail eventually. Using several different types of sensors makes it less likely that a single sensor's failure will lead to disaster.
"Once you get out into the real world, and get beyond ideal conditions, there's so much variability," argues industry analyst (and former automotive engineer) Sam Abuelsamid. "It's theoretically possible that you can do it with cameras alone, but to really have the confidence that the system is seeing what it thinks it's seeing, it's better to have other orthogonal sensing modes"—sensing modes like lidar.
Previously: Robo-Taxis and 'the Best Chip in the World'
Related: Affordable LIDAR Chips for Self-Driving Vehicles
Why Experts Believe Cheaper, Better Lidar is Right Around the Corner
Stanford Researchers Develop Non-Line-of-Sight LIDAR Imaging Procedure
Self Driving Cars May Get a New (non LiDAR) Way to See
Nikon Will Help Build Velodyne's Lidar Sensors for Future Self-Driving Cars
(Score: 5, Insightful) by DannyB on Thursday March 08 2018, @07:25PM (6 children)
Self driving cars can see around corners. Unlike puny humans.
Ah, that sounds so much better than saying that killer robots can see around corners. Or fighting vehicles in an urban environment. Or the unmarked van parked out front that can't quite see all of your back yard.
The Centauri traded Earth jump gate technology in exchange for our superior hair mousse formulas.
(Score: 4, Informative) by takyon on Thursday March 08 2018, @07:32PM (2 children)
An autonomous car kind of is a killer robot.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 3, Insightful) by DannyB on Thursday March 08 2018, @07:37PM (1 child)
It is a subclass of killer robot.
The Centauri traded Earth jump gate technology in exchange for our superior hair mousse formulas.
(Score: 1) by cocaine overdose on Thursday March 08 2018, @07:55PM
Robot class is a mechanical construct. Stop oppressing the Silicon-kin. Free Tay.
(Score: 2) by bob_super on Thursday March 08 2018, @08:30PM
That's not a problem, silly person.
Soon enough, only bad guys will hide in the street, while our courageous men pick them off from drones, bots and armored vehicles. The inferiority of human vision is an asset for the Good Guys (TM).
As far as being naked your backyard goes, we already have full sat and drone coverage. You may want to get that mole checked.
(Score: 2) by realDonaldTrump on Thursday March 08 2018, @08:47PM
The Germans had a VERY CROOKED gun. So they could shoot around corners.
(Score: 0) by Anonymous Coward on Friday March 09 2018, @03:57AM
Humans also can't see backwards, yet cars now come with back-facing cameras to be viewed by humans.
The same goes for IR, radar, etc.
(Score: 1) by cocaine overdose on Thursday March 08 2018, @07:52PM (2 children)
I was going to bitch about the safety of LIDAR, but the experiments are pay-walled, and there's no access to the laser specs. But I did find that the few other experiments on NLOS require Class 3B lasers, which is a no-no boo-boo that could pave the way to taking all non-automated cars off the road, when the amount of accidents and human damage starts piling up after such a tiny detail was withheld. And now another unfortunate event, is that most automated LIDAR sensors use Class 1 lasers (like Waylmao), and EdisonTM doesn't even use LIDAR. Foiled once again, by the God-fearing Velodine and their infinite JS loops.
(Score: 1, Informative) by Anonymous Coward on Thursday March 08 2018, @08:22PM (1 child)
They mention this in the paper:
https://www.nature.com/articles/nature25489 [nature.com]
(Score: 1) by cocaine overdose on Thursday March 08 2018, @08:30PM
Thank you. Then it's a harmless Class 1 and equivalent to staring at a candle.
(Score: 1, Interesting) by Anonymous Coward on Thursday March 08 2018, @08:03PM
I wonder if this is the same use of "confocal" as the microscope invented by Marvin Minsky in 1955--before his major work in AI?
https://web.media.mit.edu/~minsky/papers/ConfocalMemoir.html [mit.edu]
Seems likely, here's a small cutting from his memoir,
(Score: 3, Insightful) by Arik on Thursday March 08 2018, @08:39PM (2 children)
If laughter is the best medicine, who are the best doctors?
(Score: 0) by Anonymous Coward on Thursday March 08 2018, @09:51PM
I sure hope the goggles are able to do something...
(Score: 2) by hemocyanin on Friday March 09 2018, @12:26AM
As an IANAD (doctor) I wonder if it will work that way. In my rudimentary way of thinking, I can place my hand on a piece of metal that's 100 F for as long as I want and never suffer any sort of burn, but if I do the same with one that's at 500 F, I will almost instantly get a severe burn. So it seems there is some threshold effect where I could be exposed to massive amounts of energy at some particular level and experience no ill effect, but once a threshold is crossed, the effect is immediate. I suppose there is a middle ground -- if I grab a piece of metal at 120 F and don't let go, how long will it take to get a burn? If I grab that piece of metal for one second and let go for 10 -- can I do that essentially forever without getting burned?
So blah blah blah -- what do we have here? Lasers that are safe no matter how long you stare at them? Ones that are instantly damaging? Or ones that will cause damage with sufficient continuous exposure?
(Score: 0) by Anonymous Coward on Friday March 09 2018, @06:04AM
Has anyone heard anything about how well all these radar and LIDAR systems operate with each other. Will a street full of them cause interference with each other?
These systems work outside of human visual range, but other animals have different ranges. What is all our 'light' pollution doing to everything else?
(Score: 2) by shortscreen on Friday March 09 2018, @07:23AM
Sample a 2D array of points and then calculate the attributes of the 3D scene, instead of the other way around.
If they can get this working in realtime, would the same hardware be suitable for realtime raytracing?