Lots of companies are working to develop self-driving cars. And almost all of them use lidar, a type of sensor that uses lasers to build a three-dimensional map of the world around the car. But Tesla CEO Elon Musk argues that these companies are making a big mistake. "They're all going to dump lidar," Elon Musk said at an April event showcasing Tesla's self-driving technology. "Anyone relying on lidar is doomed."
"Lidar is really a shortcut," added Tesla AI guru Andrej Karpathy. "It sidesteps the fundamental problems of visual recognition that is necessary for autonomy. It gives a false sense of progress, and is ultimately a crutch."
In recent weeks I asked a number of experts about these claims. And I encountered a lot of skepticism. "In a sense all of these sensors are crutches," argued Greg McGuire, a researcher at MCity, the University of Michigan's testing ground for autonomous vehicles. "That's what we build, as engineers, as a society—we build crutches."
Self-driving cars are going to need to be extremely safe and reliable to be accepted by society, McGuire said. And a key principle for high reliability is redundancy. Any single sensor will fail eventually. Using several different types of sensors makes it less likely that a single sensor's failure will lead to disaster.
"Once you get out into the real world, and get beyond ideal conditions, there's so much variability," argues industry analyst (and former automotive engineer) Sam Abuelsamid. "It's theoretically possible that you can do it with cameras alone, but to really have the confidence that the system is seeing what it thinks it's seeing, it's better to have other orthogonal sensing modes"—sensing modes like lidar.
Previously: Robo-Taxis and 'the Best Chip in the World'
Related: Affordable LIDAR Chips for Self-Driving Vehicles
Why Experts Believe Cheaper, Better Lidar is Right Around the Corner
Stanford Researchers Develop Non-Line-of-Sight LIDAR Imaging Procedure
Self Driving Cars May Get a New (non LiDAR) Way to See
Nikon Will Help Build Velodyne's Lidar Sensors for Future Self-Driving Cars
Related Stories
Smaller, cheaper lidars are being developed. One of the most promising comes in the minuscule form of a silicon chip. Prototypes have been delivered to several big automotive-component suppliers, including Delphi and ZF. If all goes well, within three years or so lidar chips should start popping up in vehicles.
[...] Typically, a lidar employs revolving mirrors to direct its laser beam, which is usually in the invisible near-infrared part of the spectrum, rather than the visible part. Commercial lidar can cost $50,000 or so a pop, but smaller, lower-powered versions are now available for $10,000 or less. A number of lidar makers, such as Velodyne, a Californian firm, are trying to develop what they call "solid-state" lidars, which are miniaturised versions with no moving parts. Some researchers are using a flash of laser light instead of a beam, and capturing the reflections with an array of tiny sensors on a chip.
Infineon, however, has taken a different tack and is using a micro-electro-mechanical system (MEMS). This particular MEMS was invented by Innoluce, a Dutch firm which Infineon bought in October 2016. The device consists of an oval-shaped mirror, just 3mm by 4mm, contained on a bed of silicon. The mirror is connected to actuators that use electrical resonance to make it oscillate from side to side, changing the direction of the laser beam it is reflecting. This, says Infineon, permits the full power of the laser to be used for scanning instead of its light being dispersed, as it would be in a flash-based system.
The MEMS lidar can scan up to 5,000 data points from a scene every second, and has a range of 250 metres, says Ralf Bornefeld, Infineon's head of automotive sense and control. Despite its moving mirror, he thinks it should prove as robust and reliable as any other silicon chip. In mass production and attached to, say, a windscreen, the MEMS lidar is expected to cost a carmaker less than $250. These tiny lidars would have other applications, too—in robots and drones, for example.
On November 3, 2007, six vehicles made history by successfully navigating a simulated urban environment—and complying with California traffic laws—without a driver behind the wheel. Five of the six were sporting a revolutionary new type of lidar sensor that had recently been introduced by an audio equipment maker called Velodyne.
A decade later, Velodyne's lidar continues to be a crucial technology for self-driving cars. Lidar costs are coming down but are still fairly expensive. Velodyne and a swarm of startups are trying to change that.
Some experts believe the key to building lidar that costs hundreds of dollars instead of thousands is to abandon Velodyne's mechanical design—where a laser physically spins around 360 degrees, several times per second—in favor of a solid-state design that has few if any moving parts. That could make the units simpler, cheaper, and much easier to mass-produce.
Nobody knows how long it will take to build cost-effective automotive-grade lidar. But all of the experts we talked to were optimistic. They pointed to the many previous generations of technology—from handheld calculators to antilock brakes—that became radically cheaper as they were manufactured at scale. Lidar appears to be on a similar trajectory, suggesting that in the long run, lidar costs won't be a barrier to mainstream adoption of self-driving cars.
https://arstechnica.com/cars/2018/01/driving-around-without-a-driver-lidar-technology-explained/
-- submitted from IRC
Stanford researchers develop technique to see objects hidden around corners
A driverless car is making its way through a winding neighborhood street, about to make a sharp turn onto a road where a child's ball has just rolled. Although no person in the car can see that ball, the car stops to avoid it. This is because the car is outfitted with extremely sensitive laser technology that reflects off nearby objects to see around corners.
This scenario is one of many that researchers at Stanford University are imagining for a system that can produce images of objects hidden from view. They are focused on applications for autonomous vehicles, some of which already have similar laser-based systems for detecting objects around the car, but other uses could include seeing through foliage from aerial vehicles or giving rescue teams the ability to find people blocked from view by walls and rubble.
Confocal non-line-of-sight imaging based on the light-cone transform (DOI: 10.1038/nature25489) (DX)
Whereas light detection and ranging (LIDAR) systems use such measurements to recover the shape of visible objects from direct reflections, NLOS [(Non Line Of Sight)] imaging reconstructs the shape and albedo of hidden objects from multiply scattered light. Despite recent advances, NLOS imaging has remained impractical owing to the prohibitive memory and processing requirements of existing reconstruction algorithms, and the extremely weak signal of multiply scattered light. Here we show that a confocal scanning procedure can address these challenges by facilitating the derivation of the light-cone transform to solve the NLOS reconstruction problem. This method requires much smaller computational and memory resources than previous reconstruction methods do and images hidden objects at unprecedented resolution. Confocal scanning also provides a sizeable increase in signal and range when imaging retroreflective objects. We quantify the resolution bounds of NLOS imaging, demonstrate its potential for real-time tracking and derive efficient algorithms that incorporate image priors and a physically accurate noise model. Additionally, we describe successful outdoor experiments of NLOS imaging under indirect sunlight.
Tesla promises 'one million robo-taxis' in 2020
Submitted via IRC for ErnestGoesToSpace
Tesla Promises Investors 'One Million Robo-Taxis' by 2020
To kick things off, the company shared that it had built its very own computer for self-driving cars. The neural network chip was built from the ground up; the project started back in 2016. Each computer (which is stored behind the glove box) has redundancy so that if one chip fails, the second chip can take over.
This is the company's first time building its own silicon. CEO Elon Musk was quick to boast that Tesla " which has never designed a chip, designed the best chip in the world."
Musk reiterated what he's said before about the hardware available in Teslas. "All Tesla cars right now have everything necessary for self-driving available today. All you need to do is improve the software."
That hardware includes the company's reliance on cameras and radar. When the subject of LiDAR (Light Detection and Ranging) came up, Musk said "LiDAR is a fool's errand. Anyone that's relying on LiDAR is doomed." He later added that "it's fricking stupid. It's expensive and unnecessary."
Source: https://www.engadget.com/2019/04/22/tesla-elon-musk-self-driving-robo-taxi/
Tesla Vaunts Creation of 'The Best Chip in the World' for Self-Driving
At its "Autonomy Day" today, Tesla detailed the new custom chip that will be running the self-driving software in its vehicles. Elon Musk rather peremptorily called it "the best chip in the world... objectively." That might be a stretch, but it certainly should get the job done.
Called for now the "full self-driving computer," or FSD Computer, it is a high-performance, special-purpose chip built (by Samsung, in Texas) solely with autonomy and safety in mind. Whether and how it actually outperforms its competitors is not a simple question and we will have to wait for more data and closer analysis to say more.
Robotaxis also at TechCrunch.
According to a [PDF] paper to be presented at the 2019 Conference on Computer Vision and Pattern Recognition, June 15-21 in Long beach, California, researchers have discovered a "simple, cost-effective, and accurate new method" of enabling self driving cars to recognize 3d objects in their path.
Currently bulky expensive lasers, scanners, and specialized GPS receivers are used in LIDAR (Light Detection And Ranging) sensors and mounted on top of the vehicle. This causes increased drag as well as being unsightly and adding another ~$10,000 to the price tag. Until now, this has been the only viable option.
Cornell researchers have discovered that a simpler method, using two inexpensive cameras on either side of the windshield, can detect objects with nearly LiDAR's accuracy and at a fraction of the cost. The researchers found that analyzing the captured images from a bird's eye view rather than the more traditional frontal view more than tripled their accuracy, making stereo cameras a viable and low-cost alternative to LiDAR.
According to the paper, which goes into this in considerable depth, it is not the quality of images and data which causes the difference in accuracy, but the representation of the data. Adjusting this brings the object detection results using far less expensive camera data for 3D image-analysis up to nearly the same effectiveness as much more expensive LiDAR.
Kilian Weinberger, associate professor of computer science and senior author of the paper, notes that
stereo cameras could potentially be used as the primary way of identifying objects in lower-cost cars, or as a backup method in higher-end cars that are also equipped with LiDAR.
The paper concludes that future work may improve image-based 3D object detection using the denser data feed from cameras further, fully closing the gap with LiDAR.
Submitted via IRC for ErnestTBass
Nikon will help build Velodyne's lidar sensors for future self-driving cars
With the notable exception of one automaker, most companies are generally in agreement that lidar is a vital component of the hardware necessary to enable some degree of vehicle autonomy. However, with all that demand out there, any company that wants its product all over the industry will need to build at scale. To achieve that scale, one lidar manufacturer is reaching out to a company with a lot of lens experience.
Velodyne announced on Thursday that it has signed an agreement with Nikon, in which the company most famous for its cameras will manufacture lidar sensors for Velodyne. Nikon plans to start mass production of Velodyne's lidar in the second half of 2019.
"Mass production of our high-performance lidar sensors is key to advancing Velodyne's immediate plans to expand sales in North America, Europe, and Asia," said Marta Hall, president of Velodyne Lidar, in a statement. "It is our goal to produce lidar in the millions of units with manufacturing partners such as Nikon. Working with Nikon, an expert in precision manufacturing, is a major step toward lowering the cost of our lidar products."
Nikon has already invested $25 million in Velodyne's business, so this manufacturing announcement represents the first big step in their partnership. Velodyne didn't specify how else the two companies plan to join forces, saying only that the pair "will continue to investigate further areas of a wide-ranging and multifaceted business alliance." Velodyne did say, though, that it wants its lidar to be used beyond automotive applications, including agriculture, mapping and security.
Velodyne Will Sell a Lidar for $100
Velodyne claims to have broken the US $100 barrier for automotive lidar with its tiny Velabit, which it unveiled at CES earlier this month.
"Claims" is the mot juste because this nice, round dollar amount is an estimate based on the mass-manufacturing maturity of a product that has yet to ship. Such a factoid would hardly be worth mentioning had it come from some of the several-score odd lidar startups that haven't shipped anything at all. But Velodyne created this industry back during DARPA-funded competitions, and has been the market leader ever since.
"The projection is $100 at volume; we'll start sampling customers in the next few months," Anand Gopalan, the company's chief technology officer, tells IEEE Spectrum.
The company says in a release that the Velabit "delivers the same technology and performance found on Velodyne's full suite of state-of-the-art sensors." Given the device's small size, that must mean the solid-state version of the technology. That is, the non-rotating kind.
Related: Why Experts Believe Cheaper, Better Lidar is Right Around the Corner
Nikon Will Help Build Velodyne's Lidar Sensors for Future Self-Driving Cars
Contrary To Musk's Claims, Lidar Has Some Advantages In Self Driving Technology
Artificial Eyes: How Robots Will See In The Future
(Score: 5, Informative) by edIII on Wednesday August 07 2019, @11:10PM (24 children)
Lidar is not doomed, nor does it "sidestep the fundamental problems of visual recognition".
It's a SENSOR. Using several different types to provide input to the AI programs only makes sense. Cameras don't have depth perception, while lidar does. Using multiple cameras from multiple angles can help get you that data using image reconstruction algorithms. That's how a Kinect works. How is determining distance to an object superior by using AI reconstruction methods on static images than using an actual farking laser beam? Last I checked lasers could actually determine surface temperature of an object too.
These technologies complement one another in the same way a human body can detect heat and pressure. It's all just information, and were feeding it to AI anyways. More information, and more redundant systems for providing that information, can only aid the AI in its task. It's like a human being that may be temporarily blinded, but can still attempt to navigate by feel, touch, and sound.
Elon Musk is just being a dick trying to discredit other navigational systems while hawking his own crap as superior.
Technically, lunchtime is at any moment. It's just a wave function.
(Score: 3, Funny) by Anonymous Coward on Wednesday August 07 2019, @11:21PM (2 children)
Mod parent down. Elon Musk is not an arrogant prick, he is a gift from God to all mankind.
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @01:03AM (1 child)
* [citation needed]
(Score: 1, Funny) by Anonymous Coward on Thursday August 08 2019, @01:19AM
https://twitter.com/elonmusk [twitter.com]
(Score: 2, Informative) by Runaway1956 on Thursday August 08 2019, @12:24AM (6 children)
Musk should take at least a brief look at warships. The typical warship has radar, of course, but looking more closely, there is repetitive redundancy obvious in the masts. Multiple straight antennae, along with multiple radar domes, each with intimidating name. Ships have several radio antenna on those masts as well. Sonar is redundant, in that multiple different frequencies are used, each in several different ways. As if that weren't enough, one of the ships I served on had a passive sonar fish that was towed behind the ship, on miles of cable, that could dive below the thermocline. Repetitive redundancy is everywhere on a warship.
Why all of that redundancy? It's to make the ship SURVIVABLE, both in peacetime and in combat.
Few motorists would consider survivability a "bad thing".
Stick those damned "extra" sensors in there. The car already costs multiple tens of thousands of dollars. Lidar increases the cost by $800 per vehicle? Don't be a cheapskate, put it in there!
“I have become friends with many school shooters” - Tampon Tim Walz
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @12:39AM (4 children)
Lidar adds approximately $10 to $100,000 to the cost of the vehicle.
(Score: 2) by Mykl on Thursday August 08 2019, @12:48AM (2 children)
Am I right in assuming that you meant $10,000 to $100,000?
(Score: 1, Interesting) by Anonymous Coward on Thursday August 08 2019, @12:59AM
NO! [spar3d.com]
(Score: 2) by deimtee on Thursday August 08 2019, @01:03AM
Probably not.
A single distance sensing laser (eg cheap handheld distance meter) can probably be sourced for less than $10 per unit.
A full-on 'warship' LIDAR array as described by Runaway above could easily go above $100,000
200 million years is actually quite a long time.
(Score: 2) by Runaway1956 on Thursday August 08 2019, @01:36PM
Y U HATE CAPITALISM????
Just because you can find a chip for ten bucks, doesn't mean you will find anyone who will install that chip for you, along with all the rest of the hardware and electronics to make the chip useful for ten bucks. I picked $800 as a semi-rational figure, because I know it's not super expensive, but no one is going to do it for free.
“I have become friends with many school shooters” - Tampon Tim Walz
(Score: 2) by c0lo on Thursday August 08 2019, @02:54AM
As a motorist, I confirm I'd like to see cars with miles of cable in tow if those cars are more survivable ⚆_⭕ (grin)
https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
(Score: 2) by legont on Thursday August 08 2019, @02:38AM (7 children)
Not trying to get into lidar vs whatever discussion, more information does not mean better decision. It is good to have say vision and hearing, while it is not good to have two visions.
"Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @02:55AM (1 child)
(Score: 1) by khallow on Thursday August 08 2019, @12:32PM
Yet.
(Score: 3, Touché) by c0lo on Thursday August 08 2019, @02:56AM
Ummm... binocular vision, not good?
https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @10:44AM
Have you had one of your eyes removed yet?
(Score: 2) by Runaway1956 on Thursday August 08 2019, @01:37PM
So - you would prefer to have bat sonar, rather than lidar? QUICK - to the Batmobile, and I'll show you how it works!
“I have become friends with many school shooters” - Tampon Tim Walz
(Score: 2) by Runaway1956 on Thursday August 08 2019, @01:43PM (1 child)
Let's say you've got six sensing systems. You can figure that one goes out now and then. So, fail-safe it. If three or more of your systems agree that you're good to go, you can go - at reduced speed. Nobody gets stranded at the side of the road because one out of six isn't happy.
Back aboard ship again - we had four big-ass boilers for main power. If one boiler was sick, we ran with three. That's how repetitive redundancy is supposed to work.
Your autonomous vehicle should have lots of different sensors, and the car should only shut down if multiple sensors agree that conditions are actually unsafe.
“I have become friends with many school shooters” - Tampon Tim Walz
(Score: 2) by legont on Thursday August 08 2019, @07:06PM
The only thing I wanted to point out is that a statement "two systems are better than one" is often not true. When two systems provide similar information the decision is often worse than with one system. It does not mean that a backup is bad. It does not even mean that having two system is necessarily bad. It simply means that a designer should be ware of the fact that adding a sensor may decrease the results, which the original poster is not aware of.
More generally, the philosophy "more is better" is typical for an American design and it is a weakness, I believe.
"Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @02:47AM (1 child)
(Score: 1) by khallow on Thursday August 08 2019, @09:49AM
Yet. An obvious way to extend the dynamic range of a camera is via an iris, just like what the human eye has. And human night vision is based on chemical processes that aren't that reliable. You can ruin night vision for minutes to hours (depending on how light sensitive you're aiming for) by brief exposures to bright light (such as headlights from a passing car).
(Score: 1) by khallow on Thursday August 08 2019, @09:43AM (2 children)
To be fair, it's a combination energy projection and sensor system, an active system as opposed to a passive system which doesn't require such energy projection (except under the usual low light conditions that a human driver would experience as well).
As to more sensors complementing each other, let us not forget the large, synergistic downside - more failure modes. All this redundancy sounds great, but what happens when it creates more simple opportunities for a car to become undrivable? I think that ultimately is the catch. The more "redundant" systems you add to a car, the more "This car can't drive" failure modes you create. And keep in mind that people are notorious for driving cars with serious problems. If they can drive with broken sensors (and thus, reduced redundancy), they will.
Presently, supplementary external sensors to a human driver aren't essential. They can to the very last one be disabled without creating a car that can't drive. But with an autonomous car, those additional sensors create additional liability. In particular, whatever else you can say about lidar, it remains that it has two components that need to work in order for it to function rather than the one component of a passive video system.
A vehicle with a small number of critical sensors (that is, where a single failure makes the car undrivable for either safety or liability reasons) is going to have better operational reliability than a system with lots of critical sensors, even if all those sensors are individually somewhat more reliable. If I double the number of critical sensors (assuming all have equal rate of failure), I need to halve the likelihood of failure of those sensors in order to maintain the same reliability.
I think that's the calculus behind Tesla's decision.
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @12:31PM (1 child)
fail on ignorance--
Tesla currently uses both cameras and radar (passive and active in your usage). Here's a Tesla forum post on the radar, https://teslamotorsclub.com/tmc/threads/where-is-the-front-radar-on-hw2-cars-located.87415/ [teslamotorsclub.com]
If you look down a few posts, it appears that the cruise control in some models of Tesla depends on the radar working...and the radar doesn't work if covered with snow/ice.
(Score: 1) by khallow on Friday August 09 2019, @12:41AM
Still means two more systems that have to work on top of what they already have. And as I noted, now you need five systems to work, not three.
In other words, a failure mode. I mentioned that happens. It's not like a lidar equipped car would be racing along in those conditions either.
(Score: 2) by digitalaudiorock on Thursday August 08 2019, @02:37PM
Correct me if I'm wrong, but it sure seemed like that first widely reported self-driving fatality appeared to be because his beloved "visual recognition" couldn't possibly see a white semi-truck against a white background? Sounds more like Elon is trying to "sidestep" the cost of a real sensor(??).
(Score: 2) by Coward, Anonymous on Wednesday August 07 2019, @11:35PM
Maybe Musk is preparing his legal defense for when he gets sued by people who bought "full self-driving capable" cars that don't have LIDAR, when all actual self-driving cars will have LIDAR.
(Score: 1, Interesting) by Anonymous Coward on Thursday August 08 2019, @12:09AM (6 children)
I listened to the press conference where Musk and his team made these claims (cameras and radar only). He's already killed three of his customers in accidents that should have been easy to avoid by a normal, alert driver. OF course, he may have also saved a number of his customers by the poorly-named "Autopilot" avoiding accidents--but public opinion isn't going to count those successes against the deaths.
If he keeps killing people with his beta testing, he may well set back progress in this field for years, no one will trust the robot cars that kill you.
Uber and other tech companies are also hell bent on ruining this business for other players--among other things they are using programmer-types (instead of trained test drivers) as the check/safety driver for their test cars. These people are not trained to do a very difficult job--the check drivers need to maintain situational awareness in a very boring situation...which can turn into a life-or-death situation in a very short time.
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @02:35AM
Cars don't kill people CEOs kill people!
Check and mate gun lover!
(Score: 2) by maxwell demon on Thursday August 08 2019, @11:00AM (1 child)
The marketing departments of Telsa's competition will surely tell you that it's only robot cars without Lidar that kill you!
The Tao of math: The numbers you can count are not the real numbers.
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @03:29PM
Right, the robot cars *with* Lidar kill pedestrians instead (Uber in AZ had Lidar).
Ha, Ha ...but not funny.
(Score: 1) by khallow on Thursday August 08 2019, @11:30PM (1 child)
Most auto makers have killed a lot more than that.
Anyone who is serious in the field will be killing people with their beta testing. It's unavoidable. And at this point, we need a lot more than trust. Those robot cars need to work first.
(Score: 0) by Anonymous Coward on Friday August 09 2019, @03:42AM
> Most auto makers have killed a lot more than that.
But not with self-driving cars, at least not yet. And, many of the other car companies vastly out-produce luxury car maker Tesla (so their exposure is higher).
I don't think human sacrifice is necessary to develop self-driving cars. For example Alphabet/Google appear to be taking a very conservative approach, start slow and take baby steps as their tech improves. So far they have been doing OK, from what I've read the worst they've had are fender benders.
(Score: 0) by Anonymous Coward on Sunday August 11 2019, @08:06PM
Musk has not claimed that any current Tesla on the road is actually currently capable of self-driving. It isn't his fault the morons that were supposed to be driving their cars ignored the damn warnings about paying attention when using Autopilot.
(Score: 4, Interesting) by PinkyGigglebrain on Thursday August 08 2019, @12:28AM (10 children)
What happens when two or more LIDAR systems of the same or different make/model operate within range of each other? Can/do they interfere with each other?
How susceptible are they to interference or jamming?
How would they affect someone taking a picture*? All the IR lasers would be visible to the cmos sensor in the camera and would mess up the image.
How would they affect other imaging systems like toll/red light cameras, license plate readers, or security cameras?
Anyone know the answers?
*there was an article awhile back where the LIDAR of a car at a show permanently damaged the photographers camera
https://arstechnica.com/cars/2019/01/man-says-ces-lidars-laser-was-so-powerful-it-wrecked-his-1998-camera/ [arstechnica.com]
"Beware those who would deny you Knowledge, For in their hearts they dream themselves your Master."
(Score: 1, Interesting) by Anonymous Coward on Thursday August 08 2019, @02:39AM (4 children)
A good lidar generates its own, pretty unique pseudo random sequence and modulates its transmitter. Then it looks for the same received sequence using synchronous receiver principle. Autocorrelation's shift is proportional to the distance. Other lidars do not get received. See CDMA.
Jamming of any receiver is possible.
Cameras of all kinds need to have notch filters for the lidar frequency.
(Score: 2) by PinkyGigglebrain on Thursday August 08 2019, @02:49AM (2 children)
So jamming/interference isn't anything to worry about. Good to know, thank you,
And cameras can be equipped with filters, also good to know. But what about Human eyes? Or other animal's eyes?
"Beware those who would deny you Knowledge, For in their hearts they dream themselves your Master."
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @03:01AM
(Score: 3, Funny) by c0lo on Thursday August 08 2019, @03:09AM
Interference-wise, human eyes are safe too, as don't interfere with one with the other.
But letting interference aside, the advantage of the LIDAR over using human eyes is that LIDAR can cast laser beams for as long as they are powered; if you try to cast human (or animal) eyes you'll run out of eyes pretty quick. Besides, the bounce back of a cast eye is quite poor, most of them will be squished on impact.
(large grin)
https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @03:36PM
> Cameras of all kinds need to have notch filters for the lidar frequency.
What are the chances that car manufacturers will offer upgraded filters for their cameras, after some new generation of Lidar starts to use a new frequency (different wavelength)? Seems like this needs to be a safety recall.
I have a nice digital camera, does this mean that it won't work right around Lidar (perhaps on every street in a few years) unless I buy an extra filter? Who do I take to small claims court to pay for this (times many billions of small cameras that didn't come with IR filters)?
(Score: 4, Interesting) by PinkyGigglebrain on Thursday August 08 2019, @02:42AM (4 children)
How do these automotive LIDARs effect the Human eye?
We're not talking bright headlights here, we're talking IR LASERs. Like the kind they warn you not point at anyone's eyes.
And there are going to be a lot of them on the roads eventually.
"Beware those who would deny you Knowledge, For in their hearts they dream themselves your Master."
(Score: 2) by c0lo on Thursday August 08 2019, @03:13AM (3 children)
For now.
But keep in mind that UV light allow for a better estimation of distances; why, you only have to look at integrated chips industry - they started with Vis light lithography and are doing EUV now. (large grin)
https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
(Score: 2, Funny) by Anonymous Coward on Thursday August 08 2019, @12:36PM (2 children)
The future's so bright, I gotta wear shades (??)
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @01:41PM
Better get an arc welding helmet, yes.
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @06:59PM
Just wait until they start replacing the blue LEDs with lasers everywhere.
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @01:02AM (3 children)
This got me thinking about failure modes of self-driving cars from the perspective of other drivers. As the article points out, sensors do fail which is why redundancy is important. But considering the worst edge case possible in which multiple systems fail – including the driver – I would assume some sort of fail safe mode would take control to bring the vehicle to a safe stop. In such an instance it would seem self-driving cars would need some type of visual alert system other than the conventional four-way hazard lights to allow surrounding drivers to yield as quickly as possible. I've seen instances in my daily commute where someone turned on their hazards but still had difficulty moving to the shoulder because other commuters did not yield. I've also seen cars driving down the freeway with their hazards on with traffic flowing around them as if everything was perfectly normal (I'm guilty of this myself). Perhaps we've become a bit numb to hazard lights or they've been used casually so frequently as to diminish their importance. Thus my question of should self-driving cars have a new type of visual warning system; one that cannot be triggered manually by the driver?
(Score: 0) by Anonymous Coward on Thursday August 08 2019, @12:43PM
Looking further down the road (sorry), if we ever get to the place where all cars are self-driving (which I doubt), zombie cars need to let the functioning cars know that they have a problem. Perhaps by the V2V (vehicle-to-vehicle) short-range secure communication channel that keeps getting mentioned...
There are a lot of problems yet to be solved to make this version of the future work.
(Score: 2) by legont on Thursday August 08 2019, @07:20PM (1 child)
My biggest question is how the car is going to triage between the owner/occupant life and a "small object" running across the street. Is it going to err on my side or the child's. Assuming it does somehow, would it be disclosed and adjustable by owners. What about adjustable by hackers?
"Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
(Score: 0) by Anonymous Coward on Friday August 09 2019, @03:51AM
One analysis of the Uber pedestrian fatality that I read noted that the developers had turned off at least some aspects of the auto-braking system(s). It was crying-wolf much too often, slamming on the brakes for nothing, and potentially causing a lot of following cars to rear-end the self-driving cars.
My interpretation: Rather than improve the auto-braking algorithms to reduce the false alarms, the developers (or their managers??) turned them off so they could focus on other aspects of self-driving. We are a long way from baking ethics into self driving systems if they can't determine that it's OK to hit a plastic bag (or tumble weed) blowing across the road.