Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Wednesday February 19 2020, @09:45PM   Printer-friendly
from the Do-these-trick-other-vendor's-systems? dept.

Hackers can trick a Tesla into accelerating by 50 miles per hour:

This demonstration from the cybersecurity firm McAfee is the latest indication that adversarial machine learning can potentially wreck autonomous driving systems, presenting a security challenge to those hoping to commercialize the technology.

Mobileye EyeQ3 camera systems read speed limit signs and feed that information into autonomous driving features like Tesla's automatic cruise control, said Steve Povolny and Shivangee Trivedi from McAfee's Advanced Threat Research team.

The researchers stuck a tiny and nearly imperceptible sticker on a speed limit sign. The camera read the sign as 85 instead of 35, and in testing, both the 2016 Tesla Model X and that year's Model S sped up 50 miles per hour.

This is the latest in an increasing mountain of research showing how machine-learning systems can be attacked and fooled in life-threatening situations.

[...] Tesla has since moved to proprietary cameras on newer models, and Mobileye EyeQ3 has released several new versions of its cameras that in preliminary testing were not susceptible to this exact attack.

There are still a sizable number of Tesla cars operating with the vulnerable hardware, Povolny said. He pointed out that Teslas with the first version of hardware cannot be upgraded to newer hardware.

"What we're trying to do is we're really trying to raise awareness for both consumers and vendors of the types of flaws that are possible," Povolny said "We are not trying to spread fear and say that if you drive this car, it will accelerate into through a barrier, or to sensationalize it."

So, it seems this is not so much that a particular adversarial attack was successful (and fixed), but that it was but one instance of a potentially huge set. Obligatory xkcd.


Original Submission

Related Stories

Slight Street Sign Modifications Can Completely Fool Machine Learning Algorithms 31 comments

Submitted via IRC for Bytram

It's very difficult, if not impossible, for us humans to understand how robots see the world. Their cameras work like our eyes do, but the space between the image that a camera captures and actionable information about that image is filled with a black box of machine learning algorithms that are trying to translate patterns of features into something that they're familiar with. Training these algorithms usually involves showing them a set of different pictures of something (like a stop sign), and then seeing if they can extract enough common features from those pictures to reliably identify stop signs that aren't in their training set.

This works pretty well, but the common features that machine learning algorithms come up with generally are not "red octagons with the letters S-T-O-P on them." Rather, they're looking [at] features that all stop signs share, but would not be in the least bit comprehensible to a human looking at them. If this seems hard to visualize, that's because it reflects a fundamental disconnect between the way our brains and artificial neural networks interpret the world.

The upshot here is that slight alterations to an image that are invisible to humans can result in wildly different (and sometimes bizarre) interpretations from a machine learning algorithm. These "adversarial images" have generally required relatively complex analysis and image manipulation, but a group of researchers from the University of Washington, the University of Michigan, Stony Brook University, and the University of California Berkeley have just published a paper showing that it's also possible to trick visual classification algorithms by making slight alterations in the physical world. A little bit of spray paint or some stickers on a stop sign were able to fool a deep neural network-based classifier into thinking it was looking at a speed limit sign 100 percent of the time.

Source: http://spectrum.ieee.org/cars-that-think/transportation/sensors/slight-street-sign-modifications-can-fool-machine-learning-algorithms

OpenAI has a captivating and somewhat frightening background article: Attacking Machine Learning with Adversarial Examples.


Original Submission

3D Printed Turtles Fool Google Image Classification Algorithm 15 comments

MIT researchers have fooled a Google image classification algorithm into thinking that a turtle is a rifle and a baseball is an espresso:

The team built on a concept known as an "adversarial image". That's a picture created from the ground-up to fool an AI into classifying it as something completely different from what it shows: for instance, a picture of a tabby cat recognised with 99% certainty as a bowl of guacamole.

Such tricks work by carefully adding visual noise to the image so that the bundle of signifiers an AI uses to recognise its contents get confused, while a human doesn't notice any difference.

But while there's a lot of theoretical work demonstrating the attacks are possible, physical demonstrations of the same technique are thin on the ground. Often, simply rotating the image, messing with the colour balance, or cropping it slightly, can be enough to ruin the trick.

The MIT researchers have pushed the idea further than ever before, by manipulating not a simple 2D image, but the surface texture of a 3D-printed turtle. The resulting shell pattern looks trippy, but still completely recognisable as a turtle – unless you are Google's public object detection AI, in which case you are 90% certain it's a rifle.

The researchers also 3D printed a baseball with pattering to make it appear to the AI like an espresso, with marginally less success – the AI was able to tell it was a baseball occasionally, though still wrongly suggested espresso most of the time.

The researchers had access to the algorithm, making the task significantly easier.

Also at The Verge.


Original Submission

A Simple Sticker Tricked Neural Networks Into Classifying Anything as a Toaster 55 comments

Image recognition technology may be sophisticated, but it is also easily duped. Researchers have fooled algorithms into confusing two skiers for a dog, a baseball for espresso, and a turtle for a rifle. But a new method of deceiving the machines is simple and far-reaching, involving just a humble sticker.

Google researchers developed a psychedelic sticker that, when placed in an unrelated image, tricks deep learning systems into classifying the image as a toaster. According to a recently submitted research paper about the attack, this adversarial patch is "scene-independent," meaning someone could deploy it "without prior knowledge of the lighting conditions, camera angle, type of classifier being attacked, or even the other items within the scene." It's also easily accessible, given it can be shared and printed from the internet.


Original Submission

A New Clothing Line Confuses Automated License Plate Readers 52 comments

Garments from Adversarial Fashion feed junk data into surveillance cameras, in an effort to make their databases less effective.

The news: Hacker and designer Kate Rose unveiled the new range of clothing at the DefCon cybersecurity conference in Las Vegas. In a talk, she explained the that hoodies, shirts, dresses, and skirts trigger automated license plate readers (ALPRs) to inject useless data into systems used to track civilians.

False tags: The license-plate-like designs on a garment are picked up and recorded as vehicles by readers, which frequently misclassify images like fences as license plates anyway, according to Rose (pictured above modeling one of her dresses). The idea is that feeding more junk data into the systems will make them less effective at tracking people and more expensive to deploy.

[...] Fashion fights back: Though it's the first to target ALPRs, this isn't the first fashion project aimed at fighting back against surveillance. Researchers have come up with adversarial images on clothing aimed at bamboozling AI, makeup that lets you hide your face from recognition systems, and even a hat that can trick systems into thinking you're Moby.


Original Submission

Protecting Smart Machines From Smart Attacks 12 comments

Machines' ability to learn by processing data gleaned from sensors underlies automated vehicles, medical devices and a host of other emerging technologies. But that learning ability leaves systems vulnerable to hackers in unexpected ways, researchers at Princeton University have found.

In a series of recent papers, a research team has explored how adversarial tactics applied to artificial intelligence (AI) could, for instance, trick a traffic-efficiency system into causing gridlock or manipulate a health-related AI application to reveal patients' private medical history. As an example of one such attack, the team altered a driving robot's perception of a road sign from a speed limit to a "Stop" sign, which could cause the vehicle to dangerously slam the brakes at highway speeds; in other examples, they altered Stop signs to be perceived as a variety of other traffic instructions.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 4, Insightful) by fustakrakich on Wednesday February 19 2020, @09:52PM (9 children)

    by fustakrakich (6150) on Wednesday February 19 2020, @09:52PM (#960027) Journal

    A cleverly disguised sign will fool a human driver, but they know other conditions will cause doubts. The machine has to "know" its surroundings with many different kinds of sensors.

    These machines will barely be usable and kind of dangerous only because of extreme cost cutting measures. The wrong kinds of people are in charge of the process.

    --
    La politica e i criminali sono la stessa cosa..
    • (Score: 3, Insightful) by ikanreed on Wednesday February 19 2020, @09:57PM

      by ikanreed (3164) Subscriber Badge on Wednesday February 19 2020, @09:57PM (#960030) Journal

      Sanity checks are a fool's game. Move fast, break things, especially the spines of a whole family!

    • (Score: 0) by Anonymous Coward on Wednesday February 19 2020, @10:50PM

      by Anonymous Coward on Wednesday February 19 2020, @10:50PM (#960055)

      I would think that there should be a hierarchy of conditions that should be met. If the car is in a residential area then the max speed limit is 25 MPH no matter what the sign apparently says.

      The conditions should be so that the car take the safest precautions when there are two conflicting signals (residential speed limit = 25 MPH. Posted speed limit = 75 MPH. Obviously it should chose the 25 MPH if it's in a residential area).

    • (Score: 1, Interesting) by Anonymous Coward on Thursday February 20 2020, @01:51AM

      by Anonymous Coward on Thursday February 20 2020, @01:51AM (#960126)

      How about a missing sign? Years ago a friend was fooled by a stop sign where a vandal had turned the pole 90 deg (in top view) -- from his viewpoint the sign was edge-on so he didn't see it at all. He t-boned (at low speed) someone rolling properly along the cross street, thinking that they had the 2-way stop.

      My friend got the "failure to stop" ticket anyway, the cop didn't care if the sign was visible or not.

    • (Score: 2) by Hyperturtle on Thursday February 20 2020, @03:19PM (5 children)

      by Hyperturtle (2824) on Thursday February 20 2020, @03:19PM (#960297)

      Wow, image recognition is fooled by vandalism of the image? Who knew? It's not like the cars are doing a careful analysis of previous speed signs and known traffic corridor speed rules and if there is construction and all sorts of things that a strict rule abiding system wouldn't expect when it came to established rules that are no subject to unanticipated changes.

      Laws are based on trust. People that break the law are punished so that other people are incentivized to not do the same thing and hopefully recognize the value of a framework we can all trust and put our faith in.

      Articles or studies that spell out for us that if we change a 3 into an 8 via whatever means, and the machine doing optical recognition sees it as an 8, this is no different than what I learned programming on my Commodore 64 -- Garbage in, Garbage Out.

      Should every smart car system everywhere have a complete and full understanding of all speed limits everywhere in order to prevent situations like this?

      Probably not. Should people be punished for vandalizing signs in such a way that people could get killed as a result? Absolutely. What type of punishment that should be, I don't know, but we're not supposed to hurt each other either and yet we have laws in place to punish the people that go around punching people.

      Should the cars be limited to how fast they can go on non-highways? That sounds like a good idea, and anyone that wants their car to continue to have free speech or whatever they feel is violated can just grab the wheel again and press the accelerator before those options are taken from us later. Otherwise, the cars follow the laws and could employ a speed limit (imagine that!) based on more than just the signage--maybe some human can insert some logic into the process like "85 in a school zone is not permissible no matter what the signs state". GPS has certainly allowed for the mapping out of school zones, just ask Google...)

      It is likely true that too much trust is being placed into a system that expects everything to follow the established rules. There are way too many variations of how rules can be broken to account for and program a solution to them all.

      I think that a good part of the solution would be to properly incentivize people to not be jerks, punish the ones that do, and maybe for really important traffic corridors--have a means of updating the known speed limits in real time, which of course opens up its own can of worms considering some clown could broadcast fake speed limits or hack into the database and change the values or...

      Security and trust will never be a resolvable issue for open systems like this that pretend they are closed, especially when it comes down to a person and their intentions from the outside of that system deciding to shoot paintballs at signs or cleverly design stickers to apply to various signs to change their meanings in order to, I don't know, generate ad revenue on their blog when they report more obvious things, like changing a 1 into a 7 increases something by in the same way changing a 3 to an 8 increases it by 5.

      • (Score: 0) by Anonymous Coward on Thursday February 20 2020, @06:43PM

        by Anonymous Coward on Thursday February 20 2020, @06:43PM (#960398)

        " GPS has certainly allowed for the mapping out of school zones, just ask Google"

        Yeah but sometimes the GPS gets confused and thinks you're on a freeway when you're not just because you're in close proximity to a freeway. Perhaps more should be done to protect against this as well (terrain mapping?).

      • (Score: 0) by Anonymous Coward on Thursday February 20 2020, @07:00PM (2 children)

        by Anonymous Coward on Thursday February 20 2020, @07:00PM (#960407)

        There are much easier ways to be vandals these days as well. You can put some crazy obstacle on the freeway and that can cause all sorts of accidents. A bunch of nails or a spike strip or a piece of wood with nails in it. Imagine the things a vandal can do to train tracks to cause a train wreck. Yet somehow we manage to get along. We don't ban cars and trains because they can be vandalized. Otherwise everything would get banned.

        I don't think the assumption being made should be that people will be vandals whenever possible. Sure there should be ways to catch vandals (the cars should have cameras, there should be security and cameras around, etc...) and punish them when they do get caught to deter them and make systems resistant to vandals but to not do anything because something can be vandalized would result in everyone living in a bubble. It wouldn't be practical.

        • (Score: 0) by Anonymous Coward on Thursday February 20 2020, @07:13PM (1 child)

          by Anonymous Coward on Thursday February 20 2020, @07:13PM (#960409)

          My car is parked in a parking lot right now. Someone can take a rock and bash my windows in. OH NO!!! I should have walked to work!! LETS BAN CARS!!!!!

          • (Score: 0) by Anonymous Coward on Thursday February 20 2020, @07:22PM

            by Anonymous Coward on Thursday February 20 2020, @07:22PM (#960412)

            Errr ... Let's *

      • (Score: 1) by speederaser on Friday February 21 2020, @01:29AM

        by speederaser (4049) on Friday February 21 2020, @01:29AM (#960545)

        Should every smart car system everywhere have a complete and full understanding of all speed limits everywhere in order to prevent situations like this?

        My Garmin does that now without a camera or an internet connection. Always knows the speed limit everywhere I go, and whether an intersection is controlled by a light or stop signs. Shouldn't be a problem at all for smart cars.

  • (Score: 2, Informative) by Anonymous Coward on Wednesday February 19 2020, @10:30PM (6 children)

    by Anonymous Coward on Wednesday February 19 2020, @10:30PM (#960042)

    These types of attacks work because the state of the art in ML is to throw different types of neural nets at a problem. The top research is going into making those nets more efficient to train rather than coming up with better algorithms to do specific things. For example, if speed sign reading was done with OCR analyzing...

    ...I had to stop myself right there after double-checking the article. I had mistakenly assumed they put a static, QR-like sticker on the sign to mess with a neural net. What they actually did was extend the middle of a 3 to make it look like an 8. Not being told ahead of time and being given only a glance at that sign, most people would probably see 85 too. They're basically saying they re-wrote the number on the sign and it fooled the computer. Well of course it did! If you put white tape over the left half of an 8 you could fool everyone into thinking it was a 3 as well.

    If we want to continue to argue about this, then a better designed system should have reported both a 3 and an 8 with similar probabilities then some other component of the car could have checked it's surroundings (highway, city road, everyone else doing 40?) to determine which was more likely.

    • (Score: 3, Informative) by Anonymous Coward on Wednesday February 19 2020, @10:38PM (4 children)

      by Anonymous Coward on Wednesday February 19 2020, @10:38PM (#960046)

      most people would probably see 85 too

      Yes. But people aren't AI, they are real-intelligent: They will pick up that it is unreasonable for the speed limit to suddenly go from 35 to 85 on a little street, and will guess someone fucked with the sign.

      • (Score: 2, Insightful) by Ethanol-fueled on Wednesday February 19 2020, @10:52PM (1 child)

        by Ethanol-fueled (2792) on Wednesday February 19 2020, @10:52PM (#960057) Homepage

        The same assholes driving in these cars are the same assholes watching Harry Potter with self-driving enabled and totally oblivious to the outside world. They ain't gonna notice shit until they get punched in the face with an airbag and then locked inside to die in a fiery inferno.

        • (Score: 3, Funny) by c0lo on Wednesday February 19 2020, @11:11PM

          by c0lo (156) Subscriber Badge on Wednesday February 19 2020, @11:11PM (#960070) Journal

          The same assholes driving being driven in these cars...

          FTFY

          They ain't gonna notice shit until they get punched in the face with an airbag and then locked inside to die in a fiery inferno.

          I would totally pay more taxes if I knew they are gonna be used for installing public, free of charge, "asshole punching and incinerating" facilities where this can happen without endangering the rest of decent humans.

          (who am I kidding, tho', the "decent human" race got extinct. Probably with Neanderthals)

          --
          https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 1) by anubi on Thursday February 20 2020, @05:47AM

        by anubi (2828) on Thursday February 20 2020, @05:47AM (#960207) Journal

        This is a good example of coding skill.

        One can be quite nebulous in telling a human to do something, and still get the point across.

        One can have all sorts of leadership skills when working with people, but this rarely works with state machines. If the company depends on their machines to work, a good coder and engineer is worth a helluva lot. But in reality the best often get poor reviews over people skills, as they are not a people person at heart...they deal with machines.

        Machines do exactly what they are told to do...not what they think you want. One does not impress a machine by having a corner office, private jet, three piece suit, and a pad of evaluation forms.

        --
        "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
      • (Score: 0) by Anonymous Coward on Friday February 21 2020, @12:33AM

        by Anonymous Coward on Friday February 21 2020, @12:33AM (#960517)

        And once we move to all autonomous cars, governments will simply have to publish maps in a standard format so the car can look up the correct limits without having to try to parse some marks painted on a piece of metal. Then the car will have access to better information than the human. The virtue of human intuition in knowing not to drive fast in a residential zone only arises from the fact that we are in a transition period where the information is mainly being made for human consumption.

    • (Score: 2) by c0lo on Wednesday February 19 2020, @11:19PM

      by c0lo (156) Subscriber Badge on Wednesday February 19 2020, @11:19PM (#960073) Journal

      If we want to continue to argue about this, then a better designed system should have reported both a 3 and an 8 with similar probabilities then some other component of the car could have checked it's surroundings (highway, city road, everyone else doing 40?) to determine which was more likely.

      No need for better designed systems. Just switch the speed limit signs to use numerals less prone to adversarial attacks (cheaper for driverless car makers too, the change of the limit signs are gonna be supported by public money; a small investment in lobbying can go a long way).

      I don't know, maybe use Mandarin numerals? Because sooner or later, those are gonna be lingua franca anyway.

      (large grin)

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
  • (Score: 2, Interesting) by Anonymous Coward on Wednesday February 19 2020, @10:31PM (2 children)

    by Anonymous Coward on Wednesday February 19 2020, @10:31PM (#960043)

    Something is driving up the stock to crazy heights recently, it hit over $900 today. A company that makes no money is traded higher than GE or IBM.

    • (Score: 1, Informative) by Anonymous Coward on Wednesday February 19 2020, @10:45PM (1 child)

      by Anonymous Coward on Wednesday February 19 2020, @10:45PM (#960051)

      Something drove up the price of DeLorean stock too.

      • (Score: 0) by Anonymous Coward on Thursday February 20 2020, @03:48PM

        by Anonymous Coward on Thursday February 20 2020, @03:48PM (#960312)

        Musk seems to be drawn to weed though.

  • (Score: 2) by SomeGuy on Wednesday February 19 2020, @10:35PM

    by SomeGuy (5632) on Wednesday February 19 2020, @10:35PM (#960045)

    "He pointed out that Teslas with the first version of hardware cannot be upgraded to newer hardware."

    In other words, get ready to be bombarded with advertising and propaganda that will brainwash 99.9% of the people that they should just throw their old cars away and buy a new one every year.

    News anchor 1: ...the car he was driving was more than TWO YEARS old.
    News anchor 2: So unsafe! And now to Tom with the wather...

  • (Score: 1, Touché) by Anonymous Coward on Wednesday February 19 2020, @11:13PM (1 child)

    by Anonymous Coward on Wednesday February 19 2020, @11:13PM (#960071)

    Now you're a "hacker" for putting a sticker on something!? Man, my toddler's gonna get us raided by the FBI...

    • (Score: 0) by Anonymous Coward on Thursday February 20 2020, @10:51PM

      by Anonymous Coward on Thursday February 20 2020, @10:51PM (#960474)

      Used to be blowing a whistle in a phone made you a hacker.

  • (Score: -1, Offtopic) by Anonymous Coward on Wednesday February 19 2020, @11:45PM (8 children)

    by Anonymous Coward on Wednesday February 19 2020, @11:45PM (#960080)

    Last I checked, acceleration is measured in m/s^2. So is the title implying that the Tesla accelerated by 50 miles/hour/second, or 50 miles/hour/hour?

    If the former, that's pretty quick, given that the acceleration record currently is rougly 35 miles/hour/second (0-60 in 2.1 seconds, but the acceleration isn't linear). The interesting question then becomes, was this a short burst or a sustained acceleration? And if so, how long did they sustain the acceleration?

    If the latter, I'm not impressed. I can push a car uphill faster than that.

    No, I didn't RTFS, thank you.

    • (Score: 1, Redundant) by isostatic on Thursday February 20 2020, @12:07AM (5 children)

      by isostatic (365) on Thursday February 20 2020, @12:07AM (#960088) Journal

      It went from 35mph to 85mph when it should have stayed at 35mph

      • (Score: 0) by Anonymous Coward on Thursday February 20 2020, @02:25AM (4 children)

        by Anonymous Coward on Thursday February 20 2020, @02:25AM (#960142)

        My esteemed AC colleague is correct, that is still not an acceleration. Divide your 50 mph by however long it took to do it, and then we can talk acceleration.

        • (Score: 3, Insightful) by maxwell demon on Thursday February 20 2020, @01:36PM (3 children)

          by maxwell demon (1608) on Thursday February 20 2020, @01:36PM (#960270) Journal

          The Tesla accelerated to a speed that was 50 miles per hour faster than the allowed speed.

          Note that the title does not say "Accelerating 50 Miles per Hour" — there is a preposition in between.

          --
          The Tao of math: The numbers you can count are not the real numbers.
          • (Score: 0, Disagree) by Anonymous Coward on Thursday February 20 2020, @02:56PM (2 children)

            by Anonymous Coward on Thursday February 20 2020, @02:56PM (#960290)

            Indeed, and the preposition "by" explicitly says that the acceleration was increased 50 MPH, not the speed.

            I've increased my weight by 5 lbs. I cut the power by 3 Watts. "By" uses the same units. "to" would have worked in the sentence. In the context of the article, the acceleration is irrelevant anyway. It is the speed limit that is being manipulated, not the acceleration.

            • (Score: 3, Informative) by acid andy on Thursday February 20 2020, @06:53PM (1 child)

              by acid andy (1683) on Thursday February 20 2020, @06:53PM (#960404) Homepage Journal

              Acceleration means an increase of speed (per unit of time).

              I've increased my weight by 5 lbs.

              The Tesla increased its speed by 50 mph. That's the same as saying it was accelerating by 50 mph (over some unspecified amount of time).

              TFS didn't say it accelerated to 50 mph or that it was acclerating at 50 mph (not complete units for acceleration).

              --
              If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?
              • (Score: 2) by isostatic on Tuesday March 03 2020, @09:48AM

                by isostatic (365) on Tuesday March 03 2020, @09:48AM (#965923) Journal

                "accelerated to 50 mph" would be fine.

                Accelerated (by an unspecified amount from a speed less than 50mph) to 50mph (in an unspecified time)

                For example

                Accelerated by 20mph from 30mph to 50mph (in an unspecified time)
                Accelerated by 20mph from 30mph to 50mph in 20 seconds
                Accelerated from 20mph at 1mph/second for 20 seconds

                The headline

                "Accelerating by 50 Miles Per Hour"

                is

                Accelerating (from an unspecified speed) by 50 Miles Per Hour (in an unspecified amount of time)

                "accelerating at 50mph" is of course nonsense.

    • (Score: 2) by Nuke on Thursday February 20 2020, @11:06AM

      by Nuke (3162) on Thursday February 20 2020, @11:06AM (#960249)

      Someone modded this offtopic, but I read the headline twice and still did not understand WTF they were on about. I only understood when I RTFA. The headline is poorly worded.

    • (Score: 0) by Anonymous Coward on Thursday February 20 2020, @06:36PM

      by Anonymous Coward on Thursday February 20 2020, @06:36PM (#960392)

      You're one of those people who, being told you're ignoring the gravity of the situation, complains that an object cannot possibly ignore a fundamental force. Ain'tcha?

  • (Score: 2) by rigrig on Wednesday February 19 2020, @11:52PM (1 child)

    by rigrig (5129) Subscriber Badge <soylentnews@tubul.net> on Wednesday February 19 2020, @11:52PM (#960084) Homepage

    This was not some convoluted hand crafted modification of signs, this was something that a bird could accidentally do.

    Tesla has since moved to proprietary cameras on newer models

    In other words: "All these science guys are causing bad press, so let's make out system harder to study."
    Totally unrelated: the next time this kind of problem pops up, it'll be because someone investigates all those Tesla crashes on that one stretch of road with the peculiar bird droppings on the traffic signs.

    --
    No one remembers the singer.
    • (Score: 2, Informative) by jlv on Thursday February 20 2020, @07:46PM

      by jlv (3756) on Thursday February 20 2020, @07:46PM (#960418)

      Tesla has since moved to proprietary cameras on newer models

      The article is wrong here. They've moved to their own AutoPilot hardware and software. The cameras aren't proprietary. The AP1 hardware (used through early 2017) used processors from Mobileye and was non-upgradable. Since 2017, AP2 and later uses more camera and Tesla's own software, and is OTA upgradable.

      The bulk of Tesla cars in the world don't have this problem.

  • (Score: 3, Informative) by TheGratefulNet on Thursday February 20 2020, @01:52AM (1 child)

    by TheGratefulNet (659) on Thursday February 20 2020, @01:52AM (#960128)

    sigh. copied from the green site? should not have.

    tesla does not use mobile eye anymore.

    this article is out of date.

    but don't let that stop a good HATE SCREED of tesla. always in fashion, it seems.

    --
    "It is now safe to switch off your computer."
    • (Score: 2) by martyb on Thursday February 20 2020, @09:36AM

      by martyb (76) Subscriber Badge on Thursday February 20 2020, @09:36AM (#960238) Journal

      tesla does not use mobile eye anymore.

      this article is out of date.

      From the summary:

      Tesla has since moved to proprietary cameras on newer models, and Mobileye EyeQ3 has released several new versions of its cameras that in preliminary testing were not susceptible to this exact attack.

      --
      Wit is intellect, dancing.
  • (Score: 2) by bzipitidoo on Thursday February 20 2020, @01:58AM

    by bzipitidoo (4388) on Thursday February 20 2020, @01:58AM (#960132) Journal

    One time coming back from a long road trip of 2000 miles, I was stunned to discover that the last 10 miles, on a road I know well but hadn't used in a few years, was by far the hardest to drive. It was a country road around which housing developments had recently sprung up. Lot of road work was in progress to turn it from a 2 lane highway into a 6 lane street. They had lane shifts and barrels and changes of pavement. Several times, the single lane would split into 2, and it was not clear whether the through lane was the right lane with a left turn lane, or the left lane with a right turn lane. Further, it had gotten dark, and all these newly opened stores had hastily erected illumination that was not well positioned, so that many of the lights were shining in the drivers' faces. Bad enough being blinded by oncoming traffic, without having to deal with that. To add to the fun, it was a bad kind of busy, with a lot more oncoming traffic than traffic on my side, which meant no one to follow through the maze, and lots of headlights glaring in my face.

    Texas does a bad job of directing traffic in road construction zones. I wonder how well these self-driving cars would handle a hell drive like that one.

    There are worse drives. Try a 100 mile trip in winter weather, in those hours just after a blizzard and before the snowplows have had a chance to clear the roads. If you can get through at all, you may be doing stuff like ramming your way through snowdrifts. Back up, get a running start, and plow into the snow. Gets you a few feet of progress. Then back up and do it again. And again, and again. For hours. Can't be too deep, for that to work. If the road is buried under 8 feet of snow for miles, you are not ramming through that with a car, you are stuck until the snowplows clear the road. Then you get the thrill of driving in a snow canyon. I've never done any of that, but in his younger days, my father did, many times. I always wondered what was so urgent that he couldn't wait until the next day. In later years, he did wait. Anyway, would like to see an AI handle that.

  • (Score: 2) by stretch611 on Thursday February 20 2020, @03:35AM (3 children)

    by stretch611 (6199) on Thursday February 20 2020, @03:35AM (#960180)

    This hack is unlikely to get people to drive 85mph.

    First, before we say that the tesla really f'd up, the article I read earlier [bloomberg.com] mentioned how this can even fool human drivers to think the speed limit is 85.

    If you think of how this hack worked, and the actual numbers, there are not that many options. After all other than turning things into 8's, how many numbers can be changed in this way? Maybe a 5 into a 6 (which for speeding won't even get cops to notice you in most places. Another possibility is turning a 1 into a 7, but how many speed limits are there in places that are only 10 or 15mph... the only ones I can think of are parking lots.

    While changing 2, 3, 5, or 6 into an 8 is possible, the fact is that only a single state (TX) allows 85mph anywhere. (only 7 other states allow 80mph) That is an obvious red flag for this hack. (reference: https://en.wikipedia.org/wiki/Speed_limits_in_the_United_States [wikipedia.org] )

    If you are not on a rural freeway, it is a dead giveaway that the sign is wrong as even the 8 states that allow 80mph or more, the only place that happens are on rural freeways.

    Even if a person/computer is fooled into thinking that this speed limit is correct, the fact is that you can't go faster than the car in front of you. So traffic will not allow you to go this fast.

    Another point brough up by the article I read is:

    Manufacturers are also integrating mapping technology into systems that reflect the proper speed limit.

    i.e. GPS and mp software will know the actual speed limit regardless of what the signs say. If you have used any GPS system in the past 20 years you would realize this too. Most roads in the US have there associated speed limit included in GPS/Map databases.

    So while this "hack" can fool a camera/sensor, in many cases, common sense and/or computer databases would dispute it quite effectively.

    --
    Now with 5 covid vaccine shots/boosters altering my DNA :P
    • (Score: 2) by dry on Thursday February 20 2020, @07:10AM (2 children)

      by dry (223) on Thursday February 20 2020, @07:10AM (#960224) Journal

      In Canada, where we also have Tesla's, common speed limits are 50 in town and 80 on many highways with quite a few rural or major roads being 60.
      All speeds are in km/h.

      • (Score: 2) by maxwell demon on Thursday February 20 2020, @01:41PM (1 child)

        by maxwell demon (1608) on Thursday February 20 2020, @01:41PM (#960272) Journal

        Which raises another question: Will those Teslas always reliably know when to interpret a road sign as MPH or km/h?

        --
        The Tao of math: The numbers you can count are not the real numbers.
        • (Score: 2) by dry on Thursday February 20 2020, @04:10PM

          by dry (223) on Thursday February 20 2020, @04:10PM (#960322) Journal

          'twas another thing I was wondering. I'd think their mapping software is good enough to know where the border is but we still get the odd American who crosses the border, sees the 80 speed sign and goes flying down the highway at 140 km/h.

(1)