Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Wednesday June 13 2018, @11:54PM   Printer-friendly
from the we've-all-done-it dept.

Milorad Trkulja was shot by an unknown gunman in Melbourne in 2004, then discovered that Google searches of his name brought up images of mob figures, including prolific drug trafficker Tony Mokbel. Gangland activity in the city was prevalent at the time.

Trkulja successfully sued Google in The Victorian Supreme Court in 2012, receiving AU$200,000 in damages (roughly $150,000). He then launched a second defamation action in 2013, alleging Google's autocomplete predictions, as well as searching phrases such as "Melbourne underworld criminals", wrongly brought up his name and image. Google took the case to the Victorian Court of Appeal and won that round.

Now the High Court has granted Trkulja special leave to appeal against that decision.

"In each of the pages on which images of such persons appear," the judgement said according to the ABC, "there are also images of persons who are notorious criminals or members of the Melbourne criminal underworld... coupled with images of persons, such as Mr Trkulja whose identity is relatively unknown."

Google tried to stop the case, but the High Court ruled there was clear potential for defamation.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by tfried on Thursday June 14 2018, @06:58AM (3 children)

    by tfried (5534) on Thursday June 14 2018, @06:58AM (#692739)

    Some guessing / generalization, here, as I cannot be bothered to dig into this particular case: I think a large part of the problem is that Google (and competitors) have stopped simply returning matches, i.e. content that really contains the search terms, and are instead trying to be clever.

    So, here, that guy got shot by an underworld criminal. There have been news reports on the incident, and these will be found when searching for his name or for "underworld criminal". Nobody to blame for this, but also no sign of a problem up to this point. The nature of the association between Trkulja and criminal remains fully transparent to anybody presented with these results.

    Well ... unless that somebody is an AI. Google's AI sees that association, and has no idea about its nature. It simply concludes that "Trkulja" and "underworld criminal" are related search terms, and if you were typing one, you are probably also interested in the other. So now, when you search for Trkulja, you are also presented with a diffuse array of underworld activities, where the nature of the association is no longer transparent. Similarly when searching for underworld activities, you get hits for Trkulja without the original context. It's not hard to see how the guy would be seriously pissed about that outcome.

    So: A qualitatively new association has formed. And in has not formed in the real world, it is not implied in the input data, it has formed inside Google's AI. If that same mistake - naively associating "Trkulja" and "underworld criminal", as if these were synonyms - had been made by a human being, then clearly that human being and their employer could be held responsible. But if it's an AI messing up, then nobody is to blame? Or perhaps it's not all that unreasonable to say "if you let your AI play in the wild, then you are responsible for its screw ups"?

    Starting Score:    1  point
    Moderation   +2  
       Insightful=2, Total=2
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 1) by tfried on Thursday June 14 2018, @07:08AM

    by tfried (5534) on Thursday June 14 2018, @07:08AM (#692743)

    Addendum: The possible path out also pretty clear, then: Just go back to searching for the terms the user has actually typed. And if you really want to add synonyms on top of that, then use a proper - human controlled - list.

  • (Score: 3, Interesting) by PiMuNu on Thursday June 14 2018, @09:39AM (1 child)

    by PiMuNu (3823) on Thursday June 14 2018, @09:39AM (#692789)

    AI is a trigger acronym for me. Please, just write "Google's algorithm". It really isn't a neural network anyway, and a neural network isn't an AI, so please don't write that.

    https://www.google.com/search/howsearchworks/algorithms/ [google.com]

    • (Score: 4, Insightful) by tfried on Thursday June 14 2018, @10:01AM

      by tfried (5534) on Thursday June 14 2018, @10:01AM (#692798)

      That's always a valid remark, and I've even spent half a thought on it while typing my post. But actually I found "AI" (pronounced with an ever so slightly ironic tone of voice) quite fitting this time, because:

      - In forming that (silly) association, the algorithm is actually exhibiting something that looks a lot like "creativity", which is a property that we do not generally think of when talking about algorithms.
      - The algorithm is trying to work with meanings and intentions, or at least it is employed in a setting, where an understanding of meaning would be essential. As much as the algorithm is failing at the task, dealing with meaning is a core feature of intelligence.

      So you are right, this is actually an algorithm (of course), but it is employed with that characteristic confidence that it can be employed without close supervision, as if it actually was intelligent.