Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday October 16 2018, @01:31PM   Printer-friendly
from the NSFW-NSFW-NSFW-NSFW-NSFW-NSFW dept.

Submitted via IRC for chromas

Bing Is Suggesting the Worst Things You Can Imagine

If you use Bing’s image search, you’re going to see the worst filth you can imagine.  Bing suggests racist terms and shows horrifying images. Bing will even suggest you search for exploited children if you have SafeSearch disabled.

We contacted Microsoft for comment, and Jeff Jones, Senior Director at Microsoft, gave us the following statement:

“We take matters of offensive content very seriously and continue to enhance our systems to identify and prevent such content from appearing as a suggested search. As soon as we become aware of an issue, we take action to address it.”

Update: Since publication, Microsoft has been working on cleaning up the offensive Bing suggestions that we mentioned. Based on our research, there are still many other offensive suggestions that have not yet been fixed, including a few that we’ve mentioned below. We are unsure if they are simply fixing the offensive items we pointed out, or if they are improving the algorithm.

Note: The screenshots here show what we saw when we wrote this piece testing the US version of Bing Image search in an Incognito private browsing session, but Bing’s results shift over time. Google didn’t have any of these problems, according to our tests. This is a Bing problem, not just a search engine problem. The same problem affects Bing’s video search.

[...] Microsoft needs to moderate Bing better. Microsoft has previously created platforms, unleashed them on the world, and ignored them while they turned bad

We’ve seen this happen over and over. Microsoft once unleashed a chatbot named Tay on Twitter. This chatbot quickly turned into a Nazi and declared “Hitler was right I hate the jews” after it learned from other social media users. Microsoft had to pull it offline.

[...] Microsoft can’t just turn a platform loose on the world and ignore it. Companies like Microsoft and Google have a responsibility to moderate their platforms and keep the horror at bay.

Suggestions Have a History of Serious Problems

Of course, there’s no team of people at Microsoft choosing these suggestions. Bing automatically suggests searches based on other people’s searches. That means many Bing Images users are searching for antisemitism, racism, child pornography, and bestiality.

Please refer to TFA for actual search terms, suggested items, and images found.

Also at The Verge, BBC News


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by requerdanos on Wednesday October 17 2018, @01:50PM (4 children)

    by requerdanos (5997) Subscriber Badge on Wednesday October 17 2018, @01:50PM (#749951) Journal

    TFA is actually talking about suggestions - i.e. the 'auto complete' options that pop up before you've finished typing your search terms.

    This is a search, of a dataset ("past searches"), that returns results (the suggestions).

    Now, there's certainly a "free speech" debate about whether search engines should allow people to search for offensive subjects, but this isn't it.

    This is a free speech debate about whether search engines should allow searches of past search terms.

    What we have here is a system that, albeit unintentionally, is encouraging people to search for offensive subjects, perhaps even by accident...

    It's not an accident--it's by design. If you type E-x-p-l-o i, and many people have searched for "Exploited children", then that's one of the results for that substring because it matches. It is a valid search result. That doesn't mean it's going to be what you are looking for--maybe you're looking for exploits that your systems may be vulnerable to--only that it's a somewhat popular search term that matches what you've typed so far and might or might not save you as an investigative reporter some typing with judicious use of the down arrow key. If it doesn't match, as in this instance, then that's okay too.

    Really, this isn't about free speech or political correctness - its about a duff UI feature that (probably) lets script kiddies write bots to push their favourite terms onto the "suggestions" list.

    Your theory here implies that the only way anything "offensive" would become a popular search term that appears in suggestions is if some script kiddie or bot artificially inflated it, rather than the thing that offends you being something many other people actually search for. That's, respectfully, nuts. Due to a wide variety in tastes and potential offensiveness, people search for things that offend other people all the time, all day every day. No hacking need be involved.

    There may already be great logic in place, in fact, to detect artificial inflation measures and automatically reduce their importance--people searching for things that offend others is not dependent on this in any way.

    At the end of the day its a vulnerability that undermines the impartiality of the search ending.

    Again, that depends on things that "offend" you only appearing if made popular by hacking or stuffing the ballot box, instead of being things that are actually searched for by actual people. Facts not in evidence. People search for things that offend others.

    Heck, that depends on the results being impartial in the first place, which they may or may not be.

    if they just turned off the "suggestions" feature, nothing of value would be lost, and I'd much prefer it if by computer didn't spaff my typos onto the internet until I'd finished and hit "go".

    The issue unfortunately with free services like Bing isn't what I like or what you like; we aren't the customers. Google has suggestions, so Bing has them too, a feature parity effort to help attract advertisers, who are the customers. Plus recording your typos and partial searches gives them more data (I think MS calls it "telemetry"), and data is power.

    I'd say that Microsoft, given their commercial strategies, should just let the results be the results, and only act to censor them if their commercial interests may be negatively affected, which in this case, they may.

    If you'd like to stop having your typos sent out, probably best to stick to ddg and its spiritual bretheren.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by theluggage on Wednesday October 17 2018, @05:36PM (3 children)

    by theluggage (1797) on Wednesday October 17 2018, @05:36PM (#750045)

    This is a search, of a dataset ("past searches"), that returns results (the suggestions).

    I'm pretty sure that most people go to a search engine with the intention of searching the dataset of websites, not the dataset of "past searches". Now if Google/Bing want to publish a tool to let people research the popularity of various searches [xkcd.com] then that's fine.

    People search for things that offend others.

    You're still conflating censoring the actual search results with a buggy evil-son-of-clippy system that tries to second-guess what people want to search for.

    ...what other people choose to search for is none of my business and I would not presume to impose my preferences on them - but they can bloody well learn to type "pedophilia" in full without having it pop up as a suggestion when granny is trying to type "pedicure".

    As for "ballot stuffing" - neither of us has any evidence either way, but the similar (and far harder to achieve) art of Googlebombing [wikipedia.org] back in the day was definitely deliberate. Heck, all you'd need to do would be to print something like TFA and it would become true as people fired up Bing to check it out...

    • (Score: 2) by requerdanos on Wednesday October 17 2018, @07:32PM (2 children)

      by requerdanos (5997) Subscriber Badge on Wednesday October 17 2018, @07:32PM (#750098) Journal

      People search for things that offend others.

      You're still conflating censoring the actual search results with a buggy evil-son-of-clippy system that tries to second-guess what people want to search for.

      No, that's how the "offensive" suggestions get into the suggestions database. Someone searched for them previously.

      • (Score: 2) by theluggage on Saturday October 20 2018, @11:50AM (1 child)

        by theluggage (1797) on Saturday October 20 2018, @11:50AM (#751369)

        Someone searched for them previously.

        ...or wrote a script to deliberately make it look that way. There's a lot of troll-baiting going on in the world right now - from every edge of the political arena.

        In any case, so what? Where's the "free speech" in, basically, telling people what you think they ought to be searching for? The best auto-suggestions are no auto-suggestions. Now, if search engines want to transparently publish details of how many people are searching for what - on a clearly separate page - that's fine. Also, maybe they should be transparent about the algorithms they use to prioritise the actual search results - but that's nothing to do with these 'suggestions'.

        • (Score: 2) by requerdanos on Saturday October 20 2018, @03:40PM

          by requerdanos (5997) Subscriber Badge on Saturday October 20 2018, @03:40PM (#751410) Journal

          Many people who have trouble typing because of physical or neurological disabilities are assisted greatly by being able to type only a few letters rather than all of them. They should not be made to suffer for the paranoia of those convinced that the suggestion results are faked.

          Many other people simply find it convenient. Ditto for them.

          Demanding that the feature be discontinued because you personally believe that it might somehow be influenced by trolls says a lot about you, but doesn't contribute a lot to the betterment of mankind.