Submitted via IRC for chromas
Bing Is Suggesting the Worst Things You Can Imagine
If you use Bing’s image search, you’re going to see the worst filth you can imagine. Bing suggests racist terms and shows horrifying images. Bing will even suggest you search for exploited children if you have SafeSearch disabled.
We contacted Microsoft for comment, and Jeff Jones, Senior Director at Microsoft, gave us the following statement:
“We take matters of offensive content very seriously and continue to enhance our systems to identify and prevent such content from appearing as a suggested search. As soon as we become aware of an issue, we take action to address it.”
Update: Since publication, Microsoft has been working on cleaning up the offensive Bing suggestions that we mentioned. Based on our research, there are still many other offensive suggestions that have not yet been fixed, including a few that we’ve mentioned below. We are unsure if they are simply fixing the offensive items we pointed out, or if they are improving the algorithm.
Note: The screenshots here show what we saw when we wrote this piece testing the US version of Bing Image search in an Incognito private browsing session, but Bing’s results shift over time. Google didn’t have any of these problems, according to our tests. This is a Bing problem, not just a search engine problem. The same problem affects Bing’s video search.
[...] Microsoft needs to moderate Bing better. Microsoft has previously created platforms, unleashed them on the world, and ignored them while they turned bad
We’ve seen this happen over and over. Microsoft once unleashed a chatbot named Tay on Twitter. This chatbot quickly turned into a Nazi and declared “Hitler was right I hate the jews” after it learned from other social media users. Microsoft had to pull it offline.
[...] Microsoft can’t just turn a platform loose on the world and ignore it. Companies like Microsoft and Google have a responsibility to moderate their platforms and keep the horror at bay.
Suggestions Have a History of Serious Problems
Of course, there’s no team of people at Microsoft choosing these suggestions. Bing automatically suggests searches based on other people’s searches. That means many Bing Images users are searching for antisemitism, racism, child pornography, and bestiality.
Please refer to TFA for actual search terms, suggested items, and images found.
(Score: 2) by requerdanos on Wednesday October 17 2018, @01:50PM (4 children)
This is a search, of a dataset ("past searches"), that returns results (the suggestions).
This is a free speech debate about whether search engines should allow searches of past search terms.
It's not an accident--it's by design. If you type E-x-p-l-o i, and many people have searched for "Exploited children", then that's one of the results for that substring because it matches. It is a valid search result. That doesn't mean it's going to be what you are looking for--maybe you're looking for exploits that your systems may be vulnerable to--only that it's a somewhat popular search term that matches what you've typed so far and might or might not save you as an investigative reporter some typing with judicious use of the down arrow key. If it doesn't match, as in this instance, then that's okay too.
Your theory here implies that the only way anything "offensive" would become a popular search term that appears in suggestions is if some script kiddie or bot artificially inflated it, rather than the thing that offends you being something many other people actually search for. That's, respectfully, nuts. Due to a wide variety in tastes and potential offensiveness, people search for things that offend other people all the time, all day every day. No hacking need be involved.
There may already be great logic in place, in fact, to detect artificial inflation measures and automatically reduce their importance--people searching for things that offend others is not dependent on this in any way.
Again, that depends on things that "offend" you only appearing if made popular by hacking or stuffing the ballot box, instead of being things that are actually searched for by actual people. Facts not in evidence. People search for things that offend others.
Heck, that depends on the results being impartial in the first place, which they may or may not be.
The issue unfortunately with free services like Bing isn't what I like or what you like; we aren't the customers. Google has suggestions, so Bing has them too, a feature parity effort to help attract advertisers, who are the customers. Plus recording your typos and partial searches gives them more data (I think MS calls it "telemetry"), and data is power.
I'd say that Microsoft, given their commercial strategies, should just let the results be the results, and only act to censor them if their commercial interests may be negatively affected, which in this case, they may.
If you'd like to stop having your typos sent out, probably best to stick to ddg and its spiritual bretheren.
(Score: 2) by theluggage on Wednesday October 17 2018, @05:36PM (3 children)
I'm pretty sure that most people go to a search engine with the intention of searching the dataset of websites, not the dataset of "past searches". Now if Google/Bing want to publish a tool to let people research the popularity of various searches [xkcd.com] then that's fine.
You're still conflating censoring the actual search results with a buggy evil-son-of-clippy system that tries to second-guess what people want to search for.
...what other people choose to search for is none of my business and I would not presume to impose my preferences on them - but they can bloody well learn to type "pedophilia" in full without having it pop up as a suggestion when granny is trying to type "pedicure".
As for "ballot stuffing" - neither of us has any evidence either way, but the similar (and far harder to achieve) art of Googlebombing [wikipedia.org] back in the day was definitely deliberate. Heck, all you'd need to do would be to print something like TFA and it would become true as people fired up Bing to check it out...
(Score: 2) by requerdanos on Wednesday October 17 2018, @07:32PM (2 children)
No, that's how the "offensive" suggestions get into the suggestions database. Someone searched for them previously.
(Score: 2) by theluggage on Saturday October 20 2018, @11:50AM (1 child)
...or wrote a script to deliberately make it look that way. There's a lot of troll-baiting going on in the world right now - from every edge of the political arena.
In any case, so what? Where's the "free speech" in, basically, telling people what you think they ought to be searching for? The best auto-suggestions are no auto-suggestions. Now, if search engines want to transparently publish details of how many people are searching for what - on a clearly separate page - that's fine. Also, maybe they should be transparent about the algorithms they use to prioritise the actual search results - but that's nothing to do with these 'suggestions'.
(Score: 2) by requerdanos on Saturday October 20 2018, @03:40PM
Many people who have trouble typing because of physical or neurological disabilities are assisted greatly by being able to type only a few letters rather than all of them. They should not be made to suffer for the paranoia of those convinced that the suggestion results are faked.
Many other people simply find it convenient. Ditto for them.
Demanding that the feature be discontinued because you personally believe that it might somehow be influenced by trolls says a lot about you, but doesn't contribute a lot to the betterment of mankind.