Google is wrong, wrong, wrong to remove 'are Jews evil' from search autocomplete suggestions

google-search

Over the last few days there has been much wailing and gnashing of teeth over the discovery that if you type "are Jews" into Google, one of the suggested searches is "are Jews evil". The same is true for the search "are women" and "are Muslims" ("bad" being the suggestion in the third instance). Or at least it was the case.

Following cries of anti-Semitism, the search giant folded like a moist tissue and remove the "offensive" suggestion. Clearly Google is able to do -- by and large -- whatever the hell it wants... but that doesn’t make it right. And the removal of the "are Jews evil" suggestion is not only wrong, but also worrying and dangerous. If you disagree you can let off steam in the comments and cast a vote in the poll, but hear me out first.

I'm already anticipating several possible responses to this article. 1) No, Mark, you're wrong. Of course Google should censor this. 2) What are you? Some sort of bloody socialist? 3) Surely there are bigger things to be concerned about. 4) Think of the children! 5) You hateful anti-Semitic bastard. 6) (and I'm aware that this will be a tiny minority) Yes, Mark, you're absolutely bloody right.

Before we get into things too much, let's hear from Google. The company talks a little about the action it decided to take:

We took action within hours of being notified on Friday of the autocomplete results.

Our search results are a reflection of the content across the web. This means that sometimes unpleasant portrayals of sensitive subject matter online can affect what search results appear for a given query. These results don't reflect Google's own opinions or beliefs -- as a company, we strongly value a diversity of perspectives, ideas and cultures.

Autocomplete predictions are algorithmically generated based on users' search activity and interests. Users search for such a wide range of material on the web -- 15 percent of searches we see every day are new. Because of this, terms that appear in autocomplete may be unexpected or unpleasant. We do our best to prevent offensive terms, like porn and hate speech, from appearing, but we acknowledge that autocomplete isn't an exact science and we're always working to improve our algorithms.

Even though there is not a great deal of detail from Google here, there are a couple of interesting things to jump on straight away. The first is that Google appears to have fallen foul of its own algorithms here -- just as what happens to Facebook on an almost weekly basis. The second is the statement that "results don't reflect Google's own opinions or beliefs". If that was the case, what is the problem in suggesting that someone might want to ask the question "are Jews/women/Muslims evil?"? It's a reasonable question, even if the answer is pretty obvious (yes, some of each group are, just as with any given group).

If the initial, offensive, suggestions did not reflect Google's own opinions or beliefs, presumably the same applies to the censored/edited/interfered-with suggestions. If so, why bother interfering? If Google does indeed "strongly value a diversity of perspectives, ideas and cultures", why does it not strongly value the perspective of someone who believes that, or wants to question whether, Jews are evil?

What's worrying is that tinkering with suggestions in this way sets something of a dangerous precedent. Of course it does not stop you from performing any search you might care to -- although you might not necessarily see links to all relevant pages, of course -- but part of the point of search autocomplete suggestions is that they are meant to provide a snapshot of what people around the world are searching for right now.

If Google starts censoring suggestions because they might be deemed offensive, we're not getting a true reflection of what's going on. Google is, in effect, playing editor of the web. That's worrying and dangerous, particularly for a company of the size and influence of Google. Just imagine if search result suggestions had been tweaked during the US election -- all sorts of terms could have been censored according to Google's whim... and, importantly, we'd probably never know about it.

It's important to point out that as well as not blocking the actual searches for the terms set out above, Google is also still delivering the same results as before -- censorship has not gone quite that far. But, if nothing else, this episode highlights just how much power and influence Google wields, or how much it is perceived to wield. So what if people are searching for pages that could reveal to them whether the entire global population of Muslims is bad? So what if people are looking to find out whether there are any pages out there that tell them womankind is evil? So bloody what?

Yes, the suggestion that Jews are evil is nasty and hateful, but there are nasty and hateful people out there with nasty and hateful views. Google providing a sanitized version of events will not change this. You don’t make things go away by trying to hide them. The fact that this particular example of interference from Google centers around such a sensitive topic should not detract from the fact that Google could do the same thing for any search. That's powerful and scary stuff.

But back to the question posed early: do you think Google was right?

[poll="29"]

Image credit: Evan Lorne / Shutterstock.com

73 Responses to Google is wrong, wrong, wrong to remove 'are Jews evil' from search autocomplete suggestions

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.