Everyone knows exactly why they don't make search recommendations, and is there really anything we can say about it that won't devolve into trolling, flaming, and deliberate misunderstanding?
I don't think it's removed for the reasons you think it is.
It seems to be driven by what people search for. First option under "dead b" is "dead baby jokes". Try "death and r" and "death and return of superman" is above "death and resurrection of jesus". And one of the options under "death squads" is "death squads health care" If people don't use "Islam is ..." all that often it's not going to be part of their suggestion tree.
PS: First option under "muslim is " is "muslim is wrong". Muslim is not actually a religion, but I think far more people in the US use it than islam.
Hmm - you could be right, but as the article takes pains to note, all of the obvious analogous searches do return suggestions, which I think is evidence against your idea.
Still, it's been pointed out that queries like "Muhammad is" still "work", which I think is evidence for your idea. :)
Right, I think the author of the article is not really interested in uncovering the (presumably) interesting heuristics for inclusion/exclusion in the autosuggest list, and is rather just throwing his prejudices out in the open.
I don't know what your theory is ("everyone knows exactly why"), but I'll put forth my thesis.
1) Autosuggestions are a sample of the most frequent phrases google users type in
2) This autosuggest list is analyzed statistically and a sentiment score established ranging from 0 (friendly) through 1 (hateful).
3) If the autosuggest list statistics reveal a certain characteristic (e.g. all hateful, or very controversial, etc.), then no suggestions get returned.
If I'm wrong that at least in principle this is an automatic, data-driven behavior, then I really feel sorry for the sod who is maintaining a list of what phrases aren't kosher.
Actually, I tried a few other examples using "minorities" as the start of the phrase and then using "are". They are all censored. Non-minorities aren't; e.g. "men are" works fine.
So my algorithm implosion theory is clearly wrong.
Probably not. "Island is" works. It's not even that you don't find offensive things in the dropdown; it's that there's suddenly no dropdown.
Weirder still is that "Muslim is $ADJECTIVE" is ungrammatical. Typing "Christian is" turns up some nonsensical results, not the same negative ones that "Muslim is" or "Christianity is" do.
I think this may very well be true. It has always been the case that some search terms simply don't yield suggestions. The fact that 'islam' is among them, could very well be a mere coincidence.
Why tune the algorithm in a way that would make the service less useful just because some people are sensitive? I guarantee you I've been helped by google suggestions that occur less frequency on the web than, say, "islam is false" (which, judging from other religions, would be suggested if it were not blocked).
"white people are stupid" (without quotes) has 218 million results. It's one of the suggestions that google (presumably) blocks. That's many more results than a lot of useful suggestions ("michael jackson songs" has 88 million).
It turns out "(race or religion or nationality) are" is blocked for most races, religions, and nationalities.
Google is just being wise here. Why would they risk offending people who are already offended?
Not many would be offended when the completion suggests "niggers are.." - it is a thing of past; "jews are.." - they are sane and educated enough not to go batshit; "indians are.." - pretty well integrated with the western society. Muslims are neither that educated nor integrated well with the western world and we have a stinking situation in hand like occupied countries and a threat of having more occupied countries.
The minute after Google suggests completions about Islam there will be forward mails like "GOOGLE IS RUN BY JEWS AND THEY HUMILIATE US ALL!!!1". It is an effective bit of propoganda and there's a possibility of causing a loss of market share. And mind you this is the least of what could happen. So, again, why would they even risk it at all?
I dont think your right BTW (generally in your comment). I expect it is simply Google is avoiding any fall out. It's one of those current affairs terms that eventually somone would think to Google then write a blog post titled "look what Google suggests for islam is..."
Lose-lose all round. Best to go for the option the least amount of people would give a crap about.
EDIT: google doesnt offer suggestions for any of your other examples either btw.
You are saying terrorism works. You are saying that one of the most important companies in the world should discriminate in favour of one religion, because the adherents of that religion will use violence to get their way.
Google has chosen to position itself as an ethical company. Google's motto is "do no evil". They want us to think they enshrine all that is good.
Discriminating in favour of irrational and violent religious fundamentalists is evil.
Interestingly, the same is not true for Islam was ("founded", "founded by", "spread by the sword") or Muhammad is ("a false prophet", "satan", "the antichrist") or was ("born and raised in", "born", "illiterate"), or Allah is ("satan", "great", "not obliged") or was (which, intriguingly, has just a single suggestion, "a moon god").
I'm actually having a hard time finding anything else that is similarly blocked. Makes you wonder what the completion list had been popping up before they blocked it, eh?
It's not that hard to find other blocked terms. Just try something that's socially sensitive or offensive. Such as 'n----rs are ' or even 'chinese are ' (note: 'swedish are ' works).
I wouldn't be very surprised if they had some sort of algorithm to generate the blacklisted terms as opposed to manually cherrypicking them. We're talking about Google here.
Google could learn a lot by analysing which suggestions are presented and how the user reacts [chooses one, presses enter as is, leaves site, types something else]. You might be able to correlate this with how useful / how hateful the suggestions are.
Perhaps there is too much noise or too even a distribution - if there are too many nearly-equally weighted results for "Islam is" to make a good suggestion, does it not even try?
My pet theory was that it was based on the length of the query, since "Islam" was the shortest word on the list of terms. That got shot down by "Allah is ". I guess it really is censoring.
I haven't seen it pointed out yet that it doesn't actually block searches for "Islam is". It only blocks recommendations. It's not like they're censoring actual results here.
What they're doing is simply disabling a minor feature for a small number of potentially inflammatory use cases at the expense of about 10 seconds of amusement for you. If this offends you, I would think it says more about you than Google.
It doesn't exactly offend me; I just find it interesting and potentially troubling. It's part of a trend of going out of the way to avoid offending Muslims specifically.
I fear that things like Ireland's new blasphemy law, which came into force a few days ago, are the result of this trend. (Of course, it applies to mockery of all religions; in the West at least one can hardly get a law passed outlawing just mockery of Islam.)
I just read a comment elsewhere that such a trend "incentivises outrage", which I think is exactly right.
What I wonder is how the autosuggestions are generally generated.
From what users type in?
If so, and certain phrases get censored from autosuggest (even though "Islam is" is specifically in the censor list, it is not the only phrase that is specifically on this list), then my conclusion would be that the majority of phrases that googlers type in that start with "Islam is" are hateful? Hence the exclusion. In fact, what phrases get excluded might even be data driven as you can deduce sentiment from the most common phrases that users type in.
I can't speak for anyone else here, obviously, but I've never had my head cut off, whether by Islam, Muslims, or any related noun. "The plural of anecdote is not data" and all that, though, so take my story with a grain of salt, eh?
To be fair, religious affiliation seems unusually tightly bound to some people's sense of who they are. What's unclear to me is why certain religions have this worse than others, e.g. compare Muslim reaction to the Muhammad cartoons with Christian reaction to "Jerry Springer: The Opera".
I suspect if Jerry Springer: The Opera hd been written by a group or country that certain sects of Christianity want dead then the response would be the same.
It was groups of hardliners with their own agenda (aka not representing the religion) who, for the most part, provided the extreme reaction. The majority showed roughly the same response as to JS:TO.
So much for letting the algorithms rule their results. They could at least be somewhat equal about it and do the same thing about christianity and judaism, but then again, they know that nobody is going to chop someone's head off when they see that the first result for "christianity is" is "christianity is bullshit".
How long is the civilized world going submit to this insanity? When is enough, enough?
Define "algorithm" because it looks like they still are letting algorithms rule results, they've just changed the algorithm. That seems like a pretty good idea to me, because if by 'algorithms rule results' you mean "page rank as released on Google 1.0", I'd be seeing some pretty crappy search results.
It's tough to get every case of this and still have an autosuggest. They also got "muslims are" in the block list, but even then "muslims can" returns "muslims can not be trusted" within the top couple results.
There's apparent favoritism here, or bias against, depending how you want to look at it. If there were no filtering going on either way, I'd have no complaint.
This is rather disturbing. I didn't live through a dictatorial government in my country, but 30 years or so one that was ruling for about 50 years was put down. The worst thing about a dictatorship is not the censorship itself that will cut the parts that they don't want to be read from a text or a poem, it is when the author himself limits its expressiveness and freewill to the walls built by the fear of the censorship, and will instead write the version of his feelings that he thinks it will not get cut.
Google is doing the same... actively self-censoring, the worst form of censorship.
Aren't they simply withholding phrase suggestions which they deem wrong? I.e. you can still search for anything ranging from the angelic to the vile, and they probably don't censor (but of course, there's no way for an outsider to ever know).
What I do wonder is whether Google, for example, actually filters out from its search results websites that are blacklisted by the Australian government, when people down under google.
There's been a popular trend of posting screenshots from Google search recommendations over the past few months; I wouldn't be surprised if one of those shots made it to Google Headquarters (though I'm certainly surprised they didn't notice it sooner!).
Also shame on PZ Meyers for yoinking that article from atheists.org.
I submit something that may be potentially useful, a guy who is going to open source out something that might be real interesting ->http://news.ycombinator.com/item?id=1032278 and I get no love and this ahem "sensationalist" blog with no proof whatsoever for the claims gets so many points - wow!
It doesn't make any claims (aside from claims about the behaviour of the Google search field, which are easily verified). It asks a question. But don't feel bad - whether your submission gets voted up is half luck: either a couple of people spot it and vote it up so it becomes visible to everyone, or not.
Fair enough, I am unfamiliar with the misspellings. Musli(s)m sounds perfectly normal to me as a non-native speaker of the English language, and for a bunch of other people on the internet for that matter hence the suggestions of google.
Search engines should avoid being in the business of egging people on. That has to be one messy program/algorithm/filter. We put one together for a retail company emails and stopped 20% of the incoming mail. It was pretty funny what combos you can get.