I'm an owner of firearms and in favor of the second amendment and I still agree with parent, Gun Control being highlighted as a relevant article makes complete sense, since Gun Control is fundamentally about rights related to firearm ownership.
Right - of course. So the question is: are the other "things that matter" designed to:
1) Inform the user and help the user quickly find facts, narratives, and meaningful content related to the search phrase?
2) Sell things?
3) Promote modes of political control by narrowing the scope of acceptable thought and connection between search phrase and content?
We all hope for #1. We all accept #2. But I think that most of us, at least until recently, had thought that #3 wasn't happening to any significant degree.
I agree that results like this one are befuddling - a search for "Politically controversial and anti-state concept <foo>" brings as the first result, "Related but saccharine pro-state concept <bar>" when both foo and bar appear in approximately equal stature in the domain in question.
Yeah, I think that's odd. And I don't see a way to easily explain it in the context of numbers 1 and 2 above.
So I think you're right to be skeptical here. On the other hand, after a scan of each, I see two plausible reasons that the "Gun control" article might be elevated.
One, it's somewhat longer, with more references and cross-links (it even has a "See also" link to the "Gun rights" article).
Two, superficially, the "Gun control" article is a bit more internationally focused. The title of the "Gun rights" article is straight out of the U.S. Constitution. The appearance doesn't really hold up under scrutiny -- they both spend similar shares of their content on U.S. stuff -- but it might make a small difference.
Regardless of the reason, it's definitely good to pay attention to this sort of thing.
Try Googling the term "white couple" as an image search - you will be provided with a page of interracial couples, not white couples. Not that I have a problem with interracial couples, my own family is interracial. But it's obvious how Google is attempting to influence the perspectives of it's users.
I just opened up a private tab and did this (and by the way, going in, I was and am already suspicious that Google is in fact using search results to manipulate public opinion, at least some of the time and in an experimental way).
In the top two rows (ie, 8 images), three are from a reddit post about this very controversy.
The remaining five are images of couples. Of these, three are what look to my eyes to be white couples. One is an image of an ethnically Indian couple in an article about not being able to adopt a white child. The other is a photo of two dudes doing a hand symbol together with a caption "black and white couple gesturing with hands."
All of these images seem very relevant to the phrase "white couple."
So, actually, this is a case where Google Image Search gives back perfectly appropriate results, at least for me as of today.
Is it though? The term “white couple” is already racially charged and its use across the internet will often be in that context.
Are you implying that there are Google employees manually tinkering with the result of millions of search terms for the purposes of influencing the American population? And nobody at Google has leaked it? That’s hilarious.
And how is that influence campaign going to work on a practical level? Are you assuming people are so dumb as to change their opinion on core political matters because their racially charged search term didn’t acid wash the internet of its diversity??
Any attempt to algorithmically shape perspectives is crude with today’s state of the art.
How precisely would such an algorithm work, while not creating obvious artifacts and errors throughout the search results for all people using the service.
And how would it work for people only in the us and not elsewhere?
> For me or anyone to believe this, is a huge ask.
Agreed.
> Any attempt to algorithmically shape perspectives is crude with today’s state of the art.
I suspect that the truth of this statement is very difficult to ascertain at the moment. Reasonable people might dispute both which technologies represent the "state of the art" and also which outcomes represent successfully shaped perspectives.
Although I don't have the data to back up this assertion (and again, I'm not sure such data is cognizable in the current environment), I think that for some reasonable definitions of the above that indeed perspectives can successfully be algorithmically shaped today, though as I said above, I think that this "white couple" nonsense is not a meaningful example.
> How precisely would such an algorithm work, while not creating obvious artifacts and errors throughout the search results for all people using the service.
Again, it may not be clear to anyone, including the authors, precisely how such an algorithm works. That's why A/B (and other) testing is needed.
And I think people in this thread are saying that indeed obvious artifacts are starting to creep up (although again I think that the "white couple" thing is a total misfire and not at all an example - the "gun rights" thing appears to my eyes to be much closer).
> And how would it work for people only in the us and not elsewhere?
Isn't this already at a point of triviality? Doesn't Google already do this all the time? I mean, if you travel outside North America, indeed the Google experience is palpably different from the one in the US.
1. It's subjective if the results are wrong. _You_ know what you are looking for, Google however has to guess. And it's algorithms aren't perfect at figuring out _your_ intent - even though they may do a fine job for the average person. Right now, for me, the top result is from a Reddit thread about this very query. The rest are things like "black and white couple" and "couple with white background". These aren't what you expect, but, it makes sense why they are being returned.
2. Your test does nothing to establish intent. A pretty reasonable explanation for the search results is just that that is what the algorithm does.
3. I'm sure someone can come up with some theory about how tweaking the results of this very specific search is a nefarious plot to promote interracial relationships - but it requires some pretty tortured logic to conclude that such a plan would, even if true, be even remotely effective at accomplishing it's goal. That's basically the Hallmark of a conspiracy theory.