> The author ignores the more pernicious problem : ML fairness. Google is editorializing results by up-ranking positions it prefers. Check the results on any controversial topic like race, politics, abortion, covid treatments, etc.
Not the OP but I think he might be referencing this study from Columbia where they found that the “Featured” articles were mostly left-leaning: https://dl.acm.org/doi/10.1145/3290605.3300683
The cause of this is debatable though - be it editorialisation, unconscious bias on behalf of people working on the algorithms or some common qualities to the websites that lean a certain way politically.
No, it's overt--I wish I could find this for you, but I saw a tweet from an apparently reputable source, it was a Google engineer saying "I'm so proud of the work our team has done to make it more difficult to find misinformation using Google". Naturally, "misinformation" means whatever the G-engineers want it to mean.
I'm not necessarily saying I don't believe you--obviously you'd know more about working there than I would. But can you explain to me why in 2010 if I googled "evidence the moon landing was faked" I got a huge list of conspiracy sites, and now if I Google the same query I only receive articles and websites alleging the exact opposite of what I searched?
I currently work for Youtube. We identify conspiracy-seeking queries, and then boost results from authoritative sources. Don't see why we wouldn't also be doing this on Google. Maybe people don't bother searching for things when the results suck (or maybe they just use alternative means of searching)?
Source/Proof?