Hacker News new | past | comments | ask | show | jobs | submit login

So really the only way to identify high quality information is by what some would call "peer reviewed, authenticated content". That means you know something about the person who generated the content and a bunch of people (who you also know something about) review it and vote it up. Thats why a large percentage of high ranked search results come from wikipedia and stack overflow/exchange and quora.

LinkedIn and Facebook are leading the market in the authenticated peer reviewed content business, which is locked in to their platforms, and which search engines cannot index.

Extrapolate ten years down the line, and that means a scenario where existing search engine leadership is severely compromised.

And thats why the push for G+. Its do or die.

tl;dr: the goal is to have search access to authenticated peer reviewed content, and to mitigate the risk of existing market leaders in that space from cannibalizing the search business, and with it the lucrative advertising business.




the only way to identify high quality information is by what some would call "peer reviewed, authenticated content".

That's not the only way.

It is a method that scales.

You can also identify content yourself, based on known sources (whether fully identified or pseudonymous), or based on the extant indicators within the text itself.

Traditionally the issue has been resolved through editors (not necessarily peer review) who would judge content on its merits and/or the reputation of its author(s). As Clay Shirkey has noted, "it's not information overload, it's filter failure", we've moved from pre-publication filters to post-publication filters. The incentives on and for publication have also shifted, with a huge increase in low-quality information being promoted (most of what's "viral"), something I'm increasingly getting sick of.

As for G+, as I've posted elsewhere (and Homer Slated points out very eloquently in a recent post), Google's formerly razor-sharp relevance algorithms are becoming increasingly vague. It's to the point it's becoming highly obvious to me, and it's very painfully manifested on G+ specifically.

Some of this might be attributed to SEO gaming of search, but tool design and selectin (see "What's Hot" and the extremely limited search and noise controls on G+) make me increasingly think it's deliberate.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: