Hacker News new | past | comments | ask | show | jobs | submit login

Just because most reactions somewhat imply shady SEO problems:

Cutts' clearly states, that they are giving preferred treatment to a high viability site such as Gawker, while normal websites (i.e. internet peasants) would have been de-indexed in the same situation. Cutts' fulltime job is communicating ranking related issues, so I am quite certain this would have sounded a lot different if this would have been implemented as a general rule for disaster ridden IP ranges etc.




I don't think he "clearly" states that. To me, it sounds like he is saying: our algorithms are robust to temporary service failures; websites aren't penalised heavily.

As a genuine question, do you have any evidence for the de-indexing time of normal websites? (And evidence that they aren't restored as soon as the website is restored?)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: