Hacker News new | past | comments | ask | show | jobs | submit login

On one of my sites I received a million requests from Googlebot before lunchtime today. Those requests were spread over hundreds of IPs. I get similar traffic from them and other hyper scalers daily. I'm just one guy. I don't have load balancers and endless engineering budget. I'm sick of this.



Do they not respect the crawl-delay directive in robots.txt?




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: