Hacker News new | past | comments | ask | show | jobs | submit login

But it's not a "light" protocol when you're serving 36MB per day, when 500KB would suffice. RSS/Atom is light weight, if clients play by the rules. This could also have been a news website, imagine how much traffic would be dedicated to pointless transfers of unchanged data. Traffic isn't free.

A similar problem arise from the increase in AI scraper activities. Talking to other SREs the problem seems pretty wide spread. AI companies will just hoover up data, but revisit so frequently and aggressively that it's starting to affect the transit feeds for popular websites. Frequently user-agents wouldn't be set to something unique, or deliberately hidden, and traffic originates from AWS, making it hard to target individual bad actors. Fair enough that you're scraping websites, that's part of the game when your online, but when your industry starts to affect transit feeds, then we need to talk compensation.






Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: