Hacker News new | past | comments | ask | show | jobs | submit login

Hold up, one of those things is not like the other. Are we really blaming webmasters for 100x increases in costs from a huge wave of poorly written and maliciously aggressive bots?





> Are we really blaming...

No, they're discussing increased fingerprinting / browser profiling recently and how it affects low-market-share browsers.


I saw that, but I'm still not sure how this fits in:

> The enemy has been trying to spin it as "AI bots DDoSing" but one wonders how much of that was their own doing...

I'm reading that as `enemy == fingerprinters`, `that == AI bots DDoSing`, and `their own == webmasters, hosting providers, and CDNs (i.e., the fingerprinters)`, which sounds pretty straightforwardly like the fingerprinters are responsible for the DDoSing they're receiving.

That interpretation doesn't seem to match the rest of the post though. Do you happen to have a better one?


"their own" = CloudFlare and/or those who have vested interests in closing up the Internet.

Your costs only went up 100x if you built your site poorly

I'll bite. How do you serve 100x the traffic without 100x the costs? It costs something like 1e-10 dollars to serve a recipe page with a few photos, for example. If you serve it 100x more times, how does that not scale up?

It might scale up but if you're anywhere near efficient you're way overprovisioned to begin with. The compute cost should be miniscule due to caching and bandwidth is cheap if you're not with one of the big clouds. As an example, according to dang HN runs on a single server and yet many websites that get posted to HN, and thus receive a fraction of the traffic, go down due to the load.

You got 100x the traffic if your traffic was near zero to begin with.



Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: