Hacker News new | past | comments | ask | show | jobs | submit login

I'm investigating the same thing. But my bet is that they will either change the terms or lower your cdn-cache size (therefore lowering performance, you can't serve popular videos without a CDN).

And the difference is that you will fail your customers when that time comes because you'll just get suspended (we've seen some cases here on the forum) and you'll have to come here to complain so the ceo/cto resumes things for you.




I don’t believe anybody on a paid plan has been suspended for using R2 behind the CDN? (I’ve seen the stories you’re alluding to. IIRC the cached files weren’t on R2)

In their docs they explicitly state it as an attractive feature to leverage, so that’d surprise me.

That being said, I’m not planning to serve particularly large files with any meaningful frequency, so in my particular case I’m not concerned about that possibility. (I’m distributing low bitrate audio, and small images, mostly).

If I were trying to build YouTube or whatever I’d be more concerned.

That being said, with their storage pricing and network set up as they are, I think they make plenty of money off of a hypothetical YouTube clone.

I do think they’ll raise prices eventually. But it’s a highly competitive space, so it feels like there’s a stable ceiling.


See https://news.ycombinator.com/item?id=34639212. They got suspended for using workers behind the CDN.

> I’m distributing low bitrate audio, and small images, mostly

This means the cache-size would be much smaller though.


Right, but they were serving content that wasn't from R2 as far as I understand from that thread. Not trying to say they that justifies their treatment, only that it doesn't apply to my use case. They were also seeing ~30TB of daily egress on a non-enterprise plan, which would absolutely never happen in my case – 1TB of daily egress would be a p99.9 event.

Re cache-size, maybe I've misunderstood what you mean by cache size limiting, but yeah that's my point – I don't need a massive cache size for my application. My data doesn't lend itself much to large and distributed spikes. Egress is spiky, but centralized to a few files at a time. e.g. if there were to be a single day where 1TB were downloaded at once, 80% of it would be concentrated into ~20 400MB-sized files.


He was ok by the terms though. Workers had/have the same terms as R2 before R2 got the new terms.

> They were also seeing ~30TB of daily egress on a non-enterprise plan, which would absolutely never happen in my case – 1TB of daily egress would be a p99.9 event.

I don't understand what media company you'll be competing against if you'll use just 30TB/month of bandwidth.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: