Hacker News new | past | comments | ask | show | jobs | submit login

Worrying about page weight is kind of a red herring and micro-optimization. You've really got to get into binary data streams before it's possible to blow through a data cap like this. For example, you could download the full text of English Wikipedia twice a day, every day, and not come close to hitting 1 TB[1]. Text is cheap.

This is part of why I chuckle about people tearing their hair out to minify and obfuscate JS or CSS files to reduce them by a handful of kbs - when their pages load a dozen uncompressed full-resolution images and a 1080p auto-playing video.

[1] https://en.m.wikipedia.org/wiki/Wikipedia:Size_of_Wikipedia




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: