Hacker News new | past | comments | ask | show | jobs | submit login

Who cares about performance for a random blog? As long as you don't get hit by HN, of course.



On the other hand, why bother writing a blog if you don't expect any readers?

There's this hidden assumption here that "the HN crowd" = "a huge amount of visitors". The typical HN crowd for a front-page article will be on the order of 10,000 visitors. That's really not much on the scale of the internet.


Why implement premature optimisations if you expect 100 or 1000 readersa day? There's a huuuuge difference between 10k an hour or 10k a day.


This argument doesn't make any sense to me.

It might be true for self-written blog software, but amusingly those tend to have quite good performance ... either because these are static generators, or because these are rendered on the fly, but with very simple and hence fast templates.

However, there is no such excuse for popular CMS. Those have been developed over 10+ years, and have a wide range of users - small as well as large blogs.

Finally, I wouldn't call static HTML pages "premature optimization", but rather "the natural thing to do". Let's have a look at the data access pattern: On average, articles are written once, updated seldomly, and read at least 10x as often as written. With increasing popularity, this ratio shifts even more to the "read" direction. [1] Since the datasets are small (order of KiB or MiB), complete regeneration is feasible. Moreover, it is much simpler and less error-prone than caching. And you can speed up site generation with classic build tools (make, tup, etc.), if you want.

[1] That is, with increased popularity more articles are written due to the increased motivation, but disproportionally more readers will arrive.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: