It might be true for self-written blog software, but amusingly those tend to have quite good performance ... either because these are static generators, or because these are rendered on the fly, but with very simple and hence fast templates.
However, there is no such excuse for popular CMS. Those have been developed over 10+ years, and have a wide range of users - small as well as large blogs.
Finally, I wouldn't call static HTML pages "premature optimization", but rather "the natural thing to do". Let's have a look at the data access pattern: On average, articles are written once, updated seldomly, and read at least 10x as often as written. With increasing popularity, this ratio shifts even more to the "read" direction. [1] Since the datasets are small (order of KiB or MiB), complete regeneration is feasible. Moreover, it is much simpler and less error-prone than caching. And you can speed up site generation with classic build tools (make, tup, etc.), if you want.
[1] That is, with increased popularity more articles are written due to the increased motivation, but disproportionally more readers will arrive.