Does it have to be "sustainable"? It wouldn't work for Google, but it works for Gumroad (and, like they said, most open source projects). Gumroad is a pretty simple site (and I don't mean that as an insult). They don't deal with huge customers, sales processes, large data, complicated tech, etc. Most "hard" things can be outsourced to a SaaS tool.
You say that, but the other side of this argument is having to rebuild every. Single. Part. Of a program because nothing was designed with forethought and scalability in mind.
Great code is both simple and has the ability to scale if needed in the future.
If it doesn't imply excess complexity nor effort, it's not premature to optimize it. Why write the inefficient algorithm when you remember the efficient one, and it's no more effort?
But if an optimization will imply excess complexity or effort, and you aren't sure it's needed, it's premature. Worrying about load when you have 3 users is almost assuredly premature. Not to say you shouldn't ask yourself when building "what happens if I get a bunch of users, and are there easy decisions that barely cost me any effort now, but which will allow this to scale easier later, such as writing things with minimal shared state, etc etc", but once you have optimization design/implementation choices that are going to noticeably delay launch...it's premature to select them.
Premature optimization is extremely subjective. You can’t actually make any decisions based on it because the data you need to know if it’s premature is in the future.
Conversely we know you can’t make an unscalable design scalable after the fact. You may be able to do micro optimizations of hotspots or fix the egregiously bad parts but you’ve painted yourself into a corner.
The good news is that most scalable architectures are simpler and better in most other facets of the design so it’s largely a false dichotomy.
Yep; the quote was originally about in line optimizations, like profiling to avoid cache misses and tightening inner loops and such. Nowadays, with your main concern being service throughput, the bigger concern is your architecture, not your implementation.
Which is where the confusion is, possibly. Architectures you can't easily undo; that loop or choice of data structure you can. So it's generally worth giving careful consideration towards your architecture (it's not premature to ask "What happens when this one thing becomes two?")); it's generally not worth giving the thought towards your implementation.
"Premature optimization is the root of all evil" is a thought-terminating cliché and a tautology - if it's premature, it was done too early.
When people discuss the phrase, the conversation degenerates into random proclamations about when one should worry about things, proclamations that immediately break when applied to specific situations.
Well, yes. Sustainability doesn't only mean "in the face of expected large growth", it means (for example), here's a culture that the company likes, can that culture continue to thrive through the coming years?
So why overcomplicate things?