Hacker News new | past | comments | ask | show | jobs | submit login

> there is basically no situation in which it is important to optimize for binary size, embedded sure, but nowhere else

Not disagreeing that there many upsides to statically linking, but there are (other) situations where binary size matters. Rolling updates (or scaling horizontally) where the time is dominated by the time it takes to copy the new binaries, e.g.

> the model where an OS is one to many with applications works fine for personal machines, it's no longer relevant for most servers

Stacking services with different usage characteristics to increase utilization of underlying hardware is still relevant. I wouldn't be surprised if enabling very commonly included libraries to be loaded dynamically could save significant memory across a fleet .. and while the standard way this is done is fragile, it's not hard to imagine something that could be as reliable as static linking, esp in cases where you're using something like buck to build the world on every release anyway




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: