Heh. I'm doing work for a "MegaCorp" right now trying to shave a few milliseconds off of ad render times. We have metrics that show milliseconds do matter. Especially since a few milliseconds on desktop can translate to hundreds of milliseconds on mobile.
I'm sure they do care, but they seems to care more about everything else. Or just are victims of decades of bloat. Or too many teams, where one team obliterated the tuning a hundred other teams did.
Because the end result is typically something that takes orders of magnitude more than you'd imagine physically possible on todays hardware.
1.3 MB on every embed? Is that the best the brightest engineers can come up with?
Or maybe the goal is to slow down the entire internet so that their own sites don't seem so slow in comparison?
I know it's easy for me to talk shit, but have you actually used the web lately? It is a miserable experience.
I'm probably misremembering the details because I can't find it but didn't Netflix have an absolutely enormous favicon or something that went unnoticed for a really long time. Feels like there are a few of those in any big codebase. Question is whether the codebase really needs to be big.