It's not objectively wrong to do this. Suggesting that "SoA is always right", tells me that maybe you don't understand the trade-offs between expediency, readability and performance... or understand the trap of early optimisation. Always measure first before optimising, otherwise write code that's easier to read, and simpler to write.
> Twitter is so bad it crashes my browser. It's 90% text!
Not sure how knowing how to micro-optimise cache-effects fixes these issues. Profiling and spending time on perf does, but they've decided it's just not worth their time.
> Ehh, skeptical. Unless your data & algorithms are structured as contiguous arrays of primitives to begin with you're going to have trouble refactoring the abstraction.
The keyword is refactor. Developers and organizations do it all the time - or at least they ought to.
Disclaimer: Used to work in the video games industry. Literally worked for years doing perf, and cache-level optimisation.
I don't think twitter is crappy because of cache misses, I think it's crappy because they don't have a solid understanding of what their code is doing. Ignorance of caching is another symptom.
If you build your game on an entity-component system, you can't "just refractor" that to make it contiguous. Your structure is pretty baked.
> I don't think twitter is crappy because of cache misses, I think it's crappy because they don't have a solid understanding of what their code is doing.
It's my hunch that you have little idea of what the code-bases for twitter are like, nor what factors are at play that affect usability issues.
> If you build your game on an entity-component system, you can't "just refractor" that to make it contiguous.
Moving to object pooling of components is very do-able. There might be some extra work, like extracting a generic matrix hierarchy and physics primitives out, but you'd only do such a thing if you found that d-cache misses on v-table lookups, or if i-cache misses due to update code churning were of real concern... and these could realistically be mitigated with homogenised object pools for the particular use case... though usually yes.
Edit: I'd expect anyone on an engine team, or technical programmers at a game dev shop, or maybe even someone working on a browser renderer to have such knowledge from day 0, rather than the average FAANG employee.
It's not objectively wrong to do this. Suggesting that "SoA is always right", tells me that maybe you don't understand the trade-offs between expediency, readability and performance... or understand the trap of early optimisation. Always measure first before optimising, otherwise write code that's easier to read, and simpler to write.
> Twitter is so bad it crashes my browser. It's 90% text!
Not sure how knowing how to micro-optimise cache-effects fixes these issues. Profiling and spending time on perf does, but they've decided it's just not worth their time.
> Ehh, skeptical. Unless your data & algorithms are structured as contiguous arrays of primitives to begin with you're going to have trouble refactoring the abstraction.
The keyword is refactor. Developers and organizations do it all the time - or at least they ought to.
Disclaimer: Used to work in the video games industry. Literally worked for years doing perf, and cache-level optimisation.