There's an attitude I've seen bandied about a lot in recent years that "unused RAM is wasted RAM." In a literal sense, this is true. However it's nearly always misapplied. Unless your program is likely to be the raison d'être for that computer existing, then you shouldn't assume the user has all that RAM so that your program can use it. The user probably bought all that RAM for something else and you shouldn't feel justified in slurping it all up yourself.
I've only ever seen this when explaining to people why Linux appears to be using all their RAM - it caches your disk to make subsequent reads faster, and when an application needs more memory the cache will be evicted immediately and at almost no performance cost.
It's completely insane to suggest the user's RAM is yours to consume. Some people have 64 gb of memory in their desktops, and others have 4gb on their $300 laptop because that's all they could afford, and some have 2gb on their cheap phone.
> I've only ever seen this when explaining to people why Linux appears to be using all their RAM - it caches your disk to make subsequent reads faster, and when an application needs more memory the cache will be evicted immediately and at almost no performance cost.
That's where it's taught I think, and certainly it's the truth in that context. But more than a few times I've encountered it as a defense for stuff like bloated chat programs slurping up gigabytes of RAM.
With respect to hardware diversity, I think part of the problem is most programmers do their development on powerful hardware and become accustomed to it. Certainly nobody wants to sit around for an hour waiting for their build to finish on low-end hardware when a powerful computer, which they or their employer can easily afford, could finish the build in minutes. But because of that, they lose touch with end users who will be running that software on very modest hardware.