We set up a jitsy-meet server for those cases when the customer does not have teams or teams decides to not work.
Jitsi does work flawlessly for us in Chrome, Firefox and Edge.
I set up a private server for friends, too, to keep in contact with their family.
Jitsi is a fine tool, especially with headsets (wtf, use a headset, dad!) , the main bottleneck seems to be the clients ISP downstream and the dedicated apps on tablets seem to get out of sync.
Just to clarify the performance penalty of these apps is not what bothers me the most, but it’s more to do with registering protocols that are susceptible to attacks, opening local ports on your computer that are browser-accessible, installing startup drivers/services and other things in the name of “user convenience” that persist well beyond me closing some of these apps.
I can live with a fat app, but anything that insists on being there and having serious negative trade-offs even when not using it, in my opinion falls under malware.
The funny thing is that if you disable all the gif and animated gif features it behaves much better.
I also disabled spelling checks, automatic replacement of emojis and this kind of not really useful things.
On my work computer it went from using kind 60% of a core to almost nothing.
I went looking for these options after ready your message. I'm now wonder why in the world is "Allow animated images and emoji" in the Accessibility section of the preferences and separated from the other options for images and media? Seems oddly located to me
Still doesn't excuse it from eating up all your resources. It's not a great productivity tool if you can't do anything else on your computer when it's open.
There's an attitude I've seen bandied about a lot in recent years that "unused RAM is wasted RAM." In a literal sense, this is true. However it's nearly always misapplied. Unless your program is likely to be the raison d'être for that computer existing, then you shouldn't assume the user has all that RAM so that your program can use it. The user probably bought all that RAM for something else and you shouldn't feel justified in slurping it all up yourself.
I've only ever seen this when explaining to people why Linux appears to be using all their RAM - it caches your disk to make subsequent reads faster, and when an application needs more memory the cache will be evicted immediately and at almost no performance cost.
It's completely insane to suggest the user's RAM is yours to consume. Some people have 64 gb of memory in their desktops, and others have 4gb on their $300 laptop because that's all they could afford, and some have 2gb on their cheap phone.
> I've only ever seen this when explaining to people why Linux appears to be using all their RAM - it caches your disk to make subsequent reads faster, and when an application needs more memory the cache will be evicted immediately and at almost no performance cost.
That's where it's taught I think, and certainly it's the truth in that context. But more than a few times I've encountered it as a defense for stuff like bloated chat programs slurping up gigabytes of RAM.
With respect to hardware diversity, I think part of the problem is most programmers do their development on powerful hardware and become accustomed to it. Certainly nobody wants to sit around for an hour waiting for their build to finish on low-end hardware when a powerful computer, which they or their employer can easily afford, could finish the build in minutes. But because of that, they lose touch with end users who will be running that software on very modest hardware.
And I hate that Covid means that I am pretty much forced to use have half a dozen each day.