There are many reasons. The elsephant in the room, being that it’s Javascript and includes a full blown browser with a JS VM.
When I write a C or C++ app, if I use libcurl and a library I’m using does as well, it links to the same shared libcurl library (mapped in RAM) as all other running processes do as well. With NPM, you can’t share libraries across process space and you’ll often even find the same packed being imported multiple times within the same app. In JS, it’s like statically linking all of your dependencies, but even less efficient. It can be argued that this is as good as it is bad.
When you build a code ecosystem atop JS, the underlying inefficiencies multiply as each layer is added. VS Code is more efficient, but even 192MB is still pathetically bloated.
Expect Atom to eventually approach and surpass 1GB of RAM, not less... at least in the short term.
At a high level, it's because Atom is buult from components, and each of those components is made up of hundreds of components, and each of those is made from hundreds of components, etc. Each component is general-purpose in some way, and includes code that isn't used by Atom. To be slightly more specific, when I pull apart components like this to figure out why they're so large (like a 100MB XML-parsing library), the vast majority of the bulk is usually huge tables of data, such as unicode mappings. Most likely, Atom contains multiple components which contain the same data tables, but because of the hierarchical nature of modern software development, no one can cut through the layers and DRY up (denormalize?) that data.
I think developers probably tree-shake the HTML/CSS/JS that is running inside of Electron, but tree-shaking all of Chromium can't be the easiest thing to do. Not that I disagree with you, that should be a part of the Electron publishing/packaging step.
Atom is ALSO a text editor. It's really a web browser in "to go" mode so it brings all its dependencies and a whole lot more along with it. I'm sure there are multiple copies of some components brought along for ride. I've seen this with quite a few NPM based projects. eg three copies of the same module because of dependencies in other modules.
"Imagine if the apple you were eating for breakfast had 291 ingredients, or if the car you drove to work had 291 parts. You’d be worried, wouldn’t you?"
That's literally comparing...apples and cars. I would be shocked if a car only had 291 parts.
I'm not in front of a Mac right now, nor do I use Atom on said Mac, but you can right-click on atom.app/Show Contents and get a directory layout of what's in there. It won't break out the binary and statically compiled libraries, but you can see if it's using large assets, or dragging along bunch of dynamic libs.
FWIW, I am shopping for low-end laptops for kids in my extended family, and Dad could use a new machine, but he wants Windows OS. These machines spec out to 4GB RAM, 64 or 128GB eMMC or SSD.
That's a surprising expectation to an older developer coming from the days when a home computer was less than a decent used car and a developer's workstation was more than some new cars.
it's not sarcasm. 4GB is a pathetic amount of ram for a new computer in 2018. the last computer I had with 4GB of ram was a laptop I bought in the late '00s. a 2x4GB kit of DDR4 for a laptop costs about $50 on newegg right now.
the kids should be glad to have any computer a relative will buy for them, but it's probably wasteful to purchase such an underprovisioned machine for an adult who will actually take care of it.
Looking at laptops with my dad (he's buying for himself as a retired person), the laptops he liked were either $500-600 with decent specs (i3 with hyperthteading) except low RAM or overkill (i7) for $800+.
I found it difficult to justify the more beefy machine aside from RAM, and paying $200+ for 4gb more RAM is silly, so he'll probably get the cheaper option and upgrade RAM if it becomes a problem.
The sad part is that Windows takes nearly half of that, by he said he only needs word processing, viewing pictures of grandkids, and some light browsing. That same thing was quite reasonable a few years ago on 2gb, but now everything is bloated to the point where 4gb is feeling tight...
As a developer, I consider 16gb to be a minimum for my dev machine, but why do basic computers need 8gb these days? Tablets do about the same workload, but only need 3-4gb max...
first of all, you are definitely right that it is silly to pay $200 more when all you care about is the extra 4GB ram. if i were in that position, i would buy the 4GB laptop and immediately upgrade the ram. the problem is that not everyone is comfortable upgrading ram themselves, and many modern ultraportables have the ram soldered in anyway. 4GB is honestly okay as a bare minimum for 2018, but it will rapidly become the system bottleneck in the coming years.
> why do basic computers need 8gb these days? Tablets do about the same workload, but only need 3-4gb max...
mobile software is still designed pretty differently than its desktop/laptop counterparts. take the browser for instance, probably the most widely used software by light users. while i type this post, firefox is taking up over 1GB just to have five tabs open, and it seems to use 100-200 MB more for each additional tab. that's a lot of memory, but the upside is that each tab is fully loaded in memory and can be switched to in an instant. on my phone, firefox uses way less memory, but only the current tab (or maybe a couple recently viewed ones) is actually loaded in memory. switching tabs on mobile triggers a page reload from disk or over the network, orders of magnitude slower than the desktop experience. certainly there are some lazy devs out there that waste our memory, but at the core there is a intrinsic tradeoff between memory usage and performace. it's always faster to load from memory than to read from storage. i doubt you would want to trade desktop/laptop class performance for mobile class memory footprint on your main device.
When I write a C or C++ app, if I use libcurl and a library I’m using does as well, it links to the same shared libcurl library (mapped in RAM) as all other running processes do as well. With NPM, you can’t share libraries across process space and you’ll often even find the same packed being imported multiple times within the same app. In JS, it’s like statically linking all of your dependencies, but even less efficient. It can be argued that this is as good as it is bad.
When you build a code ecosystem atop JS, the underlying inefficiencies multiply as each layer is added. VS Code is more efficient, but even 192MB is still pathetically bloated.
Expect Atom to eventually approach and surpass 1GB of RAM, not less... at least in the short term.