Hacker News new | past | comments | ask | show | jobs | submit login

The fact that you need me to "show data" that Moore's Law exists is, in and of itself, hostile, but on top of that this comment pretty solidly demonstrates that my intuition is correct.



You are the one who came in here and disputed my claim (and those of others as well) without a single shred of evidence. You still refuse to substantiate your disagreement.

Nobody is disputing Moore's Law (and this is the first time you've brought it up!). The dispute is around whether people still experience slowdowns and other bad experiences as a result of excessive memory consumption relative to resources, despite Moore's Law. I and several other people have told you right here that they have, and you refuse to acknowledge our collective experience largely because you personally haven't shared this experience.

My ego isn't bruised. But I and others reasonably expect significantly more than a "drive-by" non-substantive negation of a claim that comes directly from personal and professional experience (i.e., gaslighting) -- especially on Hacker News, where the audience is supposed to be largely composed of mathematicians, scientists, and others who possess better-than-average ability to think deeply, logically, and in a nuanced fashion.


You are wrong when you say, "Memory is a precious resource." No "precious resource" doubles every two years. You can either accept that objective fact, or you can continue to try and weasel your way around it, but the fact will remain.


First, average primary system memory in a typical laptop does not double every two years. Growth of average installed memory has been linear, not exponential (see, e.g., https://techtalk.pcmatic.com/research-charts-memory/). Second, even if it did grow, software can consume memory faster than it can be provisioned; there's no law that prevents software developers from writing software that utilizes an arbitrary amount of memory. Put differently, adding more memory does not necessarily ensure the software will not consume it, in the same way that adding more freeway lanes is not a guarantee that gridlock will not ensue, or that moving from an apartment into a mansion is not a guarantee that it won't get filled with stuff.


Aaand now you've gone on to dispute Moore's Law, as if the specificity of the "two years" part was at all critical or even important to our conversation.

Also, that link you provided seriously undercuts your own argument, you do realize that right? It very clearly shows how over 90%+ of computers have 4 GB or more of memory installed, which is plenty to run multiple Electron apps.


I encourage you to read my argument from the top again. I speak in terms of probabilities, not absolutes. I don't disagree that many people might not notice performance degradation when running multiple Electron apps. However, it is an incontrovertible fact that (all other things being equal) an Electron app will consume more memory than a native app will; and some people will experience swapping and reduced performance when running Electron apps where they might not experience that if they were solely running native apps instead. Also, it's important to keep in mind that people often run a healthy mix of apps at once--both native and Electron--and they'd have the ability to run more of them without risking swapping if they ran fewer Electron apps (again, all other things being equal). The closer you get to exhaustion, the more economy of consumption really matters.

I just can't see how this is that controversial a claim.


What you have here is not a controversial claim. It's also not what you've been arguing until this moment, but for whatever reason you've softened your position substantially, now to the point of (IMO) banality.

What's controversial (because it's false) is the claim that, "Memory is a precious resource." That is not a true statement.


We call memory a "precious resource" because it is often a fixed quantity in a given computer, and often the most expensive component after the display unit. Many laptops these days do not offer upgradeable memory, and even when they do, they often have very few slots in which to add it. So for many people, an upgrade involves an entire unit replacement at significant cost. I think most people understand this, so again, I don't see how it's particularly controversial.


Humans are not limited to one computer in their lifetime, so the resources of one computer at one moment in time are not relevant to this discussion.

Additionally, your own citation shows that at this moment in time a vanishingly small number of computers have an amount of memory that would result in any kind of performance degradation due to the use of one or a few Electron apps.

Therefore, Moore's Law applies, and we can safely say that resources which double every two years are not scarce.

Your continued insistence on a false fact will continue to be "controversial".


This discussion has never been about whether the aggregate amount of computer memory in the world is a fixed quantity. (At least, that's not what I meant to discuss, or how I think most people would interpret my claim.) It's about the impact on real people who have laptops with fixed amounts of memory in their hands today.


The impact on real people who have laptops is low because the aggregate amount of computer memory in the world is doubling every two years.

It doubled, people bought new laptops, and reaped those benefits by running multiple Electron apps seamlessly on those new laptops.

So it remains a false statement to say, "Memory is a precious resource."




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: