Hacker News new | past | comments | ask | show | jobs | submit login

What do you need 32GB for? Not saying you don't, but what's the use case vs. using someone else's cloud machine in a datacenter?



Virtual machines

Edit: for comparison, Lenovo T460p can be configured with quad-core i7, Nvidia GPU, 32GB RAM, 1 SSD + 1 HD/SSD, replaceable battery, 1.9mm key travel, base starts at $800 with many components user-replaceable, http://shop.lenovo.com/us/en/laptops/thinkpad/t-series/t460p...


Hell, my three and a half year old W530 has a quad-core i7, Nvidia GPU (Optimus), 32 GB RAM, and a pair of Samsung PRO 850 SSDs.

My one year old MBP (which is really a "2013 MBP") has half the memory and storage of the ThinkPad. In fact, pretty much the only noticeable difference between it and the "new" one is the touch strip. :/


What kind of battery life do you get on the W530 vs your 2013 MBP?


I have a rMBP, but these T460p's are excellent machines, even can come with a touchscreen. If I had to have only one laptop, and it had to be windows....


Has much work been done on truly memory-light virtual machines? It seems like a really big opportunity, even if it's really hard.


It's not that it's hard, it's just the logistics of running the virtual machines(s) and the applications they contain. You could use a minimal linux distro like Alpine or Tiny Core, but you still need to run applications on top of that if you're testing or developing.

Spinning up a basic devstack instance (for example) take a minimum of 6GB and that's before you even deploy any test vms inside that infrastructure. Another example, if you're doing config management development you may need several VMs running which in turn may have (say) large java apps with heavy memory requirements even when fairly unladen. So, I guess the answer is, it depends on what you're doing and what the memory requirements of the thing you're running on VMs is.


Hyper-V has dynamic memory, so memory resources can be reallocated as needed, and has driver hooks so that linux vms can be resized too. There's also Intel's clear containers push which virtualizes for linux but shares a lot of kernel structures between the host and the VM.


>It seems like a really big opportunity, even if it's really hard. It's not because it's a problem that is easily solvable by spending a small amount on better hardware. 16GB RAM costs $80 which is cheap if you're only going to use it for VMs.


Unikernels are one area of research, http://unikernel.org


Yeah if someone worked out how to use less memory in a VM we could use it in normal machines too ;)


For a 512GB SSD, no upgrade option vs 2TB, 14" vs 15" screen, and the crappiest NVidia graphics option even at the high end $1500 model.. What a joke, those are MBP 2013 specs.


The point was that even the low-end Lenovo has 32GB RAM, the topic of this subthread. If you want the latest hardware features, there's a 15" Xeon P50 model with USB-C, etc, http://shop.lenovo.com/us/en/laptops/thinkpad/p-series/p50/


Different use cases. Any laptop that offers 32GB is going to need a more powerful chipset, which affects heat, battery life, and portability.


It's not hard to bump into the 16 GB limit when doing video editing, photo editing, software development with virtual machines, etc. All of those tasks are commonly done by freelancers on-the-go, which necessitates a pro-level laptop. Instead we got a gimmicky touch bar and lost compatibility with decades of peripherals.


Not to mention the perfectly reasonable expectation to leave a few hundred tabs open, each with a window for their own project.


Well, this fact coupled with os x being absolutely SHITTY at managing and task-switching when you're doing this, and you've got a good argument for abandoning the platform.


Used windows for 15 years, osx for the last 10, haven't found a panacea between the OS, applications I need to run, and my essential human right to run 100+ tabs. Maybe 32 GB of RAM will do it.


This.


I'm a programmer with 64GB machine and I'm running out of RAM daily. Can't wait until IT upgrades our machines to 192GB some time next year.


I don't think I've ever used more than a gigabyte of RAM programming. I could even do it on a Raspberry Pi. What exactly in your workflow uses 64GB of RAM?


C++ games programming. Build process uses several gigabytes, but running the game in debug configuration takes 50-55GB because we store every allocation at the moment. If I need to run my own servers or bake data I go over that easily.


When you say "store every allocation" do you mean you never release anything, or do you just mean you store info about every allocation? If it's the former, that sounds kind of crazy, is it common for game developers to do that? If it's the latter, you could always write it to disk (which is what malloc stack logging does).


We had a random memory corruption problem recently, so we started storing every allocation without releasing, to verify periodically. We do free up old memory, but only every 10 million allocations or so.

Maybe it's not super common for games programming, but it's definitely common to not use ref counted pointers or anything that could help you here.


Makes sense. Thank you!


machine learning stuff - whilst training datasets are usually cloud-deployed, dev data alone can use up a lot of RAM. I've recently started dumping my matrices to disk for dev work now. Or turn off Chrome and Firefox which turns out to be the largest memory sucks in my ubuntu machine


16g ram: perhaps 13g available to the user. If you run chrome/spotify/slack/an editor you're often left with only 8g useable.

ml work commonly uses data that is 8g+ -- and regularly 32g+ -- just for the data itself. Yes you can work on remote servers but it's convenient to be able to work and develop locally.


I remember upgrading my notebook from 1 to 2 GB of RAM (back in 2009) to speed up Git. It really benefits from additional filesystem cache.



No that's not why. If you have any application that sucks memory without releasing it, adding more memory just delay the inevitable.

As for the RAM in the rMBP. I'm only half disappointed compared to what I would have been 1 year ago. My workflow is pushing more and more stuff in the cloud, so instead of running a bunch of small VM locally I can have VM sized properly to the task at hand rather than limited to my laptop configuration.


A lot of major projects recommend 32 GB RAM to compile including Android and Chrome.


4 tabs.


I wish this was a joke


Seriously?

Somebody reply with a link so I can be more upset, please.


Modeling, like COMSOL. Runs very much faster when RAM is enough to keep it from accessing disk, even when disk is an SSD.


What do you need "a brighter display" for? Or "2.5x more bass in the speakers"?


That's the beauty of Capitalism. It doesn't matter if someone, no matter the position in society, doesn't understand why someone would need 32GB. As long as there's sufficient demand, it'll be produced by some business.


It's not someone else's cloud.


you are the entirely predictable, and, sorry, but very irritating, "who needs more RAM than I have ever needed" stereotype of every comment board ever. 16GB is peanuts for anybody doing anything serious in graphics, video, machine learning, statistics, or finance, or ....[put your professional subject here].

Not everybody wants to have the weight on their back of a clock-ticking cost of doing their R&D online in the cloud. Many of us, including me, want a highly capable machine with an upfront, quantifiable cost, but that is professionally credible.


Hear hear. The anecdata being thrown around in these comments is ridiculous. Like the above, and also the people saying "everyone i know has a desktop". So what? I need a powerful laptop, and i don't need to justify it to you. End of story. Apple's insistence on limitations are ridiculous and just as offensive as Bill Gates'.


I run VMWare Fusion on my 2012 Macbook Pro; call it a travel/demo computer. 16GB allows only a couple of minimally configured VMs to run at once.


640K ought to be enough for anybody.


Rubbish, I've got 1280k EMS for borland C++

http://mos6581.com/pictures/5170-canvas/DCSF0009.jpg


This is highly subjective and anecdotal, but I find OSX to use a lot of RAM.

The in-OS memory compression helps, but when I still had a 16GB Macbook Pro, the system always found a way to use up all of the RAM to the point that the compression would kick in to handle the overage over my physical memory.

My habits aren't any different in terms of extraneous windows/apps open on Windows, and I rarely hit 100% RAM utilization on my 16GB Windows machine.


The system should use all the RAM. You paid for RAM, why have it sit there unused? As long as the next user process gets the RAM it asks for, I want the system to use all my RAM to cache everything.

Edit: even to the point of compressing pages, since it's faster to uncompress them than fetch from disk.


If your 16 GB machine is sitting around with 8 GB free, it's not doing you any good. It's much better for the OS to be actively using it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: