Edit: for comparison, Lenovo T460p can be configured with quad-core i7, Nvidia GPU, 32GB RAM, 1 SSD + 1 HD/SSD, replaceable battery, 1.9mm key travel, base starts at $800 with many components user-replaceable, http://shop.lenovo.com/us/en/laptops/thinkpad/t-series/t460p...
Hell, my three and a half year old W530 has a quad-core i7, Nvidia GPU (Optimus), 32 GB RAM, and a pair of Samsung PRO 850 SSDs.
My one year old MBP (which is really a "2013 MBP") has half the memory and storage of the ThinkPad. In fact, pretty much the only noticeable difference between it and the "new" one is the touch strip. :/
I have a rMBP, but these T460p's are excellent machines, even can come with a touchscreen. If I had to have only one laptop, and it had to be windows....
It's not that it's hard, it's just the logistics of running the virtual machines(s) and the applications they contain. You could use a minimal linux distro like Alpine or Tiny Core, but you still need to run applications on top of that if you're testing or developing.
Spinning up a basic devstack instance (for example) take a minimum of 6GB and that's before you even deploy any test vms inside that infrastructure. Another example, if you're doing config management development you may need several VMs running which in turn may have (say) large java apps with heavy memory requirements even when fairly unladen.
So, I guess the answer is, it depends on what you're doing and what the memory requirements of the thing you're running on VMs is.
Hyper-V has dynamic memory, so memory resources can be reallocated as needed, and has driver hooks so that linux vms can be resized too. There's also Intel's clear containers push which virtualizes for linux but shares a lot of kernel structures between the host and the VM.
>It seems like a really big opportunity, even if it's really hard.
It's not because it's a problem that is easily solvable by spending a small amount on better hardware. 16GB RAM costs $80 which is cheap if you're only going to use it for VMs.
For a 512GB SSD, no upgrade option vs 2TB, 14" vs 15" screen, and the crappiest NVidia graphics option even at the high end $1500 model.. What a joke, those are MBP 2013 specs.
It's not hard to bump into the 16 GB limit when doing video editing, photo editing, software development with virtual machines, etc. All of those tasks are commonly done by freelancers on-the-go, which necessitates a pro-level laptop. Instead we got a gimmicky touch bar and lost compatibility with decades of peripherals.
Well, this fact coupled with os x being absolutely SHITTY at managing and task-switching when you're doing this, and you've got a good argument for abandoning the platform.
Used windows for 15 years, osx for the last 10, haven't found a panacea between the OS, applications I need to run, and my essential human right to run 100+ tabs. Maybe 32 GB of RAM will do it.
I don't think I've ever used more than a gigabyte of RAM programming. I could even do it on a Raspberry Pi. What exactly in your workflow uses 64GB of RAM?
C++ games programming. Build process uses several gigabytes, but running the game in debug configuration takes 50-55GB because we store every allocation at the moment. If I need to run my own servers or bake data I go over that easily.
When you say "store every allocation" do you mean you never release anything, or do you just mean you store info about every allocation? If it's the former, that sounds kind of crazy, is it common for game developers to do that? If it's the latter, you could always write it to disk (which is what malloc stack logging does).
We had a random memory corruption problem recently, so we started storing every allocation without releasing, to verify periodically. We do free up old memory, but only every 10 million allocations or so.
Maybe it's not super common for games programming, but it's definitely common to not use ref counted pointers or anything that could help you here.
machine learning stuff - whilst training datasets are usually cloud-deployed, dev data alone can use up a lot of RAM. I've recently started dumping my matrices to disk for dev work now. Or turn off Chrome and Firefox which turns out to be the largest memory sucks in my ubuntu machine
16g ram: perhaps 13g available to the user. If you run chrome/spotify/slack/an editor you're often left with only 8g useable.
ml work commonly uses data that is 8g+ -- and regularly 32g+ -- just for the data itself. Yes you can work on remote servers but it's convenient to be able to work and develop locally.
No that's not why. If you have any application that sucks memory without releasing it, adding more memory just delay the inevitable.
As for the RAM in the rMBP. I'm only half disappointed compared to what I would have been 1 year ago. My workflow is pushing more and more stuff in the cloud, so instead of running a bunch of small VM locally I can have VM sized properly to the task at hand rather than limited to my laptop configuration.
That's the beauty of Capitalism. It doesn't matter if someone, no matter the position in society, doesn't understand why someone would need 32GB. As long as there's sufficient demand, it'll be produced by some business.
you are the entirely predictable, and, sorry, but very irritating, "who needs more RAM than I have ever needed" stereotype of every comment board ever. 16GB is peanuts for anybody doing anything serious in graphics, video, machine learning, statistics, or finance, or ....[put your professional subject here].
Not everybody wants to have the weight on their back of a clock-ticking cost of doing their R&D online in the cloud. Many of us, including me, want a highly capable machine with an upfront, quantifiable cost, but that is professionally credible.
Hear hear. The anecdata being thrown around in these comments is ridiculous. Like the above, and also the people saying "everyone i know has a desktop". So what? I need a powerful laptop, and i don't need to justify it to you. End of story. Apple's insistence on limitations are ridiculous and just as offensive as Bill Gates'.
This is highly subjective and anecdotal, but I find OSX to use a lot of RAM.
The in-OS memory compression helps, but when I still had a 16GB Macbook Pro, the system always found a way to use up all of the RAM to the point that the compression would kick in to handle the overage over my physical memory.
My habits aren't any different in terms of extraneous windows/apps open on Windows, and I rarely hit 100% RAM utilization on my 16GB Windows machine.
The system should use all the RAM. You paid for RAM, why have it sit there unused? As long as the next user process gets the RAM it asks for, I want the system to use all my RAM to cache everything.
Edit: even to the point of compressing pages, since it's faster to uncompress them than fetch from disk.