How are companies still releasing hardware with 8GB RAM? My MacBook Air from ~2016 has 8GB of RAM.
My first PC which had 8GB of RAM was built in ~2010 if I remember correctly. My current machine has 32GB. In all honesty, 32GB is not even much, considering my motherboard can take 128GB (in the old days, if your motherboard could take max 8GB RAM, people would fill all the slots to capacity). Also, I have a Lenovo machine that I paid $200 for that has 8GB RAM.
WHY any power user would buy a MacBook is beyond me. They've become devices that only my mother would use, not to get work done.
Apple should've made the baseline 32GB with 128GB as max spec. It would've forced the whole industry to give us more RAM. So for the next two years RAM will still be stagnant, all other manufacturers that copies Apple will indeed keep shipping 8GB without blinking.
This was an expected step of the evolution of the Memory Hierarchy. RAM is just bigger-but-slower CPU cache. Swapfile is just bigger-but-slower RAM. There's always a trade-off between size and speed.
Incorporating RAM into the CPU package gives a big performance boost at the expense of maximum capacity. To compensate for the reduced performance in tasks that require more memory, you can either add another layer into the Memory Hierarchy (i.e. CPU Cache -> Fast RAM -> Slow RAM -> Swapfile), or you can improve Swapfile speed. It appears Apple have done the latter - they claim their SSD is up to 2x faster.
Video game consoles have done something similar with the latest generation. Only a 2x increase in RAM from the last generation (compared to an 8x/16x increase from the generation before), but a focus on storage performance.
Jisss to be honest the new PlayStation and Xbox hardware is amazing, specificially how the memory/cpu/mobo and streaming system will work. Wish we had something like that in PC land. Then again, their workflow is basically streaming data from disk, thus this round they optimized that hell out of that chunk of the pipeline.
Hopefully AMD learn lessons from that sphere and can bring some of that to our sphere.
The problem I see with the PlayStation 5 at least is repairs, if the solded SSD kick the bucket you will probably need to: buy a new chip from Sony and resolder the fault one (or even worse, resolder the entire SSD chips because you can't rely in which cell is physically the data and swapping the cell will probably corrupt the entire disk), or buy a new PS5/Motherboard
You can't just swap the hard disk like before, I'm sure this you be a major headache with repairs
Battery life is easier to market than an acronym that sounds like it's either related to trucks or sheep.
RAM eats battery. Apple has made a strategic decision to pretend RAM doesn't exist, ship underpowered hardware, then brag about the battery life.
This is also the reason RAM is not listed on iPhone specs.
If I remember correctly the iPhone Pro 12 Max has twice as much RAM as the other 12 models and it's nowhere in the marketing materials. This is not accidental, it's a fundamental part of Apple's strategy.
EDIT: The max has 6GB and the rest of the line has 4GB. That is not twice as much. I did not remember correctly.
The performance of the phones seem to be smooth compared to Android phones with similar ( or more) RAM. The difference comes from vertical optimization, so the raw numbers on RAM would not give us a full picture.
The relative openness of Android means your are not always getting the most optimized ROM , a lot of vendors add custom crapware into their phone OOTB.
While Java GC on Android may not always be optimal in regards to memory utilization (given that there are other concerns) and AFAIK doesn't allow tuning like you can do for enterprise Java apps it seems like quite a few improvements have been made over time and many GC implementations have resulted from this: https://proandroiddev.com/collecting-the-garbage-a-brief-his...
Personally, i'd look at how common cross-platform apps are and how many of the frameworks (like Ionic and React Native) don't always use the native system UI in an efficient manner.
It seems a bit like running Electron apps on GNU/Linux and being puzzled about where all of the RAM goes.
The Native UI in React Native is where Native comes from in the name. Nothing to do with an electron app which is a full fledged browser with a node process attached to it. Not a fair comparison by large
And yet, that is not the case! While ReactNative might not result in quite as big of a performance penalty as embedding a browser engine, its performance can still be worse than actual native apps, given that it adds additional abstractions to make developer's lives easier and allow for cross-platform development: https://medium.com/swlh/flutter-vs-react-native-vs-native-de...
While one could argue that there aren't that many visually driven experiences like that out in the wild, the benchmarks still prove that most abstractions have drawbacks.
Of course, there are good reasons for using them, which is also why many choose even the previously mentioned Electron platform for development, instead of something like Qt/GTK on desktop.
But for the discussion at hand, it simply serves as a suggestion that there's more at play here than just "Java uses a lot of memory and can be slow".
Lots and lots of people use their laptops in a way where there's little to no noticeable difference between 8GB and 16GB RAM. Why should those people pay extra for something they don't need? I think the HN crowd - with 4 docker VMs, 3 electron apps, 50 Chrome tabs, ML training, long compile times, etc - need to realize that their workloads and needs are not representative of the majority of users. It's ok for Apple to also make computers for people that are not you.
> It's ok for Apple to also make computers for people that are not you.
Yes, it's perfectly fine to sell machines like these with specs from 2012 but their efficiency should be reflected in the price. Honestly, what Apple is doing now seems like pushing power users buing new hardware towards non-macOS systems.
Uhm. Adding 16GB RAM instead of 8GB costs almost nothing and which by far and large, is useful to all users. The only reason why they ship 8 GBs is because they want to sell you upgrades, similar to why Apple products come with miniscule SSD hard drives. Certainly, only professionals would need something like 512GBs right?
Yes but those certain professionals can now only upgrade to 16GB on a MacBook Pro in 2020 which is shameful. Oh and it's soldered so you have to buy the beefed up version today, cannot upgrade the RAM 6 months from now.
Go look at a non-techie webbrowsing habits alone: they have 50 tabs open in chrome, some of which are months old. And they install all sorts of other crappy software, anti-virus, browser extensions. Some of them use Chrome, Firefox and Internet Explorer at the same time. And/Or if they use Slack/Steam/Spotify/VS Code/etc they are running another 300MB RAM per app.
If you are a programmer, try all that, with Visual Studio or Eclipse running, sometimes at the same time as Android Studio with an emulator running, maybe with a local postgres or sql server etc... it easily eats up that 16Gb. Oh lets sprinkle some docker/vagrant ontop..
Good luck if you are into video/image editing, 3d modeling, illustrating, CAD etc... then MacBook Pro's are basically the wrong tool for the job today, where it became very famous for being the right tool for the job in ~2010.
Same problem with Intel CPU's...they've been riding the wave of success of their initial chip redesign for 10 years and now AMD seems to be wrecking them every 6 months.
My problem with the RAM situation is that we've been paying ~$100 for 16GB for 5 years now.. instead of getting scale up to ~$100 to 64GB and down to ~$25 for a 16GB dim. The RAM companies riding the wave too at the moment. Luckily SSD's forced HDD manufacturers to up their game else we would've still paid ~$200 for 500GB hdd with 8mb cache, so now we at least have 10TB+ drives avail for consumers.
That's (slightly) unfair because obviously the RAM produced in 2012 isn't the same as the RAM produced in 2020. Better to compare Apple's price against the cost of top-of-the-line RAM available for other laptops. I checked around and it looks like you can get 8GB 3200MHz DDR4 from a highly respected brand for 56 USD, while Apple will sell you some mystery RAM for 200 USD. It's not an (ahem) apples-to-apples comparison because the Apple RAM is closely integrated with the SOC, but I think it's fair to say they're making a comfortable margin.
Do we truly believe that on die memory has similar costs? I don't see how this is remotely possible but its going to take tear downs of both configurations to check what is going on; as in are the 8gb units using chips which never had the circuitry or perhaps disabled by hardware and its still there?
yes, right now memory is built-in into the motherboard... So you need to buy overpriced memory from Apple (and it makes me think about leaving Apple platform)
For a work laptop that only serves as an interface to a personal workstation set up in a server room, Macs are so much more enjoyable to use. No worrying about wifi drivers, being able to use the Mac touchpad etc.
Absolutely agree. - First they took away the ability to upgrade RAM, then to change HD/SSD. What‘s next: Can‘t boot any other OS. - Taking away freedom step by step.
I am a die hard Apple products aficionado, I don‘t use any other computing device, but why are many HN commentors stubbornly defending every Apple decision that limits its users is way beyond me.
Edit: Re taking away freedom: Let‘s not forget the many app store dramas on iOS. This is Apple‘s vision for macOS as well: Install only approved apps, show a warning for all other apps (and finally disallow them?)
There’s a fairly significant performance difference between Apple’s new cores and the Pi 4’s A72s.
If you ignore that and the high-end display, touchpad, and keyboard, then I suppose it’s not too far off from a Pi in a fancy case, but then you could say the same for any laptop.
In 2012, the same Mac mini case held a cooling system for a 45W chip, 2x2.5" storage bays (or one storage bay and a DVD-Drive), 2 RAM slots, and had three more ports and a card reader. They really used every last corner.
None of these new M1 devices will be densely packed. They’re putting smaller boards into the same enclosures. I assume next gen is when they will take advantage of space savings.
My 2012 Macbook Pro was also slowly dying. As a Hail Mary I opened it up, cleaned it and renewed the thermal paste on CPU and GPU. Now it's back to old speeds!
And this is not only about 2020, most likely people will use the machine till 2025. ( Also consider the fact that Apple solders the RAM to motherboard. )
Not only the base spec is 8GB but the largest possible upgrade is 16GB. My previous-generation Macbook Pro had the possibility of 32GB, which is what I bought because I think that's a decent minimum for a computer I want to last.
We will probably see the next round of new Macs in either March or June. I suspect that is when we will see the next version of the chipset that would support higher RAM limits and more I/O ports. We might start seeing the new hardware design language,too.
Because not everyone is on containers fashion running full blown clusters on their laptops and 8 GB are more than good enough for enterprise developers.
One doesn't even need to be a developer to fill up 8GB with just a browser, Teams and Slack in the background (or any other Electron apps of their employer's choice).
Another wise choice is not to use anything Electron based unless forced by the employer, or customer, and in that case, it is expected that they pay for the hardware.
This is a confusing comment. If your mother doesn't need more than 8GB RAM, why should she have to pay for more? It's not as though you can't order more if you want it.
It wouldn't force the industry to offer more. It would just mean purchasers such as your mother wouldn't buy a Mac.
I only upgraded from my 2013 MBP this time last year. Through most of that time the macbook was my workhorse as a contractor. It's still more than sufficient for many of the web and data jobs I did.
For this 8gb is fine though right? I get the idea - we want Apple to push the bar. But my 2011 4gb MacBook Air with Catalina does those things fine... swapping to an SSD is reasonably fast.
Not true at all.
Maybe in the US but not elsewhere. I know more than a few developers that work on sub $999 laptops/machines. Specifically Dell/Lenovo laptops seems to be the go to, since they can get them with 16GB RAM + 1TB hdd + non-discrete gfx, then they just swap out the hdd with 256/512GB ssd.
About half of all my dev friends have given up on laptops (specifically apple gear, incl iPhone) and ALL of them built ryzen-based machines and work from home. My previous workplace refused to buy us laptops and built typical workstation PC's for everyone (quadcore i5 + 16GB RAM + 256GB ssd + 1TB hdd, this was around 2017). The guys that do have iPad's tend to hang on to them but swapped their iPhones with Android phones when they ditched their MacBooks. Main complaint? Ports & (Performance vs Price vs (ryzen pc with same price)).
> How are companies still releasing hardware with 8GB RAM?
8GB is a huge amount of RAM. I cannot imagine writing a serious program that needs more than 2% of that. The real problem is programmers using "frameworks" and shit that requires inordinate amounts of unused memory.
Mostly Electron. I don't like it either, but Electron is a fact of life now.
While you don't have to join in on that as a developer, you still have to be aware that 95% of users will have no idea that their instant messaging application uses 800mb of RAM to idle, or whether that's a bad thing.
I run regressions on half a million data points. My 2017 Macbook Pro with (checks) 8 Gb of RAM works fine. I'm not sure what all you guys' mothers are doing.
Half a million data points sounds like a lot but it's smaller than a 1k texture. RAM for integrated graphics is VRAM too, just to render to the screen you need a ton of it, for every open window.
Not op, but my main laptop is a 2013 MacBook Air with 8GB of RAM. It’s old enough to have developed dead pixels, the backlight is now noticeably uneven, the USB ports have become loose, several keys now only work unreliability, and it now does an emergency low power shutdown when the battery reports 55%.
(Naturally this means it is now permanently plugged into mains power and an external monitor, keyboard, and mouse).
Yet the RAM is still sufficient for my use of macOS, dozens of Chrome and Safari tabs, Xcode, TextWrangler, iTunes, etc.
A few dozen tabs, as in often in the range of 24-60.
That one aspect never even seems problematic or limiting, and it’s not like I’m oblivious to all the other things wrong with such an old piece of hardware.
Perhaps this pricing was brought about by the same head injury that caused someone in management to believe 8GB is an adequate base spec for RAM in 2020.
They aren't doing it to save 20$ per machine, they are doing it to make the higher price of a 16GB machine seem more reasonable than if it was the base model.
Yeah this screams classic Apple. Advertise the cost of the Mac as lower, but with a shitty option that most people will have to upgrade. Majority are not going to want to buy a 8gb laptop for that kind of money and not just bump it up to 16gb.
This reminds me of how movie theatres offer a small, medium and large but the large popcorn only costs slightly more than the medium, thus enticing you to go for the large.
I’ll add classic Lenovo to the list, whose base level thinkpad are underspeced and have terrible screens. The fact that apple doesnt perpetually claim their machines are on sale is their last bastion of superiority
You used to be able to replace/extend ThinkPad parts quite easily. We'd buy base models exclusively and just add extra stick of ram and flip the CD rom to another hard drive or battery - those were super cheap high quality laptops.
Only recently Lenovo started to solder their parts so base models lost their power :(
They're also charging $200 for a 256GB storage upgrade - spot price of 3D TLC is $3/256Gb ($24 for 256GB) - you can currently buy a 256GB NVMe SSD (full stick with controller) at retail for <$30 as well.
That might be a somewhat relevant analogy if you had a choice to go somewhere else to buy bread elsewhere, but since both memory and storage are now soldered, you are forced to upgrade for the lifetime of the product at purchase. The relevance on the parts pricing here is to point out that Apple is being extremely abusive in their pricing to their customers.
Many laptops now have soldered RAM, but other vendors have not chosen to do what Apple has done. In HP's premium laptop (Spectre x360 13t), adding 8GB of RAM is a +$70 option. For reference, an 8GB DDR4 SODIMM at retail pricing is about $30.
Also, AFAIK, no other major laptop or mini-desktop manufacturer uses soldered storage like Apple does, so in this case, the retail cost for storage is even more relevant.
These are commodity parts, so there's not any R&D to recoup - this is just profiteering on Apple's part. Especially for the Mac Mini, where there's not the same space constraints, the lack of storage upgradability is also a rather egregious form of forced obsolescence. Fine from a business perspective, but hypocritical for a company that claims to care about the environment.
Well, no. Apple buys RAM chips at that price, and they just have to swap one chip for another. The differential in price from one to the other including all costs is that much. Labour costs and OpEx are not affected by choosing Chip A or Chip B that are essentially identical.
This is pretty much the reason why I'll never will buy apple devices again. Recently I bought iphone 11 for my girlfriend as a present and the price difference between 64GB model (which honestly might as well be a dead brick in this data age) and 256GB model was around 30% in my country which is beyond absurd.
You can call me bitter but this sort of manipulation is making me extremely salty to the point where I'll be having seething hate for the company for the rest of my life.
But I think a more important factor is future sales. They will likely sell more laptops as these 8GB owners upgrade earlier (in 2-3 years) as the OS and apps continue to bloat.
So on one side they save $20 per laptop and they likely sell more laptops.
So charge $50 more per laptop and make $30 more profit while not producing 8 GB junk that'll be e-waste in a few years because that's barely enough RAM to run an average browser session anymore.
>Even if you somehow could argue that 8 is all you need now, what about in few years?
What about it? It's not like someone who mostly browses the web, checks email, works with office documents, and so on, will change what he does in a few years...
The fatal flaw in this argument is the failure to realize that web pages are bloating, video is becoming more prevalent, higher resolution images, etc.
So “browses the web” has ever increasing hardware requirements if you want to maintain the experience.
The large consumer market that uses mac for common computing activities will not need more soon, perhaps, but many of Apple's bulwark clients - graphic designers, musicians, and even developers - probably will.
Despite the software they demoed, the machines they show (with exception of the mini) were all for the non-demanding users.
I would like to see Photos on those benchmarks, that is one program that would benefit from lots of ram, and would be used by many of the Air’s customers.
Don't forget - with unified memory that means that the GPU has to pull from the same pool, so your usable RAM after display buffers is likely much less.
I have Firefox, Slack, MS Team, MS Word, MS Excel, and Onenote opened. The world's most boring office worker use case.
I know it's a deceptive metric but Task Manager shows 12.5GB "In Use". 16.2 Committed.
Each Firefox tab has between 250mb and 1g "Memory (Active Working Area)".
HUUUUUGeeeee long tail of 100MB trailing down to 20MB services and things. Adobe has half a dozen at all times, etc etc etc.
I'm game to use any other "more realistic" specification or monitor as I know that with memory paging and virtualization things are wonky.
I don't think this indicates I MUST have 12 or 16 or 32GB - I have a 4GB media PC that works just fine. But I do believe it's an indicator average user can use 16GB, if it's available. Or in other words, if I buy a $1000USD laptop, I don't think 16GB is going to waste away unused with zero benefit ever :-\
Oh, I don't disagree even remotely! We can have a nice thread about optimized apps and 4k demos :)
But in context of expectations in the $1k laptop market, my point was that perfectly normal usage can grow to benefit from over 8GB of RAM these days, whatever the background reasons may be.
On my MBP with Catalina just browsing will get me over 8 gigs. Reaching 16 is another thing, that i did not manage even with Docker and mutiple IDEs in a typical day.
I personally wouldn't buy something without 32GB these days. But I would tell family/friends to get at least 16GB. With 32GB, I rarely have to think about RAM. But I'm probably 85%+ utilization all the time. I'll admit that I'm a "power user", but my wife has a fairly new laptop with 8GB and it fills up FAST, even when doing nothing serious. It's at 81% right now with just Firefox, Word, and Evernote.
If I were going for a desktop it'd be at least 64GB.
I'm not saying your wrong, but this is such a catch-22.
When people complain about Electron apps, the response is always "modern computers have so much memory anyway, it's fine!"
But then when a manufacturer introduces a computer with slightly less memory (but still a perfectly reasonable amount) the big question is whether it can run all those Electron apps!
Is there a way to break the cycle? This isn't good!
> Never measured whatsapp but it's really pretty performant
are we really talking about the same whatsapp ? it takes a good 4 seconds to load here. I don't remember ever waiting for MSN Messenger to load, on potato-powered computers.
Electron is cancer .. but Skype is native (for now, at least on Linux. Although I'm sure they'll switch to the Teams codebase and Skype native will disappear forever too)
I don’t do any tab-hoarding in Chrome (that’s the job of Safari on my machine), but I’m pretty sure that it will suspend tabs long before you run out of RAM. 8 GB is still plenty for web browsing.
The median Macbook user is not a developer. A web browser and videocalling are the only two from that list you'd expect the median user to be doing with a Macbook.
2. Their entire presentation focuses on editing large image files, editing 4k video, machine learning, and graphics-intensive games (games were mentioned more than anything else)
So, given how these are marketed, they are extremely low on RAM and storage.
Does macOS handle memory in a way where it just lets RAM fill up to some point before actively doing something about it? Because on my 16gig Macbook 8 gigs are gone just with browsing & spotify.
Yes. If you run vm_stat in the Terminal, you can see how few pages of RAM are actually left free at any given time. Modern operating systems aggressively cache files in RAM to improve performance.
The green/yellow/red graph in the Memory tab of Activity Manager is the best way of seeing if you need more RAM at a glance. In my experience, macOS won't do any swapping in the green. In the yellow, the system is still pretty usable but there's some swapping going on. When it turns red, your SSD will be getting thrashed pretty hard.
They're not really "gone", they've just been put to use because there's no point in leaving RAM empty. You can do much more than run a browser and Spotify with 8GB of RAM.
It's a good question -- do most people still use desktop office suites these days? I don't know the answer. I use the Google Docs suite for both work and for personal stuff, and I personally haven't seen the need to pay for Office in awhile.
I have a Mac mini that still had upgradable ram, and was able to be kidded to add a second hard drive. I put 32 gigs of ran and an 1 tb ssd into it. I’ve had it for roughly 7 years. Any current Mac mini is a downgrade except in processor speed.
We're talking about RAM. RAM usage is a direct function of how many apps you have open at the same time. For an iPad that number is a lot smaller than for a laptop.
I don't think that's the reason the iPad Pro performs well at video encoding (or else you'd be able to say the same thing for any Android tablet). It's a combination of dedicated hardware and a fast CPU.
What I'm saying is that just because the iPad Pro is good at video encoding it doesn't mean that it is capable of running a windowed OS with real multitasking. For that you need more RAM. The iPad is built for a specific workflow where you use one or two apps at a time, laptops are designed for a workflow that involves more multitasking.
What Apple appears to have done is transpose the hardware from iPad to Mac without thinking about the different requirements of each. There is a reason that RAM is not part of the package on laptop CPUs. This might work for casual users but people spending $$$ on a "pro" computer will wise up pretty fast.
I used to run a Windowed OS with real multitasking on a 33MHz 486 DX with 4MB of RAM! Sure, apps are a little more resource hungry these days, but there's nothing about multitasking or windowing that inherently requires tons of RAM. An iPad could for sure enable full multitasking without any problems. Apple has made millions of laptops that ran OS X perfectly fine with less than 6GB of RAM. It's just that iPad users aren't accustomed to having to manually manage which apps are open at any given time.
I have a 2019 Macbook Air with 8GB RAM and it runs OS X fine for dev work. RAM compression and fast SSD storage make a big difference. The newer Macbooks will have even faster SSDs.
I think the main reason Apple don't want to add lots of RAM to the base Macbook models is that it's the wrong compromise between battery life and performance for most users. They are probably betting on continuing gains in SSD performance obviating the need for additional RAM.
Video encoding is usually special dedicated hardware, so it's more measuring the hardware encoder than the general speed of the device. Go head to head with AV1 encoding for example and you'll see it matches it's general benchmark.
Yes, but Apple are moving in the direction of adding dedicated hardware to support a lot of common use cases. The point remains that video encoding is something that people often use laptops for. Users just want it to go fast; they don't care how exactly the hardware is making that happen.
The iPad Pro has strong raw CPU performance in any case. It certainly outperforms plenty of entry-level Windows laptops.
Seriously though what’s the connection between it being 2020 and there being more default RAM? Are you suggesting that in the year 2030 we should expect 100gb of RAM?
The goal shouldn’t be to increase ageing technology but to replace it with something better.
If you do anything like virtualization 8GB of RAM is absolutely paltry, borderline unusable - due to the need, e.g. to allocate a specific amount of RAM to a running VM.
To be honest, I find even 16GB to be limiting. The lack of offering of a 32GB configuration killed the immediate sale for me, absolutely no exaggeration.
Because you can't upgrade it post-market, I'd consider these to be some of the least future-proofed releases from Apple in a while.
> Are you suggesting that in the year 2030 we should expect 100gb of RAM?
100GB sounds like paltry for 10 years of technological advancement. In 2010 DDR4 didn't exist yet and DDR3 only supported up to 16GB per ram stick (8 on Intel at the time). Now we have ram sticks with capacities up to 256GB. The 2010 MacBook Pro only came with 4GB! But really "640K ought to be enough for anybody", am I right?
Well, I get where you're coming from, but would compare it maybe to electric or hydrogen vs gas cars. If I was in the market for a new car and drove certain distances regularly, I probably wouldn't be the first to buy something that could only go half the distance less conveniently, but does it faster.
I've had 16 gigs for the last 8 years. I thought about getting 32, but I haven't noticed needing more memory and it does use (battery) energy so I decided against it.
More ram is patch for components that aren’t hyper-optimized to work together. The whole point of what apple is doing with their own silicon is to create that optimization, thus reducing the need for excessive ram.
It’s worked for them for iPad and iPhone. Samsung et all would boast more ram but oddly enough iPhones were and remain faster
An iPad and iPhone mostly display one thing. One app or maybe a second one in split screen.
A laptop can run many things simultaneously.
Web browser, Office suite, Musik, EMail client, Cloud sync...
Even my mother struggled with 8GB RAM on here work machine and she is by far no PC expert.
She writes emails and letters, opens spreadsheets and websites. Sometimes she gets a call and has to open another document or website.
End of RAM.
I would NOT recommend the 8GB Macbook Pro for professional work.
You're confusing CPU and memory pressure, and attributing blame to the wrong component.
Android's lack of smoothness is due to thread contention, whereas the iOS kernel uses a dedicated UI thread running at high priority. RAM size has no impact on UI performance, beyond a sufficient amount for the kernel.
To add on to this, the person you are replying to is also quite wrong. Android devices definitely can be smooth and many go past 60hz now and have higher refresh rates than Apple products.
Agreed. I have a pixel work phone and iPhone (XS/12 pro) for personal usage. Even though the Android is also current gen it can be unresponsive at times and also the face unlock is terrible on the android. I can’t remember the last time faceid failed but on Android it’s atleast 2-3 times a day.
MacBooks and PowerBooks before them where about that 5% of power users who fell in love with their tools and preached to all their friends and family to get one too.
Now that apple has the remaining 95%, it is letting their power users down. But that is only natural, establishing a beachhead with the powerusers was "just" a genious marketing move that worked pretty well.
>MacBooks and PowerBooks before them where about that 5% of power users who fell in love with their tools and preached to all their friends and family to get one too.
And they still are. Just not in their base configuration, like they have never been. I've been using their stuff since the Motorola era.
Was there any time Apple gave "ample" RAM/DISK in their base configuration? No.
They might both be an M1, but only the Pro has a cooling fan.
The Air is going a more iPad route; totally silent and more than enough performance for typical tasks, but under sustained load it's going to throttle hard.
Or they may be throttled lower all the time, so that using the four performance cores still stays within its cooling budget. Either way, it's not going to run as fast as the MBP without having a fan.
I'm very curious to see benchmarks on these things. Also wondering if processors are performance binned from one SKU to another, or if it's really just the amount of RAM and storage.
Apple's been known to have secret hardware differences like that, the 2018 iPad Pro had 4GB of RAM unless you specced it up to 1TB storage, which included a secret upgrade to 6GB RAM. You wouldn't know from the specs, since Apple doesn't talk about how much RAM their iPads have.
Now here we are with the M1 and Apple doesn't talk about the clock speed. All MacBook Pros have an M1, but do they all have the same clock speed? Maybe.
Here[1] is an article from The Verge that goes into the full differences with confirmation from Apple. For those too lazy to click through, yes the fan is the biggest difference. Beyond that the screen in the Air is slightly darker, the base model has worse chips due to binning, the battery is smaller, and there is obviously the Touchbar.
Thermals and sustained performance. I expect 10 W passive CPU in MBA body to throttle quickly - I now have 2017 12" MB, and for about 60(winter)/15(summer) seconds of sustained load it's the most awesome little machine in the world, and then becomes a slog.
It has 7W CPU in slightly smaller body than new Air, so I expect Air's performance review to be 'the story about throttling'.
The pro comes with fan cooling.
With that, the pro can run for much longer without hitting max temperatures .
The air is fanless so in an intesive task, you will hit thermal limits very soon and the cpu will be slowed down .
I have an 2018 MBP with an i9/Vega20 - I can testify how terrible their thermals are. But given that this thing can be passively cooled - even apple crappy venting will probably do a decent job with this chip. Need to see what happens when the reviews land.
Oh I am sure judging by how excellent MacBook Pros cool things, having passive cooling will make no difference at all. I mean, during summer I put my MacBook on a large ice block that I freeze over night, this way I can maintain acceptable build & development speeds during the day. Should work the same for the MacBook Air. No?
The emphasis here being on the word "should". I wouldn't buy just yet until some third party has published a sustained load/heat test. Apple have sucked at this for quite some years now, it's folly to touch a MacBook for anything remotely compute-intensive
Only when you compare apples to oranges, but this is an apples to apples comparison. Same underlying silicon, almost identical configuration, different chassis. It's impossible for the Pro not to outperform the Air when the only substantial difference for performance is TDP and we have zero reason to believe that heat-pipe + fan would be outperform by passive cooling of all things.
Agreed, the Pro's cooling should be better. However, I think it's worth waiting and seeing. There might not be that big of a difference in the end (e.g. both are uncharacteristically well designed and run fine; or conversely both are so badly thermally throttled that no serious intensive tasks can be performed) in which case other factors could end up being more significant.
Apple has always sucked when it came to cooling. The Apple /// and Lisa were plagued by “IC creep” where, due to heat expansion from inadequate cooling, the ICs would wiggle out of their sockets ever so slowly.
You can solve that problem by going into keyboard settings and changing the "Touch Bar shows" setting to "Expanded Control Strip". Or "F1, F2, etc. keys" if you prefer. Whichever one you set, holding down the "fn" key will temporarily toggle to the other one.
You still don't get the same feel as physical keys, but no Siri, no sliders to adjust volume or brightness, no stuff shifting around as you switch apps.
Yes, it's just annoying that the best workaround involves replacing the touch bar functionality with an inferior replica of the physical keys it replaced.
The M1 chip will for sure sustain higher performance in the Pro because it has active cooling.
Apple ramps their chip performance dramatically and frequently in iOS devices; that’s how they hit such high benchmarks but also deliver long battery life and stay cool. Those chips spend most of their time not performing.
In such a system, having “the same chip” in two different devices will not tell you much about their relative performance under real world loads.
I literally had to cancel orders after noticing this as well, was very confusing even for a developer.
Also, what's the difference between the left and the right versions? just the storage? can't you configure the storage? why do they make it look like there are 2 different options.
I'd also be very interested to see if that $250 difference between the 7-core & 8-core GPU is really just a software toggle. It sounds like they've built 1 SOC, not two, so I don't know how they came up with that.
tech specs say 500nits on the pro and 400nits on the air, so its possible they are different screens but also possible they just have different backlights
The larger chassis allows for better cooling and more battery. Their comments would make it seem that the same processor in the MacBook Pro performs better because of more thermal headroom due to active cooling. The Air is fanless.
I wonder how useful they are. I used to have a gaming laptop in the early 2010s and the nicest cooling fan I could find, but it only dropped the temperature down by 2-3 degrees
My suspicion is that they're not useful at all. Once I bought such a cooler pad, and placed a 2011 13" MacBook on it. Then I benchmarked it (i.e. max out the CPU), and measured the sensor "CPU Near". I could not measure a difference between the cooler pad on or off.
I can't remember the brand of the cooler pad, but it had two 10 cm fans.
Apple mentioned in the event that these Macs will have hardware verified secure boot. Since I’m not very knowledgeable in this area, can someone explain (or even try to guess) what this would/could mean for running Linux on these? I use Macs way beyond Apple’s support timeframe with OS X/macOS, and Linux is the one that runs on some of the older Macs and provides adequate security and security related software updates.
I don't know the answer^, but how old is your current old Mac hardware? I don't know about the desktops, but Macbooks from 2016 are not well supported hardware-wise in Linux - things like no WiFi even. There was a good GitHub repo tracking it for up to I think the first touchbar Pro, and basically it was dismal then and only got worse (according to repo owner who consequently stopped bothering iirc).
So.. depending what you want to do on these older machines, my point is that this may be the least of your worries.
^(though I think it's fine, because it's the reverse that would be a problem? Bad news for 'hackintosh' if all supported versions of macOS can expect secure boot hardware, I think)
Until 2019 Apple sold 13 inch MBPs without the touchbar, and these models did not have a T2 chip. They are still miserable computers to run Linux on, although I think the original 2016 touchbar-less MBP performs better than all the rest, albeit (IIRC) no working audio, very poor suspend/resume functionality, and until pretty recently no keyboard/trackpad functionality.
Oh, and Apple's NVMe interface is non-compliant. This is widely reported as Apple locking Linux out with the T2 chip, but that's not really true. The T2 chip will by default prevent unsigned kernels from reading/writing to the SSD, but this can be disabled.
Even if it's disabled, the controller is not standards compliant, and Linux won't see the underlying block device. I saw some diffs floating around on github a few years ago that fixed it, but I don't think it was ever mainlined.
Basically, post-2016, Apple seems to have incorporated even more custom (and undocumented) hardware that running alternative OSes on them is basically impossible. Windows works because of the Apple-provided HAL + drivers for WinNT.
Even in Bootcamp Apple did not bother to expose all hardware to Windows . The touchpad is reported as a mouse with a scroll wheel, no option to enable hardware encryption or to use Touch ID to unlock.
Windows supports various bio-metric logins and provides rather generic API. Some manufacturers use that to provide login based on veins in a finger, not fingerprints. Apple could have implemented those API.
I have a MacBookPro15,2 (2019, with T2), on which I duel boot Arch Linux. It is perfectly usable. The hardware support is not great. In particular, resuming from suspend is very slow, and I haven't gotten the built-in mic to work. And getting the system to work did require using a patched Linux kernel installed from Github. So not easy, but possible.
Your claims about "dismal then and only got worse" are unfounded. The repository you refer to is still active. https://github.com/Dunedan/mbp-2016-linux If anything, activity has slowed down in these threads because it was figured out how to make it work.
Even among people who run Linux on these MacBooks, the general recommendation is to keep a macOS partition around for stability. Some of the value you get from any Apple computer is in the software. If you intend on instantly installing Linux or Windows as your only OS, this probably isn't the computer for you. But if you want to or have to use Linux sometimes, these T2-chip Macs can do it.
> There was a good GitHub repo tracking it for up to I think the first touchbar Pro, and basically it was dismal then and only got worse (according to repo owner who consequently stopped bothering iirc).
As I'm said repo owner, let me chime in here quickly to shed some light on that.
I used a 13" MacBook Pro 2016 for 3 years with Debian as my sole machine for work. When ordering it back in 2016 I wasn't sure how difficult it'd be to get Linux properly working on it, as at that point it was only known that it's possible to boot Linux, but nobody had figured out even such basics like support for the integrated input devices or the NVMe SSD yet. However as I was using Linux on Macs since 2006 I figured it'd be somehow possible to get it to work for me.
Fortunately I wasn't the only one serious about running Linux these 2016+ MacBooks, as I have very limited knowledge of the required lowlevel programming skills. What I did was to provide and moderate a Github repository (https://github.com/Dunedan/mbp-2016-linux) as a central place to document and discuss of the status of hardware support for these MacBook Pros, some little patches and lots of feedback and bug reports. A big shoutout to all contributors who did an incredible job at reverse engineering, implementing and upstreaming drivers for various components! That's quite an achievement for such a complex device with no public hardware documentation at all!
After a while it turned out that support for certain components would be rather difficult to get working flawlessly. As an example, even at the end of the 3 years I used the MacBook Pro, I had to use an external adapter to be able to use WiFi. With that in mind I started to reconsider why I bought Apple products: I bought them because of their superior hardware quality. But if I'm not able to use the hardware as intended, what's the point of paying a premium for Apple products? And let's just not talk about the butterfly keyboard or the horrible thermal management. So when it came to replacing my MacBook Pro, I decided to go with a Lenovo Thinkpad X1 Carbon instead. It's not perfect, but I'm way happier now than I ever was with the MacBook Pro 2016, as the hardware just works.
As I don't own any 2016+ Apple device anymore, the help for further Linux support I can provide is limited, but I didn't stop bothering at all! I'm still actively managing said Github repository, but activity in general has significantly dropped there over time. Either the devices work well enough for other people now or they also replaced them with non-Apple hardware.
I bought a 2017 MBP hoping the situation would eventually improve but it never did, so I never got around to installing Linux. I'm expecting it'll be even worse for these M1 systems.
It is a shame, it's not something I ever really did (or not for long, for a period I do recall having Arch on my 2013 Air) but I like the idea - I like Apple's hardware, just not the software.
Oh Apple, why are you doing this, taking freedom from your customers. I don‘t want to use Windows, neither do I want to tinker with Ubuntu. But if you keep going that path, you are forcing your power users to think about migrating to platforms that respect users freedom to do whatever they want to do with their machines.
After two years of using an otherwise beautiful iPad Pro (along with my MBP) I came to realize that a crippled machine that is very limited in how I use a computer is not the future of computing I like. The device collects dust for quite some time as I prefer a computing environment where I use the terminal a lot, where I use my bash and Python scripts a lot to automate, where I use Emacs a lot to write tech docs, do my project planning, writing, automating workflows, and many more things that are not doable on a crippled (iPad)OS.
You keep going toward your vision of a computing platform where your customers are just consumers, not hackers and doers, and us hackers need to look for alternative platforms, most propably Linux.
You can get XPS Developer Edition, System76, Purism, or many other laptop brands (eg. any of these https://elementary.io/store/ with elementaryOS, whose DE should feel fairly familiar to a macOS user) with GNU preinstalled these days.
Yes the iPad is "crippled" in that sense, but I find it's an excellent accessory to a computer. Not everything I do needs a terminal, my Python scripts, favorite text editor, and rapid multitasking. The iPad is a wonderful (albeit expensive) side device for lighter activities on the couch, in the kitchen, or on the go.
It doesn't need to be our only computing device to be appreciated, and not every computing device needs to be powerful.
What I am lamenting is the observation that the iPad OS seems to be Apples vision for how computers should work: Crippled, not much user control, just content-consuming devices with Apple controlling every aspect of it. That’s not a personal computer anymore, not a device where we have much control over it.
I'm perfectly happy with the division between "consumer machines" and "creative machines".
I obviously count myself among the people who needs and wants creator capabilities, but for my technologically challenged family and friends there's no reason to learn and manage all the complexities of a classic computer environment if they just want a point of access to youtube, netflix, spotify and social networks.
I still shiver remembering the times of browsers riddled with search bars, trojans and antivirus software slowing computers to a crawl and people who's "good with tech" being dragged to friends' houses to see what's wrong with the computer.
Same here. I can't stand macOS, the interface is terrible and it's an awful development environment.
But the iPad is an excellent companion, since I use that to scrible around, consume media, photo editing, keep my music sheets, and all that stuff that would suck on Linux.
What? The Apple II was a fantastically open machine! It even came with the circuit schematic and ROM source code right in the manual! It had lots of slots and there was a massive third-party ecosystem. It was when Jobs got to design machines with the Apple III/Lisa/Mac that things closed up.
Linux does support many ARM architectures already, and even supports the x86/x64 version of Secure Boot in some configurations. If Apple wants to either allow their Secure Boot to be disabled or to allow end users or Linux distros to somehow get their own keys trusted, I'm sure the port can happen in the coming $smallnum years with enough interest + resources + time. (But not $smallnum months, sadly.)
I use a mac for work (and paid for by work) but refuse to spend my money on something I can't use the way I would like to use it. I think that these companies shouldn't be able to lock you out of you using your tractor/car/computer like everyone seems to be moving towards. It's a real shame. I understand if they want to void the warranty because a user blew away some critical firmware, but that's another ball of wax and it's on the user to suffer the consequences.
Void my warranty, boot with a scary splash screen, whatever, but don’t lock me out of the thing I ostensibly own. Or, maybe change the “buy” button to a “license to use” button in your store.
Maybe, just maybe, it shouldn't be a $1T company? Maybe it should once again become a company that puts its customers and their experience before profits?
People are frustrated with technology lately. Even non-tech people. Apple has everything it would need to change that, but it decides to contribute to technology becoming ever more frustrating time and time again instead.
Agreed. But I can't think of an example of a company that ever voluntarily downsized. Downsizing usually happens because a new competitor arises that makes a product customers prefer. That's a very difficult proposition in the personal computer space.
If history is a lesson, any company that arises to compete with established players, gets acquired by them. And it's a shame no companies actually decline these acquisitions.
>no companies actually decline these acquisitions.
Not true. Yahoo made 2 separate offers to acquire Google, and an established social-media company offered to acquire Facebook (for a billion dollars IIRC).
Craig Federighi Said himself that they don’t boot other operating systems.
Could you link the talk where they said it can run binaries not signed by Apple? The only thing I could find is where they still allow you to boot older versions which they don’t let you download anymore. To keep the actual mac experience.
Can’t find anything in both documents which allows booting of non Apple signed Software. The only thing I See there is something like SecureBoot on PCs, where Apple would need to sign your boot loader in order to be able to boot it.
It changed. You use kmutil create to create the artifacts and add the hash to the Secure Boot policy. (--help at https://pastebin.ubuntu.com/p/mN3Z2kfJWy/, no manpage)
It is no longer a Personal Computer. And it is a security disaster if you cannot control own hardware of your computer. It should be made illegal for Apple to operate like this. User MUST have full control of the computer. It is user right and should be human right. Then only reason I used Macs is their respecting ability to use any OS I want if I Want.
Apple's not going to ship drivers for Linux, and it's a SoC. So someone needs to somehow write an open source driver for Apple's proprietary black box of a GPU.
I suspect very little will work at all if someone can get Linux to boot on one, and it will be a very lengthy endeavor to get things up to being usable.
The reality is that we can't answer this until we have some hands on.
It really depends on whether their secure boot architecture can be disabled (unlikely knowing apple), or allow adding ones own keys (unlikely). Bootcamp probably won't happen since windows does not support the architecture: they'll be pushing people to use VMs.
They might also provide some untrusted path to boot without it being able to access certain secure features. I wish they did this, but also won't keep my breath!
That said, the kernel itself needs to have support for the hardware architecture, and then drivers for all the new hardware they're pushing out. I don't expect this to be soon, though I'd definitely be willing to sponsor anyone willing to work on this.
Who downvoted your comment and why ?
This is a good comment and good strategy to teach them a lesson. Without some efforts those companies would not recall moral values. Richard Stallman was warning us about this development long ago and he was right. Cripled hardware is useless for hacking mind.
Dude they'll just sell them to someone else. It doesn't change anything, the material and resource cost has already been paid. Stop making this about something it absolutely is not.
Companies DO care when people return something, because that is pure signal. "I got it because I thought I would like it and I don't" is a much different signal than "I have no idea what you think because I never interacted with you". That is likely one of the most effective ways to make a company sit up and take notice, the return rate of a product is a key indicator of its success.
I really don't see that there's anything to disagree with there. Loving Apple, as you may, doesn't make the above point wrong.
Do you think Apple is just going to take the computer you touched, turn around, and sell it to another person? No, they’re going to take the whole thing apart, replace all the consumables and user-facing parts, the sell it as refurbished. And that’s the best case: they might have to strip it for parts or trash it depending on what it was that you bought.
This has nothing to do with a love of Apple or anything, and everything to do with “you’re abusing a program that they are going to either ban you from, or remove because you abused it too much”.
and degrating Personal Computer into machine under control of someone else, attacking rights of the person and stripping people from privacy completely is not a wasteful thing to do? I mean, it's a garbage by defintion and sure, it takes time for people to understand this, but this machine is useless by design for freedom respecting society, it's not a waste? it's a huge waste of resources I would say. Returning product doesn't add to this too much of waste, it simply tells what it is.
Dude, you’re arguing about control to the wrong person. Apple knows about this already and you returning a bunch of devices isn’t going to get them to change their policy.
I am not buying apple stuff for 5 years already, becasue I can't stand stupidity and their macbooks pro woould just cripple my abilities and mobility with those stupid dongles, unupgradable memory and idiotic touchbars.
The only reason I could bare some of their hw is becase I knew I can put Linux when I get enough of it, and now what?
I am not buying, sophisticated people are not buying and it doesn't help so in my perspective IF something is ever going to change their policy is returning products to SEND A MESSAGE. Other option is to wait untill some dumbo get it when it'll be too late, like it was when S.Jobs has to return to save them..
Or you sugest even more strong action then returning?
This is intended. They should stop selling things that attack privacy and freedom of a person, or this concept is not your priority and you are ok to have computer controlled COMPLETELY by someone else, which means ZERO privacy?
PS: Well, this one is downvoted too. Looks like some lost even a sense of what PERSONAL computer means.
OK. Keep downvoting! it's a good strategy to silence someone when there are no valid arguments.
They’ll stop selling the device to you or accepting your returns. So you haven’t really done much.
Also, people who complain about downvotes usually attract more. I’d suggest not doing that. Claims that there are “no valid arguments” against your position rather than nobody wanting to deal with you are, well, absurd.
If many people will do that it's a different story. May be it's ok with you and you see no danger in their strategy but I see this issue as huge attack on freedom and rights, including right for privacy. History have many examples about how people protected their rights and freedom. Returning product is very light way to send a proper message.
Recently I see more and more perfectly valid comments to be downvoted and I do not like it. If this forum will become mob controlled with bullying then I see no reason why bright people would stay here. If one doesn't want ot deal with comment, usually one moves on, like I do. But if argument is perfectly valid and instead of answer I see one simply downvotes it is bullying as it appears.
I also prefer secure hardware, but I find macOS completely useless for work.
While I can appreciate that some see other-OSs as something of a curiosity, for many of us it's a big deal-breaker, and it's a shame Apple is not willing to provide their hardware to so many potential clients who simply don't want their software.
What advantage do you see to "secure hardware", I'm unaware of any recent Mac security issue that would have been prevented by it. It gives Apple a lot more control over the device but I don't see any advantage to the user.
It means that you need to find a vulnerability in bootloader and exploit it to break free from Apple secure garden. Linux works on ARM for years, so I'm sure that it won't be impossible to port it over, but whether enthusiasts will do it or not is another question, as you would need to write drivers for proprietary GPU and storage to make it useful.
That’s not really what I’d call a “WWDC talk”, but sure, it mentions that Apple won’t provide Boot Camp, and that they are running their OS demos using virtualization. I didn’t see a claim that they won’t let you reduce the boot security.
"We're not direct booting an alternate operating system, it's purely virtualization. Hypervisors can be very efficient, so the need to direct boot shouldn't really be a concern."
For me that quote means that they're not allowing booting any alternate operating system and they expect developers to use virtualization if other operating system is needed. I would be happy to be wrong about that.
I understood that as "we weren't booting something else in our demos, we were using virtualization" but not "we can't boot anything else". I am sure, however, that they would like you to use virtualization instead of direct booting.
I'll be sticking with my 2015 Macbook Pro Retina 13". Great machine, not too thin, heavy enough, no stupid touch screen, usb ports, great keyboard. Everything apple has done since hasn't compared.
I agree here. I really don't understand some of the bashing that happens every single time the Macbook Pros come up in HN comments.
I was a longtime happy owner of a 2015 15" MBP until earlier this year when I decided it was time to upgrade to a refurbished 2019 16" MBP. I was a bit nervous at first but I have to say that I have no major complaints, other than the fact that I wish I had F keys instead of the touchbar. Contrary to everyone else, I really like the keyboard.
As for solving the touchpad issue, I use Pock [1] which I read about on HN. It removes the need for using a slider every time I want to adjust the volume or brightness and lets me control Spotify and see what song is currently playing.
I also have a 2018 15" MBP for work. If you believed the comments here you'd think that I am unable to type or use the damn thing. Honestly, after 1 week of use I'm already used to the keyboard. Not having an escape key kind of sucked, but I have rebound caps lock to escape on all of my Macbooks and enjoy that even more than having an escape key.
My other complaint is that the trackpad is a bit too large - I find myself accidentally hitting it sometimes and it just seems excessive. Finally, it's a shame that you can't mess with the battery/RAM/SSD yourself but unfortunately that's more of a trend for the industry than just Apple.
Overall I'm a totally happy user on both the 2018 and 2019 Macbook Pros. I was quite nervous about the possibility of having to use a Windows PC for work when I started my new job. And don't even get me started on having to use a Pixelbook at Google.
> Contrary to everyone else, I really like the keyboard.
Lot of people like it, they just don't post about it.
Mine has a bunch of keys that print 2 or 3 times when I press the key once (the issue that everybody eventually get with the model I have) but I still prefer that flat keyboard to the old one. Takes a few days to get used to but then it's great.
I wonder how many people are informing their opinions solely from the gross incompetence exhibited by Apple with your mentioned double tapping keys issue. I seem to recall that they have fixed it in newer models, but I wouldn't be surprised if it still persists.
I hate to admit it since I've drunk the mechanical keyboard koolaid, but I like the mbp flat keyboard. I easily prefer it to the 2015 keyboard, which imo is really nothing special (at least it doesn't gunk up). I hear people say great things about the thinkpad keyboard, but from my little experience with it, I'd still rank the mbp flat keyboard higher.
With BTT and GoldenChaos you can swipe anywhere on the Touchbar with two fingers to change the volume, and with three fingers to change the brightness of the display.
I use the 2020 macbook air personally and 16in professionally and the keyboard drives me insane, to the point I bought a magic keyboard to be able to function without screaming at my computer for accidentally turning my video camera on yet again during a meeting. I really wish they would have a no touch bar option, because I could use more power but still be able to touch type. Touch bar is the worst technological "improvement" I've ever come across.
Did you know that you can change what buttons are on the Touch Bar? You can make it almost behave the same as other keyboard by having it always show F-keys.
the difficulty isn't the content but the fact you can't blindly hit the key reliably. reminds me of back in the day where you could send a composed complete text message from your phone without taking it out of your hoodie pocket in class. not the case anymore in our touch screen world without tactile feedback.
Must be finger positioning. I hit the TouchBar accidentally several times a day, each of which was infuriating, until I managed to semi-disable it by turning it into a row of inferior function keys.
My company gave me the current 16" late last year, and it's been a piece of junk: suddenly starts to roar without a reason (while plugged in to power supply) so people on a phone conference complain I should switch off the hoover.
Often freezes with not much open other than a few Chrome tabs, O365 and Sublime (not actually DOING much with any of them at the time).
The keyboard is poor, the touchbar is useless but doesn't get in the way that much, the touch pad is too large and disturbs me sometimes while typing source code. The lack of ports means without a hub the whole machine is pretty useless (and that means all users will have to buy an USB-C hub for another $100 - annoying but the real annoyance is to have that hanging of your laptop all the time, and you mustn't forget to pack it, or you're screwed).
I also dislike the 16" form factor compared to 13", as it breaks all my leather bags when I try to carry it (not so relevant anymore in the days of the home office..).
I was having issues with my 16” until I disabled Turbo Boost. I dunno if it’s specific to the i9 model, but it was running hot and processes were spinning out of control. Since disabling, it runs a lot cooler and I have pretty much zero issues.
Ok, so like a reverse Turbo button. You’re running an i4 most of the time (and that’s fine) and sometimes you crank it up to i9 (when the solo begins, to stretch this metaphor to the limit).
I don't think that's a problem with Apple. I have a Dell XPS with an i9 in it, work gave me, i don't use it because idling in Windows the fans ramp up... If I use a laptop I end up using my personal Lenovo ThinkPad X1 Extreme (gen 1) which doesn't have fans trying to take off into space.
The original retina macbooks are still the best. :(
So what to do if a powerful working machine is needed now, buy the current MBP 16" or go to thinkpad? Disappointed they didn't even upgrade the intel CPU on MBP 16" to the newer version.
The newest one fixed it enough to satisfy me, i.e. it has a physical escape key again. I never used the rest of the function keys anyway, and I like the customizable nature of the touch bar. The lack of a physical escape key when they first introduced it was my only real gripe.
I was personally really hoping for an Intel 16" refresh today. I'm up for a new laptop at work and it's a little bit of a bummer to drop that much on a year-old machine. Not quite sure what to do at this point.
It's very similar to the 2014 models, it has scissor switches and not the controversial butterfly keys. If you ever tried a magic keyboard then it's basically that.
My work gave me MacBook 16" 2020 and probably the worst business laptop I have ever had. The screen is smudgy and glary, the keyboard is awful, touch bar is a meme, no ports and finally it literally cuts my wrists because it was designed to be looked at not used.
You are missing out if you don't have 4 USB-C ports in both sides of your laptop. You can charge your laptop and connect a display with a single cable and without caring about putting the computer in the right direction.
My 2017 13” Pro (no touch at) has two USBC ports, both on the same side. It is annoying. Especially after it turned out there is a design flaw in the 2017 Pros where charging on one side tends to trash your battery - guess which side both ports are on?
It’s nice to only have to plug in one cable when I dock it at my desk but I would have greatly preferred not having to replace the battery twice in the three years I owned it due to this. Especially since only one replacement was covered under the warranty thanks to the ‘rona shutting down all the authorized service centers in my city.
Overall I have positive feelings about my 2019 16" Macbook pro, and USB-C on both sides is definitely part of that... However I have yet to find a decent adapter that will both charge my laptop and output 4k @ 60FPS. I've gone through several iterations now and the closest I've come is a monitor that can connect via USB-C directly, but even then the power output only outputs enough juice when the machine is near-idle... I'm looking forward to the day when I can have a 1 (or even 2) cable dock setup that won't run me $250+. Until that day, I have to use all 4 ports for:
Looks like the just announced 13" MBP only has 2 usb-c/thunderbolt ports and they're both on the same side.
Never occurred to me they would go backwards on this, I bet this is one of those overlooked details that will get a lot of notice once people actually start receiving these new laptops.
There are two models of 13" MBP - based on the other specs of this new machine, it occupies the lower tier - which in the Intel models also only has two ports.
Presumably, when they fill the higher tier with the new chip it will also have more RAM and more USB ports.
don't MacBook mysteriously run slower if you plug in power on the left side instead of the right? I don't remember the cause but I've personally experienced this handedness issue on my mac
I heard yesterday that Apple does not offer a battery replacement for 2014s and you have to go third party. This is an unfortunate development and I'm wondering if I should inquire about getting mine replaced.
I have this machine, it’s a super portable and reliable powerhouse. I recently upgraded to a 2019 15”, mostly to go up to 32G ram, and it feels like a downgrade in terms of design and usability.
Also have this one. It was my first macbook, and I think I got really lucky with getting in when I did. Been going strong for 5 years, and has no difficulty doing anything I need it to do. Only thing I wish I did was get more storage (only got the 128GB model).
FYI the storage is upgradable on these machines (and it's very straightforward to do). It's a nonstandard SSD interface, but you can official modules 2nd hand on ebay quite cheaply these days.
Do your research on the adapters though and drives. Some have issues with suspend. Make sure the hardware you add has been tested by some brave youtuber or blogger before buying anything. It looks like the actual act of repacing it isn't that bad though.
Used Apple products maintain their value very well. The 2015 MBP Retina can be found on Ebay between $700-$1000. I've thought about getting one myself.
The Macbooks that were for many years the smallest and lightest Apple laptops, would be imbalanced and knocked off your lap by placing your hands on it slightly asymmetrically.
I think these super thin laptops are ugly, too hot and not ergonomic - also this sacrifices the keyboard. I actually think this model represents the perfect mix of all of these things - thickness (being the thinnest thing I can buy is not a feature for me) - weight (I really don't mind 1.3kg) and keyboard.
> Modern web is too heavy for the old dualcore i5, unfortunately.
It's a shame that the web has become this borderline unusable mess. I shouldn't need a quad-core machine with multiple gigabytes of RAM to just read the news online.
I have the same machine too, and I'm starting to notice this as well. It's absolutely ridiculous that LinkedIn for example makes my machine slow down to the extent it does.
I've been waiting to upgrade my 2015 for 5 years now! It's funny, when I bought it I only went with 8GB ram because I figured I would replace it in 2 years-ish. It has served me well, but, it's struggling with only 8GB these days. Now I'm ready for 32GB, because I'm planning on keeping the next one for another 5 years. Let's hope the 14" MackBook Pro in 2021 (with mini-LED secreen) rumors have legs to it. Hoping this is only the interim state of the 13" Pro and is due to really needing a different ARM cpu that is not available yet.
I am in the same boat. I love this Laptop, but Lightroom (CC) is really slow compared to how it runs on iOS. I would assume the new M1-based machines could run the iOS version of Lightroom almost out of the box.
Same boat as you, but for some reason with the new MacOS update my computer kept getting the most random freezes to the point where I gave up and gave it a fresh Linux install. Haven’t looked back and I’m extremely happy with the work I’ve been putting into it.
I have this generation air and a recent pro... call me crazy, but somehow the hard edges of the pro are super uncomfortable when the laptop is on my lap. Maybe an air vs pro thing rather than generation thing
Did I miss something, or are they, once again, not upgrading the camera? The low light performance of the current version is real bad. Since we're all using these cameras way more, I really thought there would a hardware bump. You can only squeeze so much detail out of an under-exposed, noisy image with software.
If you look at the cooling solutions on the current macs I’d say the inverse: that these chassis were designed with the M1 in mind and they shoehorned Intel CPUs into them.
This is exactly right. They could have done everything in one go, but they had to change the keyboard and call that the 5th generation macbook pro. Like they didn't even include wifi 6 in the previous model nor did they hold an event for it. Kuo says a redesign is due in 2021, but apple can't keep getting away with doing the same thing since 2016. There are laptops with dual screens, 4k touch screens, vapeor chamber cooling, and apple just be changing out the keyboard for the next gen. This new chip is good for the chip industry, but it's not really a different laptop at all.
I'm curious, are higher-quality cameras even available that fit into the thin lid? (At a reasonable price?)
I was always under the impression that the built-in camera is so much worse than your phone's front-facing camera simply because the phone is thick and has room, while the MacBook lid is much thinner.
The reality is, for most people 720p is more than enough, since most videoconferencing is at a heavily compressed 480p anyways, and it's not like most people really want it super obvious that they missed a spot shaving or have a small zit anyways. And sure the quality arguably falls to below 480p in very low light conditions, but that's not an issue in any normal well-lit office, coffee shop, or kitchen or living room.
I would assume Apple is making the right call here that a higher-resolution camera isn't worth additional thickness.
Yea, I hate the way the background blur feature in google meets suddenly make my face way brighter and "pop" in the video. With winter coming, I'm getting a lot less natural light in my work area and my videos just look dreary.
Considering how good they are at iphone cameras there is absolutely no reason to have a poor camera nowaday in the laptop. They know how to make cameras that fit in tight places. They're just being lazy/cheap
Instead of having a camera as thin as the lid, they could have the camera stick out a little bit and fit into a recess on the other side of the clamshell when closed.
Surprising as it seems, 8GB seems to be enough for most "normal" users. My wife has an 8GB MacBook Pro and I'm always shocked by how many open tabs she keeps in Chrome and how many Office documents are open at any given time.
My 16GB MBP is constraining sometimes and the next one will, doubtlessly, come with at least 32GB. Until Apple can do that, I won't use an ARM-based Mac as my daily driver.
As for the 8-core, it's 4 beefy ones and 4 low-power ones. Load will shift from one kind to the other according to usage patterns. This is how they achieve that 20-hour battery life.
I brought a decent external camera, and while being able to position it is a must have, the quality means that if I don't shave or have a nick or something in the background, it is obvious.
Whereas the people I am on an online call with have about has many pixels dedicated to their hair as Laura in Tomb Raider 3.
I still care enough that I use it, and I am disappointed in Apple not including their iPhone camera, but it might not be something the average person wants.
I agree that the cameras are subpar. I wonder if Apple figures that most Mac users already have an iPhone, and that if they want to use a better camera for videoconferencing, they can always just plug in their iPhone and use it as the camera.
Has anyone done this? I've considered using my iPad Pro's camera this way but have never cared enough to actually do it.
I'm pretty sure there was a part of the presentation talking about a camera upgrade. But it's not something on my wish list, so I didn't pay attention.
Double-check the recording on Apple's web site. The answer may be there.
Super interesting how they kept the touchbar on the Macbook Pro keyboard but not on the Air. As a software developer, I'm going to be more likely to buy an Air just due to the keyboard.
Having bought a Macbook Pro with an Esc key earlier this year, lack of an Esc key was 90% of my issue with the touch bar. My biggest issue now is when I accidentally activate it. I didn't have an issue accidentally hitting the function keys on my 2013, so why not bring those back? Or invent something new. Forward or back, I don't care, just admit the touch bar was a dud and get it off my keyboard.
EDIT: My mother also says she gets distracting autocomplete suggestions on the touch bar while she's typing. I vaguely remember doing something to turn that off on mine, but she is terrified of Covid so I haven't had a chance to get at her laptop and fix it for her. I don't know what human factors genius at Apple decided it would be helpful for people to see words hopping around at the edge of their vision while they're trying to type.
> I vaguely remember doing something to turn that off on mine, but she is terrified of Covid so I haven't had a chance to get at her laptop and fix it for her.
You can use Messages to easily initiate screen-sharing (with you controlling her Mac) to do that. No 3rd party software, no action on her side needed.
It's absurd that you had to resort to this.
I wonder if someone also released a standalone bluetooth escape key that attaches to the side of a keyboard (equally absurd).
This is such a bizarre unforced error. Why would they just arbitrarily remove a row of the keyboard? Taking it away and asking why I need it is like asking why I need right-click, pinch-to-zoom, or my left pinky finger. It makes no sense that we have to choose between fully featured input and active cooling.
Either way, based on this announcement I'll probably hold off on upgrading for another generation. That'll leave some time to see how the transition goes, and with any luck the next ones will include 32 GB RAM, Mini-LED, and 5G (along with a full keyboard).
Don't know why they didn't just add a screen to the blank function key width gap of bare aluminum between the keyboard and the hinge if they were looking to add some decoration to the keyboard. That would have been praised, instead it's been a pariah.
Step Over/Into/Out shortcuts in Xcode (F6/F7/F8) are the really obvious ones which I hit many times per day. Play/pause/etc. with the Fn key are easier to hit without looking than the touchbar.
I prefer that they change labels in a context-sensitive way. For keys that I don't use that often the icon is a better trade-off than a tactile response. Sliders are useful as well.
I'm not the person you replied to, but I frequently use the play/pause key and also the volume keys. I occasionally use the function keys in my text editor too.
I'm with you. My work computer has the TouchBar, and my personal computer has the physical keys.
While I prefer the physical keys, once you get the TouchBar configured properly (I use MTMR), it's really quite nice. Combined with a physical escape key, it would be ideal.
Considering how the people on HN boast so much about being L337 Haxxorz, I'm surprised they don't see the value in a secondary interactive screen that they can make do anything they want.
I don't look when I type, either, except for the function keys, because I don't use them that often. So since I'm already looking down at the function keys, looking at the TouchBar makes zero difference.
One minor annoyance, however, is that the TouchBar isn't visible in direct sunlight, which is to be expected of any screen. Function keys don't have that problem.
It is Cmd+Backspace for me because I remapped my MacBook's keyboard so it matches my ThinkPad's[1] - I really don't want to have to play Twister with my fingers.
[1] Yes, I've remapped/swapped the Ctrl+Fn keys on my ThinkPad too.
I can't fathom why Apple couldn't just include both. I'm using a company-provided touchbar'd Macbook Pro for work, and it's handy (I've got it setup with screenshot and screen lock buttons so I don't have to memorize the absolutely bonkers keyboard shortcuts for them); it just seems boneheaded to treat it as a replacement of the function keys instead of an addition.
I've been using Airs as my primary machine since 2010. I don't use bulky software (XCode, Adobe suite, Office, etc), so if you use any of that I wouldn't recommend it, but for software dev it's been plenty fast and a real joy to use.
The only time I'm speed constrained is deep learning, but generally I just run tiny test sets locally and then run full jobs on a cluster or the cloud.
In the mid 2010's I had a desktop with like 64GB of Ram and four million processors, and I found programming on the Air I was still more productive. Productivity wise I think it's a very high dimensional space to consider.
There's a noticeable latency using gmail.com on my iMac Pro, too.
I've always used powerful desktops (Mac Pro, iMac Pro) and MacBook Airs while traveling, and while the Airs aren't very good for video games like StarCraft II, I've never had a problem with performance while doing anything else (I work in VS Code on TypeScript). My unit tests run in ~20 seconds on the MacBook Air as well as the iMac Pro.
Since the new MacBook Air doesn’t have a fan while the MacBook Pro does, I’m sure there are some differences that Apple isn’t admitting right now. The battery life claimed by Apple is also higher on the Air compared to the Pro. Maybe the Air is throttled or runs at lower clock speeds.
If you configured them the same (chose the more expensive Air), the Air and the 13" Pro have the exact same specs from what I can tell other than a slightly brighter screen (500 nits vs. 400 nits) and slightly bigger battery (58.2 whr vs. 49.9 whr) on the 13" Pro.
I am also not a fan of the keyboard or touch bar, but especially the touch bar. It's a cool idea and I can see useful functions, but not pushed up against a place where you're typing. I am constantly bringing up Siri or changing volume when hitting numbers, -= or delete from a finger swiping across the touch bar accidentally. There used to be a tactile force needed to activate F key functions, but now there's just a capacitive touch bar.
What's wrong with the new keyboard (touch bar aside)? I haven't tried one yet, but I thought it was supposed to be more or less the same as the 2013 - 2015 MBP's.
Personally I tried using it for F keys which is quite frankly the only requirement I’d have and it was awful. At a fully extended hand, which you need to do if you’ve hit a meta key, it’s impossible to hit the right F keys every time. I don’t know how anyone writes code on those machines.
It went back to the Apple store within a week and I got an air. Which went to the Apple store in a week because the keybbboooaaarrdd wwaaass aawwwful.
Then I bought a thinkpad. Which is less of a shiny toy.
T495s running Ubuntu 20.04. Absolutely perfect machine as far as I am concerned. Everything works flawlessly that I've tried and the quality is excellent. Really like it.
Like most oldsters, I have presbyopia, which is exacerbated in low light. When I need to turn up the brightness of my monitor, I can actually see the "button" on a touchbar. On all my non-touchbar laptops, I randomly press F keys and hope for the best or pull out my phone for a flashlight.
Yes, I miss the physical esc, but F-keys are usually SW configurable and I can configure splat-1-10. That said, I am not a frequent programmer...
I love the Touch Bar. All those cmd-opt-shift bizarro keyboard shortcuts, I have mapped to custom buttons across lots of apps. A couple popup apps on globally-available buttons. Even a swiping gesture while on iTerm to fly through command history.
It's great if you take the time to actually customize it.
incidentally, I place the “blame” for the poor uptake of the Touch Bar at Apple’s feet. They relied on uninterested developers to make it useful, and didn't but the necessary tools in the hands of users to make it useful themselves. In five years, they’ve done nothing to expand its capabilities since the debut.
Everything cool I can do with the Touch Bar is thanks to a third-party tool, BetterTouchTool. Also enables some cool stuff like three-finger-trackpad-swipes to cycle through window tabs! Worth the price of admission for that feature alone, give it a look. https://www.folivora.ai
I really have no clue what apple is thinking with regards to the touchbar.
Why should any developer spend time developing sorely lacking features for the touch bar, when they just released the most popular mac without one?
The only way I can make sense of the decision to release the air without a touch bar, is that it must be really expensive to manufacture and they were struggling to hit the $1000 price point.
As a result the touchbar lives in this weird limbo state. Apple themselves clearly are uncertain what they want from it, and it shows. Since its release the touch bar has been left mostly abandoned and it’s been up to third party developers to make it useful.
It can be useful, at least after heavy customization – something a screen lends itself well to. Why isn’t apple doing more to help users customize it?
macOS comes with an application called Automator. Automator is confusing for power and regular users alike and as a result, nobody uses it. Why not rethink Automator completely with the touchbar in mind? Bring the power of programming to regular users with an easily accessible ‘create your own button’ feature that lets users add custom commands accessible from their touchbar?
My fondest wish is a high-end MBP 16" with no touch bar. The touch bar's sole role is to generate errors and missteps resulting from merely grazing over it.
Even setting the touch bar to vanilla F-keys, grazing triggers F-key actions which is so frustrating.
FWIW there are third-party software options for the Touch Bar such as MTMR (my touch bar, my rules), which at least allows you to activate the haptic feedback in the trackpad when touch bar buttons are pressed. I found that it helped dramatically with accidental touch bar presses.
MTMR also solves the other main problem with the Touch Bar which is that it hides brightness and volume controls behind a tap (so you can't, for example, instantly mash "volume down" when you find it is unexpectedly loud.) With MTMR (and others I believe) you can make multi-touch gestures on the bar to adjust volume and brightness swiftly.
All that said, I'm not convinced that the touch bar adds enough value to justify its cost. If your day-to-day computer use includes tasks it is good it, maybe. As a developer, probably not.
Just my $0.02 as a touchbar-skeptic-cum-macbook-owner.
I use it for exactly what I've always used function keys for- brightness and volume.
I also occasionally use the emoji picker or app specific stuff. For example- my markdown editor has Touchbar selections for code blocks and all that sort of stuff. I don't write enough markdown to remember everything- so the touchbar makes it nicely discoverable without having to click through menus.
I only use it for brightness and volume controls (and TIL about the Emoji picker thanks to a sibling comment). I LOVE the analog-feeling controls on brightness and volume. Even though I know it's the same under the hood as an incrementation button, the UX of it makes me feel substantially happier. I absolutely miss the volume and brightness sliders when I go to use my Chromebook or my wife's laptop.
I normally keep the touchbar configured to only show the ~4 most used buttons (night mode, brightness, volume, and mute), and find that I use the other buttons rarely enough that I forgot what else was on there -- mainly because I use touchpad gestures instead.
I rarely (never?) used F-keys in my IDE (Jetbrains) or other apps, so I don't miss them. I have a physical Esc key (which is nice), though I used it rarely enough on my previous generation touchbar mac that it wasn't very infuriating. Having the physical escape + power keys removed any complaints I previously had about it.
Interesting. Volume and brightness adjustment is my least favorite part of the touchbar. With physical keys I can adjust the volume/brightness without looking down.
If it were a physical slider, I'd totally agree with you.
(Not that there's any right or wrong or answer here.)
I dislike it immensely, mainly because it changes all the time and this makes ot hard to use worhout looking at it every single time. The best way is to use an external keyboard.
Depends on how you use it, if you use a lot of actions (basically macros) via keyboard shortcuts then you probably hate the touchbar, because they can only be bound to f-keys.
VirtualDJ supports using it for a crossfader, but then, it would be a way better move to buy an Air, and a real audio interface and mixing board, or at least an external controller.
If Touch Bar were in addition to the function keys + Escape key, everyone would praise it as yet another brilliant Apple innovation, and rivals would be copying it the way every notebook nowadays looks like the late-2008 unibody Macbook Pro (and really, all the way back to the 2001 titanium PowerBook). But it's not, so they don't and they aren't.
On a tangential note, there were reports about the Intel based MacBook Pro from earlier this year having CPU usage spikes and heating issues when charging from the left side port. For users with this problem, it was effectively “charge using the right side port”.
I definitely get more heat and throttling powering on the left side than the right, on my 2019 16”. There are quite a lot of people who have reported the same, for all models which have USB-c ports on both sides. If you’re not seeing a difference, it’s pretty likely your workload or some background process is still pushing the thermal limits of your machine.
Hmm, I wouldn't say this is nonsense. I can consistently reproduce the issue by charging on the left side (making the computer completely unusable), and then switching to the right side and having it run perfectly fine. Since I started charging on the right side I've had zero issues with it.
You would also think they would keep the MagSafe connector, which was truly revolutionary (and has definitely saved me at least a dozen times), but yet here we are.
I've already tripped or stumbled on my usb-c cable dozens of times working from home. The damn thing is already pulling out at the ends. Why can't apple make good cables? They make their environmental statement making you buy the charger for your phone rather than shipping one in the box, just to sell you junk cables that don't last. I had old multipin cables and earbuds from the original ipod that still hold up. All of their textured grippy feeling cables have been crap.
You can’t charge from either side if there are only ports on one side, which, if I understand the design and Gp’s comment correctly, is the case for the ARM macs. Which would be those GP meant as “this generation”.
Five years later and I still can't buy a new 13" laptop from Apple with more RAM than my 2015 MBP.
Edit: Apparently you can configure a (Early 2020) Intel-based 13" MBP with 32GB of RAM - I was not aware of that. Hope they bring that option to the ARM versions ASAP, especially if the performance gains are as good as Apple claims.
I'm guessing that since the RAM is now on the SOC instead of on separate chips creating models with more RAM becomes more difficult not only from a space constraint on the die but a cost to manufacture more variants?
The iPad Pro has a mere 4GB of RAM and is faster at almost every task AND multitasking than any of my other beefy machines. I wouldn’t discount this yet.
I think it depends what they've done with Big Sur.
Take a look at your activity monitor and see how many background tasks you laptop is running right now (I currently have 481 and I only have 4 apps actually running).
Compare that to iPad which can run 2 apps max and apple controls all the background activity.
Hitting a wall how? The machine stops working properly? Or you see all the RAM allocated in Activity Monitor? Because MacOS aggressively allocates RAM even under a normal workload.
Well when you start getting memory pressure you start losing cycles to compressing and decompressing pages which makes everything run like complete shit.
That’s not how MacOS works. It “keeps the pressure on” so to speak to try and have as much in-memory before you need it. Have you experienced actual issues with specific software?
Blows my mind, I do most of the same on a 2013 MacBook Air with only 8GB RAM. I’ve never had a perf hiccup unless I try video editing or gaming. But I just don’t really have a want to do those things, hence why I’m still getting by on this thing. I was going to upgrade once for the hell of it but all the keyboard issues kept me seated.
I expected you to say you did something with graphics/media. I guess the VMs could be the difference. I don’t use. But I have 100 chrome tabs open at most times lol.
All these things work with 8GB RAM. macOS uses all available RAM aggressively, so if you have 32GB RAM it will appear as what you are doing requires 32GB RAM, which is not the case.
On my machine, if it goes more than 5G into swap it starts to become less responsive.
It really depends on your usage patterns. VM abusers and people who need to keep an eye on more than one project at the same time (as in multiple IDEs) can't do with 16 Gb.
And please don't tell me to start closing software, I'm willing to pay for more ram to have everything handy. Except... I can't.
Right now I have 6GB for VMs (for ~12 docker containers) and 8GB for various IDEs and editors. Probably typical for those on my development team. Surprisingly Chrome isn't even the top ten for memory footprint on my laptop.
So far 16GB has been fine on this 2017 MacBook. Wouldn't turn down 32 though :-)
Could be true. But I get instant feedback from all apps, so what's the measure of "reduced performance" that would matter? I'm actually curious, if nothing lags, would I notice a difference by upgrading?
My view into your comment is that you think I'm used to the lag. That's not the case. It's not my only device just my only Apple laptop. My newest device, a iPhone 12 Pro, is an upgrade from my 6s that I surely noticed and knew I would. I had been frustrated by the 6s for a while but held out for 5G.
Yeah, I found running all my dev environments in Docker/VMs made everything too slow. So I've just installed everything natively and have my own little docker-compose style runner to simplifying running our in-house microservices. I would definitely appreciate decent Docker performance, but it was actually surprisingly painless to set this up.
I did some benchmarking on that public taxi rides dump (~600m records) and could run through it all in under 15 seconds directly from SSD. No way in hell your CPU will keep up with anything close to memory bandwidth or beyond 3GB/s.
The only thing could be some really aggressive swap disk situation offloading anything not needed that second. But that won't work for things where you need all the ram right now, video editing for example.
But you don’t have background services on the iPad and apps are hibernated to disk if they aren’t active and iOS needs more ram. It’s possible that Apple is doing the same thing on macOS, but we don’t know (it would break a ton of apps).
This is a good point and begs an interesting question: will they continue using the same ARM chip across the whole line when the 16" MBP and the other iMacs make the switch, and if so, will all Macs of the same generation always have the same amount of RAM? Or will they branch the chips (M2 and M2S, or something)? Is RAM becoming less relevant when you have smart integration of software and hardware components, to a point where stratification is no longer necessary in most cases?
> Is RAM becoming less relevant when you have smart integration of software and hardware components, to a point where stratification is no longer necessary in most cases?
If they intend these things to run software development, audio/video editing, CAD, or any other resource-intensive workloads, there will absolutely be demand for more.
I could see the M2 coming with 32GB across the board, at which point the only current outliers (setting aside the Mac Pro as a special case) would be the 64GB configuration for the 16" Macbook Pro and the 64GB and 128GB configurations for the iMac. I could imagine Apple's optimizations closing the performance gap with the current 64GB offerings, and then perhaps they just leave 128GB customers to go all the way for the Mac Pro. I would be surprised to hear that they sell very many 128GB iMacs right now anyway.
I used a MBP with 4 GB of RAM up until 2018 for rather heavy duty data science workloads. Wasn't ideal, but one thing I learned was that you really can't infer the amount of swapping and performance degradation that occurs just by looking at how much RAM is in use vs. how much you have, because the OS will eat up whatever it gets. The little memory pressure graph you can see in Activity Monitor is quite good, and on my current machine with an "unfortunate" 16 GB of RAM, memory's always full but I have no complaints about speed whatsoever.
Crazy to imagine this, but restricting developers to 16GB machines might be the most effective way to fight against (cough... electron driven) software bloat :-)
fwiw, you can configure a 13" Macbook Pro with an Intel chip up to 32GB. But I agree, I wish they launched the new M1 based 13" Pros with up to 32GB RAM
The low end Air only has 7 GPU cores compared with 8 on the one with more storage. So they must be disabling a bad core and selling it the cheap model. Other than that all these machines use the exact same CPUs. Which means that an iMac or 16" MBP are probably going to use a M1X or something with more cores.
Yes. Chip fabrication is super sensitive to the condition of the silicon wafer used. Chip companies talk about yields, because some percentage of chips can't, for example, be run at the highest clock rate. Indeed, some can't run reliably at all. If there is a microscopic flaw on the wafer that ends up being where one of the cores is located, disabling that core altogether is an option to keep that silicon marketable.
The Intel 13" MBP with 2 TB3 ports also topped out at 16GB. They're actually still selling the Intel 13" MBP with 4 TB3 ports, which is the model that can be configured to 32GB.
This is 16GB which is shared with their GPU so it will be interesting to see what limit Apple puts on the GPU for grabbing memory.
There are stories that Apple is working on a gGPU so that would free space up on a future Mx chip for additional processors or memory. However looking at the space occupied by DRAM and GPU it looks like a larger die is required for any on board memory expansion.
Do most people with MacBook Airs have workloads that require more than 16GB or ram or do they generally value a lower price point?
Cost, performance, light weight - pick two right? Apple seems to have picked a combination of lower cost and light weight. Customers who need more ram probably move up to a MBP.
For the lowest end models right? It's the same tradeoff to bring down prices for people buying the lowest end machines. Presumably they understand that people who need the higher end spec'd machines will wait to buy a new laptop until more software has been migrated to run natively.
If you need a 13" MBP that supports more than 16gb of ram, they sell it. It seems unlikely they'll stick with a 16gb limit once they start replacing the higher end devices.
I honestly can't think of a situation where you need more than 16GB of ram on an ultrabook. If your job is heavy video editing - I don't think getting a laptop is a good idea.
I know I use a lot of RAM for compiling and tests, but we have cloud instances with up to 500GB of RAM for that.
Also with the new ARM instructions, I suspect more heavy task like these will be forced to move elsewhere. Businesses might not want to switch for a few years to wait for dev tools/ARM servers to be available.
Please do explain what your workflow is that requires to you have so much RAM in such a small form factor with a severely limiting CPU and almost non existent GPU?
GPU: don't need it except to drive 1 or 2 external monitors if I'm at my desk (thus small screen not an issue either)
CPU: 8 cores is fine for me
RAM: mainly for a local dev environment running in Docker
For reference I currently have a 2019 16" MBP, with 64GB of RAM and an 8 core i9. Right now I'm using ~45GB of it, usually its up around 50 or 55. The CPU is at ~10%. While I'm developing locally, the workloads are highly bursty but sometimes they will chew up almost all of the CPU for about 10 minutes.
I used to have a 2016 15" MBP with 16GB (4 core i7) and while I could still run everything, the RAM was always pegged. Before that I had a 2016 13" MBP with 16GB (2 core i7) (I actually swapped with a coworker because I thought his 15" had 32GB). With both of these the RAM was by far the limiting factor, and while the increased CPU cores have been nice, most of the time I haven't noticed. The RAM increase on the other hand is always noticeable. Thus, 16GB is a nonstarter for me.
Edit: To add, I prefer the 13" form factor when I'm not at my desk.
More correct would be to term this limit "absurd" or "clownish".
Even if Apple views the use-cases of these machines as primarily consumption devices and the users of these machines not as creators but as consumers, the bloat of the modern web is expensive and getting more expensive every day.
The end user needs this memory even if they really are just using these as facebook machines.
EDIT: That's not just the notebooks - the mini appears to have this limitation as well (down from 64GB previously).
Not all RAM is created equally. Better memory management means less RAM can be better than more RAM. Pair that with superfast flash storage for SWAP and you might not even be able to tell the difference between 16 and 32.
Besides, this release cycle is 100% optimized for an impressive speed boost, tempered by a need for a more impressive battery life.
> Better memory management means less RAM can be better than more RAM
No, it can't. If more RAM means it's slower RAM then maybe it can be better to have less of it in some workloads. But otherwise it's never better to have less RAM than more RAM. Better memory management can make the impact of less RAM be less severe, but it's still unambiguously worse.
Especially if you have workloads that actually need the RAM like large ML models or editing 8K videos.
> better memory management absolutely can be better than just throwing more RAM at the problem.
Those are not competing in any way. Better memory management does not require nor benefit from less RAM.
Apple doesn't give you a different kernel when you choose the 8GB SKU instead of the 16GB one. It's the same software, just with less RAM. And having that less RAM is best case break-even in "day to day" experience, but never better.
It's true that if the OS could predict exactly which memory pages to keep and which to swap out, we could save memory wastage, but so far I haven't seen any memory management scheme that can reduce memory consumption by half.
For me personally, I won't even consider a machine with less than 32GB ram in 2020/2021. With 32GB, I never close out of applications that I use regularly, and so it allows me to switch state instantaneously for not that much more money. My workloads are typically read-heavy & multiple GBs - editing/screening/cataloging hundreds of RAW photos, loading of multi-GB games, having about 50 tabs open in FF, etc. After having switched careers I rarely code anymore, and I don't think these are uncommon requirements.
Yes that's true, for the "average consumer" that really only needs that RAM to power the 100+ browser tabs they have open. But if you're doing lots of virtualization or containerized work, super fast SWAP isn't going to cut it.
I no longer code so I'm probably close to the 'average consumer' now. I personally consider 32GB to be the minimum amout of RAM that anyone should consider in 2020,2021 (with obvious caveats on money). My multi-GB workloads are read-heavy and include loading multi-GB games, editing hundreds of RAW images, opening 50+ tabs, etc, all without leaving the cosy confines of my RAM - which I still sometimes do. I have a 100GB system commit limit on my W10 box, and with my current usage pattern, I hit about 50GB @ peak.
I was literally about to order a new mac mini, until I noticed it only has 16GB max memory. How can they do this?! The previous model could be upgraded to 64GB, had 4 USB-C connectors (instead of now 2) and the option for faster ethernet. It really sounds like this new mac mini is a downgrade from the previous model. I wonder if the new M1 chip/architecture really makes that much difference to make up for the downgrade of the rest.
The new Air looks good. The new mbp 13” not so much. If they’re going to lower the max ram they should at least provide an explanation why 16g in the new architecture is comparable to 32 in the previous, if that’s indeed the case.
Dell XPS 1000+ 8GB, Lenovo X1: 1000+ 8GB, Thinkpad T14s 1000+ 8GB, this list could be long. Even System76 linux computers are 1000+ for 8GB config. I do not know where you got this idea that 1000+ laptop have 16GB default.
I am sure there will be consumer (crap) computers like HP pavilion, Dell inspiron etc where one can get 1000+ computer with 16GB or so RAM.
My Huawei MateBook 13 2020 Ultra has 16GB and cost ~€1200 i.e. same ball park as the ones you listed (i7, 512GB SSD, 16GB RAM, GeForce MX250, 2736x1824 display).
Apart from the RAM the main reason I went for it was the resolution (I liked the extra height in a 13" form factor). Also, it works with linux - though haven't tried it yet.
Recently bought a Thinkpad E15 Gen2 with 24 GB of RAM for less than $1000. (Different country though (Japan), and ~6 week wait, but extremely reputable retailer)
* As to how that deal worked, I'm guessing that large retailers that agree to buy a lot of laptops get a large discount.
I finally switched to 32GiB this year, and man, what a difference. I hadn't thought I was all that constrained by 16, but now that I'm not, all the edge cases where that was the pain point are thrown in to relief.
Not strictly comparable, but my company provided Lenovo P1 that's a few years old got 64 GB. Can't imagine having to deal with 16 GB a few years later..
I think it's actually a good move for the first gen. If they put out the CPU frequency, people would be comparing it to the intel parts. ARM and x86 being different architectures, a 2GHz M1 is going to perform differently than a 2GHz Core i5
There's been dozens of macs that have come out with a lower baseline frequency than the last gen. My ancient 2004 ibook G4 has a higher frequency than a 2020 macbook air, doesn't mean customers conflate it to be faster nor should they if your marketing was worth a damn.
It is sorta meaningless these days. The 3.6GHz chip in my P consistently averages at 4.3GHz, and climbs up to high 4's when it needs to. It's also much faster than a 3.6GHz processor from 5 years ago.
It's a great baseline for comparing same-generation computers, though. Without benchmarks, none of us has any idea which is faster, the Air or the Pro, or the 15".
Note they say "latest", not "best". That means that a Celeron 5205U, dual core processor with 1.9GHz, no turbo, no hyper threading, 2MB L1, would be a valid baseline.
Apple's direction of marketing is better suited for mainstream consumers. Many of them don't know what CPU frequency means. For them, its additional noise with no added value. Consumers have too many choices and throwing a spec sheet often make them compare for hours and still can't make a decision whereas you know that this year's Mac is likely faster and better than last year's.
What I can't find is how many external monitors can be supported on the M1 chip? I don't see any detailed specs. Al they say about the M1 specs are:
"The Apple M1 chip is the first system on a chip (SoC) for Mac. Packed with an astonishing 16 billion transistors, it integrates the CPU, GPU, I/O, and every other significant component and controller onto a single tiny chip. Designed by Apple, M1 brings incredible performance, custom technologies, and unparalleled power efficiency to the Mac.
With an 8‑core CPU and 8‑core GPU, M1 on MacBook Pro delivers up to 2.8x faster CPU performance¹ and up to 5x faster graphics² than the previous generation."
At practically every desk in Silicon Valley for 4+ years, yes. Two 27" 4K Dell UltraSharps and a choice of 13" or 15" Macbook Pro are standard issue for an engineer.
Yeah I’m hoping it’s just a case of them not having the specs in right. Otherwise it would be a pretty poor downgrade for those that use two monitors. I don’t use the laptop screen either when docked so really no reason it shouldn’t be able to drive them.
I was under the impression that Native display port would be able to drive 2 external monitors. I do not consider my self to be an authoritative source on this.
I keep seeing comments about lack of MST support but they're ignoring that you can currently already like you say drive dual 4k60. I'm doing it right now.
You probably has monitors with native thunderbolt input? They are not that many such monitor exists. Without MST you are out of luck to do thing with Display Port (via DP, mDP or USB-C), and this is what majority of monitors on market has.
No, they're both DP. Thunderbolt 3 can do dual 4k60 over a single cable (including also providing power, ethernet, audio, etc) https://www.caldigit.com/ts3-plus/
Perhaps not (I've never tried daisy chaining) but a MacBook Pro can drive two 4K@60Hz monitors over a single Thunderbolt 3 dock cable (I'm doing it right now).
Strange, they list the resolution you can use if you use one external display. What about two monitors? I don't believe you could only have one display connected, that would be ridiculous, but I also don't understand why they only write about one display and not about many?
Exactly, a Thunderbolt 3 dock. If you're trying to use an (inexpensive) adaptor that takes advantage of USB-C DP alt modes you will run into this problem.
Ironically, if you boot into Windows you can use these cheap USB-C adaptors with no issues.
Not likely, lack of arm build drivers and all. Though with the Mac Pro I'm curious how they'll handle hardware upgrades. I hope they don't revert back to trashcan mac like video situations.
I backed a game a while ago on Kickstarter with native support for Windows, MacOS and Linux. Several weeks ago they sent us a message saying they were ditching the MacOS version due the architecture change.
I wonder how it'll affect the whole ecosystem. I think it'll end being like an iOS on steroids.
I think their hope is that the Mac's anemic game ecosystem will be filled in by iOS games which run natively. I hope they're right, and I say that as someone who has a gaming PC and still hasn't found many quality core games on iOS.
Considering most developers who have ported from x86 to MacOS have had few issues, it seems like the ecosystem will be just fine. Also, all the big pro software developers—Adobe, Microsoft, Apple—are onboard.
Hard to say what this will do to gaming on MacOS, but gaming has never been the big strength of MacOS.
Depends on the engine the game uses. Games built on top of frameworks like Unity, Unreal Engine and the like get platform support basically "for free" as long as the framework in question supports the target platform.
Any in-house developed engine might either depend on 3rd party support (e.g. libraries like SFML[1]) or need to be recompiled for the new platform. Depending on architectural differences, optimisation and asset loading (in case endianness[2] differs) needs to be altered accordingly.
The main reason for a decision to cancel a release would be time and effort vs expected sales. The Mac market will be fragmented into Intel and Apple Silicon from now on and every "Mac version" would really be two separate ports (complete with separate QA and optimisation).
An independent studio might just not have the resources and experience to tackle this and a port that doesn't sell can effectively be worse than not releasing on a platform.
It's made by an indie studio which doesn't have the resources to port a game like this to a ARM. They also use a lot of third party software so it's not even up to them.
Looks like the Mac mini no longer has the ability to have a 10GBe Ethernet port. So, we could get 64GB of RAM and 10GBe in the last generation, and now this.
They still sell Intel version, it's not gone. Also I think that there will be USB dongles for 10GBe Ethernet which is not as convenient, for sure, but bearable. And RAM limit is unfortunate indeed. I wouldn't buy 8GB RAM in 2020 at all and 16GB is barely enough, so this device won't last long given the lack of upgradability. Probably better RAM options will come next year.
Well, the current USB-C to 10GBe is a big box (about two decks of playing cards) and costs about $150. I'm a bit unhappy about it. When a community college is starting to buy all 10GBe interfaces, its about time for Apple to move on.
I'm going to get two Mac minis to look at, and then wait to replace my current machine when 32 or 64 GB ARM machines are available.
Straw poll: I need a new development machine, price is not an object but it has to be a Mac. Should I buy this new M1 or go with the Intel 16 inch MBP?
My gut says there’s going to be a year or two of cross-compilation nightmares. I do a lot of Docker-based development. Wondering what everyone else thinks?
I sometimes wonder what the overhead of Docker costs on a Mac. The fact that you have to run a whole linux VM, with guest OS, then send data back and forth over that, especially with volume mapping. I'm sure it can be quantified over running linux directly on the same hardware.
My machine (MacBook Pro 13-inch, 2018, 2.7 GHz Quad-Core Intel Core i7, 16 GB 2133 MHz LPDDR3) often runs screaming hot (with screaming fans too), when I only have a browser, zoom, docker and IntelliJ running, and I wonder what went wrong.
Yeah, it’s pretty crazy especially when you’ve got a bunch of bind mounts. My current machine is a 2013 MBA and after cumulative 100+ hours of fighting with docker over the years, I finally gave up and started using remote development in VSCode on a Linux server. That was a great decision.
VSC is the real workhorse here. It’s way beyond something like an SSHFS mount. The server actually runs an instance of VSC. It’s really quite good and I don’t even notice I’m editing on another machine. Would recommend trying it.
> I sometimes wonder what the overhead of Docker costs on a Mac.
I wouldn't use Docker on a Mac. The fan goes like crazy the whole time, and the battery life is like an hour or less. Whereas with vagrant and virtualbox the fan stays off and the battery lasts all day.
Go for the 16 inch. I’ve upgraded from my 2015 model (couldn’t stand the butterfly keys) and the wait was worth it! Great keyboard, physical escape key, 16 threads, large screen - great machine. It took them 5 years to get there so don’t get your hopes up for this new one ;-)
For such workflow 16gb is not enough. Docker on Mac is a VM and you need to have a lot of ram for disc caching to compensate for slow IO in VM even if 16GB looks like they could do it.
Get the Apple Silicon. If it doesn't meet your expectations, return it and get the Intel one. Returns are trivial with Apple if within 2 weeks (might be longer now with covid).
This is terrible advice, with new launches like this there's always a ton of compatibilities issues and you don't want to spend ages getting the device setup to realise some specific software that you must use isn't supported or sit around waiting for updates to support the new system.
Why not just wait a few weeks post release to get a reviewers take on using the new device as a development machine?
To be honest. You should wait a few weeks for all the expected reviews which will not only perform performance comparisons but highlight what software works that which does not
Should I buy this new M1 or go with the Intel 16 inch MBP?
IMO, it depends on how long you're going to keep your computer.
If you're the sort of person who keeps using the same computer for eight or nine years, you don't want to end up on the old chips when all the software has migrated years earlier.
Imagine still running a PowerMac in 2010.
I was ready to buy an M1 model, but the deal breaker for me is the screen. I use a 2011 Air. An 11" screen was fine a decade ago, but my eyes aren't what they used to be, and I don't think 13" is going to cut it. When a 15" or 16" option becomes available, I'm there.
I'd say the 16, but with a caveat. By itself it is a great machine. In clamshell mode with an external 4k screen it is a great machine. With the laptop screen open connected to an external 4k screen it is EXTREMELY noisy and hot.
What difference does having it closed make? I keep my on a stand but open just for slack really. It does make a hell of a racket when I’ve got a lot of containers running.
Is the reality of the 4k era that if you want dual monitors and peformance that you have to go back to 1080p monitors? Looks like I'll be holding onto my $100 dell screens for a while yet.
Bummer. I just paid $3200 for a lower end Lenovo P620 (only 12 cores), and similarly spec'd Apple machines were running almost $8k, and benchmarks say the Macs are technically slower. These were desktop workstations, though.
I know it doesn't help your decision, but I'm still excited about the new computer
Can you run docker and virtualisation software on these? I read a comment about Federighi mentioning that there's no EFI or dual boot, which I can't find now, and I assumed there's no virtualisation at all.
I'm interested to see the benchmarks vs. comparable AMD systems. Some of the claims, like 2x performance increase on the MBP are impressive, but intel laptops have been absolutely trounced by AMD 4000-series laptops of late.
Also will be interested to see the benchmarks of the integrated GPU vs. discreet GPU performance.
> Also will be interested to see the benchmarks of the integrated GPU vs. discreet GPU performance.
If they're "only" claiming 2x over anyone else' integrated, then it's not going to be all that interesting to compare integrated vs. discreet. By comparison here the 2060 Max Q is around 4x faster than the Vega 8 in the 4800U.
Although that 2x faster than anyone else's integrated also has some really big question marks on the claim. The fine-print on that claim includes:
"Integrated GPU is defined as a GPU located on a monolithic silicon die along with a CPU and memory controller, behind a unified memory subsystem"
Which sounds a bit weasel-y like it's trying to specifically exclude what's commonly thought of as integrated graphics like the two-die approach in Tigerlake. Which how the product is packaged shouldn't really matter?
There's no claim of any 5-6x faster than previous there. It says "fastest integrated" (and in the fine print that integrated is specifically unified monolithic die, eliminating chiplet-style products), and then also "M1 delivers significantly higher graphics performance than the very latest PC laptop chip — for up to 2x the graphics speed."
So take the 4800U, which has among the fastest commercially available integrated graphics in a monolithic die, and double the GPU performance. If you needed a discreet GPU in your laptop previously this probably won't change that.
Actually I'm not sure that my number is correct, so I better won't spread misinformation. It's not clear what those Apple flops really mean and there are multiple characteristics for modern GPUs which are measured in flops with different values: half precision, single precision, double precision, tensor compute, sparse tensor compute.
For me, battery life has been the biggest disappointment with the 2018 MBP. It used to be that if one paid more for a MBP, one got more of everything. With the last couple gens of Intel MacBooks, one had to choose: speed? battery life? Now their lineup (and pricing) makes sense again.
That does seem weird. Probably Apple doesn't care about developers or rather expects them to wait for the 16 inch version?
Personally I brought a windows workstation laptop and I have pretty much regretted that most of the time - if you are going for that kind of performance, running VMs, etc you are probably better of with a desktop. I expect that if you build a system with the new Ryzen 3 you can smoke almost any of the new mac models for 50%-75% of the cost.
Screws you pretty hard if you actually need the computer to be a mac for some reason.
I wonder, is the 8GB vs. 16GB "unified memory" the kind of thing where all of the hardware actually has 16GB, but they disable half of it to sell the lower price version? Like Tesla's Model S 40 kwh?
Probably binning where faulty memory gets disabled and flagged as a 8GB chip?
Intel did that for the 486SX processor 30 years ago. The 486 was the first x86 CPU with an integrated FPU. Chips that failed the FPU test were sold as cheaper 486SX with the FPU disabled.
That's not the only way they "bin" things. some have one time blow fuses and they will disable parts of the chip if they are being "lean" and just manufacture towards the number they expect to sell of each version. . It costs about as much to make one with an FPU or without it from a chip manufacturing point of view.
I've never heard that the 486SX was a binned 486 - my impression was that it was a deliberate choice done during manufacture. Wikipedia backs up my recollection: https://en.wikipedia.org/wiki/Intel_80486SX
Yeah, seems like “SX is a binned faulty DX” was a popular but unconfirmed rumor that has stuck with me for 30 years.
The die is the same, but there’s no evidence Intel rebranded rejected DX chips. Here’s an extremely pedantic discussion (when it comes to obsolete hardware, the best kind of discussion) on the topic:
As with Pavlov's comment about the memory this could be a yield thing. It may be that there are enough dies where one of the cores fails that it's worth turning them into a lower spec model as happened with the 486 SX all those years ago. More than one core failing might not happen often enough to warrant making them a product line.
Lots of people talking about frequency binning and Benning in general. I would like to add in here someplace that AMD used to make 100% identical chips for an entire full lineup and then adjust the frequency and prices to the actual binning process like if they tested a bunch of chips and got a lot that did 3.8 then they sold them all as 3.8 if they did 4.1 they sold them as 4.1 it was quite common to buy the lowest possible cheapest AMD chip and get one that could essentially be and overclock to their number one highest bench chip. See fx 8120 the lowest bulldozer chip overclocking to be and identical performer to the much more expensive and higher chips.
I don't see people mentioning it here, but the Pro machine only supports one external display. People have called the sales rep to confirm and this is indeed the case.
Entry-level pro? Devaluing the "pro" moniker? The latter isn't new though, they've had to spend years in R&D to try and reinvent the Mac Pro for example.
Yeah, I guess. I just don't understand how my 2015 retina mbp can support up to 4 displays (including the internal one), but this state-of-the-art-3x-gpu-performance chip can't do more than one.
Watching this event was definitely an interesting look at the top rungs of Apple. Their own corporate self-perception has certainly evolved in the last few years. I think many commentators have noted the way that Apple has tried to take its "decisive-break-with-the-past" marketing of prior years and prior hardware generations and make it something that can keep selling into the long term. Making their own chips seems like yet another salvo in that effort. For anyone interested in questions about the corporate structure of Apple behind-the-scenes, you might like this article: https://theorg.com/insights/the-minds-behind-apples-revoluti...
Not every 13" MacBook Pro has had four Thunderbolt ports; the base model always had just two. This computer clearly replaces that one, and Apple will launch another MacBook Pro in the future to replace the rest.
There was a model even with the touchbar that only had two ports and one fan. the four port model got you more cores and two fans, quite a different computer internally despite sharing the same shape and 13" macbook pro label. very confusing for consumers.
Interestingly, the Air and the Pro now are almost identical in specs: same SoC, same display, same connectors, same RAM, same SSDs.
On the plus side I only see the larger battery for the Pro.
On the minus side it's more heavy and a bit more bulky. And it's got the TouchBar (I'd count that as a negative...).
Honestly, why spend the $300 extra? I'd take the Air any day.
They have Rosetta 2 (translates Intel apps to Arm) and said the integrated graphics are so much faster than Intel's you can expect faster performance in games than before.
Url changed from https://www.apple.com/mac/ to the press release with more info. If there's a more accurate and neutral third-party article, let me know and we can change it again.
Have they said anything about hyperkit on the new platform? And being able to run stuff like Docker for Mac?
I've seen lots said about the lack of Bootcamp support but I've actually never even used Bootcamp, Linux VMS are much more relevant for me and I imagine many other developers.
From their roadmap, it does not appear they have really investigated this yet. [1]
Rosetta 2 translates Mac apps and can deal with JIT, but I do not believe it will work through virtualization.
It should be relatively light lifting to make Docker for Mac run an ARM linux installation and ARM-built containers.
From there, you likely would want to strategically add multi-arch container support to the docker runtime. This might be implemented based on existing qemu containers [2]
During WWDC, they explicitly said that Apple Silicon will support virtualization and Docker. Rosetta doesn't work for virtualization though, so you have to run an ARM version of Linux. They don't support booting directly to a non-Mac OS, AFAIK.
Rumor has it that they'll eventually support virtualized Windows, but it's currently impossible for an end-user to buy a copy of ARM Windows.
Well if I get to choose then I might as well ask for both.
But ultimately I don't really care, I just want to `docker run postgres:latest` to develop web and mobile applications against against, and build a Dockerfile to be able to run a server-side ruby, python, java, go, etc. app locally for development.
Anyone else wonder if these new apps will allow installing software from outside the App store - I was worried when they were talking about Secure Enclave.
They returned to the design elements introduced for the iPhone 4 in the iPhone 12. That general form would be pretty interesting for an ultra portable laptop.
From a marketing standpoint it doesn't make sense to update the design (this is typical Apple from what I've observed) - People will buy into the novelty of the new CPUs. For the refresh, you will probably see some design changes.
How would you improve on the Air? It's such a beautiful, sleek little thing. I've currently got a 13" Pro and it just feels so damn clunky next to the succession of Airs it replaced.
I'm pretty sure they're all identical chips. No doubt there are clocking differences. But that may be automatic based on thermals and not actually represent a difference in the actual chip.
This is going to be awkward and then it'll eventually become pretty good. From Apple's perspective, they're making the right move to their own CPU design. Apple has always been about controlling the user experience and by bringing it all in-house it takes things that were out of their control (Intel) and allows them to increase their vertical integration, unifying their product lines, and increasing market differentiation.
It's been clear for a while that ARM processors have hit near desktop performance and it took the megacompany of Apple to push it over the edge.
I'm a dedicated non-Mac user at home, but at work I always chose the MBP option over the craptastic Dell PoS that'll start turning into broken plastic pieces within a year or the latest Microsoft offering which seems to suffer from endless software and driver struggles.
The software story will take a while to work out, but Apple has made this transition TWO times before, which is astonishing. Nobody else has done this successfully. Part of the reason is there is simply less of a need among Mac users to maintain backwards compatibility and simply far less software to carry forward. Even then, it appears that Apple has done some kind of magic to allow older software to work on the new architecture without too much fuss -- I'm awaiting some benchmarks!
(This will hopefully push more of the industry to look at desktop ARM systems)
Lots of comments from people complaining about the lack of 32GB of RAM. Can someone explain to me the need for 32GB of RAM as a developer? Is it the IDE or your particular application area? As for me, I do all my professional development (C++17/Linux, Vim, EDA industry) on a five year old T450s with 12GB of RAM. For me the limiting factor is CPU power for sure. I only run out of RAM running some of the larger customer tests for which I have to go to a beefy machine in our network, but that happens rarely.
I can easily make good use of 32 GB doing normal full stack development: Browser, Visual Studio Code, a few Docker containers including PostgreSQL, Slack.
Actually this should consume a dozen GB of RAM more or less, the rest is in caches, which I think you're undervaluing a lot. Having everything you use in hot caches means any applications starts in a couple hundred of milliseconds, and everything is as fast as your RAM and CPU are, especially if you're running an OS that has a good implementation of file caching (e.g. Linux).
Having more leeway means I never have to close anything if I forget to. I might have my dev environment chugging along in a desktop workspace, while I'm in another with a few dozen browser tabs, talking on Discord with a full screen game running on the other monitor.
That's what a £150 module of 32 GB of RAM gives me. But actually I'm running 64 GB :-)
Both SolidWorks and Altium are such RAM hogs that I’ve had to upgrade to 64GB.
It would be lovely to work on laptops, since it’s the default assumption for most managers, so I either stick out as a weirdo for asking for a desktop every time I switch jobs, or shut up and spend most of my day dealing with slow laptops and random crashes, and having to make design choices around my computer (unnecessary) limitations.
Chrome (~1-4Gb), Slack with quite a few orgs (~2Gb), Discord again with quite a few orgs (~1Gb), Docker for local dev work (~1-8Gb), Gopls (golang's language server) (~1Gb per open vscode).
Running out of RAM is not my main problem, continuously swapping is as has a performance hit across the whole system for some reason.
As with many things, it's not a NEED, it's more that having to care about what I have open at any given moment gets annoying.
I know Android Studio will basically eat up however much RAM is on your system. Also, with containers becoming more common, lots of people use something like Docker Compose locally to mock multiple services or parts of a system, which, depending on how well optimized those containers are, can add up quickly.
Had a t470s (24Gb), now on a p1 gen 2 (32Gb). Even 24Gb can feel limited when running kubernetes, number of docker images, oracle rdbms, kafka, java development environment, a browser, office suite...
4 gb for Chrome tabs, 6 gb for your docker WM, 2gb or whatever slack requires, 2gb because your music is now streamed and the app is just a few html pages rendered through Chrome. Plus a couple of gb for your IntelliJ IDE.
If you are writing C++ in Vim for an embedded platform, you are limited by CPU power because C is so heavy to compile, but you are most likely also not likely to be the typical HNer.
I asked the same question. I get why people need more than 16GB of RAM, but honestly a lot of the answers seem to neglect that RAM (in most laptops) is not going to be your limiter, the CPU and/or GPU will be the limiter first for many of the cases people list. At that point, no amount of RAM will save you (though it may help).
Any applications with image processing are usually bottlenecked by the available RAM on a system. If you're running any ML applications, it's most efficient to run models on batches of images and batch size is determined by RAM.
I can see people needing more RAM, but I think people are forgetting that these are just the first few models Apple are bring to the market. If Apple truly believe that no one needs more, they could have dropped more of the Intel models.
Eehm. What is the actual difference between the MBA M1 and the MBP 13 M1? (except the latter is thicker?) Checked the compareison and didn‘t find anything substantial...
The MBP gets all 8 GPU cores, the MBA gets 7 (but you can upgrade the MBA to get all 8 for effectively $50 with a $200 upgrade to the storage as well). Touchbar and bigger battery. I think those are the only differences
That would seem to imply whatever quality it has resembles a studio microphone. It doesn't say 'Quality Studio Microphone' after all.
It reminds me of my recent purchase of an electric shaver. The front of the box proudly proclaimed 'WAHL: The Brand Professionals Use.'
It wasn't until I got home and started opening it that I saw a little bit of fine print on the side of the box: "not to be used in a professional capacity."
Yeah, that's what I was just looking at. Up to 2 more hours of battery life, TouchBar, and a [slightly?] better microphone is the only difference I can see. Not much price difference either.
Optimistically: I suspect the reason they didn't upgrade the 16" pro is because of the RAM story. In other words: no 16" w/ the M1 is an acknowledgment that configurations with 32gb (and hopefully 64gb) of RAM are not ready, and it would be unreasonable to release the highest end MBP without those options available.
Hopefully we'll see the 13" and 16" MBPs with M1+ and more RAM next Spring.
> M1 delivers up to 3.5x faster CPU, up to 6x faster GPU, up to 15x faster machine learning (ML) capabilities, and battery life up to 2x longer than before
If they can release ARM machines that are such degree better, how terrible is Apple software on Intel? Is this due to Apple software engineers? Is there any reason to use Intel for anything?
These are serious questions. I find the multipliers very surprising.
The peak of the mac mini was the 2012 Mac Mini Server where you could add an extra HDD and had easy access to the memory. That was the Apple I respected.
I'm not sure I understand why the current Mac Mini's needed to be a unibody design, since it just sits on your desk.
While I am excited for the M1 chip and the future of ARM, I'm absolutely disgusted by the price gouging in the name of memory and HDD space.
Something I haven’t seen commented on (it could certainly be here, just buried): all these models, in their previous Intel incarnation, only had integrated GPUs. It seems to me that at least one reason for why these particular models were released today is that Apple isn’t quite ready to release models with a discrete GPU (on ARM).
Many things are unclear. There's so much confusion right now.
- Will they ship a more advanced spec for the upper line? Something like a M1x, thus replacing the need for a third-party discrete GPU?
- Will they market these devices like they do with the iPhone: same Soc for every generation but with different sizes/extras (Macbook Air, Pro, Mac Mini just like you choose between an iPhone Mini, Pro, Max)?
- How long Intel-based machines will still be around? What they will do with the Mac Pro?
- How smooth will be the transition with OS/software (I expect it will take more than the two year planned) and how much it will change with this new ARM ecosystem?
As with any new first generation device from Apple, plus this vast, unknown land, I'd stay away from these machines and wait for at the least the next iteration (maybe two).
Performance gains are probably welcome aside from the ram limitation, though I was at 16gb anyway. For those mentioning USB ports on the pro, it seems like 4 are available on the highest spec version. I'm also excited for Big Sur, let's hope it isn't a catastrophe like Catalina was.
Otherwise I'm disappointed that there wasn't a physical design iteration. I was really hoping for something. MicroLED screen, faster refresh rate, different colours of unibody, smaller bezels, more durable anti-reflective coating, more dent resistance, but nothing AFAICT.
I'll be honest, I was really hoping they'd pull a DeX-like thing out of their pocket at the end of that presentation. "Oh, those iPhones you all have? Especially the new ones with MagSafe chargers? Yeah, plug it into/set it on this lapdock/stationary dock and it's running macOS."
Yeah, I know they'd never do it. It's not their style. But they're in the best position to do that well out of any of the major phone manufacturers.
In the meantime I'll be waiting for some of these Linux phones to mature a little...
This is galling. I bought a new top spec Air earlier in the year to replace the one I'd owned since 2013. I hate it. The fan comes on about every 30 seconds (doing things on the new machine that were fine on the 2013 one). Literally no at Apple seems to care and I'm not the only one complaining.
It's taken then less than 6 months to refresh the entire line? What an absolute kick in the teeth - especially on the back of a machine I really dislike. Super shitty move from Apple.
I'm sorry you got a lemon but what does that have to do with how often Apple releases new machines? How often is Apple allowed to refresh their laptop lines for it not to be 'galling'?
The performance improvements here are astounding. I can’t even remember when we last saw a 2x improvement in CPU perf, much less 3-5x and with better battery life to boot!
The macmini can be a superior ‘apple tv’ replacement for a TV. It fits nicely, can support USB controllers for emulating SNES/MAME/etc. supports airdrop etc.
When I was an active software developer, I always had the biggest and beefiest machine possible, plus a MBP for when I was on the move.
Now, off in another career field, I'm really wondering why I would choose a MPB over an iPad Air or Pro. By the time you add a keyboard to the iPad, it kinda feels like the only real difference is whether you want to run Xcode. Other than that, is there really anything you can do on a MBP that you can't do on an iPad?
> is there really anything you can do on a MBP that you can't do on an iPad?
If the App Store gatekeepers get their way then "run a shell".
(I've recently got my first iPad - my first iOS device in a very long time - and it's actually less locked down than I was expecting. I had no idea they had relaxed the rules on scripting on the device. But it's still way, way more locked down than a real computer)
Huh, the Macbook Pro only has the same two thunderbolt/USB ports as the Air? (One of which will be in use by a power adapter if you are plugged in, so you really only have one port).
I was imagining upgrading to the Pro to get more USB ports, but if that's not an option... I can't see why I'd ever pay for a Pro.
I like having more than one USB port so I can copy things from one external HD to another. I guess I'd need a dock for that.
You know what, my mistake actually, but the tone seems unnecessarily derisive. I was looking at the 13" and scrolled all the way down to see 4 thunderbolt ports, but that's on the "2.0GHz Intel Core i5 Quad-Core Processor with Intel Iris Plus Graphics", so not the M1 processor. I guess that shouldn't be too surprising that they're still selling it, but I'll return to being peeved about the regression to 2 ports.
I know that this transition will benefit the real apple users, managers, influencers and cool kids. Obviously I will keep and refurbish my 2013 macbook pro for Linux, will run offline Windows for design (thanks to Affinity) and have fun with Logic on some apple device/computer. Future computing will be fragmented..
I guess they will also never bring back normal USB-3.1 ports.
Guess I will have to keep my $5000 camera plugged into my Ryzen-based PC, it has 6x USB-3.1 ports. Oh and my studio microphone is also "old" usb. And my keyboard/mouse. And my phone cable. And my expensive Corsair flashdrive. And my Seagate external hdd. And my memory card reader (I have 7 memory cards for different cameras). I can go on.
I'm not going to daisy-chain stuff. I'm not going to buy a dock (just another thing on my desk and take up another power socket, and more LED's burning nonstop). I'm not going to buy dongles to convert from one spec to another.
And no, I'm not going to convert everything to wireless.
I have 1gbps fibre where I live, with a Asus router/modem sitting ontop of my PC, hooked up with 1gbps Ethernet cable. The 5GHz wifi is unstable if you don't sit in clear line of sight. I have a USB-3 to Ethernet dongle for my laptops but the fastest one achieves 800mbps and chows about 10% cpu when downloading something. ~2016 MacBook air gets 600mpbs with the same dongle.
Apple could've fix their port/dongle/dock mess this time around but chose not to.
Is the macpro worth the cost? I want to keep 300+ browser tabs opens, 6+ chrome user sessions, and many other programs running. My PC usually gets sluggish after a day of this. I don't do graphics design or gaming, but rather need to keep a lot of tabs and sessions open on chrome and to a lesser extent on Firefox.
I think you have to upgrade your laptop often or you won't keep up with the increasing requirements of software. I have a similar flow as you do (tons of apps, tabs, Linux workspaces, code compiling, video running etc). I also get easily annoyed when I do some basic stuff and all of a sudden the computer lags behind, is stuck etc.
in the end I went with a desktop computer (if you have the space and you don't mind losing portability - I also have a Dell XPS in a pretty good shape for when I need it). It does waaay better when it comes to multitasking, never gets sluggish, much easier to upgrade, no throttiling/overheating problems, cheaper.
I bought macbook pro 13 inch 2017 edition but I regret it forever due to its screen issue. Apple asks like 800$ repair for around 1100$ laptop atm. They placed tcon board near heatsink and congratulation if ur cpu gets heated your screen will suddenly stop working.
So please buy after checking how much it can be repaired :/
Am I the only person here disappointed that they did not announce a new MacBook? MacBook is my favorite form factor: OK for a slow typist like me, big enough screen when I have a large USB-C monitor on my desk, and I especially like the light weight and small size.
What also stood out for me is how few big names they had for companies moving to Apple Silicon. I really did not recognize most of the companies they did highlight.
I had been hoping for a web page dedicated to showing all companies on board with products coming or expected in the next year
Apple has been getting trashed lately by developers for how hot and loud their macbooks get. You can tell how much focus they put on cooling in this presentation. The macbook air literally doesn't even have a fan!
No; I have one. (Perhaps you're thinking of the 12-inch Macbook, which was fanless, and which they stopped manufacturing a year ago.) And I go to extremes to keep the fan from spinning up: using Turbo Boost Switcher, and/or running a program that repeatedly does the equivalent of "kill -STOP" and "kill -CONT" to a process, hundreds of times per second, to force it to use less CPU.
I've said this in another thread and thought they might use FaceID as a "Pro" only feature like they do on the iPads but nope, they also released an updated 13" MacBook Pro without FaceID.
As much as Apple touts features as "incredible innovations", I think a lot of them are just engineering solutions.
I see FaceID as the compromise of increasing screen size to fill all the real estate on the iPhone. The Macbooks have tons of room for extra sensors on the keyboard part but not a lot can be put on the screen.
By the way, does anyone have info / advice on whether to stay away from the very first release of brand new hardware from a buggy point of view? Or manufacturing kinks still being worked out?
Maybe I'll have to eat my words in a few weeks but I can't say I'm super concerned. M1 seems a lot like the A14, which is a lot like the A13, which is a lot like the A12, etc...
I'm more concerned about how long it will take for most apps to have full support for both macOS Catalina and Apple Silicon. When the 2012 retina MacBooks came out, quite a few apps were awkwardly pixelated for a long, long time.
When I was a kid I worked with my father building custom cars. We customized a car for the guy who owned a famous brand of liquor. He had a vanity license plate on one of his cars that said something like "3 Stars" on it and I asked him what that stood for. He said it was less expensive brand of the liquor he sold and then he went on to say "it's the same as the expensive stuff and if you tell anyone that they'll call you a liar, but take it from me, it's the same stuff". And he was right. I did tell people and those who purchased his expensive brand flat out insisted I was wrong.
Apple can charge more because they always have and people will buy it because they believe it's better.
From my experience, the touchpad and OS experience is decidedly better. This is a subjective judgement, but anecdotally I've heard it from many people.
Well, we're talking about hardware here, or at least I was.
I love my late `09 Mac Mini. Best Mac I've owned yet, but for the same amount we'll pay for a new one we could probably build our own Linux box with much better specs.
Yes, people believe it’s better. The liquor the man in your story sold (unless I’ve read this wrong) was literally the same as the expensive stuff.
Apple’s always been in the experience game. And they’ve charged a premium for the experience, even though underneath the pieces were the same. Some people hate it and some people love it.
This is another change to the experience. I think though that we’re finally beyond (excuse the expression) “Apples to Apples” comparison though. This hardware is literally not the same stuff - unless you’re down to you talking about NAND gates.
Well, I've bought new Apple computers for over 20 years now and for the first 10 I had to replace them after a couple years because I couldn't run newer software on them but they were pretty much junk by the end of those few years.
The late `09 Mac Mini I'm still using has been a real workhorse though. I upgraded the Ram and Hard Drive about 5 years ago and it's still very capable for what I do. To be fair, the DVD drive died after just a couple years on it though and I didn't really use it much.
But I don't use Photoshop or almost any Apple software anymore (outside of the OS). I write code and I don't need much horsepower for that.
I hope these new Macs will respect the openness we've had on the PC platform for the past 30 years and allow you to run any operating system you like without being beholden to the manufacturer (like they do on the iPhone). I'm really worried that Apple is gaining too much power over the platform and they are going to make it difficult (or near impossible) to boot other operating systems (e.g. Linux)
iOS and MacOS are clearly starting to converge here, this current gen of Macs are even able to run iOS apps. I wonder if we'll ever see macs with touchscreens, or if iPads will become macs before macs become iPads
This M1 chip is really interesting, but they seem to be hard limited in RAM to 16GB. That will surely limit the interest there is in the initial batch amongst the HN crowd. It makes sense maybe for the MacBook Air, but not that much for the Mac Mini or MacBook Pro 13".
I read somewhere about 15 years ago that the term was introduced when laptops would get so hot that the legal departments at laptop vendors started to worry about their liability if they kept calling them laptops and they got sued by someone whose lap got burned.
Now that Apple's laptops run much cooler, my guess is that they continue to say "notebook" because Apple owns an important trademark that ends in "Book".
This is a bugbear of mine. I've never met a real human being who uses that term. I don't even know if people would understand me if I started using it in normal conversation.
They are laptops outside of industry marketing material. Why can't they just admit they lost this rebranding battle?
An iPad with a good keyboard and an operating system that allows the user to do more things would be nice. Like running Linux for development, having a web browser that is not Safari inside, having to visual studio code...
My first PC which had 8GB of RAM was built in ~2010 if I remember correctly. My current machine has 32GB. In all honesty, 32GB is not even much, considering my motherboard can take 128GB (in the old days, if your motherboard could take max 8GB RAM, people would fill all the slots to capacity). Also, I have a Lenovo machine that I paid $200 for that has 8GB RAM.
WHY any power user would buy a MacBook is beyond me. They've become devices that only my mother would use, not to get work done.
Apple should've made the baseline 32GB with 128GB as max spec. It would've forced the whole industry to give us more RAM. So for the next two years RAM will still be stagnant, all other manufacturers that copies Apple will indeed keep shipping 8GB without blinking.