Buy a used Intel Mac today and it will give you several more years of service while the ecosystem handles the ARM transition. You should never buy a first-gen Apple product, especially the first gen of such a radical change. I expect buyers of the first ARM-based Macs to end up like the buyers of the first Intel-based Macs who were saddled with Core Solo processors and 32-bit EFI.
I expect buyers of the first ARM-based Macs to end up like the buyers of the first Intel-based Macs who were saddled with Core Solo processors and 32-bit EFI.
That’s not going to be the case this time around.
Apple has obviously been planning this transition for several years. Apple has been designing 64-bit ARM-based SoC since at least 2013—over 7 years.
Single-threaded speed surpassed Intel years ago; it's probably been a matter of getting the supporting silicon up to snuff.
I wouldn't be surprised if what we see this fall will be the 2nd or 3rd iteration of ARM-based Macs Apple has designed internally but the first to be revealed to the public.
I expect these first Apple Silicon Macs will be ridiculously fast, particularly on a performance per watt basis and even perhaps compared to similarly priced Intel hardware from brand name PC manufacturers.
Apple knows they only have one chance to make a first impression; I expect they will put a stake in the ground around that for these first consumer oriented Macs, with the promise of what's to come for Macs for pro users over the course of the two-year transition of the entire product line.
You could have said all the same things about the Intel transition
Apple obviously wasn't designing Intel chips; they had no choice but to take whatever Intel had. Sure, they were better than the PowerPC processors from Motorola and IBM but it was Intel’s roadmap that sealed the deal.
This time, Apple doesn't have to settle for those trade-offs it had no choice but to swallow from Intel back in the day.
The two year old processor in the Apple Transition Kit already has very impressive performance; the first ARM-based Macs should even be more impressive with the latest and greatest chips: https://gizmodo.com/a-wild-apple-arm-benchmark-appears-18442....
I owned that machine - the original MacBook Pro from 2006, with a Core Duo - and it lasted five years and could have gone longer. I only replaced it after the aluminum bezel succumbed to mechanical fatigue. Of all the Macs I've ever owner, it was the most reliable and long-lived. The 32 bit ISA was never a problem in practice.
If you held on to that machine for a few more years it would have been dropped by OS X Yosemite, but then hilariously still supported by essentially making the machine a Hackintosh!
DosDude's patches are the bomb! I've upgraded my 2009 MBP all the way up to Mojave (I could have went to Catalina but I've upgraded my machine after it came out so decided to stay there)
The only real issues were that webcam no longer works (bummer) and that somewhere down the road Homebrew stopped supporting Penryn architecture with binary releases so I had to hack it to force it to compile stuff from source.
Either way, those first machines were supported way longer than expected. My 2006 MacBook Pro lasted me up until OS X Lion (10.7) and I still use it for BOINC.
I owned that exact machine and it was terribly slow, especially running Rosetta. Luckily the CPU was socketed and you could drop in a Core 2 Duo mobile CPU (up to a T7600) for a big upgrade!
Don't forget your putty knifes! The Core 2 Duo upgrade was the only reason I was able to keep it alive as long as I did along with maxed out RAM. Never switched out to SSD. They were just too expensive.
But this chip will be the 14th iteration of the CPU and is already quite mature. The Apple GPU is already some years further after first release. The chip is also already 64bit. The main new thing is the Arm compilation of the OS and Rosetta layer and that can be updated with patch releases.
I'm totally confident in their ability to pull off the ARM transition, but I think the fit-and-finish will be way better in the second- and third-gen products.
For another example, many people had first-gen Macbooks that would "moo" as the fan cycled on and off across a single-degree thermal boundary: https://www.youtube.com/watch?v=vEPzlIkBnGs
edit: Almost forgot the build-quality issues with the first-gen Macbook Pros where the official service manual advised you to use one entire tube of thermal compound each for the CPU and GPU which obviously lead to overheating: https://forums.somethingawful.com/showthread.php?s=&threadid... (images all missing)
… which they helped design and were the first significant user of, and which Intel donated to the USB-IF royalty-free to encourage adoption. Apple announced that they were going to support it and they’re not the kind of company which would do that without having control over their product direction.
That's a bummer though, I was actually hoping they would be the first consumer brand to go all-in on USB4 - similarly to how they were the first to ditch the floppy drive, and in numerous other occasions
Thunderbolt was a joint effort between Apple and Intel, years before it came out, called "Light Peak" then. The first Intel demo was on a Mac, and Apple first announced the Thunderbolt commercial name.
> FUD. First Intel Macbook Pro, first iPhone, first iPad, first Apple Watch... these were all excellent devices.
The 2nd gen were all extreme improvements though
> Also, I believe you are misinformed about the types of processors available in the first Intel macs.
as far as I remember the initial Intel Macs were a Mac Mini (Core Solo) and MacBooks (Core Duo).
32bit EFI was a thing and made life difficult with a Mac Pro. At some point the MacOS dropped support for the 32bit EFI, and had to have OS upgrades installed with patched installers.
> The 2nd gen were all extreme improvements though
Welcome to the state-of-the- art in the mid 2000’s. That’s what Intel used to offer and why it was worth it for Apple to transition from PowerPC to Intel. But Intel hasn’t improved much in years. Yes, Apple will probably release significantly better processors in ARM macs in the upcoming years. But they likely would not be able to release Macs with significantly better Intel processors if they stick with Intel.
The iPad 3 was pretty terrible very quickly as well - I think it was the slowest iPad they ever made. Mainly due to quickly cobbling a retina screen to hardware (GPU and RAM) that simply couldn’t drive that many pixels.
The first gen iPhone was unusable for anything other than email. The pitch was that you get "the real web," and compared to WAP on feature phones, the was true, but honestly, the mobile web was years away from being ready enough to be useful. Internet functionality was basically unusable on 2.5g. Honestly, it wasn't even ready until 4g. I remember trying to use Google Maps on an iPhone a week after launch. It took minutes to load the map of where I was. This was in SF. The App Store hadn't launched yet.
The first iPad was a dead end, the performance was abysmal since it was basically a 3GS with a bigger screen, it didn't even have extra RAM to handle the larger graphical assets. Software stopped supporting it very quickly because it was impossible to get anything running under the RAM constraints. The iPad 2 which doubled the RAM had a much much longer lifespan
The first Intel Macs were the Macbook Pro and iMac, which had core duo. Looking on the internet, the first Intel mini came a month later, so I think it’s fair to call it one of the first Intel macs, but it had a core duo option as well (check lowendmac or mactracker), so I guess you chose the core solo, but can’t say you were “saddled” with it.
Anyway, that core solo mini also was a great little computer. I believe that’s the one I bought refurb, added RAM, a core 2 duo, and upgraded the wifi to N, and used it for years as a media head and server.
The first iPhone launched for 599 and was quickly slashed to 399 and lived a year before the app store launched. If you were an early adopter, you were a sucker, period.
I don't expect that but you're right to say go for a second hand intel if you're planning on using docker for a lot of stuff. Right now while docker can work with arm and most likely there will be an arm option when things launch, most docker images require intel hardware. I run docker on a pi4 and on an intel machine. On intel I can just install things like wordpress and mysql without a total train wreck. Finding docker images that work well for arm right now is painful. Maybe some of this will be fixed by the time the arm macs launch in which case raspberry pi4 will also benefit from a growth of docker support for arm.
The first retina iPad (iPad 3) was a great example of this. In hindsight, most people agree that it was rushed out without a powerful enough chipset to handle the higher res screen and so it aged quite badly. It was replaced by the iPad 4 only 6 months later.
I owned a Core Solo Mac Mini and an iPad 3, so maybe that explains my mindset here :p
That iPad is still kicking around for random uses like HLS video testing, luckily on 9.0.2 with the (to this day final) untethered Pangu jailbreak. Part of me wishes I never upgraded it past 8.4.1 though.
My dad was using the one I gave him until it was destroyed in a fire last year. He was still fine with it except that the Amazon app didn’t work any more.
I still have my iPad 2, and it is "technically" a useful tablet. But it is worthless with Safari on modern (bloated) sites and doesn't show Netflix in HD, my two main usecases. No security updates either. I should recycle it but haven't been able to do it.
I tried to setup my iPad 3(?) a couple of weeks ago to control Apple Music on my stereo. It was just too slow and software was way out of rev. So, yeah, I tossed it back on my "recharging table" but it's pretty much useless.
I'm pretty keen on buying one of the last Intel MBPs late this year or next year. First-generation Apple products are always buggier than the subsequent versions, so even as a consumer I'd want to wait for second gen. More importantly, as a developer, I'm still deploying everything to Linux on Intel, so I value the ability to virtualize Linux without emulation.
That said, I'm excited about the potential for ARM over time with Apple's performance trajectory, and if Apple holds onto developer mindshare and doesn't totally nerf Mac OS in the name of iOS parity, then there could be a strong halo effect pushing ARM servers and cloud hosting forward. I just don't want to be on that bleeding edge of architecture transition pain.
I wasn't the one that downvoted you but to give a typical reason why some programmers don't run Linux on the laptop/desktop even though they deploy to Linux: they need to run critical software that only runs on Windows or macOS. (My previous comments with examples of this:
https://news.ycombinator.com/item?id=20312066, https://news.ycombinator.com/item?id=20312336)
I also write software for Linux but it's backend processing utilities deployed to Linux servers and not client GUI software with GTK/Qt meant for Ubuntu desktops. In that situation, it's more reliable and less hassle for me to run macOS or Windows as the 1st-class desktop and then run Linux & gcc in a vm .... rather than the reverse of running Ubuntu desktop and trying to run Windows in Wine/vm and then finding out something is incompatible.
I completely agree with this sentiment. I remember one I was a TA for an introductory graduate medical informatics class, and a couple of the students insisted on trying to make a "PowerPoint" in LibreOffice (presumably because he has Linux installed on his laptop). I specifically told them that I would be grading a non-insignificant on their communication and presentation abilities. But they kept on insisting on using LibreOffice. The content of what they were presenting was so utterly distracted by the poor formatting, I had to specifically mention it in my grading. Frankly, I have never seen an aesthetically pleasing presentation in LibreOffice, and it get into so many conflicts/broken formatting when trying to migrate to PowerPoint.
The reality is that if you're going to be interfacing with non-technical people (managers, customers, exec level), they all expect a certain amount of polish that comes from presentations. I'm willing to dunk on MBA types that drop buzzwords all day, but sometime you need to be flexible with regard for what tools you use for your job.
That's funny, because we have an organizational policy not to disseminate any materials in ppt/pptx because, as you point out, presentation matters, and you absolutely cannot rely on consistency of appearance with PowerPoint.
The heart of it is that PowerPoint is a product, not a standard.
Because we can't assume everyone is a Microsoft house, we have to support presentations produced in open standard formats and proprietary ones alike. This is why we invest so heavily into our AV and conference spaces - we don't want to make fools of ourselves and our guests by neglecting simple and costless accommodations.
Yes you can create crappy presentations in Powerpoint but when you open those crappy presentations they still look exactly how they did when it was created. That's not always the case with LibreOffice when one of their files is opened in Microsoft Office.
When I was in university I had instructors who demanded documents be created in MS Office (which I did not own) and I did those assignments in the library where MS Office is installed on every machine.
Basic documents work just fine between LibreOffice and MS Office but the more advanced you get the worse the compatibility is. In your view, how much responsibility does the author have for broken formatting versus the reader? Basically, if I (as a student) submit a document with broken formatting, how do you as an instructor make the determination that it's because of a software error instead of a lazy student? And after you make that determination, how much time do you as an instructor put into fixing the student's broken formatting versus rejecting the document and moving on?
Things have certainly also changed with the move to subscription models, like O365. Students can now reliably produce MS Office compatible documents in web browsers on any internet-connected computer or mobile device.
This is a big shift from just a few years ago where significantly fewer students had the hard drive space, internet, and know-how to download and manage multiple licensed versions or bitnesses of MS Office products. Since professors might require older versions of software that are incompatible with other courses at the school (or simply not be aware of the newer version available to them), there's not much students can do except hope the professor will listen to reason or not grade too harshly on something they might be getting entirely wrong.
Poor professoring (or TAing) is a good way to get a chair or dean involved, so it's rare to have things escalate too far, but it's still a big source of stress for students who might hold certain parts of the system in higher esteem than they rightfully deserve.
Some of us are not masochists at heart, and value the ability to switch between a GUI for my desktop things and a virtualized Linux image for development/testing.
Of course you should use whatever works best for you, but I disagree with your characterization of Linux users as masochists at heart.
I’m very much a person who values technology that “just works”. While my latest Ubuntu installation admittedly wasn’t quite as smooth as running macOS on Apple hardware, the gap has narrowed considerably. Especially for anyone who already possesses a fair amount of Linux experience from working with servers for work, and who sticks with known-compatible hardware, running a Linux desktop can be a delight.
Linux on a desktop can be fine, depending on what you want out of that desktop experience. But Linux on a laptop still tends to be Sisyphean: by the time you get all the hardware features working correctly and with proper power management and with all the firmware bugs worked around, at least one of the hardware or the distro will be years out of date. The only way to be satisfied is to decide you didn't need that feature anyways, or didn't need that last hour of battery life.
Out of curiosity, does this still hold true if you buy one of the System76 / Dell / Lenovo laptops with Linux pre-installed? I’m currently using a desktop, so don’t have any recent experience with this, but my assumption is the experience has also improved quite a bit on these OEM Linux laptops.
It's better but not perfect. If you tried Linux on commodity laptops a decade ago, you'd frequently run into awful hardware issues. There's less of that now with the OEM laptops (with some exceptions; looking at you, Killer wlan chipsets!).
UI jank is also reduced but not eliminated. Even Pop!/DE Ubuntu have a fair number of rough edges - not so much that you're constantly tearing your hair out, but there's definite room for improvement. Personally, I live with it because I also think Windows and OSX have regressed in usability and stability, but I'd be a liar if I didn't mention all the time spent yelling at things like DPI scaling w/ hardware dock-attached monitors.
e: also, expect to do some manual power tuning if you want anything close to battery life parity.
I use my Macs basically for browsing, as a thin client for Windows AWS instances via RDP, and also Linux development via VS Code Remote Development (ssh) for AWS Linux instances.
I find the overall UX quite good (Microsoft RD + Microsoft VS Code) and the fact that I do not have relevant data on the Macs allows me to hop from office to home, and second homes without having to carry a laptop.
Indeed I'm not a masochist. That's why I avoid staring at a screen with poor LED backlight, using a crappy keyboard or a kitsch GUI, all of this brought to you by Apple[tm].
I was strongly considering it, given the touchbar and keyboard nonsense, but then they at least brought back the escape key and inverted arrows which was enough to keep me onboard.
Native docker would be a strong draw, but the thing that has stopped me so far is that I have refined workflows and a handful of killer apps (eg. OmniFocus) that I don't want to retool.
I am running Linux in a VM on the Mac, as I do need software, which is not available on Linux. But with the Linux VM running full-screen, I miss nothing from running Linux natively, but I can switch in an instant to the Mac desktop with all its niceties. Also, hardware support is no problem, as it is handled by the host system.
The year of "Linux on the Desktop" may actually be coming if enough developers want to keep Intel and apple finally drops support for x86 with their software updates (and they absolutely will at some point). Dual-booting between Linux for dev and Windows for video gaming could be a compelling use case for a lot of developers. I doubt Apple ARM systems will see much in the way of video gaming. I absolutely would go for Intel/Linux before I follow Apple down the ARM road just because they want to make more money and have more lock-in.
I work in games so the answer for me is Unity editor support on Linux is poor so I can't, although I'd like to. Have most distros gotten solid high-dpi support yet?
In my case, because none of the auditing software required by work will run under Linux. I got away with it for a few years because I started before we had an actual IT department, but eventually they caught up with me and I had to swap my Linux machine for a MacBook, I could have had Windows, and possibly would have done if it were today.
anyone who develops to target linux servers should certainly be running linux on their laptop ... occasionally I fire up a osx laptop just to see how it compares to linux and after years of developing on a linux laptop the differences are stark ... I encourage any doubters to install linux on a laptop and see for themselves
Same here, I have a late 2014 27" which I can probably get about £1k for on eBay which is 1/3 of the cost of a new one of the spec I'm after. I want something I can run Linux on under Virtualbox and boot camp into Windows for gaming. A few days ago some results showed up on Geekbench for an unknown Mac running the latest gen i9 Intel cpu and delivery times for 27" machine just went to September, so we can expect a refresh soon.
For a pure Mac experience I'm sure the new ARM machines will be absolutely fine. My first Mac was a first generation intel machine and it was fantastic, so they're perfectly capable of nailing a transition like this. I'm just in a small minority with some specific reasons to prefer an Intel cpu. Long term it doesn't concern me either, there's no way I can predict what my needs will be in another 6 or 8 years when my new box starts getting long in the tooth. I'll worry about it then.
How do you virtualize Linux on an Intel Mac without it being terrible? Docker has been essentially broken for years. I’m actually looking forward to the ARM Macs because Apple mentioned they’re cooperating with Docker which I suspect could mean Docker will actually be more usable for virtualizing x86 Linux than it is now.
I use VMWare Fusion for my day to day work and my experiences couldn't be better. Running it fullscreen is like running Linux on your machine, with the advantages of virtualisation, e.g. being able to suspend your VM.
I accidentally went full screen on an Ubuntu 20.04 VM running on a Nuc that I opened from Fusion. I only installed it to cobble together some instructions for usage for someone who needed help.
Full screen from Fusion makes it really nice. There is a tiny lag when using the trackpad and typing has a has a very very slight delay, but I was looking for flaws. It's really compelling. I can only imagine that having it in Fusion but on my machine would make it even better.
I will be upping the resources it gets and trying again, as it's really nice.
I'm not sure I understand that comment. Docker isn't virtualization at least as the term is normally used (of a full OS running in a virtual machine). It's containerization and, to the degree it's terrible (I haven't used it much), it may be related to the fact that MacOS is based on BSD and not Linux.
Big Sur apparently has native Mac virtualization for, initially, x86 although I'm unclear what the underlying tech and a query or two at the time of the original announcement didn't yield any results. So that may improve Linux virtualization.
As for ARM, virtualization is not emulation. There's going to be Rosetta 2. But my understanding is that you shouldn't expect x86 Linux to run on Mac ARM using virtualization, Docker, or anything else. (Apple demoed an ARM build of Linux running in a VM on ARM at WWDC.)
OK. I get that. (Docker can also run natively on MacOS today which is what I assumed was being discussed. Docker presumably runs just as well in a Linux VM as Linux native other than any VM performance issues today and the parent was complaining about Docker on a Mac.)
But, yes, Docker can also run in a Linux on x86 VM on, well pretty much anything today, and will presumably at some point be able to run in a Linux on ARM VM in the future.
> Docker can also run natively on MacOS today which is what I assumed was being discussed.
It is. The "native" Docker.app is an Hypervisor.framework virtual machine and an 8 GB Docker.raw image in your homedir. The client utilities are native, and communicate with a docker service running inside the VM. It definitely doesn't run containers under Darwin kernel.
Not fully responsive, but I would just buy a nice Windows 10 computer. Surface, Dell XPS, Thinkpad X1, plus numerous others from various manufacturers, are available. They can run WSL, Docker, various hypervisors and VM tools, and still be fully AMD/Intel compatible for deployment. Furthermore I doubt they will become as closed as macOS is becoming, and Windows will run on them for years and years to come.
I am not looking back. Touchbar, crappy keyboards, regressive and user hostile macOS, and now custom CPUs? The walled garden has become a hellish prison and I am out. I wont miss Aperture, Logic and Final Cut that much. Everything else I use is ... Imagine this ... Fully cross platform! Adobe, Ableton, Native Instruments, JetBrains, etc. Anything I cannot do easily in Windows I have WSL or Cygwin.
Thanks for 12 great years and 4 increasingly miserable years, Apple. I'm out, and once my current hardware dies it won't be replaced. I like my OnePlus phone and Surface Book and Go.
I find perspectives like yours interesting because I have the opposite experience - I use almost exclusively Apple applications and I love their hardware (now that new MBPs have Esc keys again...).
From my perspective I find Windows UX hellish and non-Mac hardware extremely cumbersome and unreliable. This is all subjective of course.
Windows comes with a great deal more embedded spyware and adware than macOS, and has a lot more malware targeted at it in the wild. I think the security situation is far better on macOS.
Windows really doesn't, it's the OEMs. It's a pain, but I usually spend like 30 minutes cleaning up a new laptop and it never bothers me again. My last few machines have been Acers and they were pretty clean.
"Spyware" is a bit of an exaggeration. Pretty much all the things it does are also done on Android phones, and it asks you on setup if you'd like to disable any.
Now, MS isn't an ad company like Google, and enabling data collection by default is the wrong approach for users. It feels like it was build with features (that you probably don't care about--Hey Cortana) in mind before privacy.
As a developer who values running the same docker images as production, I'm out -- no more mac purchases for me.
Gaming on a mac is already a challenge. None of my most valued software is Mac specific anymore. The trend of incredibly locked down software will only continue.
The calculus may be different if I were heavily in the iOS ecosystem, or if the "exciting new Intel-based Macs" buck the trend and represent good value.
If you're not an iOS developer, do you see things the same way?
Absolutely. It’s broken. Given that Apple mentioned explicit support for or cooperation with Docker for the new ARM Macs, I think there’s a decent chance it will actually perform better.
I use Docker machine. I can switch between a local vm or one spawned on digital ocean at a moment’s notice. The local vm (backed by either xhyve, vbox, or fusion depending on the mood) runs circles around Docker’s own Docker for Mac version, which tries too hard to be magic for its own good.
I do all my dev work in AWS or a Linux box in another room (where it can be as noisy as it likes). macOS is just a nice terminal environment with desktop luxuries and half decent integration with iPhone and iPad.
I'm a full-stack web dev (who's been stuck dealing with cross-platform hybrid apps at $DAYJOB for a while), and the continued lockdown of OS X, its iOS-ification, and now a new CPU complicating Windows / Linux support have me starting the painful task of figuring out what my next personal setup will be.
Currently thinking it may be a Librem 15 with NixOS, but step one is to try building a basic env in a VM and seeing how it feels.
Most of those, I would call "devices". I would call only those "computers", where you can do your software development on. And I share the notion of the previous poster, that the x86-monopoly for the machines you develop on should be distrupted.
Yup, agreed. I am already having problems when I charge my MacBook Pro on the left side and the thing overheats. The hardware and software doesn't hold anything for me anymore; too buggy and poor value for money.
If I was an iOS dev I would have no choice. Luckily I am not
The benchmarks for x86 emulation on the developer kit are surprisingly good. And that’s on an iPad chip, an actual desktop chip should be quite performant.
I'm skeptical. Geekbench seems like just about the perfect use case for Rosetta 2 translation - I wouldn't be surprised if performance is close to native. If that's the case, the A12Z is only approximately competitive with Intel of a similar vintage despite having a process node advantage.
Answer is simple, explanation is complex. Wait, unless you need to run virtual Windows. x86 always sucked. It was always a dog. Everything that ever competed with x86 was technically superior to x86, including 68k, PPC, and ARM will be no different. If you have some burning need to run WinVMs, then you need x86. Otherwise, even this first gen Apple ARM hardware will be better than the Intel Macs in performance, and this even in Rosetta2 emulation, and this has been proven by the developer kit using 2 year old processor competing with today's x86. When Apple releases their first ARM macs, they will be cutting edge, will see faster and far more efficient processing, thus expect battery life in ARM MacBooks that any x86 MacBook or AnyBook can not compete. As far as the "don't buy first gen Macs" folks are concerned, they are stuck in 1994. Apple is a different company now. Except for minor issues like avant garde butterfly keyboards that they're abandoning, the internal hardware has no competition, and you're going to pay for that, but it is not actually more expensive than any hardware comparatively configured. So save your money, get another year out of your x86 Mac, and get the ARM Mac. Otherwise, you'll have 2 gens of x86 Macs, and one will gather dust. Wait for the ARM, and you will continue to use your x86 Mac.
The Osborne effect is a social phenomenon of customers canceling or deferring orders for the current soon-to-be-obsolete product as an unexpected drawback of a company's announcing a future product prematurely.
Totally. Don't buy either, Apple is obviously going under. ><
Thing about Macs is, they have this incredible lifetime, annoyingly (to those that like to save with buying older hw) hold high value in resale. Try pricing a Mac Mini from 2012, or Late 2014. Compare a similar compact WinDesk from those years. Though it is interesting to see a pattern, and moreso when it is famous enough to have a name, I do not think the Osborne effect applies here. Today's current Intel Mac will still be unaffordable NIB or used when the 3rd gen ARM Macs are released. I'm not really happy about that.
The thing is, in 4 years when your intel mac has run out of juice and you want to sell it, and expect it to have retained its value wel like Macs usually do, Apple will drop support for it and it will have zero value.
Probably because there hasn't been a major architecture leap in that time. It's very different when going from 32-bit to 64, or going from x86 to ARM.
Apple will want to phase that out as quickly as is acceptable to the public, and just like the PPC to x86 transition app developers will probably do it quicker than apple itself. So even if the OS might be updated for a while I think we'll pretty quickly see some apps that just won't run.
That’s what I mean though - yeah that has been the case, because they’ve been using Intel cpus for ages. But how long did they support PowerPC for? This time is different
Right... And the previous transitions in early 2000s was eased by the usage of universal binaries with powerpc and intel code. Same will apply for the universal binary 2 with intel + arm code.
As I recall, the first switch had the issue of the early PPC chips, 601, was kind of slow, and the boxes not designed well. The second transition in 2006 was annoying because Adobe doesn't like to rewrite their code... they carbonized, and then just kept charging for the same cruft. But frankly, I liked the Classic Environment, and I liked the first Rosetta. I'd like to run the whole shebang in MAME on whatever hw, if others were doing it to make it easier for me. If only MAME had a emulated coprocessor, I'd run A/UX on it.
Apple's current ARMs and OS running on it are already quite amazing, IMHO. So I don't think the new ARM Macs could disappoint. And the same thing here, Universal Binaries, a switch in the compiler to make it easy for devs, and I don't know how it could be easier.
> As I recall, the first switch had the issue of the early PPC chips, 601, was kind of slow, and the boxes not designed well.
I had a first-generation PPC Mac. It was pretty good, I thought. The baseline 601 ran at 60MHz and had a superscalar architecture with 16K instruction / 16K data L1 cache. The high-end 68Ks were not superscalar, ran at 33MHz, and had 4K instruction / 4K data caches.
If anything was slow it was probably because big chunks of code were 68K. My personal experience was that once I switched to PPC, running the same software on 68K felt slow by comparison.
That's interesting. I didn't get to touch a 601 until the 604 were common, side by side in a commercial environment, the older machines gunked with more use (old installs) and more users over time, thus my opinion is biased and worthless. Thank you for posting.
It's awkward. I have a 2013 rMBP which I'm very happy with.
It was bought in part to start developing OSX/iOS software while also continuing to use Linux for the majority of my work, which I now do in a VM on the Mac.
This setup works really well as an "all rounder" machine with excellent hardware.
In a couple of years time, having an x86 Mac won't make sense for developing MacOS/iOS software. Apple talks of emulating x86 on ARM for MacOS, but not the other way around.
But on the other hand, having an ARM Mac won't make sense for running x86 Linux and x86 Windows VMs. And my servers will probably still be x86.
To most people I would say that it depends on what you want to do with your Mac. If you also want/need to run Windows I would pick an Intel one. If you think you’d be content to live exclusively in Apple’s World I would pick the Arm one as it has the upside of also running iOS apps natively.
> it has the upside of also running iOS apps natively
I've been thinking about this lately, and I wonder who the audience is. For example, I've had a Chromebook that can run Android apps for a while now, but I've never wanted to run them when the desktop equivalent is available. Even if a desktop equivalent is not available, it has the downside of not fitting in well with the look-and-feel and ergonomics of the desktop.
However, I think being able to run iOS apps natively is great for developers testing iOS/iPadOS apps.
There is a high likelyhood that ARM Macs will be a 'superset' of iPads going forward. That means at least some models will have touch screens/support apple pencil/be convertible form factor.
Like now, two ecosystems in one, but the other is iPad. I'd love a lightweight convertible Macbook, or an iPad Pro that is really an ARM Mac but can 'be' an iPad (w/16 Gigs of RAM) at any time.
After SwiftUI and Catalyst I was also pretty surprised by apple announcing that iOS apps will just run on ARM macs. After giving us a few options for convergence they also just seemingly bolted that one on for good measure because they could. Not sure anyone expected that. Even weirder I opened up app store connect a few days ago and got a message telling me all my iOS apps would be available in the mac app store unless I opt out... The only thing it makes sense for is maybe games and even thats a stretch without proper touch UI
Since it seems like Catalyst is not as popular as expected, I'd love to run streamlined native apps in their own window without having to open a browser tab of everything (e.g. for Reddit, using Apollo instead of the god-awful web UI)
iOS apps on the Mac would give it a robust gaming library overnight.
> iOS apps on the Mac would give it a robust gaming library overnight.
It wouldn't. It would give MacOS a library of games optimised for mobile. Why would you want those on a desktop computer? And how would you play many of them when they require touch input?
Sure, not 100% of the games will work / some gestures may be wonky, but the vast majority of popular iOS games I can think of would work fine out of the box. (much less APM than games like Diablo or MOBAs)
I’d love to play longer iOS games on a system that doesn’t cause it to heat up profusely and massively drain battery life, as there are some unique iOS games that were never ported to a Mac.
> Sure, not 100% of the games will work / some gestures may be wonky
That probably includes the vast majority of games that you would want to play on a desktop system: the ones with perceived console quality.
Nowhere is this more evident than in Apple Arcade. It has cross platform games. About none of them have any appeal for a desktop gamer. Non-ported mobile games? They will have even less appeal.
Except some casual games like Monument Valley.
> there are some unique iOS games that were never ported to a Mac
Once again, the vast majority of those will most likely be very foreign on a desktop OS.
Expected by whom, exactly? I don’t think anyone ever asked for Catalyst, it was just an ugly push from Apple. It’s unpopularity was actually fairly predictable.
And on iOS gaming: iOS gaming and desktop gaming cater to different markets. Nobody ever bought a desktop or laptop to play something like Fruit Ninja.
Indeed. I don't imagine games that are designed for two thumbs, like Fruit Ninja and Flappy Bird, are going to contribute significantly to the Mac's gaming experience.
Mobile gaming has evolved quite a bit past 2013. You can get console quality games now, especially now that developers are now using MFI more consistently.
Xcode simulator already runs ios*ipad apps on Intel Macs.
It will be interesting to see how Apple presents iOS apps on the ARM Macs. IE how discoverable they will be, not just how useful. Will I get all my compatible iOS apps automatically installed when I login with my iCloud ID? Will there be a Launcher style window for them, or widgets style overlay?
Note: iOS apps compiled for Intel. iOS apps on Mac will be that they will be pretty much transparent, like Rosetta is claimed to be. You get them from the App Store and can launch them like any other app.
Yes, but there's no reason Apple couldn't launch this feature on x86 macs if they wanted to. Almost everyone is developing their apps in the x86 simulator anyway. No doubt some apps have ARM-only libraries, but I doubt it's many.
I would never buy another Apple laptop in my life and be perfectly happy except that I need to work on iOS apps pretty regularly. Even for web dev, you need to be able test on a Mac for compatibility. This is purely because Apple won't allow VMs for their systems which is a pretty shitty stance to take. It's super easy to run Windows or Android on a Mac or even Linux.
It depends on how quickly you need to upgrade, how important smooth operations are, and of course, whether you depend on bootcamp/virtualisation.
If you need to upgrade soon and are dependant on the machine "just working", get an Intel based Mac. Obviously the same, if you need virtualisation. You do get a machine, which works now and will do so easily for the next 5 years to come.
The new ARM Macs are very tempting, but it might take up to 2 years until the device you want to refresh is avialable with an ARM processor, and there might be teething problems. In any case, it will take some time, till most software is avialable natively. On the other side, unless there are first-generation issues, with an ARM-Mac you are very forward-looking.
So for me, the answer is easy. For my work, I am going to renew my MB Pro on Intel, but privately, I am going to try to stretch the life of my iMac out, until I can an ARM based one. The iMac is still running on Mojave for compatibility reasons, so if I upgrade, going to ARM shouldn't be less compatible (which probably was the reason for Catalina to be that incompatible).
I bypassed the whole decision entirely and after 15 years of using Macs of various flavors went back to Windows for stuff that doesn't run on Linux (Lightroom, Ableton). It works fine and I got used to it within a week. I think quite a few people will do the same thing, especially when it comes to laptops. MBP pricing in particular makes no sense when Lenovo X1 exists and has a vastly superior keyboard, more ports, and a higher density display. Importantly, it also looks great. Similar configs are literally $1K apart. It's time to give MS another chance, IMO.
Apple discontinued the use of PowerPC processors in 2006.
Mac OS X 10.6 Snow Leopard released at August 28, 2009 was the first version of OS X to be built exclusively for Intel Macs.
So extrapolating on Mac history one can expect last Intel Mac to be supported for 3 years.
Similar for the 68k to PPC transition, 1996 was the last year for the powerbook 190s, the last 68k, and MacOS 8.5 (which required PPC) was released 1998 (to the month 2 years later).
So going by history we can expect 2-3 years, with a sample size of 2.
I would advise app and mobile developers to wait for the ARM Macs. They will have near native performance for iOS emulation (or even native if everything works out). The demand for people who can do ARM optimizations will also surge.
Apple has already said they will natively run every iOS app on the App Store unless devs opt out so I expect this does indeed mean literally native performance.
I do want to upgrade my 2016 MacBook to an iMac because although everything on the OS works fine, I'm beginning to do work that's hitting performance ceilings.
If new Macs w/ Apple Silicon are being released this fall, I hope the 27/30-in iMacs w/ Apple Silicon rumored to launch next year count as the "second" revision in context of Apple's infamous first-gen issues.
To be fair, nearly all first gens of any tech have infamous issues. Apple’s issues are more glaring because of the expectations associated with the brand. Your advice to wait for second gen is all-around wise :)
The big ones in recent memory are the Apple Watch Series 0 (which got obsoleted unusually quickly) and the 2016 MacBook Pros w/ keyboard and screen flex issues (I keep mine almost-permanently plugged in so it hasn't been an issue for me, yet)
As someone who had a S0 (in steel!), the battery died very quickly after a year or two and it was impossible to actually use any apps on it because loading them took forever.
I eventually upgrade to a S3 and was very happy with it, although I don't think I am rushing to upgrade, yet.
that being said, museum pieces! I'm not sure I'd want to own one, but the movement to put modern components inside an old Power PC Cube chassis is pretty fascinating to read about...
We have several at work and they seem prone to weird shit happening, depending on which config you get. I think there is something that happens with one of the GPU options and it makes the system unstable.
Just curious what kind of workload your 2016 MB is bottlenecking from? Have you considered an eGPU for graphical based workloads, or a pre-built desktop for workloads that would function equally well on Windows or WSL2?
Current-workload-wise, I do want to do a lot more work with virtualization (if Docker was not announced for Apple Silicon it would have been a nonstarter), plus the ability to play an occasional game or two via Metal.
I actually bought a weaker 2016 MacBook Pro when it launched under the assumption that eGPUs would take off; unfortunately, the cost premium turned out to be not worth it, and the inability to use Nvidia GPUs without extreme hacking makes it a nonstarter. (I also planned to use it for training ML models; the massive cost drops in cloud computing make having a local GPU unnecessary)
My place isn't big enough for a full-on Windows workstation, plus I'm invested enough in the Mac ecosystem that I get benefit from that synergy.
EDIT: Another performance bottleneck in my work is video recording/livestreaming; currently on my Mac I get slowdown when I record HD video for complicated work, so I would prefer to mitigate that.
There are Intel NUCs these days that are tiny but run desktop (rather than mobile) CPUs. They're about a fifth the size of a typical eGPU box (or, slightly larger than a docked Nintendo Switch). This can work for the sort of filler needs you're describing - a bit more CPU or graphical oompf, a system to offload some tasks to (say, VMs, streaming/encoding) and putz around with Windows/games or a real local Linux install.
This, and I'll expand as I have recently been down the rabbit hole.
The Nuc8 (i5 or fractionally more powerful i7) are monsters and much loved by those who get them. Their GPU is particularly powerful. The newer Nuc10 is a leap backwards in GPU power, but has 6 cores in the i7 version. This makes it an excellent ESXi host, and the factory support for 64gb of RAM helps too (though the Nuc8 will run 64, but Intel states support for just 32). The RAM is faster on the Nuc10.
For encoding, both are great as they have the same version of Quicksync and I think my Nuc10 matches the performance of the Nuc8 for transcoding video.
A downside for the Nuc10 is that that the newer nic isn't properly supported yet by VMWare, so there is shagging about to get the nic to work.
With m.2 drives they fly, and are a great little machine.
I'm quite interested in setting up a Nuc as a stopgap / intermediary GPU dev machine while Apple sorts out whatever is going to be the future of GPU support.
With that in mind I am curious about your comment wrt GPU - are you referring to the on-board Intel graphics, discrete graphics that can be supported on-board to the NUC, or are there limitations to what can be added externally?
I will probably get a chassis and connect an nVidia card via thunderbolt, but I'm interested in what the limitations are around the other options.
As pvg says, there are the Hades Canyon like which are very capable graphically (and have 2x nic) and were a strange child of AMD and Intel, but I referring to the integrated graphics.
The new Nuc9 series have some very serious options but they cost a lot and seem to just be feeding off the Nuc name. They are monsters. They are also about the same volume as 10 traditional Nucs (0.5L versus 5L), though have a built in power supply, which is a huge upgrade imho.
Im not aware that they are particularly limited with what can be plugged in as an external GPU, but you don't see many people raving about that option. It seems to come with glitchy behaviour and continual irritation - I haven't got first hand experience.
There are a few reddit subs that may interest you - r/intelnuc, r/nuclab r/homelab have some interesting posts.
Yes, the NUC9 is intriguing but super expensive and limited availability (that I could find). On the other hand, the cost of an eGPU chassis is super high and perhaps offsets most of the cost difference.
But in any case, it sounds like if I'm definitely doing discrete graphics your GPU concerns aren't a problem for a NUC10.
If I was running the thing with a monitor attached I'd be getting a Nuc8 BEH/BEK. Cheaper, twice the graphics performance but lacking two cores (though thermal issues likely constrain the usefulness of the extra two).
Some have discrete graphics by AMD, there's one that even takes a half-sized actual nVidia card of some sort. The bang/buck suffers a fair bit there, I think, unless you're really pressed for space. But if you're considering an eGPU box, you probably aren't quite to that degree.
Different poster, but my 2016 12" MB has both an unpleasantly slow CPU for basic web browsing and not enough RAM (8 GB). (And no Thunderbolt/eGPU support.) As soon as Apple releases another fanless laptop, I'll buy it.
Would it be possible to design a macbook that has both an intel x86 processor and an ARM one? Sort of like how we turn off the discrete GPU when not needed. Too expensive or too complex?
Actually existing Macs sort of have both, given they include the T2 chip which contains the same ARM cores that were in the iPhone 7.
It's actually not the cost, though having two functional CPUs would cost more. How would it work? How much time and effort would go into building the hardware so they could interoperate and the OS to make it usable?
Just a massive amount of complexity for very little gain. Apple's Rosetta2 will already run x86 code with very good performance on Apple Silicon, in MacOS as a MacOS app. Hundreds of dollars per Mac cheaper, and far less complex to integrate.
This could be fun to make work. Processes would have a flag for x86 or arm, and only be scheduled on the processors for that type of processor.
Most likely pick one processor architecture as native for the kernel in, and the other one has just enough kernel to trap system calls and call into the native kernel, and switch tasks.
I think the market for that is tiny. I don’t think all that many Mac buyers care all that much about Bootcamp or Docker.
If, within 6 months of ARM Macs coming out, we have Chrome, the Adobe suite, and Microsoft Office running natively, that will cover the vast majority of Mac use cases.
I guess, like the graphics cards, the "weaker" card is connected to the display, and just passes through information. So with a dual CPU it would really be like virtualization, and the input/output would be handled by the newer ARM cpu, but if they'd need to run an x86-64 program, they'd start up that CPU and whatever info it needs from the OS, there'd be a translation layer.
Ah, I wasn't aware of these, it's a really interesting approach and could probably save quite a bit of space in a rack. They're solving a slightly different problem though.
Seems like there is a pretty straightforward decision tree that one could create by rearranging these questions: Is this business or pleasure? Do you just run windows on it (as I've seen CEOs and COOs do)? Are you a developer? For binary linux builds (including docker)? For iOS? For Mac? Or Java? Do you have a build farm running on the various platforms and OS you support? Do you target just one specific platform and are you already cross-compiling or building on a remote machine? Do you need to debug binaries locally? Do you require a particular piece of software for your job? Is that software available for an ARM Mac?
I think if a person isn't qualified to answer these questions for themselves, then that gives us enough information to recommend that they buy an ARM Mac. ARM is the future for Apple.
The main thing, based on what we know now, is that virtualized (or Bootcamp) Windows is a non-starter. That's not to say someone won't figure out how to do something hacky with a Surface Pro build or something, but basically no.
Other than that I agree that if you need a system to write/debug MacOS apps that run on Windows, you probably need a MacOS system if you don't have access to one.
Otherwise, if you need a new system, get an ARM and just deal with the migration pains.
The great unknown is of course how the Arm based Macs will perform vs Intel models.
Thinking back to the PowerPC to Intel transition then the step up in performance per watt was enough to make the MacBook Pro a very, very clear winner - I don't think that many people were wishing they had bought a PowerPC laptop. There is clearly the potential for this to happen again.
Apple clearly doesn't want to "Osborne" its existing range, but if the gains are only marginal then the justification for the change becomes quite weak.
If you depend on x86 for Windows or for server development then buying a Intel Mac will give you what you need. For everyone else - especially buying a laptop - I would have thought that at least hanging on, if you can, would be a sensible option.
it happened the same to the Sega Saturn / Dreamcast, the MakerBot, and the Osborne Computer in the 80s. Customer delayed buys, because a new and disruptive product generation was coming.
That effect is named as Osborne Effect since the 80s.
Let's see what happens with the upcoming Macs.
It’s made all the more interesting by the current MacBook line. For the first time in a very long time it’s a good lineup and is close to the ‘good, better, best’ of the olden times.
Shouldn't there be a leasing program for computers?
So you can have the latest model every 3-4 years, give it back if it breaks down and get a new one pronto, and just pay monthly instalments (with some premium) instead of the whole upfront fee.
I need Windows and macOS support, so it'll be Intel for me. I'm working on a Hackintosh right now, so hopefully either x86 is supported for a while, or we can get ARM-based Hackintoshes before the deprecation of x86 macOS.
This reminds me of the decision to not support Adobe Flash on the original iPad. It was a huge inconvenience to early adopters but eventually changed the face of the internet.
I wonder what effect ARM Macs will have on other x86 based environments like Windows and Linux. Will we start to see a migration from x86 to ARM or will we retain the split? Ubuntu and Arch have ARM distributions but I can't build a gaming rig out with ARM. It's mostly non-upgradable integrated systems. Will that change?
As an iPad 1 early adopter I would hold off on getting an ARM Mac immediately. Within a year half of the apps in the app store wouldn't work for various reasons. I wouldn't be surprised if that happens here.
This is a question that comes up regardless of huge paradigm shifts like changing the CPU architecture. Don't worry so much - what you DO on the computer is much more important than having the latest spec. Intel Mac's will be relevant for almost 10 years, based on my experience with the PPC transition.
My experience with the PPC transition was definitely not that. A 2004 PowerMac G5 lost basically all software support and required community involvement (the likes of tenfourfox) to even get a competent browser by around 2010. Snow Leopard released in 2009 and didn't support PPC Macs.
That said, I think the Intel transition will likely be slower. Apple has said they plan to keep releasing Intel Mac's and there's certain things (discrete GPU support?) that might motivate them to do so.
That and upgrade cycles on desktops/laptops have also slowed quite a bit in the last 15 years.
6 years from the last hardware sale is also probably a bit forced because PowerPC was never as well supported by compilers, libraries, etc., performance was well behind years before the switch, and Apple has migrated a lot of tasks which used to involve assembler to higher level libraries. This time around I don’t think you’ll have that pressure because it’s not like LLVM is going to stop supporting Intel and most projects are using the same compiler for each platform, so it’s not like you’re going to need to keep an old toolchain like Code Warrior in life support just to ship updates.
Depends on what you want to call support. 3.6 (the last PowerPC and 10.4 compatible version) was indeed supported until the first Extended Support Release with Firefox 10, since 3.6 was effectively the first ESR in all but name. But 3.6 aged awfully fast, and even when 3.6 was current certain features never made it to PowerPC (like the JIT), so I think it's also fair to say the support it got was never the same as x86 Macs did when they emerged.
It definitely wasn’t great - and I know a few people who benefited from your work so thanks!
I think this time would be different because things like compilers wouldn’t be asked to add support for an otherwise unused architecture. Getting anyone to support PPC was hard given the limited market share and got much harder once it was deprecated. There’s definitely some Mac-specific code but a lot of the most expensive code is going to be shared with Windows and Linux support which isn’t going anywhere.
I'd also like to offer my appreciation to you and the TenFourFox. That is the only browser that still works on my G4 and frankly I think few expected that such a project would be made and actively maintained.
I retired my G4-based Mac from everyday use in 2009. I don't recall having major issues. But I suppose it depended on what you were doing. I was doing mostly web, Java, Python at the time.
That's around the time my Powerbook G3 fell out of my daily use as well. Personally I'd say the day they became "outdated" was when Camino stopped being updated: http://caminobrowser.org/releases/2.1.2/
I traded in my iBook g4 around then. It was just too slow and it couldn’t keep up with basic things we all take for granted. The laptop was the fastest around when it came out but everything changed so quickly that decade. Not being able to stream video at any resolution higher than “low” was the final straw for me.
For something that lasted about 9 years, I'd say that was pretty darn good value for money, at least it was in my case. This led me to continue buying Mac's until today, mainly Mac Mini, Macbook Air and Macbook Pro's. None of them have given any issues except the GPU issue on the 2015 MBP, which was sorted out by Apple. I notice that people complain a lot about Apple quality having gone down, but one also has to realize the complexities of their hardware went up quite dramatically, along with their ambitions to change everything now and then (the keyboard flop comes to mind).
I think it is really a very simple question: do you want to run the software you have now, at the fastest speed possible, with all the current defects? If so, get an Intel machine now. If you want to be able to do future, different things, wait for the ARM machine.
Personally I would use something like an Apple Chromebook. Long long battery life, more secure (from hardware support like the trusted enclave, microarch stuff like tagged pointers, and the OS moving towards unpriviledged applications and code signing), better iOS integration.
I think we're actually overdue for Apple to offer cloud offload processing, especially for iOS, but maybe offload to the edge (your macbook) becomes easier with ARM.
Given that the future of mac software will clearly be arm I don't see the point of buying intel now. Unless you plan on replacing it before transition period...but even then the resell value will presumably be shocking by then
Given that rMBPs from 2012 and 2013 command a huge premium right now...
You'd basically have to believe that Apple is actually serious about it's non-iOS offerings and is actually committing to perftecting the hardware and software to a degree that we haven't seen in a decade.
Remember when Apple "invented" the trashcan workstation that was going to revolutionize the world of desktop computers? People might prefer to have trusty tech that supports all the software written today and more or less just works instead of incredibly ambitious but half-baked and de facto abandoned tech from tomorrow.
If you take into account that a lot of the impetus behind this transition is so they can reuse their hardware/software stack from the iOS world, you'll see that it is quite a leap of faith to take.
1. You use Bootcamp, Windows virtualized, or Linux Virtualized
2. You are worried about first gen hardware
3. You believe that a high end Intel iMac will be faster than an Apple Silicon iMac.
4. You believe that Intel Macs will be supported for at least 5 years.
All four are why I'm buying a new Intel iMac when it comes out, and keeping it for 5 years (as has been my Mac cycle).
> Apple has demoed Rosetta 2 with apps and games, and shown that there's no apparent difference between running an Intel app on an Intel machine versus an Apple Silicon machine. Everything works as you'd expect, but if performance is important to you it may take some time for all your software to be updated to support the new processors.
Marketing fluff or what? Because if they are emulating x86 on ARM, doesn’t there have to be a performance issue? Unless they built their ARM chip with lots of suspiciously x86-looking functions?
It’s non emulation really. Emulation is like JIT. Only converting code as its running. So emulator and the code have to run at the same time.
This time they are doing AOT — converting the entire app at once ahead of time. Slow to start the first time, but decent speed after. Without the source it is not optimized, but still faster than emulation.
The list of things stopping me going to ARM-based macs:
- OCR software for my scanner (admittedly becoming less and less relevant over time)
- Fortinet VPN client which is intel-only
- Steam library (still on Mojave because half my games won’t work on Catalina)
I can resolve the Steam library issues by getting a dedicated computer to be my games computer, but then that it an extra thing to carry on long trips.
It highly depends on your use case, If your work require Mac extensively [iOs app dev??]and you also require mobility with device, I will suggest to buy 1-2 year old Mac laptop.
else if mobility is not concern, try using Hackintosh based system. There are many known well working sets of hardware combination discussed among hackintosh community.
I would like to switch from my 2018 Mini to a 16 inch MBP, I miss being able to code anywhere in the house or being able to code elsewhere while waiting for kids.
But it’s not a need yet, so trying to hold on till Q2 2021 when the 16 inch ARM MBP is due. Not worried about buying the first version, Apple has always nailed these transitions incredibly well in the past.
I got the 6 core with a 500 Gb SSD, because the bigger the SSD the more writing lanes it would get. IIRC the 500 Gb writes almost as fast as the 1 Tb, and way faster than the 256 Gb.
I started it with 16 Gb RAM to avoid Apple RAM tax, and just installed 32 Gb for only $180. Thought about spending $280 for 64 Gb, but cheaped out.
Edit: Hindsight being 20-20, should have got a 8Gb/1 Tb and upgraded the ram right then. The Ram upgrade is a lot easier that it first appears but easily done in an hour being careful. I think ram upgrades were more expensive at the time and I was anxious to get using it.
Neither of these ARM CPUs will be used in Apple Silicon. It will be an A14 class chip. It will be significantly faster due to
1) New 5 nm process vs. the older 7 nm process.
2) Greater thermal headroom
ie they will be able to use more power in laptops and desktops than in tablets/phones.
3) More cores
It is expected that Apple Silicon will have at least 12 to 20 cores, and more for high end desktop versions.
4) GPU integration
The current A13 has an Apple integrated GPU that’s reputedly much faster than the Intel integrated GPUs. The 5 nm process will provide many more transistors for even faster GPU performance.
My expectation is Apple will have a family of Apple Silicon SOCs next year that start around 1,400 single core, and with multi-core starting at 7,000 and going over 20,000 for 30 or 40 core desktop versions.
They will probably include T2 functionality and more, further reducing power draw.
Also remember that i9s are $200-$300 each. A13s are around $100 each.
MacBooks will get the benefits of longer battery life, faster CPUs & GPUs while also being as much as $200 cheaper.
I thought the geekbench results for the A13 are already comparing favorably to a i9 and that is running on a phone. Imagine a version which has 10x the thermal headroom than the iPhone. Also, the ARM MB Pros are likely to run a variation of the A14 on the 5nm TSMC process.
Beyond the raw CPU power, it has been quite overlooked how much performance Apple can get by putting additional compute units onto the chip, like their neural net accellerators. As they own the whole hard- and softwarestack, they can much more easily make the best use out of those additions.
Sorry to be boring, but I am wondering about how Arm Macs are going to get on with Microsoft Office. With my mac users their most used applications are Outlook, Word and Excel. Somehow I can't set them getting the same level of support as Adobe is getting. I'm interested what other people think?
For work, I expect we will buy some Intel Macs, but for myself, I'll probably go with the ARM since I do some Mac App programming. That whole need Boot Camp or x86 Windows is a powerful need.
Question, can someone hang a x86 off the thunderbolt port to help with virtualization?
I guess something like Intel PC which would receive power over thunderbolt and use some protocol to communicate with "host" OS. I never heard anything like it, but that's an interesting idea.
It depends on what you’re buying for: if you’re a professional you buy what runs your software when you need it. If you’re looking to play older games under Windows, buy Intel and see how the transition goes to see whether you need to buy a PC next time.
The promise of ARM based Macs is delaying my next laptop purchase for a single reason - battery life. I want to see the battery life metrics of their new devices, depending on that I intend to decide.
Does anyone here have a good source for how fast vector instructions (AVX2 and the like) will compute under Rosetta2 on the Arm-based Macs vs the current x86 Macs?
I'd buy an Intel mac from 2015. The modern ones all have their well-documented problems. The ARM based macs will almost certainly have a small period of instability and incompatibility when they first launch, just like the Intel macs did.
I agree about not getting an ARM Mac on launch day, for the reasons you’ve cited. But I have a 2015 Pro for work, and a 2020 Air for myself. The Air is a fantastic machine, and I wouldn’t hesitate to recommend it again, even after the ARM announcement.
What do you think of the keyboard? I'm still lovingly holding on to my 2015 MBP which has keys that actually travel. For context, the last new Mac keyboard I tried was the butterfly variant.
Lots of people actually liked the 2016 keyboards, less travel is fine for many people, including me. The real problem as the susceptibility to debris breaking keys, and not being easily fixable.
So the new MacBook keyboards seemed to have the debris issue licked, and a little more travel. That’s all I need, but if you want more travel I can see it’s not there for you yet.
That time to switch depends entirely on the target demographic, though.
It might be fine for your web developer type or casual user.
Professionals, however, have a lot invested in certain platforms. Be it the ability to publish their work (e.g. Apple's Walled Garden requires you to use a Mac for publishing to their app stores), workflows established over years, or investment in hard- and software that are tied to the Mac ecosystem.
Switching to something else entirely is of course always an option, but it's not always a realistic or sensible option...
In my book a professional that marries a given platform so hard is no true professional. Although it might seem oh so sensible and realistic to give into even more strings to be attached. It should give these professionals pause if they are so utterly dependent on a system.
There are also quite a lot of professionals that could easily switch with some minor inconveniences.
PS: I find it jarring how you are associating the casual user with a web developer type.
> I find it jarring how you are associating the casual user with a web developer type.
How so? Both don't require any special hard- or software environment and in principle only need an environment that provides access to a modern browser (and optionally some tools for office, graphic editing and the like).
You can use cloud-based services like REPL.it to fulfil all your development needs in the browser. Backends are available in the cloud and all your source code lives there anyway. No need for beefy local hardware or any particular OS.
It's the ultimate platform-agnostic development activity and thus your OS or hardware shouldn't matter - just like it shouldn't matter for casual users; every mainstream and even many niche OS these days are usable enough for everyday tasks and running modern browsers.
If you think it's somehow degrading to say that modern web development doesn't require any more hard- or software than casually browsing the web, watching movies, or editing holiday pictures, then you completely missed the point.
It's actually great that you don't need expensive workstation level hard- and software to do web development and that - in principle - an iPad with a keyboard will do.
By "professional" I meant self-employed people who make a living from working with their computer. This should be clear form the context, but alas that's among the things that go by many people these days.
You chose to ignore this meaning of the word and decided to be offended instead. Fair enough, but not my problem.
A professional shouldn't be "married" to a platform, but most professionals have a clear understanding which platform gives them the best productivity and the environment they like to work in. There are quite a few things which keep me going with the Mac over a Windows machine, though WSL makes the latter more interesting than it had been in the past.
The funny thing is, I considered that option before the Apple announcement. I was basically decided, but the big thing was: I couldn't find a good offering for a really good Ryzen laptop, especially when considering running Linux (I would still be interested in getting a recommendation).
But so far I find the Apple announcements so exciting, that going Ryzen has been put back into being plan B.
I have the Lenovo Legion 5. Only problem with it is the trackpad doesn't work with Ubuntu 20.04 (some kernel regression that I don't have the time to track down, so I'm using an external mouse).
I am guessing here and hoping some can correct me if i’m wrong: Existing Mac Intel applications need to transitioned to ARM applications and will be either: A) recompiled (buggy) or, B) emulated (slow).
My options are: 1) ARM - which will be buggy or slow immediately or, 2) Intel - which will have problems 5-7 years when Intel apps no longer supported... and by that time I’ll be ready for a new computer anyway.
Recompiling, unless you're doing some low-level optimization and the like, isn't tricky or bug prone. I expect a lot of apps to be recompiled and smooth after a few months; making sure dependencies are ARM-ready will slow that down a bit.
And to add on, the recompiled code is thrown away when it does sketchy things like JIT or rewriting itself. So it should be able to avoid many of the bugs you might encounter in that area by just emulating complicated bits.
This puzzles me. Why don't they just recompile the projects to ARM and go with fat binaries? Stop emulating things. It wastes power. Plus the compiler toolchain already exists.
That’s what they announced, and it’s trivial as you’d expect from a company which has at times supported 4 architectures (32/64-bit PowerPC and Intel). Rosetta 2 is for the apps which don’t do that.
The recommended way to run apps is indeed to recompile to ARM using universal binaries [0]. Rosetta is just to support older apps that have not yet been updated for ARM.