Hacker News new | past | comments | ask | show | jobs | submit login

> The M1 chip, which belongs to a MacBook Air with 8GB RAM, features a single-core score of 1687 and a multi-core score of 7433. According to the benchmark, the M1 has a 3.2GHz base frequency.

> The Mac mini with M1 chip that was benchmarked earned a single-core score of 1682 and a multi-core score of 7067.

> Update: There's also a benchmark for the 13-inch MacBook Pro with M1 chip and 16GB RAM that has a single-core score of 1714 and a multi-core score of 6802. Like the MacBook Air , it has a 3.2GHz base frequency.

So single core we have: Air 1687, Mini 1682, Pro 1714

And multi core we have: Air 7433, Mini 7067, Pro 6802

I’m not sure what to make of these scores, but it seems wrong that the Mini and Pro significantly underperform the Air in multi core. I find it hard to imagine this benchmark is going to be representative of actual usage given the way the products are positioned, which makes it hard to know how seriously to take the comparisons to other products too.

> When compared to existing devices, the M1 chip in the MacBook Air outperforms all iOS devices. For comparison's sake, the iPhone 12 Pro earned a single-core score of 1584 and a multi-core score of 3898, while the highest ranked iOS device on Geekbench's charts, the A14 iPad Air, earned a single-core score of 1585 and a multi-core score of 4647.

This seems a bit odd too - the A14 iPad Air outperforms all iPad Pro devices?




AFAIK it's pretty common for new macs to spend a while creating an index of its hard drive. For that reason, if you want to run benchmarks, you should generally wait until it's done with that (e.g. an hour or probably less with these speedybois). It might be that the people running the Pro benchmarks didn't wait for that, in their rush to publish the first benchmark. This would be consistent with what we're seeing - the Pro has faster single core performance, but slightly lower multicore, because some of its "background" cores were busy creating the index, while the Air was done with that task.


Quite likely what happened, a second geekbench score has shown up for the pro and it matches the air: https://browser.geekbench.com/v5/cpu/search?utf8=&q=MacBookP...

My guess: in geekbench air and pro score the same, because geekbench is shortlived and not thermally constrained. In cinebench you'll see the pro pulling ahead.


Who uploads these results, are they unverified?


Anyone on the Internet can upload results. They are not verified.


The geekbench software uploads them automatically.


It may be possible the variations are due to differences in the thermal environment when the tests were conducted. I would expect the pro and mini to beat the air as they should have better thermals, but that may only show up over longer term tests and environmental factors could win out in shorter tests. Just a theory.


if I recall correctly, the geekbench score does run on small bursts and it's designed to find the peak performance without taking the thermal limitations in account.


SPEC, on the other hand, takes hours to run.

Of course the iPhone chip isn't as beefy as the M1, but the results still speak for themselves.

https://www.anandtech.com/show/16226/apple-silicon-m1-a14-de...


Apple has since explained that M1s are slightly different between the Air, Pro and Mini, accounting for the different thermal chassis. (In the case of the Pro they enable an 8th GPU core.) It sounds like they are three different chips rather than the same chip in different configurations --I think he said that in marketing speak. https://www.youtube.com/watch?v=2lK0ySxQyrs

Apple makes it clearer that in the real world, these machines are only going to offer their incredible performance on Metal, iPad/iPhone apps and for any Mac apps that happen to have been ported over to M1 by their developers (using Xcode). These machines will only offer similar performance to existing Intel Macs when running existing Intel Mac apps because the incredible performance will be reserved for Apple's Rosetta2 software to make those unmodified apps compatible.

But what went unsaid, except during the part where they say they 'learned from their experience in the past processor transitions', is that by introducing the chip at the low-end of the lineup first, they create a market for the (few remaining relevant) Mac developers to invest in porting their code over to ARM and likewise, because these new machines run iPad apps at full speed on upto 6K displays, there is incentive for the iPad/iOS-only devs to expand the functionality beyond what their wares can do on a tablet/phone. (Any Mac dev that drags their feet porting may find that there are 50 iPad apps that now run fullscreen performing 75% of their functionality, costing them sales in the big volume accounts where they buy licenses by the thousands.) Meanwhile, the type of users who can get by with two USB ports, 16GB of RAM and a single external monitor probably don't run many third-party Mac apps and are going to have an awesome experience with the iPad apps and Apple's native apps.


Hence why Cinebench is often used these days when evaluating real-world performance with sustained workloads.


Different tests. Different purposes.

GB deliberately avoids running up the heat because it is focused on testing the chip, not the machine's cooling ability.

Cinebench, as you say, tests "real-world" conditions, meaning the entire machine, not just the chip.


The chip's ability to run at sustained load is a part of its design also. Precisely because modern chips has to throttle in order to meet power and thermal envelopes, we should be looking at sustained performance as a more accurate measure.

In a majority of cases, burst performance only affects things like responsiveness, and those things should be measured instead for a better reflection of the benefits.


If you perform an integrated test, would you not perform unit tests? An unit test may show areas for easy improvement if other aspects of the total package are changed.

For example, if someone thought M1 was thermally constrained, they might decide to rip mini out of the case and attach a different cooling method.


Not saying that burst performance shouldn't be measured, but it shouldn't be the de-facto go-to performance measure like it is now with geekbench.


If you run only unit tests, you don't get useful data.

> they might decide to rip mini out of the case and attach a different cooling method.

99% of customers will never do this.


That's not true at all about Geekbench.

"Geekbench 5 is a cross-platform benchmark that measures your system's performance with the press of a button. How will your mobile device or desktop computer perform when push comes to crunch? How will it compare to the newest devices on the market? Find out today with Geekbench 5"


I don't see anything in this quote that discounts the parent.


The new R23 release even does multiple runs by default. Excitedly waiting for results for the M1 to start popping up now that its released and has support.


The A14 Air just came out and has a brand new CPU. The Pros have much fancier displays, lower pen latency, etc. Subjectively, in most typical use, the Pros already feel like they have more available cycles than iOS apps have got around to soaking up.


Thanks, that makes sense - I didn't realise there was no Pro line on the newest chips yet. It'll be interesting to see how the next iPad Pros compare to these M1 Macbooks.


The iPad Pros never even used the A13 series - they're still back on A12 (though with some variants & binning), so understandable that it could be a fairly big jump


This is about the new line of Macs and the M1 chip in them, not iPads


My comment also mentioned the part of the article that mentioned the A14 iPad Air being the best performing iOS device - I wasn't sure why that was the case.


Geekbench is a series of extremely short and bursty benchmarks. Because of this, it doesn't really test the steady state speed a system is capable of, it's more testing the peak performance for short periods.

In this view, it's entirely possible that the Air simply did not have time to throttle before the benchmark ran out.


Yep, try doing that test back to back 25 times and see who comes out on top.


I checked this geekbenchmark with our several different computers on hand, and I can confirm that it's total useless measurements for real world applications or performance.


Torvalds ripped it apart nearly a decade ago.

It's a useless benchmark, what I want to see is things like time to compile a bunch of different software, things that take long enough for the processor/cooling to reach thermal equilibrium etc.

I.e. stuff that more closely matches the real world


>There’s been a lot of criticism about more common benchmark suites such as GeekBench, but frankly I've found these concerns or arguments to be quite unfounded. The only factual differences between workloads in SPEC and workloads in GB5 is that the latter has less outlier tests which are memory-heavy, meaning it’s more of a CPU benchmark whereas SPEC has more tendency towards CPU+DRAM.

https://www.anandtech.com/show/16226/apple-silicon-m1-a14-de...


He ripped apart a very different benchmark for what it was worth, that was GB3 at the time I believe. 5 was a rewrite to make it cross platform-equal. In real world use it actually is far more relevant than thermally limited benchmarks. It just measures max peak burst performance...which is important because 90% of all users use their computer to do only bursty tasks rather than long term processing. See exporting a 10 second clip on an iPhone or loading a heavy SPA webpage on a Mac. These are 5 second burst tasks where real world use would not be thermally limited but would see real change consistent with Geekbench.

It's really only intended to be one of many benchmarks to tell the whole story; of course Linus would attack that because it doesn't make any real sense in his use and isn't the full story for him. If Geekbench was not tested, it would not cover the majority of computing uses and it would weigh cpus that had poor turbo or burst performance unfairly high for most uses.

Geekbench is kinda like 0-60MPH times and other tests (like SPEC2006) are like top speed I guess? The whole story is between them.


I believe this is the discussion OP is talking about: https://yarchive.net/comp/linux/benchmarks.html


The results seem a little weird but if remotely true then these machines are going to sell like cup cakes.

Why would anyone (who is not forced) buy an Intel PC laptop when these are available and priced as competitive as they are?


> Why would anyone (who is not forced) buy an Intel PC laptop when these are available and priced as competitive as they are?

- locked bootloader - no bootcamp - can't install or boot linux or windows

- virtualization limited to arm64 machines - no windows x86 or linux x86 virtual machines

- only 2 thunderbolt ports

- limited to 16GB RAM

- no external gpu support/drivers - can't use nvidia or amd cards

- no AAA gaming

- can't run x86 containers without finding/building for arm64 or taking huge performance hit with qemu-static

- uncertain future of macos as it continues to be locked down


Although I personally care about every point you listed, I don’t think most buyers care about any of them.


Maybe the gaming part.


To the extent that games are made available for MacOS/ARM in the first place (admittedly a sticking point), it looks like these machines will be able to play most of them reasonably well. Certainly much better than most of Apples previous machines with integrated graphics or low end discrete GPUs.


iOS games/apps run on Apple Silicon Mac, so that by itself opens up a huge gaming market.


Casual gaming market.


In other words, the biggest gaming market.


Maybe, but distinct from real "gaming".


A game is a game. There is no "real gaming". Apple Arcade with a Xbox controller paired to a Mac is actually a fun gaming device for some types of gamers.

Genshin Impact is a great game that is on iOS in addition to "real consoles". Oceanhorn 2 is an amazing game that was originally on Apple Arcade and brought to Nintendo's "real console".

There is also quite a number of ports that I think you aren't aware of.


There is a difference between Tetris on Facebook versus Dota or CSGO on PC. The later is "real gaming", the former not. The border might be a gradient.

It's like calling yourself a programmer because you can set a timer on your VCR. (dated but still accurate)


And since when CSGO is a GPU-demanding game?

You think "real games" are for "hard triers only", but it's just your point of view.


Mine and many others. Rightfully so, see the example.

GPU was not the issue here.


umm there are more ports than you think.


I work on multiple OSX machines, servers run linux, and i have a single windows machine just got the few games that can't be run elsewhere.


If porting IOS games is easy then gaming for most people shouldn't be an issue.


I feel like the demand for desktop versions of simple phone games is pretty low.

Mac users who hope to play anything from their steam library or dual boot Windows are going to be very disappointed.


> Although I personally care about every point you listed, I don’t think most buyers care about any of them.

Buyers don't especially care about performance either to be honest, unless they care about one of those factors in order to need it.


Consumers absolutely care that their Apple MacBook Pro has 20 hours of battery life, when the comparable XPS has only 10 hours or whatever.


The ROI for battery life falls off at a certain point, right? For phones, it's probably about a day -- how often is it a problem to plug in your phone at some point in a 24-hour period? -- and for laptops it's often about a full workday, 8-10 hours. I'm not saying that a 20-hour laptop battery isn't an incredible accomplishment, but I do think that I care a lot less about 20 vs 10 hours than I do about 10 vs 5.


And then sit at a desk all day within reach of a power socket


This. The amount of times I've truly needed more then a couple of hours of battery life are rare. I think most people think they want more battery life when they really don't need it. Just add more cooling to stop those processors from throttling all the time.


> "no AAA gaming"

This is, arguably, a disadvantage of any Mac.

But Apple Silicon may actually improve the situation over time, as having the same GPUs and APIs on Macs and iOS devices means there is now a much bigger market for game developers to target with the same codebase.


It's more about losing the ability to boot into windows to game there, as well as losing egpu support.


I agree. It is not without danger, the same as with Apps actually, that developers target only the iPad (touch interface) and don't care about optimizing for the Mac experience.

But on the whole I am optimistic.


Apple has had support for game pads for a while now. This will just make it more viable.

The only issue might be multi-touch based games on M1


And don’t forget that all of the iPad/iPhone games will work on these laptops. That’s not quite the same thing as having major PC titles, but it’s not nothing either.


That and streaming direct to the browser


> But Apple Silicon may actually improve the situation over time, as having the same GPUs and APIs on Macs and iOS devices means there is now a much bigger market for game developers to target with the same codebase.

Not really. The business models for desktop gaming are completely different to mobile devices, and there is no meaningful common market.

I think people will actually be surprised at how few games from iOS will even run on an ARM Mac because developers will block them.

It used to be possible to do some gaming on a Mac - the vast, vast majority of Steam users have graphics hardware of a level that was perfectly achievable on a Mac, especially with an eGPU. The end of x86 is the end of that market, forever.


> "the vast, vast majority of Steam users have graphics hardware of a level that was perfectly achievable on a Mac"

Exactly. So it was never really the hardware that held back gaming on Mac, but the fact that from a game-development perspective it's an esoteric platform that has limited or no support for the main industry standard APIs (DirectX, Vulkan, etc).

It was never worth the effort for most game developers to bother porting games to the Mac because writing a custom port for Metal was way too expensive to justify for such a niche market.

But now with Apple Silicon, that all changes. If you're going to port your game to iOS (and that's surely tempting - it's a huge platform/market with powerful GPUs and a "spendy" customer base) then you basically get Mac for free with close to zero additional effort.


> Exactly. So it was never really the hardware that held back gaming on Mac, but the fact that from a game-development perspective it's an esoteric platform that has limited or no support for the main industry standard APIs (DirectX, Vulkan, etc).

I think it's more that gaming wasn't held back on the Mac. It was just bootcamp was much more common than people think.

> If you're going to port your game to iOS (and why not? It's a huge platform with powerful GPUs and a huge, "spendy" market)

Because mobile gaming and desktop gaming have very little in common. Note that Nintendo didn't port their titles when they released iOS games, they made new games. Users want different experiences, and flat ports of successful console gaming titles to iOS tend to fail. There are, all told, very few ports of successful PC/console games to iOS, and those that exist tend to be brand reuse rather than literal ports.

> then you basically get Mac for free with close to zero additional effort.

Not even remotely. The way you secure your service has to be totally different, the UI paradigm is completely different, you have to cope with totally different aspect ratios etc etc. It's significant effort, and it will be very hard to justify for most game studios. It's certainly more work in most cases than porting a Windows game to MacOS was when using a mainstream engine, and that was not a huge market.


Why would developers "block" their iOS games from running on ARM Macs?


1) Macs are harder to consider secure, they're effectively all jailbroken. Cheating/bypassing in app purchases will be rampant, reducing the opportunity for cross play, and the Mac market isn't big enough itself. These aren't insurmountable issues, but they require investment, and the additional Mac market probably isn't worth the outlay and risk.

2) You have to rebuild the UI, which costs money which the Mac version may well not recoup.

3) You have a different version for desktops that costs more upfront with less reliance on in-app mechanics that you don't want to undermine.


> "Macs are harder to consider secure, they're effectively all jailbroken."

OK, but that's no different to Windows and Android.

> "You have to rebuild the UI"

No. Even with apps this is no longer the case (see: "Mac Catalyst"), but it's certainly not true for games. Maybe you'd need to add some keyboard/mouse bindings, but that's about it. Even iPads support keyboards and mice now days!


So if an ios title uses a multi touch gesture how do you replicate that on MacOS?


Every MacBook has a multitouch trackpad. It's rare that I ever use a mouse.


Still doesn't account for desktop devices. And no, it's not a given that every desktop mac user has a Magic Trackpad.


You can do those multitouch gestures on TouchBar /s


"uncertain future of macos as it continues to be locked down"

Citation Needed.

Apple detractors LOVE to bring this idea up, but there's nothing to it in any real sense. Do Macs ship with a checkbox filled in that limits software vendors? Yes. This is a good thing. Is it trivial to change this setting? Also yes.

Anyone who buys a Mac can run any software on it they like. There is no lockdown.


There is a lockdown as you cannot even boot linux anymore...


I guess that may count for you, but I mean lockdowns within the OS itself.

I don't care that I can't run Linux on my Mac. If I wanted to run Linux, I'd have different hardware.


You can change that on T2-based Intel Macs, at least, just like on Windows: https://support.apple.com/en-us/HT208330

Of course, Apple as an OEM does not support running non-Mac OSes, so virtualization should still be preferred for most use cases.


That's why I'm saying anymore. It was the case but it is no more possible with the new Apple Silicone laptop.


There’s no indication from Apple that they are intentionally not supporting this feature - just that it doesn’t exist right now. That said in practice I never use BootCamp because the driver support is always sub-par. It’s a much nicer experience to virtualize, especially now that most virtualization platforms offer native support for Apple’s virtualization libraries, such that installing third party kernel extensions are less necessary now than ever before. (I think the only ones I tend to install now are Tuxera NTFS support which tends to be really high quality. Apple should just buy Tuxera and ship them natively.)


I guess Microsoft will have to hurry up and build an ARM version of Windows so they can keep the rounding error number of BootCamp users satisfied.


An ARM version of Windows has existed for almost a decade. In fact it's running on quite a few laptops right now.


Since 1997 to be exact. WinCE ran on arm since time immemorial.


There is an ARM version of Windows 10 that runs on the Surface Pro X [1].

[1] https://www.microsoft.com/en-us/p/surface-pro-x/


Not only that, the first released version of NT on ARM was in 2012.

They had crappy code signing policies (only store apps on Windows RT tablet) which guaranteed poor adoption but that was a policy decision, not a technical one.


Catalina already broke a TON of legacy software and you cannot downgrade newer Macs to Mojave (at least not without some serious hacking efforts, and I know at least one person who tried and failed).


That's not true at all. You can use recovery mode to trivially revert back to the OS that was installed when the computer was purchased. If that's pre-Mojave then you can just upgrade back to Mojave afterwards.


What if the Mac had a newer OS than Mojave originally installed on it? That is how I interpreted the parent poster's comment. Given this interpretation, I'm don't think I'd have the expectation to be able to install an earlier OS.


With that interpretation, you'd be correct but I don't think you've ever been able to downgrade to something earlier than what it came with since the older OS wouldn't include the appropriate drivers or kexts to properly run the hardware.


So the choice is between running an older OS version that will go EOL sooner, or abandoning the ability to dual-boot? How is that OK?


ARM Macs can't run unsigned software.


Yes, but code signing can be ad-hoc, can be done automatically at build time, and doesn't require notarization. So it's basically just a way to ensure the binary has not been tampered with. I don't really see the problem here, as the code signing itself does not prevent any kind of code from running on macOS Big Sur.


With iOS, you have to be an Apple developer paying $99/yr to do ad-hoc signing, I'm guessing it's the same now for MacOS..


That's not true. Anyone with an Apple ID is able to sign software they build/binaries and install them on their iOS devices.

Although instead of lasting 1 year they only last 7 days, but there is no fee for a user to sign and install their own binaries.


My question was about MacOS and if similar behaviour exists there too with the M1 Macs.

To clarify iOS, so the app erases itself after 7 days? Or is it something like you can install an app for only 7 days after downloading/using Xcode?


To answer my own question, an ad-hoc signed iOS App will deactivate after 7 days unless you pay $99/yr. This behaviour is not present on Big Sur and likely M1 Macs, they can still run notarized and non-notarized apps: https://arstechnica.com/gadgets/2020/11/macos-11-0-big-sur-t...


If you can sign ad-hoc then there's no point, right? Just modify and re-sign.


Can be ad-hoc? For how long?


Cite?


- Starts at $999 for the base laptop version. You can get much cheaper still good Windows laptops.


You lost me at "good Windows laptops".

macOS has plenty of warts, but my experience with high quality equipment (Thinkpad, XPS, Alienware) has left me ultimately disappointed with Windows in many day to day situations compared to Mac.

Windows is still clunky, despite many improvements. And aside from a Thinkpad Carbon X1, I haven't used any laptop with the performance and build quality (for the size/money) as a Macbook Air.


If you need a computer for serious (long hours) use, I would always go for desktop, as you can get a vastly superior machine to any laptop, with massive amounts of disk space, memory, tons of cores, screens, etc. If you want a Mac, I'm not familiar with desktop Macs but I'm guessing the Mac Pro machines blow laptops out of the water the same way high end desktop PCs do.

For travelling, I don't think anything beats a Macbook due to how light, thin, and resilient they are. But my 2016 MBP is a pretty shit machine for its price. It's also loud (like every other laptop I've had). I avoid using it. Sure, if you take size/design/mechanical quality into account, it is probably unmatched. But for 95% of my computer usage, those are irrelevant, as I just sit at my desk. I had a company provided HP laptop (not sure if stock or upgraded by our IT staff) at my previous job which was far more performant than my Macbook, so I don't really agree that Windows laptops are necessarily bad, but it was even louder than the Macbook, and of course clunky and ugly.

For me personally, the new Macbooks are disqualified as viable work machines if it's really true that you can't use more than 1 external screen. That's just not a viable computer for me (for work). I will always have a Macbook though just because of how much I love them for travel. But a Macbook is more of a toy than a serious computer, especially if the 1 screen limit is true.


I'm in the market for a new work machine myself, and have been eying a final-generation loaded Intel MBP16. I'm sure the AS models will catch up on graphics capability by the end of their transition time, though I'm certainly wondering what the first AS MBP16 will do for graphics. I certainly wouldn't buy less capability than the 5600M myself.


" I'm not familiar with desktop Macs but I'm guessing the Mac Pro machines blow laptops out of the water the same way high end desktop PCs do."

Unfortunately they will also blow your wallet.


Wow, I just checked and yeah those prices are pretty insane, especially if you want a better-than-base model. I guess then in the desktop arena, Macs are at a disadvantage, because you can build a similarly powerful PC for a much more reasonable amount.


Certainly the Pro Desktops must be intended for Pro people that can quantify the number of billable hours they will save in Final Cut or Logic and come up with a "return of investment" figure.

The iMacs are a mistery to me, but guess I'm not the target market anyway. (I have a 2018 MBP)


> you can build a similarly powerful PC for a much more reasonable amount

It's not even a contest or similarly powerful, spend $3000 on an AMD + Nvidia PC and its significantly more powerful than the $5000 Mac Pro in both CPU and GPU compute.


In my experience, the Thinkpads are indeed the only real competition, hardware-wise.

When my current Mac dies, that's where I'm headed, but running Linux; Microsoft is less of a danger, so I don't outright boycott anymore, but I still find Windows super annoying to use.


The windows market is all over the place. All the way from leftover bin junk from 5 years ago sold as new to high end cutting edge. When you get bellow 900 dollars the market decidedly on the crap side with respect to windows. There are some exceptions but usually you have to get 1200-1800 before you start get quality items. Not saying you can not find good stuff near the bottom. But you get what you pay for. Usually they skimp on the screen, memory, and disk. I am currently using an MSI stealth gaming laptop. other than the keyboard layout being slightly odd i am liking it a lot. i replaced my previous hp of 8 years. that will find a new home doing something else once I do a full teardown and repasting. luckily it is one of the last hp laptops where taking it apart is not a total nightmare. finding a decent laptop takes a lot of work. going with apple has a lot of advantages as the hassle of 'picking' is cut down to a few models, and you have a good shot of it being decent. I personally would not buy an apple but that is because of other 'petty' reasons and not quality.


If Lenovo could for once figure out why the speakers on these Thinkpads are SO BAD, I wouldn't be reading this thread because I wouldn't care about Macs. I know there are headphones, but many times when I'm alone, I just want to watch a video and actually hear the people talk, can't do with the Thinkpad.


I bought a thinkpad hoping to avoid quality issues that I’ve experienced with other machines. The hardware is great (except for the wimpy cooling), but I have had various annoying issues with drivers, bios updates, and the behavior of their system update tool.


Maybe that's Windows or Windows pre installed bloatware specific ? The high end thinkpads (t/x/p/w/carbon) are generally well supported by Linux distros, partially thanks to many kernel and distro developers using them.

As for bios (well EFI these days) that should be handled very seamlessly via fwupd on all major Linux distros: https://fwupd.org/lvfs/devices/

(Frankly seems much more robust to how it is handled on Windows - not at oll or via half broken OEM bloatware.)


You can also buy some ThinkPads with Linux preinstalled now if you don't want to worry about hardware compatibility issues.


It's been a couple of generations since I used the Thinkpad, but wiping and carefully reinstalling only the useful drivers/apps was how to did it. Perhaps it's locked down now such that you can't do that (and if so, I wouldn't buy it!)


In my experience, Microsoft and PC laptop manufacturers still haven't figured out how to make a trackpad that works as well as a 2008 era Macbook.


I have both platforms at the office for years, still haven't discovered what is so magic about the trackpad.


The biggest difference between Macbook trackpads vs the best for Windows is the super low hysteresis of pointer motion vs finger motion. I recently bought and returned a Microsoft Surface Book with "precision touchpad". The main reason for returing it was that pointer control feels sluggish compared to the Macbook and its pointer speed was too slow even at its fastest. The best Dell touchpads are no better and Lenovo trackpads are even worse.

I understand that this may be because PC touchpad hardware reports jitter, sometimes higher than it really is, and this causes the Precision Touchpad software to increase the hysteresis. Macbook touchpads have low jitter and the driver is tuned to benefit from it.

If anyone Microsoft with input into the Precision Touchpad reads this, why don't you fix it or work with your licensees to fix it?


Sounds like exactly the kind of thing you can optimize much easier when you control both the hardware and the software.


Like Surface laptops?


Edge to edge uniformity, physical feel, virtual feel of the scroll and motion, gestures. Other than that, sure, same.


Yep, though sadly, in my view, the current line up of Macbooks trackpads aren't as good as the 2008 era Macbooks either....


I disagree, my razer blade 15 has been amazing.


IIRC, binaries on arm on osx have to be signed.

I.e. "We raised the walls on our garden further"

Balls to that, if I buy hardware I want to be able to run what I want on it or it's not a general purpose computer, it's something else.


The Macintosh, by design, was never a general purpose computer. It was a computer that Steve Jobs allowed you to use. The Apple II was the general purpose computer that Woz championed.


The claim of no AAA gaming is totally uncertain -- Apple seems to think that you'll be able to run games in Rosetta with better performance than what you can get on the existing 16" Macs. I guess we'll have to wait and see, but if these new Macs really are so great, I'd expect devs to start porting their games.


>- locked bootloader - no bootcamp - can't install or boot linux or windows

This has been a claim made about the Macs since the T2 chip came out. It was strictly false then (you just had to boot into Recovery Mode and turn off the requirement that OSes had to be signed by Apple to boot) and we still don't know for sure now. Apple has stated in their WWDC that they're still using SecureBoot, so it's likely that we can again just turn off Apple signature requirements in Recovery Mode and boot into ARM distros.

Whether or not that experience will be good is another thing entirely, and I wouldn't be surprised if Apple made it a bitch and a half for driver devs to make the experience usable at all.

>- virtualization limited to arm64 machines - no windows x86 or linux x86 virtual machines

True, but this isn't a strictly unsolvable limitation of AS and more like one of those teething pains you have to deal with, as it is the first-generation chip in an ISA shift. By this logic, you could say that make doesn't even work yet. Give it some time. In a few months I expect all of these quirks to be ironed out. Although, I suppose if you're concerned about containers it sounds like you want to be in the server market, not the laptop market.

>- only 2 thunderbolt ports, limited to 16GB RAM, no external gpu support/drivers, can't use nvidia or amd cards, can't run x86 containers without finding/building for arm64 or taking huge performance hit with qemu-static

See above about "give it some time".

>- no AAA gaming

I mean, if you're concerned about gaming, you shouldn't buy any Mac at all. Nor should you be in the laptop market, really. Although, this being said, the GPU in the new M1 is strong enough to be noted. In the Verge's benchmarks, Shadow of the Tomb Raider was running on the M1 MacBook Air at 38FPS at 1920x1200. Yes, it was at very low settings, but regardless – this is a playable framerate of a modern triple-A game, in a completely fanless ultrabook ... running through a JIT instruction set translation layer.

>- uncertain future of macos as it continues to be locked down

I disagree. I know we were talking about the M1 specifically, but Apple has shown that the future of ARM on desktop doesn't have to be as dismal as Windows made it out to be. Teething pains aside, the reported battery life and thermal performance on the new AS machines have been absurdly fantastic. I think, going down the road, we'll stop seeing x86 CPUs on all energy-minded machines like laptops entirely.


> - no AAA gaming

I thought Google, Microsoft, Nvidia, etc. were all pushing streaming gaming services that will run on any hardware with a decent internet connection. I would imagine the hardware video decoder in the M1 chip would allow 4K streaming video pretty well.


But these “features” are not highlighted on the product page. (Aside from memory) The core count and battery performance are listed. I think many people will buy these. And arm64 containers will come in time with adoption.


As a related benefit to a non Mac user - ARM64 support for packages at build time is going to greatly improve over the next few years!


Most games played by most players are on iOS & iPadOS, and macOS Big Sur will run them on your MacBook.


Hmm. I might buy one anyway and use a remote docker host for x86...


> Why would anyone (who is not forced) buy an Intel PC laptop when these are available and priced as competitive as they are?

There are enough people who do not want to deal with MacOS and Darwin regardless the hardware specs.

Also the way of least friction is usually to use whatever the rest of your team uses. There are even relevant differences in Docker for MacOS vs Docker for Linux that make cross platform difficult (in particular thinking about host.docker.internal, but there are certainly more). Working with C/C++ is another pain point for cross platform, which already starts with Apples own Clang and different lib naming conventions.

Going away from x86 makes this situation certainly not better.


> Working with C/C++ is another pain point for cross platform, which already starts with Apples own Clang and different lib naming conventions.

A walk in the part to anyone that had to deal with coding with C or C++ across UNIX flavours.


Or anyone who had to deal with Microsoft's on version of everything.


Point one C or C++ compiler vendor that has a pure ISO compiler with zero extensions.

Toy projects don't count.


I do web development and I'm not sure how my locally compiled libs will behave on x86 based servers. We often upload our local build artifacts to the DEV envs... I'm not sure this will work on a different arc.

That said, my wife returned the macbook air she bought 3 weeks ago in favor of this new one, so I'll be able to test on that machine before I dive in.


I'm primarily a Mac user but laptops are cheap. If I were working on a team doing Linux development for x86 I'd certainly have a Linux laptop for that even if I preferred a MacBook for other purposes.


Until all software is ported to ARM, it will run in emulation, which is going to be slower in most cases. People invested in plug-in ecosystems, like DAWs or video editing, will likely have an endless long tail of plug-ins that aren't getting ported, or that require a re-purchase to get an ARM version. And due to Rosetta's architecture, you can't mix ARM and x86 plug-ins (in-process, like VSTs - Apple wants you to use AUv3 which is out of process but nobody does that), so you will be running your entire workflow under emulation until you can make the switch hard and all at once. And some of your software will never make it.

Mark my words, this is going to be a massive shit show for people using those ecosystems, for 5 years if not 10. It already happened with the PPC transition.


Rosetta2 is mind blowing.

“ fun fact: retaining and releasing an NSObject takes ~30 nanoseconds on current gen Intel, and ~6.5 nanoseconds on an M1”

https://mobile.twitter.com/hhariri/status/132678854650246349...

“…and ~14 nanoseconds on an M1 emulating an Intel”


How does it handle DSP inner loops? SIMD? x87 code? Floating point corner cases like denormals? What about the inevitable cases where impedance mismatch between the architectures causes severe performance loss? Is Rosetta2 binary translated code guaranteed to be realtime-safe if the original code was? What about the JIT? There's no way that is realtime-safe. What happens if some realtime code triggers the JIT?

We still can't emulate some 20-year-old machines at full speed on modern hardware due to platform impedance mismatches. Rosetta2 may be good, but until someone runs a DAW on there with a pile of plug-ins and shows a significant performance gain over contemporary Intels (and zero unexpected dropouts), I'm not buying the story of Rosetta2 amazingness.


Rosetta2 is not emulation.

Edit: And Apple has already discussed how Rosetta2 handles complexities like self modifying code. It probably won’t help with performance but the M1 has a lot of power to keep even that code running fast.

But more importantly video/audio apps aren’t going to be using Rosetta2 for very long. 99% of code written for x86 MacOS is going to be a simple recompile to native, if not more. Not going native when your competitors did and got 2-5x faster is corporate suicide.


Rosetta2 is emulation just as much as qemu and Dolphin are emulation, both of which also use binary translation like every other modern emulator. Apple marketing just doesn't want you to call Rosetta2 an emulator because "emulators are slow". Anything running software on a different architecture is an emulator.

If you read my parent comment you'll see how DAWs are going to be using Rosetta2 for years to come, maybe even a decade, for many people. Even if there are ARM versions, you won't be able to use them until all your dozens if not hundreds of plug-ins, some of which won't be actively developed any more or will require a re-purchase for an ARM version, have also migrated.

People invested in such ecosystems aren't just going to up and give up half their collection of software, or spend thousands re-purchasing upgrades to get ARM versions.


Apple is a lot bigger market now than PPC transition though


won't emulation catch up in those timeframes ? still, 3+ yrs which is a long time...


Three years isn't that long for CPU performance gains anymore, but even if it was, it isn't the emulation that gets faster, it's the hardware. Contemporary ARM machines emulating x64 would still be slower than contemporary x64 machines natively executing x64.

You're also going to be in a bind if Apple decides they don't care about the long tail and stops supporting emulation before all of your plugins have been converted (if they ever are).


There is no emulation per se, there is a one time AOT translation of Intel to Arm. Then that native code just runs. So no emulator is running on the cpu while the app is.

There is an exception for apps with JIT and those will perform poorly (think Chrome and every Electron app).


"Emulation" is a catch-all term that includes binary translation, static and dynamic, which every modern emulator uses (Apple just doesn't want you to use that name because people think emulation is slow). Rosetta2 is not a pure static translator, because such a thing can't exist (see: self-modifying code).

Just because binary translation is used doesn't mean it's magically as fast as native code. Converting code that runs on architecture A to run on architecture B always has corner cases where things end up a lot slower by necessity.


> So no emulator is running on the cpu while the app is.

Nonetheless, the translated code is going to be slower than ordinary native code because a lot of the information compilers use for optimization isn't available in the resulting binary, so the translator has to be extremely conservative in its assumptions.


Yet the translated code will still run faster on the M1 than the original code runs on x86.


Citation needed. Not a microbenchmark, or a single example of some software. Actual sustained mixed workload usage of real life applications. Especially realtime-sensitive stuff like DAWs (where you have the added risk that calling into the JIT in the middle of a realtime thread can completely screw you over; keeping realtime-safe code realtime-safe under dynamic or even static binary translation is a whole extra can of worms).


Benchmarks are now out, and x86 Chrome is faster on the M1 MacBook Air than the x86 MacBook Air.


Sustained benchmarks await production hardware. But it will be surprising if Rosetta2 translated apps run slower. Not only will system calls be native, but common operations like retain/release are 2x faster under Rosetta3.

https://mobile.twitter.com/hhariri/status/132678854650246349...


That's a microbenchmark. There are a myriad reasons why one specific thing might be faster under a new CPU even under emulation. That doesn't mean other things won't be much slower.


Oh dear lawd, Electron apps can get slower?


Oh yes. JS -> bytecode-> JIT for x64 -> interpreted and converted to ARM (both using CPU and memory).

And most use electron-builder which does not have Mac Arm support. Expect super slow mode for a while!


We still can't emulate some 20+ year old machines at native speed on modern hardware under certain conditions. Emulation always has corner cases where some awkward impedance mismatch between both architectures causes severe performance loss.


> Why would anyone (who is not forced) buy an Intel PC laptop when these are available and priced as competitive as they are?

as a power user I will not be touching anything apple ARM until all my hundreds of software apps are certified to work exactly the same as on x86_64. i will not rely on rosetta to take care of this. i need actual testing.

besides this, 8GB of RAM is how much a single instance of chrome uses. i run 3 chrome instances, 2 firefox and 2 safari. and this is just for web.

this could be a good time to jump the apple ship. it's pretty clear their focus is not their power users' focus.

as such i was looking into a lenovo thinkstation p340 tiny. you can configure it with 64gb ram and core i9 with 10 cores and 20 threads for less $$$ than what an underpowered 6 core mac mini is selling for.


> this could be a good time to jump the apple ship. it's pretty clear their focus is not their power users' focus.

Apple is at day 1 of their two year migration to Apple Silicon. Your judgement seems not just a little premature.


I jumped ship back to Linux (still sucks). This is the first new computer I’ve bought in almost a decade.

I think many professionals who need new hardware will use this as the catalyst to make them move back to PC hardware. The M1 looks amazing, but I need more than just Apple software to do my work. It’ll be a while before all the things I use get migrated.


it depends how you consider it.

“two year migration” sounds just about right for a transition to something non apple.

we can then re-visit apple in 3 years time.


I don't replace laptops that frequently, currently.


> this could be a good time to jump the apple ship. it's pretty clear their focus is not their power users' focus.

Their focus is not on power users ? They just completed the first, small, step of the migration to ARM. They only updated the very low-end models, those who were never targeted at power users anyway, and we're seeing that their cheapest, lowest-end models are whooping the i9 MBPro's ass.

Sure, the features and RAM may not be there yet, but again, these are the low-end models. If we're seeing this level of performance out of an MBAir or Mini. I can't wait to see what the Mac Pro is going to be capable of.


They also updated the MacBook Pro so that is exactly the performance you are going to get for this generation.

The big screen model might give you more cores and RAM but IPC is going to be exactly the same.


They updated the lesser 13" Pro, but not the high-end 13" Pro (since 2016, it's been separated into two lines, with the high end one distinguished by higher TDP, four thunderbolt ports, and more fans) or the 16". IPC will be the same, sure, but I'd expect the higher end 13" and the 16" will have more cores or higher clock speed or both, to soak up the extra TDP headroom.


The 13" MBP was never a pro "pro" model. I bet the big screen models next year will have more RAM and maybe an M2 chip.

> but IPC is going to be exactly the same.

I am not sure what you mean with this?


Its the same chip so single core performance is going to be the same, unless they raise the clock.


Why don’t you think the M2 will increase clock speed?

And the problem with the M1 isn’t performance, single core is already off the charts. The M2 is going to provide 32Gb and 64Gb systems with up to four thunderbolt/USB4 ports and support for dual 6K monitors.


I doubt that the M1 or M2 is going to have superior single core performance to the upcoming Zen4/5nm laptop chips.

Let alone multicore performance. Apple's core are also far behind in IO, 64GB of RAM and 4x Thunderbolt is less than what current gen laptop chips can do.


I agree that Zen4 should be comparable, but it also will cost 4X to make, and more to implement since it doesn’t include RAM.

The M1 is a system on a chip, with all the benefits and drawbacks of that including RAM and port limits.

The next releases will likely be A) a tweaked M1 for higher end PowerBooks with more RAM and ports and B) a desktop version with plenty of ports, significantly higher clock speeds, and off chip RAM.

I think there will always be faster CPUs out there, but not remotely near the M series in power per watt, and cost per power.


Zen is also an SoC, but with off-chip memory, this brings other advantages.

Most importantly, Zen 4 is a chiplet design, so for the same amount of cores it will be cheaper to make than the M1 chip.

As for performance per watt, Renoir in low power configurations matches the A12. I would really doubt that a laptop Zen 4 on 5nm LPP wouldn't pass the M1/M2 in both performance and performance per watt, because Renoir is on 7nm with an older uArch and gets close.


> it's pretty clear their focus is not their power users' focus

Depends on the definition of "power user". Music producers, video editors, and iOS developers will be served quite well.

> lenovo thinkstation p340 tiny. you can configure it with 64gb ram and core i9 with 10 cores and 20 threads for less $$$ than what an underpowered 6 core mac mini is selling for.

When making that calculation, one should also take power consumption into account. $ per cycle is very low now with the new CPU.


Taking power consumption into account makes sense when the machine is running on battery power, but all modern processors are power efficient enough for the cost of electricity to be negligible for a tiny desktop computer.


I agree for the most part; the exception would be if I were running the small machine as a server. I know this is outside of most use cases, but if I were buying a machine to have on all the time (Plex, email, whatever), I'd want to at least feel like it's not driving up my electric bill.


This is where idle power matters. I recently replaced a pretty low power atom in my nas with a i3-9100F. The peak power usage is probably a good 2x higher, but the idle power is just a couple watts, so I expect my average power draw to be much less since the power draw under plex/etc is about the same and the machine sits idle most of the time.


My tiny desktop is on all the time (but idle most of the time) and that was the frame in which I wrote my comment.


Hypothetical scenario: You save 50W (maybe too high, maybe not), use the machine for 10h every day, and a kWh costs you €0.40 (eg in Germany). You save €0.20 per day, €73 per year, and €365 in 5 years. Definitely a factor in areas with high electricity prices.


I think for most power users, they probably generate significantly more than €73 in value from the computer every day (or maybe every hour), so they are probably not thinking too much about that savings.

(Of course, power savings are important in their own right for mobile / battery-operated use cases.)


"Video editors" are going to buy a machine with 8GB RAM? (Which, I assume, will be soldered to the motherboard, like all recent Apple products.) Good luck to them, I guess.


They also doubled (!!!) the SSD speeds, at least according to their slides. Presumably swapping will be much more seamless, so I'm not sure low RAM would be a huge issue for most day to day tasks.


It will still be a problem. The difference in access time between RAM and SSDs is still order of magnitude faster for RAM (10s of micro-seconds vs 10s of nano-seconds). So even if they doubles speeds random access of small data chunks will still choke your performance.


Yes, Apple SSDs are back on the leading edge, they were about half the speed of the fastest Gen4 SSDs.

Low RAM is still an issue with such fast SSDs, as someone who ran RAID0 Gen3 NVMe SSDs (so equivalent to what's in there).


> this could be a good time to jump the apple ship. it's pretty clear their focus is not their power users' focus.

Let's back up a second: Tim Cook said this transition would take place over two years. This is just the first batch of computers running Apple Silicon.

I certainly hope and think that Apple can come out with a beefy 16 inch MacBook Pro with 32 gigs of ram within the next two years. Also, in that time I imagine everything in Homebrew would be ported over natively.


As expected, the Apple M1 is a little faster than Inter Tiger Lake in single-thread applications, but it is a little slower than AMD Renoir in multi-threaded applications.

So for things like software development where you compile frequently your projects, the new Apple computers are a little slower than similar computers with AMD CPUs.

So even when taking only CPU performance into consideration, there are reasons to choose other computers than those with Apple Silicon, depending on what you want to do.

Of course, nobody will decide to buy or not buy products with "Apple Silicon" based on their performance.

Those who want to use Apple software will buy Apple products, those who do not want to use Apple software will not buy Apple products, like until now, regardless which have better performance.


> Of course, nobody will decide to buy or not buy products with "Apple Silicon" based on their performance.

That's exactly the reason why you would chose Apple Silicon right now where you can choose between Intel and Apple SoC. There are of course other reasons such as battery life and price.


Not really. Right now Apple Silicon would be translating most code and therefore be slower and possibly have worse battery life. By the time that isn't true anymore, the option of buying Intel from Apple will be gone and your choices will be ARM from Apple or a PC with Intel/AMD.

The x64 options from Apple are also uncompetitive with existing PCs already because they're using Intel processors when AMD's are faster.


Most code? I would imagine that most code run on Apple laptops today would start with Safari. And then Slack, some IDEs, etc etc. These will all get ported extremely fast if they haven't already been.

There will be a long tail of edge case software that runs in emulation, but that won't affect the majority of users.


That's not how the long tail works. Any given esoteric piece of software won't be used by very many people, but very many people will use some esoteric piece of software.

You also have the problem with proprietary software that even if a port exists, it's not the version you have an existing license for, and you may not be able to afford a new laptop and all new software at the same time.


That's partially correct, and partially wrong. Long tail means that few people will buy/use a particular software package, but that if you have lots of such packages, you can make money. In the case of Apple Silicon, if there's an "esoteric" package, by definition it's only used by a small number of people.


The Macbook Air with 16GB RAM and 512GB isn't really priced competitively here in Germany. It's almost 1600€.


It's not trying to compete with German PCs (which have some nice options, but yet are Windows PCs, not Macs).

I'm not an Apple fanboy, and I'm still very displeased with many of their decisions (touchbar being #1 on MBPs). But if you consider the packaging (small, light, sturdy, now-decent keyboard), and consider their performance, and then consider macOS, I think they are more than competitive.

Even if you match every spec, including size/weight and durability, it comes down to Windows vs macOS. Ironically, macOS is free while Windows is not, but macOS is worth more (to me and many others).


I got a new computer from work last year. I spent quite a while carefully studying my options, and what I saw came down to this:

If you're only looking for computers that are comparable according to the usual hardware specs (cpu, ram, etc.), a Mac costs 25-50% more than the cheapest comparable PC.

If you also throw ergonomic factors like weight and battery life into the comparison, there's no price difference.

(This was USA prices.)


Macs are significantly cheaper in the US than in Europe.


> macOS is free while Windows is not

what laptop are you buying where you need to purchase a Windows license?


The manufacturer of your laptop has paid Microsoft for the OS and is passing that cost on to you in the total price (excluding MS hardware such as the Surface).

Or if you buy a bare system or build your own, you need to buy Windows yourself.

Apple gives their OS away, but in theory you can only run it on their hardware.


When you buy a Mac you are subsidizing the development of that free OS. The the price of the OS is baked into the MSRP in both cases. Comparing the prices is apples-to-apples. Also, Microsoft has made it clear that Windows 10 will be around for a long time, so OS upgrades don't work the same way Windows once did.


>Apple gives their OS away, but in theory you can only run it on their hardware.

If you don't understand why this isn't free then I have a bridge to sell you.


If the bridge auto-upgrades itself for free for the next 8 years after the initial capital expenditures, I'm sure you'll easily find buyers.


I have a MacBook Air 2012, and have been waiting to upgrade for... 2 years already. The laptop will probably end up being 10 years old by the time I upgrade...

- crappy webcam,

- no built-in SD card reader (a 1TB SD card is ~200$, and my music does not need to be stored on an expensive SSD)

- magsafe.. if this was the only downgrade, I'd upgrade, but TBH I love magsafe on my mac and I would miss it if I upgrade.


Eliminating MagSafe for power ports was one of the Apple choices I hated. So many times something has happened and ripped my power cable off my 2014 MBP, and MagSafe saved the laptop from damage or a fall. And worse, Apple has now applied new and different meaning to the same name :(.


Just buy a USB magsafe type cable for $20 and be done with it.


can you recommend one? all the ones i've found have horrible reviews


>macOS is free

Oh wow, that's cool, I didn't know that. Do you have a link to where I can download the free edition of macOS? Google doesn't seem to be helping me.


Depends on your country, but here's a US example: https://apps.apple.com/us/app/macos-catalina/id1466841314?ls...


€1665 in Czechia, and we have much lower purchasing power.


To be honest, you'd expect widely traded good to trade at the same prices, regardless of local purchasing power.


The price seems to be the same except for tax. Germany has a 16% rate (July-Dec 2020) and Czechia 21% for sales tax/VAT.


I think it often depends on big resellers (like Best Buy, for example). They can provide discounts which Apple will not do directly to the customer, and which other resellers can't do because they don't turn enough units.

That's true of many other common goods worldwide. Unless you can buy a locally made item in a lower purchasing power country, you will usually pay a currency exchange equivalent price for the item. Actually you often pay more because the local shop selling the product cannot get bulk pricing and pass along the discount to you.

Finally, when you add the local taxes - 23% in Portugal, for example - the price can be much higher compared to Alaska, US (< 2%). That last bit is really not Apple's fault.


Before Brexit sometimes the you could buy apple stuff for less on amazon.co.uk than in the rest of Europe because the price was fixed in GBP.


You can get a fanless PC with a 1,700 single core GeekBench score for less than $1,600 euros in Europe?


Seems to be priced pretty competitively?


Asking myself that very same question. I've been booting linux, quite happily I might add, off mid/high-end Dells and HPs for a while. The last time I looked Airs were still dual-core, and much more expensive for 16 GB.

I'm not an Apple fan, but the change in value is stunning. I don't need a new laptop currently...


Many apps optimized for the x64 platform won't run as well as the benchmarks.


I think this is an important one to keep in mind. I'm sure most native Mac apps will be compiled to ARM, but a lot of existing apps won't.

Plus there's the brouhaha about Electron apps.

I for one really wouldn't mind if Apple would build a native app to replace Electron apps, e.g. a chat app that works as a good client for (what I have open right now in Rambox) Discord, FB Messenger, Whatsapp and multiple Slack channels. Or their own Spotify client. Or a big update to XCode so it can use language servers for VS Code and it's viable to use for (what I do right now) Typescript, PHP and Go development.

They have more than enough money to invest in dozens of development teams or startups to push out new native apps.

One day I'll switch away from Chrome in favor of Safari as well. Maybe.

(I am taking recommendations for native alternatives to apps)


I can't understand why you think Apple should be building apps for their competitors. It's very strange.

Use Apple Music, Messages, Safari, Swift if you want first-class support.

Or one of the better options now might be to use the iOS apps for Slack, Spotify etc.


Most of the electron apps out there already have native iOS versions which will run natively on AS macs too, that should go a long way to smooth the transition (and will be interesting to see how much extra RAM you gain from not needing slack/spotify/notion etc to run on Electron).

I guess there will still be issues for people who need to run VMs or media apps like Adobe CC etc, and also it will take a while for some dev environments to be fully supported (https://github.com/Homebrew/brew/issues/7857 for example shows it will take some time to get to feature parity).

Overall though a lot of the hard work has already been done, and I'm sure in 2 years time or whenever the transition is 'complete', mac owners will be getting much more value for money with few drawbacks (the main one being higher walls around the garden)


> Most of the electron apps out there already have native iOS versions which will run natively on AS macs too, that should go a long way to smooth the transition (and will be interesting to see how much extra RAM you gain from not needing slack/spotify/notion etc to run on Electron).

They don't have desktop UIs, and will be a big step down for most users. You can't seriously argue the UI doesn't matter on a Mac.


Electron seems to have support in recent beta releases.


You still have to rebuild your software and explicitly support a bunch of different architectures.


Apple Silicon has integrated floating point hardware specifically designed to run JavaScript super fast, so Electron will be fine.


> Plus there's the brouhaha about Electron apps.

Won't this be handled by just porting V8 to the M1?


whatsapp support in native apps is not something that isnt happening because there is no will in teams or companies to do that. its just not possible, there are no APIs. everything you see is a workaround or mashup of the whatsapp web feature


Not just the lack of will - hostile action and threats from whatsapp against even just community projects trying to build a client for a platform not supported by the official ones. This might no longer be the case but a couple years ago they still used to do that, so no wonder so little native clients exist.


They did mention in a presentation some applications ran even quicker in Rosetta 2 than native. Though Wine isn't an emulator, I've seen the same in Wine numerous times. How many, which, etc, who knows? Interesting to figure out regardless.


That happened regularly in the transition from 680x0 to PowerPC.


This is often because it's translating syscalls rather than emulating them, so for applications that are only asking the OS to do the real work, in those cases it's running native code. And then it's running it on a current day CPU instead of one from two years ago.

Unfortunately, although applications like that exist, they're not the common case.


One of the most common operations done in MacOS is retain/release of objects. Rosetta2 translated code is TWICE as fast on the M1 as the original code on x86.

https://mobile.twitter.com/Catfish_Man/status/13262387851813...


Microbenchmarks are meaningless. Where are the benchmarks of real-world applications?


The M1 running native code can retain/release objects five times faster than x86 processors running native code.

X86 code translated by Rosetta2 on the M1 retains/releases objects TWICE as fast as native x86 processors.

https://mobile.twitter.com/Catfish_Man/status/13262387851813...


I was going to add you cant do Android development on these, as you need Android Studio, but that seems to be on the way — Support for Apple Silicon is in progress


I assume they have the OpenJDK JVM ported at this point so all of JetBrains' products should be working or close to working.


They have to make the claim ‘millions of devices run Java true’. But anyway. A lot of programming languages are going to have to support Arm now. Interpreters for like php and js must be cross-compiled and then most things can work. Like Rust just brought their arm support to the tier 1 support level, see https://github.com/rust-lang/rfcs/pull/2959


And I’ve been getting Rust to work on Apple Silicon. It’s only tier 2 for now, but that’s mostly because there are no CI providers so we can’t automatically run tests for it. I’ve been running them by hand.

https://github.com/rust-lang/rust/issues/73908


Most popular languages have supported Arm for many years already, on Linux (and more recently, some of them on Android and iOS).

> Rust just brought their arm support to the tier 1 support level

(for Linux)


Yep, most things were ported in the first wave of linux arm enthusiasm around the netwinder/ipaq craze 20 years ago.


The "netwinder/ipaq craze 20 years ago" would be 32-bit ARM (AArch32), while AFAIK this new chip is 64-bit ARM (AArch64); everything has to be ported again to this new ISA (though yeah, most things were already ported for Linux on AArch64).


Yep, in the intervening time ARM on Linux became popular enough that doing the required compiler backend work for 64-bit ARM in GCC, LLVM etc by commercial interests was a given, there was eg a big push for ARM on servers from various vendors. MS even ported Windows Server. Eg Hotspot/OpenJDK was ported in 2015.


You also have GraalVM (Oracle), OpenJ9 (IBM) and Corretto (Amazon).

So plenty of enterprise-class JVMs available.


Azul and Microsoft are doing the port.


IIRC, Azul is a hardware vendor for the JVM ... can you please share a public source on this collaboration between Azul & Microsoft?


I don’t think Azul have sold hardware in a few years. Their current offerings, Zing and Zulu, are cross platform JVMs and don’t appear to be sold with any hardware.


Azul was originally a hardware vendor with Java-optimized silicon. They haven't sold hardware AFAIK for probably over a decade.


Yes, they used to have hardware with a ton of cores/cpu's and large amounts of RAM if I remember correctly.


Yeah, it was around the time that thread-level parallelism was getting a lot of love. As I recall, they had some massive number of cores directly connected to each other and garbage collection at least partly in hardware. They got burned for pretty much the same reason a lot of the other custom CPU hardware of the time got burned; if you could just wait for Intel to double performance in a couple years it wasn't really worth going with some one-off design for a temporary advantage.


This is the actual JEP https://openjdk.java.net/jeps/391.

This is an early access build from today https://github.com/microsoft/openjdk-aarch64/releases/tag/16...


Sure, here sparing you the effort to learn how to use Google or whatever search engine you like using.

https://www.infoq.com/news/2020/09/microsoft-windows-mac-arm...


Thanks for the link but there was no need for you to be patronizing about it.

FWIW, I originally thought your mention of Azul was a typo, so I parsed your comment as "Azure and Microsoft" before I realized the tautology, which was why I posted the question. I didn't realize that Azul had pivoted to be a software-based vendor of the JVM.


I answered like that because of what looked like a snarky comment.


Wouldn't the x86 version run under emulation?


Basically anyone not lucky to live in countries with comparable economy to US, aka 80% of world IT.


You cant run bootcamp on these..

For me atm thats a dealbreaker.. but I still want one


Even if you could, it wouldn't really help, as it would only be able to boot Windows for ARM, which has even less software support.


There's work being done getting Wine on ARM to emulate Windows/x86-64.


Far more pluralistic software environments would be a huge factor.


I think price comparisons depend on what you're looking for. You cheapest model with 16GB of RAM is 1800$. That's pretty steep, especially considering other laptops will let you upgrade the RAM yourself. And along with that you get the touch bar and the garbage keyboard. I'm just one person, but that's why I would never buy one of these.


> cheapest model with 16GB of RAM is 1800$

$1200 for the Macbook Air with 16GB RAM in USA. No touchbar, no garbage keyboard.


Ah, you're right there. I didn't realize you could modify that.


The old butterfly keyboard that was prone to failure is history.


Maybe they have to run enterprise software which does not get updated every year so they can't use MacOS. Or perhaps they want a fully featured copy of Office? Maybe they want to run an Active Directory network? Maybe they like having ports on their laptop


Professional use in creative industry - film, TV, videography, audio production, and a million other things. 16GB ram and a souped up integrated GPU won't cut it for many 'Pro' applications Mac has traditionally excelled at.


Why would anyone who is not forced to buy a Mac get one of these?


No Linux support is a deal breaker. (current versions with Intel included)


They are not priced competitive. Cheapest macbook air starts from $999. Cheapest Dell Inspiron starts from $319.


The cheapest Dell Inspiron doesn't even hold a candle to the MacBook Air. They're not competing in the same class...


That's true, for sure. But I was answering to "Why would anyone (who is not forced) buy an Intel PC laptop when these are available and priced as competitive as they are?". The answer is simple, everyone, who does not want to spend $999 will buy an Intel/AMD laptop. And $999 is quite a lot for someone who does not need a powerful workhorse. That Insiron is extremely underpowered, yet it'll launch web browser and office apps with some swapping here and there. Apple is not going to kill x86 laptop market with it, just like it won't kill smartphone market with their $400 SE phone when you can buy $100 android phone.

Another thing is that you can buy "gaming" laptop for $999. Something like i7-10750H with GTX 1650. And it's powerful enough to run almost any game on high to medium setting. Apple GPU is awesome compared to Intel GPU, but compared to dedicated Nvidia GPU - not so much. So if you need GPU for gaming, that's another area where Apple does not compete with their new laptops. At least for now.

Ultrabook with focus on portability and long battery life - Apple is awesome here.


> Ultrabook with focus on portability and long battery life - Apple is awesome here.

Exactly that, I think that's the ultimate reason to have a laptop and if not, it might make sense to re-think the setup. Why should I buy a 1500$ Intel/AMD mobile workhorse when the battery is empty after 2 hours? It usually makes more sense to have a server at home or VPS for that. Also a lot of native Apps like Steam have first-class support for that nowadays. For the rest Parsec might work.


It certainly does when the best one can hope to bring home as software engineer is around 1000 euros after taxes.


I fully get that the cheapest option might fit your budget and the MBA doesn't. And that's absolutely fine. Been there, done that. Heck, I wish my 1st laptop had cost $300 but in the end it was more around what the MBA cost today.

But it's not really an Apples to Apples comparison.


I am lucky for having gotten the opportunity to live and work in a country where buying an Apple device is not an issue, and there are plenty of them around the office, but I don't forget my roots nor the monetary possibilities of similar countries that I had the fortune to visit.


Well, buying an Apple laptop is not a necessity or some fundamental human right. It's a nicety for those who can afford it and appreciate the differences (and understand the tradeoffs).

In raw performance per buck you could always get a customer PC setup for cheaper, especially in desktop form.

In some countries, even a $300 laptop comes down to half a year's salaries...


Indeed, which gets back to the original OP point not standing.

> Why would anyone (who is not forced) buy an Intel PC laptop when these are available and priced as competitive as they are?

Apple devices are definitely not priced competitive outside first world countries.


It is an Apples to Dells comparison :)


That's for each buyer to consider, based on their budget and prospects.


Answer straight from the horse's mouth: https://www.youtube.com/watch?v=eAo8gnUCWzE


The recent keyboard fiasco would say otherwise.


You're free to buy the 50% plastic Dell Inspiron which comes with is probably underpowered for Windows 10 and comes with several free nagwares

You might be better served by wiping it and installing Linux though


it's probably <20s short tasks, if you run the CPU/GPU at full load for extended periods the thermals kick in and the M1 Macbook Air without fan will reduce clock speed.

iPad pro - the current 2020 gen iPad pro has A12Z (essentially the same chip as 2018 A12X with extra GPU cores) - significantly older chip than A14. I think there will be an A14 iPad Pro refresh with mini led display in early 2021.


> thermals ... reduce clock speed.

I see that statement a lot, and yes, at some point that is going to happen.

But the analysis seems to fail to take into account what utterly amazingly low power devices these chips are. So while it will happen, it might take a long time.


I think more results are needed to smooth out the curves. If you check all the results by model so far,

https://browser.geekbench.com/v5/cpu/search?q=Macmini9%2C1

https://browser.geekbench.com/v5/cpu/search?q=MacBookPro17%2...

https://browser.geekbench.com/v5/cpu/search?utf8=%E2%9C%93&q...

it looks like they're all in the same ballpark (i.e. the Air is not leading others, just comparable).


One has to remember that Apple is still selling an Intel mac mini at the top of the range: it likely means something about the performance to expect from M1 vs Intel.


The available RAM and eGPU support could explain that rather than raw performance.

I also imagine not all customers are ready to jump on ARM day 1. Some will want to wait until the software ecosystem has had time to make the transition.


They probably have a lot of customers still demanding Intel CPUs. Mac minis are often used in server farms as build servers and there are many companies that would require Intel CPU there for some time.


All of that is true, however it is notable that the models they replaced (are not selling anymore) are all the lower end models. The two port MacBook Pro, the Air, only the lower end mini.

Seems pretty obvious to me that there will be another more higher end variant of the M1, though maybe the only difference will be the amount of RAM, the number of GPU cores, the number of supported USB4 ports or something like that, not raw CPU performance.

Either way, it seems obvious to me that the M1 is their low end Mac chip.

That will be interesting to watch.


Yes, it looks like the M1 was designed mostly for the MacBook Air. The specs are a perfect fit and it makes a lot of sense, as the Air is their most popular laptop. Having the perfect Air - and the new one is truly impressive - will make for a lot of Mac sales. Also, they also put it in the bottom end MB Pro and Mini. But indeed with the next variant of Apple Silicon, the higher end variations, and other devices are probably going to be supported.


The M1 Mini is far better than Intel Mini on cost/power, power/watt, power/heat measures.

Server farms are going to switch rapidly, one leading Mini server farm just announced a 600 unit starter order, and the CEO noted that Big Sur also made significant changes to licensing to make its use in server farms easier.


Of course i just found out M1 only supports 1 gigabit Ethernet, but I don’t think that changes the decision much.


I remember when Intel simultaneously released the first x86, 286, and 386 CPUs all on the same day. What exciting times it was!

Apple released a killer low end SOC in the M1. It contains the highest performance single core processor in the world along with high end multi core performance. But it’s limited to 16Gb and two USB4/Thunderbolt ports, so it’s targeted at the low end.

When the M2 is released mid next year, it will be even faster, support four USB4/Thunderbolt ports and will also come in 32Gb and 64Gb versions.

Greatness takes a small wait sometimes.


Wait.. they've already announced specs for a M2 chip?


No, but it’s pretty clear what next release will be. They will move Apple Silicon to the rest of their MacBooks, and into their iMacs. 16 gb RAM and two ports ain’t going to cut it for them.

Where I can be wrong is that Apple could release two chips. First an upgraded M1, let’s call it M1x that supports a bit more on chip ram (24 or 32 Gb) and four ports. It would be only for high end MacBook Pros and again optimized for battery life.

And they would release a M1d for desktops that has more cores, but moves RAM off chip. That would improve multicore performance, but I don’t know how it much it would hurt single core with slower memory fetches. Probably they could compensate with higher clock speeds, power budgets, and more active cooling.


A14 Air outperforms all iPad Pros in single core, multicore is still faster on the A12X. Keep in mind the fastest iPad Pro is using a two generation old cpu.


I have watched a number of reviews comparing the new Air to the Pro. The CPU performance increase isn't noticeable in most cases, and the Pro still offers better pencil latency and display refresh rate (plus camera/lidar, but probably most don't care).

I wouldn't buy a Pro now because I would wait for the next version, but I wouldn't trade a current Pro for a new Air just for the CPU bump...


the latest ipad pro is still on the A12Z bionic. the air just got refreshed with the A14 bionic


https://twitter.com/tldtoday/status/1326610187529023488

Has an interesting comparison of an iPhone 12 mini doing similar work to an i9 iMac

Now I haven't dug into the details to verify both produced the same results. I believe most of the difference is from software encoding versus hardware encoding. the follow up tweets suggest similar output.

it does show how workloads can cause people to jump to conclusions on simply one test and not having all the details to support the conclusion they desire to arrive at


> “the A14 iPad Air outperforms all iPad Pro devices?”

iPad Pro is still on an older generation of SoC (A12Z), while the iPad Air just got the new A14.


> This seems a bit odd too - the A14 iPad Air outperforms all iPad Pro devices?

Well yeah, every year for the last bunch or years the A series of chips have had sizeable IPC improvements such that the A12 based iPad Pros are slower than the new Air. Apple's chip division is just industry leading here.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: