Hacker News new | past | comments | ask | show | jobs | submit login
Apple Plans to Use Its Own Chips in Macs from 2020, Replacing Intel (bloomberg.com)
1302 points by uptown on April 2, 2018 | hide | past | favorite | 1111 comments



This might just be a bargaining move on Apple’s part, but I don’t think so. I think that long term they are much better off controlling their entire hardware stack. I wouldn’t be surprised to even see them make their own display screens.

As an Apple customer, I like this idea also. For 20 years, I used to be a desktop Linux fanatic and later became a fan of Android. In the last few years, I have switched to using all Apple devices.

Even though I am a computer scientists, I actually spend more time on my iPad and iPhone, by far, than on my Mac laptops. I only use a laptop for software development (because of the nature of my work, that is often just SSHed into Linux servers), the rest of my workflow and entertainment is on iOS devices.

The older I get, the more I want to get my work done expediently, leaving time to study new technologies and spend time with family and friends. At least for right now, I am maximally effective in Apple’s environments.

As a bonus, I trust Apple more that Microsoft, Google, Intel, etc.


I am a software developer, and I detest working on my mac laptop. At $lastjob I had a Linux desktop and it is, I believe, the most productive environment I have ever developed in. The job before, I had a windows desktop, and I prefer that to mac.

You say you want to just "get things done expediently," but in my experience apple software is flat out inferior and OSX is the worst of the 3 major operating systems I have to choose from.

Lastly, what does Apple have to gain by switching away from Intel? Not much, at least, not much that benefits me as a customer. Likely they are interested in making their laptops have more in common with their iOS devices, which does little to nothing for me. Apple's behavior towards OSX and macbooks in the past few years should be of great concern for anyone, especially if you actually like the devices.


2/3s of all PR activity on Github is on a Mac, for what it's worth. Certainly the platform punches above its weight relative to market share.

https://developer.apple.com/videos/play/wwdc2017/102/?time=2...


My last couple of jobs have given me no choice aside from a MBP and OS X. They do this because they don't want to manage multiple kinds of machines from an IT perspective. (I'm not sure I agree with this, but it is what it is.) Depending on how widespread this is, usage may not reflect actual user desire (though I would wager quite a large number of users either don't care what they're using, or care, but a MBP is adequate, but would also be fine on something else.)


In my second to last company (a web development shop), they had originally the policy that a new developers can choose if he want either a mac, linux or windows machine. When i arrived there it was 75% mac and the rest used linux, not even one had made the choice for windows. After a while some new guys get hired and one of them wanted to use windows. We found out immediately, that this caused a lot of problems, because he had serious problems to get our projects to run under windows, even though we thought this shouldn't be an issue because all the dev environments were running in VMs managed by vagrant. After while ago watching this guy not getting productive, because he ran from one issue into another, cto forced him to install linux on his machine to end this horror show and windows was removed from the new guys choices at least for developers. The company i went after that and my current one are 100% mac environments, for the reasons of homogeneity you mentioned.


We run a Mac and Linux setup here - me being the admin, this is tolerable. Mac's still-essentially-Unix underlying OS makes this reasonably consistent - I can SSH between either platform fairly easily, for example. We offer new starters the option, and are about 50/50, though I've had a few surprises - one user had only previously used Windows, and from experience these sort of users switch to Mac more easily than Ubuntu, but he gave his MBP back after a day and switched to Ubuntu instead, which he's stuck with. I could not be more proud :D

It's very nice to give end users the option - when I joined, I asked for Linux and was given a brand-new Dell XPS with Ubuntu pre-installed, root access, and told, customise it to suit you. I keep that spirit with my users.

Throwing Windows into such a setup is a nightmare though - two users actually installed their company laptops with their own personal Windows 10 licenses without my knowledge or approval - I still hold a grudge because I had to actually read the MS EULA to make sure they weren't about to cause trouble! It means I can't manage them (I have no real tools to do so on Ubuntu), so they're on their own for that. Fortunately management has my back, declaring the company to be a *nix shop.

Our actual product is all containerised so it should run anywhere.


> I have no real tools to do so on Ubuntu

Ansible works pretty well for managing Windows machines but it requires a little bit more up-front setup on the Windows side and as always YMMV depending on what you're trying to do.


I would have been pretty upset with that. Unless you're writing machine-native code, development environments should work on anything, especially if they run on VMs deployed by Vagrant.


Well the thing with Vagrant is that it (in theory) runs your application code inside the VM, but you typically run all your development tooling in the host OS.

I suspect that it was on the local tooling side that things fell apart for this Windows developer, if he was grabbing a project mostly built on Mac or Linux.

A lot of popular dev tooling built around web technologies (node, Ruby, PHP, etc) isn't as mature on Windows as it is on MacOS or Linux.

My source on this is me. I have a Mac and a Windows box at home, and I've had projects fail to set up and build on the Windows machine for these reasons.


I’ve had problems even getting anaconda (scientific python distro) to work in a VM on windows (because the file system of the host OS lacks certain UNIXy features).


In theory there is no difference between theory and practice. If no one is testing the build on platform X, no one should be surprised if it doesn't "just work" on platform X.


You would think that right? Except then you actually want to run a simple python script on Windows and observe things like: os.rename not being able to replace opened files, the default encoding not being UTF-8 (wtf), not being able to just `pip install scipy` because of some weird BLAS/MKL dependency...


I guess from a business point of view it makes sense, but I really get frustrated when I find an open source project that's completely unbuildable on Windows.

Maybe the developer in question would have been better using a manually created VM or WSL?


Now, take this with a grain of salt, as I'm no open source maintainer, and I only rarely use Windows, and even then for development. Up until very recently, to get real developer tools that are supported by Microsoft one had to buy visual studio. VS2017 has a community edition that's usable for open source projects.

Not to mention, the windows environment is completely different to linux and OSX. Until x64, they had a different calling convention. On OSX and linux, also until recently, I could use the same compiler for both platforms and still be supported by the platform vendor.

All of these non-trivial differences make it a lot more resource-intensive to support a codebase on windows that already runs on Linux and OSX. Asides from my work VMs, I don't even have a windows device that I could use for development. So to me, it's no surprise that most open source projects don't build on Windows.


Community Edition is really nice nowadays, usable for day-to-day professional usage as well. If ReSharper works - it works for me.


Student and hobby editions of Visual Studio exist for decades, before they introduced Community, there was the Express editions.

Also, mingw and cygwin also exist since late 90's.

OS X tools are kind of included on Apple's hardware price.

Apple hardware costs in many countries is similar to average PC + VS License costs.


Whilst I agree with you on all but mingw and cygwin, these tools are wildly different. Integrating microsoft's C/C++ toolchain into an existing makefile would be hell. And again, most people who have a mac don't have to buy the compiler. Most people don't buy extra software after the fact just to compile open source libraries. I can't imagine a reason to buy visual studio for personal use, and I can't imagine a reason as to why I'd dick around with visual studio in my free time.

As for mingq and cygwin, most regular people don't have it installed and configured. And cygwin and mingw are not _platforms_ which receive first party support, as unfortunate as that might be.


I once had to use cygwin to port an old C library from UNIX (I don't recall the flavor of UNIX) to windows. It involved editing makefiles to use the MSVC toolchain, amongst many other things.

It was pure pain. Especially when the boss kept asking me what was taking so long (it's just a recompile right?).


People that have a Mac already payed for the compiler.

500 € PC + Visual Studio vs 1000 € Mac.


I don't believe that the majority of people who buy 500€ PCs would spend money on Visual Studio. In the Mac case, nobody pays for the compiler because Clang is open source - this is a massive oversimplification, but whilst the fact that Clang and Make are distributed in the standard OSX base install is a value add in my book, I don't believe that it's something that people pay for. Also, the people who buy Apple hardware are a completely different set of customers to those who buy 500€ PCs. However, this is all a bit moot as you and many others have pointed out that the community edition is out there and it is usable.


Windows is a bad product made by a company that has historically been a bad and unethical actor that has attempted to limit user freedom and destroy freedom of choice by illegally destroying competitors. Recently we are to believe that they have found jesus and ethics via quiet contemplation and peaceful regime change.

All projects are ultimately created to scratch somebodies itch. If it doesn't work on windows out of the box that isn't their use case. What you are wondering is in effect is why people don't pay money to purchase a windows license which will ultimately fund a bad and evil company in order to enable the projects software to run on an inferior OS that the dev doesn't run or care about. If its not end user software for desktop users it doesn't even have the positive effect of enabling a substantially bigger group of potential users to benefit from the software. For anything server related they are either already running linux or can as easily run a linux vm as a windows one.

Further the users who would benefit will by and large buy a license if the software is non free but probably wont contribute anything but complaints phrased similarly to the support requests they would make for paid products that had failed to perform adequately.


A windows developer just needs to create a build


What was the problem just install an xwindows server (the new bash environment should support it) or just turn on the telnet clinet.


The new bash env is a bit of a let down for development currently, at least for me. I love this move by MS though and am willing to give it another go in the future once it’s more complete.


> My last couple of jobs have given me no choice aside from a MBP and OS X. They do this because they don't want to manage multiple kinds of machines from an IT perspective.

Not that many years ago that was the same reason IT departments would give for only giving people Windows machines.


Apple shot themselves in the foot for this market with the touch bar. It adds such a small amount of utility for annoying every VIM user.


15 year VI and VIM user here. This is a stupid complaint: anyone seriously using VIM remapped Escape a long time ago, or learned to use Ctrl+[.


I've been using vim seriously for years, and I've had escape on capslock for many of them. However, not all users know how to do that or haven't thought of the amazing benefits that can be obtained. There's people who are only picking up vim today. Give them a chance.

To those who haven't: Give it a go. Put Escape on Caps lock. If you've got Control there and you're running Linux or Windows with a standard PC keyboard, you probably actually want to swap Control and Alt. If you haven't, you probably want to swap Control and Alt anyway. (This will give you bindings similar to the Mac's Command key. The thumb is a much stronger finger and far more suited to these combinations than the weak pinky.)


I got lucky on that one! I have been remapping caps lock to escape for years now on every machine I use. It's great, until I need to use literally anybody else's machine.


I have the same experience with the Dvorak layout, which is what I used when I learned to type, at age... um... nine? People crack up when I try to type on anyone else's machine, because my WPM drops by a factor of ten.

This is part of the reason I basically refuse to customize anything on a new linux machine; my needs are already weird enough that I'd rather just adapt myself to the defaults for everything else.


Not to mention every IntelliJ user, and every touch typist.


Vim users commonly map escape to caps-lock.


Caps to Ctrl

Ctrl to Escape

Escape to Caps

I mostly use Ctrl-[ for Escape. I have a lot or weird Ctrl bindings. Not an emacs level of Ctrl bindings, but enough that I use this particular rearrangement.


I don't, because my agency uses a very locked down Windows env and I can't even remap keys (shell?! You jest....)


Then you don’t have the touchbar anyway.


No, my main work environment makes the remapping inaccessible, so I have declined to buy a touchbar. A consultant offered me one free and I still declined, requesting an original rMBP instead.


> map! jk <Esc>

Remap the sequence "jk" to escape, you don't even have to take your hand off the home row to exit a mode.


> It adds such a small amount of utility for annoying every VIM user.

You have to love Emacs and Vim users.

Quite sure Apple's world doesn't revolve around the needs of such a small user base.


FWIW I'm an Emacs user on a MBP with Touch Bar.

No issues.


> My last couple of jobs have given me no choice aside from a MBP and OS X. They do this because they don't want to manage multiple kinds of machines from an IT perspective.

This is a huge improvement over when I started developing and was forced to use Windows.


The managing multiple kinds of machines issues is a real one, speaking from the other end. For uniquely talented software engineers I might let them use what they want but from managing an IT department you definitely want anyone who has no reason to be otherwise on the same exact hardware and software configuration.


Yeah, at my last place, the deal was: First we have to find 5 - 10 devs who want to switch to linux (there were 50 - 100 in total). Then we get a PXE boot image to install linux and, well, DHCP. We get to try it for a couple of month, while producing documentation on how linux works with the infrastructure.

That feels like a fair solution.


I don't trust that for two reasons:

1) There is a lot of cargo-culting around the MacBook because hurr durr Windows is terribad (though, to be fair, Docker does work better natively with OS X than Windows, even under WSL), and

2) Most companies give their developers the choice of a MacBook or the shittiest Windows machines known to man (because of irresistable volume deals from Windows OEMs). I bet the spread would be more even if companies were willing to offer the Surface Book as an option. This laptop is nice.


While I do prefer MacOS over Windows, I believe that you have a point in regards to the computers companies (and people in general) run Windows on.

Most of issues people have with Windows can easily be explained by low quality hardware. If you need a decent laptop, you're looking at price tags above $1000. Depending on where you live, I wouldn't buy a Windows laptop below $1400, if you expect to be happy with it.

Honestly Microsoft should take steps to prevent the sale of Windows 10 on laptops without an SSD and at least 4GB of RAM. That would help the Windows brand tremendously.


What I do every few years: buy a refurb/clearance business-class laptop (latitude, precision, et. al) from the Dell Outlet. The specs may be merely ok, but the build quality is pretty good. And you'll spend less than $1000 :)


Do you think that they want to cede the entire low end market to chromebooks/linux?


Satisfaction rate among surface owners is abysmal. At this point, I'd call the surface brand a failed experiment.


I was going to say; most people I know brought them back for many reasons. One of my Windows fan friends had a surface book 2 and brought it back for repair because of random battery life a few times; ended up buying a Lenovo T470 (I think) instead as they kept insisting it was his fault instead of the glitchy laptop. No issues with the Lenovo ofcourse.


Do you have stats for that? Maybe I'm an outlier, but I loved the Surface Pro so much, I bought a Surface Book for work and a Surface Book 2 for personal use.



The first year was a really bad. Plain and simple. Now it is a decent device, but I'd take a Lenovo today if I had to make the choice. However, for note taking, the Surface Pro is second to none. The form factor, pen, and OneNote is a really good combo.


Until the screen breaks and you can’t turn it on...


If that kind failure recovery is important, then no tightly integrated device is a good choice.


Had two surface pros fail just out of warranty, if that’s the price of a tightly integrated device it’s too high for me. No more surface devices here.


That is indeed poor reliability. The SP4 I used and others I know people have had since launch, are still working fine.


We had the choice between MBPs and surface books (the good ones) and still only management opted for windows.


When I asked for a newer MBP, i got bitched out by the purchasing director. He made the point that I was one of only a handful of people in a 500 person IS department allowed to have a Mac.

Many companies don’t allow mac purchases, even for developers.


A company that balks at spending 2k to enable an asset who costs 50k-250k annually to perform maximally isn't very wise. We are talking about spending 1-4% more.


Docker has a native Windows client for like a year now. Just FYI


Yes, it does, if you're on Windows 10. And volume mounting from within WSL doesn't work with POSIX paths (at least it didn't when I used it last over six months ago) regardless of the container engine type you select (HyperV or Windows Native Containers, which has its own quirks).

Docker for Windows doesn't exist for Windows 7 (still used in a number of enterprise environments). You're stuck with Docker-Machine in that case.


My only complaint is that if the mac were truly a developer-centric device, I wouldn't need homebrew to install software. It should just be in the apple store for me to download.


I think that'd be a disservice to both developers and non-developers alike. It'd clutter the app store and force developers to use the app store and whatever approval process they decide.

I think it would be cumbersome to say the least...

  * Would you have to install through the app store gui?

  * How would it manage dependency chains and conflicts (system ruby vs local ruby)?

  * How about explicit paths or build parameters?

  * How would it handle different shells?
I know other operating systems do this but I've never liked it... I guess I just don't mind installing and managing command line/developer tools from a terminal window ¯\_(ツ)_/¯

And, with all that said, I have a quite a bit of negative sentiment towards Apple lately for their hardware and platform choices. So I would still say, it's not a developer-centric platform (the Mac)...


> Would you have to install through the app store gui?

Possibly or not. Ubuntu gives me both a gui and a command line option for apt. Thinking about being able to type "app-store install openssh" gives me goosebumps.

> How would it manage dependency chains and conflicts (system ruby vs local ruby)?

First, that example is a problem for ruby, not for the package manager. Second, software can be in the store statically compiled. Third, apt handles dep management really well. Perhaps that's something apple can learn from?

> How about explicit paths or build parameters?

This is a solved problem in ubuntu.

> How would it handle different shells?

Again, this is a solved problem.

> So I would still say, it's not a developer-centric platform (the Mac)

The question I have is if we deploy on linux, why are we using a mac to develop software on?


As someone who lives on linux and has to touch OSX for things like building for iOS devices, I find it really odd that OSX doesnt just solve these problems like all the FOSS distros do. It's insanely odd that Apple gives me a bash shell which is years out of date, and it feels hacky to just use what i want. I know Propietary and free software kinda have a hard time co-existing on operating systems built on free or mostly free software, but if OSX is half free software and its posix compliant, Its hard to believe that apple couldn't give you both propietary software and an easy package system for developers easily.

The sad thing about linux is that as much as I love it and its ecosystem, i cant recommend it to anyone who wants things to "just work":

- X and wayland crash on me all the time on this laptop because of its HiDPI screen and my kinda-works-but-is-wonky fixes to work with multiple monitors.

- Hardware support is the best its ever been, but graphics cards, wifi, exotic devices, laptop power states and embedded devices can still be a pain because manufacturers simply dont care.

- desktop applications can still be a little glitchy, Web browsers work fine, as do first party DE apps, but the more you get away from things which arent in the big name gui toolkits and have custom controls and behaviour, the more problems you seem to run in.

But aside from all that im happy here in ubuntu. When your software library feels as easy as picking a book from a shelf and 90% of the system updates by the update manager and says "hey restart when you feel like it" Im quite comfortable.


GPLv3 appears to be the reason for shipping that ancient version of bash.

If you look Apple ship a modern version of ZSH with their OS, I believe because it isn’t affected by the same potential licensing issues.


But why? That doesn't seem like a rational choice by Apple.


Apple wants to use open source code, but doesn't want to make their proprietary code open or to license patents to anyone who wants to build off of their code.


Ultimately apple will sell developer devices at the price point of their Mac pro devices that come with all the approved tech you are allowed to use to build end user services/apps. In addition to the high purchase price you will have to sign up for a developer account and pay annually.

Everyone else will buy consumer oriented devices that are as open as the ipad.

It's not a bicycle for the mind its a train and if you dress appropriately and pay your fee you may set up a concession stand on the route.


* Would you have to install through the app store gui?

With various linux package management systems you have multiple interfaces to the same same system thus you can at one moment use the gui to install foo and having closed that you can fire up your favorite terminal and run install bar.

* It'd clutter the app store and force developers to use the app store and whatever approval process they decide.

It also has a concept that neither Microsoft Apple nor Google has opted to pick up on because they desire control of their platform and to extract a substantial tax on all software sold on same via such control... sources also called repositories.

Linux package management systems draw from not a single centrally managed source but a user editable list of sources. Each source is free to run with their own set of requirements. There is nothing requiring a hypothetical dev tools source from being any more restrictive than whomever maintains the source used by homebrew.

Further packages can actually contain sources. It would be entirely trivial to package up a source of dev tools as a package and allow people to install that via the front end of their choice.

* How would it handle different shells?

Different shells is probably the simplest answer the arguments to a hypothetical install command would be simple text strings. If the argument command includes characters that the shell considers special characters they would have be escaped or wrapped in quotes like any other combo of shell and cli tool. Generally most package names just don't include characters like ([])$\~`'" in the name and most commands don't require any particular special attention to shell escapes.

* How would it manage dependency chains and conflicts (system ruby vs local ruby)?

* How about explicit paths or build parameters?

These are implementation details that don't go away by exiling developer tooling to officially unsupported channels. For example language specific package managers will probably remain a thing but it would be vastly easier if you could do officalmacpackage install developerrepo then officialmacpackage install cargo|node|whateverfloatsyourboat then use that to deal with whatever.

All your concerns are basically lack of experience with more reasonable systems there are zero good reasons not to do this. There are literally no downsides other than the work required for an official solution. However it seems vastly unlikely that we will ever see such a thing.

The logical endpoint of Apple's vision seems to be 2 classes of device. One with a very high introductory price and annual maintenance that allows you to create and run whatever you like so long as such software is distributed via blessed channels and you tithe the required 30% to apple and more reasonably priced but still expensive devices that only allow you to consume software.

The former will come with xcode and technology to deal with apple approved languages. You will be able to play with other tech on your local devices but it wont be able to be distributed to end users.

The latter will be as open as your ipad is now.


It's not a developer-centric OS and it's not supposed to be. But, that doesn't change the fact that a lot of developers prefer it over other OSs. I prefer because it has been more reliable for me than Windows. It's not right for everyone though and I completely understand anyone who prefers Windows or Linux.


I could see an argument that Apple should make a tool like Homebrew themselves or official support Homebrew development (maybe they do, haven't looked), but including it in the app store would just be confusing. Plus, I can type 'brew install <package>' far faster than I could type <package> into the search bar and make the necessary clicks to install.


That same argument extends to the users of Homebrew who are overflowing with cash from startup get-rich moments: please consider funding (if you have $1/mo or more to spare) the Homebrew patreon, which currently lists a total of 285 paying users out of the million developers that depend on it to earn a paycheck every month.

https://www.patreon.com/homebrew

(I do not participate in Homebrew other than as a user.)


I've been using homebrew for years and have never heard of this. Maybe the reason there's only 285 patrons is that the patreon page has not been promoted in any way that I can remember?


Can’t say I blame them for not promoting it through the only way any current user would see it: inside the brew command line tool. I can see the angry commenters now:

“How dare they inject advertising into a critical command line tool that I use to get paid money!”

“It’s inexcusable to promote your own financial success when someone types brew upgrade”

“No doubt they’re blowing the money on burrito delivery in SOMA”

“I’ve been a $1 Patreon since they launched three months ago and they haven’t implemented my favorite wishlist feature!”

“If they can’t do it for free, they should give up and shut down and let someone fork it.”


> ... or official support Homebrew development ...

From memory of speaking with an ex senior-Apple-dev-person, Apple doesn't support any 3rd party Open Source projects nor Communities. eg Homebrew, or even the software they bundle in their OS


Apple will never distribute gpl v3 software, this is the reason bash is ancient on mac and why samba is gone.

Any license with a Patent clause or Tivoization clause will be verboden. As an example you will never get something that depends on Postgres in the Apple store.


Postgres is BSD licensed. Apple even uses it in its Server app


Apple supports and hosts MacPorts

https://www.macosforge.org


macOS Forge shut down in 2016. All those links are to the new homes of the projects.


Macs are macOS and iOS developer-centric devices, those hardly need homebrew.

Other types of developers were a kind of nice to have regarding sales, but no longer relevant.


probably related to the fact that the top language on Github is javascript.

If you do anything else than UI scripts, I would agree with the above and say you are better off with a linux box.


I think you're probably right. I feel like the one thing Mac is really good for is writing the porcelain. node.js came out of the porcelain side, so it seems to fit really well there too.

When you get into plumbing, I feel like there are better alternatives.


Yes that, plus it could be that MacBook/Pros are popular with students: many of whom create lots of tiny Hello World etc repos.


That's because most GitHub activity is by frontend web devs. What about the rest of the world.


Those stats also don’t include GitHub Enterprise..


Do you have a non-video source on this? 2/3s seems really high.


That really is "2/3s of all PR activity on Github is unix-based". People don't use Macs to develop on because of OS X, but because it is unix-based. Sounds like they are going to iOS based which makes a Mac much less interesting to develop on for anything but iOS devices.


That does not mean anything. People can use Macs for development and still be miserable. I am practically the only Dev in my company using non-mac laptop, the amount of envy in people eyes is astonishing.


Couldn't agree more.

Finder is garbage compared to Windows Explorer. OS hotkey navigation is much better on Windows and GNOME/KDE/Fluxbox/xmonad. (I actually miss using xmonad; it was soooo good at this.) Office applications (still very real in a lot of environments) are mostly awful on the Mac. iTerm2 is fine, but I prefer PuTTY. Starting applications by their binary name with WIN + R is better (for me) than trying to find them within Spotlight. There are other little enhancements that I prefer Windows over to OS X, but I'm not remembering them right now.

I also hate the new MacBook keyboards. They take "getting used to," i.e. typing "lighter" than I'm accustomed to. Also, not having USB-A ports and requiring a 90W USB-C to USB-C power brick (i.e. not being able to charge the thing from common chargers, the most lucrative reason for using USB-C, for me) is garbage.

The one thing I like about OS X is that it's a BSD with GNU utilities built in. The thing that I don't like is that it's not a Linux and many of the dev environments I've worked in run on Linux, especially now in the age of Docker. While that's good news in that it forces me to do almost everything within Docker, it's bad whenever I need to maintain parity for whatever reason.


Odd I find hotkeys much better on Mac. They follow a more logical predictable pattern and are much more standarized. E.g. what is the standard windows hotkey for getting info/properties of an object. Never found one.

Also if an app doesn’t have configuration of hotkeys you are screwed. On Mac hotkey config is OS wide.

The command console experience on windows is horrible.

Office apps are better on Mac. Pages, numbers and keynote offer me a totally superior experience. The ribbon interface on Windows is a disaster. You spend too much tome hunt around among a zillion incompethensible icons.

In fact I do GUI design at work and use MS Office as an example of how to NOT design a UI.

Not sure why you complain about Finder. You got plenty of alternatives on Mac. I use Terminal and Finder a lot in combination and they interact with each other a lot better than on Windows. E.g. does windows have an “open” command yet?


"E.g. what is the standard windows hotkey for getting info/properties of an object. Never found one."

Alt + Enter (since Windows 95, at least)? You can find it in here: https://support.microsoft.com/en-us/help/12445/windows-keybo...


The Mac environment takes some getting used to, it took me years to learn all the little details, like if you select a bunch of folders and double-click, it will open all of them. To close them all, option-click the close button.

I do find Finder to be inferior to explorer in almost every way. That tree pane on the left is too useful.


I'm far more of a Mac user, but I use Windows, too, and I agree that the Finder is quite inferior to Win Explorer. The simplest, most obvious things, such as having 2 side-by-side trees for file organization, or right-click on anything and create a new folder there--things that are easy and obvious--are NEVER going to come to Finder, which occasionally adds trivial eye candy such as "album flow" views or colored "tags", but basic workaday functionality, no.

Yes, you can buy a 3rd party replacement with the hassle of deploying and maintaining on multiple machines, if you have them, but my main complaint is what decades of not caring about the most fundamental of all Mac apps (the only one you can't quit) implies about Apple's strategic attitude toward the Mac overall.

For those of us who find the Mac the best pro dev platform, the implications are not good, because what we tend to like about it (desktop unix where all the client stuff just works) is just a historical accident that Apple would not create again and does not intend to maintain longer than necessary.

Rather than make the Finder more powerful to make the Mac better for serious users, Jobs took the approach that people who couldn't understand the difference between a file and a folder were the real market for Apple, solving the problem by creating iOS with no user view of the file system at all.

This turn from computer company to fashion accessory for those who don't care about computers per se was so successful as a business strategy that I can hardly criticize it. It just bodes ill for those of us who like the Mac for features its only supplier wants to be rid of.

Each time Apple has an event where Cook pointedly emphasizes that the iPad is "what we at Apple see as the future of computing", I google for an update on the current state of "best desktop Linux distro".


On OSX you can create a new folder with Command+Option+N. As for the side-by-side trees, I don't understand what you mean, I usually have 2 Finder windows open, then drag and drop as needed.

When I transitioned from Windows to Mac, 13 years ago, I needed some adaptation time, especially with Finder, but in the end I found it powerful. The main issue coming from Windows Explorer is that basic Explorer workflows such as copypasting don't have a Finder equivalent.


As I said, Windows lets you put a new folder (as well as various types of new documents) in any part of the tree that you right-click, while Mac's cmd-opt-n is limited to the root of the folder you're looking at. And having to separately position two windows that have/lose focus independently and can have one come to the front without the other is ridiculous compared to having both trees side by side in the same window like Win Explorer and every 3rd-party Finder replacement on the Mac.

Having been a Finder user since my 128K Mac in spring, 1984, I've had adequate time to get used to an application that hasn't improved in even the most obvious ways for going on two decades. It's fossilware.


Must be something about a particular way you are working. I have never experienced this as a problem.

Why would losing focus on one window be a problem?

Why would you need to create all these directories in multiple parts of the tree?

I can't quite comprehend what sort of work flow of work you do which require these things?

Perhaps I don't see it because I use the Finder and Terminal a lot together. I don't use one exclusively for very long periods.

But I am curious what you're workflow is like, because I see a lot of Finder hate, but don't really get what people's problem are, and how they are using Finder which is causing them so much problems.


Sounds more like you are not familiar enough with using a Mac. Putting two finder windows next to each other is easy, and so is quickly creating a directory.

Finder also supports tree view you know. OTOH does windows have as good filtering tools, smart folders, labels etc?


Option-Click to close is a novelty to me and a very handy one at that... the number of times I’ve accidentally shot myself in the foot by selecting and accidentally opening maybe hundreds of folders is not something I’ve kept count of, but it’s definitely happened multiple times. I’ve been using OS X (now macOS) intensively since 2002 and I’ve come to think of myself as a “Power User”, but I’ve never known of the option-close trick. Thanks.


On your last point: START "title" [/D path] [options] "command" [parameters]

It's been there since forever (Win95?).

start (without parameters) - open a new CMD window

start . - open an Explorer window of $PWD

start <DPATH> - open an explorer window of DPATH dir


What? As much as I'd like for the Excel dominance to end, I can't see how anyone would ever work quicker in Numbers over Excel for any work that takes more than an hour a month. The ribbon interface offers shortcuts to pretty much every single function through the keyboard in an interactive way. I can't say the same about Numbers.


What exactly is it you do in Excel which is quicker than Numbers?

Numbers excel at what a spreadsheet application should be about. Once you get very complex sheets you are much better off using more specialized software such as DataGraph, R, Julia, Matlab, Numpy, SAS


'open -a appname.app'

Is what i generally use to open applications. I never use spotlight


I have long said that Mac os X is only useful as a way to start Emacs, but since 10.10 I am not even sure about that.

Of course it is a matter of taste. I get that. I still have to bite my tongue to not yell profanities whenever I hear anything about productivity and Mac OS.

Most of the time I keep my mouth shut. Now I just wanted to tell you that you are not alone. I have to use Mac OS at work. To avoid frustration I have started to let my soul and will leave my body before logging in and let my lobotomized shell I left behind just go with the flow. At least until Emacs is running full screen.


I don't suppose it's any consolation, but I feel the same way if I have to use Linux or Windows as a workstation. Different strokes for different folks.


I guess it depends on what work you do.

As for writing software macOS is nice since it is quite close to Linux and does contain a lot of the UNIX-thinking.

Windows is, IMHO, a user experience mess. Mac is more gentle and aimed to be simple and useful for everyday people. Windows feels like it was designed by a lot of different groups that in the end glued their pieces together in order to ship it.


> As for writing software macOS is nice since it is quite close to Linux and does contain a lot of the UNIX-thinking.

You know what's even closer to Linux?


OpenBSD?


Nice thing on windows is I can open the command line type >bash and I've got vim, ssh, sftp, bash whatever immediately. I still don't use that because other telnet, x-window setups are better for me. I never found out how to be productive in OSX vs what I can do in Windows.


> Nice thing on windows is I can open the command line type >bash and I've got vim, ssh, sftp, bash whatever immediately.

Yes, macOS has a terminal emulator as well.

> I never found out how to be productive in OSX vs what I can do in Windows.

Open Spotlight, type in "Terminal" and hit return.


Unsure how that magically makes OS X productive. Stuff is just easier and faster on windows or linux. And I’m tired of OS X randomly crashing or freezing. It’s like windows 98 now.


>Stuff is just [...] faster on windows

Windows is the slowest of OSes, see http://www.bitsnbites.eu/benchmarking-os-primitives/


Then how does opening the command line in windows magically make it productive? They're just citing a counterpoint.


It's a trendy way to start Emacs. I used one just to use vim.


I was with you until the "I preferred Windows" part. Windows right now is the most infuriating piece of software I can imagine for a developer. Forced upgrades? Forced advertisement? Gigantic, expensive SDKs with incomprehensible versioning and install times measured in hours, no standard library installation facilities, no standard build facilities, a default compiler that is markedly inferior to the alternatives, it just goes on on and on.


No idea where you're getting most of this from.

I was a C# developer as recently as 2017, and I'd say that I prefer Windows 10 to OSX as a development environment, but going through your points:

* Upgrades are handled by the system administrator at most companies, so it's unlikely that automatic upgrades will be set up if you've got anyone remotely competent handling your IT. * I can't say I've ever seen an advert when working on Windows. There's some Cortana crap, but that takes a few seconds to click away, and you'll never see it again. It's no different to your standard desktop setup for OSX or Linux. * If you're a .NET dev, it's extremely unlikely that you're paying for the OS or the platform, in the same way that you're not paying for OSX. Admittedly, Microsoft tools take an age to set up, but the latest versions of .NET and Visual Studio are much quicker - if anything, I spend far more time upgrading/installing stuff on OSX. Hell, sometimes setting something up on Homebrew will take longer than a standard Windows installation for a given tool. * I'm yet to see a Windows machine, outside of a brand-new one, set up without the necessary .NET framework. If it's not on there, Visual Studio will install it for you. Again, not an issue. * Not sure what you mean by build tools and a default compiler - IMO building/compiling is ridiculously easy for .NET apps, either through the command line or through Visual Studio.

In my view, as someone who has worked on all three sides (OSX, Linux (Debian), and Windows) I'd say that Windows is just as capable as the other platforms for its main use cases. Where Windows struggles is in its differences. It's a very different experience, and people from each side struggle to make the switch, and it's a switch where you feel that you can run before you can walk at times. You have your own way of doing things efficiently, but even though you're looking to do something similar on a different stack you're using entirely different tools.


I don't understand you guys, it significantly depends on the programming language you use, what you said would apply to certain languages only, for example we use Delphi and Delphi runs only on Windows, I believe c# programmers are in a similar boat too.


For C# there's .NET Core nowadays, which iirc is a crossplatform JVM / JDK-like thing, runs on linux and everything too.


Developers really should be using Windows Server for Windows development, particularly if the software is going to be running on Windows Server in production. That solves at least some of the issues you mentioned.


Good point. But the license cost would be too much. I don't think there's an upgrade path to server from consumer. You have any helpful tips?


If you are a Silver Partner or better you don't have to pay to use Windows Server for internal purposes, including development and testing. It's not expensive to get that.


As mainly a Windows developer, with UNIX experience going back all the way to Xenix I can enumerate similar complaints about developing on UNIX.

Expensive SDKs? You should have seen UNIX compiler prices before GNU and BSD actually mattered.

Don't try to use non-UNIX OS as if they are UNIX and the experience will be much better.


This is contrary to my experience. I've been on various Macs for years, tried to switch to a Surface Pro (i7 model) + WSL. Nice machine, but it didn't work. I ended up running Ubuntu in a VM just to get `npm install` to work reliably. And that was horrible and slow, even with VMWare. Installing Linux on that thing looks like a lot of work (there's a whole SurfaceLinux subreddit...)

You know what I ended up doing after nine months of this crap? I switched back to my trusty old 15" MacBook Pro. The backlight is dying, but it works a lot better for me than Windows or Linux!


I wish I could agree. Half of the time I spent on my last linux laptop was spent frustratingly trying to get things to work. The screen font was too small for some stuff, too big for others. It didn't wake up from sleep properly. sometimes it didn't GO to sleep properly. It was death by a thousand cuts.


This has been my experience too. I love Linux, but its just too buggy (And I had all these issues on a ThinkPad that was marked as 'Linux Ready'). It really does give me a nice environment for development but the lost hours and days to fixing problems made it a total time sink. MacOS on the other hand has all the tools I need and works incredibly well. I also cannot stand Windows, its a total mess and drives me mad. So I am stuck with MacOS and for now I couldn't be happier.


Thing is. I could honestly say the exact same but change Linux to Mac and otherwise around.

I had so many issues with Mac and all my Linux devices just work flawlessly for years without issues.


As a counterpoint, I installed Arch Linux on a Lenovo gaming laptop and I'm delighted with the result. Arch being Arch, I had to configure lots of things manually, but there are no bugs to speak of. Maybe you mean weird behavior instead of bugs? That would make more sense.


What does Linux give you that macOS doesn’t?

I use both a Mac and an Arch Linux running i3/awesomewm, and to me macOS is like any Linux distro with a user-friendly desktop environment.

I guess it doesn’t have an « official » package manager, but homebrew has most packages anyway?


Package management is core to the OS, not bolted on. I can change the desktop environment as I please, and potentially run different environments for different purposes, if I so choose. For any piece of software on the system, I can have a part in its development process, if I choose to. I like a non-GUI-centric system. I'm not limited to Apple's drivers, or Apple hardware in general (it's pretty and sleek, but I find some of the design choices grating).

I feel like Apple provides a computer and OS that are user-friendly to the general population. But it also seems like the whole culture is "No, don't do it that we. We've provided this method as the One True Way."


How is package management via apt any different than using homebrew? They're both equally "bolted on".


Apt is the update mechanism for the system. OS updates, application updates, etc. It's a core piece of software on a Debian-related system. Brew behaves more like an alternate software repository...it's not like you're using it to fetch your kernel updates in macOS.


Why does the method of updating the core OS matter? Unless you're a system admin, and have to do it a lot. If not, that's hopefully something you don't spend a significant amount of time on each day.

Most of my concerns with regards to the OS I use has to do with the stuff I do 10s or 100s of times a day.


> Why does the method of updating the core OS matter

Because updating Arch is about 100x faster than updating a Mac to a new version. The loading bar looks nearly complete and then "About 17 minutes remaining."


I’m not sure Arch Linux is the best thing to compare it to. It might update quickly but it might not boot up next time, either.


That's a popular myth, but nothing more. I've been running a single Arch install for close to 4 years now, no problem.


> Why does the method of updating the core OS matter?

Because it's part of the answer to the question "How is package management via apt any different than using homebrew?", and (I think) supports my assertion that Apt is more of a piece of core OS functionality than Brew is.


So, it's more of a philosophical hang-up than a practical one.

I love apt, but I have no philosophical stake in the game. Both allow me to install things from command line. In that respect, they are functionally identical to me.

Then once a quarter or half-year, I need to do an OS upgrade, and then I use two different systems depending on platform (I use both regularly). Let's say one takes 15 minutes and uses apt, and the other takes 45 minutes and uses App Store.

Then I amortize that over the preceding three months, and in both cases the attention required, confusion created, and effort expended approaches zero rapidly, regardless of system.

Like I said, if you're a system admin, then sure.


try 'brew install gnome-desktop' maybe?


Wait this works? Gnome shell? The full gnome experience?


Nope :-/

        $ brew install gnome-desktop
        Updating Homebrew...
        ==> Auto-updated Homebrew!
        Updated 3 taps (caskroom/cask, caskroom/versions, homebrew/core).
        ==> Renamed Formulae
        php70 -> php@7.0

        Error: No available formula with the name "gnome-desktop" 
        ==> Searching for a previously deleted formula (in the last month)...
        Error: No previously deleted formula found.
        ==> Searching for similarly named formulae...
        ==> Searching local taps...
        Error: No similarly named formulae found.
        ==> Searching taps...
        ==> Searching taps on GitHub...
        Error: No formulae found in taps.


I only realized later that it wouldn't really work anyway guessing from all the missing dependencies.


Uniform platform support for multiple architectures. I have Linux desktops on i686, x86_64, arm and aarch64. Same desktop environment, same programs, same easily mirrorable configuration, same firewall system, wireguard, same or fairly equivalent package management between systems.


I work on container-tech and the absence of namespaces and cgroups in the macOS kernel is a continuous source of frustration for my team since you need to work through a VM abstraction for Macs.


What you're talking about is way over my head, but I do understand Linux might be more appropriate for more low-level stuffs and hardcode users like you at the kernel level. I'm happy doing React on my Mac :)


> What does Linux give you that macOS doesn’t?

perf, case sensitive file systems, non stupid alt-tab behaviour, strace, pstack, gdb (these don't seem to work without sacrificing animals), gnome-shell (better than finder by a long way, imo).

I've got a mac. I don't install programs except for things through brew. It's basically shitty linux with outlook.


perf, strace, and pstack can be replaced with dtrace on MacOS.

You can have a case-sensitive position. (Separate from root so it doesn't break some apps)


Do you need to add dtrace to the key chain each time it's updated or do they just not update it like most of their command line software?


What do you mean "add dtrace to the key chain"? I've never done anything related to keychain while using dtrace.


gdb needs to be added to the keychain as a signed program in order to attach to programs. But brew is a rolling release so I might need to add this frequently. It's painful.


DTrace is built-in software.


I took some time to try dtrace out. Sadly it's broken unless one turns off SIP:

https://apple.stackexchange.com/questions/193368/what-is-the...


Only when tracing the operating system, not your program. But yes, that is still a shame. You don't need to turn off SIP globally, you can just enable DTrace while keeping the rest of SIP on.


Thanks for the tip!


I was trying to work with some students with Swift and the Mac character encoding ending up causing no end of problems.


> the Mac character encoding

You mean UTF-8?


..with outlook?

Sorry man, but you’re doing it wrong.

And the Mac file system is case sensitive.


Alt-tab behavior is more of a preference thing.

>It's basically shitty linux with outlook.

Linux is shitty Linux. Except on the server. (IMO, of course).


I can accept that alt-tab is a preference if you want to use alt-tab and alt-`. It's heresy to prefer alt-`, but whatever -the vim-spaces-only(but automated code formatters are best and I dont care what they use)-alt-tabbers will eventually win out.

But on multiple desktops if you alt-tab to the previous application again, it only brings up the application windows in the current screen instead of bringing you to the last window you used (on another screen). Wrong! Broken! Sad!

Also, on mac alt-tab raises ALL the application windows. So if you have shed loads of terminal windows or loads of browser windows open, then alt-tab brings them all to the front. This is definitely broken since it stops common workflows like copying between windows; or finding some text that you want to type into a terminal and then alt tabbing to the terminal only to have the screen covered in terminal windows. SAD.

Mac is low energy (That's the reason I think I have it for a laptop).


I find separate inter-app cycling (cmd-tab) and intra-app cycling (cmd-`) much superior, faster, giving you better control.

However, agreed that on a Mac with multiple "spaces" it's completely broken.


Compared to a optimized Arch, or even a non overloaded windows 10 mac never appeared to me very energy friendly.


> What does Linux give you that macOS doesn’t?

Stable, long term, OS support. I'm running CentOS 7 x64 now, as I got completely sick and tired of Apple's "new OS release every year" bullshit.


You can run things like the Adobe Suite without any ridiculous overhead. Where Xcode is important you've also got that.

For a lot of people that don't care the differences are largely subjective and the difference between Ubuntu, for example, and macOS are largely academic and aesthetic.


With academic, I'm not sure if you also include 'philosophic'. Personally, I'd rather develop on mac than on windows but I'm the happiest on Linux.

A large factor in this is that I like to contribute to FOSS and find that 'ideology' to be a match with my beliefs regarding software.


For work I prefer the macOS environment partially for the software, but mostly because the machines are standardized and interchangeable. If one machine dies I can swap it for another without any fuss. Restore from Time Machine and get on with life, something that takes about an hour or so.

This is really not the case with Windows or Linux. These require a lot of tinkering and tuning. A recent swap from one Windows 7 machine to a Windows 10 one took days, the migration procedure is basically garbage.

I've never had much luck with desktop Linux even though I use it all the time on servers but those get rebuilt with a new OS when they're out of date. Upgrading them is just too much of a fuss.

If you've got a workflow for keeping desktop Linux up to date and rolling over from one machine to another as you upgrade hardware, that's worth sharing.


> Upgrading them is just too much of a fuss.

Switching to a rolling distro will eliminate the upgrade pain. Keeping dd backups is also relatively easy.

And unless you're on a custom kernel, you can just roll over to a new machine with your image and the appropriate kernel modules would get loaded for the new hardware at boot.


> What does Linux give you that macOS doesn’t?

For one, you largely don't pay strategy tax, which is under discussion in the very title of this submission.


freedom.


Strange, I have the complete reverse experience. Mac is by far the best. Windows is worst.

I use Linux at work daily but miss my mac. It is much more unstable and unpolished. The app selection is really weak and the integration between gui and console is weak although better than on windows.

Windows is an utter mess these days as Microsoft is jumping between so many different UI paradigms.


Can you give examples of the GUI and CLI integration? I have a mac, but mostly use a Linux desktop, so I don't know what I'm missing.


Drag and drop an item from Finder to the terminal and it expands to its full path. Not sure if Linux does something similar as I'm not a regular user.


It does, under modern DE, KDE/Plasma does this.


No comparison is fair if someone hasn't used Plasma.

I could not care less about "freedom" or these other philosophical aspects of Linux.

Plasma is just straight up amazing on every level.


I also quite like the 'open' command to do the opposite.


The Linux equivalent of this one is "see", an alias to the run-mailcap program, which on Ubuntu is in the "mime-support" package, which is probably installed by default.


Drag and drop almost anything into the CLI produce a sensible result. Open command works as if you double clicked the item in Finder, which means it can be used to open directories, launch a program associated with that file etc.

In addition because mac use commad+C, command+V for copy paste rather than ctrl+C and ctrl+V as Linux and Windows, you feel no different working from a CLI than from any other app. You don't have to mentally jump in and out of two different ways of working.

This extends all the way into GUI apps. Typical CLI commands keys work in all mac GUI apps. I can use readline keys such as ctrl+a, ctrl+e for moving the cursor e.g. Works even in office apps like Keynote and Pages.

It is very frustrating to not be able to use these Unix conventions on Linux!!

There are lots of little things like this which makes a superior experience IMHO.


I only use KDE, so I can't comment on Gnome etc.

Dragging a file into Konsole (KDE's terminal) gives a menu with the options "Copy here", "Link here" and "Paste location". Seems reasonable.

Dragging a hyperlink or image from Chromium gave the same options. "Copy" downloaded the file, although set a timestamp in 2106 for some reason. This doesn't work from Firefox.

Dragging selected text pastes it in.

"see ." opens the current directory in the file browser thing, "see thing.odt" opens LibreOffice, "see my.pdf", etc. "see http://example.org" doesn't work, although I can right-click the link to open it. (Naturally, "alias open=see" if you prefer that word.)

I set a custom shortcut for Konsole for "Super+C" etc (Windows/Cmd key), but I don't use it very often. I mostly select + middle click to paste, which is a Unix convention I miss on a Mac! The readline keys are nice, they seem to work about 80% of the time on a Mac, and I haven't found a way to get that working in Linux.

My "lots of little things" favours KDE. Properly maximizing a window, having a "keep above" button for any window, focus-follows-mouse, and the general feeling that the computer does what I ask in a boring way, not what it thinks I want in a stylish way.


So why are you sticking with it? Go out and get a nice Linux machine. Expense it. If you're working at a company too square to approve that, install Linux on your Apple laptop. If you're working at a company that won't even allow that, well, you have my condolences. Things to ask before you sign, I guess.


This works, but only for simple environments. Given a work environment where you have a choice of: a) prepared environment with all application dependencies available and a tested-by-everyone, one-click-install/update on a bad system, or b) preferred system, but you have to do all of that from scratch yourself... Sometimes the reasonable answer is "a", even if you're allowed to do "b" (on your own time of course)


I've always been a b guy, and it's helped me understand how things work. One time, I joined a company, and two weeks after joining, poking around with a weird custom machine setup, I found that the machine was running a world-writable batch file as SYSTEM upon boot. Who knows how long this huge gaping security hole would have gone unnoticed if I hadn't felt compelled to poke around with Cygwin?


I asked for both a and b.

I could make do with an older computer for Office and Outlook.


Windows is good, but getting hardware working is a lottery. I recently built a PC with pretty standard components (Asus MB, i5 CPU, Asus GTX 1060 GPU), installed Windows 10, it worked fine until I enabled Hyper-V. With Hyper-V enabled it BSODs few times a day because of buggy NVidia driver.. I had to disable it because of that. While macOS is pretty buggy and I experienced crashes, they are not that often, may be few times a month or less.


I must be very lucky. Didn't saw BSOD in 5+ years now while using Windows nearly every day


May be I'm unlucky, I don't know. Hardware seems to work fine, because with disabled Hyper-V long stress test performs just fine. I'm trying to prepare a bug report, but NVidia doesn't seem to even have a proper Bugzilla, so I'm not sure if it'll go anywhere. I hate to interact with corporations.


What exactly does your development workflow look like where Linux was so much better than MacOS?


Apple frequently introduces changes to its OS that caters to the average user without any hindsight for developers. A good example I had to deal with; I wanted to change the port the SSH daemon listens on. With El Capitan and future versions you have to temporarily disable SIP (System Integrity Protection) - which requires two reboots - in order to make the change to the ssh.plist file. As you can see it's a common problem: https://apple.stackexchange.com/a/208481 It's lots of little things like this that require extra effort on OSX but which are straightforward on a decent Linux distro.


Flatly, This isn't true.

I can still go into /etc/ssh sudo vim sshd_config, can my values, and they stick.

Just to test it, I did it right now, and I'm running 10.13.4, for what its worth.

What exactly was the issue with changing the sshd config? This isn't a SIP protected directory, which is the only thing that would prevent such a move.

Perhaps the requirement of sudo? I feel like most linux distros force that in the /etc directory too.


Changing /etc/ssh/sshd_config was my first attempt as well. But since this is OSX things are a little different, see here: https://serverfault.com/a/67616/


  this security feature that is great for the vast majority of Apple's user's isn't convenient for me.
Luckily, Apple shipped SIP with the option to disable it, and it's not hard. So you can disable it once and then you never need to deal with issues like that again. It's weird because it sounds like you want the protection of SIP without the inconvenience of SIP, but that's never been possible with pretty much any security measure ever -- more safety means less convenience. That being said, to me Apple has actually been the best when it comest to safety/convenience ratio. Linux distros don't even have the option of SIP or something similar, so I can't say I find your argument compelling.


It's not the fact that the SIP is there which bothers me. It's that Apple introduces these kinds of things with little notice and without caring if they break compatibility. This has always been Apple's approach and it's just not friendly for developers.


Is there anything stopping you using a separate sshd to run on a different port? Or using one from homebrew instead of Apple's bundled one?


What was the workflow in which windows was better than anything UNIX based....


I’m missing the logic as well.

Linux > Windows > macOS?


Desktop software development, graphical debugging tools.

The only UNIX based that tops it is macOS.


But Linux is better for that?


Not all all, Linux is the culture of command line, where people improving UI/UX on GNOME/KDE always get bashed as taking the power away doing irrelevant work.

There is not a single cohesive stack of desktop technologies, what macOS calls Kits, Android frameworks, Windows UWP and such.

Something like Glade still fails short of what XCode or VS Blend are capable of.

Sure there is something like Qt QML designer, but that isn't Linux specific anyway.


You are missing context. Grandparent was claiming that Linux was the best OS he’d used for development (fair enough), but that he’d rather have Windows over macOS otherwise. That part I was hoping for clarification on.


Not OP, but the LLVM toolchain provided by Apple is a bit clunky and missing features relative to what you get from a typical Linux distro.


Something I don’t quite understand, coming from a scripting-language bqckground: why are you using your OS’s provided compiler toolchain (for anything other than building OS packages for distribution?) Is there no version manager for clang the way there is for e.g. Rust?


Speaking about the classical C/C++ mainstream, and more from the Linux perspective:

Tooling for C and C++ mostly relies on some external package manager, often the OS-provided one (on most Linuxes, for example). There isn't a standard cpan/npm/pip/cargo for C/C++, although there are plenty of tools that can do kind of the same thing.

There's also not much support for virtual environments (there are tools out there, but not ubiquitous tools). It's pretty easy to point the compiler to a different set of header files and libraries, even on a per-file basis, to get a similar effect.

And from the Apple side (which I have a vague understanding of, having dipped my toes in a few times): Most of the documentation assumes that you're using XCode, and I'm pretty sure that the version of the compiler is just tied to whichever version of XCode you're using (which has a somewhat looser tie to the version of MacOS you're running). So in that case, you'd be using the XCode-provided toolchain rather than the OS-provided one.


You can install GCC from Homebrew if desired. Personally I don't bother because, well, Apple's clang works just fine!


> Is there no version manager for clang the way there is for e.g. Rust?

Not an official one for sure, Cargo is a blessing for people used to dealing with C/C++ dependency management.


Macs ship with outdated versions of common bash commands and tools as well. Homebrew goes a long way to fixing this for me.


I have the same problem on Ubuntu LTS distributions. Eg the ancient version of git in 16.04LTS.

You can fix that of course but homebrew does the same job on macOS.

Roll on 18.04LTS.


Homebrew exists on Linux. I like running LTS distros because they just work. For newer per-project exceptions, I put everything into a container (again layered on LTS), it is esp nice for things like specific versions of LLVM which I wouldn't want to pollute my base machine with. My peronal userland gets shipped via, cargo, go, pip, npm, etc.

https://github.com/Linuxbrew/brew


I try to put everything our team does into a container for the same reason. I'd point out though that the version of docker listed by apt on 16.04LTS is also really ancient. Pre docker-ce.


Yeah, I install docker on to 16.04LTS via this install guide [0]. I'd happily use something else, and probably will, but docker is low friction.

[0] https://docs.docker.com/install/linux/docker-ce/ubuntu/


The most enjoyable and productive dev workstations I have used/setup have all used rolling release distro's for this exact reason.

It does have the overhead of requiring you actually understand (or know how to google) your system/packages that you use though.


It's not just understanding the packages. When I upgraded from 16.04LTS to 17.10 I had to relearn how to compile the kernel because my laptop would no longer boot.

Wasn't that hard to do in the end but it's not something I've done for many, many years and doesn't fill me with confidence.


BASH is outdated, but I compile the latest version and install it in /bin (I have system integrity protection turned off).

Other UNIX utilities are actually standard POSIX ones. If you are used to GNU extensions to these on Linux, then Mac ones may seem outdated, but ironically (macOS kernel is called XNU which stands for X is not UNIX) Mac is certified UNIX, while Linux is only UNIX like.


StumpWM & a full GNU userland just like the machines we deploy on are killer.


It's sort of apples and oranges .... I'd thought the parent was talking about day-to-day consumer stuff (which even computer scientists and programmers do) like managing music, photos, various online accounts, and so on. Very different from all the stuff that software engineers specifically do. Of course there is some overlap but overall its two vastly different sets of user experiences.


I can concede Linux might be superior, but were you comparing apples to apples with OSX vs. Windows? Most Windows development seems to be on MS-based development stacks. A lot more of platform-independent development happens on OSX, I'd be willing to bet, and that iOS development on OSX would be commensurately productive as .NET development on a Windows machine.


The issue seems to be many equate developer == UNIX developer, as if there wasn't anything else.

Also UNIX underpinnings on macOS, just like on NeXTSTEP, was just a mere convenience, the OS culture was never about crufty CLI programming, and many that only jumped into Apple after OS X have not yet grasped it.


Aww this thread is such a waste of time and space. Bunch of totally subjective thoughts without any specific examples.


What apple software do you need to use? Sure, it's annoying not having the GNU command-line utilities. But `brew install coreutils` basically fixes that for me. I admit I don't work heavily with compiled languages on macOS so I can't comment on technical aspects of that. iTerm2 is not inferior to linux terminal emulators. The Apple laptops are quite obviously the best hardware experience out there. I mean obviously iTunes is a complete insult to humanity, and I've no fucking clue how to use Finder etc, but does that matter really?


I mean the laptops are great.

Provided you don't need a HDMI port. Or more than 1 spare USB port. Or am ethernet port.

You could get the touch-bar version, but I've read literally nothing but bad reviews and stories about that thing, plus tb versions have smaller batteries.

Yes, I could get a dock, but here's a better idea: how about apple chills out on their unnecessary-thinness fetish, and givese back some actual functionality in my laptop.


With touch bar i agree, it hate it. But the thunderbolt dock thingie totally rocks. It's really nice to only plug in one cable to connect the complete hardware on your desk (multiple screens, ethernet, power, etc).


In theory the dock is nice but in reality they don't work at all. Does Apple make their own dock?

Every time I come back to my desk with my laptop I have to go through some ritual involving opening and closing the laptop lid and connecting cables in a certain order to maintain screen orientations and try to get the thing to even wake up. I power cycle my dock at least three times a week to try and make it all work

I used to have a 2014 RMBP with the Apple display and it was fantastic because it was all built by Apple and Just Worked (TM).

Hopefully moving to their own chip fab is the start of them going back to owning the entire device solution.


Yeah, this is why I'm so happy with a Surface Book. In a couple of generations once all the bugs have been ironed out I'm sure third-party Thunderbolt docks will be great, but for the time being I want a first-party dock.


TB3 is definitely an improvement over the TB2 ports on previous Macs. But there's no reason Apple couldn't build a laptop that has both TB3 and HDMI/USB-A ports.


Here's a counterpoint: I have mbp tb and I like it. Only problem I had was esc, but after a while you get used to it. I also like how light and thin it is, I can go to work now with my backpack pretty much empty.


I mean, I used to ride to work/uni with my 2009 macbook in my backpack without any issue (along with textbooks, notes etc). Compared to today's machine's that thing would seem like a tank, so I don't know why everyone goes on about these new ultra-ultra thin mbp's like they're the first laptop to be portable lol.


I've heard it said that the greatest minds of our generation are thinking about how to make people click ads. However, I feel like the greatest minds of our generation are spent trying to figure out how to enable people to build Linux software on OSX. Whether it is:

* a convoluted process involving Vagrant, Docker, or both, because you depend on one or more pieces of software that don't run at all on OSX, including Docker itself.

* slogging through bugs specific to services running on OSX because they really only support Linux well, such as Cassandra or Kafka, or even MySQL.

* Getting shells scripts that work reliably on both OSX and Linux, especially as the tools used to do so break backwards compatibility, either by Apple itself or by 3rd party tools like HomeBrew.

* Getting a consistent development environment at all on OSX. Doing so seems to be much easier on a Linux distribution than on OSX.

And that's not even talking about general issues like:

* incredibly flaky bluetooth drivers, often requiring a full restart to fix, if not having to reset some weird hardware bit.

* My laptop randomly not waking up properly from sleep and requiring a restart.

* My laptop randomly beach-balling more often than I ever saw BSoDs on windows.

* OSX seeming to just run really slowly compared to Linux whenever it is stressed in any meaningful way.

* My laptop's wifi not working with random wifi endpoints, such as at the airport or hotel. Whether it's router software bugs or OSX bugs, I am always able to connect just fine on my windows laptop.

* Having to deal with OSX's incredibly outdated userland and its BSD-specific userland. Yes, there are workarounds, but they are generally a pain to figure out and have not been standardized in our environment in any way.

* For a few months my OSX terminal was SEG-faulting about twice a week. I learned to be very grateful for screen/tmux during that time. At some point Apple seemed to fix it, at least.

At $lastjob we were actually developing a Linux service that relied heavily on Linux APIs because it was essentially creating full-on Linux containers like Docker. Before I joined, the previous developers, all big apple fans, were actually going through the tremendous effort of trying to make the service at least build and run on OSX, even if it no-oped most of the things it did.

When I joined, one of my first acts was to completely remove OSX support for this service, and I promise you that life got way easier ever since. Our development processes got simpler. Our build system got simpler. And most of all, our source code got simpler and easier to read.

Granted, most of us aren't developing software that directly calls into Linux APIs, but even then I think you'll find huge productivity wins if you just use a Linux laptop or desktop, assuming everything else is equal.


Everything you listed under your general issues section is my exact experience when running Linux. I don't seem to have any of those issues on OSX.


I did maintain a Linux laptop (Ubuntu) for all my personal and development usage, from 2003-2011. Based on my experience at that time, it is entirely incorrect to claim that the experience with device drivers (wifi, bluetooth) and sleep/suspend/hibernate is better on Linux. Those things basically never worked right, battery experience was terrible, and many times during that era I lost a day's work because something was broken early in the boot sequence, and I couldn't even start X Windows.

In 2011 I switched to macOS due to my job and I have never had to deal with any of that. Ever. Perhaps the Linux laptop experience has improved significantly since then though.

Furthermore, there was no hardware nearly as nice as Apple laptops on which to install linux. (Yeah, other than Apple laptops, but I felt like I didn't have the money to justify that).

However I of course agree with your points regarding Vagrant/Docker and shell scripting. It is a shame that MacOS could not be based on Linux.


The device drivers compatibility and availability on Linux these days has never been better. Even better than Windows and OS X out of the box in many cases. Seriously, there has been huge improvements in the last few years.

If you want to run Linux on a laptop and get the best experience, a Mac is not the best option. Thinkpad (business class X and T series) is what you are looking for.


Linux has come far in seven years. I haven't had issues with wifi or bluetooth for years on quality hardware (Thinkpad, Dell.) YMMV


I begrudgingly use a MBP now after many years dealing with similar issues on Linux. The hardware is great, things adjacent to hardware are great (drivers, external monitors, configuration) but I would prefer to have the windowing system behave more like Gnome. Things like Alt+click to drag a window, Alt-Tab cycling through windows not apps (and stick to the same desktop workspace at that) menu bar location etc.


FWIW Alt-Tab, for me, does cycle through windows not apps, and windows within the same workspace only. Check SystemPreferences => Keyboard => Shortcuts => "Move focus to next window"


for everyone who wants to expirience a hell of developing software for linux under OSX i suggest to try to build apache httpd server from source on mac. it seems to be a trivial task on Linux but it is a hell of nightmare on OSX. I am saying this just because having a custom build version of apache is a commin need for many web projects.


Once you've got to the point of needing to match your production server technology, doesn't it feel like you should be using a VM or docker (assuming you're tied to MacOS as the development environment)?


Yeah, i don't understand this argument. Even all the devs i know working under linux are doing all their development under docker or VMs to match the live environment and being able to switch between different projects more easily. And even when you have only one project, who wants to run debian stable or centos as his main OS?


I do use docker. Do not worry about this part.

But first of all development without docker is much faster. Setting up a docker container takes some time (especially with a custom built apache server).

Secondly running a docker machine (needs a Linux VM) on mac in parallel to Vagrant (I have to use it too) is CPU taxing as both (docker machine and vagrant) VMs take a lot of power from main CPU.

Add to this constant port conflicts and network paths resolutions issues between different docker containers and hosts. Also it forces me to use Oracle Virtual Box (who likes Oracle ?) These little things all add up.

My point is that what should be a trivial task, is not so trivial when deployment is done on one platform and development is done on another. I agree with original comment that it probably makes sense for industry as a whole to switch to Linux development workstation instead of Mac.

The only reason I use Mac besides it is a company policy is because OSX somehow able to render better fonts. My eyes are getting tired after looking on poorly rendered fonts from Linux machines. If you use Mac and want to see how Linux fonts look like - install SeaMonkey browser. It turns off all proprietary patented algorithms available on OSX and renders fonts exactly as on pure Linux where due to some licensing issues many nice font rendering technics are disabled.


Do an intermediate container, and it shouldn't take very long for repeat builds.


> Even all the devs i know working under linux are doing all their development under docker or VMs

I don't, because docker is a buggy, embarrassingly-poorly-designed system and VMs are a pain. I develop on the same system I deploy on: Debian stable.

> And even when you have only one project, who wants to run debian stable or centos as his main OS?

I loathe CentOS, but I love Debian stable. It's a wonderful, solid (one might even say … stable) system. Why wouldn't I want to run it as my main OS?

One of the things I really hate about our development culture is the cult of the new. It's a good thing to use stable, well-tested systems. Let others find the bugs; I'm happy to get work done.


> who wants to run debian stable or centos as his main OS?

I haven't used these two in particular, but if they support flatpak, why not? You'd have the same packages as on the server plus the latest versions of GUI apps.


Interesting point, is flatpak already a mature thing?


I've used flatpak on an Ubuntu 16.04 base without issues. There are limitations, for example you can't easily replace/update your window manager through flatpak or snap. But for user-facing apps, it's really a great step forward.


The macOS system shell, python, compiler etc. are out of date, and out of your control. They're good enough for basic tinkering. However, as soon as you develop on a Mac professionally, you should create your own dev environments and stack independent of the macOS maintained one, starting with brew for example.


I thought it was common knowledge that you should not rely on system provided Python, Ruby, Tcl etc. They are there for scripting the system and nothing else.

If you are Python developer, install specific version your project needs and use that. Same for any other language.

Not sure what you mean by outdated compiler? For what? C, C++ etc are distributed with Xcode and clang is usually standard compliant and recent.

BASH is the only problem. Newer BASH versions switched to GPL v3 and Apple will never upgrade to that.

My personal solution for that is to disable system integrity protection, build latest bash and install in /bin.

This way most scripts that have #!/bin/sh or #!/bin/bash and rely on new BASH features just work.


You always ought to build on the same hardware you deploy on - to be blunt having multiple developers messing around on n different macs is just madness!


Just use docker or a VM for matching your production environment. Then your devs can use whatever device they're comfortable with. Add in a CI environment and then suddenly you don't need to force the same OS on everyone. Everybody likes something different.


Its better than nothing, but you still have added extra risk by doing it this way:

Are all the VM's exactly the same

Can you 100% prove that the vm behaves identically on all the varied hardware.

And no "professional" would consider "Everybody likes something different" is ever valid for a paying job!

No problem if its a hobby project but on a project with even a small number say 5 or 6 the risk isn't worth it.


Once again, this is why CI is important, it acts as a final test to make sure things build and run correctly. If your deployments go through the entire CI process before going anywhere important, the source of the work (developers' particular workstation setup) is not as important.

> Are all the VM's exactly the same

Is the silicon die on your processor exactly the same as everyone else on your team? If not, you're not a TRUE professional.

> Can you 100% prove that the vm behaves identically on all the varied hardware.

No, but I also recognize that I (and many other devs) are not writing space shuttle / train control / self-driving car software that is responsible for human lives. If I were, I wouldn't be advocating for the style of development I mentioned above. That is a different situation which I imagine most people on this site are not dealing with.

> And no "professional" would consider "Everybody likes something different" is ever valid for a paying job!

What does this even mean? Of course it can be "valid" for a paying job. There are literally companies out there who offer different OSes and machines to use for a job and the developers get paid. You get to deem they are not "professional" because...?

> No problem if its a hobby project but on a project with even a small number say 5 or 6 the risk isn't worth it.

The risk doesn't lie with the number of people on a project but the scope of the project itself: basically, can human lives be impacted in a significant way if a dev screws up? Then yeah, there are better, more rigorous ways. Are you writing a web app, a desktop GUI, some CLI tool, or perhaps...the Linux kernel? Then develop on the machine you like! Have a good review process, set up CI to build for the platforms you support, and implement a good set of tests.


"Is the silicon die on your processor exactly the same as everyone else on your team?"

Of course I have worked on projects where all the test dev and live hardware was explicitly brought from the same production run so the hard ware was identical down to the rev no of the pcb's - our hardware guy would have liked to have all the disk also from the same production run.


Can you "100% prove" that your destop Linux distribution has the same kernel modules and packages, and that none of the versions or patches diverge from the server your project is destined for?


I don't even know what hardware I'm deploying on. Amazon/Google/Azure has abstracted that away from me.


> What apple software do you need to use?

Not sure if you're asking about "Apple" software (eg from Apple themselves), or OSX Software.

Personally, I miss Affinity Designer, and SnagIt. Both OSX applications I've paid (though not made by Apple), and which have not-as-good-to-use equivalents for Linux.

I still find myself switching back to OSX when I need to get things done using them. Rare now, but still happens.


Quite obviously the best hardware experience? That doesn't seem obvious. I've been running Linux on Apples for many many years, but mid last year I bought a Dell. I'm certainly satisfied with it; I definitely miss the magsafe power adaptor, but that was a trade off I knowingly made. It seemed on balance that Dell had the better product.

What was so obvious that after checking out the options, I bought the wrong hardware?


I like Linux better than Mac which I like better than Windows. I don't use Linux for work right now because HiDPI support just isn't there for a lot of Linux software.


What has given you HIDPI trouble?

I have Kubuntu at home and a 4k screen, and also at work with a "3k" screen. I don't do development at home, but the general desktop works fine.


You're lucky that you get to have just one dev environment. I have 2 Windows, 1 Mac, 2 iOS and 1 Android device on my work desk and I have to constantly switch between them.

That said, I do most of my heavy lifting on the Mac, and all primary dev on a Mac. Doing the equivalent tasks is a chore in Windows. I used to be all-Windows, but once I understood that MacOS is file-oriented and not program-oriented, it helped a lot.


Really? I've always found Mac OS unpleasant, because it seems to me very program oriented. I typically work in a task oriented way (a collection of windows, probably from different apps, on one virtual desktop, referring to related files; another virtual desktop has windows from the same apps for another task).

This has always seemed to me to be a species of "file oriented" working, and I prefer apps that run in a file oriented way. For instance, a graphical file manager that shows a folder in a window, and you click to get open an item in the folder you so always get a new window - even if it happens that the item in folder is just another folder that is opened by the graphical file manager. Or office apps which always open each document in a new window, and when I close the current document, the only thing I notice is that the current document is closed - it doesn't try to focus some other window from the same app.

But Mac OS has always seemed an almost perfectly application-based interface with its fully application-based dock and it's application-based global menu (close a window for a file that was loaded by Cool App, and Cool App's menu still shows). I think it's Finder is a bit confused: it used to be mostly file based, at some point in the OS X years it got more and more application based but perhaps it's swung back - it's been a few years since I've bothered trying Mac OS. It had some of the most appalling virtual desktop support when I last used it, that made me think "the people who have implemented this only work in an application-based way". I can only imagine that's got better. But it seems that Mac OS is always: application-based workflow is prioritised, file-based workflow is secondary.

I have found that Linux can[1] excel in this workflow, Windows is ambivalent to all workflows, supporting all of them badly because it supports none at all, and Mac OS prefers you to think "what tool am I using", not "what task am I doing".

Is my experience entirely unique?

[1] i.e. it depends on what particular tooling you're using - it's possible a default setup is appalling.


I'm a software developer too. I've held senior engineering roles at Microsoft, Apple, and Intel (in that order). I grew up loving Windows but soon after getting really deep into development that love vanished.

Anything I do today (everything from Intel microcode, x86 assembly to C/C++) happens on macOS simply because I can do every single thing I need to in one place. Most devs I bump into that really hate macOS have no idea it is really just (open source) BSD with Apple's oddly unique visual facade. There is literally nothing I can't do on my Mac even during the times I run Visual Studio, VTune, etc. I also find it amusing when devs tell me that macOS isn't as customizable as Windows. Sure... sure... ;-)


That's the question. Will you be able to continue developing for non-Apple CPU on a Mac??? That's not what Apple wants.


You seem to be conflating macOS with Macs. Have you running Linux on Apple hardware?


Not GP, but I'd like to share my perspective.

Apple creates their products for a specific customer (which covers most of population looking at its popularity among people, who have enough money to buy their products), but not for me. Using Macs or iThings feels like using devices which aren't design for my workflow.

Macbooks are well-built, but they are optimized for Mac OS. Dells and Lenovos are good enough for me and work nice with Linux. I like physical buttons and keyboards which are... well, hard to say what I don't like in Mac's keyboards, but they are not going well with me.



What parts of your workflow do you think are uniquely enhanced by not being on Mac?


For me,

1. Getting packages with apt on ubuntu without installing homebrew. It's vastly simpler and natively supported by ubuntu. I can also update my entire system with sudo apt-get update && sudo apt-get upgrade.

2. Native docker support.

3. On mac, I always ask the question whether something is ctrl-c or cmd-c, say. On linux it's always ctrl.

4. I don't have to login to the apple store to get the software I need.

5. Linux has always been command-line first. On mac it's always been GUI first.

6. Macbooks are terribly built, and fail for a number of reasons. [1]

7. The Apple brand, while it used to be one of homebrew is now akin to one of fashion. It's kind of like seeing all the kids walk around with their "Hollister" t-shirts on.

I could go on I suppose but those are the meat of the issues.

1: Louis Rossmann's macbook repair channel. https://www.youtube.com/watch?v=sfrYOWlKJ_g


2 - docker works flawlessly on macos

4 - i cant think of a single application, outside of maybe xcode, a developer might need to sign into the app store for

6 - whatever reasons that youtuber might have for not using macbooks, i will never buy a non-mac notebook ever again. this 2013 13" mbp has been the best and most reliable computer I have ever had in 30 years and I bought it used!

7 - sure the apple brand is cool but i don't get your point? doesn't it mean that you care about the branding so much so that you won't use it? do you see where I'm going?


> docker works flawlessly on macos

He said nothing about "fine". Docker on Mac is through a VM and thus not native. Containerization in MacOS is impossible without a VM.


Exactly, on linux you have the entire CPU and memory space to use, as it is just a mapped name space.

I will add that the usefulness of seeing the containers PIDS in ps and other tools is more useful than most mac/windows users realize.

xhyve died on the vine it appears, but did make things a bit better.


Containers, sandboxes and zones are all different names for the same thing. macOS and iOS actually have quite good sandboxing support.



6: Go watch a few videos. He repairs macbooks -- for a living. Apple just keeps sending business his way.

Pro-tip: If you spill water on the keyboard, shut the MBP off immediately and send it in to the apple repair shop to get cleaned. If the MBP is on while there's water on the logic board, rust will form, either short circuiting chips, or rusting out the copper traces.

7: Apple used to be an amazing technology company giving the world wonderful computing devices. Today, it's a fashion accessory company. Mac OS doesn't run in the cloud. It is not the core OS that runs our infrastructure or space programs. And lastly it doesn't capture our imagination anymore. That throne has been passed to Elon Musk with Tesla, SpaceX and Hyperloop.


> 2 - docker works flawless on macos

Lol, nope. At the job I just left almost all of our team had MacBook Pros. We ran our entire dev stack with docker-compose and wasted so much time dealing with broken docker crap that ultimately stemmed from the fact that you're not running docker natively, but rather in a VM.

I wish I could still query the Jira database. I could throw out tons of specific issues.


I bought a 3000 dollar, 2015 MBP with serious heating issues. Which I was told is 'normal'. This pretty much killed any idea that I had that i deal with quality devices there.

I knew they are fragile, I didn't know that using the actual cpu performance for more than a few minutes is a problem.


>On linux it's always ctrl.

Unless you want to copy from the terminal, where it is ctrl + shift + C.


I highlight the text and middle mouse button for paste.


Didn't you say you liked command-line interfaces, not GUIs?


I'm not sure how using a mouse to select text means that you're using a GUI. I spend most of my working day in an Emacs window, which is ostensibly text, but it doesn't mean that there aren't cases where selecting some text with the mouse is faster than moving the cursor to the text by retyping it. People have been using mice with computers before windowing systems were even invented.


And on a Mac, you do Cmd+C. Neither maintains uniformity.


On macOS, it's very simple, consistent, and intuitive:

GUI menu shortcuts are invoked with Cmd, and that includes copy/paste.

CLI shortcuts work as always (and there's no overlap, because of the extra Cmd key).


ctrl-C not command-c to exit a process.


I'm also a software developer and I also have a Linux/Mac mixed environment, and I feel no difference between the two.

To be honest, Mac does many things really differently. Yes it feels and sounds like double-think, but I do everything differently from my Linux desktop, and I can do the same things at the same speed, if not faster.

On the average, I use them equally, and can do the same thing on both.

One footnote is, I do not install anything via homebrew or anything massive which installs into depth of the macOS. For these stuff I have a Linux VM, which is fired up rarely.


Maybe they also have an unpublicized reason to build their own - the backdoors on Intel/AMD/Qualcom/IBM chips. But building their own isn't a guarantee that they won't put one for "concerned" parties.


Find me a 15" 16:10 or 4:3 laptop with a *nix based OS, 4 cores, and 16GB of RAM and I'll switch any day. Unfortunately, there's only one choice for all those requirements right now.


https://www.maingear.com/custom/notebooks/pulse15/index.php

This is what I have is used since 2016 as my personal machine. Comparable in price to a MacBook Pro and easily outperforms a MPB. Specs on mine:

Intel CoreT i7 6700HQ (4 core s, 3.5GHz)

16GB RAM

15" 4K Display (1080p is standard, I upgraded. No regrets).

GTX 970M

256 GB SSD (system drive) + 1TB hard drive

Installing and running Ubuntu has been a breeze.

Like a MBP it has an aluminum body and weighs less than 4lbs.


Xps 15, aero 15 are v nix compat. Think the xps comes in a developer edition that's default ubuntu


The Dell 7520 DE?


Think pads?



My hero. Pretty much I'm the only one in the whole webdev team who prefers linux over mac :D Fortunately now that we have figma, nothing ties me to win or mac.


I'm a fan of using a Mac and a Linux VM. I use "Spaces" to switch between macOS and Linux.


> Not much, at least, not much that benefits me as a customer.

Why would it matter to you, as a customer, what chip is inside your computer? As long as it can run your programs, why do you care?

See also: Microsoft and Windows on ARM


> in my experience apple software is flat out inferior and OSX is the worst

It is almost absurdly bad. The only OS that still hangs, freezes, and crashes regularly. It is like Windows 98 quality wise and seems to get worse instead of better.

The whole UX is also insane. Every feature is hidden behind some obscure keyboard shortcut that you have to google or you just get used to working with this useless toy os.

The terminal is garbage. Everything is slooooow as fk (typing, mouse, etc).

It is shocking that the internet industry has standardized on working on this garbage when they run Linux on their servers and would be far better off developing on the software they actually use.

It just demonstrates the cult mindset and horrible lack of real technical proficiency in the industry.


> The only OS that still hangs, freezes, and crashes regularly.

Tell that to my Windows 7 box. It crashed last week because I tried to copy a PNG on the desktop into an Outlook e-mail message.

(IT still hasn't cleared Win 10 for company-wide deployment)


> some obscure keyboard shortcut

FWIW, according to the old Human Interface Guidelines basically everything was supposed to be discoverable via menus, and then has the keyboard shortcut right there in the menu.

The keyboard shortcut System Preference pane lists additional keyboard shortcuts, and lets you easily change them system wide. But yes, granted, searching online reveals quite a few more shortcuts.

I'm just wondering how Windows or Linux are better in that regard in any way??


You can't easily share media between the Apple ecosystem and others. You can't access some Apple services at all from Linux.

There's no way to run userscripts/WebExtensions on IOS. Your device is no longer a user-focused tool to access media, your browser is closer to being a "smart tv" than a customize-able information explorer and augmenter essentially controlled by no one.

You're giving up an awful lot for the sake of convenience, trends that if amplified could irrevocably change the character of the Internet for the worse.

I also don't get what's missing from a modern Linux desktop, especially since nearly everything is on the web these days.


> what's missing from a modern Linux desktop

Networking, printer stuff, graphics stuff that works immediately after installation, without one having to search for various problems & fixes on the Internet. And that doesn't randomly break after kernel upgrades.

Because of running into networking & graphics driver problems every now and then (and having to revert to older kernel versions), and problems with printers & drivers, I feel I would never never never recommend Linux to people, unless they enjoy troubleshooting things and learning new stuff.


I have used Debian almost exclusively for ten years, not counting the last couple months where I've been doing Windows 10 for some contract work. I can't remember the last time I ran into problems with networking, printer setup, or really anything except suspend/hibernation, barring one astonishingly cheap ($168 at Walmart) machine that had a weird integrated Bluetooth/wireless/something else card without a free driver.

That being said, Windows 10 is great for regular-user stuff --- really great, in fact. It's only for development that it's sometimes a little awkward, and it's really not bad. If I were working with .NET/MSSQL/IS more, I'd love it: Visual Studio is a nearly perfect IDE on a powerful machine. Debian is still better for me, though.


Well I suppose there're lots of people who never ran into any problems. I've been using Ubuntu and Mint mainly, and usually that works fine for me too (with most laptops & desktops I've had). In one case though (maybe 3 years ago), on fairly new hardware IIRC, I've had to test various kernel versions, until I found one that was compatible with the graphics stuff on the laptop. And then take care to not accidentally auto-upgrade to another kernel version. In another case, networking in Ubuntu didn't work after installation. I installed another distro instead and then everything (incl networking) worked fine directly. Finding printer drivers that doesn't just print random garbage characters, is usually super frustrating (I think), largely because the printer company webites' UX is terrible.

I think one is safer, if one use a bit older hardware (laptops and printers), because then the Linux people have had time to look into bug reports and incompatibility problems and fix them?

Nowadays when buying a new laptop, I always websearch for the laptop name + "Linux problems" or something like that, to see what bugs & incompatibilities other people have reported already. And then maybe I decide to avoid that laptop. But ... I wouldn't expect my parents or most other people to do this. Instead they'll buy a "random" laptop with new "unknown" hardware, and then there'll be a 20% ? risk that networking or graphics won't work for them? And I think they'll need help to get the printer working. ... And with all that in mind I feel I need to slightly warn them about Linux.


Um I have never had to download fixes for networking / graphics Ubuntu worked out of the box for me.


with amd open drivers or even intel ones there is virtually no more support issue for graphics on Linux.


I'd be amazed if you even managed to break networking after kernel upgrades in the recent 5 years.


You know that OSX and Linux both use the same printer stack (CUPS) right?


> I also don't get what's missing from a modern Linux desktop

Adobe software, for one.


I have also yet to find an IDE as nice as Visual Studio...


Many of JetBrains IDE's are better and they're cross-platform. I've switched to Rider after getting sick of constant white screen of deaths, crashes, locking and perf issues in VS.NET. Rider is on par with VS.NET/R# for C# and superior for pretty much everything else except for visual GUI designers - although I haven't used one seriously in over a decade.


I'm very skeptical but I will try out JetBrains, thanks for the pointer! :)

EDIT: These all seem to be .NET development? I use Visual Studio mostly for C++... for that would you recommend CLion?


Yeah you'd use CLion for C/C++ development (tho AppCode includes C/C++ support as well), although haven't used that variant myself, but they're all built from the same high-quality core platform and share many components so I'd expect it to provide a quality experience for C/C++ as well.


I just gave CLion a whirl. It's decent, but unless I'm missing something massive, it's nowhere close to Visual Studio in terms of being a good experience IMO. The debugging experience is so rough in comparison. Aside from the fact that the shortcuts are pretty non-standard (F5 is "copy" instead of "debug"??) here are some issues I can see right away:

1. Artificially long delay when I hover on variables before I can see their values. This drives me insane.

2. I can't view the "raw" values of a variable; I can only see whatever it decides to show me. (So if there's an std::string, I can't examine its fields; at least not by default, if there's any way.)

3. The first time I tried running the debugger, the process stays hanging in the background. I couldn't kill it; I had to manually detach it from the debugger. This was literally the "hello world" example it came with.

4. The completion database for their hello world (all it includes is <iostream>) took quite a while longer than Visual Studio's did. I can't imagine what it's like for actual projects.

5. It initially told me I can't debug because I didn't have a project? So I had to reload the CMake project. This makes no sense.

6. There seems to be no filtering of private variables that I can't access (in the completion UI)? Maybe I'm missing something here.

7. There seems to be no obvious "immediate mode" in the debugger. I'm not well-versed enough to know if GDB has this feature, but regardless, whatever you have to type into GDB to get this feature would be more painful than in Visual Studio (where you literally just type the expression into the Immediate tab and press Enter).

8. Even the cursor hiccups repeatedly. Not that Visual Studio's is perfectly innocent in this regard either, but CLion's stuttering is far more pronounced, far more frequent, and far less excusable (see https://imgur.com/a/Huo2V).

This is just after a very quick use. It seems to "work" in the same way Eclipse does -- the functionality is "there" (maybe even more than what Visual Studio has), but it's just clunky and doesn't feel smooth or "integrated" (the 'I' in IDE).


I watched the CLion sales pitch video after reading the recommendation. Binging around for the features shown I then discovered Visual Assist[1] which seems to add comparably nifty code completion/generation etc features to Visual Studio. I'm demoing it just now, it's definitely worth a look if you have an itch that needs scratching.

It's better to add bells and whistles on top of a rock solid foundation such as VS than the other way around imo.

[1] https://www.wholetomato.com/


I used Visual Assist X with VS2008 actually. Functionally it made things better but UX-wise it was a bit sluggish and glitch from what I remember. If I get a chance I'll check out the newer versions (thanks for mentioning it) but somehow I expect something again similar this time around haha.


Interesting, okay thanks! I'll give it a try.


I don't know if its still the case, but I can honestly say with a straight face that at various points in the past decade I've used Netbeans for C development and it was fantastic. My understanding is that a lot of the Netbeans C/C++ code was ported over from Sun Studio, which is why it is so solid and mature.


Microsoft released Visual Studio for Linux [1] a little while ago. I find it to be excellent; it gives comparable functionality to Atom but with a lot of the stuff you get with plugins already built in.

[1] https://code.visualstudio.com/


That's Visual Studio Code, not Visual Studio.


VSCode is a good text editor, but it doesn't compare to the full Visual Studio IDE.


Out of curiosity, what are the killer features of the full IDE over Code?


I'd probably suggest debugging being one of the main killer features.

I'm not a VS/VSCode user but I am a Sublime Text and PhpStorm user. You can debug in Sublime Text but I'm not sure you'd want it to be your primary debugger. I'd guess VSCode/VS might be similar.

And again taking Sublime Text and PhpStorm as an example, PhpStorm knows your code far, far better than Sublime Text ever would do and this makes navigating your code quicker and easier.


> There's no way to run userscripts/WebExtensions on IOS. Your device is no longer a user-focused tool to access media, your browser is closer to being a "smart tv" than a customize-able information explorer and augmenter essentially controlled by no one.

I have (personally) never had the desire to write/run a userscript on iOS. If my device being a “smart tv” lets me focus more on other areas of my life (i.e. projects, hobbies, and career) rather than fiddling with my tools, then that’s a trade off I’m more than happy to make.

I suspect many other people share this point of view.


As usual, a lot of the innovation you probably take for granted happens through side channels, user-favouring features that would never appear unless they can be "unauthorized" add-ons. Like ad blocking. I understand that some people value convenience above all else, but maybe you never experienced the bad old days when everything was locked down. It's good that Apple still has competition and others are conscious of what might happen.


>"unauthorized" add-ons. Like ad blocking.

There are iOS ad blockers on the Apple App Store, and extensions for MacOS Safari. I'm not sure how much more authorized it can get than that unless you want Tim Cook to hold your hand through the process.


The point I didn't make clearly enough is ad blockers probably wouldn't exist if others hadn't introduced this kind of feature through userscripts and add-ons. If only Apple can enable these abilities, it's up to other ecosystems to innovate and define access.

In my opinion, the most important branch the Web could take right now is going from transparent content to opaque content. Now I can filter, transform, organize and augment content accessed through my browser, and even remix it, all enabled by the inherent organizing of the technology, which suggests to the larger society more can be done. It's an open question though, and that technology could be changed to restrict what can be done with content.


that only works with safari of course because they don't want chrome having feature parity or, gasp, being better!


It also works with any app that uses the Safari View Controller - like Feedly.

If Google wanted to include their own ad blocker with Chrome they could - it is highly unlikely that they would want to.


For instance, Firefox for iOS's settings screen indicates it offers some sort of "tracking protection" in Incognito Mode by default. I remember reading about how this works on the desktop[1], but I've not had a chance to understand how well this is implemented on iOS.

Still, it's clear Chrome could offer an adblocker if they wanted to.

[1] https://support.mozilla.org/en-US/kb/tracking-protection


Add blocking is something Safari (and Apple) does better than the any other browser vendor, IMO.

A good ad blocker for Safari (on Mac or iOS) just uses the Content Blockers system, and provides essentially a JSON blob with a bunch of rules in it to Webkit, which will then take the requested actions (mostly block, sometimes hide elements, sometimes force a URL to https) internally - the Ad Blocker "app" never knows what sites you're visiting and is never involved in actual blocking.


> I also don't get what's missing from a modern Linux desktop

A user interface that isn't ugly and laggy.


You know not everyone actually likes the Apple desktop experience either.


What's ugly or laggy?

It's not like there is a defacto user interface but there are plenty of them that sure doesn't lag or look ugly!


I bought a system76 bonobo with their pop linux distro. I'm perfectly happy with my desktop experience.


- decent video calling

- imessage (signal is getting warmer, but the integration is still lacking)

- adobe creative suite

- video editing

- audio production apps

- photo management and post processing (darktable is still garbage)

- sleep/wake and connecting/disconnecting external displays

- not-ugly fonts


It’s kind of funny to compare Linux vs. macOS if your experience of macOS primarily comes from using a Hackintosh. (No sleep/wake support for me! Unplugging my HDMI cable causes a hard-reboot! Etc.)

Which is to say: Linux is pretty good on Macs, whereas macOS is pretty broken (after weeks of intense debugging with other Hackintosh folks) on hardware that would run Linux just fine.


That's to be expected, of course. Linux maintainers care about the fact that it can run on just about anything. macOS maintainers only care about whether it runs on Macs.


You admit you're using a hackintosh, and then blame Apple when stuff doesn't work?

That's one heck of a mental disconnect.


I didn’t say I blame Apple. It’s not Apple’s responsibility to write drivers for machines they don’t claim to support.

I’m more saying that—although Linux’s support for things like sleep/wake on random machines might be sometimes wonky, the best efforts of the Hackintosh community don’t get macOS to support those same machines any better. It’s the machines that are wonky. (Usually by having horrible off-spec ACPI tables that get patched over with drivers the manufacturer releases only for Windows.)

Which is further to say: it’s not really a failing of macOS or Linux if your hardware won’t sleep/wake correctly, or won’t connect/disconnect from external displays correctly. In either case, it’s because your machine is nonconformant to the specs the drivers were written to follow. The only way to succeed in such an environment is to spend man-decades (Linux) or at least man-years (Hackintosh) reverse-engineering the brokenness and writing heuristics into your drivers that patch over it—or by being such a monopoly player that the OEM does that for you (Windows.)


Breaking news: An OS with a limited set of drivers for the hardware it's specifically designed to be used on, doesn't work very well on hardware it was never intended to be used on. Full story at 11.


OP is nodding to infamous problems with actual macs freezing and rebooting when waking if external displays are connected, which is a pretty common occurrence on high sierra devices.


Audio production on linux is pretty good, well not for everyone, but it's got a lot to offer. I spent years with logic / Ableton live on OSX. I'm moving to a more barebones approach these days with linux and supercollider.

Still nothing for the creative suite except for wine though.


You might be interested in looking at Bitwig Studio for audio production. I've heard it's similar conceptually to Ableton Live and it runs natively on Linux.

https://www.bitwig.com/en/home.html

I agree that audio production on Linux is pretty good these days. Ardour is a pretty good DAW. The only problems are jack being a PITA sometimes and the lack of plugins.


I got bitwig as soon as it came out. It does some things fantastically! The apt-like package manager is badass. I haven't tried again in about a year but every time I've used it the massive instability reminds me of mediocre DAWs that drove me to live in 2003.

The reason I went with bitwig was to support anyone doing anything that wasn't quite Live because everything is made with Ableton now and I can hear it. Also the collaborative features that didn't work when I last tried them. Moving to Supercollider has solved all those issues for me. It's also cross platform and free.


I suppose you only have looked at free Linux software, as opposed to paid-for software?


The only thing I have a hard time trusting Intel with is the way they have handled questions around their Management Engine. I think they have done a poor job convincing us that it's safe or at least benign.

It will be interesting to see if Apple includes a secretive management engine. If they do, then the speculation that it's required by one or more governments will be dialed up to 11.


With Apple's documented interest in owner's control of their devices and delivering the required documentation for it, they won't tell us what is included in the chips and what's not.


Fortunately, Apple regularly publishes their iOS Security Guide as criddel mentioned below and there is a vibrant Apple chip reverse engineering community that does microscope analysis of Apple's chips [1].

1. https://www.anandtech.com/show/11596/techinsights-confirms-a...


Apple has done a very good job with documenting iOS security:

https://www.apple.com/business/docs/iOS_Security_Guide.pdf

If they included a management engine and produced a document like that, I'd be satisfied.


I always see this document linked to on HN even though it's a (relatively) high level description of what is happening on the device.

Besides, there is absolutely no guarantee that Apple isn't just being "semi-open" with their security mechanisms. In other words, who is to say that there are no redundant, undocumented mechanisms in place?

But I will concede that Apple is ahead of everyone else in this regard. It's just that I think Apple being ahead doesn't mean that the information they're providing is enough.


You're right that a whitepaper isn't enough to be really sure. That takes disassembly of binaries and then you are still stuck with judging the trustworthiness of the hardware and I'm not sure how you solve that problem.


Apple is probably going to produce a similar document that details at a high level how the security works. If they follow the example that they set with iOS devices, what they won't do is provide the source code for these components.


I mean, of course, it's still their right to release or keep any source code they produce. We shouldn't be expecting them to release the source code, but even a whitepaper is better than what Intel is doing.


How's this different than going back to PowerPC architecture? When they switched to Intel, I recall there being a lot of talk of benefits to running on the same chips as everyone else.

From what I've been hearing lately about Apple, the impression I keep getting is that Apple doesn't think of Mac as a product with a specific target market, but more as a sort of just-a-really-really-big-ipad. The closed garden approach worked well with iOS, where for the most part people just play throwaway games, but I really don't see that being a viable underlying philosophy for laptops, where interop with digital plumbing is important.


Apple is in a much different state now. Making your own chip requires significant resources, and before they didn't have the leverage nor talent to get the chips they wanted built. Now, they have a trillion dollar product (iPhone) and a mature chip development team that many consider the best in the industry, and they can throw tons of money at the problem.

The benefit of running the same chips as everyone else was about leveraging the investment of other products into the Intel architecture. Now they're doing that, just with their own chips.


From what I've been hearing lately about Apple, the impression I keep getting is that Apple doesn't think of Mac as a product with a specific target market, but more as a sort of just-a-really-really-big-ipad.

As a long-time Mac user who's still basically happy with the product, this is what worries me the most long-term. YMMV, different strokes, go in peace with Ubuntu, but for me personally, macOS has been my favorite desktop Unix, hands-down, for about fifteen years. There's nothing that I want to do on even a semi-regular basis that I can't do, and a lot that would require, well, heavy adjusting if I moved to Linux. (Yes, I've used Linux within the last couple of years.)

I also love iOS and the iPad, but it's not a general computing platform; it's a computing appliance. It's very good at what it does, but by design it's difficult-to-impossible to do things on it that Apple doesn't want you to in ways that aren't true for macOS. (I'm sorry to those of you who had to disable SIP to recompile your own version of Apache, or who are infuriated you can't replace Finder with ratpoison, but you know that's not what I'm talking about here, right?) If "Project Marzipan" is about creating a new UIKit/AppKit hybrid that allows developers to create codebases that run on both iOS and macOS, that's already a little worrisome; if it's about "letting iOS apps run on macOS," as some of Gurman's reporting has it, that's a lot worrisome. I have a lot of apps that exist for both macOS and iOS, and in every single case, the macOS version is more capable. And iOS's "sandbox everything" model--and, I suspect, attitudes it engenders--make every app feel like an island not just in terms of data but in terms of functionality: there's much less of the "learn the basics of one app, learn them all" feeling that makes macOS, well, macOS.

If the Mac line moves to Apple's A-series chips, that's...not necessarily bad, but if it's being done in conjunction with sweeping software changes, it makes me extremely uneasy about the line's future in a way that even the Touch Bar doesn't (and trust me, I do not take the Touch Bar as a good sign). I'm not planning to switch platforms any time soon, but I'm starting to wonder if maybe I should buy an inexpensive Linux-compatible laptop so I can, you know, practice. Just in case.


I'm not too worried about Marzipan in the short term, I'm thinking the goal of that project might be to get devs who make pro apps on the mac side work on a codebase that they could easily port over to the iPad side seen as the iPad Pro is seriously lacking in app support compared to the Mac, because it's unprofitable to maintain them.

In the long term this is a play to push Cook's iPad Pro post-pc idea, yet again...


> When they switched to Intel, I recall there being a lot of talk of benefits to running on the same chips as everyone else.

They switched to Intel first and foremost because PowerPC was not going to get them the performance/power ratio that they wanted. It was holding them back in laptops. For example, Apple never shipped a laptop with a G5 chip because it took so much damn power to run a G5. Intel was basically their only option.

A secondary benefit, which they marketed heavily, was the opportunity to boot your machine natively into Windows. They thought this would help overcome the reluctance of "switchers" who were worried that they would miss something about Windows.

These days, the role of the OS as a gatekeeper is basically gone. There's greater diversity in client OS (Windows, macOS, Linux, ChromeOS, iOS, Android, etc.) and most anything important is available through a browser or a native app.

And Intel is no longer their only option. They've proven they can run big businesses making their own chips.

I do wonder how this will affect VM performance on Macs though. I don't see how it can be maintained, unless Apple chips are so much faster than Intel that they can absorb the translation overhead.


> most anything important is available through a browser or a native app

I guess that depends on what people think is "important". If by that you mean Facebook and Youtube, sure. But if we mean things like compilers, imho it would be a net loss if the only way to compile anything for any Apple OS was through Xcode/Swift.


OS as a gatekeeper is still present in a lot of corporate stuff. You'd be amazed the number of people who are still running IE because of one reporting package or what not.

That said there is no reason in 2020 they couldn't dual boot to MS's Windows ARM which is already shipping supports 32 bit x86 apps and should be polished enough to do legacy to the limited degree most users need.

I have to say this isn't how I expected Intel to go down. It is basically microcomputers killing minicomputers all over again (which was before my time, TBH).


The x86 emulation is far from reliable. It's far more likely that corporations just get the intel laptop because processors just are simply just a small fraction of the total price of a laptop. When you're buying a dell laptop that costs $1300 then spending another $200 for the intel laptop is a drop in the bucket.


Well, certainly I think a part of it was that they could unify on one Chip with the rest of the industry did provide some economies of scale advantages, though rumor has it that Apple has always paid a little bit more for its chips so they could guarantee certain things and dictate as such.

I believe the main driver though, as pointed out in this CNET article https://www.cnet.com/news/four-years-later-why-did-apple-dro... was that IBM could not deliver on a powerful enough powerpc chip that would also meet the other constraints, namely, at this time, one of the biggest markets Apple had was the notebook market, and its sales there were exploding. IBM was unable to deliver a lower thermal envelope for its portable chipsets on top of performance issues.

I think this more than anything else pushed that reality.


It doesn't need to be a closed garden. Look at what Microsoft is doing with x86 emulation on ARM.


We should all not forget that if IBM, in the 80s, controlled its hardware and software stack like Apple does now, we would not have the modular desktop PC and we would not have Linux.

I consider the extreme vertical integration of Apple as a bad development (Apple recently is trying to control even rare Earth metals). I like the market to be "modular", with many different vendors, making replaceable parts that fit together like in the desktop PC. And I wish for a future where I still have a choice, instead of being forced to use iDevices for everything from scientific computing to consumer media players.


...we would not have the modular desktop PC and we would not have Linux.

There’s probably no way to support that counter-factual. We could easily have ended up with some other open-architecture personal computer and open source operating system.


There is actually: BSD. In 1991 BSD transitioned to open source

https://en.wikipedia.org/wiki/Berkeley_Software_Distribution

Which of course gave us FreeBSD in 1993

Thats just one simple example I can think of. Intel, as well, was never monopolized as a chip manufacturer. They could sell chips freely to IBM competitors.

Also, the reverse engineering of the IBM platform is what made Compaq computer successful in the mid 80s.

https://en.wikipedia.org/wiki/Compaq

I don't think even a quick examination of the histories involved here would yield any other result than open hardware and software was always going to be a big component of desktop computing.

These things ebb and flow anyway, historically what you usually see is that for at least a decade (sometimes more, a lot more, and sometimes dramatically less) what ends up happening is vertically integrated solutions are favored as they bring the most harmony to the average customer of said platform ('it just works!' was as slogan of many firms, not just Apple). Then as complexities can be either mitigated or eliminated these things tend to open themselves up.

I believe Mobile devices are going through this phase. I think in another decade or two you will have a situation where you start seeing really useable open alternatives to the major platforms even though right now it does not seem intuitive or obvious how that will work


To play devil's advocate.. While you may trust them more, you can't deny they have had many significant software security issues of late. How could consumers trust the security of their chips?


A bug is bad. A business model based on selling your personal information to third parties without your explicit knowledge -- far worse. A bug is a mistake -- not excusable, but the _intent_ is much different.

The admin access without a password bug -- fixed in 24 hours, and it still required physical access to a machine.

That's not to diminish the seriousness of the flaw, but that's far different that Facebook's (and many others') nefarious, continual and one-might-argue, malicious exploitation of personal data.

I feel like Apple actually cares deeply about privacy. Look at Tim Cook's personal life for example -- extremely private person -- someone, given his life story, who probably appreciates privacy more than most.

No company is perfect and no company should get a free pass for negligence, but I think Apple has earned the benefit of the doubt.


Everyone has had security problem since I started in the industry 25 years ago. -everybody-


Given software !== chips and their ARM chips are - hands down - the best on the market... yeah, I'd think people would trust them.


Given the amount of evidence you presented, how could anyone refute your opinion?

OP raised a valid argument - Apple's QC has been crap lately and trending downwards. There's a lot of feedback out on the net to support that line of thinking.


There's been Apple-bashing "feedback out on the net" since Apple was founded.

Most of it has been laughable. The stuff you're talking about is firmly within that laughable camp.

If Apple's QC were "crap" you'd see that in customer satisfaction ratings for their #1 product: the iPhone. But guess what? You don't see that. And there is no downward trend.


Shipping Mac OS so that anyone can just YOLO login as root?

https://www.macworld.co.uk/how-to/mac-software/how-stop-some...

Storing the full disk encryption password in plaintext on the disk?

https://support.apple.com/en-us/HT208168

How do you explain these except for atrocious and trending downwards QC?


I do not deny that those were both terrible, awful mistakes (and I'm upvoting you for correctly including them in this thread). I would note that they were both relating to what is (perhaps unfortunately, but them's the facts) a very minor product for Apple. And that they're both fixed. And that Apple's security overall is still unquestionably far superior to that of its competition. Ask anyone who has to administrate both Macs and PCs for a living (like me).

My comments above still stand.


Software, you could make a case, that's been a bit of a mess lately.

Hardware, though, you've got nothing. Where have they dropped the ball on their chips?


It's all one company. The reputation, stellar or tarnished, affects the entire brand. Their software QC has been bad lately, it's makes me question their hardware QC too. Those bugs/issues are often harder to find though - if they exist - they might not. But I think it's fair for me to question.


For a company that makes a large volume of a very narrow range of products they have a pretty good track record when it comes to physical hardware, and so far their chip work hasn't hit any major snags.

When Apple's custom CPU has its first F00F-type bug then we'll see how they handle a real issue.


> I think that long term they are much better off controlling their entire hardware stack.

But CPU/GPU aren't the only chips Apple is sourcing from 3rd party. Think wireless modems like WiFi, LTE, etc.


Close, but Apple already produces it's own WiFi/BT chipsets (the S1/S2 etc SOCs in Apple watches for example, and the W1 chipset in AirPods and new Beats headphones), and is rumored to be working on an LTE/GSM chipset.

I think Apple is intensely interested in controlling the entire stack.


Why? A lot of these parts are commodetized. Are they really getting that much better battery life from their own stuff?


W1 chip seems to have better performance and power than any other chip. A10 A11 seems to have better performance and power than any other chip. I see a trend..


For performance. They can outcompete against others by moving to bespoke hardware. My iphone x cpu is ~ 13” MBP in performance, which is bonkers.


Absolutely! That's why the Apple Watch, for example, is an unqualified success compared with any other wearable. Qualcomm, which is the only real rival in that space, hasn't come out with a new wearable-sized SOC in forever, leading to the downright abandonment of many wearable product lines from pretty much everyone except Apple.


No, that success is because apple is literally a fashion brand. How many people do you think that wear apple watches could even tell you what hardware is inside it? Sure, it runs smooth, but that's not why they have it on. Nobdy is streaming hd video or running heavy applications on it.


With the W1, battery life and bluetooth that actually works as it should have from the start.


Apple claims the W2 chip is 50% more power efficient than existing Bluetooth/Wifi chips on the market, and is 85% faster than the W1 chip for data throughput.


Existing Bluetooth hardware has so many hardware and software bugs that it would [hyperbole] a [large thing]. Apple should absolutely control the BT hardware/software stack, there is no other way they can offer the Apple Experience without it.


Produces? Are you sure you don’t mean designs?


Apple is as yet fabless, correct. By choice, clearly, as they have enough cash on hand to build and operate more than anyone else on the planet if they wished.


If Apple were to make their own LTE modem, does that mean we might get one in a MacBook Pro?


Isn't wireless hardware in general more like a IP/Patenting/Licensing issue?


Not if you want to do it better than the competition. We already have a zillion radio chips that all do about the same thing. If Apple (or any other company) wanted to be 'better', building a better design to do the thing in a better way (think efficiency, power, speed etc.) would give a unique advantage that cannot easily be copied by a competitor. Same goes for the RTOS (or lack thereof) that you need to run those. For some implementations, a softMAC vs. fullMAC makes sense.


Apple's watch now has a W2 chip that does Wifi and Bluetooth, and maybe even LTE (which is how they crammed it into such a tiny device).

Slowly but surely, they've put together all they'd need for a computer. Their T2 chips in the current iMac Pro handle encryption and NVME storage on the SSD's, as well as the camera image processing, Secure Enclave, and the SMC.


I don’t know what the current status is but they have definitely looked at making their own baseband chips before.


What I'd like is for a company like Apple to provide me (and others) the tools & APIs we need in order to take advantage of that integration, without locking us into their UI paradigms. I would like everything to Just Work™; I wouldn't mind a Macintosh laptop; I would hate to be stuck using the macOS UI.

A lot of what Apple really does is hardware integration. Let me use the hardware to run a free OS, and also take advantage of the excellent hardware integration. Let me use a free operating system, StumpWM, emacs and some nifty APIs which let me talk to the integrated hardware system.

Give me, in the words of Steve Jobs, a 'bicycle for the mind,' not a tram to nowhere.


It's the old story of convinience vs. freedom. I myself am willing to trade a bit of convinience and the need of tinkering with the system for a big chunk of freedom. Not only for me but by using and making Linux better, hopefully it's also a (small) investment for future generations.

This way I hope they still will have the choice to run software they (can) trust on their own devices. If everyone would just go for the convinience, at least desktop Linux would dissapear as an option.


I have a mac, it's a great piece of hardware, and it works great for doing basic stuff, surfing the web, responding to emails, writing documents, but for my work I find Linux so much simpler to use.

Linux just works, sure it's isn't pretty like macOS but it let you get your work done fast, I know perfectly how everything is put toughener, there is a problem surely I can fix it in seconds, just edit some config file and done, it has far more tools for developers, if you need to install software you have the package manager, sure on macOS you have brew but it have some problems, and a lot of developer tools either don't work on macOS, or doesn't work correctly, or are similar but not the same as Linux.

In fact on my mac most of the times or I works with remote servers or with VMs on my computer, because for development work Linux it's far better.


The problem is that Apple’s focus on this distracts from the software that defines the user experience. The MacOS ecosystem is both beautiful and deeply troubled.


On the other hand, with Marzipan (shared libraries on iOS/MacOS) and a single processor architecture, you could say that Apple is putting 'more wood behind fewer arrows' and will have more engineers developing software that reaches MacOS as a result.


> The MacOS ecosystem is both beautiful and deeply troubled.

So, shifting it to ARM and letting universal apps in the iOS App store now run on macOS in addition to iPad opens up the mac to a metric fuckton of developers that are happy to make cross-platform apps, since the iOS app store is SO lucrative.

Apple showed, with the iPad which was a totally new product category, that the sheer number of iOS developers could push the iPad from a platform with zero apps to the best tablet in the world (with an incredible app ecosystem) and universal apps who's UI adapt to the screen they're made for.


> I wouldn’t be surprised to even see them make their own display screens.

I think they're already working on that: https://www.bloomberg.com/news/articles/2018-03-19/apple-is-...


There was a post on here recently that said Apple was interesting in making their own displays.


Maybe they’ll make a TV that doesn’t sell information on everything you watch and say to all and sundry.


interesting, i only use my ipad as a remote to the avr so i can play tidal from my hifi... it's gathering dust otherwise.


Have anyone checked with adobe and autodesk if this essentially kills the professional mac workstation as a viable product going forward?

I have no doubt they can make it fast and usable enough for the naive consumer living inside the bubble of what relatively inexpensive(consumer grade) app-store apps can do and that microsoft will release something pretending to be MS office for a new ARM macbook line, but photoshop, autocad, mathematica(ok that one is fairly portable), AfterEffects and whatever it is the pro's use for video editing those days are a different ballgame.

We saw microsoft stumble down that path with their successfull port of windows to arm where the OS itself worked but where no major 3rd party business essential app ever got ported to windows for arm, with a lot of IT departments choosing to keep windows 7 around our of fear of what 10 would morph into until MS begun to dial back their ambitions for windows everywhere.

Im guessing that the short run result of Apple going though with OSX on arm is that a lot of the CAD and video editing heavyweights just drop support for whatever OSX morphs into to support running on IOS hardware.


If you remember back to the mid 2000's when Apple switched from PowerPC to (Intel) x86, the higher end machines still ran with PowerPC chips in them while the mid-low end products had x86 chips. All of the old applications that weren't yet ported to x86 had to use "Rosetta." It wasn't until a little while later, presumably once most of the professional software companies had good working x86 versions of their software - that the high end mac pro systems started seeing Intel chips in them.

Microsoft started down that path last year with Windows 10 on ARM, and Intel had some legal issues with that - https://arstechnica.com/information-technology/2017/06/intel... It will be interesting to see how Apple handles this.


> It wasn't until a little while later, presumably once most of the professional software companies had good working x86 versions of their software - that the high end mac pro systems started seeing Intel chips in them.

The switch to Intel started six months [correction: seven months] from the announcement, and took seven months to complete.

June 5th, 2005: Intel switch announced at WWDC

January 10th, 2006: First Intel Macs (MacBook Pro and iMac) released

August 7th, 2006: Mac Pro and Intel Xserve released, last PowerPC Macs discontinued

https://everymac.com/systems/by_timeline/index-macs-by-timel...


There's two problems with that story: first, the famous switch from PowerPC to Intel happened when the desktop was the king, and app developers had strong incentive to recompile programs to the new platform. Right now? Not so much.

Second, while Apple's mobile processors are currently the best in ARM world, for "pro" desktop computers Apple needs desktop-grade processor. Making something comparable to i7 isn't such an easy and cheap job.


Adobe for one is making a boatload of money from CC subscriptions, record numbers, so I wouldn't worry too much about that (I imagine MS is equally happy with Office 365 on the desktop).

Besides, there's no Carbon to Cocoa or CodeWarrior to Xcode port, so aside from a very small percentage of optimized assembly code, it shouldn't be nearly as hard.


>Second, while Apple's mobile processors are currently the best in ARM world, for "pro" desktop computers Apple needs desktop-grade processor. Making something comparable to i7 isn't such an easy and cheap job.

Good point, I guess Apple never though of that.


It's quite impressive how many people in the comments can so easily come up with big show-stoppers that Apple never thought about. Must be why Apple is struggling financially and can't get anyone to buy their products.


While being sarcastic you might have missed the sarcasm of the parent :)


I both understood and was adding to the sarcasm.


> the famous switch from PowerPC to Intel happened when the desktop was the king,

I though one of the major reasons for the switch to Intel was the power consumption and performance on laptops.


I'm thinking that in this case "desktop" is cast in opposition to tablets and cell phones and laptop computers can be considered "desktop" devices.


> Second, while Apple's mobile processors are currently the best in ARM world, for "pro" desktop computers Apple needs desktop-grade processor. Making something comparable to i7 isn't such an easy and cheap job.

Qualcomm, NVIDIA, and Cavium have created very powerful ARM chips for non-mobile application. Apple definitely has the chip architects, the money, and the experience to do such a thing.


I have serious doubts that the marketplace can support 3 desktop/laptop workstation CPU manufacturers.

(Amd, Intel, and now Apple)


I don't think it matters to Apple since they will be using their own chips. Intel will simply sell less, and AMD will not change at all.

By the way, don't forget Broadcom, Marvell, Cavium, IBM etc. as they are doing big bucks with CPU's and SoC's as well.


It matters to Apple, CPUs are a very big expense for them, and the performance of their computers are very important to them.

I'm specifically worried about the performance workstation/gaming cpu market, which Broadcom Marvell, Cavium, and IBM are not major players in. It's currently AMD and Intel. Apple would need to sell a lot of macs with the new CPUs to make the R&D cost worth it and the new CPUs would have to be very fast or they won't sell macs.


I meant the impact on global CPU markets won't matter to Apple, they don't care if Intel makes less money.


Also worth noting is Windows and x86 desktop dominance at the time likely made porting Mac software to x86 easier


I remember all the way back to the early 90s and the original x86 emulator running on ARM. Admittedly that was before optimizations like JIT, and a lot of research has gone into binary translation since then, but it was a painful experience at the time. I wonder how much performance would be lost these days.


If they’re making their own CPUs, they can presumably add whatever additional functionality and/or instructions are necessary to make emulation or JIT translation work effectively.

But even if flat-out performance isn’t as good, what about if it proves cheap and easy to scale to more cores? Like, what if you get twice as many cores as the Intel equivalent has threads, albeit at 2/3 the speed (maybe worse or better for some workloads) but no hyperthreading-style contention? This wouldn’t be useless.

And GPU-accelerated functionality will be mostly unaffected.


Couldn't they achieve that today by running legacy apps in a hypervisor?


the x86/x64 applications in the hypervisor would be super slow, as the x86 ones are with Windows 10 for ARM (Windows 10 for Arm doesn't support x64 emulation)


The issue is implementing the features for the hypervisor’s emulation system without Intel-licensed hardware or software.


lol wut? VMware Workstation, Connectix/MS VirtualPC, Oracle VirtualBox, qemu, bochs...what Intel-licensed hardware or software?

How about FX!32 or it’s latest AArch64 cousin seen on Windows 10?

Of the above, qemu can target x86 and run on anything, Virtual PC ran originally on PowerPC, and FX!32 ran on the Alpha



I see an article that is not congruent with actual tech facts. They can warn all they want, because the precedent says otherwise. Even if you wanted to make the argument that Intel is against emulation on non-x86 systems, that’s not true - see Virtual PC for Mac, FX!32, qemu, bochs and countless 8086 emulators to run VGA ROMs on various legacy free systems (either in OS or firmware). If you wanted to make the case that efficient emulation on non-x86 is is disallowed, well, FX!32, qemu and Virtual PC all used and use JIT.

Intel’s position amounts to cage rattling and vague insinuations that efficient execution of x86 code is impossible without hardware support. That’s nice, and their ham-fisted approach already hurt customer choice in the past (Transmeta and nVidia Denver, before the later implemented the Arm ISA) but the Windows on Arm solution is completely software and not coupled to any specific Arm chip implementation.


There's not really any real precedent other than people not being sued. That's more the complete lack of precedent rather than precedent that it's legal.


You know, the article refers to Windows ARM emulation, but I now see the Intel post as a warning to Apple, as patent licensing discussions may have started about that time last year.

It usually takes 4-5 years to design a chip, and if Apple planned on releasing the Mac CPU in 2020, well...


Apple has done this twice already. 680x0 to PowerPC, then PowerPC to Intel. Adobe's come along on both of these.


Those changes were all to more powerful processors such that emulating the old architecture wouldn't be that bad. In this instance, Apple would likely be going to a less powerful processors and thus emulation at any reasonable level of performance wouldn't be that viable.


That's not "likely" at all. Apple's CPUs are already faster in single-core benchmarks than the Intel competition, even though the Apple processors in question are designed for mobile devices and have much less power to work with.

Apple would not be making this switch if they didn't think they could improve performance.


That's not remotely true. There is some crossover, which is notable. But that means that the fastest iPhone is faster than some of the slower desktops. Intel's current single core leader is a coffee lake running at 4.7 GHz with like 12MB of core-speed cache. Be real.


Gotta remember that all these previous ARM chips have been designed in the context of the phone. If you suddenly tell the chip designers they have 10x the power and thermal headroom, theres a lot you could do.


Sure, but "could do" isn't remotely the same thing as "already faster" now, is it?


Apple's A CPUs are already the best in their category, by far.

Knowing Apple had Intel Macs running in the background for years before they announced the jump to x86, I wouldn't be surprised if they already have the chips and know how well it performs.

And I also wouldn't mind if they did another demo like they used to: https://youtu.be/oxwmF0OJ0vg?t=24m45s


The current A11 chip is faster single-core than the current MacBook Pro.

The Intel chip you mention isn't in any Apple machine at all. (Except maybe the iMac Pro? Is it even in that?) So you be real. I don't think you've actually looked at the benchmarks. You should do so. I think you'll be shocked at just how much overlap there is, just like I was.


>I don't think you've actually looked at the benchmarks.

What benchmarks? Geekbench? That's the only one I've ever seen where the fight is close, and Geekbench (like everything that boils performance down to one number) is nearly completely useless.

If you see a two-watt phone CPU beating a 45-watt actively-cooled laptop CPU, you can either conclude that the phone is alien technology decades ahead of anything else on the market or that the benchmark is broken. Which is more likely?

https://www.realworldtech.com/forum/?threadid=136526&curpost...


Does Linus not realize that Intel processors also have hardware assisted SHA?

Geekbench may not be the best tool for determining the power of a processor, but at the least it gives you a general idea of power across devices in a consistent set of tests, which can be extrapolated.


They didn't at the time he wrote that.

That post is pretty outdated. It's about old hardware and an old version of Geekbench. But people still use Geekbench 3 and old x86 hardware, so it's not totally irrelevant.


We are talking about the MacBook, IE passively cooled Intel Core m3 running at 1.2 GHz on something like 8 Watts TDP.


that's because apple is fixed on choosing the intel parts with the least tdp power they can get away with. it's their choice to make the mbp slow, up and including the numerous thermal issues that force the cpu out of the nominal turbo mode for the longer workloads


Those are valid points, but it's still shocking that Apple's own CPUs can outperform Intel laptop CPUs with far higher TDP.


Benchmarks aside, because they can and are gamed, the only real way to see this, is loading a big workload such as Photoshop, Kernel compilation, ect., that professionals do every single day. I very much doubt that A11 is even remotely close to a mid range quadcore Intel cpu, and let's not forget that we're in the middle of moving to 6-8 core cpus.


The 8700k which the previous poster mentioned is a hex core with 12 Threads!


> Apple's CPUs are already faster in single-core benchmarks than the Intel competition

Extraordinary claims require extraordinary evidence. "But Geekbench..." is not extraordinary evidence. Give me a board where I can do an Apple to apples comparison, booting both under the same revision of the same OS, and running industry standard benchmarks (read: plural), and then we can come to that conclusion.


No one is stopping you from doing that experiment.

> Extraordinary claims require extraordinary evidence.

"Extraordinary" is in the eye of the beholder. We have some evidence; you're welcome to believe it or not, but unless you supply alternative evidence you're not really making an argument. You're just saying "it might not be true," which is something anyone can say at any time about anything.


Nice, that was a good explanation for your stance. I am not supporting either, just liked the turnout. Obviously this isn't my usual kind of hn comment.


"booting both under the same revision of the same OS"

It should be irrelevant in Apple's case anyway since they run their own OS. If their OS is faster for a certain task than some other OS then that's part of what you would define as "faster" for a user of the device.

Right? Am I missing something?


Faster, probably, but 2x+ Faster like in PowerPC > Intel very unlikely. Which is why there’s much less likely to be enough performance budget for effective emulation.


Presumably their "desktop-grade" A-whatever chip will have a higher TDP and clock speed than their mobile counterparts.


If they could just turn up the TDP and clock speed on their chips and get 2x the performance of Intel's best chips that easily- they would already own the entire desktop market.


And some of us, with actual CPU design backgrounds, have been saying that for a while now.

We've been asking why Apple doesn't already do it.


As someone without a CPU design backround, would something like this scale pretty linearly with added power and thermal headroom? I assume there are limit that would have to be overcome, but what would an ARM chip in the conditions of an i7 look like?


TDP typically scales as somewhere between the cube and fourth power of the clock speed if you're pushing the envelope (in the sense of running at frequencies where further frequency increase also needs a voltage increase). So having 10x the thermal envelope means you can probably clock about twice as fast, all else being equal.


It's more of a logarithmic scale.

The same architecture can generally scale to 10x over a few process generations.


This does not hold at 10nm and below.


Seems that it’s more of a logistical and network-effect issue in getting everyone to support it, rather than a technical issue.

Possibly a patent issue also


My only question is if they actually use ARM for their next-generation architecture, or something completely new...


Computer architectures routinely see 3x performance jumps across different power budgets. This rule has held over decades.

Clock speeds alone can probably increase by 30%. Caches and internal datapaths can double or more. Then you can start to add in more execution units or more expensive branch-prediciton or even new more power-hungry instructions.

A 4 Watt Intel Pentium 4410Y Kaby Lake for mobile devices gets about 1800 on Geekbench, while a 115 Watt Intel Core i7-7700K Kaby Lake for desktops gets 5600.

I'm just going to say it: the Apple laptop CPU is going to get Geekbench score... above 9000!

And, yes, I do have a CPU design background.


So artificial benchmarks already do a very poor job of capturing performance. The apple laptop cpu does not exist. If it did exist it would likely suffer a very substantial performance hit if forced to emulate x86 software. So you are going to speculate on the meaningless benchmark numbers of an imaginary cpu that will take a wholly unknown hit if everyone does not rewrite everything why?


Artificial benchmarks do a great job of capturing performance, since they're more controlled and eliminate unnecessary variables.

Once you understand this, then you can understand how CPU designers work to predict future performance. CPU designers use artificial testbenches.


You making up numbers doesn't appear to be a useful endeavor.


I suspect if Apple designs a desktop CPU, performant x86 emulation will be a key design criteria. I know very little about CPU design, but I imagine it would be possible to have hardware optimisations for x86 emulation just like we have today for video codecs.

Or even further they could bake a "rosetta" into the chip's microcode and have their CPU natively support the x86 instruction set along with ARM or whatever they come up with.


Which is the previous gen and was sandbagged as the 50% increase in cores for coffee lake show.


If someone disagrees they should state why


Do you think they will use the same chip? In 2020?


The processors wouldn't necessarily be less powerful. Case in point is the A11 outpacing the CoreM processors.

The greater threat is the ios-ofcation of macOS. We've seen many companies pull their apps from the Mac App store due to the limitations of the sandbox.


ARM less powerful? Wait and see what Apple can do with their own ARM-design if they target Desktops / Laptops. We might get to see 32 or even 64 cores in a regular MBP. Now tell me how THAT would be less powerful than the current Intel offering.


Apple has always gone down the path of fewer, more powerful cores instead of packing many, weaker ones. Apple's chips were dual core until A10.


And it has let them eke out performance that absolutely leaves almost all Android phones in the dust, and all that with a battery that's 30-50% smaller than the rest (although that does bring its own problems, as we now know.)

Imagine if they're not held back by the battery at all, or even just have to match the power consumption of the power-hungry x86 chips.


Let's be realistic. 64 A72 or A11 cores is a pipe dream. It's not going to happen without a massive nm process advantage.

If we had a MBP with 64 wimpy a53 cores it would still only have 50% of the performance of a Intel i7 8700k or AMD Ryzen 1800X on paper. Of course that is faster than most notebooks but if you truly care about CPU performance then you usually wouldn't use a notebook in the first place.


> In this instance, Apple would likely be going to a less powerful processors…

Arguably, the A11 Bionic is already faster than mainstream desktop CPUs.

"The iPhone 8 even edged out the score from the 13-inch Apple MacBook Pro with a 7th-generation Core i5 processor. That notebook notched 9,213. Is Geekbench 4 really comparable from phone to desktop? According to the founder of Geekbench, John Poole, 'the short is answer is yes that the scores are comparable across platforms, so if an iPhone 8 scores higher than an i5, then the iPhone 8 is faster than the i5.'"[1]

[1] https://www.tomsguide.com/us/iphone-8-benchmarks-fastest-pho...


A 6-core CPU beat a 2-core one in a perfectly scaling multithreaded benchmark? That's just shocking! Truly, that is definitely, unquestionably, beyond any doubt representative of what a battle at the workstation level would be.

But more seriously and less snarky there's a reason they used the 13" macbook pro there specifically and that's because the 15" with the quad-core i7 destroys the A11, 15k vs. 10k. Of course that's with wildly different power & thermal budgets, and I haven't seen much reason to think these geekbench results are at all representative of anything, so grains of salt and all that.


Don’t forget that the current gen A processors are designed to run in something with a tiny battery compared to a laptop. It’s likely that any MacBook bound Apple CPU won’t have such a limitation and will be able to run much faster as a result.

Also if you look at the performance trajectory of A-series in the last five years, 2020 sounds about right for where the crossover will come even between iOS based A-series processors and the fastest of what Intel has to offer, provided they can keep up the pace of performance improvement.


> It’s likely that any MacBook bound Apple CPU won’t have such a limitation and will be able to run much faster as a result.

Geekbench is based on the unthrottled peak performance of the CPU. Increasing the TDP headroom eliminates throttling but Geekbench doesn't measure throttling.

You can perhaps increase performance by overclocking but this requires the CPU to be designed for higher frequencies. AMD's Ryzen CPUs usually top out at around 4 GHz. Intel's CPU can be overclocked up to 5 GHz. The maximum clock speed is limited by the slowest component of the CPU. If the slowest operation takes 0.25 nanoseconds to complete this limits your frequency to 4GHz. If Apple had enough foresight to design their chips with this in mind then maybe but in reality they probably optimized the chip entirely for mobile TDPs.


but the question would be:

will apple up the battery requirements for their own cpu/gpu, or would they simply drop battery while striving for a thinner laptop?

this from someone who's been buying/using Macs for a long time - I'd love to see some very fast arm competition, but is that what it'd really turn out to be?


Why would Intel stand still while arm improves dramatically?


Apple has been gaining on Intel while both have been improving. He’s projecting where those lines cross.


> Apple has been gaining on Intel while both have been improving.

Have they really, though? What lines are actually crossing? Intel's mobile CPUs have continued to shed power, so how much of this is actually Apple "catching up" to Intel vs. Intel just optimizing for power at the cost of performance? As in, is there any performance gap that's actually shrinking, or are phone SoCs just getting more power hungry while laptop ones are getting less power hungry?

Because phone SoCs have gotten rather monsterously power hungry compared to years past. They are actually 5w under real world load parts now, with devices just letting them thermal throttle rapidly to achieve higher burst rates. Laptops, by contrast, are ~10w TDP, vastly less than they were a decade ago.

This would probably make a fascinating in-depth analysis, but the singular data point of a 5w SoC being within spitting distance of a 10w SoC is hardly a revolutionary story. It's pretty much what you expect.


Faster != Faster. Adobe and Autodesk likely have many architecture specific optimizations, and take advantage of the large x86-64 instruction set.


Geekbench is a poor tool. Maybe for simple mathematical operations, but x86-64 has a larger instruction set than what the A11 supports


I dont think there is strong connection between a larger instruction set, and performance.


Using a bad example, let's say x86-64 and A11 can both add numbers up to 3 digits. The a11 can do that faster than x86-64. But I want to add a 5 digit number. The x86-64 can do that with one instruction, but the A11 has to break that 5 digit number into multiple 3 digit numbers in order to process it.

From my understanding, which might be wrong, Geekbench only tests what both can do. Comparing a pickup truck and a sports car you need to look at more than just how fast they can do. You can both move furniture, but the sports car will need to do a lot more to be able to what the pickup can.


On the other hand, Apple hasn't been shy about putting fixed function hardware on the Ax chips to eke out the maximum performance for a given power budget. They were much faster at adding h265 decode/encode than Intel, for instance. I would imagine they'd spend their (larger) transistor budget on a lot of this type of dark silicon that would assist in speeding up common operations.


Geekbench doesn't benchmark single instructions.


This doesn't make any sense.

Yes, Geekbench is a terrible CPU benchmark, but both of them are Turing complete, so there is no functionality difference. And complex instructions are broken into uOps in x86 too.


You are 100% correct.


Faster is not necessarily more powerful.


...in one benchmark, from a heavily Apple-favoring company.

I'm sorry, we need better evidence.


Microsoft is running 32-bit x86 applications on ARM today:

https://channel9.msdn.com/Events/Build/2017/P4171


There are also fat binaries - a recompile for apps that don't do anything too esoteric and you've got both formats in one bundle.


There's no reason to think that the ARM chips they use won't be as powerful as their intel equivalents in a few years.


This is true. The rate of progress in CPU performance has slowed to a crawl and will only get slower in the future. Because the pace of innovation of fabs has slowed Intel has also lost their fab lead and GF/Samsung/TSMC should all be very competitive with Intel over the next decade.

Apple is probably looking at all these trends and concluding that they don't really need Intel.


I'm envisioning a Mac with multiple Ann chips in it. Why stop at 1 when you could put 8 or 14 Ann chips in it?


After the Meltdown patch and Intel's exposed security flaws, Apple might still be better off with less powerful processors.


When Apple switches, their processors are likely going to be 2x faster than Intels. It will be a huge jump.


As have Microsoft and other major companies. Unless you're writing large portions of the code in assembly language, porting to a new operating system can be greatly eased by the developer tools. In the past, Apple has provided a transitional emulation layer that meant that even architecture-specific code ran on their new machines.


Don't forget Apple's NeXT heritage. NeXT supported and migrated between even more CPU platforms: 68k, x86, PA-RISC, and SPARC


Whenever there is a big change by Apple, I always see a lot of relatively niche comments like this, and they're really important, IMHO.

These sorts of "great, now I can't use Mac for XYZ" problems might, at first glance, be dismissed as "well, looks like Apple out-grew your niche market... sorry."

Upon further investigation, however, it appears that Apple might letting go of too many niche customers whose needs will be met by niche products; niche products that will grow concentrically, and the Apple we know now will die (has already been dying) a death of 1000 cuts, and the shell that remains will the consumer tech toy company we see taking shape now.


The niche market of "software developer" is in an interesting position if you include developers who create the apps that Apple needs to sell phones. Clearly "iOS app developer" is only a subset of "software developer" but I think there is only so far down the "consumer tech toy" road they can go without doing longer term damage to their ecosystem


I would think that non-iOS software developers are still a big money maker for Apple. It seems like half of professional developers use Macbooks.


Even though I am a non-iOSdeveloper who likes using a macbook, I can see how it would make sense for Apple to push out a small group of power users if their requirements conflict with features that enhance the experience for the majority of their users, for example something like sandboxing the entire OS away from the user. This would (ostensibly) be good for security and would only be a dealbreaker for a small minority.

As long as iOS developers aren't impacted, I'm not sure what the incentive is for Apple to allow that kind of access.


They could make a Linux distro, a "yellow box" for Linux. I know it sounds totally ridiculous, but if Apple really doesn't want to make computers for software engineers, it doesn't stop them making an OS for software engineers.


:-)

Apple did just this, twenty years ago.

https://en.m.wikipedia.org/wiki/MkLinux

http://www.mklinux.org

Interestingly, they claim to have incorporated some of the technology from MkLinux into OSX, according to this page:

https://developer.apple.com/library/content/documentation/Da...

Make of that what you will.


...and they passed that point years ago, to boot.


I don't feel that the parent comment was "niche". This argument comes up every time anyone discusses a architecture switch, and the invariable response is that Apple has already done this twice before with an emulation layer.


And, if anything, it's easier now than in the past. Most code has been abstracted away from the bare metal, so it's more a matter of porting runtime environments over (V8, JRE, etc) than it is rewriting apps.


I'm not really sure how you reach the conclusions you do considering you don't know:

- Adobe's plans

- Apple's transition plans

- Really, anything other than the vague statement that Apple will start using their own chips in two years


Yep, it's the slow death of the general purpose computer (Cory Doctorow reference).


Adobe has a lot of stuff that works well on iOS. ARM has neon instructions for SIMD. I think they will be okay. Changing over from one SIMD architecture to another is not that hard.

AutoCAD isn't ultra optimized for Mac anyhow, it will be okay.

This transition may screw up or significantly slow down VMs of Intel-based OSes like Ubuntu.


I think developers flocked to Intel Macs because it gave them one machine that could run Mac, Linux, Windows in virtual or bootcamp environments. If x86 compatibility goes away, developers may jump ship to Linux or Windows machines.


I used to have a PPC G5 iMac running Linux. It was fine; hitched me onto the Linux-on-Mac train for more than a decade once I finally ditched Mac OS X. And Linux already works nicely on ARM - it's the most common deployment.

This may be a terrible outcome for multi-OS users, but it's not guaranteed.


UNIX developers you mean.


Your last point may be a bit of an understatement. It's going to absolutely annihilate x86 virtualization on the Mac. Like, I don't think parallels or VMware will even attempt to write an x86 emulator for ARM, they'll just pack up and leave.

But the Mac thrived for a very long time without x86 compatibility, and it can do the same again, virtualization was always kind of a happy accident for us anyway.


Parallels and VMWare will just abandon their primary market?

No they won't.


Maybe not Parallels, but VMWare Fusion always seemed like a skunkworks side-project that almost got killed off at least once. Their cash cow is enterprise hypervisors, not $60 consumer windows emulation, unfortunately. (Fusion has been serving me very well for windows and .NET development purposes so far)


They can try, but you can't get blood from a stone. Emulating x86 on ARM is going to give you grotesquely low performance, it's arguably not even the same category of problem that they currently solve (virtualization, not emulation).

One thing they could run with is ARM-on-ARM virtualization, which is a more direct (heh) parallel to what they do on Intel now, but ARM hypervisors are less mature, and their practical applications, less readily apparent.


VMWare's primary market is not Fusion.

VMWare's primary market is vSphere -- and you don't run vSphere on a MacBook.


Didn't Parallels start off with an Intel emulation before Apple switched to x86.


There was "Microsoft Virtual PC for Mac" which was x86 emulation for PowerPC but performance was abysmal.


There's Windows for ARM, so important Windows applications will be recompiled to ARM and those who stay behind probably can take that performance hit.


Except, like, all the games.


What's so special about games? Sure, old games might be abandoned, but supported and new games probably will be compiled for ARM as well. Of course it depends on whether ARM will be a commercial success and there will be sizeable number of ARM laptops.


But old games are the classics of the genre! Imagine if switching platforms meant you couldn't watch movies made from 1960 to 1990 anymore. Also, game companies lose source code all the time, and they often aren't interested in recompiling the code that they still have. (This comes up whenever compatibility-breaking iOS updates happen.)

Frankly, I'd be surprised if even a game like Overwatch got an Arm port.


IIRC Windows games already sucked on Mac hardware, even booting natively.


The hardware isn't top-end by any stretch, but you still have the back catalogue of pretty much the entire history of PC gaming, plus less demanding (but still un-emulatable) indie games — to say nothing of eGPU use for modern gaming. (And even without an eGPU, I can still play Overwatch and CS:GO on my 2013 MBP at 60fps on low settings.)

If my Steam library couldn't come with me, there's no way I'd get an ARM Macbook.


But adobe's arm work is probably related a completely different series of products directed at the not quite a demanding pro segment of the market, and you still need to port autoCAD something they might not think is worth it.

The problem is that you are not just porting from a low TDP atom(by any other name) chip with a intel gfx card but a class of computers that include dual Xeon systems with 2 or more high end ati cards, to something that don't really scale upwards from a macbook air in terms of raw power.

And when you remember how badly outperformed even the highest spec mac pro is compared to their competitors, who is likely still going to be around a decade from now the question is to what extend a maker of high margin niche product targeted at demanding professionals(with corporate procurement budgets) to even consider supporting the mac with anything but a simplified html5 based consumer version that lacks most of the advanced features going forward.


A lot of progress has been made recently in ARM virtualization (see its push into datacenter units). An ARM VM should run reasonably quick, and Ubuntu works fine on ARM.


> An ARM VM should run reasonably quick, and Ubuntu works fine on ARM.

Sure, but if we're deploying to x86 servers running Linux, then running Ubuntu on virtual ARM doesn't seem terribly wise[0] — it won't tickle the right bugs.

[0] For my money, running Ubuntu in general is a poor idea. It's not a great server OS, so it shouldn't be on the server — and if it's not on the server, it shouldn't be on the development systems either. Just run Debian everywhere.


Are there extensions with a stronger memory model by at very low cost (matching the one of x86)? If not, we might not get excellent perf with x86 emulated by arm in the multithread & multicore era we are.


Server versions of major Linux distributions already run on ARM. Gnome and KDE too, although I hear that there can be issues. Part of the problem with ARM support is the lack of hardware for Linux developers to test on. MacBooks with ARM would actually help this process.


There are several aspect to the perf hypotheses you make, and I don't see why Apple could not design its own CPU with excellent perfs.

Current Apple best CPUs are already performing very well compared to some Intel CPU. Not the fastest Intel CPU, obviously, but a) Apple Arm implementation is already above all the Arm competitors b) we are talking about an Arm implementation optimized for low energy consumption and other components and functions essential to a high end mobile phone -- and even then it starts to catch up chips that are dissipating more and integrate far less features c) the hypothetical Apple latop/desktop cpu won't be coming today.

From a theoretical pov I see nothing that prevent Apple to achieve what they want. Especially with the kind of money they have, and the kind of team they have to already design chips. Oh btw, the A11 has more transistors than a 4core Skylake. That does not means much, except that we are dealing with the similar kind of projects, and that in that kind of project, the perf and thermal budget you have is not something that suddenly makes the people working on it go: oh my god, but we don't know how to make that, we are only good for mobile phone SoC. I suspect some of Apple employees working on their chips even worked at Intel before...

Another problem with what you say is that the workloads you consider are especially well suited for accelerations, and it is even simpler to compete in this area compared to old school CPU power. You just need bulk compute power, that can come in the form of, at least: vector instructions; GPU; DSP; fixed function acceleration units. Those areas are not especially dominated by Intel.

Some last points are that: top CPU speed are (mostly) not growing anymore. That lets others catch up Intel in the few area that they did not yet -- others also are catching up on the process side.

The result of that equation is that I really don't see why Apple could not manage to create high performance CPU to the point of not needing Intel anymore. Maybe at first they would limit to their laptops, and switch the Mac Pro last, but I'm not even sure of that.


>Some last points are that: top CPU speed are (mostly) not growing anymore. That lets others catch up Intel in the few area that they did not yet -- others also are catching up on the process side.

Meanwhile in reality the top end non apple ARM SoCs only have the performance of a single highend intel core.

http://browser.geekbench.com/android-benchmarks/ vs http://browser.geekbench.com/v4/cpu/7779910


Yes but Apple has an edge compared to other ARM (and that's what we've been discussing about in the first place) and comparing the CPU speed of an energy optimized phone SoC to an high end desktop core is absolutely not representative of what you can expect from a team tasked with creating a CPU core dedicated to performance in the first place, with less power constraints. (And that's also why the work Apple has already done on their own CPU is so impressive, btw.)

Otherwise I could as well take the perf of an IBM z14 mainframe and compare it to an existing smartphone SoC, and declare that Arm is doomed because too slow. Note than even increasing the clock alone of those ARM cores would increase the perf greatly -- it might not be possible without modifying them in various degree right now (if you don't target high freq to begin with, you can afford to work less on the length of some critical data path), but given today processes it is very probably not extremely hard to tune the design if needed.


Apple has more experience with architecture changes than many other companies. In general, transitioning to a new processor architecture is very straightforward, and way easier than switching to a new OS.

The reason why windows on ARM never caught on was simply that windows was still available for x86. With apple it's different though, because once (Or if) they switch, then the only way to be on OS X is through the use of the new architecture.

There are a variety of features of OS X that are going to keep the design community there for the foreseeable future.


Well, except it's not that simple at all. Even if they immediately discontinue all previous models and only introduce new architecture models in one go, you still have several years worth of computers sitting on x86/x64. They won't be able to get rid of it in the OS until at least the mid-2020s.

As an example with the PPC->x86 move, they released Rosetta in 2005, stopped pre-installing it in 2009, and only in 2011 did then completely remove the ability to install it. I foresee a similar experience here.


The technology to do this is already there. Apple will likely extend some piece of open-source work or acquire a small company which specializes in emulating x86 on whatever architecture they transition to. Emulators are complicated pieces of software, but they are well understood ones. We know how to make them reasonably performant, and Apple has had around four major architecture changes under its belt from which it can draw experience.

The existence of Rosetta does not prove the transition to be 'complicated'. It just shows the transition was a thing


I'm not a chip designer, but how practical would it be for Apple to try to make a CPU that has good hardware support for both ARM and x86/x86-64 binaries?

It's a radical direction, but they are designing the chip by themselves and for themselves, they have deep pockets, and if it worked well it would reduce an important risk.

There are already several chips capable of running multiple instruction sets. Most desktop machines have chips that can run either x86 or x84-64. Some(?) ARM chips have a Thumb mode. So that seems very practical, and obviously there is a bigger "impedance mismatch" when trying to do both x86 and ARM on the same chip, how much worse is it?

If you could do some kind of hardware real-time instruction translation without wasting much silicon and without reducing performance (for x86) that much, it seems like it would take a lot of the pain out of the transition.


I'm not a chip designer either, but IIRC, the current architectures are actually something similar, whereas the CISC is broken down into custom RISC-ish instructions by a HW layer, which is then executed by the RISC-ish cores.

And while it might be doable (x64 and ARM on the same chip), I think it might not be a smart decision, since some things will not function correctly or will have to be redone (e.g. speculative execution).

I don't know either architectures on a deep understanding level (x64 or ARM), but I have a feeling that there should be some big architectural changes that would prevent x64 code to run on ARM natively.


AFAIK the biggest impediment might be that Intel would need to license Apple to do so, not to talk about patents etc.


See: Transmeta.


How do you think the announcement that OSX will be supporting external graphics cards fit into Adobe/Autodesks plans?

I don’t really do much video, photo or 3d editing these days but isn’t that all GPU driven?

If that’s the case then couldn’t someone like Nvidia pick up the slack for Apple with a really nice external modular offering?


I use both Adobe and Autodesk, and neither yet support eGPU well. Essentially, if they wrote their own graphics engines (both adobe and autodesk did) then eGPU doesn't work that well. Lightroom has started to come over, and some features of Fusion360 seem to work OK, too, but for the most part the only eGPU accelerated parts of the apps seem to be the user interfaces.

Given Apple's recent obsession with AR/VR, I wouldn't be surprised if the new apple architecture was really GPU first, CPU second. Apple has always loved graphics, and in all likelihood they recognize that a more parallel OS model might actually suit people's day to day needs better.


Sure nvidia could build a traditional co-processor card(just like in the old SBUS days) but what are the chance of apple releasing hardware with that kind of extendability after going all in on the IOSification of the mac?, even thunderbolt(which is a much lower level bus then USB3) did not support the same kind of performance as a proper PCIe slot.

And even if you could most of the software in question is composed of countless components some of whom predates the mac itself and porting that to support anthing but the traditional big core CPU platform is not a trival thing to do, which is why the Xeon powered workstations and servers still exists and why manycore designs only really work in the supercomputer space.

You could argue that the workstation itself is a dead concept and that the future is "cloud" with html5 frontends even for video and CAD workloads* which is something autocad and adobe is likely working on as a long term strategy, but in the short term they need the workstation that post Jobs apple don't appear to to have any interest in building.


It is also quite likely Apple will try to add some kind of SIMD chip to their ARM efforts. Things just got a hole lot more interesting if we can wrestle Intel from their current leader position.


Apple goes where the money is. I'm sure they have ran the numbers. If this makes financial sense I'm sure they are willing to drop the professional graphics market. Still waiting on that new Mac Pro?


If Apple manages to lose both designers and developers, I think the whole ecosystem (not just macOS) will suffer more than Apple's management realizes. I also don't think it's possible to run the numbers on this scenario.

Without the insistence of so many designers on using a Mac, would macOS really have won a foothold in corporate environments? If enough designers and developers switch to Windows, who's to say that macOS won't slowly turn into a hobbyist OS that no IT department wants to support (like Linux, or Macs ~12 years ago)?


Apple doesn't care about the corporate market. They care about individual users. And as long as the CEO wants their shiny Mac (also, see iPhone, iPad) then the IT dept will support it. Your argument might have been valid a decade or so ago, but since then, every home has a laptop or three, and BYOD happened.


In my limited experience, management often uses a shiny iPhone, a shiny iPad, but then a Windows laptop (or none at all). I can't find any statistics to back it up, though.


>this essentially kills the professional mac workstation as a viable product going forward?

Some would argue it's already been dead for a long time. I'm seeing less and less directors and artists using Macs for their workstations when it was close to 100% about 10 years ago.

They just didn't supply a competitive workstation machine for so long that many were forced to move to windows if they wanted power. I don't know a 3D artist anymore who isn't doing all their rendering through multiple Nvidia GPUs in a tower PC.

The iMac Pro has only been out for a few weeks after years of neglect and I'd struggle to recommend it to anyone at that price point.

(Ironically many I know still use a Mac laptop but it's relegated purely to office/email/slack/web tasks that back in the day we'd associate with Windows machines, not the actual creative work)


> is going to require significant work by third parties outside Apple who haven't even heard of the possibility of this happening until today

I don't see recompiling their next release with a different version of Xcode qualifying as "significant work".


> whatever it is the pro's use for video editing those days are a different ballgame.

That would be Final Cut Pro X, which Apple itself makes. (Also, Motion, Compressor, and honestly, iMovie does a lot of stuff you wouldn't think it could do for a free app.)


There are very very few pros using Final Cut Pro X (lots of wedding videographers though). Every single edit house I've been in recently is on Avid or Premiere (usually both) with a FCP7 computer somewhere for legacy projects.

I've never encountered anyone using Motion professionally. It's all Nuke and After Effects with Fusion growing.


If Apple needs to, they could buy either Autodesk or Adobe. To be honest, I'd prefer the buy and port Solidworks.


FTA: The Pro units will remain on Intel until Apple's chips could compete for this reason.


My concern is that they are going to completely lock down and cripple MacOS and turn it into iOS, making it worthless as a professional machine. The CPU doesn't matter but the versatility and user-centric-ness of the OS does.

If they're talking about making MacOS more like iOS I think it might be time to start looking for the exit. A laptop/desktop is not just a portable dumb terminal for accessing siloed cloud services.


I see no reason to suspect that changing CPUs would be related in any way to "crippling" macOS. Those are two separate issues.


They are separate issues, but history has shown that changes in architecture (or form factor) is often used as a pretext to also make other changes.

Mobile transformed the computer into a portable surveillance and addictive media device aimed at the user rather than a personal computing device built for the user. Those are two separate issues of course -- there is nothing in mobile that mandates that it be designed to commoditize its users -- but the shift to mobile was used to also smuggle in a total inversion of the user/machine relationship.

Apple is kind of a strange company. On one hand they have stuck up for the privacy and security of their users, but on the other hand they were a factor in this inversion of relationship. They weren't the only or even the primary factor, but the locked down nature of their iOS platform helped other actors such as Facebook and Google implement surveillance-capitalist and gamified attention-capitalist user experiences there.

Whether or not I stick with Apple depends on whether or not they keep my computer mine even as they swap out the CPU. I could care less about the architecture as long as it performs well, but I do care about the nature of my relationship with the technology that I use. I care about this for both personal and professional/pragmatic reasons.


I agree and it's unlikely my next laptop will be an Apple device...


I was thinking the same thing, but Apple doesn't care. Folks like us do not represent the majority of the market.


exactly. It's a shame... written on old macbook air


Why assume this is an arch switch to ARM? What's stopping Apple from making an x86-64 compatible chip? Or even a chip with dual instruction sets? I'm sure they have the in-house talent at this point to make it happen.


Interesting question. I think the advantage Apple has here over MS, is that partners like Adobe already have experience making apps run on Apple's ARM chips. Lightroom on iOS is surprisingly full featured and fast.


Whats the big deal? We all write code that is cross platform, don't we? :)


No one wants to be the next QuarkXPress.


Why are you so sure that it will be ARM processors tho?


Based on this article, it doesn't sound like Bloomberg has enough information to distinguish between Apple being committed to a transition, or Apple developing chips to improve their negotiating position with Intel. It's also very unlikely to happen on such a fast timetable; given the IP situation, it's unlikely Apple could make an x86_64 chip, and any move away from x86_64 is going to require significant work by third parties outside Apple who haven't even heard of the possibility of this happening until today.


> Apple developing chips to improve their negotiating position with Intel

Not necessary. They only need to ship one model of Mac Mini or any small desktop lineup with Ryzen to improve their negotiating position while still being compatible with the existing x86_64 ecosystem.

So no, this is not merely a negotiation jibe. There are definite long-term prospects.

We're already moving to a mobile-first design and approach world. Having those mobile/tablet apps expand automatically to desktop is the next logical conclusion. As an app developer, there is nothing better than write-once-run-everywhere, and the cascading effects of that on the whole Apple ecosystem and future consumer audience is hard to understate. Keeping aside power, efficiency, device prices, opportunity costs and much more.

> significant work by third parties outside Apple who haven't even heard of the possibility of this happening until today.

Definitely not today. Its been speculated for many years now ever since the A4 chips, and its still not official news. Usual "people who don't want to be identified". If or when this becomes official news, most partners would be like about time, because everyone is in one way or another working on convergence and multi-device. Adobe and Microsoft are examples of pivoting many of their desktop businesses successfully to cloud and apps already.


> Having those mobile/tablet apps expand automatically to desktop is the next logical conclusion.

Thankfully Apple comprises vastly smarter people than me, but I've always thought this is a bad idea.

The two interaction models are so dramatically different that I don't see how merging them makes sense. A finger is not a mouse.


> ... I've always thought this is a bad idea ... A finger is not a mouse

There are no good or bad ideas :) Its all about time, place, knowledge of tradeoffs, execution, business, marketing, taking into account all stakeholders (user is one of them) and everything holistic :)

What you say is true, a finger is not a mouse. But we're now a decade past the release of iPhone, and by this time, the industry in general (and Apple in particular) have deep knowledge about all the tradeoffs involved here. 5 years before and this can be considered a reckless bet that requires a Steve Jobs to pull out. Now, its just natural evolution. Responsive design has been around for even longer, and changing interactions based on screen and form factors is a pretty mature problem domain now.

To be very specific:

- whether its a finger or mouse can be a runtime decision, not necessarily a compile-time or clean-slate/distribution decision for different platforms

- those decisions are already standardized enough based on existing knowledge that you can let the platform/framework (or 3p libraries) handle it out-of-the-box for you and just register multiple possibilities. instead of debating finger vs mouse - think finger and mouse

i.e. different interactions requiring entirely different apps - is not necessarily true for all of them. for a 2D application, some amount of standardization is actually good, otherwise it doesn't really help all the interaction patterns and may instead stagnate them.


It's hard to argue with the resounding success of Windows 8.


Its easy to succed when you have the monopoly in PC operating systems so success of Win 8 proves absolutely nothing. Touchscreen in PC is about as useful as a waterproof towel


I had assumed that was sarcasm.


I think this was sarcasm :)


I'm also surprised that no one (of significant voice) has voiced enough to pressure Apple to think about their developer user population. Everywhere I go I see devs using mac. I'm sure the reason behind this is 2 folds: supported hardware, x64 + Unix platform. So if they make the transition, say, in 2020, the dev world must be prepared by the end of 2019, I mean, from every toolchain to dev Apps. And this would seem quite a big endeavour, not that devs world moves slowly but the amount of work...

Is my anecdotal too far off?


It’s a good point, but I suspect the transition (if it happens) will be staged over time to make it easier.

If you look at the PPC->Intel timeline[1], Apple announced it 6 months in advance, although it went fairly quickly after that.

1: https://en.wikipedia.org/wiki/Apple%27s_transition_to_Intel_...


Just the same treadmill as when toy desktops replaced workstations in the 80's.


Apple did a pretty decent job when they transitioned from PPC to x86_64, it has Rosetta to translate PPC to x86. Microsoft and partners released ARM based Windows Laptops this year that can run Win32 apps in emulation.

What's stopping Apple from shipping Macbooks with a custom SoC that can run existing Apps in emulation until developers can recompile? I would argue that most Air and Macbook owners aren't developers and probably don't have many apps that didn't ship with their system.


And before that they successfully managed the transition from m68k to PPC.

The processor doesn't matter.


Yep. Just to put this into perspective: tiny NeXT shipped NeXTStep on 4 architectures. M68K, Intel, PA-RISC and SPARC. They had at least two more in the lab, M88K and PPC.

Compiling for different architectures was a checkbox in ProjectBuilder (after you took care of endian issues, once). Much easier than in Xcode today.

My favorite was that they apparently shipped an additional architecture by accident: the developer tools came with one of the aforementioned architectures long after it had been officially dropped.


The processor doesn't matter to Apple because they don't mind fucking over their customers and their third party developers.


That’s a popular line around here but it doesn’t line up with the reality. When they’ve done architecture changes in the past they’ve reached out to large developers for feedback before committing to the changes being made. The original very OpenStep like OS X did not ship en mass because it lacked a bridge library to support developers and Adobe and others cried foul. That’s how Carbon came about. When 64bit Carbon was shelved some 10 years ago, the “sky is falling” was proclaimed but Adobe came along for the ride and everything was fine. Are Technica did a great write up summarizing both the genesis and end of Carbon development [0]

The PPC transition and Intel Transition both had emulators, fat binary support, and for the Intel transition, early access to developer hardware [1]. I’m not sure how much more you can ask of Apple. The current iOS simulator compiles to native Intel code and then builds for deployment use the appropriate CPU target. The tooling is mature and the execution know.

Apple can certainly do better in a lot of areas, i.e. Swift examples that are either missing or are too old to compile. This is something they’re competent at.

[0] https://arstechnica.com/staff/2008/04/rhapsody-and-blues/

[1] http://vintagemacmuseum.com/the-apple-developer-transition-s...


Apple spent a lot of effort with each transition trying to make it as smooth for customers as possible while still pushing forward with the new system.

I have no doubt some customers were caused pain by the transitions, and some left the Apple world entirely, but characterizing them as being "fucked over" seems a bit over the top.


(gaius is right, however: CodeWarrior saved Apple once, but they've clearly learned a lot with successive transitions.)


It’s true that Apple fumbled 68k —> PPC as far as MPW went but it’s also true that creating fat binaries was a total non-issue with CodeWarrior. Apple had learnt from that mistake for PPC —> x86


The difference is that the x86 chips were so much faster than what they had for PPC, that they could still emulate it without customers feeling like they were going backwards.


How does this fuck over customers or third party devs in anyways?


If you are changing CPU architecture, you have three options with respect to backwards compatibility.

1) Old programs don't work, because it is a different CPU architecture. 2) Old programs work, but in a VM, so it can't take full advantage of the hardware. 3) Old programs must be recompiled to work on new architecture.

The last one is the preferred option, but is only possible for open source software. If options 1 or 3 are taken for proprietary software, the customer needs to buy a new version of the software.


You go through some mental gymnastics to make it seem like #3 is only possible for open source.

Look - Apple or any other vendor isn't beholden to one CPU architecture. Such expectations breed monopolies - like Intel in PC CPUs.

None of your arguments prove that Apple is fucking over customers or developers. If anything, this opens up the market for newer, more nimble companies that'll fill the gaps left by slow moving, irrelevant apps/software.

Cheers


I'm looking at it from the point of view of the customer. If I don't have the source code, I cannot recompile the code, end of story. The company that sold the binary executable might recompile it for the new architecture, but they're probably not going to give the recompiled binary away for free. That is why I say that #3 is, for the user, only possible for open source.

I agree that vendors are not beholden to a CPU architecture, but let's not pretend that switching is immediately beneficial to the user. What you call "opening up the market", I call adding unnecessary obsolescence to programs that chose not to add planned obsolescence in the first place.

If anything, I would take this as further evidence that software should be sold as source code, because the utility of mere build artifacts can be taken away.


You can conjure up fictitious reasons, but the manufacturer never guaranteed the buyer that future hardware versions would use the same CPU.

Then there's your contrived reason to obtain source code - another bogus, non-sensical reason that'll never fly with devs.

You always have the choice of staying with an older model, or better, using Linux on your custom hardware. Don't push your socialism/communism on one of the most capitalistic companies on Earth (Apple)


I doubt that story has changed much.


It does matter that the new processor is a lot more powerful and faster than the old processor. This wouldn't be the case this time.


It's not powerful/faster that sells chips, it's power and speed relative to envelope. Maybe the iMac pro continues to ship with Xeons, the iMac with Core i5 and i7 depending on configuration.

But compare the Intel Core m3 to the Apple A11, and a completely different story will emerge. The A11 is already comparable to relatively recent Macbooks in terms of performance.


The point is, if we're going to bring up what an 'easy' time Apple has had transitioning from one architecture to another, it's worth remembering each transition came with a big performance jump. This made the new platform desirable and emulation bearable. If they are considering a transition again, 'bearable emulation' is not as much of a given.


While single core performance may not be higher, I could see Apple adding more cores than their current Intel offerings have, and still fall within an acceptable power envelope (thermal, battery life, etc).


My memory is that the PPC -> x86 jump was due to PPC supply issues and the fact that the PowerPC 970 / G5 was too power hungry for laptops. I could be wrong, but I administered labs of mixed x86 / PPC Macs during the transition and the performance jump seemed just like the normal difference between successive generations.

Keep in mind that while you don’t see much 68k outside embedded these days, you still see POWER in supercomputer rankings, and it also appeared in game consoles.


They were a lot faster, especially the laptops. And that was really the consumer proposition in both cases - you put up with our switch in exchange for a faster, better, stronger computer.

If Apple ends up doing it, the proposition would almost certainly be different and it seems very unlikely it would involve things like running your x86_64 macOS apps on your brand new Mac except three times slower. Thinking about this in terms of previous changes (or in terms of PPC history details) just seems obviously wrongheaded to me.


People assume that everyone who uses a Mac is running Photoshop or developing advanced AI but they're not. There are a lot of professionals using Apple products but they avoid the low end like the plague. Apple is a lifestyle brand these days and much of the user base has no demand for anything beyond the core Apple Apps.

The Macbook Core i3 is barely enough to run Safari or iTunes and Apple could probably replace the CPU without many of those users ever noticing.


I'd expect a huge performance jump when the new ARM-based Macs come out, both in Single-core (they can ramp up the clocks and increase cache and execution units) and Multi-core (they can add more cores to fit new power budgets as well).

The current A11 chips for iPhones are within 10% of Intel's top mobile chips on Geekbench, and within about 30% on their top desktop CPUs.

It's entirely possible for chip architectures to see 2x-3x speeds when moving from mobile power budgets to desktop power budgets.

An Intel Pentium 4410Y Kaby Lake running at 4.5-6 Watts gets about 1800 single-core on Geekbench, while an Intel Core i7-7700K Kaby Lake running at 115 Watts gets 5600 single-core on Geekbench.


> The current A11 chips for iPhones are within 10% of Intel's top mobile chips on Geekbench

No, they aren't. iPhone X's multithreaded geekbench score is 10k. The 15" macbook pro is 15k. That's a lot more than 10%. It's only close if you look at the lower end Intel chips, the dual core ones (which is what Apple ships in the 13" macbook pro).


Those are multi-core numbers, not the single-core numbers.

What do you think is the most important for 250million desktop users? Because the vast majority of them are sitting idle waiting for interaction tasks, like on your system now.


Even browsers are doing a better job of loading multiple cores now and desktop machines often have more processes running in the background.


> Because the vast majority of them are sitting idle waiting for interaction tasks, like on your system now.

My dual Xeon E5-2690 v4 regularly loads all its cores and benefits greatly from them, but keep making assumptions by all means.

But if all you want is a chromebook competitor then sure, A11-class works fine. I'm going to guess that the people using Mac Pros tend to care a bit more about just running Chrome/Safari, though. Maybe Apple is just going to completely give up on their historically strong content creation market.


Geekbench is not representative of application performance, sadly.


The Intel chip is definitely along the biggest line items on a Mac. Apple can cut that cost by 70-80%.


Maybe the first few hardware revisions it won't be as performant. Its probably going to be an Armbook that will have the form factor of the Air or the Macbook but with A12 chips. The current A11s are in striking distance for performance.


The current A-series SoC’s were also designed around the power and thermal requirements of a mobile phone. In a laptop or desktop they would have a lot more wiggle room.


They could also run more than one per machine


> This wouldn't be the case this time.

I don't think we can make that assumption here. I would be very surprised if the ARM chips they put in desktops are identical to the ones they put in phones.

For one, the power budget is going to be a lot larger (even for a notebook), and power is roughly equivalent to speed.


Apple's ARM chips compare favorably to the crap that is in most laptops. Those Intel Core i3 6100U or Core i5 6200U chips? The midrange and low end will be greatly served from ARM chips.

They will not compete at the high end against the Core i7s with 6 cores running at 3.8GHz (12 with HT) though.


> Geekbench 4 scores are calibrated against a baseline score of 4000 (which is the score of an Intel Core i7-6600U)

https://browser.geekbench.com/mac-benchmarks

> iMac (27-inch Retina Mid 2017) | Intel Core i7-7700K @ 4.2 GHz (4 cores) | 5683

https://browser.geekbench.com/ios-benchmarks

> iPhone 8 | Apple A11 Bionic @ 2.4 GHz | 4217

4217 for Apple A11 @ 2.4Ghz vs 5683 for Intel Core i7-7700K @ 4.2GHz

Of course, microbenchmarks don't mean much. But the margins are thin enough for users to notice already. Add in more power, more cores, more Ghz, better optimized instruction set, more vertically integrated system, and who knows.


geekbench scores look impressive because they are very very short bursts to prevent power/thermal throttling on iphones, iphones are able to scale up/down preformance wise very quickly, x86 CPUs less so, so what is really being tested is how fast can a cpu jump from idle to full speed.

Geekbench is not at all a reliable benchmark that tells you anything about real preformance.

Its a total farce to suggest that a CPU with a power budget that is 10x to 20x larger, on a modern process, with modern archtecture is somehow just as fast or slower.


Intel is dragging 40 years of legacy along. I’m not so sure a new processor can’t be faster and less wasteful of battery power.


Indeed, we have every indication that Apple can already do both of these things. Go look at Geekbench 4 single-core scores for the current iPhone and iPad vs. the current MacBook Pro. Apple is already beating Intel on speed in a design that uses MUCH less power.


I guess that Apple would have the resources and knowledge to create an extended ARM architecture that could match x64 in the same power envelope, if they really really wanted.

The market segment isn't that large though so it seems tough to get it done within the laptop budget. Still, it could benefit the other devices and streamline the hardware development, so maybe they think it's worthwhile.


But what -- and work with me here -- if the processor is slimmer. With rounded corners.


> until developers can recompile?

Will it be that simple? Apple's own Logic Pro ships with a lot of legacy products from Emagic days. I know they've re-skinned a few with the last couple releases, but I'm guessing a lot of the DSP code is still the same.


Creatives who want hardware optimized Adobe apps to continue to be fast will care. I wouldn't count on emulation of vectorized code to work all that well.


Yep, I think people tend to gloss over how smooth these transitions really were. From a creative professional perspective, they were huge short-term PITA until Adobe etc had everything ported over. Also Excel lost VBA support for several years.


I remember everything audio was a mess although I'd imagine a lot of it had to do with smaller developers lacking resources to port existing software.


Losing VBA support stinks, but do any professional Excel users even consider Office for Mac?


At least for me "professional Excel users" sent out stuff that I could not open, and that is not good for the for the platform.

"Spreadsheet macros" were like hot personal computing shit in 1983. Really not good if you cannot create them in 2008. , (Few complaints about the current Mac MS Office, FWIW) I'm just saying that breaking old stuff often takes year to fix.


No, I get it. I just think that, in general, people who spend a big chunk of most of their work days using Excel kind of consider anything other than Windows MS Office Excel a non-starter.


Adobe's transition was somewhat rough because they took the Carbon route, and then had to do it again to switch over to Cocoa. I expect that now that all their apps are built on Cocoa, the transition will be relatively breezy.


emulating x86 on ARM with any kind of real preformanceis additionally tricky because the x86 archetechure has very strong memory gaurentees (dosent matter if memory is aligned, dosent matter when in the instruction pipeline you access it, etc) that ARM isnt even close to matching.

Emulating that requires a massive performance hit, because you essentially have to check every single memory access to make sure its not doing something invalid on ARM.


From what I've heard, x86 to PPC or ARM emulators are doing dynamic recompiling and instrumentation. It catches the illegal memory accesses and rebuilds the code responsible to add paths with slower code with alignment compensation.

A lot of x86 code is aligned for speed already so it's a pretty safe bet to assume alignment and fix it up if wrong.


…or you design your own ARM CPU that handles unaligned memory accesses fine, and run your emulator on that hardware.


AFAIK the ARM license doesn't allow to invent arbitrary extensions to the ISA. So this might not be allowed.


Apple has an architectural license. I would expect that allows them to add instructions (assuming modern ARM still has the concept of instruction set extensions)

Google didn’t answer that for me, but I found out that there now is a ”cp15 sctlr[1] (alignment bit)”. http://infocenter.arm.com/help/index.jsp?topic=/com.arm.doc.... says this about it:

”3.5.1. Alignment faults

If alignment fault checking is enabled (the A bit in CP15 c1 is set), the MMU generates an alignment fault on any data word access if the address is not word-aligned, or on any halfword access if the address is not halfword-aligned, irrespective of whether the MMU is enabled or not. An alignment fault is not generated on any instruction fetch or any byte access.”


I thought the emulation only covered x86 not x86_64 because patents still cover 64 bit x86. Would that be more or less of an issue for Apple


As much as Apple made a big deal marketing "Rosetta", behind the scenes it was a product from Transitive called QuickTransit. QuickTransit is gone now, absorbed by IBM. In comparison the 68k to PowerPC emulation was very primitive and slow and mostly appeared to work because of how unbelievably slow 68ks were vs PPC.


> It's also very unlikely to happen on such a fast timetable

the article (thin as it is) claims a multi-step transition. Apple almost certainly has a version of Mac OS that can run on their iPad hardware - the transition path that makes the most sense to me is a 12" MacBook that runs essentially the same internal hardware as an iPad pro, and a MacOS that can run iOS apps. There would be a great consumer market for a MacBook that runs iOS apps, and it would serve as a hardware test bed for developers to get their MacOS applications ported over to ARM before transitioning the MacBook Pro lineup away from x86.


I’m still very skeptical of the “Mac that can run iOS apps” concept. Tim Cook has been pretty unambiguous in his statements about Mac vs iPad.

https://www.macrumors.com/2015/11/16/tim-cook-no-converged-m...

>We feel strongly that customers are not really looking for a converged Mac and iPad, because what that would wind up doing, or what we’re worried would happen, is that neither experience would be as good as the customer wants. So we want to make the best tablet in the world and the best Mac in the world. And putting those two together would not achieve either. You’d begin to compromise in different ways.


Of course this is from 2015, and it’s possible they’ve been prototyping for years and think they can overcome the fear of a subpar device.

But as a former Surface Pro owner who now has an iPad Pro, I don’t see that happening. The iPad is immeasurably better as a tablet when you have tablet-oriented software available. And when you don’t, obviously the Surface’s compromise of “have a crappy desktop experience too” is usable if you need to have that option. But it’s not good compared to stuff designed for a tablet.

Similarly, the chunks of Windows 10 that are clearly designed for touchscreens (like the new Settings app) are not great on a desktop compared to the older and still more powerful control panel. More consistent, sure, as any ground-up redo would be, but the information density of things like the Add or Remove Programs list is awful compared to what it was before.

I don’t think Apple is going to make those compromises. They might do a more converged developer backened for Mac and iOS to make it easier to target both platforms, but they won’t shove the frontends together.


> I don’t think Apple is going to make those compromises. They might do a more converged developer blackened for Mac and iOS to make it easier to target both platforms, but they won’t shove the frontends together.

This, I think, is the most accurate picture of future iOS / Mac convergence. Universal binaries that present either a desktop, tablet, or phone UI based on where they're running.

Microsoft's mistake was trying to converge the desktop and tablet UIs.


They already have one App Store where a single purchase gives you a cross platform app with a different frontend across iPhone, iPad, Apple Watch, and Apple TV.

Unifying the Mac frontend into that ecosystem just seems like the obvious conclusion.

I don't know that they'd strictly be the same executable, but at least as far as the user is concerned they would be the same piece of software. From a developer perspective, multiple UIs built with slightly different flavors of AppKit depending on the UI paradigm, including the Mac which is currently targeted by AppKit.


This doesn’t work for professional apps, many of which are a collection of work areas controlled by tens or even hundreds of menu options.

iOS doesn’t handle that kind of UI well.

There’s also the cost issue. All but the most exotic iOS apps cost less than $100. Many professional apps cost hundreds or thousands of dollars.

It’s not just a fundamentally different market. The products are fundamentally different in critical ways.

I can imagine a hybrid MacPad product - maybe a dual-panel clamshell - but if it’s done badly it would be the worst of both worlds.

I auspect it could a succesful replacement for the iOS product line, with dual MacPadOS and iOS support.

But I can’t see it working for professionals without a lot of breakage.


I don't see it taking over for everything necessarily, but there are a lot of places it would help. Like Paprika Recipe Manager, which costs $5 for the iPhone/iPad universal app, but if I wanted the Mac version it's $30. Part of that is a "because I can" pricing, the market will bear it because there's just not that much competition compared to iOS apps. Another part is "because I have to," it's a much smaller market and a bunch of additional work to make the Mac version, so the price has to cover those costs with fewer users.

If it didn't need a totally separate UI framework, ports like this would take less effort and more apps would do it. Maybe you can't sell it for 6x the cost any more, but a comparatively small amount of work gives you a leg up on the competition.

Twitter is another example. They killed the native Mac client earlier this year and said "For the full Twitter experience on Mac, visit Twitter on web."


Microsoft has been trying convergence for years and have been awful at it, but there are finally some real implemented concepts like modifying the task bar based on tablet/desktop modes. Realized at scale I still think this strategy could work and with Apple's tight control over the iOS software landscape there is some hope of having iOS as a second-, class citizen on mac OS.


Apple has a long history of being strongly against things right up until they actually do them. And i think there's a significant difference between a converged iPad and MacBook, and a MacBook that is still primarily a laptop that runs MacOS, but can run iOS apps as a sort of bonus feature and consolation prize for incompatibility with existing Mac apps.

It's also worth noting that when he said that, Chromebooks didn't run android apps.


Apple has a long history of saying one thing and doing another (I don't blame them; the "Elop effect" is real: https://en.wikipedia.org/wiki/Stephen_Elop#%22Burning_Platfo...).


They may never release it but no doubt they have it. There are iMacs in Cupertino running macOS on ARM and there are iPads running macOS. During the PPC transition it was pretty obvious that Apple had been building OS X to work on Intel for a while secretly.


It's not like it was a big stretch: Mac OS X came directly from NeXTStep, which had already been ported to Intel wayyyyy back in the NeXT days.


Which means it's not a big stretch for iOS, either, since it's built on top of MacOS.


This needs to be repeated every time there's a discussion about ARM and the Mac:

Putting an ARM processor in a Mac does absolutely nothing to change the viability of running iOS apps on a Mac.

iOS developers always compile, test, and debug their apps on x86. The challenge of running iOS apps on x86 was solved 10 years ago.


Apple is rumored to allow iOS 12 apps to run on x86 macOS, with support for both touch and mouse interfaces.

https://www.macrumors.com/2018/01/31/apple-still-plans-combi...


I think you hit the nail on the head here


If apple uses its own chips, I would assume that they would be aarch64 and not x86_64. Also, Microsoft has some magics for executing x86_64 binaries on aarch64 with good performance, so Apple may have a similar technology. We could end up with another situation similar to the Power->x86 transition that happened back in 2006.

Or it could be nothing. This is a pretty thin article.


> If apple uses its own chips, I would assume that they would be aarch64 and not x86_64. Also, Microsoft has some magics for executing x86_64 binaries on aarch64 with good performance, so Apple may have a similar technology.

They definitely do have OSX running on ARM64, they had OSX running on x86 for years before the switch (in fact they had OSX running on x86 before it even was OSX, NeXT ran on x86, SPARC, PA-RISC and 68k, PPC is the one Apple had to add), they've already gone through two architectural migrations (68k -> PPC and PPC -> x86) and by all accounts the iOS core is very much shared with OSX, it wouldn't make sense not to port OSX along the way.


> they had OSX running on x86 for years before the switch (in fact they had OSX running on x86 before it even was OSX

Although NeXTStep ran on x86, the MacOS build on x86 was John Scheinberg's personal skunkworks project until it became Marklar in 2001 (and then kept under the hood for another four years). Mind, Darwin was always written to be portable, but it wasn't a deliberate strategy to take it to Intel. This time around though, I too reckon that they have already a MacBook running on an A10X in the labs.


NeXT actually had prototype PPC hardware, so they may have had that at the ready too. Less developed and with less driver support of course.


What I'm really curious about is what they've still maintained: what they publicly ship now is all little-endian; are they still maintaining any big-endian port? Do they have a big-endian ARM port? Have they preserved the big-endian PPC port?


> ...Microsoft has some magics for executing x86_64 binaries on aarch64 with good performance...

I was just listening to a Windows Weekly podcast about that, and its limited to 32-bit x86 binaries only, no 64-bit support. They also said its "unusable" for 32-bit apps (in particular Chrome), because you can watch the system drawing the windows of x86 apps on the screen.

Watching the video, they seem to be exaggerating a bit with "unusable", but Chrome does look sluggish, and the startup of "DrRacket" and its window redraw does look very slow too.

https://www.youtube.com/watch?v=LfYcCSRMkVI&t=60m55s


Ahh that is a bummer. I only read about the translation stuff when it was first announced, before there was much info on the details.

Edit: After watching the video, that didn't seem too bad. Perhaps it would work well enough on top of a beefier ARM processor? Of course the lack of x86_64 support is another issue that may not have a reasonable solution.


> Also, Microsoft has some magics for executing x86_64 binaries on aarch64 with good performance, so Apple may have a similar technology.

I'm reminded of the patent-sharing agreement between the two. Or, given MS' diversification, they may be willing to directly license the tech. Making the x64 -> aarch64 translation as robust as possible has benefits for both companies, and they're not nearly the bitter enemies they used to be.


> Microsoft has some magics for executing x86_64 binaries on aarch64 with good performance,

No, they have tech for executing i686 binaries on aarch64, not x86_64. Big difference.

Meanwhile, High Sierra is the last macOS release to support 32-bit binaries.


> High Sierra is the last macOS release to support 32-bit binaries

"Without compromise". We'll see what this means later, but most likely it'll mean that macOS won't ship with a 32-bit runtime and you'll be able to download as you do with Java.


The PPC -> Intel transition also happened, and yes devs had to do work to make it happen, but it doesn't mean it is unheard of or impossible.


To be fair, their PPC-on-Intel emulation that let old versions run of apps (and the entirety of OS9, IIRC) was a technical marvel on a number of levels - especially considering the hardware limitations of the early chips.


Mac OS 9 never ran on Intel. That was a separate migration path, where Mac OS X would run a Mac OS 9 VM, but this "Classic" experience never made it past PPC.


My apologies, I'm remembering wrong.

On PPC you could either dual-boot or run OS9 (and 8?) apps seamlessly on an OSX desktop, which was pretty impressive.

The seamless emulation of OSX-PPC apps on an Intel processor was extremely impressive though. I remember the majority of stuff working surprisingly well with little slowdown (though this might now be rose-tinted).


Mac OS 9 could run apps written for anything from System 7 onwards, and sometimes even System 6 apps though that was hit or miss. It could even run 68k binaries.

Mac OS 9 inside of “Classic” (the VM that ran OS 9 inside of OS X) wasn’t especially seamless, but what was seamless was “Carbon”, a transitional API that allowed developers to build apps that ran natively on both Mac OS 9 and Mac OS X. It didn’t take nearly as long to port code from OS 9 to Carbon as it would have taken to port to Cocoa, so many early OS X apps were Carbon ports.


as did 68k -> PPC a decade earlier.


Safe to say Apple has quite a bit of experience in the architecture migration department.


> significant work by third parties outside Apple who haven't even heard of the possibility of this happening until today

I dunno, I feel like the writing has been on the wall for Apple to switch to their own ARM chips, for 3 or 4 years now. At first it was "yeah maybe someday", but by this point, I'm just surprised they're waiting until 2020. I was hoping the first ARM Macbooks would be this summer. (Really, last summer, if I'm being honest).


Agree this was obviously something that was going to happen. My prediction was 2019 tough. My thoughts was that they probably wanted to get more developers over to using bitcode first as well as ARM being almost at desired performance but not quite there yet.

In 2020 I doubt anyone will see any problems with ARM performance on a desktop or laptop.


I remember Chris Lattner giving some details to bitcode a while ago and according to him it's not possible to recompile an app for a different architecture having only the so called bitcode.

Edit: Found the source: http://atp.fm/205-chris-lattner-interview-transcript/#bitcod...


The article doesn't say that Intel will be gone in 2020, just that Apple will be shipping macs with their own chips starting in 2020. It will likely start with one or two models available for devs, then a launch that covers more of the platform.

Doesn't all have to happen at once, and Apple has been though this before in the PPC-Intel switch, which went rather well.


It will obviously be an ARM transition not x86: https://medium.com/@Jernfrost/in-3-years-apple-will-switch-f...


What is so "obvious" about it?


The fact that Apple develops and ships hundreds of millions of ARM processors every year, which are widely regarded to be the best ARM processors made, and they design and ship no X86 processors?


That alone doesn't make it "obvious" as applied _to a Mac_, which is a very different kind of device that caters to a different audience that often has special needs. See our other discussion threads elsewhere.


That is what I explain in the link. I saw this move coming two years ago when I wrote that article. I noticed how regular users were using iPhones and iPads for almost everything they do. I noticed how office apps like Pages, Keynote and Numbers worked without any performance issues on iPad.

I concluded ARM was fast enough for 90% of users. Once you don't have the restriction of battery life and small enclosure it is not hard to imagine that Apple could beef up ARM a lot for their desktops and laptops.

Why run multiple CPU architectures when one does the job and is much cheaper?

I've been thinking hard about what kind of workload ARM can't handle and I can't think of anything. Ok... there is one 1st person shooter games. But iOS is a more successful gaming platform than Mac.


Macs have at least two major uses that might present technical problems: native MS office apps, and VMs for developers.


> it's unlikely Apple could make an x86_64 chip

Why is it unlikely that Apple would make a x86 chip? Because of IP/licensing, or for technical reasons?


IP/licensing.


I would argue technical reasons too. x86_64 is a huge and complicated beast.


And cost. Making something you can license to others and/or sell to 250000 users each year has a completely different cost compared to something you use only on your 20000 machines (source: [1]). Macs are already overpriced, I don't know how much Apple could sustain an in-house production of proprietary chips without hitting their users; they're probably going to ARM.

[1]: https://www.idc.com/getdoc.jsp?containerId=prUS43495918


I think you might have misread the table at the page you linked. Above the table, it says "Shipments are in thousands of units." According to the table, Apple shipped 20 million notebooks and desktops last year, not 20 thousand.


Doh... Same proportions but yeah, damn language barrier, over here the comma and point meanings are reversed compared to English. Thanks!


And the fact that Apple can already beat x86 on speed per unit of electricity with its own architecture.


that only matters for large data centers and mobile. for laptops and desktops raw power matters.


Uh, actually not. It matters tremendously for laptops.


Shouldn't the patents be expired by then?


It could be that they are just trying to improve their negotiating position. Even if that's true, I'd wager that it is just the short-term contingency of an investment with a long-term view. I bet they have their eye on the end-game.

When I say that, I'm thinking of the old TED Talk by Amory Lovins called "Winning the Oil Endgame". I feel like there's some parallel one could draw between peak oil and process improvements. Intel has slowed down now that it's more difficult to extract increased performance with each iteration. I'm guessing someone at Apple is thinking about where they'd like to be positioned when the well runs dry.


> third parties outside Apple who haven't even heard of the possibility of this happening until today

This announcement is hardly a surprise. The writing has been on the wall for at least the last 5 years since LLVM replaced GCC in XCode. It became even more obvious when Apple started compiling to "Bitcode" so that they could deliver optimized binaries to devices. What that replacement meant is that it would ease Apple's transition away from any particular architecture - developers shouldn't have to do much if the app is installed from the App Store.


> The writing has been on the wall for at least the last 5 years since LLVM replaced GCC in XCode. It became even more obvious when Apple started compiling to "Bitcode" so that they could deliver optimized binaries to devices.

These weren't clear signs of an upcoming ARM switch:

• Apple has been purging software that's under the GPL for quite a while now, and GCC was probably high on the list since Apple (NeXT) had been bitten by its license before[1].

• The Intel switch in 2006 was extremely smooth even using GCC.

• Bitcode is not enabled on macOS, and even if it were, it's not abstract enough to recompile an x86 app for ARM[2].

[1] https://news.ycombinator.com/item?id=9158017 [2] https://news.ycombinator.com/item?id=9728162


Is the IP/licensing situation particularly onerous here to prevent making their own x86_64 chip?


Yeah. There are only three extant architectural license holders (Intel, AMD and VIA) and they're basically in a circular firing squad of patents.

To make their own chips, they'd need to either license from an existing holder (which wouldn't let them tinker unless it was a partnership or they acquired the license) or they'd need to make something so incredibly great the other three would trip over themselves to use it, and bind themselves in the process.


If they wanted to go x86 they would just buy VIA? I think it will be aarch64


I’m not sure they can for this purpose. I remember people talking about the idea of buying AMD but it’s possible the license terms may say it doesn’t survive a purchase; making AMD or Via worthless for that purpose.


Unless the licensing terms are extremely lopsided, they'd still own AMD64 patents which EM64T is based on, and intel would need to strike a new deal. VIA's situation should be similar, but with other technologies.


Without meaning to comment on whether it is likely or not, they could also easily buy AMD.

(Price tag would be in the ballpark of $15-20 billion, 3 months of income for Apple)


AMD's cross-licensing agreements are unfortunately, or fortunately depending on your point of view, nontransferable.

https://www.kitguru.net/components/cpu/anton-shilov/amd-clar...


I wonder if "non-transferable" covers tricks like Apple loaning AMD the cash for AMD to buy Apple. Then renaming the new AMD to Apple. Then it would pay itself back for the loans.


I vaguely remember that the licensing agreement between Intel and AMD (Intel licenses x86 to AMD, AMD licenses x86_64 to Intel) expires if AMD gets bought. Apple could renegotiate, but it would not be straightforward, I think.


> If they wanted to go x86 they would just buy VIA?

In theory, but then VIA is a subsidiary of Formosa Plastics Group, so they'd need to either negotiate the sale with an entity with which they've locked horns in the past or buy an entire petrochemical group???

They'd probably have an easier time buying AMD.


Right, same group that owns HTC. They did have some big-ish lawsuits going with Apple some years back. So maybe not that easy, in practice.


Money don't smell to them, memory in the business world is short if the price is right.


Replying to sibling comments.

They can't buy AMD (nobody can) for patents as licences on parts of x64 AMD doesn't own will be voided if AMD is ever acquired.


Yes. Everything since the Pentium is still encumbered. Intel and AMD have extensive cross licensing agreements. Anyone want to build an x86_64 chip would need to either purchase a company that already had licenses or negotiate with both AMD and Intel.


Couldn't Apple easily buy AMD? (not to say they wouldn't have a million reason not too)


AMD will lose its license in case of a sale as per their agreement with Intel.

Intel will also lose access to AMD patents.


Yes. x86_64 is younger than patent lifespan, anyway.


I wonder if this is more about coprocessors and other chips than the main CPU.

I know the iPad Pros certainly outperform many cheaper Intel chips while using lower power.

But I doubt they would want to split the line into half ARM half Intel, or move the Mac Pro to ARM.


Apple has been trying to chase power users and professionals off of their platform for years. A switch to ARM is an opportunity to rid themselves of the few remaining holdouts and focus 100% on high-end consumer electronics.


I would say that is more about the boxes they use than CPU. The difficulty of adding your own hard drives, memory, graphics cards etc to a Mac is the biggest problem I think.

What pro task, really requires high single thread performance? I imagine Apple could match intel by simply using more cores on their ARM CPUs.


Audio from what I know is best with have multiple high speed cores, rather than many lower speed cores. (I use my machines for audio)


And what are your processing times like? I use my Mac for video editing, but I have never had any problems doing that smoothly even on much weaker Macs.

Only issue I have on Mac is compiling large programs.


The issue is with live streams of audio or virtual instruments with CPU heavy effects with very small amount of buffering (typically 64 samples @ 48kHz, which is 0.75ms). Also processing of instruments and effects for one track (+ relevant busses) has to be in series so more cores doesn't help a resource hungry signal chain. But of course they do greatly help overall effect/instrument count over many tracks.


True, I miss the old Powermac towers: Workstation class machines, but with low-end configurations for those of us on a budget. When I was faced with the choice between:

- an iMac that didn't really meet my needs

- a ridiculously unaffordable Mac Pro tower

I jumped over to Linux running on commodity PC hardware.


Oh good, I see the downvotes are rolling in now (I was concerned after receiving a wave of upvotes yesterday). I've found there is no more reliable method to collect downvotes on HN than to say something critical about Apple.


I wouldn't know, but hasn't Apple made its Mac Store apps architecture-agnostic? I'd be surprised if that's still not the case.


No, the Xcode workflow is set up to make applications CPU-agnostic by default, but there's nothing about the Mac App Store that requires it. The executables are ordinary x64 binaries.

It'd be perfectly valid to write an application for the MAS in x64 assembly, if you want, as long as it's sandboxed and doesn't touch any private APIs.


Does anyone remember Intel's random rant about how "emulation of x86 violates our intellectual property" from last year?

https://newsroom.intel.com/editorials/x86-approaching-40-sti...

Intel didn't name names, and everyone at the time assumed it was related to Microsoft and their efforts to ship windows 10 on ARM with a 32 bit only x86 emulator.

But I wonder if that rant was actually aimed at Apple. Microsoft's emulator only needs to target a older subset of x86, one that could avoid ant patented instructions.

If Apple are planning to move from x86 to arm, they will need an emulator that supports full 64bit with some relatively recent extensions that all MacOS applications target by default.


The timing and the wording makes it so I'm quasi sure this was MS & Qualcomm related, and not Apple related. For example it says "there have been reports that some companies may try to emulate Intel’s proprietary x86 ISA without Intel’s authorization"

Also the level of secretivity at Apple is soo high that I'm not even sure anybody outside there knew about any concrete subject this summer. And would they somehow know, why would them try to dissuade them publicly at this point? That does not work like that at this level.


That would put Intel in the crosshairs of anti-trust. The have a defacto monopoly on personal computer processors, and this sounds like an abuse of their market position to stifle innovation.


How would Apple vs Microsoft change anything in those circumstances, though?


> "A decision to go with ARM technology in computers might lend it credibility where it has failed to gain a foothold so far."

> "Apple is working on a new software platform, internally dubbed Marzipan, for release as early as this year that would allow users to run iPhone and iPad apps on Macs"

Two things here:

1) I'm OK with breaking the Intel near-monopoly on x86. I'm not OK with moving to a walled garden where Apple forces you to publish apps through their App Store with a paid dev account, etc. just for the privilege of users on their platform. ARM doesn't necessarily mean this, but it is a different CPU arch. When Apple transitioned to Intel/x86 from PowerPC, Intel processors were performant enough compared to PowerPC processors to provide a pleasant emulated PowerPC environment for applications build for PowerPC. I don't think that a switch to ARM would provide this benefit, and afaik Intel's mobile offerings aren't that far off from ARM efficiency. So what's the benefit? Just vertical integration, I guess? Escaping Intel's backdoors and high prices?

2) iOS apps on OS X. Why? Does anybody want this? The way I see it, web apps are perfectly adequate for the desktop environment when it comes to stuff like checking my bank account or browsing Hacker News. I don't want to deal with a desktop app to do any of the stuff I can currently do via a browser. Is there actually a use case?

3) Given the hellscape of bugs currently present in iOS/macOS, does anybody have faith that Apple is going to be able to navigate a rewrite of macOS on this scale? It sounds like the sort of thing that requires a lot of talent and a lot of focus. Apple has the capital for this, but not the environment, imo.

Seems to me like this could be the nail in the coffin for Macbooks that's been pending since the merger of the macOS/iOS teams and the introduction of the controversial TouchBar/USB-C Pro.


“2) iOS apps on OS X. Why? Does anybody want this? The way I see it, web apps are perfectly adequate for the desktop environment when it comes to stuff like checking my bank account or browsing Hacker News. I don't want to deal with a desktop app to do any of the stuff I can currently do via a browser. Is there actually a use case?”

Right now many new desktop apps are just badly ported web apps wrapped in electron. They are slow and eat a lot of memory as all of their UI is a being rendered in a glorified standalone chrome tab.

This is less about iOS apps on OSX and more about making it easier for the iOS developer ecosystem to build desktop apps. Right now it’s web teams that are building desktop apps because for most companies it’s too expensive to hire a dedicated desktop team. Even big apps like slack/WhatsApp use electron.

Making it easier for iOS developers to build desktop applications with the APIs they currently use should hopefully lead to higher quality apps.


The best case scenario is that Apple has cooked up some sort of awesome translation layer between the Cocoa Touch APIs and Cocoa, letting iOS apps function almost like native macOS apps. Hopefully this will happen. But I won't let it go past hope.


Don't a number of iOS apps use essentially the same technique?


Most if not all of the top 100 mobile apps are native applications.

Hybrid mobile frameworks like Cordova tend to get a bad rap these days especially with the arrival of more performant alternatives like React Native.


I avoid installing apps for most sites, but I kind of remember in the early days, anyway, people were just sneaking a Web view in there for most of the functionality in many apps.


If you can build an iOS app, you can build a macOS app too.

Electron apps are not built by iOS developers, but by developers like myself who would rather hit 3 birds with one stone ;-)

And no, nothing would change, except the MacBook will get to be even more shitty. We'll remember fondly the MacBook Pro of year 2015 as the last model that didn't suck.


> 1) .... why? ...

You're already scratching the surface: vertical integration, backdoors and high prices, power and battery life optimization, lesser effort for an app developer to publish to all platforms, develop-once-run-everywhere, ...

Looks like you're looking for one "The Reason" - but there doesn't need to be one. If a layperson like you or me is able to provide 5-10 reasons, then its likely there could be 100-1000 reasons internally, and all of them add up.

> 2) iOS apps on OS X. Why? Does anybody want this?

Of course yes. As an app developer and as a consumer - convergence and bringing my apps and data across all platforms is no longer an "optional" thing anymore, its mandatory even for a ToDo list app, or email, or IM and everything else. Web apps suck at power efficiency - see the situation with Slack/Electron/Chrome/others on desktop, especially when it comes to stuff like hardware bound work (video/audio, digital image and movie processing, hidpi wor and much more).

If you don't need all this, and all you need is just a chromebook with a browser, its fine, it has and will keep working. It also ties in to why you're confused "why this is required" in so many ways. You may not be the target audience here.

> 3) Given the hellscape of bugs currently present in iOS/macOS, does anybody have faith that Apple is going to be able to navigate a rewrite of macOS on this scale?

This one answers itself. If a fragmented platform doesn't work and has lots of bugs - then it makes all the more sense to invest all resources in one platform/arch to have better focus and lesser bugs to tackle. Everyone doing rewrites know that there will be short-term pains, but that has to be balanced with the larger picture - otherwise we'll just keep hating new releases but there will be no solutions other than "let's do only bug fixes for next 1-2 years" aka platform stagnation, and users still won't be happy :)


Good point on 1) -- basically, Intel has produced enough problems for a high enough cost that there's reason enough to try alternatives.

On 2)... I hate web apps. Probably much more than your average layperson. Electron is miserable in my experience -- laggy, high memory/CPU user, non-native feel, but I think a web app in a browser is OK. There are really two different use cases I see here. a) Take TurboTax, for instance. I don't want to download a TurboTax app for my computer. I'd only use it once or twice a year. But it works well in a browser. It's complex enough that a web app is justified. b) Spotify. It needs to interact with local files, and I usually have it up in the background. A web app doesn't work well for this. Unfortunately Electron doesn't work well for this either.

I think if this is executed well it could be amazing -- what if layouts scale beautifully onto a laptop screen, so I can use an iOS app instead of an Electron app for Spotify/Slack/etc? If this happened, the benefits could trickle down into iOS, making it a more useful platform. On the other hand, layouts might not scale well for larger screens, and iOS apps on macOS could end up neutered and even less useless than a current webapp. Hopefully the former case happens, but lately Apple makes me feel like the latter is more likely.

3) I'm not really convinced that this will reduce bugs. In my experience, combining two pieces of software into one just makes the resulting monolith harder to reason about because it doubles complexity at high levels and increases complexity exponentially at low levels. But maybe that says something about my development skills :)


If I read that correctly, I don't know where you're extrapolating that Apple is making this move to replace macOS with iOS / a walled-garden where nobody can create apps for Mac except by selling them through the App Store. If Apple wanted to do that, they would just sell iPads and stop selling MacBooks.

As for 2), I think that's a nice convenience - if you're a developer you wouldn't have to to worry about emulation.

> does anybody have faith that Apple is going to be able to navigate a rewrite of macOS on this scale

Is it really that crazy they would rewrite parts of their OS to target a new architecture? A large undertaking sure, but not that ridiculous... This is Apple, not some random startup lol.


Absolutely true-- Apple is, after all, one of the most valuable companies in the world. But I think recent issues in iOS and macOS hint that they might not be one of the most talented software companies in the world. If regular maintenance and feature updates are bringing their OSes to their knees, what will a massive spec change do?


Random, probably not true thought: large portions of their OS developers have been working on this rewrite, hence leading to the "increase" of bugs on macOS right now.


Not having to emulate iOS apps on macOS will be a boon for Xcode developers.


I doubt iOS devs are very high up on the list of parties to please with this move


iOS apps do not run via emulation on macOS; they are compiled for x86_64 and run as a normal process.


And it’s actually a pre that every iOS app works on multiple architectures. If Apple ever wants to change iOS to run on a different type of CPU it’ll be easy.


And they've already made advantage of that, with the move from A32 to A64.


> does anybody have faith that Apple is going to be able to navigate a rewrite of macOS on this scale?

Inn their current yearly OS release cycle? Nope, not at all.


Why rewrite macOS? All they have to do is recompile it for ARM. We know they're already doing basically this for iOS.


>So what's the benefit? Just vertical integration, I guess? Escaping Intel's backdoors and high prices?

Yes, and perhaps better power/performance ratio. Possible iOS binary compatibility as well.

It should be an easier transition than from PPC -> Intel, since at that time, most big apps used CodeWarrior and had to transition to XCode along with the architecture move.


iOS Apps on MacOS is a good reason for Apple to incorporate touch screens. Today, they say the trackpad is superior because you aren't raising your hands up to the screen. Touch optimized iOS apps would be a compelling reason to include this feature and motivation for many people to upgrade their MacBooks.

Also, the Microsoft Surface Book Pro has a detachable screen that is a stand alone tablet. What if Apple is designing for an iPad to be the main screen with a base that houses external GPU, battery, keyboard, etc. iPad production ramps up, iPad users can "upgrade" to a full laptop, laptop users are automatically in the iOS eco-system.


Maybe a trackpad with a screen behind it. Apple already has an A series chip running the touchbar. Why not an iPhone under the trackpad?


I believe it's a T series, though your point still stands. I think I'd personally far prefer a display under the trackpad (though I don't think it's necessary at all) to a TouchBar or a touchscreen laptop. But I think I'm very much in the minority.


You are right, T-series.

I'd join you in that minority.


2) macOS tablet.

Android apps on Chrome OS tablets are a thing now.

With macOS on ARM, the iPad Pro and MacBook lines merge, as the hardware internals converge.


Tim Cook has repeatedly stated that the iPad is "the clearest expression of our vision of the future of personal computing" and Apple only sells around 4M Macs per quarter, while iOS devices dwarf this stat. If their plan is to continue moving users from desktop/laptop devices to their iOS ecosystem, and the software can be transformed to cross between both classes of devices, then it only makes sense to unify the processor architecture as-well. Especially considering the expertise they've brought in-house to accomplish this, and the success they've demonstrated with their "A" service of chips, complemented by some of the specialized co-processors which have begun popping up in their accessories.

I still expect that they'll offer devices in traditional form factors -- laptop-like devices and desktop devices, if the market demands this. But for most users, they don't care what's inside if it let's them accomplish what they're trying to accomplish.


I can't take that seriously without them demonstrating an equal amount of work on the software end of the iPad. I am a huge believer in the potential of the iPad -- I was blown away by the original iWork and even more so by Garage Band, they both demonstrated honest attempts at providing differentiation from the desktop computing model. But then they just kind of ... stopped developing the OS and associated apps. We all forgave them for making iPad iOS just a bigger iPhone iOS at first because they were otherwise firing on all cylinders, but then they proceeded to continue ignoring this OS for over 5 years, only now providing what is in my opinion the absolute bare minimum features to make working on the iPad tolerable (multi-tasking, etc.)

It appears that Apple, at least when it comes to the iPad, is increasingly unwilling to deliver new software that isn't explicitly tied to a new piece of hardware (and thus a hardware sale). We got writing APIs in order to sell $99 Apple Pencils, but never an actual vision for what it would really mean to do professional work on the iPad. And at some point, it stops seeming like there's some master plan waiting to be unveiled just around the corner. The iPad has been around for 8 years and the fundamentals of using one have largely remained the same, the same as the original subpar experience we excused since it was obviously a 1.0 product. The iPad has basically gotten thinner, faster, and gotten a pencil (which don't get me wrong, is great). iPad sales reflect this: iOS DEVICES may be doing amazing, but iPad sales have plateaued or declined repeatedly. Apple seems eternally confused by this, as if they think the product is done. They begrudgingly offer a "new" iPad with older parts and slightly cheaper price as if people just don't get what they're missing.

But what's missing is a message, a message other than "this is obviously the next iteration of the computer, why aren't you buying them even though we hardly update the Mac either?". The message and vision would inspire Apple to lead by example in the software they produce for it.


The last few years of iOS development have arguably added more big features to iPads than iPhones. Split screen, picture in picture, new gestures, drag/drop, new multitasking workflows, Pencil, etc.

Yeah, it felt a little stuck for a while, but iPads are ridiculously more useful and viable as work platforms than they were in the iOS 8/9 days.


I still expect that they'll offer devices in traditional form factors

What are iOS developers supposed to develop on?


"What are iOS developers supposed to develop on?"

Presumably laptops and desktops with Apple chips inside.


According to the article, those would still be using Intel chips.


"According to the article, those would still be using Intel chips."

Where do you see that?


Buy two iPads and use one as a keyboard.


I don't know if you're being sarcastic or not :-)

That said, CPUs have gotten so cost effective that it makes possible something which has come in and out of favor for a while, which is a network of devices that do one thing co-operating as a larger system.

So lets say you buy your "iDisk" which is a storage brick that you can put on a short range wireless network. A couple of iPad monitors, a wireless Apple keyboard, and a wireless Apple mouse. You set up this box of stuff and arrange it around on your desk and it is essentially a "single" computer system perceptually which, running a development "app" could work fine for development.


We've been dreaming about this kind of system since the rumors were flying about the Apple "Brick".

The "brick" turned out to be the all-aluminum MacBook, being carved from a single piece of metal.


You'll need the screen to keyboard dongle to accomplish this.


Need at least three in order to emulate a dual monitor setup.


Their own cloud IDE and build farm ?


You don't want to compile directly on your iPhone?

Maybe someday they'll make a cross-platform Xcode.


This is too optimistic but if they don't care about selling macs maybe they'll drop the requirement to develop on mac?


A Mac emulator running on iOS.


I don't see why a separate desktop device is needed at all.

Give us a MacOS-like desktop mode. We could connect a mouse and keyboard using bluetooth. We could airplay to a display. Even better, replace lightning with thunderbolt so we can connect our phones to an external GPU.


More importantly, Apple has clearly demonstrated several times that they want to own a fully locked down platform where they approve any piece of code and collect percentage from every sale. iOS devices are that - everything from DRM protected hardware to app store apps pays tax to Apple. Users of desktop platforms are resistant to that (Mac App store failed) it's easier to just introduce a larger iPad to push people over.


Apple has been preparing itself for independence from underlying CPU arachitecture for a while. In WWDC 2015[1], they announced bitcode. All App submissions to AppStore are compiled to to bitcode. This allows Apple to re-optimize/re-compile your app binary in the future for newer hardware without the need to submit new version of your app to the store.

This allows Apple to switch to intel chip on iPhone, or switch to an ARM chip on MacOS.

[1] https://thenextweb.com/apple/2015/06/17/apples-biggest-devel...


> This allows Apple to switch to intel chip on iPhone, or switch to an ARM chip on MacOS.

I think you're assuming way too much. Says Chris Lattner regarding Bitcode, "It's useful for very specific, low-level kinds of enhancements, but it isn't a panacea that makes everything magically portable." [1]

[1]: http://atp.fm/205-chris-lattner-interview-transcript/#bitcod...


I believe Lattner said there was interest in making a CPU architecture independent bitcode though. At least bitcode is part of the way.


Similar to how Bell Labs Inferno worked. The kernel to user space interface was a register based VM called Dis. The entirety of user space was written in Limbo, the predecessor to Go (Both designed by Rob Pike), which allowed the whole of userspace and thus the bulk of the OS to be completely CPU independant. A JIT kept things speedy and performance penalties were about 30%. So your entire OS stack was CPU independent which allows applications to be written on an X86 machine and then ran on an Arm box with zero recopilation. True write-once-run-anywhere.

I have it built on my raspberry pi B+, Linux 32bit chroot, and there is a prebuilt Windows image available that you unzip and run.


Pity people still propel horrible names of software we had since 60s... How can any outsider take us seriously?


I saw the light when I read the plan 9 intro. And Rob Pike was right, The operating system still maters. In fact, I'd say it's more important than ever in the age of towering software stacks. We live in the distributed internet age but are hamstrung by 60's OS design.


> This allows Apple to switch to intel chip on iPhone, or switch to an ARM chip on MacOS.

Not easily. It's optional on iOS [1], and doesn't support completely new (non-ARM) CPU architectures. On macOS, the feature doesn't exist at all.

[1]: https://help.apple.com/xcode/mac/current/#/devbbdc5ce4f

In theory an LLVM-based bitcode-like system could allow Apple to change CPUs and automatically recompile App Store binaries without developer intervention, but their current bitcode system does not support that.


I just checked the app store on my Mac. I have just 1 application installed from the app store: Xcode. Even then that's just because Apple makes me have it installed to have command line tools such as git.

I think it's safe to say that the vast majority of applications (Microsoft Office, games, browsers, text editors, IDEs, etc.) installed on Macs are sourced from places other than the app store.


You can get the command line tools without installing Xcode from the App Store by running the command: xcode-select --install


Well, there goes the only App Store application then.


Well. We'll see what they do, but Apple had better be very careful. If a lot of OSS software won't run on Apple's new chip, then it becomes very hard for a software developer like myself to stick with Apple. And it is developer energy that sustains the software ecosystem for Apple....including many developers who are in the Mac ecosystem to develop for iOS.

This is not necessarily the wrong move for Apple, but it is fraught with danger. Here be dragons.


Much of the OSS stuff people run on macs - like the stuff that's in brew - already runs on various ARM based platforms. That's why your little raspberry pi can run a 'real desktop'.


OSS software can "just" be recompiled though.


In theory yes, but back in the PPC days a lot of stuff was not available even though theoretically it could have been.


This was complicated by the fact that PPC was big endian and Intel is little endian. If software assumed byte order when twiddling bits, etc, you could not just recompile it.


If iOS will be by that time the only mainstream way to make money from developing software (happening already), people would care very little about OSS there...


By definition, if the maintainer of some OSS software doesn't have the time/energy/willingness to make the software run on Apple's new chip, you can do it yourself.


True, but right now I don't have to do it myself. Things like brew just work. If the proposal is to throw away something that works, and add the burden of doing it myself to my to-do list, that doesn't sound at all appealing.


Brew is maintained. If there is indeed an architectural switch, you would expect Brew to be one of the first projects to actively embrace it and make things just work.


Keep in mind that Apple has a veritable army of ARM developers (thanks to the iOS app store) which

1. doesn't suck 2. pays very well 3. has already demonstrated multiplatform apps with iPad / iPhone


But unless the software has embedded assembly, most likely you can just recompile it and run it fine. Even easier if it is written in languages other than C/C++.


This looks to be a very good move to optimize and significantly improve Apple products' security and reliability.

I do not understand why we argue what is better between Mac, Windows and Linux. It all comes down to the scope of what we're using it for. All of these 3 OSes have complex parts and arguing which one is better is not a a good conversation to have. The market already says it all for what choices people are making. These 3 OSes feature development would always follow the market demand. They're usually not that far off for important features. And whether they choose to focus/optimize them is a business decision.

I have used all 3 and I like different parts of all 3. I currently use mostly my Macbook Pro, iPhone and sometimes iPad as they're very reliable and high quality made products. They often get my job done mostly without a hiccup. I would choose the product that get my work done best within the least amount of time so I could focus on other more important things. My wife, however, prefers her Windows laptop & android phone as she could do her work better and she prefers android because of its flexibility.


Apple does have a good record of moving their OS and software from different architectures. 68XXX to PPC to Intel was not a horrible experience. Having gone through three transitions I don't remember a lot of pain - but maybe I am repressing it!


It is fairly common for newer versions of OS X to break software that worked on previous versions. If you try to use Photoshop 6 on a modern version of OS X it isn't going to work. I think this makes the chip transition less painful because no one really expects their old software to work anyway.


> ... no one really expects their old software to work anyway.

My experience with HN comments leads me to believe otherwise, at least among professionals. :)


Well everyone (including myself) likes to complain about it, but there is a huge difference in how many OS updates it takes to break a legacy application on OS X and Windows. If we really expected the OS X updates to not break things, we'd move to a different OS.


PS6 used the old Carbon Framework. This was developed as a stopgap to allow developers to transition off of MacOS to OSX in the early aughts. It goes to show how little respect Adobe back then had for Apple and Mac users that Apple ended up purchasing and developing Final Cut Pro and Logic Pro IP.

Microsoft Office was another drag on the Carbon library. They also chose to keep their applications in the old format even though they were told repeatedly from Apple that this thing was going away. Eventually they got their shit together and the Office suite for OSX is a best of show.


Not really an excuse when Windows 10 can run most Windows NT4.0 software, can get to run win3.1 and dos apps (which are different OS, not just older versions).

Userspace GNU/Linux is also pretty bad about this, despite the effort put in by Linux.


A few simple apps here and there is not most Windows app. I keep a windows 98 VM around simply because developers played fast and loose with the security of the OS. They would not be able to do that today. Like a user editable ini file in the windows folder.


Yeah, Windows 98 is a completely different OS from modern Windoes, makes sense that compatibility is not strong. It would be like expecting OSX 10 to run classic Mac apps.


I've been through the PPC to Intel migrations so many times for so many school districts that I'm still fairly amazed at how well it usually went as long as the apps weren't trash.

The biggest issues I ran into was always with the low quality edu based applications. So many of the issues we had with those apps were because they were created in Windows and ported half way to OS X, an old PPC version that did something special that the Intel version couldn't do, or the worst one using a special baked in version of Adobe Flash..

/shivers


Hey, do you remember which title used a special baked in version of Flash?

Asking for friend ;)


I've been out of the edu game for a while now. I want to say it was the Map Testing system. The school district was using a web based system (something lightning? I can't remember) previously, but the state was mandating the use of MAP instead. The state didn't give the schools any options or much time to prepare for it either. The school district had 70+ schools. I spent a summer fixing / getting that stuff working. (still more fun than iPad deployments)

The app wasn't a universal binary at the time. Which would be fine if the PPC version worked with the PPC machines, and Intel worked with the Intel machines. haha.

*Edit The windows version did usually work without issues. The OS X version eventually worked once they redid the whole application.


Oh good, it wasn't mine!

Edit: maybe I should explain a little. We were a educational software developer and doing multimedia titles. We had a home grown multimedia engine which served us well (at the time there weren't alternatives) but Flash came along and at the time they would license the engine as a C or C++ library, which we could embed. This would get us a capable engine with superb integration with the content creation tools.


phew, I was hoping MAP wasn't going to come back to haunt me after all these years, haha.

If your application let students finish what they were doing without randomly crashing and losing everything, then it is leaps and bounds better than the MAP system.

The school had to extend the map testing by 2 weeks just because of how often it crashed and students had to retake it. My coworkers at other school districts in other states ran into the same exact issues. Then on Windows, it had to access a SMB share somewhere to dump data to a flat db... that would get corrupted sometimes when a students application froze up... haha. it was job security though.


> 68XXX to PPC to Intel was not a horrible experience.

Unless you rely on a piece of software that does not receive any updates anymore. Then you're basically screwed. I am rather annoyed that Apple is going to break backward compatibility yet again for no good reason. Intel Mac's perform a perfectly adequate job and Intel couldn't really screw Apple because they are a rather good customer.


Intel already has screwed Apple as far as they're concerned - there have been numerous delays in macs attributed to Intel's chip delays. Apple isn't one to take kindly to holding up a significant product line for an external vendor if they have an alternative.


Maybe a bit tangential, but from an economic it's interesting to see Intel's market cap vs TSMC, given that TSMC is purely a foundry. Nowadays what 10nm or 14nm actually means is a lot fuzzier than previous nodes, but the general consensus seems to be Intel's fab tech lead is either pretty precarious or already gone, so I'm just gonna assume even (which is admittedly a poor, oversimplified assumption).

Then TSMC and Intel are pretty even, which is slightly interesting to extrapolate all manner of conclusions.

Intel has slowly opened it's fabs to outsiders, however the first one was Altera, who Intel now owns, so... Main point is in 20, 30 years, is Intel's main business going to be fabbing their own chips, or someone else's? I dunno, I just enjoy following the industry.


If TSMC's fab is as good as or better than Intel's, then fabless companies like AMD and Apple will reap competitive benefits over Intel, which now has to have IP for CPUs it sells as well as IP for its fab business.


I personally hate close fences with no rooms for customization, and many computer scientists are on the same wavelength.

Windows is an awful choice for software development, a job that usually requires a lot of interaction with Unix servers. Microsoft's recent crush for Linux and open source can't fix many years of closed standards, closed protocols, and Embrace-Extend-Exterminate evil culture.

And MacOS, even though it's still loosely based on a Unix *BSD flavor, still falls into the "closed fence" category - brew is nice but can't be compared with the maturity of the package managers on any Linux distro, you can't just recompile or update the kernel on the fly to get support for a new device, and OS internals are purposely obscure to developers. Paying all that money for such a closed box, if you're anything more than an average users, is simply a waste of money.

Linux distros are by far the best environment for a computer scientists - made by developers, for developers - but there's no major vendor that sells machines with Linux pre-installed and supported, and getting Linux installed on the newest Surface or Dell toy is often a challenge that gets many frustrated.

There's definitely a huge gap in the market, and as a computer scientist I feel that none of the major vendors cares to provide me with a solid machine to do my work.


Here's the problem with saying "this is better for computer scientists": It's the same argument every time someone says "the Macbook Pro is not a professional device". Not every computer scientist has the same needs. There are a lot of CS folks who will never need to recompile a kernel or even get into the OS internals. For those people, having to do that is a major downside to any computer. So the obvious choice is a computer where that's never necessary.

Linux is a great OS. I spend all day SSH'd into it. From a Mac. Because my workstation needs to work, and it's not going to work if I'm copying and pasting kernel code from Stack Overflow to get my trackpad to scroll the correct direction. I'm not a C programmer, so keep me as far from the kernel as you can.

The best environment for a computer scientist is the one that lets them get their work done in the most comfortable fashion. And for most of us, that's Linux on the server and something else on our workstations.


Couldn't agree with you more. Ubuntu was my daily driver from the start of 2007 through early 2014 when I finally gave in and bought a rMBP. Despite missing the "real" nix environment being available right under the GUI, I have been much much happier on the Mac.

I fought software update and driver issues for years. I ran the most bog standard HP laptops that I could get, albeit w/ Nvidia GPUs. Every distribution upgrade there was something that went wrong. One of a list of usual suspects: video driver, power management, and wifi. Things were never completely broken. It would be something like not being able to close the lid, sleep, and properly recover. Or not being able to connect to wifi when coming back from sleep. Or not being able to stay connected to wifi after a couple hours and needing the kernel modules to be removed and reinserted.

I kept a set of hard drives so that I could always move to the next version of the OS on a clean install. I could fall back easily if the next release presented problems. I would tinker and fix the sort of issues you get with these random outlier driver problems and press forward. I had a good regimen going, but ultimately it was the power management issues that wore me down. Randomly finding the battery dead because the laptop didn't sleep when I closed the lid or not being able to properly return from sleep was just too much.

I've been running Linux for 20 years and I'm not willing to put up with these problems on my workstation. It seems like it must be a very small minority that can.

If you're going to have an actual destkop machine maybe you can avoid most of these problems. And maybe these things have been solved finally for laptops, but I wouldn't know, I moved on and haven't looked back.


> I've been running Linux for 20 years and I'm not willing to put up with these problems on my workstation. It seems like it must be a very small minority that can.

What makes you think everyone faces the same issues as you? Maybe the problem is your specific hardware? I've been running linux based distros as my daily driver for over 10 years and literally never had any of the issues you described. Never had issues with power management, wifi always worked out of the box on all distros I've tried and suspend/resume always worked flawlessly. If you want a laptop that works well under linux, buy a thinkpad or a dell developer edition laptop.


I used Linux as my desktop OS for around ten years.

I learned, the hard way, the truth of the cliché: Linux is free only if your time is not worth money.


>Rather than a "small minority that can".

I'd also like to add my voice as one of those who have used linux for over ten years with excellent out-of-the-box support for hardware. Often, linux will even offer a better experience by including support for obscure hardware out-of-the-box. No internet connectivity or hardware driver media required.

I realize my experience is anecdotal. However, for me and many others it seems, linux just works.


I just bought a Thinkpad T480 to replace my aging (but still functional) T410. I installed Debian (using the nonfree installer to get the wifi firmware), and everything just worked right out of the box.


> and it's not going to work if I'm copying and pasting kernel code from Stack Overflow to get my trackpad to scroll the correct direction

When was the last time you used a linux desktop, 2005? On any linux desktop I can think of (MATE, Gnome, KDE) this is a matter of going to Settings | Mouse & Touchpad | check or un-check the natural scrolling box as desired


Actually, I just installed the latest lUbuntu and am having this exact problem. The preferences don't have a way to select natural scrolling and I've been "fighting" with it for a couple of days now, changing mysterious values in config files and restarting X. Then when I think I have it figured out a system update happens and it resets back to the original scrolling.

This is the real value of macOS. I agree that I liked it better when it was more Unix and less Apple unix-like but it's still better than fighting with your Linux config on a x86 machine.


Click the ubuntu (start) button. Type mouse. Click on Mouse and touchpad. Check/uncheck natural scrolling.


Ubuntu had an option for a couple years that let you choose, then they removed it... (bastards)

I tend to use it everywhere... running a hackintosh on my desktop now, and mbp for personal laptop, and work assigned is another mbp.

However, if they go off of x86, that may drive me back to linux to be closer to prod/deploy environment.


Do either of these work?

- https://askubuntu.com/questions/604002/mouse-wheel-scrolls-i...

- https://askubuntu.com/questions/819662/how-to-invert-touchpa...

(They're mutually exclusive; I put the 1st one 1st as undoing that one is a case of rerunning the command replacing the '1' with a '0')

If the first one succeeds, try putting the command you used into ~/.Xdefaults and see if it sticks. The 2nd approach puts info into a file anyway.

If these fail, what desktop environment are you using?


The problem is not "had to go copy/paste something from the internet to make it work".

The problem is "didn't work out of the box". Using Linux as a desktop OS can be a death-by-a-thousand-cuts experience. Sure, whatever problem you run into there's probably someone who's figured out how to solve it and posted instructions online. But if the solutions are known, then the operating system shouldn't have those problems out of the box.


I agree. In this case I think extra factors must be taken into consideration.

First is the "violation of least surprise" principle. Linux as a UNIX is traditionalist to a fault, sometimes obsessively so beyond what is helpful. Scrolling is typically done inverted in X11.

So, that means this would need to be a configurable option.

From the standpoint you're talking (where the comparison is to macOS), now you're in UI territory, which has typically never been one of Linux's strong points.

Someone has go build the UI, but then that'll only be for one given desktop environment, and so now you have duplication of work. Unless every DE does that work, if you're not using one of the DEs that has done the work - well, you get no radio selection group or dropdown or whatever.

There are very real reasons why Linux isn't best in class. The politics, infighting and greybeard traditionalism does more to hurt the "it just works" flow other environments have.

(Note that I'm not saying "politics", "infighting" and "greybeard traditionalism" are the same thing; they're very different groups and issues, but they combine to form stalemates and inertial cancellations. For example, when traditionalism met systemd, the result was a vicious catfight instead of unified action. Things could have worked out where the community met Red Hat's lobbying with a list of terms and requirements. Nah; everyone just threw their hands up in the air and screamed instead. This is what's broken in Linux. I think KDE had it sorted in 1999 when they pragmatically grabbed the still-closed-source Qt and made a desktop environment out of it. Then GNOME came along and drowned everyone more political overhead than the W3C and Internet working groups combined, and user experience is definitively worse for GTK ever being created.)


Make a bash script with these commands to fix it:

   xinput set-prop 'ELAN1200:00 04F3:301A Touchpad' 308 1
 
   xinput set-prop 'ELAN1200:00 04F3:301A Touchpad' 295 1
You'll need to replace 'ELAN1200:00 04F3:301A Touchpad' with your touchpad device name, which you can find out with 'xinput list'


It kind of works but not stable as mac’s touchpad. You need to spend a lot of time to make 4 finger, 5 finger, zoom, pinch work stable. Also there are endless weird bugs you need to find out why. Had crashing ubuntu, opensuse right after the install.

Traveling and working remote, battery power is very important. You cant shutdown or sleep your computer seamlessly. Its just simple thing. It should just work.

Yes you can customize it as you wish but I dont want to. I want to spend time for actual coding and having time for myself.


>Yes you can customize it as you wish but I dont want to

This was a big moment for me. I was a huge anti-Apple guy, I had built my own laptops and used Android and everything had to be open and hackable. Until one day I just realized it wasn't fun anymore. I would have to stop what I was doing to fix some broken (virtual or real) duct tape that held everything together.

I bought my first Mac the month I replaced the CPU, then the motherboard (wrong socket), then the RAM (incompatible with my new mobo), and then was staring down a dying video card in my desktop. I realized I didn't want to fix it. I picked the wrong parts because I didn't care anymore, I didn't care enough to do the proper research. I spent all day fixing computer problems, I hated doing it on my own time too.

What I want is a computing appliance that lets me do my job and nothing more. I don't have that yet, I still have to deal with system updates and reboots and filesystem maintenance and all that nonsense with my Macbook, but it's closer. I've been considering a Chromebook but I have had bad luck with Google products in the past.

Not a month goes by where I don't re-read Mark O'Connor's "I swapped my MacBook for an iPad+Linode" series and find myself agreeing more and more.

http://yieldthought.com/post/12239282034/swapped-my-macbook-...


I think this depends on the specific flavour of Linux. For instance, Linux + i3wm yields a desktop environment where everything -- including seemingly braindead functions like volume control -- needs to be configured by hand on a command line. Other combinations might be more extensively configured by default but even then it's not hard to find something broken.


Used Linux on the desktop through college before switching to a Mac. It’s still not great. Maybe worse now, considering that now you’ve got X and Wayland to choose from and GNOME has somehow alienated everyone and fragmented. How do you set things up so you can use Remote Desktop/VNC into a GNOME desktop using Wayland?


GNOME has a built in Wayland-compatible VNC server. Install the gnome-remote-desktop package and open the Settings app and enable Screen Sharing.


Exactly. Sure if you're trying to revive some old notebook with a Linux distro you're going to face issues (no different to if you try to install the latest MacOS on their earlier machines). With a modern machine, Linux is a very reliable and comfortable OS that doesn't restrict you in terms of hardware. The only minor criticism I have is that it lacks a high quality spreadsheet application like Excel (but with cloud based solutions catching up, that is becoming a non issue).


Does Libre's open source spreadsheet not provide the functionality you need? I took a small HP stream, replaced Windows with POP! OS (from System 76) and the spreadsheet comes as part of the package. As far as I know, it's also easily available with vanilla Ubuntu.


It's 80% there, but for an advanced Excel user, libre office calc feels like Excel from 2005. I'm sure it can do everything advanced users need, but it's slow and clunky at times.


In my experience, having the setting doesn’t make it work. Oh, it works some of the time. In order to make it work MOST of the time, you’ll need to change it in more than one place, at least in X11. And Wayland doesn’t play nice with nvidia, yet. (Not implying who is at fault).


I've never had to change it in more than one place on my xps 13


I never got it working all over the system in ubuntu 17.10 (I just resigned myself to testing scroll direction in each app), but in 16.04, there were XKB options, Unity settings, separate Gnome settings (because Gnome ignored the X11-level scrolling!), separate KDE settings (though KDE at least obeyed the base X11 options), and at least an app or two that had their own settings, or used some lib that implemented its own scrolling and also didn't have options to control this.

The thing is, though, desktop linux users are just used to this. Your choices boil down to:

1) do a lot of research and setup and then constant maintenance to restore settings that updates casually break, and understand that there will still be exceptions and gaps, or

2) try to stay inside the lines of Gnome, or Unity, or whatever ElementaryOS calls their graphical shell, and ignore the thousands of applications that do not match the look and feel of your chosen environment.

I keep going back and trying desktop linux every few years since switching from Gentoo to Mac in 2003, and sometimes I use it exclusively as my personal machine for months at a time, but in the end, frustration always drives me back into Apple's arms. :(


That doesn't sound like normal behavior to me.

In any gnome 3 distro I've ever used, I've only ever had to change it in the gnome control center and it works system wide.

Right now I'm running solus mate and changing it in the mate mouse and touchpad settings is system wide and works in both qt and gtk apps...

Were you using a touchpad driver other than libinput?


I don't know what mouse driver I was using (I don't use a touchpad except on laptops, and all my linux machines have been desktops). I would have assumed it was whatever the default was for Ubuntu over the years. :)

(Actually, I do remember that in some versions you had to figure out how to trick the system into thinking you might have a touchpad before it would provide you a place to change the mouse wheel scroll direction, but that seems like a separate failure of preference panels, rather than the root of the issue).


I’m using Ubuntu 16.04 LTS on a MBP and I cannot get the trackpad to work properly (right-click in the corner). I have to hand-edit confit files, the settings involved are undocumented, and it doesn’t work and I don’t know why. Even regular movement just feels janky compared to macOS.


MBPs have gotten worse with linux compatibility over the past few years. It usually takes at least a year to even get usable. It's just not worth getting a Mac to run linux. Mac has successfully made sure that macs are only for MacOS.

That being said, many laptops which were originally for windows have issues like this too. Desktops are usually fine because they tend to use highly standardised components. Laptops, not so much. This is pretty much entirely because companies are not willing to write open source drivers, and the good integration of stuff in package managers only works if drivers fit the packaging requirements. This usually means open source. I imagine this is mostly a problem of slightly higher cost, but also embarrassment of the jankiness of code. For mass produced systems with volume in mind, where the target consumer barely cares about reputation, it just isn't a priority.

In other words, you have to go for something expensive while also not being an OEM who opposes openness.

tl;dr you need to do your research before you buy. Linux needs cooperation from all parties, you can't expect compatibility with everything.


> MBPs have gotten worse with linux compatibility over the past few years. It usually takes at least a year to even get usable.

I don't see a trend there. Some things always worked, while others didn't. Same for the current MBP's. It's the usual problem for hardware where the vendor doesn't provide Linux drivers: you need somebody to write these drivers in their spare time and only a few people are willing to do so. I maintain an overview of the Linux compatibility of the MBP's >=2016 (https://github.com/Dunedan/mbp-2016-linux/) and it's absolutely astonishing what a few people can achieve, even without documentation of the hardware. Given, the hardware support isn't that good yet in Linux, but if there would be 10 instead of 2-3 people working in their spare time on the drivers, first class support would be there pretty soon. So if you care about Linux compatibility of Macs: Get your hands dirty and do something about it!

> It's just not worth getting a Mac to run linux.

For me it's worth it, because it feels like somebody thought really hard about getting things right, which results in really, really nice hardware (except for the latest keyboards and the Touch Bar of course).


Oh pish posh.

I had to go into recovery mode last week because an automatic kernel upgrade broke.

You're doing that thing where you ignore the argument to refute a tiny irrelevant point of the argument.

Thats a rhetoic device for arguing without content, like the whole 'attacking the person not the argument'.

Your point is not compelling and does not refute the parent comment.


Yeah that was not the point at all, but thanks for refuting it anyway.


If it works.


the biggest problem with macs is not the software/openness. the biggest problem is the price. the older models might have had a justifiable price, however the new 15" models with a touch-whatever, costs way more and have less value. they actually even replaced the SD slot, which probably a lot of professional artists needed (or at least used). I used it to extend my harddrive with a 256gb extension card If I would need a new model (replacement for my pro late 13 model I would need to pay 1000 € more than for the late 13 (I need more harddisk space since 512gb wouldn't be enough and the sd card upgrade is insane.) even worse the late 13 had a 512gb sd, the new model costs 500€ more and has a 256gb sd card. just insane. dell sells me a precision with a dock more memory (32gb or 64gb) and the best processor that could be in a mac + 1tb ssd for the same amount than the cheapest new mac book pro. (it has ubuntu 16.04 preinstalled.)


1. You can buy MacBook Pros without the touch bar.

2. Professional photographers for a long time primarily used CompactFlash which has never had built in hardware support. So I can't imagine they are suddenly missing something they never had. It's the prosumers who were the heavy uses of SD cards and who were inconvenienced with the decision. Although most cameras come with WiFi these days so you can see Apple's thinking in removing the port.

3. Using a SD card as external storage is an awful idea. It was never designed for that much IO especially on OSX which can thrash storage devices pretty badly (check fs_usage). Just buy a USB-C enclosure and a 2.5inch hard drive like everyone else. Very cheap and comes in large sizes.


> 1. You can buy MacBook Pros without the touch bar.

The old Model (15") (no new processor + no usb-c)


>they actually even replaced the SD slot

Isn't this the same argument people used when the Mac ditched the floppy, and then again when they ditched the CD-ROM drive?


And PS2 ports and phone jacks and ethernet ports, yes


Ethernet ports are still a pain to not have for us. I have staff that travel to sites and need to test hard lines to see if they are live and most of them forget their dongle from time to time. Life was easier when they had built in ports. I do realize for the average user though this is a non-issue.


IMO for MBPs there is no such thing as an ‘average user’. That kinda user buys Chromebooks nowadays. It’s almost all professionals and students with different fields and usecases, and almost everyone is inconvenienced by at least one of Apple’s recent choices, be it lack of USB—A, SD, HDMI, mini-DP, magsafe, more RAM, Ethernet, upgradable components or even optical drives.

Edit: Oh and also lack of CUDA support i.e. NVIDIA cards.


I, too, used to love tinkering with Linux. I would compile my own kernels, hack drivers and fix bugs to make it run better on the hardware I chose to buy. I wanted everything to be just so.

Then, I realized that I have more money than time. I can choose Apple hardware, and run OS X (now macOS), trouble free. The only time I patched my Linux kernel was when some piece of hardware didn't work. Now, I simply choose hardware that works with my mac. It's not that much more expensive, usually.

On the software side, my old workflow with terminals, SSH, and browsers is pretty much unchanged. The only thing I really liked but Apple dropped (and I hate them for it) was virtual desktops. For that, I use TotalSpaces2.

That said, I think you can still get a Dell XPS with Linux preinstalled, no?


I've always kind of wondered what you guys are doing with your computers where you run into issues with Linux. For 95% of users, even software developers, in there off time you just want a web browser.

If you are developing, Linux is almost certainly better. The vast majority of languages have all their libraries available in the built in package manager, the included versions being guaranteed to work together.

I'm obviously not suggesting you run Arch Linux, but any Debian based distro will pretty much just work. And Ubuntu LTS doubly so.


I second that. I own a System76 laptop (Ubuntu 16.04 pre-installed) and an HP stream that I bought specifically to run POP! OS. I have also replaced several of my colleagues' OS with a Linux distro and I have yet to encounter a problem that took more than an hour to solve...and even those minor problems rarely occur.

  I am by no means a Linux guru - I just grew disgusted with Windows and wasn't willing to switch to Apple.


Linux has always had issues on laptop hardware. Poor power management, inability to switch between integrated and discrete video, lack of hardware support (eg fingerprint readers). It seems that for every one of these issues that gets fixed a new one appears.

That said you can get some decent laptops now with full Linux support. You have to stick to that hardware though.


I switched to Macs quite a while ago, when many things usually didn't work out of the box with Linux. Wifi, bluetooth, hibernation, full disk encryption, GPU support, and external monitors were all things you would either spend a lot of time configuring or even compiling kernel modules from source.

The situation today is a lot better. I believe some research into Linux support is still a good idea before purchasing, say, a laptop.


> If you are developing, Linux is almost certainly better.

Assuming UNIX programming is the only thing developers are supposed to do.


Does it feel, though, like Apple is neglecting MacOS in favor of IOS recently? iPhones and iPads seem to be marching forward while desktops and laptops seem a bit stagnant.

They gave up the server line entirely, and the "pro" moniker seems to not mean what it meant in the past. Even the super popular Mac mini seems to be lacking investment.

It would be interesting to see an internal view of growth of investment dollars going into MacOS/desktop/laptop vs the tablets and phones. The recent bugs and vulnerabilities certainly seem to point at some neglect.


Yes. And it just works. Gnome is not at MacOS levels of polishing but you get a proper dev machine instead of a pretty SSH terminal into your real dev machine.


What are you talking about? MacOS is a certified Unix variant and it has everything you need to do local dev on it.


For server side software the deployment environment is overwhelmingly Linux making Linux also the most straightforward choice for the development environment. Its not a slight on MacOS, its tooling might be just as wonderful as the Linux one for deploying on MacOS.


Not all developers are writing server side software.


I’ve seen way too many Dell lemons to believe its as simple as “it just works” with Linux.

Also, as someone who used to live next door to the creator of GNOME, saying it isn’t as polished as MacOS is quite an understatement.


Elementary OS is very polished.


> The only thing I really liked but Apple dropped (and I hate them for it) was virtual desktops

What are you talking about here ? They are still there.

Four finger swipe up to get Spaces overview. Then click the plus button to add a new desktop.


Right, there are still virtual desktops in the form of Spaces, of course. Sorry for being vague.

I was referring to the ability to set up virtual desktops in a 2D grid, which was removed in OS X Lion. I got very used to a 3x3 grid of desktops where I can move around with Control + arrows. No animations, just instantly switching to another desktop. That is what I now use TotalSpaces2 for.


Wait! What! Virtual Desktops "Spaces" are gone in OSX?



Spaces are still there, but what they have removed is that “X”. It’s simply “MacOS” now.


I really dislike the argument for recompiling a whole Kernel to getting something to run is a good thing... I would even say it's very good that Windows and Mac don't need a whole Kernel recompilation for devices... just because Linux has this design for their hardware stack does not make it a good choice. Ever wanted to get custom hardware drivers working in Linux... it's insane and complicated you need to recompile your Kernel for that! Only good thing in Linux is that many hardware drivers are very good or at least good enough and plenty nowadays. Nothing against Linux... but it's not always a holy grail of architecture!


> I really dislike the argument for recompiling a whole Kernel to getting something to run is a good thing...

I haven't had to do that in years. On the other hand, knowing that I could gives me peace of mind.


Me neither. Just wanted to say that other choices have also benefits. It's not "Linux is always superior" in every regard.


You don't.

Use a kernel module.


That does not always work. Especially when you want to replace some existing drivers. I know...


I still prefer the "closed fence" that is macOS over the complete mess that is desktop on linux. macOS was really really liberating after years of frustrations with all of gnome, kde and xfce. Then was x11 vs mir vs wayland. Desktop on linux is pretty much still a toy when compared to desktop on macos.

There seem to be a lot of misconceptions about the whole developer's environment thing. "Made by developers, for developers" and "all that money for such a closed box" sound a lot more like wishful idealism than actual "best tool for the job" pragmatism.

I also used to think linux was the best desktop, and did so for a long time, because of "by developers for developers, open-source" and other righteous sounding things. But believing you're using the best tools and actually doing so are two different things; I wasn't even living in Emacs or aware of how superior the VIM language is back then, so much for using the best tools ;)

Its easy to fool yourself into thinking you have the best tools, and its near impossible to tell you actually do.


I beg to differ: Take a look at current state of MacOS. There is nothing wrong in idea of quality control with emphasis on security, but actual os implementation lacks it. Only Apple can survive root fiasco from this year. I think that a need of commercial desktop linux distribution is present. I remember commercial SUSE 11. It was a good desktop, but targeted in a wrong direction, if someone is brave enough to target creative market with all good things that Apple is famous for, it will be more than a hit. It will be glorious. Microsoft is going in cloud/ai direction. Apple is mobile only. Desktop computing will not die so easily. Professionals need desktop, bigger screen, and reliable and expandable hardware architecture.


Where does it say Apple is mobile only?

Last I checked getting a 4K monitor to work under linux was a very involved series of configurations. My current MacBook has a retina display and effortlessly supports a 4K monitor as secondary screen. Retina alone is a game changer.

Combined with better text anti-aliasing it makes macOS by far the most pleasing/productive experience for development I've ever used (disclaimer: I have over a decade of experience in both Windows/Linux desktops.)

I stopped believing in desktop on linux a few years ago (and I really wanted it to work!) Things are getting more fragmented instead of unified; that adds lots of extra work for users and app developers, which result in a frustrating experience and vastly different UIs depending on versions/toolkits used. I don't think a commercial desktop will change any of that, even a fully dedicated and financed team will take years to barely get on-par with the desktop on macOS.


Just become a professional games developer, then you are forced to use Windows (for example to develop for Xbox One). The nice thing is: you can then say, 100%, Windows is the right tool for the job, and you can ignore the people claiming Linux or MacOS are better.

I have worked at three games development studios, some of the biggest in Europe and every single developer machine runs Windows.


> Linux distros are by far the best environment for a computer scientists - made by developers, for developers - but there's no major vendor that sells machines with Linux pre-installed and supported, and getting Linux installed on the newest Surface or Dell toy is often a challenge that gets many frustrated.

Is there any reason why something like the various Dell machines preloaded with Ubuntu or RHEL wouldn't count?


for laptops, dell seems to love 16:9 aspect ratio, which makes them inappropriate for development. for workstations, every dell i've seen uses custom cases that prevent upgrading the motherboard with an OTS component


I hate the 16:9 ratio and everyone uses it. It's probably cheaper since that's the same ratio as full HD which is used in many TV panels. A 1920x1200 monitor is much better than a 1920:1080 for me for coding.


Those are certainly downsides, and that's fair enough. The last Dell tower that I used was from about 2008 (Precision T-something). I remember adding a hard drive and RAM to it without trouble, but I can't talk about their non-business line, or larger modifications like replacing an expansion card or power supply.


>>for laptops, dell seems to love 16:9 aspect ratio, which makes them inappropriate for development.

Inappropriate how, exactly?


They are too short for reading terminals or documents. Macs are 16:10 which is a little bit better


How about "suboptimal"? 16:9 is OK for games and great for movies, and 1080p is at least wide enough that it can be reasonable to stick 2 terminals side-by-side, but it would be even better if it was a little taller.


MacOS is "real Unix". It's been certified as Unix since they moved to Intel by the Open Group.


How about the Dell XPS line with official Ubuntu support?


> Linux distros are by far the best environment for a computer scientists

I feel like this is a wide enough class that you shouldn't speak for all of us. I spent years on Ubuntu and a bit in centos and then macOS and prefer the macOS UI over both gnome and awesomeWM and brew over apt and yum. For reference I use laptops and spend most of my time on the web, in git, various programming languages, and of course writing latex.


If Linux had proper support for suspend on My MacBook Pro I would use it. The problem with Linux is their laptop support for drivers such as the touchpad and proper ACPi support . Windows and MacOS have this handled but Linux does not unless you buy dell or Lenovo. Also, macs have a lower chance of replacement since they have more reliable parts .


MacOS doesn't have it unless you buy a Mac ;)


Check out system76 and Dell's Linux offerings. Both great.


I don't really get that you can run X windows to work on your end server and you can set up local tools to work direct MySqlWorkBench for example.


Can you spell/smell Microsoft Linux? Cause I can. :)

Apple is as good as dead if they actually carry this out.


This will have zero impact. Most consumers will not care at all. For me, if I need to build something for x86, ssh to a build server works fine. Heck I have an Intel Mac now and do not build locally. Submit to Jenkins and let the build system sort it for me in parallel. I moved to a MacBook because locally the weight / size matters more than anything else. Hacker news is not the tigers Apple market.


A transition to aarch64 would provide a unique challenge to Apple.

The other transitions (68k->PPC, and PPC->x86) brought with them nearly an order of magnitude increased raw performance, which let them paper over some of the emulation overhead in Rosetta or their 68k emulator.

They won't have the same intrinsic benefit working for them this time; it's hard to beat Intel at raw performance, especially single threaded.


It’s doable for them. For one thing, even their iPhone chips are currently nipping at the heels of Intel’s low-tdp mobile CPUs, exceeding some ULV parts in performance. For another, they could just push all the difficult stuff into specialized coprocessors. Wrap that into system frameworks and nobody will ever know.


I think they'll get between 5 and 10 percent of Intel by then, but that still means that an x86 emulator is punching light, even before you get into emulation overhead.

By comparison, the 68k emulator on PPC wasn't even a JIT. It just interpreted 68k machine code, and even then it was still 2x faster than a native 68k.

Moore's law really helped them.with their previous transitions, but Moore ain't cashing checks like he used to.


I’m pretty sure Rosetta wasn’t an interpreter, and Intel chips were less than 2x the speed of PowerPC.

https://en.m.wikipedia.org/wiki/Rosetta_(software)


Their x86 cores were about 5x faster when you accounted for everything. And even though Rosetta was a JIT it ended up running a little slower than the PowerPC chips it was replacing.

So to start off with chips that, let's assume are only about 90% of the chips that you're replacing in native code, you're now at least under half the performance of x86 while emulating. And that means half the battery life too.


You’re making the unwarranted assumption that they’ll actually emulate anything. The state of tooling is much better than it was back then, and this time there’s no 32 bit mode to worry about.

According to Anandtech report in 2006, btw, the real world difference was far less than 5x. Indeed, Intel chips had substantially worse FP performance back then, something Apple uses quite a bit in their graphics subsystem.


One of the authors of this piece is Mark Gurman. Lately, Gurman has gotten legit scoops on news, but has then interpreted the small gem of a scoop into a story which is entirely wrong.

In other words, take this with a huge grain of salt.


As a Mac user since 2004 for my daily driver and professional software engineering as well as light gaming and recreation... I really hope not. Being able to run Windows and Linux easily in both VMs and on metal are much too important to me.

Of course, I am already not buying another Mac until they give me a keyboard with a physical ESC and F-keys so maybe it doesn’t matter at this point.


The current MacBook Pro is what caused Apple's revised compromise on the Pro PC line. So I think there's still hope. Otherwise I too will be patching up rMBP models like the one I have (probably second-hand ones though) until they run out.


I'm not going to miss Intel ME.

I am going to miss running windows and linux VMs, and commercial software by microsoft, adobe, and blizzard. I'm also going to miss hand-optimized x86_64 assembly variants of ffmpeg, lame, x265, nodejs/v8, java hotspot, and more.


IME cannot do everything you think it can. https://www.youtube.com/watch?v=JMEJCLX2dtw&list=WL&index=17...


Finally. I was a hardcore apple fan. I believed in well thought design and optimized software. But after Jobs this company is pissing me off constantly. Stupid iWatch is device without real use case (but it was a reason iPhone to look like every other smartphone), ditching the skeuomorphic interface for flat stupidity (instead of simplifying and cleaning it), mac mini fiasco (great idea, left in the dust), mac book air (best laptop form factor, still with low quality TN display), killing of Final Cut Studio (the most comprehensive audio/video suite ever) and dumbing down the software (my friends still use Old versions of Final Cut), no Mac Pro (iMac pro is not a professional system), ditching the core user base (the reason of existence in the first place). Finally this year after staring down in disbelief at iPhone X (o god face id is the new norm) for the first time i started looking in other alternatives in smartphone market. The only device that i still cannot replace is my iPad Air 2 (media consumption and good battery life) but is questionable. This is the last drop.For me Apple is gone. It was a good ride Steve. Now its time to ditch the bandwagon, and say hello to my old Penguin friend. To all developers in the world: Start making top notch software for linux. Make it expensive and reliable. Start with Photoshop. Than Illustrator. Than Video editing and DAW systems. The hipsters like me who from years are paying for apple GUI design (now non existent in good quality and fading away every day) are ready to pay. Go find yourself The old Apple (Snow Leopard) HIG and start your engines. Big Money is coming in your face, because professional designers, videographers, composers, architects, etc. are really pissed off :)


I think this was inevitable given the insane YoY performance increase we've seen from A-series chips.

Anyone want to chime in on why Intel can't get more than 5% YoY while Apple has been getting 30-40%?


Because they have moved past the easy parts of optimization. Apple's optimizations still have thermal throttling issues (sustained use of an A11 will not beat an Intel chip due to better cooling and IPC).

Compared to other ARM licencees they do beat the pants off of. This is probably why Apple put in a T1 chip and a touch bar in the Macbook Pro and the T2 chip (dedicated power module) in the iMac Pro. They are prototyping the individual pieces with Intel as the main processor so that they can attempt to swap the chip in 2020.

EDIT: nitpick on t1 and t2 chips.


sustained use of an A11 will not beat an Intel chip due to better cooling and IPC

In an iPad profile sure the heat dissipation is very limited. With even a basic passive heat sink (not just a spreader, but actual fins for surface area), much less active cooling like most laptop and desktops use, and the heat profile dramatically changes.

This change was inevitable -- recent A## chips have been just astonishing for passively cooled chips. The state of the art in binary translation, and the support for cross compiling across architectures, makes this a very doable transition.


FWIW, the TouchBar MacBook Pros have the T1, and the iMac has the next generation T2 chip.


Thank you for correcting me :)


:)


Because Apple started from a much lower base.

I will probably surpass Pizza Hut's growth in making pizzas this year.

Last year I made just a few, and only this month already two or three!


That's a good joke and I laughed, but this seriously understates the power of Apple's ARM chips. They are pretty impressive. To put this on the pizza scale, Apple is a lot further along than any individual pizza maker. They're a strong regional pizza chain, at the very least.


Intel painted itself into a technological corner, where they have to pursue yields with larger die sizes than AMD. Or, looking at it from the other side, AMD won by steering themselves a course where they could avoid the trap of ever larger die sizes.

https://www.youtube.com/watch?v=ctgAzn5Wx8o

If Apple can come up with something like Infinity Fabric, then they could steer the same course.


Single threaded CPU performance improvements are pretty much tapped out at this point. Process improvements have slowed to a glacial crawl (compared to the past), clockspeed can't really increase much while maintaining the ability to be air cooled, and IPC is mostly mined out. Intel probably has the best processor cache in the industry, and adding more just isn't yielding much for general purpose applications (can't really speak about multi-tenant environments or other specialized applications).

I'd guess, but I don't know, that Apple's performance growth is coming from two places:

1. More multi-threading. Moble chips in general have been getting more parallel over time. It's pretty easy to construct synthetic benchmarks that benefit from this; it's less easy for general purpose applications, although they are coming along.

2. The GPU is improving. It seems like there is a lot of headroom here to improve. Apple's GPU is almost certainly substantially better than Intel's. This isn't really a huge problem for Intel, because for people who care, they can always get a discrete GPU. Super powerful GPUs can be very impactful for certain applications, but they just don't have general applicability.

In general, I'm pretty skeptical that Apple's CPUs are improving in the necessary areas to enable them to switch their pro lines. Moving over their Air and Macbook products to in house chips seems pretty straight forward.


Starting from a lower base. Now that ARM chips have nearly caught up, they almost certainly won't have the same rate of performance growth.


Because x86 was technologically mature in the 90s (superscalar, OoO, vector extensions, …) and basically hit its frequency ceiling in the mid-aughts (modern Core-series finally have frequencies beyond what P4 reached back in 2005~2006 but it took a long time).

ARM was nowhere near there (for it had no need to) before modern smartphones. ARM in 2007 was single-issue in-order single-core topping out at ~600MHz.


Cause they started with atrocious performance?


Standard math failure. It's the same with programming languages. I never trust anything that got a "2x" speedup because if a gaping optimization like that exists, it's likely not ready for prime time.


This is such an obvious move, that I predicted it 2 years ago: https://medium.com/@Jernfrost/in-3-years-apple-will-switch-f...

My prediction in 2016, was that Apple would switch to ARM based laptops in 2019, so it looks like I am off by just 1 year.

A quick summary of my argument back then was that ARM was getting fast ENOUGH, but are considerably cheaper. Standardizing on one CPU architecture for all their products makes sense. In addition it allows them to differentiate themselves from competitors.

The only choice I am still uncertain about is the Mac Pro lineup. Since performance will matter a lot more in this segment it is harder to see a transition away from intel. A possible solution would be for apple to go massively parallel and simply use a lot of ARM CPU cores. Sure single thread performance wont be able to compete with intel but they might not need to.

A lot of the stuff you need a Mac Pro for like compiling large programs fast, video processing etc is likely very paralelizable.


> I predicted it 2 years ago

Such clairvoyance.

https://www.extremetech.com/computing/139677-the-threat-of-m...


> Sure single thread performance wont be able to compete with intel but they might not need to.

In regards to audio applications like Logic and Ableton, single thread performance is probably the biggest factor. Individual plugins and effects are all single threaded within a DAW.


What if it turns out not to be based on the ARM architecture?


That would definitely be an interesting and very surprising development. Like I would really like to hear Apple's rational for developing an x86 CPU.

x86 is a far more complex CPU architecture which they have zero experience in making. ARM in contrast they have experience with all the way back to Newton, and it has a nice licensing preposition.

I can't see them doing something different than ARM and x86. That would be an enormous undertaking to come up with an entirely new CPU architecture.

x86 really has nothing going for it architecture wise. It just has a market because Intel makes fast chips that run existing software. ARM is a better architecture, and Apple already has the toolchain to make software run on that architecture.

If you asked two equally competent chip makers to make a fast chip, and one was given x86 architecture and the other ARM, I am certain the ones who got ARM would win. x86 is just winning today because intel is filthy rich and can pour insane amount of money on x86 to make up for all this short comings.


> x86 really has nothing going for it architecture wise. It just has a market because Intel makes fast chips that run existing software.

People buy computers to run software. If they can't run their existing software on it, the computer has no value. Just like with the original PPC-to-Intel transition, where Apple provided a PPC emulator on Intel, if they go with ARM, Apple will have to provide Intel-to-ARM translation software in macOS for several years.

But now, a significant number people are using Macs as hosts for x86 Linux and Windows virtual machines. This changes the calculus a bit. It's one thing to emulate an ISA to run software in a user context; quite another to perform cross-ISA virtualization. QEMU can do the latter, but at a significant performance penalty.

It should also be noted that Apple is sitting on _much_ more cash than Intel. At the end of last year, Intel had $14B in cash on hand, while Apple had $285B.


"People buy computers to run software. If they can't run their existing software on it"

Obviously, but my point was that this mattered more in the past. Apple has so many tools and arrangements in place today to make a transition less painful. They got fat binaries and bundles. They got bitcode. They have really strong control over the app delivery process by having it all go through their app store. Almost everybody use xCode which they also control.

I can't remember ever running very much on PPC emulation on last transition. Usually I used fat binaries. Apple has been through so many of these transitions and gotten better at it every time. This will likely be quit smooth.

"But now, a significant number people are using Macs as hosts for x86 Linux and Windows virtual machines."

Really? I've been a Mac user since OSX came out and have very rarely encountered people who do that. I think this user group is too small for Apple to care. Anyway Linux should work fine, and I believe Windows also runs on ARM although not all software so I think this is not as big of an issue as you might think.


> I've been a Mac user since OSX came out and have very rarely encountered people who do that.

Are you a software developer developing for Linux targets? That describes nearly _all_ back-end development work done in the last 10+ years in SF and Silicon Valley.

> Linux should work fine, and I believe Windows also runs on ARM although not all software

Have you actually tried running ARM virtual machines on ARM hosts for these two environments? This can hardly be described as a mature art. If even remotely usable software exists today for this purpose, I'd love to see it.

Even if it did, the fact that developers usually target x86 builds (for deployment in the server clusters) makes this less useful than one might otherwise think.


Both Windows and Linux have ARM builds available. The emulation for legacy x86 Windows apps can happen inside the virtual machine.


If they are not sticking with x86, it basically has to be ARM. Producing a new CPU architecture is very costly endeavor and other CPUs exist. However because they already have experience making chips for iPhone and friends and ARM is at least "good enough", it makes sense for them to go all in with ARM.


Who's to say they can't manufacture their own x86 chips a la AMD?


Better yet they can probably afford to just buy AMD.


Even at $15-20B all cash, Apple would be overpaying, and then having to deal with all of the continuing business of supporting AMD, including the ATi problem - there's just no way to acquire AMD without destroying the graphics/gpu-compute market (you know, any more than AMD has already done themselves). There's probably some other explosive x86 licensing terms that would have to be hammered out under the transfer as well (iirc, in the past when AMD buyouts have been discussed, to deal with the architecture licensing stickiness they all had to be "reverse mergers" with AMD, and that killed every prospect of that happening...)

Just not worth the hassle compared to the price of an ARM license or sticking with Intel (or even just investing a huge chunk of cash in AMD to build super power efficient laptop chips, if you had any confidence that they could pull it off).


The AMD-VIA-Intel patent cross licensing triumvirate has buyout, sellout, and a bunch of other miscellaneous prevention/mitigation clauses.

Essentially you cant just buy Intel, AMD, or VIA and acquire the rights to make x86/x86_64 processors. You could acquire a controlling or other majority ownership stake in one of the companies and then continue to run them as a wholly owned subsidiary.

In addition, Intel, AMD and VIA cant just license you the rights either. However you could enter into a Joint Venture with Intel, AMD, or VIA as a ‘passive partner’ whereby the subsidiary has inherited access to the technology. This route has been undertaken without legal repercussions so far to VIA. Zhaoxin is a joint venture between VIA Technologies and the Shanghai Municipal Government, and has so far produced a line of x86 compatible processors on a 28nm process for the Chinese domestic market that are apparently performance competitive.

Apple has more than enough cash to fund a joint venture with AMD or VIA that could enable them to bypass any possible complaints from Intel about emulation violating their intellectual property. Apple could take this route to build an ARM based processor that had sufficient x86_64 helpers to speed things up enough that the translation/emulation performance was negligible. Heck they could use this to design their own entire x86_64 CPUs and if it was with AMD they could borrow a GPU core they are familiar with too.

At the end of the day, Apple has enough money they can basically do anything they want with regards to technology.

The challenge will be successful execution in the face of a market that has become increasingly immune to the reality distortion field Apple projects, the demise of Steve Jobs significantly decreased the power of Apple to distort reality in a way that is consistently in its favour. Not so much of a ‘If Steve was still at Apple’ as simply ‘Apple isn’t the same now, and this changes the mathematics of the situation’.


Thanks for the insights on the AMD/VIA/Intel triumvirate. It’s such an odd model that x86 has landed on.

> Apple could take this route to build an ARM based processor that had sufficient x86_64 helpers to speed things up enough that the translation/emulation performance was negligible

Totally agree, this is exactly what I thought when I read the story. There’s a lot of transistor budget these days for special hardware, especially in a larger / more power-hungry part that is substituting for something quite expensive (Intel chips).

Mac’s role as the heavy-lifter in Apple’s ecosystem means it has different trade-offs to consider, and my guess is that definitely points to a JIT or compatibility layer for legacy x86 apps. If that’s the case, it wouldn’t be worth the effort unless it could be reasonably performant, which means silicon. The optics would be really bad if they launch a platform transition and it was less than ~75% of last year’s computer.

With that said, they’ll be pushing hard for app developers to transition over, and might throw away that compatibility silicon after a few years. It’s a really cool thing to be your own chipmaker and have the levers that Apple does in this situation to add compatibility silicon and not just accept another architecture’s limitations.


What else is there?


RISC V possibly, but unlikely.


would also be too good to be true...


x86_64, but of their own manufacture. A fair number of people use MBPs to run Docker containers and other Linux/Windows VMs; to ask them to sacrifice this functionality would be to risk a fairly significant user population.


If other companies could legally figure out how to license and build x86 chips, nVidia would already be there stamping them out. It's a legal minefield that's simply not worth exploring, even for Apple.


Why would nVidia want to do that in the first place?


...to compete with Intel? Why would Apple want to and nVidia not want to? Apple sells computers, nVidia sells chips.

Apple loves vertical integration and all, but selling chips is the reason nVidia exists. Apple could put out a clothing line, switch to building luxury watches or cars, or commit entirely into doing "fashionable audio" (Beats, Apple Music) to compete with Bose and still exist past computers - it's a lifestyle/fashion company masquerading as a tech company. Apple making chips is means to an end - to fill a hole in the market that Qualcomm and Samsung weren't able to. If it were anything more than that, Apple would be taking customer orders for A11s.

Meanwhile, nVidia's Project Denver and the whole Tegra stuff only came about after their plans to attack Intel head-on failed miserably. They hired up a bunch of Transmeta employees in the hope they could implement a binary translating CPU that would "cheat" them into being able to implement an x86 processor without the licensing stupidity. Intel's lawyers proved them wrong.


They're already breaking into SoC market with their Nvidia Tegra chips that run ARM.


While a x86-ARM translator of some sort is certainly possible, keep in mind that Xcode added support for BitCode back in 2015, which means that binaries are uploaded to the App Store in LLVM IR and compiled to the target architecture by Apple. While this will not cover 100% of Mac OS applications out there, it will certainly minimize the effort of porting stuff to ARM MacBooks.


Relevant: http://atp.fm/205-chris-lattner-interview-transcript/#bitcod...

I wouldn't assume that Bitcode is a huge help with this. Per Chris Lattner in the linked transcript, "It's useful for very specific, low-level kinds of enhancements, but it isn't a panacea that makes everything magically portable."


Intel’s R&D budget is $13 billion and Apple’s operating income is $61 billion, so Apple has more than enough money to play this game. I do wonder why it makes sense for them, other than a possible unification from top to bottom of the desktop and mobile computing paradigms, with everything from the Apple Watch to the Mac Pro running the same instruction set.


I think scaring Intel enough to be better about its software and security practices is a big enough effect that would be beneficial for everyone. Not that I'm a big fan of close-source-everything Apple. But Intel's Software is just a horrible mess (note that I'm talking specifically about software: they seem to contract out software development to other firms and the code quality seems to be objectively bad as a result)


That would be the end of the Hackintosh, which would be kind of a shame.

Even having never built one myself, it remained a last resort of some kind in the back of my mind.


For someone not still in high school that struck me as way more effort than it was worth.


Hah, now that you mention it does seem like a huge chunk of the community is in high school. When I went to college I just got a real Mac and it is 100x better than the MSI Wind I used to use (though that was a great device too!).


That's kind of the crowd for something that saves you a bit of money at the cost of a huge chunk of your time, isn't it?


It's actually pretty astonishing how easy it has become to churn out a decent, stable desktop hackintosh. This announcement seems to give them a four or five year lifespan.


1. apple likely believes intel's hardware division benefits from apple's large investment in compilers. In general, apple is attacking all its technology partners to cut them out of the stack (i.e. qualcomm).

2. apple seems to be underestimate the value of openness in driving certain kinds of customers to their hardware (developers, for example). Further splitting the ecosystem will drive some users to linux laptops. (Hear hear).

3. intel's incentives are now highly aligned with the linux community (esp given msft's windows shutdown)

4. this is the first time since 2005 that there's a legitimate spread trade between intc and aapl.


AMD has the rights to produce x86 compatible CPUs. If Apple were to design the chips but AMD manufactured them on contract, would that avoid IP/licensing issues with Intel?


Yes. I am finding it almost bemusing that people are dancing so delicately around the only rational possibility here: A semi-custom AMD-produced x86-64 CPU, very much like the Samsung-produced A-series ARM CPUs for their phones and tablets. I see zero reason that the next Axxxx chip cant be a rebranded Zen core with Vega GPU and custom Apple IP magic sprinkled in - https://www.amd.com/en-us/solutions/semi-custom

Just because Apple says they are done with Intel and are making their own chips, does not preclude the possibility that they are simply working with a different x86 vendor on a custom/semi-custom design.

I see zero way Apple gets off the x86 architecture in the next few years, simply due to the tremendous software ecosystem that has grown up around the platform since the switch from PowerPC. They would have to offer a dual path for years to get their larger software partners prepared.


That’s a lot of fuss for at best a lateral shift, and a step down in the all-important performance-per-watt metric (Zen and Vega are totally unproven here).

You may be underestimating the possible paths Apple has to a) transition software developers quickly with better tooling, and b) provide sufficiently performant JIT / compatibility layer by sprinkling a bit of x86 compatibility into future A-series chips, and you’re certainly underestimating the value in c) reducing complexity on the software-side.

Apple is already making their own GPU and CPU and throwing all their weight behind that, why would they hop onto the (brilliant but rather unreliable and on mobile totally unproven) AMD train?


That’s assuming Apple manly uses existng core designs from AMD etc and doesn’t significantly contribute improvements for their own systems, to gain advantages over the chips available to their competitors. But surely adding a hefty dose of their own secret sauce would be the only thing that would make such a move worthwhile?

What’s being suggested is not that Apple use AMD designs. It’s that Apple partner with AMD in order to gain licensing cover for their own largely or completely in-house engineered chips.


No. Without an architectural license Apple can't design chips without being sued. They could set up a partnership though, similar to Zhaoxin, a joint venture between VIA and the Shanghai Muni.


AMD does not manufacture anything, they've gone 100% fabless. Their former manufacturing arm had been spun off into https://www.globalfoundries.com, which also absorbed IBM's fab tech.


If true, and assuming the chip will be based around ARM, then this may be the final nail in Intel's server business. The biggest gate in adopting ARM-based servers is development and testing on a local environment. If x86 is being emulated and ARM is native then developing software for ARM-based servers just got a lot easier.


"according to people familiar with the plans." usually = "stuff we made up." Same with the WSJ. They're known for Apple news stock manipulation. went from 167 to 164, which at a decent volume and buying at the lower price, could net a nice profit.

"Intel shares dropped as much as 9.2 percent, the biggest intraday drop in more than two years, on the news."

Ahh. That too.

Plus all of the ad revenue from all of the people clicking on this startling article. I don't trust Bloomberg for anything related to Apple.


If the current MacBook Pro is any indication, then I would be surprised if people are still buying Macs at all in 2020.


Pretty sure I'll still be using / rebuying the 2015 model


Try an XPS15.


If Apple really moves to custom chips in Macs, it's going to be about one thing only: independence. Apple wants to control the entire stack; and relying on a single vendor is too big of a risk.

It's not a question of price; it's about getting the chips they need.

Intel has missed a lot of targets in recent years, and (according to rumors) more than one recent Mac model has been delayed because Intel did not have the next chip generation ready in time.

Current Macbook Pros max out at 16GB RAM because the Intel chips don't support LPDDR4 RAM.

On iOS, Apple can control everything. They can plan ahead, and know that in 3 years they'll have the chips they need.

On the Mac side, they can only hope that Intel doesn't delay their roadmap longer than usual.

(The story about Marcipan (a shared iOS / macOS UI framework) seems unrelated to the chip issue. Sounds like they included it in this article only to make it look a bit more substantial. There is already a lot of shared code between macOS and iOS, and the CPU architecture isn't holding anything back in that regard.)


Yeah, I can definitely see this as being the reason. The switch from PPC to Intel was partly motivated by IBM dropping the ball. Now Intel is failing to keep up as well, so given that Apple has the resources, why not control their own destiny?


While it's had a rocky start, Microsoft's starting to push Windows 10 laptops based on ARM chips, so it's definitely not out of the realm of sanity that Apple is interested in doing the same.


Probably too early to tell, but I'd imagine this will mark the death of the Hackintosh? It's a shame, because I had a lot of fun hunting down all of the information and files I needed to convert one of my old PC's over.


Eventually, probably. But they'll still support existing Macs with new software for a while after the transition.


For everyone speculating that this means a move to ARM and they'll write an emulator for x86 code, it's good to note that the first Windows tablet based on ARM, the $1,000 HP Envy x2, has pretty terrible performance for x86 code when running on a Snapdragon 835. The Celeron N3450 launched almost 2 years ago found in super low end $200 Windows laptops outperforms it by ~50%. And ARM on Windows can only run 32-bit apps (no 64-bit apps, no drivers, no system level utilities).


Good data point, though they may have just not invested much in writing a good emulator, let alone dedicating some extra silicon to speeding it up. If Apple makes this shift, my strong suspicion is they’d do both.


What will happen to people who use VMs of Intel-centric OSes like Ubuntu, Fedora and other cloud OSes?


For web development, I think Ubuntu for ARM will be fine:

https://www.ubuntu.com/download/server/arm


Seeing a lot of assumptions and speculations on this thread (which makes sense given how early and high level the sourcing is).

I wonder if anyone has seen side by side roadmaps of Apple and Intel's roadmaps, or projections of those roadmaps? I doubt Apple is moving away from Intel for pricing reasons, given the margins they work with. It seems more likely (or at least as likely) there are broader technical reasons. Feels like an apples-to-apples (no pun intended) comparison would be helpful.


> I wonder if anyone has seen side by side roadmaps of Apple and Intel's roadmaps

This would be easy to do, if we knew what was on Apple's roadmap…


So Apple seems to be willing to give up on the developer market almost entirely, and will turn the Macs into multimedia centers just as iPads then? Maybe that's indeed where the majority of their revenue lies, and they couldn't care less whether fewer developers buy their Macs if that means more home users buy them. But still, I doubt whether it will be the right move in the long run.

Correct me if I'm wrong, but wouldn't a change of processor architecture mean that the majority of the apps that currently run on Mac won't run anymore and would require the app developers to make extensive changes to make them run on the new architecture?

I've parted with my MBP after having used it for 4 years and switched to Arch Linux. It has been a blast so far, apart from the spotty HiDPI support and lack of built-in dictionaries. It's also much nicer than Ubuntu as a dev distro (not as a production distro of course) since you can install the newest version of almost any software you want in a flash without involving the messy PPAs. I was worrying about Apple's shifts in the recent years in the Mac line but they pale in comparison to this change. Gives me less of an incentive to use any Apple product for sure.


The vast majority of apps don't use things like inline assembly or rely on architecture specific features, so it's just a recompile.


I've read through a lot of the comments, but no one asked the question - why would Apple invest the R&D resources to create and support ARM based Macs when Macs are such a small part of their revenue and profits? Would they make more money by selling ARM based Macs?

Why not just add capabilities to iOS and make an iOS laptop? The only thing I see missing is true multi-user support (not the hack they do in the education market) and pointer support.


> why would Apple invest the R&D resources to create and support ARM based Macs when Macs are such a small part of their revenue and profits?

They already invest a lot in the development of their ARM-based processors for iOS. I don't think the additional investment for making them ready for Macs would be that high. Take the 12" MacBook for example. I can even imagine they could simply stick the current A11 processor into it and it'd be fine.

> Would they make more money by selling ARM based Macs?

Yes. Profit margin would be much higher.


They would still have to develop their own custom thunderbolt controller, an emulator, etc.

The only thing changing as far as cost would be the processor. Let's say they save $100 on the processor (not likely). They sell about 14 million Macs a year. That's only $1.4 Billion - pocket change to Apple.


Intel stock is down 7.3% currently.


DJIA and NSDQ are both down about 3% right now so at least half of it is a general slump/profit taking


The whole market is down today because of the TTT (Trump, tweets, and tariffs).


Well yes, but the whole market is taking a dive today. -500 on the DOW last I checked.


-722


I think for a site full of developers, the almost exclusively welcome reactions to such a change are intriguing.

Don't you worry you will have to cross-compile to run your programs on the amd64 server? Or how you are going to debug locally bugs that may arise from a different architecture? Or that homebrew could be in an inconsistent state for some years until all software catches up with the new architecture?


Apple dealt with this last time with Universal Binaries https://en.m.wikipedia.org/wiki/Universal_binary


Recompiling software isn’t as scarey as some people seem to think as long as you have a decent toolchain. And, in my experience toolchains have improved a lot in the last few decades. Automated build processes are de rigueur these days.

Actual hardware to test on is essential for debugging, but it’s not like ARM based Unix/Linux isn’t already a thing.

Assuming Apple doesn’t lock down the OS, I’d say that open source ports like Homebrew will be available almost immediately. It’s the closed source vendors with outdated development processes that might drag their heels.


If this is true, I'm pretty interested to see how they intend to pull it off (and how it works out in real life).

I'd be more skeptical if Apple hadn't already pulled off CPU architecture transitions with impressive smoothness multiple times already.

Will Apple do an x64 CPU?

Or maybe x64 emulation? ...maybe with custom hardware assist?

Could they possibly do a crazy "you must recompile for ARM if you want to run on the latest Macs" thing?

The performance of Apple's ARM CPUs is already fine for the lower end of lappys, and have superior power usage. Moving up into midrange performance lappys doesn't seem like a stretch at all to me, or even higher end lappys, especially if you're looking for more cores. But what about iMacm, iMac Pro and Mac Pro? Does Apple really have a solid plan that makes ARM CPUs competitive there? Or will there be Intel-based Macs at the high-end and ARM-based Macs at the mid to low end?

Could Apple possibly tie all this to some unification of iOS and MacOS?

And/or tie it to a unification of the iOS App Store and Mac App Store?

Interesting stuff.


I wonder if this means that virtualization will be impossible without some sort of emulation layer. If so, that seriously blows.


The thing I've come to realize over the past couple years is that the dream of performant open hardware is still beyond reach for at least a decade or two.

But this is probably the next best thing to have. A company that at least tries to be sensitive in terms of privacy and security building their own hardware, albeit not open. I respect Apple for that.


That would be 30 years after they started ARM together with Acorn.


I cannot imagine that is is a good plan. I just don't see enough innovation coming from the processor space to justify the overhead of design, development, testing, and manufacturing while effectively re-inventing most of the underlying system. Unless they build them as x86 clones... but even then it seems like they would be playing catch-up to new innovation.

If Apple really plans on doing this, it MUST be because of some fundamentally new capability that they are either unwilling to divulge to Intel, or Intel is unwilling to invest in. Some kind of core change to the way that Macs interact with or behave with users; otherwise it just doens't make sense to me.


My questions:

ARM? Walled garden of an app ecosystem? Command line?

There has been convergence between iOS and MacOS for a while now.


I see the question marks but I'm not entirely sure what your questions actually are


It was just a stream of consciousness and I was planning on editing it but got distracted. It also wasn't formatted very well. My mistake.

I'm sure this has been going on for a long time behind the scenes. That is what happened when the PPC -> x86 transition became public. Since Apple has it's own ARM chips/fab, I wonder if ARM is where they are going? It would make sense from a couple different points of view: Lower power use and security. Can there be equivalence between the two architectures, meaning would similar CPUs net you a similar workload?

There is much to like about the iOS app store, but there also is much to dislike. Apple is the gatekeeper here, and if they don't like your app, you are out of luck. That's not the case with MacOS, at least not yet. You can still write what ever kind of app you desire for what ever your client base may pay for and Apple can't really do much to stop you. Merging iOS and MacOS would need to address this.

Finally I use the command line a lot. I don't have access to that on my tablet or phone unless I jailbreak it which has it's own risks with doing so. If Apple decides this is no longer acceptable, that will certainly change my choices for getting work done.


Your concerns would be better founded if this was news of iOS coming to desktop instead of Apple switching desktop processors.


There are several reports about this happening. Granted it’s rumor: https://www.cultofmac.com/519763/apples-got-secret-plan-merg...


I think I see the questions, but I don't think we can answer them. Will Apple use this as an opportunity to merge MacOS and iOS? What will that mean for developing and distributing software?


Puts them in a unique position where they may see increased sales if the can prove the lack of the invasive portions of a management engine. How can they create and push to market something as complex as an x86 processor in 2 years. Sounds like a ton of work.


It's kinda hilarious that nobody has commented on the "Kalamata" code name :D


That's because there ar


I just had a flashback to the G4 and G5 era. The day they announced Apple announced the move to Intel there was a huge sigh of relief from everybody I knew, especially those in the Media industry.

I don't see a lot of good coming from this move.


People here were anticipating this, should be no surprise. Presumably, Intel also knows for some time, but their long-term strategy on how to stay relevant seems unclear. I could imagine them becoming a pure fab-company one day.


This is a multi-year transition even after 2020 but this is a step closer towards a cloud-based MAC where the hardware is connected 100%, has super long battery life and the majority of the processing is done in the cloud.


It would be in line with their warnings on 32-bit apps. You certainly wouldn’t want to have to continue supporting 32-bit as a first-class development target when you’re about to change your entire architecture.


This is no surprise, it's in Apple's culture to want to own the stack completely. I wonder how they will execute on this however, Apple's design team is pretty good.. but who do they have doing pre and post silicon validation? Will they go the extra mile someday and build their own fab? There are lots of intricacies to consider when trying to make your own chip. I guess the biggest question is how far is Apple willing to go to have complete control over their product, microprocessors are probably the most capital intensive part of their product.


I was just discussing this weekend the strength I was seeing in the ARM market. Looking at the Apple trend lately with the A9-A11 chips, being ARM based, and then the advances Qualcomm has made making workstations and servers based on ARM chips, this just seems like a no brainer. Build cheaper, more efficient chips, custom designed to work with your own ecosystem.

Finally, the ARM and NVIDIA deals could make large strides in the market of getting blockchain ready and making machines purpose built for dealing with the inevitable future of blockchain in everything.


About time. I'm tired of Intel's minimally viable laptop processors (and their security blunders). Apple has the margins and the volume to throw a lot of transistors at a problem, like they do in iOS devices. I want 13" Macbook with more than 2 cores, good GPU, and decent memory bandwidth. Apple can do big/little cores, large caches, wider memory buses (HBM please), inference acceleration, 120hz variable refresh with good battery life, etc. These things have a questionable ROI for Intel but set Apple apart and sell more hardware.


These things are available, Apple chooses not to use them in their quest for thinness.


I disagree. They have certainly sacrificed battery size at the altar of thinness but not transistors. Who else is doing 100mm^2 SoCs in consumer devices? Now take that up to 200-300mm^2 and imagine the possibilities with tailor-made blocks and cutting edge memory. Look at Intel's dual core die shots, >50% GPU and my laptop is still jankier than my iOS device.


How the performance of fastest current ARM chips compare to the latest generation of low power, quad core Intel i7 chips (~15W) used in contemporary non-workstation laptops?


I wonder what impact this will have on the hackintosh community.


Some of the old X86 clone chip companies got bought out by others. Cyrix, Centaur, Winchip ect got bought out by VIA and I had not heard of VIA is still in business.

Apple once made PowerPC chips with Motorola, IBM, and others.

There is also a rumor of Apple using ARM chips with X86 codes or X86 emulation.

I'd like to see Apple make a version of a modern PowerPC CPU and use an ARM as a co-processor to run iOS stuff. Cut out Intel or do it via emulation like Rosetta did for PowerPC code.


Component manufacturers & companies like newegg should get together and fund a Linux distro that has great ports of all the most popular OSX software :)


It will be interesting to see how Apple handles this. Last time around users got a compatibilty layer in Rosetta to bridge compatibilty between Power Pc applications on Intel. I would imagine that would be the case this time but some of the questionable changes in terms of user friendliness (imho) in the recent past removing headphone jack etc. makes me wonder if this will be a cold switch for older software.


It's trivially credible that Apple will use its own chips in macs - it already has cpu's faster that its low-end, ultra portables. They could have done it years ago.

But less credible that they will only use their own chips. However, there have been complaints that Apple's high-end hasn't been very high recently - maybe laying the ground-work, lowering expectations?


I wonder if Apple is maybe looking at AMD for chips? That would be a much less painful switch instead of transitioning to ARM for the Desktop.


It's unlikely. Apple has shown a clear trend towards trying to get rid of absolutely any 3rd party components from their devices. They want the control that full vertical integration gives them.

The got rid of the 3rd party ARM design in their iphones/ipads, going for their own ARM/AARCH64 design. They are currently in the process of getting rid of the PowerVR GPU, to be replaced with their own GPU.

Moving to their own AARCH64 CPU for laptops/desktops fits with that trend. Many people have been wondering 'when' this will happen, not if.

Moving to AMD doesn't fit with this long term trend.


Probably correct. Apple could easily purchase AMD and also get the video card tech from ATI which they seem to favor on their desktop machines.


I sometimes wonder why Apple would do this. I mean: i) the end-users frankly don't care. Seriously. ii) 90% (if not 99.99%) of dev would not (be able to) care

I mean, I doubt if more than perhaps 5% of Apple internal dev can take advantage of "tight integrat[ion] of new hardware and software." This integration probably takes form of either some specific app (think Pixel camera phone), some specific library, or compiler optimizations. The first one (specific app) can be accomplished much cheaper through add-on chips (guess what, that's what Pixel does). The 2nd and 3rd can be done much more effectively through a generally available chips (like, well, Intel's chip) since more people, from vendor's engineers to researchers to random open source ninjas, would be able to experiment and help out.

In other words, from a purely technical point of view, there is absolutely zero reason to do this. Whatever happens, Intel is among if not the best capable chip producers. And Apple is not "disrupting" (i.e. focuses on unaddressed aspect), but merely directly competing with Intel's core competencies. It's not Amazon entering details against WalMart's. It's Target's competing against WalMart's, except they don't have Target's existing competencies. Which, again, makes no technical sense.

On the other hand, if they want to completely lock in users......


I don't think they'd attempt it unless they had something that would disrupt the market. Intel has been rusting on its laurels without credible competition, and if Apple manages to steal a march on them with a new technology, that would be the impetus to bring it in-house.


‪How they managed not to mention the shrinking economy of scale for PC chips vs growing economy of scale for mobile chips is beyond me.‬


SemiAccurate has reported this all the way back in 2012 that they had a ten year plan to switch to ARM. I’m not surprised.


Anything to get more competition in that market is greatly appreciated. Intel products have become a massive disappointment.


Is this maybe why they are looking for Linux kernel developers?

Tim Pritlove predicted this move for some years now on his freakshow(.fm).

Anyway the only interesting thing to me is if they will share a codebase between macOs and ios or if they will create something new and not based on darwin. Maybe they will take the step steve didn't want and switch to linux.


I seriously doubt this will happen. Why would they throw away three decades of work that's known to work on both x86 and ARM?


You are probably right. I myself however am curious why they would need linux kernel devs.


I think Tim Cook is laughing out loud in his office now, how they just tricked Intel to lower their CPU prices. However I wonder how this insider info went out public, having direct impact on Intel shares.

Sorry but transiting from Intel doesn't make sense to me. It would burn huge pile of money for very low benefit.


What I find funny is that in ~2006 I had a net eng working for me that was a die-hard Mac fan.

I claimed at the time, that I would bet him that apple would start making machines with Intel chips. He LOST it and said "THAT WILL NEVER HAPPEN!!"

And thought I was a complete moron for even suggesting as such.

:-)


This was public knowledge in 2005…


I may have got the year wrong. I don’t recall the year.


Interesting thread on the way Steve dealt with the press during the PPC -> X86 transition.

https://twitter.com/nickwingfield/status/980882009156329472


I remember when Atari and Apple used to use Motorola chips. I miss the greater variety of those days, so I see this as positive, even if I'm not an Apple user myself. (And I look forward to trying out Apple-produced chips second-hand, and running Linux on them....)


It'd be really cool if they rolled down the https://riscv.org/ path. They wouldn't have to pay anyone anything and could do whatever they wanted.

Then again, wont happen


I am curious as to how Apple can just jump into a new complex business and be instantly pro at it without buying out a chip manufacturer who is. Will there be a learning curve? Will Apple machines circa 2020 have a bumpy road or not as fact?


There is a difference between designing a chip (from system model downto mask set) and manufacturint (fabbing). Apple is a fabless company, just like many others. They use TSMC, Samsung Semi, SMIC and other fabs that provides state of the art fab processes.

When it comes to the design part, Apple has gobbled up enormous amounts of talent and resources the last 10 years. PA Semi was one of the bigger ones:

https://en.wikipedia.org/wiki/P.A._Semi


Don’t they manufacture chips for their iOS line of products?


I can find no evidence that Apple manufactures chips of any kind.


Neither does it plan to do so.


The Apple chips are manufactured by TSMC & Samsung.


I think the first step is to produce a GPU that integrates well with Intel CPUs. The next Metal version will need to have even deeper hardware integration while external GPUs support will be a solution for those who need AMD or NVIDIA solutions.


I sensed this transition from the point that Apple released the new small MacBook.


A non-event and long-forecasted IMHO. As others have pointed out, they've already done this twice before. I do like the idea that they can thus build a very secure platform. Might even make me buy a Mac again.


The only thing that could go wrong with this kind of switch is software compatibility, and as long as this chips could run x86 software properly and will have same or better performance, I don't mind.


Surprised by the comments on performance. Putting a second cpu/SoC on the board and keeping it in a low power sleep mode, then spinning it up as needed would be trivial for Apple.


Apple is fantastic with hardware, so this sounds like a great idea.


This will give them even more advantages because they are a vertical integrated company they can differentiate them selves with hardware and software integrations


Here's the full Bloomberg piece over audio: https://goo.gl/V999a4


So Apple laptops will become glorified underpowered netbooks. macOS users totally won't start abandoning the platform even more than now.


Oh boy, more universal binaries.

What are the odds that Apple will just internalize x86 production, as opposed to doing a A11-style ARM derivative?


> What are the odds that Apple will just internalize x86 production...

Zero, since it's wildly improbable that Intel would grant them a license to use the x86 architecture.


They could just buy AMD, which is also a rights holder to X86_64 (in fact, they basically “invented” it). Super highly doubt it will happen, but technically possible.


It is not unheard of for licenses to have terms which require renegotiation in the event of acquisition. That might make such an acquisition a mortal threat to x86_64, but not a seamless path to using it.


You don't need a license to use x86.

Moreover, the x64 was created by AMD, not Intel.


> Moreover, the x64 was created by AMD, not Intel.

With the assistance of a veritable phone book of cross-licensing of patents with both Intel and VIA.


> You don't need a license to use x86.

Factually wrong. Ask Microsoft. ;)


Does this mean we can have Open Firmware again?


I wonder if they will be ARM only or if there is some sort of x86 solution built in too. This report has basically zero info.


Its hardly a report. It is at best a rumor.

My strategy with any apple news: ignore it until its released or said from the horses mouth. Remember the iphones that had clickwheels like ipod rumors prior to the iphone? Thats what I consider this.


The clickwheel iphone "rumor", might have been true. During initial design of the original iPhone, both Scott Frostall (from Mac OS division), and Tony Fadell (from iPod division) were tasked with creation of iPhone. Both groups took their existing software (from iPod/Mac), and tuned it for a phone. Ultimately trimmed down OSX solution won over iPod's evolved firmware. I was reverse engineering iPod video/classic firmware around 2005/2006, and I do remember seeing references to "iphone".


Sure it might have existed, but that isn't what was released. That version of the iphone might just have been a proof of concept or prototype, but it didn't end up a product.

Just like this rumor might be.


It will be ARM only.

It's not just feasible for Apple to develop new microarchitecture from scratch. Kalamata will license ARM technology. New microarchitecture takes 6-8 years to develop and it's huge investment even for companies like Intel, Amd and ARM.

Currently only Intel and Amd have high end desktops CPU's. ARM is likely enter as third with ARM architecture that Apple is licensing, but it will have relatively low performance.

Apple will likely take a hit in workstation markets for CAD and image processing application market but they are in the position to gain a lot in more common use cases if customers get better power/performance ratio.


Meanwhile with the impending containerization and vm efforts chromeos will begin to look more and more attractive.


Steve Jobs mentioned in interviews what a hell of a job his teams did in the initial change to x86 and the immense work it required.

I wonder what he'd say in this scenario. Not saying "Steve Jobs would have NEVER allowed this apple is going to shit", simply wondering if the effort required to move to x86 should be essentially discarded (depending on the implementation)

Edit: I agree with the replies, they are a good reminder to me of the sunk cost fallacy


In the x86 transition, Apple was discarding the previous effort which had been required to move from 680x0 to PowerPC, and that seems to have been the correct choice.


The effort to move to x86 has paid off tenfold already in the ~10 years since the transition. They now need another effort to sustain the Mac, or whatever comes next, for another 10 years.


It's also worth noting that the OS they migrated was not the one that ran on the bulk of the old ppc system but a relatively new Unix fork that at least at the time was pretty possix compliant.


Which is to say, they have experience migrating an entire platform to a different architecture. Not many companies have that background.


Honestly I think Jobs would’ve been all for dropping x86. They also derived an immense amount of value from the effort to move to x86, so it’s not like it was wasted.


>>> They also derived an immense amount of value from the effort to move to x86, so it’s not like it was wasted.

That's a good point and a reminder to me of the sunk cost fallacy.


Can someone explain the current and hypothesized future role of Intel, ARM and now Apple chips among Apple products?


All the vestiges of 90's computing are dying! I couldn't be happier actually. 2020 computing here I come.


Is Apple competent enough to build an equally powerful CPU? Or is it a no-brainer for the kind of money they got?


They are absurdly competent with CPU design. The A series chips are already matching or even surpassing the performance of lower tier Intel chips. Keep in mind these chips were first introduced IIRC in 2011 with the iPhone 4S, are designed to be passively cooled (no fans) and are limited by the battery constraints of iPhones and iPads. They could conceivably 'scale them up' for laptops and desktops and get a significant performance boost.


Keep in mind that the Geekbench benchmark measures the unthrottled performance. Whether the phone is passively cooled or has a 200 TDP Fan strapped to it doesn't matter.


Apple's Cash (77B) is about 1/3 of Intel's Market Cap (224B), maybe they could just buy Intel.


Good luck getting that by antitrust laws.


Buying AMD is an option if they really need x86... either short term or long term.


Unfortunately AMD's licensing deals with Intel terminate if AMD is purchased.


I'm pretty happy with my Xeon.


What is next, Apple building its own wafer-steppers / lithography machines?


As long as i get a unix shell i wont complain about what apple does to the hardware.


This coupled with LLVM creates a powerful position for Apple in the future.


So what I'm hearing is the 2019 Macbook Pro will be the one to buy?


Assuming that Apple has the good sense to ditch the touchbar, absolutely.


Would they be able to design their chips to avoid Spectre-type attacks?


Bear in mind Apple’s own mobile CPUs were also vulnerable to these attacks. The vulnerability was with commonly used architectural features found in many processor designs from different design houses, including both Apple and Intel among others.


One possibly nice fallout of this would be Intel offering Xeons to consumers to leverage its economies of scale on the server-side for the consumer market (since I believe Apple is the biggest customer for high-end consumer CPUs).


IMO if you have never tried KDE plasma and install KDE connect on your mobile devices you have no idea what you are missing.

KDE connect is irreplaceable 5 minutes after you install it.


if true it means they have version 1 chips already today. perhaps this is why the new support for external GPU.


Sure Apple, how did it go last time when you tried to implement your own SSL library without any tests or static analysis? goto fail; goto fail;


Will this make the prices go up or down?

Down, right?


I think and hope that Apple will merge osx and ios and thus only needs to maintain one software and hardware architecture.


So that was not an April 1 joke?


Their own ARM or x86?


Hey Apple, could we get those chips made in germanium? Kthx


who didn't see this coming?


Its going to be a 128 bit chip (it could be)


I just read 1060 comments, 400 of them are about macOS, 200 of them are not even related to the CPU topic. And no one has any more insight to share?

I find it very strange people think Apple SoC aren't any good for desktop workload.

https://browser.geekbench.com/v4/cpu/compare/7795892?baselin...

This is an Macbook Top Spec from Intel on a Fanless Design, against an iPad Pro also a fanless design. Let's ignore the small difference of TDP for a minutes because both are limited by its Fanless design, and Geekbench do runs on for a few minutes, so it is an indication or how close they are in these specific workloads.

The Intel Core i7-7Y75 can turbo boost to 3.4Ghz if needed, comparing that to max 2.4Ghz of the A10X. If we also ignore the frequency, iPad Single Core performance is with 10% of Intel's best Core processor at this TDP. The A10 core here is a generation older then the A11 used in iPhone X and iPhone8, and A12 coming in this September will likely have even better IPC.

The JavaScript testing performance on both Mac and iOS Safari shows performance is similar and A11 even edge out Intel in some cases.

So no matter how you spin it, in certain workload Apple A11 has already matched up or exceed Intel within a Fanless TDP design. That is excluding the advantage Apple will have on Multicore when it has 4x Core compared to 2 Core 4 thread in Intel.

Intel's 14nm++ is also matured and better then then TSMC 10nm, which is merely a testing node for its 7nm.

i.e Assuming Apple wanted to, they could have a 7nm Quad Core A12X shipping in 6 months time that is better then Intel's Core at Fanless TDP or <15W design, at 1/3 of the cost.

The reality is Intel hasn't been executing its plan for a few years. And If this rumours is true it is only themselves to blame. When Apple were designing its 2015 Macbook, Intel's roadmap were clear, three years later in 2017, the Macbook was suppose to have Quad Core in its design. And looking at all the latest roadmap, it doesn't seem there will be a 10nm Quad Core CPU shipping this year either. Not in fanless design. We are looking at 2019 march, four years since its introduction to get that through.

Then there is the LPDDR4 memory support. You wanted 32GB Laptop Memory? Well Intel doesn't allow you to have it. Not even in 2018. The delay in 10nm, the little to no improvement in IPC, AVX2 is very much a niche. As a normal customer I dont really have a reason to buy Intel anymore unless you want absolute Price / Single Core Performance, otherwise AMD is a much better choice. So I dont think we are the one who are frustrated. Apple is likely too.

I also wonder how this moves means in the Modem space. Which is the much more important pcs for Apple. I have often argued one reason Apple didn't make the Mac to ARM move earlier was because Apple needed Intel's modem to fence off Qualcomm's "double dipping". Now that Broadcom failed to acquire Qualcomm, Intel is being dumped from Apple. Does that mean Apple is going back to Qualcomm modem soon? Likely along with a deal of using Qualcomm Centriq on servers and likely later in Pro lineup of Mac?

Previously Apple's rumoured Project McQueen were to bring all the Server inhouse, using less resources from AWS, Azure and Google. Apple is a large Server customers on its own. If Qualcomm has Apple to kick start its Server CPU business, along with Apple push to help PC transition to ARM, it may be well worth Qualcomm to lower its royalty and modem prices.


Great, we can go back to the macrumors nonsense about how this PPC is "just as fast" as the x86 days.

With their focus on iOS vs MacOS, this doesn't surprise me. MacOS will simply die once they port their tools to iOS.

Terrible choice IMO.


Apple was always their own world, their "openess" came from almost closing doors around 1994 and with NeXTSTEP they played the same game as NeXT did, pretending to be open to get UNIX devs into the platform.

Now that they have the upper hand, they are free to go back to their old ways without much considerations about opennesses.

With the PC sales declining and most OEMs following back to the old ways of integrated computers, before of PC components revolution, I guess most regular users will use follow along.

If the macOS/iOS developers still can get their share of the cake, most things will hardly change, although they might loose those that only use macOS as a pretty UNIX.

The question is how relevant are those sales currently.


Except apple has demonstrated they can design cpu’s that outperform the (android) competition. Who is to say they can’t make ARM cpu’s that genuinely outperform intel? Intel hasn’t executed all that well lately anyway.


Being able to go toe to toe with low-power areas says little about being able to compete against full-power areas, and vice versa. Intel is a laughing stock when trying to compete in low power areas even though they dominate full power. ARM is the only choice in low power outside of a few Chinese chips and Samsung's Exynos, but yet ARM hasn't made a dent in full power applications. The two areas do not scale as easily as they sound.

Intel has been iterating their design decades longer and was in stiff competition for much of that time period. They will be hard to catch in the desktop/laptop space for anyone starting with a cell phone CPU.

ARM has not had real competition in their market, most chipmakers are licensing ARM tech, so they should be easier to catch up to, which Apple has done (with a healthy dose of borrowed ideas from ARM).


You do realize Exynos and the Apple A-series are both ARM processors. They didn’t “borrow” ideas from ARM, they licensed the IP and are fundamentally just better implementations of the base line IP with a few things added on.


Isn't Exynos also ARM?


You're right, for some reason I was thinking Exynos wasn't ARM based because it wasn't Qualcomm- not sure why. That takes the list of ARM competition down to just the Chinese offerings, unless I'm mistaken about that.


The trade offs for a SoC are different than for a desktop machine, and to a lesser extent a laptop. The desktop is where we'll feel the most pain. I wouldn't be surprised if they cancelled the Mac Pro entirely.


It would surprise me, because they made a big public show of recommitting to the Mac Pro last year.

If they had any doubts all they had to do was say nothing -- which is their strong preference -- and it would have remained clear that the Mac Pro was dead.

Making a strong "trust us, we're working on it" statement and then canceling would be a pretty bad unforced error. Not impossible, but with their top-down, forward-looking decison-making process it seems very unlikely to me.


Given the failure of ARM in the datacenter Apple would be pulling something off that no other semi manufacturer has. It's not impossible but just very unlikely.

It's much more reasonable that they'll do a "best effort" in the high end space but it won't be comparable to a Xeon. You can see the iMac Pro as a prelude to that. A beefy machine to be sure but not what you'd expect for top of the line.


It really depends on what exactly Apple is intending to do, which we just don't know yet.

But if they really are committed to the Mac Pro (and if this rumor is essentially true), then they must be planning something pretty extraordinary:

1. They actually have a solid plan on how they are going to make CPUs competitive with Intel Xeons over the next ~2-4 years.

2. They're planing on a bifurcated line, where some Macs have Apple CPUs and some have Intel CPUs... presumably with different architectures. I guess 2b. would be that the Apple CPUs will actually be x64 compatible.

It all seems pretty crazy, but Apple has done this kind of thing before, and done it rather smoothly. I guess time will tell.


The simplest explanation is that they aren't that committed to the high end. They don't make any money there and they've proven with their actions it is not a priority. A few quotes to the contrary don't change the fundamentals of what they've been doing.


Nah... back then single thread performance mattered a lot. Today most users can do most of their work on something with the performance of an iPad.

Apple will not suffer in 2020 from having lower performance than intel. It wont matter. Form factor, screen quality, memory speed, memory size, touch pad, battery life etc will matter so much more.

Multi threading is also so much more common on pro app which require performance. They can just match intel by using more cores. ARM is way cheaper than intel so I don't see how they can lose this game. PPC was an entirely different world.


Not everyone wants a mobile PC.

The mac mini is essentially 1700 days old since they downgraded the processors.

I only have macs because I build apps on them. I buy used and as inexpensively as possible. Using an old i5 for development currently.

I wanted to be an Apple fanboy, but having to rebuy critical software for even point releases because of stability killed that off for me long ago.

Best Apple I've ever owned (beyond my //e) was the Dual Processor Quicksilver. Case opened on the side, you had many upgrade options and it even looked nice (irrelevant to me but I'm sure that matters to some).

ARM will simply be the new PPC. You think things like supporting external graphics cards (yay they just got limited AMD support) and having high end rigs isn't what people want, especially in the gaming community?


I don't know where all this conjecture is coming from. Have you seen benchmarks of apple A11 chips? They have some of the best CPU technology in the world now. They are competitive with low end Intel chips already while using significantly less power.

Before they went to Intel, they were at the mercy of Motorola, which couldn't keep up with Intel.


The problem Apple had with Motorola was that Motorola couldn't get the chip yields they needed at high clock rates.

The problem Apple had with IBM was that IBM didn't care enough about power efficiency to make the chips viable in laptops. They also weren't that interested in fast iterations of incremental improvements.

When the G4 first game out, it was far, far ahead of x86. Apple offered it at 350, 400 and 450 MHz but shipped maybe a few hundred or a few thousand at 450 MHz and a few tens of thousands at 400 MHz. Those computers blew the Pentium out of the water. Intel eventually caught up and passed them purely on manufacturing ability.

Then the G5 came out, which was a beast of a processor. But it never came out for mobile, and IBM only upgraded the speed once or twice before Apple went for Intel.


Except PPC was Motorola and IBM, not Apple


Aww, young'uns.

https://en.wikipedia.org/wiki/PowerPC

PowerPC (with the backronym Performance Optimization With Enhanced RISC – Performance Computing, sometimes abbreviated as PPC) is a reduced instruction set computing (RISC) instruction set architecture (ISA) created by the 1991 Apple–IBM–Motorola alliance, known as AIM. PowerPC, as an evolving instruction set, has since 2006 been named Power ISA, while the old name lives on as a trademark for some implementations of Power Architecture-based processors.



Apple contributed heavily; they're the ones who basically invented Altivec.


Because that worked so well last time?


To what are you sarcastically referring?

Apple's switch to PowerPC which gave them a performance lead over Intel for five years and kept them competitive for another five or Apple's switch to its own ARM Core designs which have given them a 12-24 month performance lead over the entire Android ecosystem?

Bear in mind that Apple dominates the high-end desktop / laptop market, both in market share and profit share* which could put a huge dent in Intel's economies of scale for high end consumer CPUs.

* I'm assuming these figures omit servers. Also, there was a lot of noise in around 2009 saying that Apple had over 90% of the high-end PC market, but I've not seen more recent figures one way or another. Given that PC prices seem to have, if anything, slipped, I doubt it's gotten worse for Apple (and goodness knows the tech press loves any statistic that makes Apple look bad).


He is referring of course to the PowerPC architecture. It did not have a performance lead over Intel for as long as you suggest. Even when it had a lead, it wasn't that big. (I bought the best G5 when it came out, so it's a painful memory)

The problem with switching to a different architecture is that if it falls behind Intel, people will say "not as fast as a PC." If you stay with Intel and it falls behind, people say nothing. It's risk vs. no reward.


Pretty sure "last time" refers to the transition from Power PC to Intel.


When they decided to make their own ARM chips and went on to build the best mobile implementation by a pretty long shot?


The wonders of vertical integration!


Apple couldn't have started digging their own grave soon enough. Now all that's left to do is for Microsoft to play its end game - provide a bridge for NT/Win32 apps to run atop Linux, and spend the next couple of years bolting/porting/reinventing their new UI on a rock solid server OS - Linux, which will soon become a rock solid desktop OS.

Bye bye Apple. Nice to have known you. Game over-Linux wins, we all win. No more proprietary OS.


Tim Cook is fundamentally an operations person who drives cost out of the supply chain. During his time as CEO, Apple seems to be innovating in the supply chain instead of the products. At Apple's scale, building your own chips and iPhone screens makes sense but doesn't intrinsically add value to customers. I wish Apple would invest as much in their software as they do in reducing cost.

Fortune Magazine in 2008: Think of Cook’s contribution like this. There are two basic ways to get great profit margins: Charge high prices or reduce costs. Apple does both. The marketing and design drive consumers wild with desire—and make them willing to pay a premium; Cook’s operational savvy keeps costs under control. Thus Apple is a cash-generating machine. Cook has called the company a place that is “entrepreneurial in its nature but with the mother of all balance sheets.” At last count that meant $24.5 billion in cash and no debt.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: