Hacker News new | past | comments | ask | show | jobs | submit login
Ask HN: Why did OS X win out over Linux for so many developers?
33 points by coned88 on Dec 13, 2015 | hide | past | favorite | 53 comments



It's the best possible bridge platform for *nix development.

POSIX enough that tools and environments work pretty well without a mountain of hacks and workarounds (e.g. Cygwin).

Mac enough that the user experience is coherent and consistent across the overwhelming majority of applications. (e.g. drag n drop, key bindings, media interop, etc.)

Popular enough to have native MS Office in orgs where that's still a hard requirement.

I tried to force myself to go full Linux by swapping out my Macbook Air for an X1 Carbon Gen 3 running KDE Plasma 5. The environment was nice and customizable and I was able to get pretty comfortable with it, but the instant I wasn't using a qt5 & KDE 5 frameworks application, the user experience fell apart. Couldn't set my key bindings the way I like in GTK apps because the GTK/GNOME teams apparently gave up entirely on accels files and key-themes. Media interop was pretty much non-existent, and there were lots of annoying little bugs (e.g. resizing a window would drop its focus leaving in a context where there was no active window and I'd have to click back in it.)

I still use Kubuntu 15.10 on a 12-core Dell T5500 w/ 48GB RAM for running larger distributed systems simulations/tests, and it seems about a hundred times more usable than the Windows 7 machine my job originally provided, but when I want to move fluidly between development, making arch diagrams, writing docs, or creating conference decks I can't escape how much better the complete experience is on my Macbook.

Also, LibreOffice Impress somehow managed to make a UX more bewildering, broken, and obtuse than PowerPoint, which I'd previously thought to be impossible. Viva la Keynote!


I used to react negatively to all the Windows + Linux bashing because I've never had issues. I also use Macs. No issues there either. They are tools and they have different personalities. I thought I was just, well, somewhat unique.

Maybe something was wrong with me that I just didn't get it. I just didn't understand just how bad it was to use Windows as an engineer (and like it) and not have one iota of interest in becoming a vim guru (while still using it as needed) while rejecting graphical code editors and IDE's. So, yeah, even with over 30 years in computing and engineering I sometimes thought I was a little nuts for note getting it.

That's until this year, when a contract we won gave me the opportunity to spend a non-trivial amount of time (12 to 16 hour days) inside one of the most highly regarded technology companies in the world. What I see is thousands of engineers doing amazing work and, interestingly enough, every desk has a Windows machine on it. I see IDE's everywhere and not a hint of vim. I see Linux everywhere running on virtual machines and no problems at all. I also see all kinds of other applications and the amazing way the Windows ecosystem just absolutely hums when setup and managed professionally. Not a Mac in sight. Well, actually, just a handful, out of thousands of PC's (I'm guessing >30,000). What's also interesting is I have never heard a single engineer complain or worry about anything Windows or Linux. Ever. Far more important stuff to focus on.

I don't know what I can conclude from this experience other than, yeah, it works like a dream when setup correctly. In fact, some of what I've seen has caused me to rethink some of our internal setup. The other realization is that the OS largely becomes irrelevant in the context of an organization. What is important is how the web of computers, users and applications are setup and configured in order to create a larger tool-set with which to run a business. I've seen how a very large Windows deployment becomes largely transparent to an organization to the point where everyone can focus on the job at hand. It's awesome.


> I also see all kinds of other applications and the amazing way the Windows ecosystem just absolutely hums when setup and managed professionally

I think most of what you're seeing comes down to this: Windows works great if you have a healthy budget, a high-quality IT team and, critically, a strong sense of mission about making users productive.

Unfortunately, the average Windows user is lucky to have one of those three be true. The normal experience is indifference/incompetence, being forced to use enterprise software which has been in development for decades without once having a usability audit, arbitrary impediments being excused as security requirements, etc. In many cases, using a Mac or Linux box was popular because it was the de facto way to opt-out from the centralized muddle and, later, the rise of iOS meant that those IT departments were told to make their systems work with the CEO's new pride and joy rather than trying to customize it into uselessness.


"inside one of the most highly regarded technology companies in the world."

any reason not to name the company? I'm trying to think of a technology company where Windows (and not Mac/Linux) dominate engineering workstations and failing.


Sorry, I just can't name it.

You have to remember that "engineering" isn't just "software engineering" or "web development". The vast majority of the engineering world does not use Mac/Linux. There are countless major engineering tools that only exist on the Windows platform and this has been the case for decades.

None of what I said is to imply these platforms are inferior in any way. We use both Mac and Linux. I prefer to do web development work on Linux because, well, you are working in exactly the environment you are going to deploy on and tools like PyCharm work great under Ubuntu. Outside of that, yes, doing web dev on a Mac is the next best thing. On Windows I always have to run an Ubuntu VM, no point in jumping through hoops to make believe you have a Linux environment, a VM works great.

Once you shift your focus to circuit design, layout, mechanical engineering, CAM and other high-power commercial tools, Windows is pretty much king. And, once you look at how smoothly Windows, Office, Exchange and other tools integrate at an enterprise level, well, it's hard to ignore how awesome of an environment it turns into.


Yeah, I hadn't used Windows in many, many years, but when I did it was to run Mastercam and SolidWorks.

I used it briefly to do Visual C++ and .Net CF development after that, but relatively quickly moved onto projects with a lot of open source underpinnings.

That was the problem with being originally compelled to use Windows at my current job. Trying to build infrastructure automation pipelines and Erlang software on Windows that will eventually be deployed on Linux is a colossal pain in the neck.


Microsoft?


probably wintel


It seems the migration to laptops as primary dev box was the biggest driver.

The linux driver issues were much more severe when running linux on a laptop (power management, CPU C-states, etc). I need dependable wifi, sound, video and other driver updates... I got absolutely tired of wondering wondering if X, network, and/or sound was going to work after each and every minor update.

Apple was/is the only vendor shipping a "working" system in laptop form for a reasonable price.


Seconded – you could see the trend start rolling in the early 2000s at conferences, meet-ups, etc. I knew a number of Linux / BSD users who switched and every single one of them cited driver issues as the primary motivation – having a coherent UI is _nice_ but not having to choose between an hour of battery life or daily kernel panics, playing audio/video easily & stably, etc. was compelling.


- OS/hardware integration "just works" out of the box, with occasional hiccups on major OS updates. Overall, very little wasted time.

- Trackpad and MagicMouse are generally well above the mainstream.

- Good iOS interoperability.

- Excellent screen, battery life, weight and finish.

- Most unix dev tools/apps run well. Homebrew.

- Aesthetics. Yes, it counts.


Through 15 years I was working on OS X, Windows up to Windows 7 and Linux (KUbuntu, Mint, Arch). In the end Arch Linux won.

There is no argument about the fact that OS X / Windows are much easier to use by people who start their jurney. However at some point cons are simply overtaking all the pros.

Although I disagree with some other commenters that Linux is hard to use on laptops (linux went through long way - "normal" people can enjoy it just like pros). I also do not agree comments about sharp look or battery life - I personally use Samsung Ativ 9 and find it way way more aesthetic than mac book. No problem setting up Arch on it. No waste of time to make things working.

And none of my devs is using OS X. They went through long way themselves and probably know better than me. My observation is that Macs are much more popular in US so since I'm based in London my view may be biased.


For me, Arch won on every machine, desktop, laptop and RPi and I think it mostly had to do with how Arch stays the closest to unix philosophy. I used Fedora for a few years and before that Ubuntu and I think I only ever rolled my own package or modified an existing one once or twice because of the sheer complexity of it. With Arch, from day one I was fiddling around and every time I need something that's not packaged already (and that's rare), it takes me minutes to do it myself. I'd much rather know how to build my way through (and not suffer) than cross my fingers that someone else already has.

Adding to that, I find Arch to be so clean and minimalist and more importantly unsurprising. Because of its rolling release cycle, I never have to plan an upgrade which always stressed me out with Fedora and Ubuntu.

Regarding hardware support, I'm running Arch on a mid 2015 manufactured Lenovo X250 and everything works out of the box and I get a solid 8 hours battery life out of it.


Well I'm one Arch + Gentoo user. Though I've optimized both to be as energy efficient as possible, I've never been able to get something comparable to Windows in terms of battery life. So once it's fully optimized (and I'm on XMonad, not GNOME, so power consumption should be fairly less), I can draw about 3 hours or so, as long as my browser (Chrome) isn't running Javascript. The moment I start some decent browsing, I cannot get more than say 2 hours from my laptop. Video watching? Again, not more than 2 hours. On Windows I could go on to watch movies for more than 4 hours (this might be less because of the smaller battery on my device). Do you do some special magic to get that much juice out of your battery? I'd love to know some cool tips. I myself wrote some on my blog.

P.S. Not to mention, running Emerge on Gentoo with -j9 lands me at a battery life of less than half an hour :P


As boring as it may sound, I didn't do anything special, just the run of the mill practices: lowering brightness, running `powertop` and setting all tunables to "Good". At idle, this gives me ~3 Watts usage. With the 46 Wh battery, that's 15 hours. Of course when I start using the laptop that's halved more or less. With the screen off (when I'm using the external screen), it goes down to ~2 Watts, for a whopping 23 hours battery life at idle. I remember an Acer Travelmate I purchased back in 2009. The lowest I could get the power consumption was 7 Watts at idle. I'd say Intel for the most part, has come a long way in power efficiency.

EDIT: I use Mate Desktop BTW.


Corporate procurement practices and policies definitely come into play, as well.

With Linux, there isn't a consensus laptop model that everyone will request. One ends up with a lot of one off business cases, vendors, service contracts, etc.

With Mac, you have a consistent upgrade cycle, one service contact, and no fragmentation of OS distribution usage, etc.

TLDR; It's easier to say "I want a macbook pro w/ cinema display..." and "we hired another developer, please re-order a mac dev setup", than any similar Linux setup.


Hmm, I've been using Gentoo on desktop, laptop and servers for over 10 years.

My Mac devs make work around in our code (geared towards Debian servers) so it runs in MAMP. Linux all the way through is a win.

As far as having to edit config files to make things work: I think it's good to know what's under the hood.

We get recent CS grads who know very little about how computers work. Linux might force you to learn the fundamentals but, I argue that is a good thing.

And, my Gentoo desktop (primary dev box) has been happy for a decade - through upgrades to hardware and software. Cheap & stable. What's not to love?


The hardware is pretty solid, you can literally beat up someone with a MBP, and 8+h battery life when coding (on the shell, not phpstorm/idea)... nothing even comes close to it.

And it's reasonably enough unix-y (and modern if you use macports to install current versions of core tools) to allow daily work on OS X instead of Linux.


It gave users unix with a visually consistent desktop environment right out of the box. Also, the hardware integration makes it easy to just get started right away without fiddling for days.


One obvious thing is OSX runs on top of a BSDish architecture. So while using Macos 9 or Windows could be painful in some respects, OSX has a shell etc. similar to Linux already.

Prior to Ubuntu, this was a no-brainer, setting up things like a wireless adapter could be a trial. I run Ubuntu on a System76 laptop and have been happy with it. I like being able to get the source for everything I use and be able to patch it.


There are lots of great techical reasons already discussed, but I would like to add one factor to consider: the "mercedes effect". Apple devices do have a certain status associated with them. Combine that with many startups like to showcase their developers in videos working on dual monitor 27" imacs and macbook pros.

Food for thought.


People have Macs for similar reasons as iPhone for similar reasons as Apple Watch


OSX is a lot more refined and the GUI programs available for developers are more rich. If you're switching from Windows, you're likely to find more of the programs you're familiar with or alternatives. Also Mac hardware is often found in Universities which gets students familiar with the environment.


My understanding is that the OS X EULA does not permit OS X to be run on non-Apple hardware or in a virtual machine that is not also running on Apple hardware.

So if you want to test on OS X (including Safari) then you need at least one Mac.

Sharing a single Mac for testing could be enough, but given a team of more than a few devs and it may become a bottleneck.

You can run Windows and Linux VMs on a Mac - without breaching any EULAs.

For web devs who care about Safari, Macs become almost mandatory.

Note, I do not own or develop on a Mac. I primarily work on a desktop simulation software written in Java. I found the Mac love professed by other devs I know somewhat bewildering for a long time. I was only when I dabbled in some web development with a Rails app that I realised how much pain came from browser differences across platforms. Now the strong preference for Macs made more sense.


Not sure what it is exactly but the retina display doesn't play nice with Ubuntu 14.04 LTS.

Webcam drivers also broke when apple switched from a USB Webcam implementation to pci or something.

Power consumption on osx is probably a half or third of 14.04 LTS.

I founded a company and need MS Office (unfortuantely), Fusion 360 for CAD work (FreeCAD didn't quite cut it) and once things got rolling the number of Skype calls picked up.

I still dual boot and prefer Ubuntu, but now I am 90+% in osx.

Before this computer and startup, I had been using Ubuntu for 6 years and loved it. Looking forward to going back one day, but for a while, I'm going to be on osx.


I have done .NET programming on windows for 3-4 years and felt the programming environment was pretty neat. The frustrating part was windows upgrades and the OS eating up all resources and the frequent need to upgrade the machine.

I switched to Linux(redhat and then ubuntu) for the next 8 years and loved vim and programming tools that linux had to offer. The resource utilization was never a blocker. The frustrating part was wireless drivers and machine hanging up because of them.

I recently shifted to OSX and installed iTerm/vim and all that. There have been no issues with wireless hardware and resource utilization. However, setting up production-like environment, which runs on Linux is a huge pain. Running a dual-boot ubuntu is also not as seamless and there are quite a few display driver issues. My take:

- If you have just started programming, start with Linux (if you haven't fought enough to compile drivers for your machine, you are one bit less of a real programmer)

- If you are doing a lot on the server side which largely is Linux driven, then you better use Linux to understand systems and deployment.

- If you are using eclipse, then you better shift to OSX because no other hardware-os combo at that price can let you code in peace.


I could guess that the bar for entry to develop via OSX is a bit lower than on Linux.

While Linux Distros like Ubuntu make it really easy to set up a developer system (with localhost web, languages and database) it is even easier on OSX via MAMP: install MAMP, configure with a GUI, ready to roll.. Linux you might be tweaking some config files to get the optimal setup.

On Linux you are partially a dev-op not only working on your code but also learning and tweaking your OS, services, etc. for one reason or another.

Another factor is there are some shinier tools on Macs, (i.e. the Adobe lineup, and a easily installed Sublime Editor) And many that went to learning institutions will be comfortable more with Dreamweaver/Photoshop/Illustrator than Eclipse/GIMP/Inkscape.

I took the Linux route, even though I already owned a Mac, I felt on Linux I was closer to the metal where Mac OSX had too many safety rails (both for the user and many publisher's safety)


I'm not sure why the adobe suite of products is misconstrued as an "apple only" set of tools, it's a myth I've run into many times before. They run rather nicely on other platforms, not to mention sublime isn't mac only either. I'm forced to use Mac at work (employer wants the same platform to be used company-wide), but I have always used a wide number of graphics and development tools (the adobe suite, sublime, maya, zbrush, etc) on windows over the years and will continue to do so. I haven't played with Linux yet, but I detest Mac OS as I don't need training wheels.


Saying that OS X has "training wheels" just basically discounts you from a comparative discussion of OS's in my opinion. It's substance-free denigration that adds nothing to the conversation.

If there are parts of OS X that you don't like, fine, post about those, specifically.

Adobe products run fine on Windows, in fact Premiere runs better on Windows than Mac. But the parent was comparing Macs to Linux, and Adobe does not run on Linux at all.


There is nothing denigrating about using "training wheels" as an analogy, Apple's bread and butter are media consumers and not developers, in my experience it shows in the OS."Training wheels" are for kids to learn how to ride a bike without hurting themselves, similarly Mac OS is primarily geared towards providing a "safe" home computer experience to the computer illiterate media consumers where user is protected from destroying their system by being passively prohibitive. These "training wheels" are particularly irritating although they can be worked around, but they also make simple tasks far more complicated than they need to be this is rather frustrating/irritating for some developers. All platforms have their share of problems.


> Apple's bread and butter are media consumers and not developers, in my experience it shows in the OS.

I mean, have you been to a developer conference lately? That wasn't a Microsoft conference? Did you happen to see any Macs there?

> Mac OS is primarily geared towards providing a "safe" home computer experience to the computer illiterate media consumers where user is protected from destroying their system by being passively prohibitive.

OS X has shipped with a complete Unix shell since 2001. Pretty much every dangerous command you can think of on Linux will execute the same way in Terminal.


>I mean, have you been to a developer conference lately? That wasn't a Microsoft conference? Did you happen to see any Macs there?

I haven't been to a MS developer conference and saw a mix of platforms at all other conferences I've been to. Have you actually watched apple's product launches? Ever notice the media consumer is whom they are marketing to?

>OS X has shipped with a complete Unix shell since 2001. Pretty much every dangerous command you can think of on Linux will execute the same way in Terminal.

There is so much more to an OS than terminal, so it's not all that matters. I suppose if one only worked 100% out of terminal and nothing else it would be a non-issue.


This Apple product launch was for consumers?

https://www.youtube.com/watch?v=w87fOAG8fjk

Again: if there are aspects of OS X you don't like, that's fine. There are certainly some things that I don't like. But the idea that it's somehow got "training wheels" and is therefore not suitable for developers, is just not supported by any evidence.


You're offended by the term, but I assure you I'm being quite objective- the "safety measures/training wheels" clearly exist. Why is it that you feel it's not suitable for developers? I'm curious because I never said that, but you did.


Why? Because far too many developers are distracted by technical baubles instead of prioritizing the long term freedom. We are losing the War On General Purpose Computing, and apple - having convinced a generation of programmers to develop for their closed platforms - has done a lot of damage to computing freedom.


Absolutely. I would continue to use Linux even if I thought my development environment / user experience were worse than on Windows or OS X. Fortunately I don't though!


Seems to be the case in US. I work for a small dev shop and everyone else is using OS X, except me. I'm running Linux Mint 17.1 (Mate Desktop) on a Toshiba Satellite w/4K screen, SSD, and 16GB ram. Love it. And I like PC style keyboards better than Mac keyboards :)


I know the only reason I use OSX for work is because we have an iOS app. If we didn't have that I, and probably half the other developers, would be rocking some flavor of Linux. You can write software for 99% of users on OSX (with a windows VM, anyways).


For me at least, I made the switch to OS X simply because of software support. There wasn't a good Linux equivalent for Sequel Pro; there's no git gui tool on Linux that matches SourceTree (which my team was standardizing on for its gitflow integration); etc.

And damn, the hardware is just nice. If I could run Fedora on a MacBook Pro, that'd be my ideal setup. Or if OS X wasn't so terrible at customization -- the number of sketchy hacks I've had to install to get my setup how I like it is just depressing.


There's still no good linux laptops* that come with Linux as the main OS. MacBook Pro blows away all the compeition.

*I've seen poor reviews of System76s stuff.


Pretty darn happy with my Dell XPS 13. Use Arch Linux instead of the included Ubuntu LTS, though.


I have worked on Windows, Linux, Mac. Nowadays Mac-only:

- least amount of hassle, 95% it works - very good piece of hardware - great number of OS-X only tools

expensive, but it's the tool of my craft so I'm willing to pay for it


On the hardware side, Repairability, same hardwares and less waiting time to repair than competitors which are too fragmented and far less hardware defective.


Because Apple products are very popular. What laptop/pc comes with Linux preinstalled?


Dell currently makes two laptops that officially support Ubuntu and they're great. I had my company buy me one when I started there and I'm very happy with it.


System76. I use a System76 laptop right now.


Well I'll give them that the options are awesome, 64 GB of Ram and PCIe solid state drives with top of the line graphics and CPU for about the same as a top of the line MacBook Pro $3+k. But it is heavy, kind of ugly, and low battery life. Which to me means I might as well just buy a desktop for less.


Yeah they're not bad, I have had one of their products for almost 5 years now and it works pretty well. I wish they had a 13'' in their line of product though.


They are nice laptops but unfortunately not a viable option if you don't live in the US.


Works out of the box.

Most dev things work exactly the same as on linux.

You can use photoshop


For the same reason so many developers use javascript in the back-end - misapplied laziness.


I agree with ya. It takes a lil work to get Linux tuned to your needs, but once set up it rocks hard. I use Ubuntu server with the i3 wm and a bunch of shell scripts. I control my world with just my keyb and I love it.


Ain't nobody got time for that [Linux]!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: