I don't think ambition is something you have to earn, more power to him.
However considering the cat-herding involved in creating great user experiences from the top down, and considering how much Apple pays its engineers and designers, if he pulls this off he's 100 times the leader Steve Jobs is.
Package management is really useful for servers, or other systems where there's a bunch of dependencies. Desktop apps hardly need it, and OSX apps in particular don't need it. The combination of the drag-to-applications-folder installing and checking for new updates to the app when you launch it solves the vast majority of this problem for the vast majority of users.
That's a circular argument. OSX apps solve the dependency problem because they have no other way of dealing with it. They have copies of the dependencies inside the app bundle because they have to be self-contained bundles.
But that's not solving the problem - if you have a defective library inside a bundle, upgrading the copy that lives inside the other bundle will have no effect on the first one. Programs are on their own and the provider must ship a new release of the bundle for each and every update on each and every library that gets bundled. Each program has to self-update and decide if the user will be exposed to a vulnerability while it self-updates or if the user will have to wait for the upgrade to finish before he or she can read his or her e-mail.
The idea that all shared libraries are in effect shared and updated at the earliest possible time, automatically, makes a lot of sense for desktops.
This is insanity. It only looks sane if you have never seen anything better.
Remember DLL's on Windows? Having one new app upgrade a shared DLL was the source of so much pain in other apps. In a perfect world where interfaces are rock hard and apps code to them perfectly rather than how they actually work, shared libraries are a fine idea. In the real world of desktop software, though, independently tested, self contained packages are a much more robust solution. OS X's application installation system works wonderfully, and the libraries they provide to enable publishers to push updates works well enough.
And, disk space usage from redundant code is a non-issue.
Of course I do. That's one problem package management solves. You (software piblisher) don't update libraries you don't own - just your application. Shared functionality, like how to read PNG images is managed by the system. This also forces library makers not to break interfaces, solving the other problem you mentioned. If you need a specific version of a given shared component, you state that dependency in your package and, if possible, the proper library gets installed with your package.
Added disk usage was never the issue. Consistency of behaviour is. With shared components you know your machine will decode PNGs in exactly the same way across all the applications installed.
Oops, I just meant if you remembered the havok they wrought, not whether you actually remembered them or not. Didn't mean that to sound the way it did.
I think the problem comes when software devs code to the actual behavior of a library's interface rather than to the stated behavior (which may not even be completely documented). I think this happens pretty frequently, and always if the library's behavior deviates in a way that breaks the program.
In the context of Mac OS, I would prefer not to rely on the library makers, etc. to do the right thing in not changing interfaces, especially in the event that an earlier version didn't quite meet its spec. I'm OK with the earlier version not meeting spec, as long as the software maker accounted for that (which he did, if his software based on it is solid).
Also, I think consistency of behavior is more important in unix-style single-purpose apps than with monolithic Mac OS style apps, due to the former's higher likelihood in being used as part of a script or larger program. I agree that package management is a good thing on Linux, and shared libraries probably are, but I'm not convinced that's true on Mac OS.
You're absolutely right, but as a developer, package management makes a significantly more pleasant experience.
I also really hate how every OSX app has it's own way of doing updates. Even if the OSX makes desktop apps not need dependencies, it would be really nice if I only had one 'update' button to push. As it is, 5 different things want to update practically every day.
> it would be really nice if I only had one 'update' button to push.
No. If you are asking for updates and fixes, you are doing it wrong. It should be automated, because you shouldn't have to bother to remember pressing a button. I set up my mother with a Linux box and sure as hell I don't count on her pressing buttons without some prompting.
> As it is, 5 different things want to update practically every day.
If you have a defective library that's bundled in five apps and needs fixing, you will have five upgrades instead of one. And, possibly, an OS update and a reboot.
But only if you are lucky and the publishers of the five programs are paying attention. Most likely, you will end up with five different bugs scattered throughout your system with many different libraries that really should be just one.
"Most likely, you will end up with five different bugs scattered throughout your system with many different libraries that really should be just one."
Actually, I prefer the five. When you have a shared library, and bugs are fixed, applications that use the library might well break. Shared libraries mean that either applications are at risk of completely failing to work, or that the library developers have to flag the bugfix as a new version to avoid that. Either way, incompatible bugfixes require application updates anyway, so you might as well avoid the breakage and use static libraries.
It seems to me that the only good non-mirage reason for shared libraries was disk space usage, and that's just not a problem any more. Down with shared libraries and dependency hell, I say! ;)
With a proper package manager, applications won't break with the ugprade of a shared library, because their dependencies will be such that the new shared library will conflict with the application (or, what usually happens, a new build of the application against that shared library is uploaded to the repository at the same time as the library). Sometimes the packaging of a program has bugs in it though, so the situation you describe sometimes happens, but I've only ever seen it happen when I'm installing random Ubuntu packages in my Debian install though.
So, yes, while you're right that shared libraries can sometimes cause the problems you describe, a good package manager will pretty much fix them. But what I love the most about Linux though is that if something bad like that happens (like the ABI of a shared library changing)... it's a minor inconvenience, but so what?
apt-get source $package
cd $package-$version
apt-get build-dep $package
dpkg-buildpackage
cd ..
dpkg -i *.deb
And bam, you just rebuild the program against the new shared library with the different ABI, and everything works. Obviously this isn't "user friendly", but then again I never had to do this before I started mixing packages from different OSes together. And isn't it so easy to do! You can just do that for any program in on your whole computer! "Hmmm, I wonder how this program works", and one command later you have its source code! Another command later and you've built all of that code, into a nice Debian package too. It's so wonderful to have such a powerful package manager.
It used to be really easy to junk a Debian system by installing random apps to try them and then uninstalling them. Even with more modern Debian-based systems like Ubuntu, installing and then uninstalling something (say, kubuntu-desktop, to pick an example that happened to me recently) does not leave your system in the same state it was before you started. Instead, you get (to a GUI user) bizarre random configuration changes, and programs that worked may now not work (or vice versa!).
Basically, as someone who switched back to Linux from Mac recently, Synaptic and the Software Center (and why there are two tools that show different-but-overlapping package sets is another WTF) are full of surprises and why-did-THAT-change?! moments.
> It used to be really easy to junk a Debian system by installing random apps to try them
Oh.. The 90's...
Seriously: I moved to Debian-based distros in 2002 and never experienced anything like what you describe. And mind you I ran testing with packages from sid directly for a couple years.
Before switching to Ubuntu late last year, the last time I'd used Debian on the desktop was around 2000, so it's true that my experiences with Debian are mostly 90s-era. However, it's still the case with Ubuntu that it's common to install something, try it, and then be unable to get things back to the way they were before (mostly speaking of desktops, WMs, and themes, here). It's not uncommon to allow Update Manager to update things and then find that programs that were working fine suddenly don't work. This happened to me just in the last month with Wine, and happened a while before that with PulseAudio.
I have the patience and time to spend a day changing my config until things work again, but it's certainly a ways behind both Mac and Windows in this area. Additionally (while I'm venting), Ubuntu trains you out of reporting bugs because if you actually reported a bug whenever things went wrong, it would be a part-time job. I actually had fewer update problems with Gentoo ca 2003, though at the time I had things to get done and just wanted stuff to work, whereas now that that I don't depend on my home system to make money, I find fixing the problems fun. :)
> It seems to me that the only good non-mirage reason for shared libraries was disk space usage.
Shared libraries are shared in memory too. If you have 5 separate copies of libfoo and you start up applications using them, you'll have 5 copies of libfoo in memory too. And memory is much scarcer resource than disk space.
True, although code memory space consumption is not the issue it once was. The consumption of most of the big hogs (web browsers, word processors, etc) is data-related.
It's pretty silly to have a serious dev machine without a package manager. What do you do when you migrate to a new machine? I like to copy a list of packages and then run an update/upgrade command. Linux wins by far. Copying a directory of self-contained apps is quick and easy as well... but if you need binaries for a newer arch, say x86-64, you'll have to download a new copy, whereas the package manager solution has you covered.
That's true. I quite like OS X and use MacPorts as well. These days I use OS X almost exclusively. The only reason Linux wins is OS X still has some installers. Not many, but any at all are too many for me.
Copying app bundles around is easy enough though and they make things Just Work w/ fat binaries, so kudos to Apple for making that easy. Their Migration Assistant is slow, but still eases much of the pain of migration.
In addition to that, there's no restriction on having something like apt-get for packages that really don't make sense in the "drag-to-install" paradigm. Which is why I installed my own version of ruby, say, out of ports.
I like OS X because, yes, there's a paradigm, but there's a rich underside that you can use as much as you like.
This isn't package management for OSX per se, but if you haven't seen it already I highly recommend macports/porticus, which is pretty effective at managing package for the unix-y apps that OSX is missing.
You gotta admire the guy's ambition.