Hacker News new | past | comments | ask | show | jobs | submit login
Non-Apple's Mistake (loper-os.org)
148 points by zephyrfalcon on April 16, 2010 | hide | past | favorite | 137 comments



Amid the ranting, he's making a good point.

And that is that there is no serious competition for the hard combination of serious good taste, obsession with details that seem negligible but ultimately matter a lot, etc., embodied in Apple's recent products.

I know this sounds like raving Apple fanboy stuff, but I do think it's rarely acknowledged explicitly; it's either assumed by those in the know, or ignored by those not.

For example, I would never in a million years want to develop Android software; why? Because it's written in a language I loathe, and the API is workmanlike but completely uninspired (disclaimer: I've never written any code for it). In contrast, the Cocoa Touch APIs are delightful, and MIT-style (vs. New Jersey) at most levels. Perfect? No. Amazingly crafted? Yes.

I claim the secret sauce to their APIs is actually a character at Apple named Ali Ozer who, along with his crew, has maintained a firm hand on the tiller, and learned a huge amount about object-oriented design for interactive systems over the past couple of decades.

iPhone OS is the result of throwing out a lot of the cruft and starting over, but keeping the hard-earned knowledge. (Core Animation being one of the key underlying technologies for iPhone OS that found its way back to the desktop, but only partially due to all the cruft in normal desktop Cocoa.)

And Apple works hard at the silicon-to-UI whole-stack integration and performance tuning, something that other vendors famously haven't been able to do (because they don't control the whole stack), or haven't wanted to do. Microsoft has recently figured this out, according to Ballmer, and intends to do the same.

I could go on, but I'm sure I'll get downmodded to death already. ;-)


The nice thing about Android is you don't have to use Java. Many other languages target the JVM, which is often trivial to convert to Dalvik (it depends on the language). Right now I'm working on an entire app in Duby (a Ruby-like language with static typing and type inference, with no baggage; reaches native Java speed easily), and loving it. All I had to do was change my compile command to dubyc and I was ready to go. JRuby is in the works, and apparently Scala is possible, and I'm sure many others are working on it too.

I've never written an iPhone app, and this is my first Android app, but I find the API and the SDK to be perfectly easy to work with; it was very easy to get started and quickly start prototyping. I'd like to know what parts of the Cocoa APIs you consider delightful though, just out of curiosity.


What makes Cocoa and Cocoa Touch really nice isn't that parts of them are delightful, it's that all of the parts work really well together. They were written to take advantage of Objective-C and the tools were built for using them. Interface builder makes UIs really easy and CoreData makes persistence almost automatic. You should try building an iPhone app and see how it compares.


I would try it out, but right now my iPhone is on Craigslist and a Nexus One is in the mail. :) I run Linux so that wouldn't be very easy to do anyway.

Re: Interface Builder, I agree, but I don't think anything like that would work well for Android. It targets all kinds of devices and screen sizes; it really has to be flexible and that'd be harder to do with a WYSIWYG UI tool. That being said, someone made one for Android, but I still prefer coding it by hand: http://droiddraw.com/ (it ain't the prettiest thing but it seems to work; you can even load up your own layouts).

I don't question the rest, and I simply don't know enough about the iPhone APIs to compare it to Android, but I will say this: everything certainly works together in the Android APIs. I'm not 100% sure what the purpose of CoreData is - preferences? generic data storage? - but they are both part of Android as well.

Funny that you mentioned everything working together (albeit in a different context); that's the biggest difference I noticed between how iPhone apps work and how Android apps work. See Activities (Android) vs. Apps (iPhone), Content Providers, Intents -- just about everything about Android apps is designed around working together. For example, in every iPhone app I've used that has a browser in it, it's always essentially a WebKit view with some primitive controls, or it says "Halt app and switch to Safari?". In Android most apps open up the native Browser app's main activity, and hitting Back will close it and go right back to where you were. It's not really based on multitasking or switching apps, it just reuses the Browser's Activity to accomplish the same task. Applications are all on the same level, even built-in ones, and can all work together beautifully by simply calling Activities of other apps for performing specific tasks, or controlling them through Intents, etc. iPhone apps tend to be much more isolated.


You know, interface builder existed before the iPhone. Developers used it to build out interfaces for Mac applications which could run on devices with a big variety of screen sizes. Interface builder is screen size agnostic for the most part. When apple first released 3rd party app support, their iphone sdk didn't even support interface builder.


Interface Builder existed way back in the early 90s, and someone who used it back then will find the modern version fairly familiar. It's been around quite a bit longer than the iPhone.


I never said IB was iPhone-specific or made for it. I'd used it myself before that. Why are iPhone apps upscaled or letterboxed on the iPad, though?


Because those apps are running on the iPad unmodified. Nothing at all to do with IB.


You should try building an iPhone app and see how it compares.

I have, and Android wins for me. Cocoa Touch is a decent API (in most cases; what's with NSImage vs CGImage vs CIImage?), but Objective-C is seriously outdated. It usually needs more boilerplate than even Java, and manual memory management, lack of namespaces, and header files are silly in 2010.


> more boilerplate than Java

whatever. Objective-C is far closer to the elegant dynamism of SmallTalk than Java.

> manual memory management

Makes sense. This is a resource constrained device.

> lack of namespaces

This is a phone not an enterprise server.

> header files

On a device with an underpowered processor the focus on C-based languaged has been paying back serious dividends as far as the responsiveness of the platform's applications.


Well ObjC straddles a very interesting place. It's as low-level as C++ but has high level message passing method dispatch as well. If it seems a little crufty, it's because it's living in a very peculiar place trying to serve a variety of masters.

Hopefully 4.0 will start to pave the way for the ObjC2.0 stuff to come to iPhone. They are really big improvements to the language.

Also, rule-based syntax translators could give you a ton of stuff as a preprocessing pass to your code. Write something for NSArray literals and you'll be a hero.


To be pedantic, "Objective-C 2.0" features are things the iPhone always had, like properties and fast enumeration, and/or had first, like the modern runtime (which solves the fragile base class problem, and which Mac apps still only get if they're 64-bit).

Blocks were a Snow Leopard feature, implemented as a C extension, and sprinkled throughout Apple's APIs; no direct relationship to Obj-C.


I was thinking more along the lines of the garbage collector. That's sort of the "tent pole" feature of the last cycle of objc revisions.


With the first iPad model still at 256 MB RAM, and no sign of GC in the iPhone OS 4 preview, memory management will be with the platform for a long time. I think we'll see multi-core first.


There are collectors that work very well in space constrained, processor constrained environments. It's not impossible.


Yes, looks like Obj-C blocks made it to 4.0.

GC is probably a ways off for obvious reasons.


Yes, something to replace the super crufty NSArray / NSMutableArray / NSSet / NSDictionary, syntax would be really good. Small talk message passing was meant to have short, small one symbol binary operators, but unfortunately that elegance is not there in objective c.


Over the life of the code, I'm only gonna type it once; I may read it a hundred times.


To be honest with you [[stuff objectAtIndex:5] objectForKey:@"name"] is less readable than stuff[5][@"name"]. And you type this kind of code constantly over and over again, it can really stack up and create something that would be simple to read in other languages.


Except that if you're used to Smalltalk, this is just standard fare.

I agree that a Python-like shorthand would be great at the object level, but then you'd have a mixture of shorthand and longhand, and that would be ugly.


Isn't one of the big problems here that "rule-based syntax translators" are kind of disallowed?


Not if they produce straightforward objective c. How would one even separate such code generation from the use of regular #define?


You're right, they couldn't tell. But that doesn't mean they're not disallowed -- unless you're writing C, C++, Obj-C, or Javascript, it's disallowed, period. But that doesn't mean people won't do it.


I'm not sure I agree that code translation is entirely disallowed.

Part of the problem, though, is that the legalese in the iphone 4 revision is maddeningly obscure and unhelpful. We're not even sure if embedded Lua interpreters are really banned. I'm sure Apple is considering that very question.


Assuming we're only talking about the iPhone APIs, then there's no NSImage or CIImage (Mac OS X desktop only).

As for UIImage and CGImageRef, which are on the iPhone, UIImage is an Objective-C class, whereas CGImageRef is a C opaque struct.

CGImageRef comes from the CoreGraphics (aka Quartz) lower-level 2D drawing APIs (implemented using C functions). This framework was around before the iPhone as it is also found in the Mac OS X APIs. UIImage gives you the ability to use a CGImageRef as an object. It doesn't expose all of the functionality found at the CoreGraphics layer, but it can be easier to use with other ObjC classes.


You're right, I was thinking of desktop Mac OS X there, and an especially annoying section of code where I was trying to get those three image APIs to talk to each other. Apple did in fact remove a lot of cruft and duplication in the iPhone API, and that's good. Still, there are a number of cases like that where you have to switch between ObjC method calls and straight C functions. Not a huge deal, but it's clearly not the SmallTalk-style pure OO.


Scala and Clojure run like molasses on Dalvik and JRuby will likely be the same. The Dalvik VM that Android targets is not like the JVM and one of its deficiencies is that it supports dynamic languages very poorly. Duby may run fine on JVM but if it's really Ruby-like, it'll be bad on Android.

No. If you want to write professional-quality apps on Dalvik, you'll be writing in Java or another language just as weak in imagination and straitjacketed.

Which is not to say that Objective-C is an improvement. At least iPhone apps used to offer Lua or even Scheme scripting. No longer.


Duby has been absolutely great on Dalvik so far. Like I said, Duby is not a dynamic language, it just provides one of its main benefits through type inference and a drastically less verbose syntax. I translated the same code from Java to Duby (reducing its size and making it a whole lot more fun to work with) and noticed absolutely no drawbacks. That's not to say there weren't any speed decreases, but if there were they were too small for me or anyone to care about.

Do you have any benchmarks or references for Scala and Clojure's speed on Dalvik? I haven't been able to find much. (That's curiosity, not snootiness.)


Scala is more statically typed then Java. It's not a dynamic language, type inference (what makes it look like Ruby to some) is done by the compiler. Modern (i.e., based on concepts dating back to late 70s and 80s instead of 60s and early 70s) statically typed languages don't have to the have the verbosity of Java and C++: see F#, OCaml and Haskell.

It does use reflection for certain types of pattern matching (only similarity with the JVM dynamic languages), but I'd expect most of the issues (note: I haven't tried myself) are due to the Scala compiler specifically targeting HotSpot.


> Scala is more statically typed then Java.

Being a contrarian here :)

It doesn't matter how static Scala or any other language is, all that matters is how well its type system fits into the target VM, and Java VMs are optimized for Java.

When the type-system doesn't fit you have to resort to workarounds, like allocating more objects than you should, or using introspection, or generating bytecode at runtime.

The JVM does fine here, but Dalvik is a VM created for a restricted environment.

That said, I don't think static languages should have problems ... as long as they don't stray too far from Java ... like being lazy or having type-classes.

> Modern ... statically typed languages don't have to the have the verbosity of Java and C++

They don't have subtype-polymorphism either (basically OOP).

Since you mentioned OCaml and F#, you should take notice that the OOP subset doesn't support type inference when you're dealing with interfaces, that's because it is not possible. OOP was designed for dynamic type-systems, and it mixes with static typing like oil and water.

Also Scala's type inference is really restrictive and should't be in the same league as any other language with a Hindley-Milner type inference system.

The type inference is really useful when you're dealing with parametric polymorphism. Scala doesn't really have that either ... generics only save you from explicit type-casts, nothing else, and are a far cry from Haskell's type-classes or C++'s templates (which are late-bound). The recently introduced Manifests or the implicit conversions save you in many cases, but those are just ugly hacks that only solve certain use-cases.

I really don't get why Scala is so popular. It's as if people are so tired of Java that they are looking for a way out, willing to compromise just for some syntactic sugar.

It's interesting to note that none of the issues of subtype-polymorphism (tight coupling) or those of parametric polymorphism (hard to get right) are of importance to dynamic languages. In a dynamic language polymorphism is simply implicit.

That's why I hope people will invest in VMs that support dynamic languages in the future ... static typing is a bitch to get right, and it would be easier if the VM wouldn't impose a strict type-system on you.


First, this is an excellent comment and should have a much higher score than my fanboyish original :-)

I agree that Scala is restricted by the JVM (and the aim of full compatibility with Java) and as a result type inference, pattern matching and more suffer heavily. F#, OCaml and Haskell are better examples of modern type systems as they're less encumbered. I am fairly curious about how F# works around the CLR (or rather, how much CLR accommodates type systems different from C#'s), guess it's up to me to RTFM.

> It's as if people are so tired of Java that they are looking for a way out, willing to compromise just for some syntactic sugar.

I think this hits the nail on the head, but I don't see it as a bad thing. If there's a very strong reason to be on the JVM (e.g., other projects or libraries, operations preferences), it's nice to have an option that lets one have a more productive and enjoyable experience.

There are times where it feels like a big compiler hack (which it is), but first class functions (even if they're compiled down to anonymous classes), closures, optional lazy evaluation, type inference, limited pattern matching, case classes, traits/mixins, "encouraged" immutability, the collections, etc... add-up.

There are occasionally bugs and performance issues with compiler and the collections library, but overall I don't see what's compromised compared to programming in Java itself. I'd argue syntactic sugar classification applies more to Java 7's planned closures and first class methods.

> OOP was designed for dynamic type-systems

Yes, that's literally true. OOP feels natural in dynamically typed languages, even when it's bolted on (CLOS in Common Lisp, Perl 5).

It's still a mystery to me why Java "won over", given that Smalltalk had (at the time) better performing virtual machines (compared to older JVMs which were superceeded by HotSpot, which was originally used by Strongtalk) and great tooling. There are known examples of teams of inexperienced programmers working under guidance to produce large, well working projects in Smalltalk.

> That's why I hope people will invest in VMs that support dynamic languages in the future ... static typing is a bitch to get right, and it would be easier if the VM wouldn't impose a strict type-system on you.

Clojure, in my view, suffers the most from the type system that the JVM imposes. It would be interesting to see if Clojure and Scala would eventually target less restrictive VMs (reverting to compiler hacks on the JVM to functionally emulate these VMs).


It's still a mystery to me why Java "won over", given that Smalltalk had (at the time) better performing virtual machines (compared to older JVMs which were superceeded by HotSpot, which was originally used by Strongtalk) and great tooling.

It's very simple. Smalltalk was expensive. Students, hobbyists, people working at cheap corporations, startups, open source projects, and the like couldn't write, share, and distribute working systems on it.

I understand there were some good vendors with steep student discounts and the like. My roommate loved some of them. But he could not hack with friends for free (or even very cheap) on it.

And that little bit of money makes all the difference. Not that it would have taken off otherwise; there are many factors, but no language or VM that charges for access has taken off in a generation except where access to hardware is strictly limited by law (smartphones) or expense (FPGA, large scale microcontroller projects).


It's very simple. Smalltalk was expensive. Students, hobbyists, people working at cheap corporations, startups, open source projects, and the like couldn't write, share, and distribute working systems on it.

In fact, it was the deliberate strategy to become a "boutique" language, a secret weapon of the Fortune 500. The Smalltalk companies missed out on "The Bazaar" and the mindshare benefits of an open community.


The performance problems of Scala, Clojure etc in Dalvik have to do with the allocator, not runtime type instrospection.


I'd be grateful for a citation. This is something I want to understand.


Duby compiles to Java. So the performance is identical to Java.


X compiles to assembly, so the performance is identical to assembly.


I doubt this. Any language could compile to Java, but that doesn't mean the performance will be the same. Technically, every language compiles down to assembly, but only apps written in assembly can compare in speed usually.


When a non-Java language is considered a first-class citizen for Android by the Android development team, I will consider developing for it.

I applaud the external efforts to do other languages, but unless they're "blessed", they have too great a chance of disappearing.


"I would never in a million years want to develop Android software; why? Because it's written in a language I loathe, and the API is workmanlike but completely uninspired (disclaimer: I've never written any code for it). "

i.e. I'm making a judgement on something I'm ignorant about and never made an effort to know.


No, I have made an effort to know it, but haven't actually written code for it.


I've done Android stuff for a year or so and vastly prefer Cocoa Touch. However the OP is wrong, Android is inspired, they are inspired to use idiotic API names; spinner (rather than picker, chooser, or dropdown), toast, intent, etc.


What is a picker or chooser? I know exactly what a spinner is: it's named for a spinning number control (like a mechanical odometer with a knob), unless they're using "spinner" to describe something completely different from what a spinner is in desktop UI parlance.

My point is that the API decisions are probably based on experience with a different developer culture, in which those words make perfect sense and "picker" and "chooser" are completely vague. Different groups have different terminology.


I agree the odometer-style widget makes sense as a spinner, this [1] doesn't in my opinion. First choice would probably be popup menu or drop down possibly, then picker and chooser.

[1] http://www.designerandroid.com/wp-content/uploads/2008/11/ds...


Okay, I stand enlightened. Yeah, popup, drop-down or combobox makes more sense for a radio button popup list.


I quite like Toast. :)


Which rather confirms his point about polish being more important in attracting people than is typically acknowledged, don't you think?


> I claim the secret sauce to their APIs is actually a character at Apple named Ali Ozer who, along with his crew, has maintained a firm hand on the tiller, and learned a huge amount about object-oriented design for interactive systems over the past couple of decades.

I agree with this. One of the things that makes most Apple APIs great is that they are not part of the Taligent legacy that so infects other languages. They keep class hierarchies relatively flat and lean heavily on the relatively inexpensive delegation that ObjC offers. It's very refreshing and feels far more smalltalk-ish than Java-ish.


Apple designs a product knowing it will sell millions of copies, so it makes sense for them to spend the time to get it right. PC manufacturers have greater volume, but no individual product is anywhere near as ubiquitous as a Macbook Pro or iMac. They're stuck in a trap where no single model can sell well enough to justify an Apple level of quality even if they had the design sense to produce one.


Close, but Apple's real secret is that they make premium products, with profit margins that allow them to invest in researching the next big thing and spend time doing it right.

Dell, HP, Sony, Acer, Toshiba, Lenovo: these guys live and die by razor-thin margins of the "we'll make up for it in volume" strategy.

Ultimately I think a (un)healthy share if consumers just don't care about the details of the experience enough to spend more money.


Thank you for making me Google "mit style versus new jersey style". I had heard of "worse is better", but had not heard those terms:

http://en.wikipedia.org/wiki/Worse_is_better



Worse is everytime better! I hope quality will came back to minds of people =(


I did the same thing :)


Android software; why? Because it's written in a language I loathe

Okay, you despise Java -- and that's fine -- but you prefer a product that can only be programmed in Objective-C? Schizophrenia much?

I've written for both platforms. Cocoa Touch with XCode is certainly nicer than Android but Android's language is slightly better than the choices available for iP[ad|hone]. Either of them would be nicer with Python, Ruby, Lisp, or even JavaScript.


> Okay, you despise Java -- and that's fine -- but you prefer a product that can only be programmed in Objective-C? Schizophrenia much?

I don't see how holding an opinion would constitute schizophrenia. What's inconsistent about this choice?


You loathe Java but you like Objective-C? Madness.


I have the same opinion and I don't see anything weird about it.

They're both OO languages without many features compared to C++ or something, but Obj-C is dynamic and much more flexible (see NSProxy, NSArray class clusters, etc.) whereas Java is not at all.

Plus Obj-C is actually C, so you can write "if (!i)" without the compiler deciding you're too stupid to handle implicit type conversions.

By the way, I've noticed that people (who don't seem to use it often) complain about the extremely long method names in Obj-C, but if you count by the number of tokens instead it can be pretty terse. And I think that's how reading natural language is supposed to work, anyway…


That's one of Cocoa (Touch)'s strengths, actually, and part of the 'secret sauce': Ozer and crew have developed a really good set of naming conventions over the years, and use them well.

I like the long names. It's very Smalltalk-like, and Xcode completion mostly takes the pain out of it.


how'd you do the upside-down exclamation mark?


The letter i?


Yes, because Obj-C is truly just Smalltalk with C-style control structures. Maybe I'm damaged, because I've used Smalltalk in a previous existence, and loved it.

Java is not a dynamic language.


For what definition of "dynamic language" is that statement true? Both languages do late binding of methods to objects, Java merely happens to smear on a layer of typesafety above it.

Frankly, the two languages are semantically all but identical, modulo the ability of Obj-C to fall back to a "bare memory" data model when you want to.


They certainly don't bind methods to objects at the same time - Obj-C does it when you send the message. You can send and catch messages with no methods attached to them, or that weren't even declared in the first place (at the expense of a compiler warning).

Of course there are other differences, but the most important one is just that the class library is much better - UI doesn't rely on inheritance, there's no class named LinkedHashMap (see http://ridiculousfish.com/blog/archives/2005/12/23/array/), and there _is_ a class named NSMutableArray.


You can do mixins fairly easily in ObjC, but in Java, you'd essentially have to resort to runtime bytecode rewriting. Take a look at what popular libraries like Hibernate, Spring, AspectJ use under the covers to perform their magic.


If this isn't a sarcastic comment, what does loathing Java have to do with Objective-C?


To pick one particular nit:

The fabled Google Android? It is entirely the piece of junk one ought to expect from a development process driven by committees and steered by non-creative minds. And it appears that many would-be buyers know it.

The actual numbers are around 100k/day (iPhone) vs 60k/day (Android).

Considering that Sony Ericsson and LG still coming up to speed, so those sales are mostly HTC+Motorola, that's pretty impressive.

It's also missing the point that others are free to design their own UI on top of Android and indeed the big players are. Sense (HTC) is really rather nice and definitely doesn't feel like it was designed by a committee.

Edit: Here's a source for the sales numbers http://industry.bnet.com/technology/10005344/android-unit-sa... - I'm afraid the opinion about Sense is just my own personal view.


It's also missing the point that others are free to design their own UI on top of Android and indeed the big players are. Sense (HTC) is really rather nice and definitely doesn't feel like it was designed by a committee.

But this can be a downside as well. You say 'Android' and people associate the one with the other, but there are a multitude of 'Android' devices which are either inherently incompatible (http://news.cnet.com/8301-17938_105-20002508-1.html) or whose basic interfaces are different enough because of that customization that it can be confusing to initial users.

Also, the fragmentation of Android by hardware leads to fragmentation by software. Software designed for too large of a screen might not work on smaller screens. Software designed with a hardware keyboard in mind might not work on the onscreen keyboard.

People are still selling Android phones with version 1.6 on them, which won't run software requiring newer versions (which software won't show up in the marketplace at all, confusing users).

Customization is great, but by not mandating any sort of rules, the market is getting fragmented in a lot of different ways, and people are just going to get more confused, presented with an overwhelming list of options. This one has a hardware keyboard, this one has better GPS, this one has newer software, this one is faster, this one has a higher-resolution screen. The very purpose behind Android - customization, distinguishing yourself from the competition - is what fragments the market.

Apple's approach, however, is very simplistic. Here's the iPhone. How much storage do you want? 16GB? Ok, here you go. Apps will work. Accessories will work. Cables will work. Cases will fit. There's little to no confusion for the user.

That seems to be what the original article is suggesting. Simplicity and ease of use that starts before you even make your purchase.


I would quantify that: "Software badly designed for too large of a screen might not work on smaller screens. Software badly designed with a hardware keyboard in mind might not work on the onscreen keyboard."

Semi-earnest question: I don't understand why fragmentation on Android is apparently a massive downside, but on iPhone and iPad it's suddenly a great opportunity. Are the different resolutions not fragmentation? Can the iPad not use an external hardware keyboard as well as the touch keyboard? Is compass-less iPhone 3G being sold alongside the 3GS not fragmentation? Will the AR apps work?


People are still selling Android phones with version 1.6 on them, which won't run software requiring newer versions (which software won't show up in the marketplace at all, confusing users).

This is the same for the App Store and firmware upgrades, and is a good deal more sensible than the alternative option of letting people buy/install software that won't work.


It's not remotely the same. All iPhones made to date run the current OS, and are prompted to upgrade to it when users activate their phones or sync their music libraries or back up. Android's OS is upgraded by the carrier; most carriers aren't even a little bit interested in doing it; the few that are interested spend more time dithering than Google spends cooking up each new release.

Meanwhile, iPhone OS 4 will support all phones except the first generation (even if the second-gen phones won't support multitasking, they'll still run the third-party apps). No comparable statement can be made about any Android OS release, referring to handsets of any age, from any vendor.


True, but the best alternative would be to show a "cannot install please upgrade your software sign" otherwise, how are people to know that their phone is out of date?


Others are free to design their own UI on top of Android and indeed the big players are. Sense (HTC) is really rather nice and definitely doesn't feel like it was designed by a committee.

But if you're writing an Android app, should you use the Android HIG, the Sense HIG, the Blur HIG, or the Nexus HIG?


That's a very good point, but I'd say that mostly the interface philosophies co-exist reasonably well, even on one device.

Because apps tend to be full screen it's not such a big deal to have them behave a little differently.

The hard buttons tend to have predictable behavior and important stuff like 'how does suspend/multitasking work?' is the same on all devices.

It's also quite nice being able to use good apps designed for another look-and-feel anyway and decide yourself if an out-of-place interface if worth it. For example people use iTunes on Windows even though it has it's own UI toolkit, because overall it is worth it.


I'm definitely noticing a lot more people carrying Android phones in the last few months. I also know people that originally wanted an iPhone, but couldn't afford it, so they got a cheap android phone. Now they're in the Android ecosystem, and might get a more powerful one the next time around. I can see it taking off in the next year or so.


One also needs to keep in mind that there are more iPod Touches sold than iPhones currently. It's much easier to drop $200 for an iPod Touch if you want to upgrade your current music player and don't want a phone contract to go with it... iPod still has a monopoly on the portable mp3 music device market.


You're comparing Android sales to iPhone sales in the iPhone's worst buying season, though. Expect the iPhone numbers to blow up when it's been one month instead of ten since the last hardware refresh.


The link actually shows that this (100k/mo) was the iPhone's best quarter to date. They sold more in Q1 2010 than they did in the quarter when the 3GS was introduced.


Fair enough -- iPhone sales are up despite it being an inopportune time to buy an iPhone. This doesn't exactly refute my point.


Apple's Q1 includes the winter holiday season. I'd wager normal people don't know or look at a manufacturer's release cycles. They know the gift-giving cycles.


It seems that most people's strong reaction to Apple's decision is not just because it's a mistake (which I think it is), but because it directly affects their life and there is no serious alternative to avoid the consequences.

Example: I don't really care what restrictions Microsoft will impose on developers for its next phone OS, because I don't care about their products. It does matter for me, however, that Lotus Notes design is horrible (and getting worse with every version), as that's what I'm forced to use at work and there is no alternative for me.

If HTC/Android were such a strong alternative to iPhone/iPhone OS, most developers would simply ignore Apple's move: "They made a stupid decision?" - "Very well, our community and market share will grow as a result".

Unfortunately, as Stanislav notes above, for many developers and users there are not many alternatives to the iPhone, with it's high quality design and strong eco-system.

And that's what makes people furious.


Exactly. The real problem with the anti-Apple arguments is that - for the moment - Apple's behavior hasn't hurt them. The iPhone still has more apps, and iPhone owners are far more likely to buy apps. The Android app store is still a wash, despite all Android handsets sold this year.

I think that in the long run Apple's App Store policies would hurt the iPhone if they weren't willing to make the platform more open, but the walled garden approach probably makes more sense while the public is still getting used to the idea of mobile applications.

Apple deserves some credit for persuading people to actually pay for software. How much does the average person spend on third-party software for their PC? Judging from how much shelf space Best Buy devotes to PC software, I'd bet it's pretty small compared to what people are spending on apps now.


Human creativity can never be bent over a sustained period to the will of one man, however gifted and prodigious he may be, and especially in an area so vast and fruitful as that of digital creativity. That is why Steve Jobs, all-powerful as he may be today, will (if he is not careful) watch developers gleefully flee from his grip as occasion permits at some point in the future, or from the grip of his company if he is no longer around to affect the outcome directly.

I am not a developer and I do not pretend to know how developers think but I have worked with countless developers over the years and still remember them chafing, for example, under Microsoft's sway in the 1990s even as they were forced to conform to its arbitrary and often harmful edicts and dictates that so affected their financial well-being. Well, Microsoft got away with it for a long time but burned its goodwill in the process (Microsoft had goodwill? You bet. In the 1980s, developers were tripping over themselves to get their new apps out in sync with Microsoft's initiatives and a regular amen chorus would sing Microsoft's praises at every such step - with a breathless PC Magazine and many others waiting to review every step and a herd of authors primed to write about them).

Today, Apple is dominating in an analogous vein and getting away with it, but it too is beginning to burn the goodwill it has had over many years with significant groups of loyal developers who now have no choice but to adhere, however grudgingly, to the ever-changing and seemingly capricious guidelines and restrictions being imposed upon them by Apple in hopes of continuing to build their companies or preserving their livelihoods. Perhaps its all-controlling policies will prove benign and wise and will only serve to maintain quality on its various platforms, as the company and its apologists contend. If, however, Apple is instead headed down a more nefarious path, there is nothing that will eventually shield it from the resentment and abhorrence it will face from developers as it burns goodwill with each step it takes in that direction.

It was precisely those loyal developers who carried Apple through its darkest days, when by all accounts it should have failed as a company, and it will be precisely those developers who will in time be nowhere in sight should Apple permanently burn them.


I'm not sure why "Microsoft free" is a product category? I mean, if you start out from such an odd ideological position, there's no telling what else you can convince yourself of.


The tacit claim that you're making here is that ideology is the only possible reason that one might have for wanting to avoid any particular vendor. Maybe you need to support that claim a bit. Surely a company of their magnitude has a healthy share of run-of-the-mill disgruntled former customers.


While there are legitimate reasons to avoid MS products, it's lazy for the author to say "Microsoft-free" without explaining why. Also, it may not be wise to hold grudges when the original reason is gone (e.g. Win7 is a lot better than old Windows).


By saying "Microsoft-free", you're generally saying "I don't want to use this company's products, regardless of the product." That's ideology, period. There are obviously exceptions to this, but in 99% of cases, it's ideology.


Words cannot describe how right he is.

Apple is becoming an 8000 pound gorilla because nobody else can design a user interface! Windows is crap committee-ware, and open source UIs insist on imitating Windows (look at Gnome, KDE) which means they're imitation crap.

There are other aspects to Apple that matter too. Their code performs well and doesn't tend to break. There's a polish that Apple's code has that nobody else puts on their product.

Their hardware is decent (not excellent, but decent) on the quality factor too, but it's also well-designed just like their UIs are. The biggest feature of Apple's hardware is what it doesn't have: loads of useless buttons (stupid blue ThinkVantage button, I'm looking at you!), connectors you never use, etc.

Note that Linux rules the server space for the same reason. Open source UIs are crap, but under the hood the open source ecosystem obviously cares. So in markets where under the hood matters most, Linux and other OSS operating systems are the dominant players.

The moral of the story?

Stop listening to the idiots that tell you that the way to design software is to slap it together and focus exclusively on marketing. This only works if your market is not crowded. This worked for Microsoft in the 80s and 90s because there weren't many affordable alternatives.

People actually do care about quality.


Apple's aesthetics are important, no doubt. But equally important was its decision to standardize its computers on BSD and Intel. Before OS X, Apple was a niche product, and its niche didn't include most developers. After OS X, the dev community started to trickle in, but it really took off after the Intel switch. Partly because the Core Duo was much faster than the PowerPC, but also because the PowerPC was a form of lock-in. It kept good software out, and made it hard to develop for other platforms.

Apple owes a lot to standardization and opening up. It's not all App Store lockdown. The question is, is this App Store crap a general trend? Are they going to bring that to the desktop?


Others could have done the same. BeOS didn't succeed. You don't see Linux everywhere and it was (and still is) based on openness.


But you do see Linux everywhere. Servers, embedded devices (including Android) tons run Linux. The Desktop is just a sliver.


I'm not sure there are that many Mac apps that have been built since the Intel switch that are used by most of the types of folks that are new to the Mac. Lots of Mac users don't run much at all on their Macs besides Apple software other than perhaps Firefox and/or a 3rd party IM client.

As I see it, the Intel switch was mostly important to get Macs being used (and thus recommended) by geeks for the same reasons it's been popular among non-geeks: the ability to run Windows (and, of course, to a lesser extent, other Intel operating systems).

This, along with the increasing importance of the web (and thus lessened importance of the specific OS platform), the spread of Apple retail stores, and the iPod/iPhone halo effect I think are the main reasons Mac sales have done so well.

Because of that mix, it's hard to tell how much the platform's openness has been important to it. It certainly hasn't hurt any, though, and it's a very likable trait in a "desktop" OS!


The fact that most Mac users don't use many apps on their laptop, beyond what's built in, and what they do use is probably free is the one thing that makes me think the app store bubble will blow over.


That doesn't seem to be the story on the iPhone OS devices, though. Everybody I've ever seen with one, regardless of tech savvy, has it loaded down with apps and is excited to show off their favorites.

Mobile apps tend to be "bite size" (approachable), centrally discoverable through the App Store, very low cost, be built around rich media, and the user feels a control over their management that makes the experience more like the "lightness" of visiting a web page than the "heaviness" of traditional desktop app management which is marred by unpredictability, multistep install processes, reboots, conflicts, and a general feeling of lack of control. I think users really like software... they just hate all the bullshit overhead that has usually gone with it.


People care about aesthetics, and quality is, I think, one form of it.

Can't agree more, and I think there could have been many more Apples if more companies were run by technical people with a good sense of aesthetics (in a broader sense), rather than by business people.


I disagree, Apple's offering in terms of servers (both hardware and software) is downright ridiculous, no ability to virtual the servers. and the software doesn't scale well (they are very far from Microsoft in that regard).

Disclaimer: All the desktop hardware we use are Apple, but we switched the OS's to windows 7, its more suitable for an organization.


I completely disagree that windows 7 is crap design. I'm more efficient with Windows 7's window management, for example, so I find their design much better than OS X's


Windows 7 is an improvement, but I still don't like it. I suppose some of it is personal taste.

I also still consider it ugly from an aesthetic point of view. It's not clean. It's cluttered and counterintuitive.

There's something else too... even with 7, Windows doesn't perform well. NTFS uses a "pessimum" allocation algorithm that seems to achieve maximum conceivable file fragmentation after even moderate use, and I'm still not sure why Windows disk thrashes more than any other OS. With equivalent workloads on Linux and OSX, the disk occasionally blinks. On Windows it thrashes wildly.

"Performance doesn't matter" is a relative of UIs as an afterthought. Performance is a UI issue, and it does matter. It's even more apparently on mobile devices... the iPlatform performs so much better than Windows Mobile. (Android isn't bad either.)

http://www.acm.org/ubiquity/views/v7i24_fallacy.html


Perhaps I'm less sensitive, but I've noticed no more slow downs than any other OS I've used before.

Not that I'm loyal to any particular platform but I feel a need to voice that I like many of the things MS has done with 7, and that arguments that OS X (at least the UI, I still prefer the *nix environment for development) is far better are just not true anymore. I also enjoy the other desktops like Gnome or IceWM, but 7 is my favorite so far in terms of UI experience.


Their hardware is decent (not excellent, but decent) on the quality factor

Who builds better quality hardware than Apple?


After working in Mac-centric development environments for the past 12 years, I can't say I'm enamored with Apple's hardware track record. The desktops have been for the most part reliable, but the laptops and iMacs have not fared so well. All 12 iMac G5s we purchased were serviced under Apple Care at least once - some went twice. And the survivors all have obnoxious fan noise problems. I remember a generation of PowerBook G4s that had a problem with the display losing connection. Plenty of those went back as well.

Externally, they are designed very nicely, and the internals of the desktops have always impressed me. I'd love to see something else similar come along. The closest I can remember is the old MIPS-based SGIs.


G5s? G4s? It's been 4 years since they switched to Intel. All that says is that at least 4 years ago they weren't making great laptops.

"To argue against an idea honestly, you should argue against the best arguments of the strongest advocates." - http://lesswrong.com/lw/lw/reversed_stupidity_is_not_intelli...

Rephrase the question, then, who makes better laptop hardware than the Aluminium Unibody MBP?


Does it count if someone makes something that does the same job at a quarter of the price, so that if you flex it too much, or run it over with your car, or leave it on the bus then you can replace it and still be saving half the price?

Or are we forced to accept Apple's business model as a given, just like Microsoft fought against the netbook pricing model because it began to make their OS pricing look ridiculous?


No, because it's not about the initial purchase price. It's about having your stuff crammed into a piece of hardware, and for most people being dependent on that stuff being readily accessible.

That's why the "it just works" attitude is very powerful amongst the almost entire population for whom replacing hardware is a really huge deal, because they

a) have to be without their stuff for the time it takes for the computer repair guy to fix it, and b) because they're scared pantless that their stuff will have been destroyed.

If you're not in the know, broken computer _might_ mean loss of data. That thought means loss of sleep.


The Macbook assumes you keep things on your Mac, the netbook...


That counts as a valid preference, but it isn't an answer to "who makes better quality hardware?" unless you are implicitly answering "nobody you can think of".


So there's no such thing as a quality bicycle, because some companies make motorcycles?


This is the place where a discussion turns into a pointless unending argument. We are certainly not talking about the same thing in the same context.

Are you

a) defining or redefining "quality" to so that a cheap item can be considered high quality because it can be replaced cheaply?

b) avoiding answering the question of who makes a better laptop hardware than the unibody mbp because you can't suggest any company and don't want to admit that

c) you think I am refusing to see some obvious and relevant point because I'm an apple fanboy or that I am just being obtuse

d) other ( insert answer here )


My point, which I didn't think was too obtuse, was that if you squint at a bicycle from the point of view of a motorcyle, then you could easily conclude that it was just a crappy motorcycle, as it doesn't meet the applicable quality criteria.

But, and this is where it gets interesting, you can do the same in the opposite direction and conclude that a motorbike is just a really crappy bicycle because it's heavy, expensive etc.

You could say the same about most web apps versus their desktop equivalent.

So, to a lesser degree, a laptop designed within the constraints of a factor of 4 price difference, perhaps with mobility (and all the many non-obvious things that mobility entails, like the iBooks being designed to be put in student backpacks and so not having little breakable flaps covering ports) being paramount are going to be faced with very different set of engineering trade-offs. I think it's simplistic to say that the Apple one has greater quality if you don't take that into account. (Especially when their expensive material choices seem to regularly kill wireless signals, that's a quality and engineering factor too).

Apple themselves have said that they don't know how to build a cheap netbook that isn't crappy (I think that's almost a direct Jobs quote) so I don't see why they should be getting plaudits for sitting out a market that they've decided is difficult, when you're calling out the folk who actually make products for their lack of quality. I think it's a good business move for them, and I'd continue to buy their products as I have for years, if I couldn't get something much, much cheaper, that does the same thing (and often a few more things) and doesn't lock me in to a series of interlocking, expensive purchases within their ecosystem. What locks-in their users, also locks out people like me.


Have you ever owned an IBM Thinkpad?


IBM:s ThinkPads seems really sturdy. But has the quality been maintained since Lenovo bought the ThinkPad brand?


yes it has


Yes, and it didn’t pass the creak test.


Yeah, but it does pass the "accidentally spilled coffee on my keyboard test".


Mine passed the "getting dropped from 3-5 feet on about 700 different occasions" test.


Mine passed the got-run-over-by-an-SUV test, modulo a new external keyboard and monitor.


Mine had a cracked motherboard because I was apparently picking it up off the table wrong, by the corner vs gently holding it on both sides, as designed. IBM initially claimed that it was water damage (6 months prior), searching online revealed a small army with the same problem as I.

I still use that ThinkPad (well minus the new motherboard) on my electronics bench, but to say that its on par with the iPhone or Mac Pro in terms of quality is laughable. The whole thing looks and feels like its made out of recycled plastic. Its so ugly it looks rugged.

And I still love the damn thing... go figure.


> Mine had a cracked motherboard because I was apparently picking it up off the table wrong, by the corner vs gently holding it on both sides

Same as mine. Replaced with MacBook Air, without regrets.


>The whole thing looks and feels like its made out of recycled plastic.

Wait, I thought that Apple was the eco-conscious company.


What vintage? I remember seeing that Lenovo added a "roll cage" to the Thinkpad line that supposedly allows it to be run over by a car.

I found a 1990s Thinkpad in my desk at work. It's makes for an excellent step ladder.


My sister's didn't. More RAM for me (and my next laptop will also be a thinkpad)!


No, but I have coworkers that do. They're nice machines, but they're still a collection of plastic components, and don't have the tight tolerances and clever case design that one sees in recent MacBooks.


I can't get past the blue "ThinkVantage" key. It's like a big pimple that says "a douchebag designed this." It also has no purpose once you install a real operating system, since the software that it invokes only runs on that crummy OS they ship it with.


You need to get over yourselves then. Thinkpads are really well made because they are not made to please some designers - they are made to be used.


Ok that came of way too snarky. What I was attempting to say was that Thinkpads were (I haven't used any that were made by the Chinese manufacture so I can't answer for them) made to be used and were designed to handle the hits and spills that would happen naturally over the course of such work (the keyboard had drainage, the hinges were extra enforced and it could take quite a few drops).

On the other hand my iPod has already got three scratches in it. One was because my keys were in my pocket (which I do all the time with my cell, without any issues) the other when when I dropped it two feet (of my bed, and not on purpose).

Sure the iPod stuff is nice, but I wouldn't rely on it for actual work.


I agree. I actually prefer Toughbooks for that reason. The biggest thing I dislike about Apple's hardware is that you have to treat it like a family heirloom.

It is more durable than cheap PC hardware though, which is why I called it middle-of-the-road in terms of durability.

But I still massively prefer the aesthetics and designs of Apple's hardware. I do not see why you couldn't design a Toughbook-type machine with sparse Apple-like aesthetics. In fact, it almost seems like it would make it easier... fewer connectors and buttons means less points for liquid entry, eliminating extra drives means fewer moving parts, etc.

If I ditched Apple for Linux, I'd probably take a look at the Sony Vaios... they're pretty sparse and seem well made.


My friend's IBM Thinkpad went through 3 video cards and a motherboard before she replaced it with a MacBook Pro.


Oops, we're all spambots now: http://www.loper-os.org/?p=91


No one else makes a portable where every hardware component simply works, including suspend mode, while entirely freeing me from Microsoft.

Uh, everyone makes this? All decent laptops are Centrino these days, and guess what, all that hardware works fine under Linux. I haven't owned a machine in years that wasn't perfectly supported by Linux.

Suspend / hibernate also works fine with modern BIOSes.


> Uh, everyone makes this? All decent laptops are Centrino these days

Which laptop maker will sell a machine where every single hardware component is guaranteed to work perfectly under Linux? Without endless driver-hunting, Dependency Hell, crashes, and "small" but intolerable imperfections.

My Centrino-based Thinkpad (circa 2007) failed to properly suspend the video chip under a perfectly-configured Linux, draining a freshly charged battery in only twelve hours. Note that I do not care if this (documented) bug has since been fixed. Neither do I care just who was at fault (the chip vendor, for failing to release docs; the slacking developers; the devil - I simply don't care.) Such cases are rampant, the culprits are unrepentant, and there is no vendor which will offer a money-back guarantee of nothing of this kind happening.

> Suspend / hibernate also works fine with modern BIOSes.

Which ones? Is this authoritatively documented anywhere, in real time? What is "fine"? Apple-fine, Microshit-fine, or sea-of-hacks-Gentoo-fine?


He's right about how Apple got here but now that it's #1 they are using contract law to maintain a monopoly. No less evil than Microsoft at its worst.


Amazing article, raises crucial questions.

What is to be gained from corporate sainthood? From a refusal to fleece eagerly willing suckers for all they’re worth?

Long-term Loyalty of those who matter? It's a question worth pondering.

The point that Apple rests on good protocols, from GUIs to APIs is also key. I don't think that only Steve-Jobs-like-management produces such things but the need for a protocol spring-forth from a consistent idea is absolutely crucial. It has happened in past - the Ruby language is cool example (and it's not a coincidence that Ruby is more or less the idea of Matz but it can be the idea of Matz without us suffering the dictatorship of Matz). Especially for open source development, we should be asking how we can duplicate that process at a high level - and not copying friggin awful GUI elements from Aero.


This seems a rather trollish article. Sadly hn appears to have taken the bait.


This article is simply obnoxious. Pathetic ramblings about how great and unique Apple is just because he doesn't like the software offered by Microsoft or the hardware offered by its competitors. Then just cheap, inaccurate, unarticulated attacks on Android.


That's par for the course for this guy. No surprise here.


It continues to amaze me how many and adamant apple's apologists are. Stockholm syndrome?


This is silly. I could take a dump and then say I have a monopoly on things floating in my toilet. It's true, but meaningless.

Most people don't desire to be "freed from Microsoft". They don't give much of a damn one way or the other. They want a $600 laptop so they can check their Facebook page. If having a monopoly on something only some tiny % of users (not even all Mac users, which is 5%) care about ever becomes illegal, I'm expatriating.

Legally speaking Apple is doing nothing wrong. And ethically. Whether their walled garden approach is a solid business decision remains to be seen but it's not monopolistic.

Other companies do make great CE devices. Ignore phones (a topic about as fruitful to argue over as religion) and you can probably think of 10.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: