You could make the same argument for other languages - I've never used Java out of preference, but have written 100,000s of lines of it because it was what was required to get the job done in the context.
Perhaps for J2ME development, but apart from that I'd say thousands of companies have individually decided to use Java. Apple is just one company that has decided to use Objective-C.
Edit: I forgot about Android, but Java was "popular" before that - in fact Android probably uses Java because Java was popular...
My favorite jab here is that thousands of IT managers, after some multi-day infomercial paid by IBM, Sun, BEA or Oracle, decided they'd buy something that limited their developers to Java.
But really, Java is a fine language, not much worse than C++, and, as far as static typing goes, fairly OK. What bothers me most is the insane proliferation of bloated frameworks and tools that Java seems to generate around itself. In order to use several popular tools, you need Eclipse. It's not a bad IDE, but needing an IDE for even basic stuff like building a web app clearly is a symptom of something that went very wrong. Some wrong decisions are really hard to undo.
Languages aren't used in a vacuum, and the economics of them involve positive network externalities and switching costs, so 'individually' needs some qualification.
What kind of thing do you mean? I was (to my utter dismay) a Java developer for more than 10 years. Most of the time I worked on business internal projects or J2ME. Seems to me that for internal projects, the businesses could have chosen anything?
Of course there was the hype, the hope to find more developers, the idea that there should be only one language in the company and so on. (All wrong, imo...)
Maybe all those factors could be combined into a single one, "popularity" :-)
I guess once Apple has created thousands of Objective-C developers (already done), it might become interesting for other things, too. It is a self-feeding loop.
Because I suck... I just didn't manage to jump ship yet. I'd take off time for months to work on my own projects but usually failed to finish what I started. Then money would get low and I would accept the next Java contract to make money again.
These days I tell the recruiters that I am leaning towards Rails and JavaScript, but in both I don't really have as much experience as with Java. I did some small things with Rails, but I also don't have the killer combination of design sense and coding skill that the younger Rails devs seem to have. Still, I have rejected most Java projects in the last three years. Might grudgingly look into Android, though (I like Android, but I hate Java, as has probably become clear...).
Well the entire stack is open source so no one needs to port it. I'm pretty sure that people like it because, it along with cocoa, makes iOS a great platform to develop apps for. But as for spreading objc, most of the features it has have been in dynamic languages like perl for many years longer. Perl even has objc lib bindings and much better integration in unix, albeit not necessarily for making apps.
I think the popularity of objc is hindered more by the fact that people would be moving a step back, if they were using it for something besides what it's designed for (cocoa apps).
My point is that the existence of ports is irrelevant. A language is popular if it is used by a lot of people. The choice to use it involves a lot more than language preference.
To claim that objective-c doesn't count as popular because anyone who chooses to write for iOS must use it whether they like it or not, is like saying that java doesn't count as popular because anyone who chooses to develop on an emterprise stack that a corporation has built on java must use it.
I still think it's popular.