The way you state it is a little negative, but it's an interesting question nonetheless. I think Objective-C would not have been on many developers radar if it weren't for the fact that Apple frameworks use it extensively (they don't 'force it on you' by the way, as you can easily use C or C++ for anything except interfacing with Obj-C frameworks).
That said, Obj-C for me is very much an acquired taste. I had seen it a few times before I started developing iOS applications, and on the surface, it never appealed to me at all. The idea to use it for anything never even crossed my mind, and looking at the sad state of GNUstep usage and the many modern Obj-C features GCC doesn't implement, I'd guess it's not really popular in the FOSS scene either.
Now that I have used Objective-C for some time, it would actually be a serious option anywhere I used to automatically pick C++ before. The more you learn about Obj-C and the more you use it, the more you will like it. It basically has most of the good things of C and C++ without many of their downsides, and it adds some almost Pythonesque dynamic features that are very useful and elegant.
IMO Objective-C deserves a little more credit than 'just the language Apple uses'
"they don't 'force it on you' by the way, as you can easily use C or C++ for anything except interfacing with Obj-C frameworks"
Unfortunately, since most of the popularity is due to iOS applications - developers have to touch these frameworks at one point or another and that means touching at least a tiny bit of ObjC
I believe Go will eat Objective-C's dessert ("general programming") in the "good things of C and [C related languages]" language category.
I would guess (but don't know the compiler) that Objective-C is a better choice than Go if you want more precise control over memory layout and specially management. But that feature is a PITA when you don't need such control and would much rather the language runtime take care of memory and GC.
Objective-C the language has always been extremely portable. Even pre-NeXT there were compilers/runtimes for Atari, Amiga, Mac, DOS/Windows, Unix, QNX, and probably a bunch of other systems I am not aware of.
When NeXT shared their compiler patches and runtime (after some GPL-prodding), Objective-C became part of gcc and available on pretty much any supported platform, especially after the GNU-runtime delivered a portable messenger ( objc_msg_lookup()() vs. objc_msgSend() ).
That's the language itself, but you also need to consider the frameworks implemented in it.
NeXTSTep itself was, of course, ported to Intel, HP-Precision architecture and SPARC. Compiling for other architectures was as simple as checking a box in Project Builder. Later there were OPENSTEP-compatible implementations on Windows and Solaris.
The grand-daddy of the open-source implementations is, of course, GNUStep, though there are a bunch more that have been around for a while such as libFoundaton and mgstep. My current favorite is Cocotron, it allows cross-compilation for Windows and Linux right out of Xcode.
There is significant difference between C++ and Objective-C in this matter. Implementations of C++ that emit C are almost always implemented as full compilers that targets C as output language, but Objective-C can be easily implemented as quite simple preprocessor on top of C, that has almost no understanding of C's syntax. (traditional Objective-C, ie. without closures and such features)
We never ran any NeXT machines. There was no (and frequently continues to be, thankfully - we've got some good negotiators who manage to convince the customer that picking the language themselves is often not smart) customer requirement on language. As I recall, a number of candidate languages were examined as a replacement for C (which was what we were mostly using at the time) and the lightness of Obj-C, its Smalltalk-like object model and the ease of training newcomers to go from C to Obj-C, were amongst the reasons.
Could you give some background on how that decision was made? I've only seen objC used in only a few situations. Usually academic and businesses that bought into NEXT computers. I've met a few neck beards who used it in Linux on GNUStep projects.
Outside of Apple and the few examples I haven't met anyone who has decided that objC was an obvious choice.
You could make the same argument for other languages - I've never used Java out of preference, but have written 100,000s of lines of it because it was what was required to get the job done in the context.
Perhaps for J2ME development, but apart from that I'd say thousands of companies have individually decided to use Java. Apple is just one company that has decided to use Objective-C.
Edit: I forgot about Android, but Java was "popular" before that - in fact Android probably uses Java because Java was popular...
My favorite jab here is that thousands of IT managers, after some multi-day infomercial paid by IBM, Sun, BEA or Oracle, decided they'd buy something that limited their developers to Java.
But really, Java is a fine language, not much worse than C++, and, as far as static typing goes, fairly OK. What bothers me most is the insane proliferation of bloated frameworks and tools that Java seems to generate around itself. In order to use several popular tools, you need Eclipse. It's not a bad IDE, but needing an IDE for even basic stuff like building a web app clearly is a symptom of something that went very wrong. Some wrong decisions are really hard to undo.
Languages aren't used in a vacuum, and the economics of them involve positive network externalities and switching costs, so 'individually' needs some qualification.
What kind of thing do you mean? I was (to my utter dismay) a Java developer for more than 10 years. Most of the time I worked on business internal projects or J2ME. Seems to me that for internal projects, the businesses could have chosen anything?
Of course there was the hype, the hope to find more developers, the idea that there should be only one language in the company and so on. (All wrong, imo...)
Maybe all those factors could be combined into a single one, "popularity" :-)
I guess once Apple has created thousands of Objective-C developers (already done), it might become interesting for other things, too. It is a self-feeding loop.
Because I suck... I just didn't manage to jump ship yet. I'd take off time for months to work on my own projects but usually failed to finish what I started. Then money would get low and I would accept the next Java contract to make money again.
These days I tell the recruiters that I am leaning towards Rails and JavaScript, but in both I don't really have as much experience as with Java. I did some small things with Rails, but I also don't have the killer combination of design sense and coding skill that the younger Rails devs seem to have. Still, I have rejected most Java projects in the last three years. Might grudgingly look into Android, though (I like Android, but I hate Java, as has probably become clear...).
Well the entire stack is open source so no one needs to port it. I'm pretty sure that people like it because, it along with cocoa, makes iOS a great platform to develop apps for. But as for spreading objc, most of the features it has have been in dynamic languages like perl for many years longer. Perl even has objc lib bindings and much better integration in unix, albeit not necessarily for making apps.
I think the popularity of objc is hindered more by the fact that people would be moving a step back, if they were using it for something besides what it's designed for (cocoa apps).
My point is that the existence of ports is irrelevant. A language is popular if it is used by a lot of people. The choice to use it involves a lot more than language preference.
To claim that objective-c doesn't count as popular because anyone who chooses to write for iOS must use it whether they like it or not, is like saying that java doesn't count as popular because anyone who chooses to develop on an emterprise stack that a corporation has built on java must use it.
Also, can you assume that apps that even are in the Apple domain DO use Objective-C enough that it warrants calling the language "popular"? Or are we seeing apps written in another language with a bit of Objective-C to act as glue between the app and Cocoa/iOS?
I'm just looking at the slew of cross-platform games (eg. the GTA series) that have recently been put on the iDevices and thinking "They wouldn't have rewritten those from scratch, would they?".
> They wouldn't have rewritten those from scratch, would they?
I do a lot of cross-platform integration. This is based on my experience. If you know you're going to cover multiple platforms from the beginning, you will do a lot of things to make that a lot easier.
Most of those would be written to target a graphics framework (like OpenGL) for the visual elements. Then, for each platform there will be a host that OS-specific things like file & network access. You can also use libraries to abstract that for you.
So the amount that needs to be rewritten depends on how similar the new platform is to the old ones. Going from PC to iOS would be about 40%. Going to Nintendo Wii, maybe 70%.
Unless the platform is particularly unusual, most of the hard problems were solved in the first release, adding additional platforms is fairly mechanical. So even if you need to replace >50% of the code, you know roughly how you're going to do it before you start.
I think that Objective-C’s pros, like they way message passing can be used for UI event handling, come to life only with a framework that makes good use of the language’s strengths. People drawn to the language would be likely to stick with the framework.
I fail to see how the article you link to supports your opinion in any way, and neither you or Siracusa offer any arguments why Objective-C needs replacing. At all.
You can summarize Siracusa's piece as 'Objective-C needs automatic memory management, it needs automatic memory management, and all the other popular languages have automatic memory management, so yeah, Objective-C needs automatic memory management'. He then rejects the fact that Objective-C on OS X has had garbage collection for years because 'it is not a no-brainer because not every developer uses it' and because iOS doesn't support it. Guess what, ARC is now the default way of 'managing' memory on iOS and OS X, and it has some very clear advantages over garbage collection. I wrote 'managing' with quotes, because with ARC you don't actually have to do anything as a developer, and if you don't pay attention to anything it is just like writing code in a language that has GC.
So unless you have some actual arguments why Objective-C needs replacing or what important features it is missing that other languages do have, it is very hard to take your 'No and No' very serious.
Just out of curiosity: did you actually ever develop anything non-trivial with Objective-C, and if so, what part of that experience makes you think it people would not use it voluntarily?
I'm an iOS developer at my 9-5, so it certainly could be a case of disliking the thing I am most familiar with.
You seem to be disputing my claim that "people don't like obj-c and wouldn't use it if they didn't have to" on the grounds that it is capable and getting better. I don't think that is enough. It's all just my opinion, I don't think C# has a bright future either.
As for my "No and No".
## My First No - "No it is not popular" - Purely anecdotal
I interpreted popular as "well liked". Those I have met and worked with all seem to view Objective-C as "what we use" not, "language of choice". I know those who have grown to appreciate and even like it, but not a soul who says "Obj-C is my favorite" outside of the context that it makes them the most money. I'm sure objective-c fans exist, but to qualify as "popular" to me, that would have to be the overwhelming majority. I've never met an objective-c person who talks about it with the same grin that a Ruby person wears when they tell you about "unless".
## Second No - "No, I don't think anyone would use it by choice" (stupid internet assumption, anyone == people who think exactly like me)
The reason for this, I think, has more to do with its lack of popularity than anything else (yup, some circular logic, I know). The fact that every mac developer I know already knows at least one other language that they can use everywhere else is I think why they aren't putting forth any effort into developing the community to use Obj-C for other jobs. Why bother? There's already overwhelming support other languages of similar merit, so why opt for the unpopular one?
## My own experience
ARC has been wonderful, and I think it greatly improves the process of getting things from idea to application. However, I still feel like I am writing - or rather tab-completing - too much code to accomplish trivial tasks. There is so much ritual that could be done away with completely. If the most common case is to @synthesize an @property, why am I forced to @type @it @out every time? I understand there are sound, logical, historical origins to many of my frustrations (like separate .h and .m files and having to wear out my 'n' and 's' keys). To be honest, much of my complaining is nothing more than trivial syntax whining, but to me, that is all friction. Here I must also admit that I am probably seeing Obj-C in a negative light because I have to interact with it through XCode. I've tried JetBrains AppCode, but there is so much switching back and forth to XCode for CoreData and Interface stuff that I now just keep it around for a few specific tasks. With Obj-C it's really hard for me to separate the language from the Apple APIs, XCode and the iOS development community, all of which I would consider "take it or leave it" quality. Every once in a while Apple throws you a bone like NSOperationQueue, which is a joy to work with, but again, that's conceptual, not syntactic, it's still a million characters to do anything with it.
I can't think of a reason I would pick it over C for low-level serious business or over $yourSciprtingLanguageOfChoice / (((your-functional-language-of-choice))) for everything else. It's got some cool features, but nothing that overpowers its awkwardness for me.
I was very shocked when I heard old Siracusa bashing ObjC in his rant about the dreadful HFS+ last month (in his terrific podcast, Hypercritical). I didn't know why he didn't like ObjC (I really like it), and I still don't understand why exactly. I've written iOS/Mac apps and am familiar with it. I don't think his critiques are on spot this time.
Well, yes, it is popular, and yes, it is popular today because Apple uses it, but still, people are using it in projects that don't have much to do with iOS/OS X, such as Web Frameworks (Frothkit, Bombax) and client-side MVC (Cappuccino).
In this aspect it's similar to C#, and Mono, I think, but in a much smaller scale.
Another datapoint for my theory for how Slashdot worked: the usual mob made enough actual experts mad enough to comment and other annoyed clueful people moderated those comments up.
From what I have seen, Apple succeeds in part because they have a corporate culture that fosters good OO and good software architecture. Most companies through the early 2000's didn't have enough BS filtering capability to accomplish this. (Only the top 10% to 25% of shops had that, I would guess.)
Add to that technical capability, the actual ability to market and an understanding of UX/design, and you start to understand how they can execute so well.
"Another datapoint for my theory for how Slashdot worked: the usual mob made enough actual experts mad enough to comment and other annoyed clueful people moderated those comments up."
I heard about Objective-C before C++. The big complaint I remember about Objective-C was that it had a very different syntax for the O-O parts. When C++ came along, it was seen as having O-O more integrated into the C language. Also, I think Borland's Turbo C++ was a big boost for the language, which probably led to Microsoft adopting C++ rather than Objective-C. Also, I think the strong type checking and fast polymorphic dispatch were attractive features.
Another thing to remember is that compilers were expensive back then. You couldn't just download a language from the web and try it out. Trying a new language mean an investment of several hundred dollars, and so you tended to just stick with your first choice. Turbo C++ was probably the first O-O language available for less than $100, so that became many people's introduction to O-O.
I found C++ to be complex and difficult, but I thought that was just the price to pay for O-O. Later, when I learned Smalltalk, I realized that was not necessarily the case.
> are there projects using Obj-C out of an Apple context?
It's the same as 12 years ago: just GNUStep and things in the GNUStep halo like Étoilé. Sony started a project to put GNUStep on mobile (SNAP) a few years ago and promptly dropped it.
> I always wondered if that popularity is tied to Apple devices.
SNAP was a very strange story. I tried to google what happened and on 25 Nov 2010 all reported "big news: Sony is using Obj-C with GNUstep!" and two days later on the 27th "Sony killed SNAP!"
I keep hoping someone will buy the thing from me. It's been fun to run it, and I think the metrics are a lot better than TIOBE's, but I just don't have the time any more.
I think the non-success of GNUStep is rooted in another problem: it is a beast to get running. I tried to get a GNUStep toolchain on my Mac when I started ObjC. No dice. Its really _hard_ to start. If GNUStep was as easy to start with like, say, QT, it would be much better.
You can look at that from another angle: Nobody cared enough to make decent tools for GNUStep, while many care about other languages (c, c++, java, python, ruby etc). Linux programmers are keen to copy UI effects from Apple, but so far they seem to keep away from its programming language
Considering the work that ran into GNUStep, its window management, the libraries, etc., I think its the typical programmers oversight: it runs for me, why should I care to run it elsewhere?
I don't think it's that. Rather a) on OSX, you might as well use real Cocoa and b) on Linux, if you want to integrate with anything else, then you might as well use GNOME/KDE.
Bah, I've used Obj-C for a couple of multi-month stints just because I wanted to like it. I tried my hardest to get into the groove of it, but I just find it to be a really weak way to add objects on top of C.
Unlike most of you Hacker News readers may think, Apple devices are not common out there (yes, I know, there are iPhones - yet they're too expensive and most of their applications are NOT being coded in Obj-C), so Obj-C didn't catch on yet.
6 months ago, there were 250 million iOS devices had been sold - that's iOS only, no Macs. I make that one for every 28 human beings on this planet. What number would tip this so that you considered Apple devices to be common?
"…and most of their applications are NOT being coded in Obj-C…"
Got any sources to back that up? I've often looked for numbers for ObjC vs PhoneGap vs Titanium vs etc comparisons but I've never managed to come up with anything conclusive.
I think that sohn's point is not that PhoneGap/Titanium/etc are more popular than they really are, but that most iOS apps are really written in C/C++ with only thin shim to interface with iOS where necessary (think games portable to other platforms).
Apple devices may not be the most common devices out there, but they are a very popular target for developers.
Many applications developers will target iOS before they target android etc, even though the sales numbers may be higher simply because the apple store has a reputation for converting paid apps better.
Therefor, any serious mobile developer probably know objective C.
Not sure what you think the applications are being coded in?
Sure, maybe some use cross platform toolkits but I bet the best selling apps are written in obj-C or at least by developers familiar with it.
For what its worth CNBC's 'All-America Economic Survey' (http://www.cnbc.com/id/46857053) resulted in the conclusion that 51% of all homes in america have at least one Apple product.
I would say thats starting to get pretty common (of course this is probably skewed based on the demographic reading cnbc)
One indicator would be that there would be ports of Objective-C to many platforms, is that the case?