Hacker News new | past | comments | ask | show | jobs | submit login

The mac app store is the end of an era. For a while the mac was the most user-centered AND developer-centered platform that existed. When I bought my first mac apple was actively recruiting developers by having free conferences around the US.

Apple has decided to go more in the user-centered direction then the developer-centered direction. This is definitely the right move for them. However it makes me a little sad, its the end of an error. Now I see linux as the developer centered OS with OS X as the most user-centered OS. Its a hard call to make.

Mostly I think I am harboring some resentment. "Back in my day we had to write our own licensing code, host a web payment code, and host autoupdates. All the kids these days do is launch the darn app." I think I almost believe its not fair.

I guess at the age of 24, I am running into one of the first big changes that I have to accept if I want to stay relevant. I knew this market changed quick, I just always assumed it would be in the way that I wanted it to change.




In what way is the presence of a Mac App Store making the Mac less developer-friendly?

If anything, the creation of yet another possible revenue channel only makes the OS more friendly to me, not less.

"Back in my day, we had to roll our own blitter if we wanted to do anything serious" could also be said, but that doesn't mean modern graphics libraries and hardware are less developer friendly. Eliminating the need to do repetitive grunt-work is always a good thing.


Because of these things:

- Apple decides which apps go and which don't.

- I have to read a list of rules to know if I can actually sell my application

- Apple takes 30% of my money.

- Apple decides what programming language I can use.

I thought it was agreed upon that the app store is not overall good for developers.


Agreed upon by whom?

An App Store isn't that great for developers (in the "developers being free to do whatever they want" sense of "good for developers") when it's the only option available.

That is not the case with the Mac App Store. If you don't find any benefits in the tradeoffs that particular option provides, don't pursue it. Write your "own licensing code, host a web payment code, and host autoupdates" like the "good old days", if that's what floats your boat. But it's still better to have another revenue option at your disposal than to not, even if you don't choose to use it.


>An App Store isn't that great for developers ... when it's the only option available. That is not the case with the Mac App Store.

I keep hearing this, but I've yet to be convinced.

Over the last 5-10 years Apple has pushed developers more and more aggressively. They put giant efforts into things like Classic and Rosetta, only to strip them from the OS entirely a few years later. They repeatedly trumpeted the full equivalence of Carbon and Java to Cocoa, and now it's infeasible to use either as the basis for any full-featured application. They developed all kinds of new APIs for QuickTime in 10.5, and then in 10.6 introduced "QuickTime X" as the only 64-bit native solution, effectively deprecating everything else. And most recently there's this unusual attack on the Flash plugin of all things. It's as if they now revel in actively destroying backwards-compatibility.

All of these decisions had the effect of reducing Apple's support and maintenance overhead while strengthening their control over the direction of their platform.

I would be surprised if Apple's very clearly demonstrated zeal for taking control and eliminating developer options did not extend to the new Mac App store.


Seems to me that putting giant efforts into things like Classic and Rosetta prolonged the ability for developers to move over before dropping legacy support. The other options would be:

a) support the legacy systems forever. b) don't support them at all.

Option A leads to a Windows-esque environment where support for older platforms actively holds back development and innovation for newer ones. Option B kills everyone's existing apps. Neither of those sounds like a very developer-friendly or even user-friendly option.

Frankly, the fact that they built those at all shows that they're willing to go the extra mile to make sure that their developers have the heads up to upgrade their applications before breaking them entirely.


Developers are mostly irrelevant in this calculus to Apple: it's what the user sees that matters. They put out Classic and Rosetta not to help developers — they were with both very clear: the train is pulling out of the station, you'd better get on board with the new thing immediately or you're dead — but to help users have something to run on the new systems. Get users buying new systems, developers will follow... especially when essentially forced to.


It seems to me that you've started with a conclusion and found facts that fit your thesis. That's the wrong way to go about reasoning.

The known facts about the Mac App Store are that it is only going to be an additional way to obtain software. It doesn't make sense to condemn it as "Bad For Developers" based on groundless speculation and presumed irrationality on Apple's part.


>It seems to me that you've started with a conclusion and found facts that fit your thesis. That's the wrong way to go about reasoning.

So let's look at it logically then. In nearly every example I gave, did or did Apple not make a promise to developers which it then broke later, costing those developers time and revenue?

And did or did Apple not recently make a promise to developers regarding the availability of an existing, long-standing technological alternative, that being direct downloads of application binaries?

Would a reasonable person extrapolate from those past observations that Apple would behave in a similar manner when circumstances similar to those that I mentioned arose?

If not, then either you don't believe that people's future actions have anything to do with their past actions, or you disagree with a series of easily verifiable facts.


there is a cost to maintaining features - ten years ago making java a first class desktop citizen was a good thing. today circumstances have changed and it's not worth the effort. so you could say that they have broken a promise - but do you expect that promise to be for evermore? or for as long as is reasonable (for some definition of reasonable)?

apple has always been that way, making life harder for developers if it suits what they perceive the consumer's needs to be.


> All of these decisions had the effect of reducing Apple's support and maintenance overhead while strengthening their control over the direction of their platform.

Yup, plus the effect of improving user experience.

The most developer-friendly thing a platform vendor can do is attract users. Apple's done that in spades.


I would argue that bringing a massive amount of new customers to their developers is actually rather developer-friendly. Maybe not "programmer friendly", as Linux is, but how many developers do you see making a living off Linux applications?


More than making a living off iOS applications.

And there's probably a lot of overlap in fact. If iOS is the client, I think it's a pretty good bet Linux is the server.


Only because it's semantically important - did you mean 'end of an era' or 'end of an error'?


Thanks, I replaced it with what I intended, 'era'. I think my fingers are much more used to typing 'error' then 'era'. :)


There's still another instance of "end of an error" you forgot to edit.

Are you British? :P




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: