Hacker News new | past | comments | ask | show | jobs | submit | more mas644's comments login

Even before I ever read GoF, I found myself using their design patterns in my object-oriented code (i.e. "reinventing" what already existed). At the time I didn't think it was any magic feat, it's just the way I organized the code. It's nothing surprising, all OOP developers end up doing this -- the abstractions provided by an object-oriented language naturally lead to the design patterns we see in the GoF book. In fac,t in the introduction of the book, the authors describe how the patterns came from existing, real-life software which they found worked well in practice. Reading GoF is useful to an OO programmer because whenever they encounter a difficult problem, there's usually a "pattern for that". Oh I need to manage complex states and state transitions - use a "state pattern". I need to route messages to any object that desires the message feed - use an "observer pattern".

As the article states, GoF patterns are a tool and shouldn't be treated anything more or less than that. I've worked with developers who treat design patterns with religious sanctity and their designs are convoluted with forced GoF patterns. The result is bloated code that is difficult to follow. Likewise I've worked with developers who think they're a scam and prefer to "cowboy" their own solution which either ends up reinventing a design pattern or just being plain ugly.

I saw some comments in this thread implying how language paradigms other than OOP (primarily functional) eliminate the need for many of the GoF design patterns. This is true, but it doesn't mean that functional (or whatever paradigm) languages do not have design patterns -- they have their own design patterns as well. For example, the concept of a "monad" can be considered a design pattern to implement side-effects in pure functional languages which lack that feature. Some design patterns transcend language paradigm, for example, Model-View-Control (MVC) is universal to procedural/imperative, functional, and object-oriented code. Design patterns also exist for concurrent computation -- a great example is "The Little Book of Semaphores" (http://www.greenteapress.com/semaphores/). The book details multiple "design patterns" for solving common problems with concurrent computation in languages that provide semaphores and threads as the primitive concurrent computing abstractions. However, to help make the point of the other commenters -- using a language designed for concurrency (e.g. Erlang) eliminates the need for a lot of these design patterns. On the flip side, using something like Erlang introduces a whole new set of problems meaning you need a whole new set of design patterns!

What I think was revolutionary about GoF was the coining of the term "design pattern". It's analagous to the concept of "algorithm" but not quite...it's more general than that. What's beautiful about the term is that it's applicable to more than just programming, but rather any sort of engineering, design, or problem solving. As the author of the article demonstrated, there are even architectural "design patterns". In any field, you'll have design patterns (they may not call them that) and books on them that predate GoF. However, the way GoF documented them as wonderfully and clearly as they did in their book...it's just beautiful.


As the author of the article demonstrated, there are even architectural "design patterns"

Even? Alexander invented them. He also disowned their use in software - something you'd think software pattern advocates would pause to consider, but never do.


It's not anti-Unix or anti-programmer...it's all about being anti-freedom.

All these companies want to do is turn your computing experience into a locked down, controlled, monitored experience. They want to turn your limitless and powerful computer into a home appliance...like a blender or electric razer. No way to customize, no way to modify, no way to organize thoughts your way. No hacking or "jailbreaking" into your own device to give yourself features like "tethering". That would violate the ALL HOLY EULA (unless you paid $5.99)!!!

Also no need to understand how the technology works -- technology is now magic. And when technology becomes magic, they have control over you. Google manages all your e-mail, Facebook and iPhone manage your social life. Google and FB make their money through selling your information to marketing companies. Apple makes it by selling you hardware. Both of them make money by exploiting the lack of will by the masses to understand their technology and how it affects their rights and privacy. They make money when they are in control of you and your data. They make money by making you think you need a new computer when all you need is a better operating system.

Getting rid of files is like Newspeak from Orwell's 1984. They want you to be dumber. They'll take care of savin that file for ya, you just worry about eating nachos and 'batin. Besides, they need to take a looksie at it first to infer information about your shopping habits...


"All these companies want to do is turn your computing experience into a locked down, controlled, monitored experience."

Oh, bullshit. Do you really think there is someone siting around at Apple dreaming up new ways to take away your freedom?

No. There fucking isn't.

I'm a huge believer in small, composable components. And I know that developing against closed platforms sucks. Being able to dig down into the source code of every layer of your stack is critical to the understanding necessary to build high quality software. Libre, Gratis, and Open are all key elements of the software I choose to use to do many mission critical jobs, every day of my professional career going forward.

But you know what?

The people at Apple just want to make damn good products. They are proud as hell of those products. They work very hard to make them that damn good.

Freedom isn't free. There are costs associated with development, complexity, opportunity. Free is a strategy that only few can afford to execute. Google has to vary the gratis, libre, and open dials with their products every single day to paint the benevolent picture they count on to keep their recruiting pipelines full, while still building high quality products on time and budget.

My Girlfriend's Android phone (a highly rated model) is a horrible piece of crap next to the iPhone. It lags, crashes, has UX issues, flimsy hardware. The iPhone is a glorious, crowning achievement of engineering that still, years later, Microsoft and Google are struggle to replicate.

I'm no dumber for owning an iPhone. It gets the job done: It makes phone calls. It settles arguments at the bar. It lets me cut in line at Chipotle. It wakes me up in the morning. It keeps me entertained. It makes me smile every time that little orange plane animates by as it goes into airplane mode.

It just fucking works.

You want to talk about freedom? I'm free from thinking about memory management. And processor utilization. And data loss. If it stops working, I'm free from worry because the Apple store makes everything better. I am free from all of the horrible things that can go wrong on my production servers.

Unless something does go wrong on my production servers, in which case I'm free to be away from my desk when it happens. And I'm free to drive aimlessly without worrying about getting lost. I'm free to call a cab, when I just don't feel like walking home.

In context, my iPhone represents every last bit as much freedom as my blinking cursor in an empty vim window.


"Oh, bullshit. Do you really think there is someone siting around at Apple dreaming up new ways to take away your freedom?"

> No but I honestly thinks there are people sitting around at Apple dreaming up new ways to take away your money, even if it entails taking away your freedom.

"The people at Apple just want to make damn good products. They are proud as hell of those products. They work very hard to make them that damn good."

> Yeah... As an aside I think Apple is overrated. It's interfaces are nifty but as seen with Lion they're not above a misstep. Plus while they offer a very nice first-time ("store") experience, with time it's edge over the other OS slowly fades. You have to learn all the keyboard shortcuts or you'll be a bit helpless in your shiny OS.

Even hardware which tends to be quality can fail, as witnessed with graphic cards overheating problems for instance.

"My Girlfriend's Android phone (a highly rated model) is a horrible piece of crap next to the iPhone. It lags, crashes, has UX issues, flimsy hardware. The iPhone is a glorious, crowning achievement of engineering that still, years later, Microsoft and Google are struggle to replicate."

> Anecdotal evidence does not make a general truth. I for one have had the exact opposite experience. I had an iPhone 3G and with the new firmware the thing was a trainwreck of usability, while my new Samsung Galaxy S2 is snappy as hell. This is essentially irrelevant, as it has to do with computing power more than product quality (tough I get more bang for my computing power in Android, the iphone firmwares have brought me copy/paste and ... what ?).


Side note: Please do not use "> " for your reply. That is generally reserved for quoting what you are replying to.

> No but I honestly thinks there are people sitting around at Apple dreaming up new ways to take away your money, even if it entails taking away your freedom.

Sure, there is some dude somewhere who only cares about driving up the quarterly numbers. But generally Apple has a pretty long record of shipping premium products at a premium price and selling them with a "buy it or don't" attitude. With the possible exception of those damn video adapters, I've never felt like Apple was trying to squeeze money out of me. Sure, they had to play the DRM game with iTunes and let's not even get into the bullshit that the telcos get away with. But, as a composite entity, Apple has been pretty damn respectful of it's customers.

> Yeah... As an aside I think Apple is overrated.

Different tastes for different people. Go ahead, buy whatever product you like. Just don't accuse a team of hard working, talented people of being out to take your freedom.

> It's interfaces are nifty but as seen with Lion they're not above a misstep.

I'm quite happy with the improvements in Lion.

> Plus while they offer a very nice first-time ("store") experience, with time it's edge over the other OS slowly fades. You have to learn all the keyboard shortcuts or you'll be a bit helpless in your shiny OS.

I've used a Windows machine since before I could speak. I worked for Microsoft for several years. I now cringe every time I have to touch the one Windows box at our office.

> Even hardware which tends to be quality can fail, as witnessed with graphic cards overheating problems for instance.

And you've never had a component on your other machines fail? shrug When the machine I built had a power supply failure, I had to go fucking replace it. I was free to use whatever power supply I wanted, but I wasn't free to use my damn computer for the four days I waited for my new component. When my iMac had a buzzing noise, they replaced it in the store in under an hour.

> I had an iPhone 3G and with the new firmware the thing was a trainwreck of usability

Freedom isn't free. Making that new software compatible with that old hardware costs time and money. They did a decent job to appease the tiny cross-section of people who upgrade software, but don't upgrade hardware. Most people don't even know what a software upgrade is, so their old OS version is running just fine on the hardware it was tested against. In general, those who do care about having the latest and greatest can afford a phone upgrade.

> tough I get more bang for my computing power in Android

Who cares how much bang you get? It does the same stuff. Again, it's clear you value different things.

Look, I'm not gonna respond further because as far as I'm concerned, I've made my point: It's not right to personify corporations as evil simply because they don't match your personal taste. It's insulting to the people who work really fucking hard and take great pride in the (not so) small impact they leave on the world.


When the machine I built had a power supply failure, I had to go fucking replace it. I was free to use whatever power supply I wanted, but I wasn't free to use my damn computer for the four days I waited for my new component.

vs

When my iMac had a buzzing noise, they replaced it in the store in under an hour.

So "sitting on my arse doing nothing, I had to wait four days for a replacement" versus "I physically took my machine to the Apple store and they gave me a replacement at the same time".

These are purposefully selective anecdotes. With your first repair, you yourself could have gone to a store - just like you did with the iMac, only not having to carry the damn thing - and got a power supply in five minutes, taken it back to the office, and had it installed and back up in half an hour. Computer stores and power supplies are as common as muck - unless your desktop PC is really weird, you'll find something suitable by simply throwing a brick.

Total time spent in process for each 'go to shop' scenario? Less than an hour plus travel time. And with the non-imac one, you don't even have to ferry the computer around. All in all, a pretty similar experience.

Unless of course your anecdote is even less fair and the 'four day wait' was for a server part.

It's insulting to the people who work really fucking hard and take great pride in the (not so) small impact they leave on the world.

People who work hard and take pride in the (not so) small impact they leave can still be doing a bad thing, even though they're full of good intentions.

http://en.wikipedia.org/wiki/Stolen_Generations (This link is just a strong example, I'm not trying to draw a parallel)


Both repairs were free (err...gratis) warrantee replacements by the manufacturer. Of course I could have paid for a new component or repair at a local computer shop.

> can still be doing a bad thing, even though they're full of good intentions.

Totally agree. I just don't think they are doing a bad thing either. No one is forcing you to buy a locked down device. No one is forcing you to make a particular freedom tradeoff. Buy whichever product you like for whichever reasons you like. But just be cognizant of the tradeoffs and their costs without making value judgements about those who make different tradeoffs, both as consumers and as producers.


But just be cognizant of the tradeoffs and their costs without making value judgements about those who make different tradeoffs, both as consumers and as producers.

I agree about the consumers bit - what's good for me may not be good for you - but don't agree about the producers. We should be able to raise constructive criticism if we see a producer as damaging - and I think the exhortation to 'leave them alone, they work hard' isn't right.


Also: On the repair front, how long would you have been waiting for Apple to come to your office and replace the faulty part? :)


Just don't accuse a team of hard working, talented people of being out to take your freedom.

Are you saying the people working on Android, Metro and platforms not owned by Apple are not talented or hard working? If not, why is the manner in which Apple people are working really relevant?

You are simply evading the point. It seems like to you, your ownership of Apple devices is a personal obligation to defend Apple from its "enemies". It isn't. Again I am forced to asked why you are doing this. It taints the rest of your argument and makes everything you do sound awfully much like zeal, and not like a rational opinion.


> Are you saying the people working on Android, Metro and platforms not owned by Apple are not talented or hard working?

That's not what I'm saying at all. I was simply taking Apple as an example. The original article was about iOS and I think it's fair to say that from the original commenter's perspective, Apple is out to get your freedom.

I, in fact, was a contributor to Windows Phone 7. I worked damn hard to get the XNA deployment and debugging to work super smoothly. I'm supper proud of my small contribution to that product.

However, the Win Phone platform offers just a tiny bit more freedom of hardware choice. And that comes with a cost. I can tell you that having a dozen potential devices floating around the office, with varying graphics cards and other specs, was very time consuming for development. The first Win Phone 7 would have been much more timely if there was a single hardware platform locked down much earlier in the development schedule.

As for Android, it's further down the spectrum. You hear about fragmentation and whatnot. There are very real costs associated with the flexibility that platform offers. Freedom isn't free. Sometimes it is worth it. For some people, like myself, I choose a different type of freedom for my phone.


The people at Apple just want to make damn good products. They are proud as hell of those products. They work very hard to make them that damn good.

It's very hard to read the rest of your comment and not have you mentally pre-positioned as a fanboy. I'm not saying this to be a troll, but because comments like this really do stand out. Would you give any other company this sort of slack you are now giving Apple? If so, why not?

And while that may not be such an interesting discussion in itself, I am not willing to support and hand over my money to someone acting against freedom of choice, because someone else on the internet says "Don't worry: These people are good guys. Really!"

Freedom isn't free.

Again. I would love to see anyone make this sort of defense for Oracle or Microsoft.

I'm no dumber for owning an iPhone

Maybe not dumber, but you have locked your mind to the most restrictive of the mobile OS platforms out there and the limited workflows it allows.

I would be very surprised if this didn't also limit your ways of thinking about how problems can be solved.

The iPhone is a glorious, crowning achievement of engineering that still, years later, Microsoft and Google are struggle to replicate.

Absolutely revolting fanboy talk. I tend to find the iPhone a glorified piece of needlessly heavy electronics running visually polished but annoyingly limited software, which fails at the most basic of tasks, like sending data from one app to another via something called "files".

You want to talk about freedom? I'm free from thinking about memory management. And processor utilization. And data loss.

So am I. On my Android phone. While my iPhone 3G had constant memory-problems because it wasn't built to multi-task. Oh well.

Why is it iPhone owners seems to default on Android being an immature platform, cite Android 1.6 problems, while any factual representation of the limitations found in current iOS releases is answered with "iOS next" and that is supposed to be a valid answer, free from hypocrisy?

Seriously. You guys need to get out of the Apple store more often. It may actually be starting to dumb you down.


> Would you give any other company this sort of slack you are now giving Apple? If so, why not?

I used to work for Microsoft. Although you'd have to dig back a couple years in my comment history, you'll find many posts where I defended Microsoft and explained some of the intricacies and complexities of building products for the customers that Microsoft really cares about.

I also used to work for Google, and somewhere in my comment history, I also defend the fact that Google isn't out to track your every move.

In retrospect, I shouldn't have chosen to defend only Apple in my original post. It weakened my argument because of the perception of being a fanboy. I think that all the same points apply to Microsoft and Xbox.

PC gamers shout about how consoles are trying to kill your freedom. You can't even use your own choice of team voice chat utility? OMG! OPPRESSIVE. But really, the locked down platform was easier and cheaper for Microsoft to develop than the wild west of PCs / Direct3D. I quit being a PC gamer, gave up some of my freedom to play mods, so that I'd have the freedom to install whatever new game came out without having to think about the specs of my PC.

Different value tradeoffs for different consumers, or even for the same consumers with different needs at different times! Different value tradeoffs for companies producing those products for those who make different value tradeoffs as consumers.


Thought experiment: Is it possible to think Apple makes the best technology products without being a fanboy?


Oh absolutely. I have no doubt about that. In fact, saying anything else would be absurd.

However I think it's fair to recognize the difference between someone merely (very) happy about their Apple stuff and someone who seemingly is personally insulted when it is suggested that Apple is (shock!) a normal corporation following a normal corporation's need and desire to profit, doing some ethical comprosises on the way.

And looking at snprbob86's post here, it's full of seemingly personal feelings when discussing this topic. It's almost this short of saying "Dear sir. You have defiled my lovers honour and I challenge you to a duel".

I have to say he/she seems like the glorious, crowning achievement of the Apple PR-Department which is hellbent on making Apple-Products a matter of personal honour and identity. And it's very freaky to observe from the outside.


I think the key difference is the implication that Apple's sole intent is to make good products. The people at Apple just want to make damn good products. is very different from Apple makes damn good products.


Spot on. When people throw "freedom" into the discussion they often fail to understand just how big the concept is. Every choice made in engineering restricts some kinds of freedom. What about the freedom to press fewer buttons? Some people prefer that over the freedom of file management. As long as people can make these kinds of trade-offs, they are still free to do what they want.

I find it much nicer to use the term autonomy, which can be seen as "meaningful freedom". To some people having a commandline interface (which they do not know how to operate) is not a meaningful freedom to have at all.


"If it stops working, I'm free from worry because the Apple store makes everything better."

You should've just said 'I love Big Brother.'


Yes, I do that there are a number of people at Apple spending a large part of their days figuring out how to prevent people from using their own purchased items in the way that they wish to. Lawyers, programmers, and PR people.


The hyperbolic indignation is not needed. This is not Reddit.


I agree with the OP:

> Oh, bullshit. Do you really think there is someone siting around at Apple dreaming up new ways to take away your freedom?

Yes - more precisely they are working out new ways to charge you for it.

We're slowly being led into paying incrementally for utility computing on a device we actually paid up front for.

Only a fool buys something then pays to use it.


> Only a fool buys something then pays to use it.

You mean like buying any of the following:

  * A car (fuel, tires, a bunch of other stuff)  
  * _Any_ cellphone (either a monthly fee or pre paid)  
  * Clothes (I sure hope you wash them?)  
  * _Any_ electrical appliance (electric bill you know...)  
(and the list goes on...)

Yeah. A fool indeed...


What you describe are "appliances". Single purpose items.

Computers, are by nature interchangeable with respect to their use at the whim of the user, without cost. They serve many purposes interchangably.

What I am concerned about is that a cost for changing purpose is being programmed (excuse the pun) into the users, converting them from users into consumers.

Consider the case of a Leatherman tool. Would you buy one if it had a couple of useless tools supplied with it and you found out when after you bought it that you have to buy all the good ones? Also the screwdriversr can only be used with Apple-licensed screws and you have to turn them the wrong way.


The iPad is computer-as-appliance.


Maybe true, but the fact that iOS devices are selling so well shows that people actually want their computers to be as simple as appliances. I think that to many "regular" people (i.e. non-HN-reading types of people), the power in computers is when they provide a simple and idiot-proof way to automate many common day-to-day tasks. If given the choice, many people would rather go out for a picnic than sit down at a shell to figure out the difference between | and > so that maybe, one day, they could become command-line gurus and appreciate the power like we do.

The market for complex and powerful computer interfaces will always exist--it's people like you and me. But the market for simple, easy, and foolproof computing will always be much bigger. Apple is owning that market, and if it's worse for the consumer, well, they seem to be pretty happy about the situation so far.


I've seen this point brought up. Making some easy (which I'm all for btw) does not mean it has to lock down aspects that average users are not meant to look at it (i.e. DRM). It's fine if they hide them, but when there are software and hardware mechanisms that prevent power users from manipulating the device - that is unnecessary. I should be able to replace the OS as well as install any app that I please on the device. They are actually taking EXTRA steps to prevent me from seeing the internals by using things like hardware level OS image hashes (i.e. tivoization).

Now people say, I just want to use my device and get work/play done. THat's fine, I'm all for it. However, having control benefits everybody. For example, if you want to tether your smartphone and use it as a wireless access point, writing the software to do this is trivial (and a power user can do it for you). However, smartphone companies don't want you doing that...unless you pay them more money for that feature. That's kind of unfair, the smartphone company is depending on PROPRIETARY LOCKIN rather than real competition. That's my problem with it. They want to control your experience and make you think you can't get a PREMIUM EXPERIENCE without paying them ludicrous amounts. I mean, all tethering does is update a few IP tables here and there -- I have to pay 10 bucks a month for that??

Yeah sure APple did a lot of great things as did Microsoft and Google for bringing computing to the masses. However, they all did it based on the open innovation of others (kernels, compilers, text editors, web servers, etc). They've used those wonderful open things to create a wonderful easy to use experience. At the same time, they've added ARTIFICIAL CONSTRAINTS to it (e.g. again tethering) to create PROPRIETARY LOCKIN and make you think your device has less capability than it really has. If these companies stay on these same paths and people maintain the same level of compliance...we'll get to a point where ALL computers are useless and innovation can't happen. Again, easiness and openness are not orthogonal.


And why exactly my tablet should be different from razor or blender? Or my TV or my car? For many this is just a tool, not goal.

  lack of will by the masses to understand their technology
There are many many many more fields besides IT. If anyone: be it waitress or accountant will be required to understand IT it will not end well.

  They want you to be dumber.
No. I don't feel any dumber because I don't know how to fix my TV. I don't feel any dumber because my knowledge about car's engine is very basic and most of it was learnt in the age of carburetor. This are just means to get some news and entertainment, to get from point A to point B. Likewise I am not calling my car mechanic dumb, just because he has no idea how to build a web page or write an iOS application.

If you are a programmer and want to create something useful, you should let that crazy idea that everyone must know and worship your craft go.


>There are many many many more fields besides IT. If anyone: be it waitress or accountant will be required to understand IT it will not end well.

I don't think thats the issue at hand. I think they're putting an upper limit on what is possible that happens to require more knowledge.

I don't know a whole lot about cars either, changing breakpads is probably my upper limit right now. But if I wanted to do some more work, I could pull into a friends garage, grab a copy of the maintenance book specific to my car's model and get to work. Why? Because the car didn't come from the factory with the hood welded shit.


Do not say "this is just a tool", it never helps. Besides, computer happen to be different in kind: they can do anything you tell them to. This is as close to a genie in a bottle as you will ever get. It wouldn't feel right to deprive most people of this little genie, don't you think?


I think you overestimate people's desire (let alone aptitude) to wrestle with the genie in order to get their wishes. I assure you that a great many people are only able to use computers by memorizing a series of steps. They have no mental model for how it works.

Given the options of "limitless but practically useless" versus "limited but quite useful practically," I'll opt for the second option. I'd rather have more people using computers than not. It's not fair to deprive people the power of even a relatively limited slice of computing just because they don't want to make a hobby out of it.


Once upon a time, most people couldn't read. No one thought it would be useful to them, including those who could read.

I think programming is the same. Just teach it in school, and watch. How many people wouldn't wrestle with the genie if their grades depended on it? Not much more than those who can't read, I think. (Yes, I'm aware that changing the school system isn't isn't a piece of cake. But its easier than requiring everyone to do non-mandatory work.)

Also, don't use the word "practically" when you actually mean "in the short term". Investing in the future can be worthwhile, despite hyperbolic discounting.

Finally, you forgot the option "limitless and immediately useful". Really, there no reason at all why they should be mutually exclusive. If they are in our world, that's only because other interests are at work. (Open platforms and Free Software tend to be harder to monetize.)


How did tin-foil-hat-esque nut-jobbery get voted so high? I wanted an iPad precisely because it isn't a "limitless and powerful computer". I wanted a multipurpose appliance that requires minimal (idealy 0) maintenance.

Newsflash for you: the whole world is not computer programmers (I am and I still don't want to have to think about this stuff when I'm just consuming). They never wanted to know. Which is why something that finally lets them not bother learning this field is so incredibly popular.

For those of us who do want to know Mac still covers that market. I can (and do) install anything I want on my MBP.


I'm the same. I strongly prefer to only have to maintain one computer. I don't want or need more computers (in fact I got rid of a bunch). Augmenting it with appliance-like devices for additional needs is a pleasure.


Additional thought, because your post is just so wrong.

> limitless and powerful computer

More like confusing and impairing computer

> no way to organize thoughts your way

More ways to do so with my iDevice than with any other computer I ever owned!

> no need to understand how the technology works

Yay! Finally!

(Not to bring up the car analogy again, but do I really need to know how my motor works to drive to work?)

> technology is now magic

True, it's now accessible to (almost) all!

Etc. ad nauseam


Most comments in this thread strike me as being anti-"normal users". This is the elite, talking about how they can maintain their elite position.


That's not what the parent, or the article is saying. Don't turn this into some sort of class warfare for the sake of trolling.

The ability of an OS or application to expose data in order to allow you to manipulate it as you see fit, does not preclude it being user-friendly and straighforward to use. That funcionality might very well be hidden away, but available if you're inclined to use it.

The point that is being made is that this level of openess and flexibiliy is directly at odds with the need for control of some companies over their OS/applications and the content whithin, usually for the sake of profits.


Sorry, but the article says nothing about companies maintaining control for the sake of profits — only the parent developed this point.

The article made the point about the lack of composability of iOS, with the Grail of composability being the command-line. And I stand by what I said. Design is the result of trade-offs: you can't have ease of use and composability.

Steve Jobs understood the necessity of trade-offs better than anyone else.


But it does. Or at least, it does in every way that we've yet to imagine.

What the author of the article suggests would be not "anti-programmer" are necessarily "anti-user". Breaking everything down into composable components and forcing the user to compose even simple, everyday procedures before being able to use them? How terribly difficult and exclusive to techies only!

Similarly, he suggests that the only way for meaningful interaction to occur between programs is through the unfettered access to a filesystem - as users have demonstrated time and time again the past decade or so, they don't want that. They will pile heir music and photos and movies happily into one big dump so long as a much more proficient application handles the intricacies for them. They don't want a filesystem - it's a toy for people like us.

The ability of an OS to expose data for manipulation does not necessarily preclude it from being user friendly, but that's only true in a very ephemeral, theoretical way. The only ways we know how to expose said data, up till now, is entirely anti-user, and part of why the popularity of computers with the general population has not exploded until so late. For too long we've kept the keys to the kingdom, and you know what, fuck all of that noise.

And the claim that iOS (and other similarly restrictive OSes) have been developed primarily to protect the company's interest is patently absurd. The iPhone has been by far the best thing to happen to mainstream computer users in the past two decades. They have power at their fingertips that they can actually comprehend now, and use without navigating a tome of quickly outdated knowledge. This is the age of the device you don't have to take a class to use! The empowerment here is incredible, and people have been throwing gobs of money at Apple, Google, and everyone else who has realized this. The main reason these OSes are restrictive is because it guarantees usability and consistency in a way that completely open platforms do not (Gimp, anyone?).

This whole article, and resulting thread, really disappoints me. I'd have thought that what we've seen in the last 4-5 years would finally wake up the tech community and make them realize that computational resources are pointless unless it benefits the rest of humanity somehow, and chief among these is mainstream usability. But no, it looks like a large contingent of HN is happy to view users as stupid plebes - if only they knew the value of free OSes and would take the time to learn the command line!

Screw that train of thought, and damn the people who continue to reinforce the false assumption that technology need be complicated and obtuse. I'm sick and tired of looking at all the cool things we can do - things that are of real benefit to people - and having it locked away behind a glass wall of techno-wizardry when it does so much more good out in the hands of the masses.


I wonder how many of the people here decrying iOS as a radical loss of freedom would be willing to learn how the engine of their car works in order to drive it, or how all the machinery in a dentist's office works in order to get a checkup.

We live in an age of specialization and iOS just represents a far less leaky abstraction for the average user than the operating systems of the past.


I'm not anti-iOS, I'm anti-all current mobile OSes. I think you present a false dichotomy. I shouldn't have to know how the engine in my car works, but I should have the option of tinkering with it if I wish.

And I do expect my car to be componentized. I expect it to have a transmission, an alternator, a radiator, etc. While I have no preferences for how those components are designed, I expect them to work, on the whole, as one would expect those types of components to work. That is they have an input and output. I wouldn't buy a car where all of these components were replaced with 1 proprietary, untinkerable, thing.

What I want is Canonical or Mozilla, or whoever, to be able to write an OS that can be installed on just about any phone hardware, the way it can be done on any x86 computer. Whether it's the crazy patent situation or harder engineering challenges that prevent that today, I'm not sure.


In the theory that comes from a cursory glance, someone could make an OS that's portable, composable, open, free, changeable and tinkerable, etc. and that's easier to use than any system that's available today. That'd be great.

But in practice, as systems exhibit more of those traits more strongly, they, as a general rule, become less and less of a coherent, unified whole and more of the internal workings become exposed to the user. This has a direct impact on usability.

Until somebody resolves this seemingly-fundamental engineering constraint, we'll have to settle for different systems that varyingly trade-off flexibility and usability. iOS has been so successful because Apple has deliberately chosen to fall more on the usability side and a tremendous number of users have found that the tradeoff is worth it: that decreased flexibility doesn't harm them remotely as much as poorer usability would. Happily, there are also extremely flexible systems available.

Looks to me like this setup allows everybody to win as much as we know how to in this "imperfect world".


I disagree, and I would use desktop linux as the counter example. A distribution like Ubuntu is nowadays opinionated on user-experience questions while still maintaining the ability to easily swap out the DE for something else if the user so wishes.

That this hasn't happened on mobile just means we haven't reached the point of interchangeable hardware. I just worry that patents are the primary reason for this, and that we may never enjoy the same level of freedom in phones that we have on PCs.


The poster you replied to noted that they become much less of a coherent whole due to those factors. Bringing up desktop Linux is not helping your case against that assertion in the least.


I wonder if the rise of Chrome at the expense of Firefox is a counter-counter example. FF has grown bloated and more chockfull of features. Chrome, on the other hand, has retained a unified and coherent structure.


There's no free lunch. Flexible and easy are dichotomous. In the past the average user has paid the price for the greater choice we hackers have enjoyed. Who are we to say that this tradeoff was fair? Modern cars are not hackable but you'll be hard pressed to find a typical driver that doesn't consider this a fair bargain for increased reliability and ease of operation.

You can talk all you want about how things should be but the facts on the ground are that Apple has done more for the experience of the average user than any of the hacker idealists at the FSF/GNU etc.


Could you tell us how the App Store exclusivity significantly enhance usability? Apple already controls the native API and the core apps, how can they need more? Do they really need to ban poor apps? Do they really need the censorship?

Usability can't explain all the control. There are other reasons.


The control is there, but it's far from the primary or even a major driver, I would claim.

The control and banning of poor apps reflects something not only Apple is doing - see Google and their "banning" of content farms and ad farms.

Junk breaks discoverability, and discoverability is one of the more fundamentally important notions that users have started to get in the last few years.

A few years ago you'd get panicked calls from users who are afraid to click on anything, because the slightest misstep means an app crash, an OS crash, or just plain data loss. Now we've finally developed software that's intuitive enough and appeals enough to people's natural assumptions about software, that this is no longer the case. People are happy to tap away and discover new features of their devices and software, and that's great.

By allowing crapware and malware into the system, we degrade user trust, and we go back to the days where users are apprehensive about clicking on anything, for fear of losing their data, for fear of getting a virus, etc etc ad infinitum.

tl;dr: The "control" angle, for the most part (except some clauses such as App Store exclusivity) does enhance usability.


> (except some clauses such as App Store exclusivity)

To me, this exception is the most part. I wouldn't complain if they had relinquished it.

As for viruses, I am beginning to think that an OS that crashes is just unacceptable. (Which may rule out even OpenBSD, but I'll bite that bullet. http://www.vpri.org/ gives me hope).


> Breaking everything down into composable components and forcing the user to compose even simple, everyday procedures before being able to use them? How terribly difficult and exclusive to techies only!

Who said anything about "forcing" users? There's nothing wrong with exposing _both_ a set of composable components (for those who know how to use them, and perhaps hidden by default) _and_ a set of pre-composed apps (for the majority of users). Sure, it's not easy from a design standpoint, but where there is a will, there will be a way. Forcing users to rely exclusively on pre-composed apps is not the only alternative.


I get your point (and I don't think that the vast majority of iOS users care one jot about pretty much anything that the article is talking abount), but there are ways of making things "programmer-friendly" without sacrificing usability.

Take MacOS and AppleScript. I doubt more than a tiny fraction of MacOS users even know that AppleScript exists. Its existence certainly doesn't get in the way of usability. But it's there for people who want to use it, and they can use it to chain GUI apps together, or to automate regular tasks.


Nonsense. It's about the loss of functionality, not the point that there's a simplified UI available.


Well observed :)


change 'television' to 'computing experience'

"When you're young, you look at computing experience and think, There's a conspiracy. The companies have conspired to dumb us down. But when you get a little older, you realize that's not true. The companies are in business to give people exactly what they want. That's a far more depressing thought. Conspiracy is optimistic! You can shoot the bastards! We can have a revolution! But the companies are really in business to give people what they want. It's the truth."

--- original quote ironically enough came from Steve Jobs


The old convenient argument "we give people what they want"..

It's the same with the tv, newspapers. People get what they can get. People get what's in the market. People get what other people already got, because of a warm fuzzy feeling of familiarity and a need for identification with others. Any creator has a responsibility so that the users can be informed that things can be done in better ways and then they can choose better things. "We give people what they want" is often an easy excuse to ignore such responsibility.


So build something better and let the market decide. I can teach my mother in law to use an iPhone in five minutes. Giving her all of this "freedom" would make the devices much harder to use and maintain, not to mention far more insecure. Not much of a malware danger on ios either. There are very few apps that need a jail broken device; the few that are unique are generally dealing with features the average user wouldn't want or need. For those Ivory Tower types, there's no law against jailbreaking..


All technology must strike some balance between catering to the power user and catering to the novice. For the last thirty years tech has skewed pretty heavily towards the power user. iOS pushes the balance radically toward the novice and has been richly rewarded in the marketplace for doing so.

I think iOS is a little more locked down than it has to be to accomplish this, but there's no question it represents a decisive break with the past. The money made in the future in technology will depend on humanizing technology. Over time I expect the market will find a paradigm that serves both ends of the spectrum well.


One more thing-- no one forces anyone to use a particular technology (unless it's being forced to use Windows at work.) The tin foil hat crowd might have some grand fears, but it's groundless -- you only give up the privacy you choose to give up. Google has never knocked on my door forcing me to give up private information, although the US government does! I would be more worried about intrusive and over regulatory whims of the government-- they can enforce their schemes at the point of a gun. Worry about the government and let the market take care of everything else.


The tin foil hat crowd might have some grand fears, but it's groundless ... Worry about the government and let the market take care of everything else.

Um... it's not really irony if you do it intentionally...


Wow, I had no idea I would stir up the beehive so much! Perhaps I was a bit dramatic in my presentation. The point I'm trying to make isn't that Apple, Google, etc are evil nor am I saying their products aren't good. The problem is things like DRM, net neutrality, software patents, privacy, developer and user rights, taking away control of information from users and throwing it in the cloud. THEY WANT TO CONTROL YOU SO THEY CAN CONTROL THEIR REVENUE STREAM AND ENACT PROPRIETARY LOCKIN! They would rather do that than have real competition! It's the Microsoft business model with sexier marketing. What happens when files go away -- how do I backup my data? Is my data even physically on my device? Is it located at some server at one of Apple's data centers? Are they securely storing it? Will I have access to it if they go out of business? Similiarly as a developer, how come there is no option for me to get my app onto the iPhone without being subject to Apple's App Store submission process? Sure, users may not notice, but they've created a major barrier to market. The excuse is that "oh we need to make sure it's user friendly and high quality", but we've seen often that they pull apps because they compete with something they're doing or they suddenly don't like the UI because it messes with Steve's vision of how your computer should behave.

I don't think having open technologies and easy to use technologies is mutually exclusive. Android is a great counterexample to iOS. For DIYers on Hacker News, another great example is Arduino. Also, if you haven't taken a look at a Linux distribution in years (or ever) like Ubuntu, you really should try it out. Not only is it free and open, but it provides (in my opinion) an easier and aesthically superior experience to Windows/Mac and is a way better option for 90% of users that do simple things like listen to music, watch videos, take pictures, and surf the web.

As one other person posted, they aren't trying to be evil, but they want to nickel and dime you at every step. They don't want you installing a more open OS on your device because they cannot control the experience (I gave the example of tethering in my original post). What scares them is giving you options that allow you to migrate away from their platform. A lot of people said "hey that's just business", but there are other business models than "proprietary lockin" that gives users flexibility and choice as well as generate profits for the company.


All technology is magic at some level. Go deep enough and no matter what the technology is, we don't understand it why it really works on some basic physical level.

Every electrical engineer, or computer scientist out there might as well be a wizard. Why pretend that one piece of technology is inferior just b/c it operates on a different level of abstraction.


The point he's trying to make is that Lisp lacks typing. In Lisp, code is data and data is code. This is what makes it so powerful. At the same time, the lack of certain safeguards found in ML can make it dangerous. ML was meant to give the flexibility of a language like Lisp but with stronger guarantees for safety.

However, I think it's a bit extreme to say a Lisp programmer could never be sure that their program was going to work...that's just disingenuous. Lisp is a dynamically typed language, but it is a safe language.

To contrast this with C++. In C++, if I call some third party mystery function, I'm not sure if it'll cause some side effect and corrupt my process memory. I can arbitrarily create a pointer, cast things to whatever I want, do some memcpys and memsets and bring the whole thing down. Not to say they're bad languages, I'm a C++ guy myself...but a lot of the security vulnerabilities in operating systems and major libraries/apps is due to the fact they're written in C or C++. Languages like ML and Lisp prevent that.


-Good points about OCaml/Standard ML. One big omission though (at least I did not see it) was the lack of emphasis on the concept of "pattern matching". I'm not sure about SML, but with OCaml, everything is implemented using pattern matching. Pattern matching is an important concept in all functional languages to begin with. What makes ML datatypes so powerful is the ability to pattern match on typed expressions. Unlike dynamically-typed functional languages like Lisp, your expression and all of its subexpressions have a type. More importantly a type that can be statically resolved. When you write code it's guaranteed to be safe meaning that the code did not corrupt memory or perform operations that do not make sense (e.g. try to take the square root of the string "banana"). Unlike other clunkier statically typed languages, type inference automatically resolves the types of most values -- so the code ends up looking compact and clean like a dynamic language but at the same time type safe. Type safety is a big deal...ML makes it impossible to violate the type system. No recasting allowed, no way to corrupt the process memory, no way to cause runtime type errors :)

-Where it fails though still...and this is a big deal in this day and age, kernel level thread support. User level threads can solve a lot of problems, but when you're dealing with heavy number crunching and the algorithms of the future (computer vision, AI, highly parallel search, etc)...you need to use all the cores! Intel is talking about having thousands of cores on a single die by the end of the decade. We have to run 1000 OCaml processes and do process level messaging passing?? A lot of people are now looking to Microsoft's F# (virtually the same as OCaml sans OOP) as it targets .NET and supports true parallelism and a thread safe garbage collector.

-One thing about the original post which mentioned the lack of OOP. Perhaps not in SML, but OOP is well supported in OCaml (hence the O). People haven't given it a chance. It's really nifty...the biggest complaint I have is that type information from the compiler is hard to read due to the notation used for object types. Also, all of the standard libraries and most of the community only use the functional subset. There's good reason too, functional programming is very flexible.

-There are a lot of libraries out there for OCaml created by the community. However they vary in level of documentation. For the most part though, I've found 3rd party OCaml libraries to be of high quality due to the elegance of OCaml. Also, it's pretty easy to take a C or C++ library and write an OCaml binding for it. This is true of most languages, but it's annoying when the thing is already written for C++/Java/Python/etc

-Oh one really really cool feature of OCaml that nobody ever talks about -- you can actually read the code for the standard library! Have you ever looked at files like "iostream" for C++ or "stdio.h" in C. There are macros and templates and all sort of ugly craziness that nobody can read. I was able to open the standard library in OCaml and actually read it. I could see how they implemented standard modules like List, Thread, and Array. What's interesting is that most of the code would be considered inefficient by imperative programmers due to heavy use of recursion. However simple tail call optimizations by the compiler save the day!


This is a fascinating topic that is always debated. There's always arguments about how young, which language (BASIC, Logo, Scratch, Python, Haskell (yikes!)), which paradigm (functional, OOP, imperative/procedural, logic, etc). I think the point that Eric Schmidt made in the article is that the only wrong move is not to teach anything :) As long as something is being taught about how computer hardware and software works, it's a move in the right direction! His concern is that students are taught only oversimplified computer skills like using a word processor and a spreadsheet rather than understanding some basics about computer science and engineering.

I'm currently teaching introductory programming to non-engineering majors at a major university in C (yes I know uggh, I would have gone with Python). I've noticed what seems to be the biggest problem is that the computer education that was given to these kids earlier in school didn't teach any independent, critical thought. They struggle when things aren't EXACTLY as they were taught to them (e.g. if the Microsoft Word icon is on a different part of the desktop). I've been trying to teach them to use their brains and to try to understand how to solve a problem without asking me for help. I try to make it clear that I do not have the answers to every question regarding computers...neither does any other person on the planet. It requires deduction, inference, experimenting, lots of Google searches, forum posts, and infinite patience. Turns out they can handle it! You just have to let them believe in themselves and let them struggle with it for a bit. In the end, they appreciate it and enjoy it more! I even show them a lot of things that people feel might be too advanced, such as the command line and interfacing with external libraries. They can get it if you can break it down for them. I think often the problem is that the teachers themselves are not experts in programming and have difficult conveying info to students.


It's fascinating to see FP being used in the "real world" rather than academia! I absolutely love OCaml and use it all the time to write major projects. I've always loved the type inferencing and super strict static typing of the ML family of languages. I wish people would embrace statically typed languages more...there's a lot to be said for code correctness and OCaml does it in a very natural way. That and the compiler can usually generate significantly better performing code.

I always notice those at Jane Street rip on the OO aspects of OCaml. The place I found classes/objects to come in handy are when you write extensible software frameworks...though I agree, usually you can do without. However, it is worth taking a look at how OCaml approaches objects...quite different from what you would find in your typical C++/Java/C# style language.

The lack of concurrent garbage collector is really becoming a limitation with the trend towards higher number of cores on a single die. However, because the GC and compiler are so simple, OCaml is one of those languages where you can look at object code and understand what the compiler is doing. What's even more interesting is that even though the compiler is really simple, it does a fantastic job at optimization. The creator of the language, Xavier Leroy, is often quoted as saying that "worst case 50% performance of C code".

Oh one other thing...for those that have not tried a functional language, I think OCaml gives the best introduction. You can code imperatively with it if you want, but you'll ween yourself off of it in time in exchange for more declarative, functional techniques. Oh and you won't look back! ;)


> even though the compiler is really simple, it does a fantastic job at optimization.

The paper also refers to OCaml's simple compiler and optimizer. This is surprising, considering OCaml's fast benchmark showings and functional languages' reputation for slow, memory intensive code. Is the OCaml team researching more advanced optimizations or is "worst case 50% performance of C code" considered good enough (compared to the tradeoff of a more complex implementation)?


You know how it is with benchmarks...it can vary so much. I'm not sure how much work is done these days on the OCaml compiler. I recall there being a big debate a few years back about it, a lot of people cried foul that OCaml benchmarks were close to C/C++ speeds because of the heavy use of OCaml's imperative constructs (rather than writing functional style implementations of the code).

In my experience with OCaml, you can write efficient code that approaches C/C++ speeds if you know how functional language compilers work. The biggest performance killer in functional languages is efficiently implementing closures on a stack based machine (i.e. funarg problem). OCaml is smart in that it implements activation records on the stack if it can tell that a function won't return a closure. If you're writing in a heavily functional style, you'll inherently create a lot of closures and lots of heap-based activation records (i.e. less efficient). At the same time, the compiler can do a lot of interesting optimizations that would be difficult in an imperative language. Pretty much every OCaml book including the official manual describes all the various places where you can have performance penalties and what OCaml is doing underneath the hood. Also like any half decent functional compiler, OCaml performs a lot of inlining which results in huge performance gains.

One thing to note, a big part of the reason why OCaml as well as other ML languages and Haskell have such great performance compared to other functional languages is because of very strict and static typing. For example, one of the interesting restrictions of the OCaml type system when dealing with objects is that you cannot downcast an object once it has been upcast (i.e. like a C++ dynamic_cast). Though this restriction is annoying, the generated code never has to do any runtime type checking - everything is statically verified!


I always felt the problem with web programming was not so much the expressiveness of existing languages, but rather the hacked nature of the web and more generally the Internet. We try to patch the stateless nature of the original HTTP protocol with cookies, AJAX, Flash player, Java applets, ActiveX, etc. We scale the net with ugly technologies like NAT and BGP. We hack some security on top with things like SSL. Heck, we can't even get simple web pages to render identically in two different web browsers. Though not practical, there's a lot to be said for a clean slate. I don't think any programming language out there is going to be make web programming easier...you gotta rethink the infrastructure on which your web apps run. Once that happens, writing web apps in a language should be no more difficult than writing a native app with said language.


I agree a new infrastructure is needed, and a clean slate might not be such a bad idea, especially if it would handle some of the infrastructure-related security and spanning issues for programmers. Such an infrastructure could make it practical to start from a clean slate especially if it could allow us to unravel and discard much of the tangle of complexities that make web programing so much different from native programming. We need to focus on how we use the internet and what we need from it and then revise the current web achitecture to see what's really needed, what's fluff, and what just plain idiocy. Right now, I think we'd find too much fluff and idiocy and not much of what's really needed. This could pave the way for a simplified and more unified architecture that could do what we want and could make it easier to deliver applications across the web. Simplifying that delivery would make it possible to simplify web programming, making it more intuitive and less burdened by security considerations. An altogether different infrastructure would spark new innovations in programming (as others in HN have posted that such innovation has ‘stalled’).


What's ugly about BGP? It's very elegant. Yes, the regex thing is a bit odd but just consider it a DSL.


I had my fingers crossed that Unity would be removed. I've been trying hard to give it a fair go, but I still find it awkward. Also, the big claim that it saves screen real estate is bogus.


Not a bad idea...like Paul said, it's a start. Here's a comment that I read from some user on Slashdot regarding the Apple vs Samsung/Motorola patent dispute that summarizes my feelings:

"Look, you pack of fucking navel-gazing fucktards. Put down the fucking guns, agree to pool your resources to buy sufficient hookers and Caribbean vacations for Congresscritters to have the existing patent system tossed out the door. We get it that you all sort of started out accruing vast numbers of patents, some good, some bad, some absolutely fucking moronic, in no small part to fend off attacks from each other and from evil little patent trolls, but look at how it's complicating your lives. You couldn't roll out a steaming turd without someone somewhere trying to claim you infringed on a patent they own.

Apple, you're now one of the biggest companies around. If anyone can afford the required number of prostitutes, golf club memberships, or whatever it is those corrupted evil bastards in Congress have an appetite for. Google, come on, you could help out here, same with Samsung. Then you can, you know, compete on the quality of your products, rather than trying to stuff newspaper down each others throats in what can only be described as the bonfire of the idiots."


As others have stated, it happens to all of us. I think one has to learn how to take breaks often. The world is not gonna fall apart if you take a week off every once in a couple of months to let go of your computer and snooze on a beach somewhere. Also as one other person posted, sometimes what you perceive as burnout may be clinical depression...but from the way the poster was talking about it sounded more like job dissatisfaction than depression.

Also, it depends on where you work. You have to enjoy the team you're working with and you have to enjoy the work you're doing. Sounds like a big part of it for the submitter had to do with the language in which they were programming. That's a big deal to a lot of programmers. Programming is an art! If an employer or project requirements forces tools upon you that do not jive with how your thoughts flow, not surprising you're burnt out. Do what you love, tons of companies that use Haskell...even ones that use embedded computers. Also sounded like the submitter was enjoying their academic career as a Masters student. Perhaps going down the academic route is a better option. In summary, do what you love!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: