Wonderful writing, I loved the way it circled several times around the central issue - the value of work, and I particularly liked this conclusion - much validation and reward in our society is driven by how much people are willing to pay you for your chosen work, and it's very hard to separate your self-worth and confidence from that. It's hard to reconcile when your values don't meet those of the people around you, as expressed in the salaries for various jobs, which vary wildly without much sign of reason or relation to what society ostensibly values. I think what he's trying to get at is why we overvalue these jobs, which on the face of it are not particularly rewarding either to society or the individuals doing them (apart from monetarily). If you ask people in the street whether we need another Facebook, most would say no, and yet we have hundreds of inchoate and uninspiring replacements being worked on and funded right now, so it's hard to see where the demand is coming from, or why this work is valued so highly, and whether it is in fact a bubble which will burst.
Going back to 17C Holland there was probably a huge demand for market traders able to distinguish fine differences in and trade tulip bulbs, until all of a sudden there wasn't - this is the kind of illusory value the writer posits for today's fêted startup web workers. I'm not sure I entirely agree, but it shouldn't be dismissed out of hand, because he's not just saying it's unfair, but that it may be unjustified.
The price of a word is being bid to zero.
This sentence near the end cuts to the heart of the matter for me - for writers or other producers of original content like photographers there is a cruel and dismal comparison to be drawn between the wages of those paid to frame content and present it to the world, and the wages of those who produce the content. The creative content (writing, photography, art, travel guides etc) is all in demand, but no-one wants to pay for it, perhaps because it's so easy to produce something yourself, and so hard to distinguish the fine differences in quality which separate a remarkable piece of writing or photography from the mediocre.
>> If you ask people in the street whether we need another Facebook, most would say no
Don't take this as a disagreement with your whole comment (it isn't), but one trap that people fall into is equating what people say they want with what they actually want - at least as expressed by what they are willing to pay for, which is what shareholders and VCs care about.
There was an article on HN a day or two ago about how asking people to bet on political propositions affects their stated beliefs - when there's no cost, people will happily spout whatever their favoured party's line is, but if you ask them to put their money where their mouth is, they will often skew away.
VCs are investing based on where they think people will actually put their money, not on where they say they will. If evidence is that people spend a lot of time social networks, AND evidence is that advertisers will pay money for web adverts to people, then it might follow that a new social network is a good investment.
The main issue as I see it is that as we use money as a proxy for value, we disproportionately weight the desires of the rich over those of the poor. A service for rich people (e.g. an online photo sharing website) doesn't need to produce much utility to be worth a lot of money, whereas a service for poor people (e.g. clean water for poor villages) can produce a lot of utility and still be worth little money.
VCs are investing based on where they think people will actually put their money, not on where they say they will. If evidence is that people spend a lot of time social networks, AND evidence is that advertisers will pay money for web adverts to people, then it might follow that a new social network is a good investment.
If VCs were funding companies which made solid profits from customers and were therefore able to pay the VCs a multiple out of that profit I'd find this argument that money talks more convincing. As it is it seems that value for internet companies is not based on revenue or even potential revenue.
This disconnect between the money invested and the money earned completely breaks the connection you talk of - customers willing to put their money where their mouth is. Someone in the chain is paying lots of money, but it certainly isn't the actual end users or customers of most startups - often they run at a loss for a substantial period, and then they have no way to turn their free readers/users into paying ones in sufficient numbers to support their valuation or the money put in. VCs really don't care about that though, as long as they are paid more than they put in - that can easily happen if the companies are bought by larger corporations like IBM or Yahoo in the mistaken belief that the customers can be converted - the question of revenue then becomes academic as the companies are folded into a larger parent and the actual value impossible to discern.
It's an interesting time to be alive and an interesting market to be working in, but I do find the distortions created by massively inflated valuations and companies built just to be sold on worrying. Forgive me though, I must stop wittering on HN and get back to work now :)
While I think you are right that there is a difference between what people say they want and what they actually want --
I would suggest that there is also a difference between what people actually want, and what people demonstrate with their behavior, what they actually do.
What people do is not neccesarily what they want either! For all sorts of reasons. Poor self-control, costs/risks (financial/social/psychological) to doing what you really want, etc.
Just becuase someone spends money on something doesn't actually mean that, in an ideal world, they want to be paying for it, or want it in their lives.
Very interesting. However, I think by differentiating what people "actually want" from what they do, you have made it completely inaccessible. How can we possibly find out what people actually want if it's different from both (1) what they say and (2) what they do? In other words, is there any way you can validate your thesis? I suspect you can't. You will have to fall back to what they say they want.
That's a good point too, I'll have to think about that.
The sorts of examples I was thinking of though, would be:
* Someone who wants/says they want everyone to be paid a living wage and are willing to pay somewhat more for products made by people earning such -- but still consistently buy the cheapest products, made in sweatshops.
(That one might be, in part, about a lack of trustworthy information?).
* Someone who wants/says they'd rather there be fewer McDonalds around and more healthy restaurants, but still spends lot of money at mccdonalds, and maybe not at an available healthy restaurant.
(lack of willpower?)
I think you are right about the, er, epistemilogical problems, but I think there are still some examples that make it pretty obvious that what people actually want is not always represented by their actions.
Or at least not always by their _purchase actions_. Maybe the larger point is that, contrary to certain religions, the market is not in fact a perfect aggregator of people's true desires. What is successful in the market is NOT always representative of people's true desires, for all sorts of reasons (including the obvious one that some people have more market power than others, so are better represented in 'the market' -- but that's just the very beginning of reasons).
This is insightful. Lack of information, lack of willpower (eg. laziness, addiction) and bad decision-making (for reasons such as cognitive biases, poor reasoning under uncertainty etc) can be reasons why actions may not be consistent with the real wants. I agree with your larger point.
Given this, as a business, one has a choice of focusing only on what you can get people to do (eg. cigarettes, farmville, tabloid journalism, in fact any business that exploits the above reasons) as opposed to focusing on what they truly want - even though we want people to act in a certainway. The first kind are the businesses which are generally considered "evil".
Your idea of wants differentiated from actions seems to be quite fruitful. I used to tell people "don't ask your customers, instead, observe their actions". Clearly, something more is required to make a non-evil business. Thanks for teaching me something new. :)
One possible way to validate your point is by asking people if they ever regret their actions. However, I think this isn't robust. Regretting my past action now doesn't mean that I didn't want it then.
>The main issue as I see it is that as we use money as a proxy for value, we disproportionately weight the desires of the rich over those of the poor. A service for rich people (e.g. an online photo sharing website) doesn't need to produce much utility to be worth a lot of money, whereas a service for poor people (e.g. clean water for poor villages) can produce a lot of utility and still be worth little money.
That is the essence of free markets. If you look up the "welfare theorem's" you will see that the free market promises to do precisely what you describe: maximize the weighted sum of individual utilities, where the wealthier individuals generally have higher weights.
Ultimately, it is in human nature to care more about oneself and one's family (and possibly other people who you consider to be part of your in-group) than others. The free market just happens to make this fact very stark. Each time I buy a coffee for myself instead of donating to a poor village, I cannot avoid the conclusion that I care more about my momentary happiness than whether someone in that village gets some avoidable disease.
People are willing to pay for Heroin because it scratches a chemical itch in our brains. People don't say they want heroin because they intellectually know it ruins their lives and does nothing positive for them.
There's a disconnect between what people want, what they want to want, and what they say they want. This seems to be universal to humans; the connection with capitalism is tenuous, at best.
> One trap that people fall into is equating what people say they want with what they actually want - at least as expressed by what they are willing to pay for
Another trap people fall into is equating what people are willing to pay for something, with the amount of value that's actually able to be captured in the market for that item.
The finance industry, say, is able to capture a lot of the value they create (some might say more than they create). Some other industries, less so.
The price of an already taken photograph, or that of a pre-packaged software or web service tends towards zero, however if you want original content, or to build software, hiring the talent to do it is going in the opposite direction (e.g. good wedding photographers charge a lot of money, as photos may be cheap and plenty, but good photographers aren't neither cheap or plenty).
> this is the kind of illusory value the writer posits for today's fêted startup web workers
The real value of a "web worker" is not in its ability to write PHP to serve HTML, but rather in their ability to find and solve problems. Good engineers that solve problems will always be in demand. Example: I would put an engineer to do marketing, because a good engineer would know how to do measurements, statistics and A/B testing.
>so hard to distinguish the fine differences in quality which separate a remarkable piece of writing or photography from the mediocre
I just paid several thousand dollars to a wedding photographer for my wedding. That is pretty standard from what people tell me. I wouldn't say that no one is willing to pay for content. It depends on the content.
I would argue that the premium you are paying for the wedding photos is attributed to the service rather than the content. You are paying several thousand dollars to a professional photographer rather than a hundred bucks to your nephew so that they do not mess up your special day. While I'm sure the photos are exquisite, no one else would pay several thousand dollars for the exact same photos you received.
No, but what matters to the photographer (our stand-in for software developer) is that there are many people willing to pay them several thousand dollars for those kinds of photos :)
>The creative content (writing, photography, art, travel guides etc) is all in demand, but no-one wants to pay for it, perhaps because it's so easy to produce something yourself, and so hard to distinguish the fine differences in quality which separate a remarkable piece of writing or photography from the mediocre.
The real reason is not really that people don't want to pay for it, but rather it's available for free, so there's no need for most people to pay for those things. If there were no free travel guides etc., people would still pay for them.
A comparable situation is happening with newspapers. People will pay for news if they cannot get them for free. Since you can these days get news for free, a lot of people are getting rid of their newspaper subscriptions.
Unfortunately I feel we're in a negative spiral, where good content (e.g. quality reporting and analysis, as in the example in the article) competes with free content which is just good enough to draw attention and pay for itself with advertising. Something is lost there, and in the rush to make everything free we're also losing the incentive for quality journalism or investigative writing, which are just too expensive to sustain on advertising alone.
If that results in a gradual erosion of quality as less and less effort is put in to researching and writing articles, and more and more of the content is churnalism or opinion pieces, then it's a net loss for everyone - eventually the adverts will be more interesting than the real content which lives in the interstices of advertising - I think that would be a shame. I've seen that happening in major newspapers in the UK like the Independent or the Guardian. We've seen the same in television with the rise of Reality TV etc - cheap to produce, controversial, and the perfect filler to squeeze between advertising segments.
The thing is, people never bought newspapers because they cared about democracy. They wanted to be entertained, and have something to talk about with colleagues.
At least with regards to written content, most people have access to much more interesting content. Heck, hn is more interesting than my newspaper. And we manage to find stuff to talk about.
Investigative journalism is just a case of the tragedy of the commons, that now run out the huge luck it got n the past. And we as a society need to start looking at it as such, and not fantasize that it is a standard business.
In response to you and grey-area about news: yes I get news for free, and it is normally good enough. But... Some of the recent reporting from the NYT has really made me reconsider paying for a news subscription, just to be able to fund great investigative reporting. If I wasn't in a low income country making low income wages it would be a no-brainer. But I really do think that there is a market for high quality reporting, and people willing to pay for it.
Basically, the majority of people cannot appreciate art. Abstract ideas and brilliant commentary are simply not valued by the majority as much as a working system, even if it's as obvious a concept as Facebook. When you hire someone to write something, it's almost impossible to predict the audience's reception to it. When you make a website, a logical machine, you at least know you have something less abstract. A website is more tangible than a piece of prose. Words indeed have no value in themselves.
You would expect the people who make the software that facilitates web publishing to be paid more on average, no? Developers get paid more because they make the stuff that people want -- stuff that does something or solves a problem. It's that simple.
The value of a lot of these apparently lightweight B2C apps is derived primarily from the attention that they get from their users. If you can build something that gets attention, then that's worth a lot of money.
In a world where people are spending less and less time watching TV there is now a huge imbalance between corporations who have lots of money and want attention, and huge numbers of people with smartphones and limited attention spans. It's like a thunderstorm, all these electrons trying to get to Earth, then suddenly, Bang! Instagram. These valuations, and the developer salaries they fuel are just a by-product of all that money trying to get from point A to point B.
Google got huge by inventing one new channel for that money to flow through. And a pretty brute force one at that: You searched for "laptop" here are some ads for laptops... It's still ahead of everything else though, which is basically a re-implementation of the old TV campaigns but on the web. If someone could only figure out how to show you laptops just before you thought of searching for laptops, then they would be even bigger than Google.
"In a world where people are spending less and less time watching TV there is now a huge imbalance between corporations who have lots of money and want attention, and huge numbers of people with smartphones and limited attention spans. It's like a thunderstorm, all these electrons trying to get to Earth, then suddenly, Bang! Instagram."
Wow. That's the most vivid explanation for it all that I've ever seen. Thanks for that.
"If someone could only figure out how to show you laptops just before you thought of searching for laptops, then they would be even bigger than Google."
Yes, new advertising innovations have power. Partially positive, partially negative.
But most new startups don't invent new advertising techniques. They don't grow the amount of attention. They just fight for the piece of the current attention pie.
And In similar fashion to reality TV relative to game of thrones, they offer crappy but addictive products.
Am I the only one that wants coders to stop feeling guilty and devaluing themselves?
The plumber analogy is off base. Web development is incredibly more complex than plumbing in a home. Importantly, web development changes extremely fast while plumbing is largely the same as it has been for decades.
The web and the developers that have created all of the sites and apps on top of it has added tremendous value to our economy and world. Web developers have streamlined almost the entirety of our lives and created enormous productivity gains.
The things that we create may seem trivial to us, but are fantastically valuable to society.
Jet packs and flying cars were always a terrible benchmark to measure human technological progress against. Iterative improvement has created a world of fantastic possibilities.
Good development is hard, and requires a lot of knowledge. Value yourself and feel good about what you are doing.
As a developer who also knows plumbing (and other trades), I'd beg to differ. When I talk to neighbors and friends, most of whom are highly "technical", they look at me in disbelief when I suggest they resolve a plumbing (or carpentry or mechanical) issue on their own. I get the "I have no idea where to start", or "I don't have any tools", and so on. It's not a lack of available information, it's the fear, in many cases extreme fear, of the ramifications of not "doing it right": a flood in the house, a wall that collapses because you hacked through a supporting structure, an electrical short that burns the house down, and so on.
You build a website and maybe it doesn't work right (has bugs). In most cases, no one dies or gets hurt. You mess up plumbing, electrical, etc and that's not the case at all. It's easy to look down on the non-technical trades since we live in a time that glorifies (like the OP says) the art of programming. But next time you have a major plumbing issue, your heat pump goes on the fritz, you car dies on the highway, consider how helpful it is to know RoR, Java, C#, etc. Not too much...
I'd like to point out that a good plumber can make almost as much as a web developer. I know a plumber who makes $90k. People gladly pay premium money for a good plumber for the same reason they will pay for a good developer: the costs and headache they save by making sure the job is done right are well worth a high upfront cost.
Is this really the case? Growing up, my family did pretty much all of their own plumbing/carpentry/electrical work, so I'm kind of surprised that that's not the norm. Its honestly not that hard to figure out, and its also pretty easy to make sure that messing up doesn't do any long-term damage.
The extreme of this was when we spent a year or so finishing our basement; everything from framing to wiring/plumbing to painting was done on weekends by us. I wouldn't say that I "know" any of those trades, but I know how to look up instructions on the internet or ask someone at Home Depot.
I think that you are not the norm. I feel that I am a pretty smart guy, and I know how to do good, in depth research on the Internet. But I have to say, I got a lot of cringing looks from my girlfriends family when they first saw me using a chainsaw with an iPad propped up next to me with instructions on proper chainsaw operation techniques display displayed next to me.
There is data, there is information, there is knowledge, there is experience. And then there is wisdom.
Just because there are more ramifications doesn't make it more complex. Proving a mathematical conjecture is more complex than driving a forklift, even though driving a forklift is dangerous to yourself and your surroundings if you aren't careful, while solving a mathematical conjecture might give you a paper cut at worst.
Additionally, I'm not one to invoke the class war most of the time, but am I meant to feel bad that value is going to me rather than to suits and investors? Don't make me laugh.
Good development is hard, but so is just about every job.
I've been attempting to farm, a job that popular media will have you believe any slack-jawed yokel can do, for the past few years. It has been quite an eye-opening experience, making programming seem like child's play by comparison. If any slack-jawed yokel can do a job that is more difficult than development, where does that leave developers?
You are right that we should value ourselves and feel good about our accomplishments. Being able to develop software is a pretty significant one. But we should not feel the need to minimize other professions to make ourselves feel that way. They are doing things that are just as complicated and important, and they should feel equally good about those accomplishments.
A lot of people would agree that empathy is one of the large components of what makes us "human", and that's rather vague, but it does make evolutionary sense for why empathy is useful for societal progress. And guess what, perspective is crucial for empathy. If we assume for a minute that almost nobody really lacks empathy, then what could be the only possible factor that produces an outcome like the one described in that link above? I'm sure most of us tech people aren't narcissists/sociopaths, so the only thing that could explain it is a skewed perspective that makes it permissible to rationalize a lot of the nonsense around us as being 'okay', or 'not relevant' to us. Empathy is a function that takes perspective as an argument, and if we assume that our empathy functions are correct (no reason to think they aren't usually), shouldn't it be good when we realize it's giving us the 'wrong outputs', and then do something actionable about it like try changing our perspective (or input)? I've battled with a lot of self-criticism and depression myself, but I see nothing wrong with this. On the contrary, it seems like the healthiest thing we could do as we approach a possible 'bubble' scenario.
I think the problem is that the OP is labeling what he's arguing with as "web development" when it's really more of a subset of startup culture that he's referring to, not all web developers (he lauded the 'adults' that created Rails remember? I'd say they're 'web developers').
Also, it isn't that things are trivial to us, it's that they can be trivial, period. It just doesn't look that way from the outside, so we benefit (and are congratulated) for "daring" to look into the black art that is computer programming and actually producing something, anything... imagine if it were socially deemed adventurous to look into plumbing, it would have a similar outcome (albeit without all the VC nonsense probably).
Also:
> Jet packs and flying cars were always a terrible benchmark to measure human technological progress against.
Not really, because we do have jetpacks (albeit not exactly efficient/affordable ones), and lets not forget that Google is developing self-driving cars! I'd say the notaion that any vehicle could drive itself is an even crazier idea than having something that's redundant with a small glider/aircraft at this point, and I'm sure people from the 1930's would probably agree.
So then why wouldn't it be good to use existing ground-breaking projects as benchmarks for human progress and value? A lot of web startups sure do fall short of that benchmark too.
> I think the problem is that the OP is labeling what he's arguing with as "web development" when it's really more of a subset of startup culture that he's referring to, not all web developers
You hit the nail on the head. I'm a web developer, and what I do is build websites and other online solutions for non-profits. They range from the very small, to the very large (many millions of dollars in budget). I'm paid less because most of our clients pay less, but I'm happy with my contribution to society, and that's what matters to me.
It's not the job, it's the context in which that job is performed.
Off-topic: Can you contact me (email in profile) as I am trying to rebuild my portfolio of work from scratch (as I've been out of the tech world for a decade) and rather than put up files on github scratching my itches, I have been trying to find non-profits or charities to re-do their websites (for free). My skills are very rusty, but I am pushing ahead to get up to speed with the modern tech world. (I programmed in the 80s and 90s everything from AI to geocities, but spent the last 15 years living life and trying different things)
What you consider virtuous may produce no economic value. Or maybe it does. There's no rigid formula that connects them and the Just World Hypothesis is an illusion that traps fools and wisemen alike.
Attempts to browbeat the world out of "is" and into "ought" have been universal failures and I retain every confidence that it will be ever thus.
James Somers might find Hayek's writing to be a bit repetitive and languid; but he will find the ideas illuminating. It seems like his father tried, but failed, to make the insight really stick.
There has never been a Just World where Virtue = Value. No matter how badly we want it, there can't be. And trying to forcefully make one generally just makes it much worse.
I didn't feel like the author was trying to prove anything, or was trying to get people to pursue more "virtuous" pursuits. It is simply a personal reflection on the state of the startup ecosystem and what it means to be a computer programmer.
Trying to reconcile the virtues of what he does with his economic value is pointless. They're just not connected.
Demand for Ruby on Rails plumbing is ultimately connected with the current wishes and desires of most of the planet's population and has nothing whatsoever to do with how he, or anyone else, feels about it.
most of the planet's population != most of the planet's purchasing power
Demand for Ruby on Rails plumbing is connected with where that money is going, and it's alright to think that where that money is going may not equivalent to the wishes, desires (and needs) of most of the population, and to be pensive about that.
In moral terms: he's wondering if the concentration of spending on matters that he's a subject expert in is a negative thing in terms of the world he wants to see.
I think the actions or desires of most of the planets population, who don't have computers or internet-enabled phones, have virtually no impact on the market for rails developers.
That was my thought. Really it's the actions or desires of a small number of VCs with copycat investment strategies hoping to outperform VC as an asset class by backing a slightly different set of San Francisco startups working on a slightly different spin on social sharing that drives demand for developers in that particular corner of the world into the stratosphere.
(The market for Rails developers in many other parts of the world is not that different from the market for other locally-based professionals with a skillset that can solve defined business problems, which is probably what their code is actually doing)
That said, the billion plus people that joined Facebook and click on Google ads have had a bit of an influence on why LPs from all over continue to pump money into Sand Hill Road VC funds...
You've misread me. I didn't say "everyone in sub-Saharan Africa is hankering for a RoR coder of their own".
My point is that everyone who is connected to the global economy is indirectly connected to everyone else in it. The overall system pushes prices, goods and money around in an impossibly large web of interactions. We can see the local factors, but we can't see the whole picture.
Nevertheless, the price of lettuce at the supermarket, the price of a new laptop, the hourly rate I bill, the cost of bullets for an AK-47 being used in a civil war: these are all, ultimately, connected together.
Ah you guys. This is precisely the attitude that the article is talking about. The fact that you conflate most of the world with the tiny proportion of the world that has both money and internet and also the even tinier proportion that cares about whatever cool thing you've made in rails screams of the kind of self involved perspective that the author of the post is complaining about. I don't mean this to sound like a personal attack, apologies if it comes off that way.
Well, sure, but there's a difference between understanding this on an intellectual level (which he already does) and coming to terms with it on an emotional level.
Granted. But if the existentialists taught us anything, it's that if you stop and look, I mean really look at your life, you'll find how thoroughly inconsequential it is.
Which is no comfort at all. Quite the reverse. Ignorance really is bliss.
I find the intellectual argument comforting because I can focus on the things I can control, the things that are within my sphere of power.
I used to be very interested in politics. These days I either ignore it or treat it as entertainment. Because I have no control over it, why drag myself down with it?
If nothing is worth anything then by definition everything is worth whatever you want it to be worth. My life is inconsequential from the perspective of the universe but from my point of view its pretty important.
Also, you certainly do have the ability to actively engage with your world and you have an extremely limited ability to change how certain aspects of that world works. Of course this has to be based on the idea that doing so is inherently worth it which you seem to have abandoned.
Nonsense. Just because "is" will never fully attain to "ought" does not make vain the pursuit of a better, more just world.
> Attempts to browbeat the world out of "is" and into "ought" have been universal failures
There are so many examples (a single one of which disproves your "universal" claim) of idealists bring "is" more into alignment with "ought" that it would be almost impertinent to even begin enumerating them. Here's a small handful: Civil Rights movement, Women's Suffrage, and (from an economic perspective, since that what you seem to insist can never be touched by notions of "ought") the rise of Organized Labor, which made it possible for former wage slaves to earn a decent living wage.
If Hayek and Rand make you feel better about the fact that most of us over-earn relative to the "virtue" of our labors, fine. But IMHO it is best to be honest about the inequities and fortuitous circumstances which make our success possible.
I wasn't speaking of all cases of is/ought. I was referring to the idea that virtuous jobs should be valuable jobs and vice versa -- that the system of production can be turned to produce according to a ranking function of virtue. Civil rights and suffrage aren't jobs or economic production; they're matters of custom and law, which Hayek also discusses as similar but distinct forms of spontaneous order.
You correctly pointed out that the value of labour has been affected by simple economic considerations more completely than any theory of the native goodness of honest toil.
Rand's cardboard aliens all seem very smug and self-assured and basically I think her books have done more harm than good.
Hayek's tone is an exasperated old school master. He just wants people to understand that you can't create -- as in design or ordain ab initio -- a working economic system that aligns to virtuous ends. It will break.
I would certainly agree that the methods for bringing virtuous and valuable jobs more into alignment are quite limited, and it is a vain pursuit to hope to bring them into perfect harmony. However, I would disagree with my libertarian friends and say that such limited methods as we have are worth employing, and new ones are worth seeking out, as the endeavor has merit. The difficult part is finding a balance between seeking to control and throwing our hands up and saying "nothing can be done."
As an aside, I agree that this is a very difficult proposition for incomes, but gains are to be made with a strong public sector which provides numerous basic services such as healthcare, education, and affordable subsidized housing, so that the unfortunate man with a virtuous job of little economic value is not left out in the cold. Or dying of long-undiagnosed cancer due to lack of access to prohibitively expensive preventive care, like my wife's cousin is currently.
I don't know. I do think that without virtue there can be no value, if these are properly defined. (Virtue corresponding to filling human needs, value being economic value.)
Is entertaining people virtuous at least potentially? if so, then there is nothing wrong with selling video games, movies, music, or whatever. If not, however then you have a problem because while the world make seem nice on paper (no money to Hollywood since that would take away from cancer research), it isn't a world any of us would like to live in.
The problem is from where I sit that the expected value (what investors will put in things) is wildly out of proportion to the actual delivered value to customers. I think this is even true of giants like Facebook, so what we get are speculative bubbles because investors are assuming there is more value than there is.
> I do think that without virtue there can be no value, if these are properly defined.
This is an unproductive line of argument because either of us can just keep moving the goalposts until we're exhausted.
Economic value clearly doesn't follow any "gut" virtue. Here's how an early moral theorist put it in a notoriously influential book:
I returned, and saw under the sun,
that the race is not to the swift,
nor the battle to the strong,
neither yet bread to the wise,
nor yet riches to men of understanding,
nor yet favour to men of skill;
but time and chance happeneth to them all.
Generally speaking, attempts to create ranking functions for virtue lead to mismatches with the observed state of the world. You line up the list of virtuous things (curing AIDS, feeding poor people) and the list of valuable things (Ruby on Rails programmers, inanimate chunks of metal) and discover that they just don't line up. At all.
But if you try to impose a virtuous order, it all goes kerflooie. Because the whole system of production and allocation relies entirely on value. When you take value you away, it stops working. Partial enforcement leads to partial derangement and in general, creates new evils, requiring still further ranking functions ... it spirals out of control thereafter.
We read this:
Dear friend, it is not possible for man to avert that
which God has decreed shall happen. No one believes
warnings, however true. Many of us Persians know our
danger, but we are constrained by necessity to do
as our leader bids us.
Verily 'tis the sorest of all human ills, to abound in
knowledge and yet have no power over action.
And imagine ourselves to be men and women both of knowledge and power. But it was always an illusion.
I'm doing a pretty poor job of explaining all this. Hayek's Social or 'Distributive' Justice does a much more convincing, thorough and erudite job. It was published in the second volume of Law, Legislation and Liberty, also in The Essence of Hayek.
edit: by the way, your blog about LedgerSMB and PostgreSQL is bloody marvellous.
You say this because we practically live in a post scarcity society. 500+ years ago people valued hard labor that brought food because food prevented starvation a vary real and horrible thing. Now, food is an evil that brings heart disease and obesity. Our desires for food and our well being are out of alignment. Value is now a function of are desires with little connection to our needs.
Stealing from people was never thought of as a productive economic activity, but legally things like patent trolls that act like stealing are legal. So, I suspect you could creat a society where virtue and value are at least aligned and historically they may have been closer. However, we don't live in such a place.
I think the evidence is that value and virtue have never correlated very well, scarcity or not. Hence quotations from two of the oldest books on the planet. Value and virtue are not orthogonal either, but it's messy.
In post scarcity society things would be completely different, as the market value pretty much loses its meaning when everything you want to buy is free.
You line up the list of virtuous things (curing AIDS, feeding poor people) and the list of valuable things (Ruby on Rails programmers, inanimate chunks of metal) and discover that they just don't line up. At all.
Ruby on Rails developers are scarce and valuable just because there still aren't a whole lot of them, and because they can crank out CRUD apps (which are just data collection apparatuses) more quickly and efficiently than anyone has been able to before. Once someone invents an even faster/more automated way to design and deploy a data-collection tool than RoR (which might be just a future version of RoR), then that will be the next big thing for a while.
Hmmm this is a very difficult topic to really do justice to. I don't think that Hayek is necessarily contrary to anything I am saying. The problem though is a definition of virtue.
> You line up the list of virtuous things (curing AIDS, feeding poor people) and the list of valuable things (Ruby on Rails programmers, inanimate chunks of metal) and discover that they just don't line up. At all.
But the problem here is that without a framework for deciding what is virtuous, all you are doing is assigning your own measure of value. In that regard your argument boils down to "what you think is valuable is not necessarily a solution that lots of people will pay money for." But I do think that virtuous solutions are valuable to the extent they are virtuous. A cure for cancer would be worth, I would expect, thousands or millions of times what a hollywood blockbuster would be.
However there are limits to this analysis. While I think valuable = virtuous with regard to solutions it doesn't necessarily follow that the coal miner is less virtuous than the Ruby programmer. Wages are determined in ways that both do and don't resemble market economics as even Adam Smith noted (he saw wage levels as being indicative of negotiating power differences between various professions).
So my caution is to avoid looking at wages (ruby programmers vs coal miners) in the same way you look at solutions (cure for cancer vs the next big MTV hit).
From memory, Hayek also mentions the problem of coming up with the ranking function for virtue. Basically "virtue" here is standing in as an alliterative reference to systems of ethics, morality, justice and justness etc etc. Huge fields of human thought in their own right. That sometimes value and virtue align is inevitable, simply because there's so many combinations available to test.
His general point is that trying to make complex, emergent systems fit into neat theories tends to break the systems.
That book I mentioned -- Essence of Hayek -- is worth getting. I began to review it on my blog and never finished writing my follow up ditties.
FWIW what I understand Hayek as discussing is something different, which is that centrally imposed definitions of "good" have very little to do with local decisions that people make. Hayek is arguing, as I read him, against the idea that "social good" as we culturally construct it through centralized structures (church and state) is the goal of the economy.
There is a way out of this, and that is a 19th century Catholic idea (I am not even a Christian but Catholicism is interesting to me to the extent the Catholic Church is a torchbearer for pagan Greek and Roman phylosophy) called "subsidiarity."
The idea of subsidiarity is that it is theft for a group to accomplish what an individual could accomplish by him or herself, and for the same reason it is theft for a larger, more centralized group to do what a smaller group could do. The goal of larger groups should be to support, not supplant, smaller organizations. This idea of subsidiarity thus seeks to reformulate society and the economy on Aristotelian grounds, with the family household in the center (rather than the multinational corporation of liberal capitalism, or the state). From this view, virtue and economic value are the same to the extent that people seek to live just lives, and to the extent that central authorities only seek to accomplish on their own what are fundamentally impossible for the smaller entities.
I think this is one reason why libertarians have associated the Catholic idea of subsidiarity with the private sector which only makes sense in terms of Hayek (I don't think that equivalence quite works but one can easily see the similarity).
I don't know if Hayek would be on board with it. The main problem is that it introduces a new criterion for arranging production which doesn't emerge from the existing order.
Hayek isn't trying to build the Ideal World. He's taking the world he saw and said "here's how it works", and then, "here's why it wouldn't work if we decided to design a system instead of having it emerge".
So for example, if subsidiarity is imposed, how does that play out? You already create a requirement to deduce what the smallest group capable of producing a thing is. But we already have something like that in the market. It's lumpy and fuzzy, but it does eventually kill off the too large and the too small (modulo endless tinkering by legislators).
Coase explained that firms emerge because of transaction costs. Sometimes it's easier and cheaper to do a thing in existing groups. Sometimes it's not. The tension between these allows firms to emerge from the social order. I'd add that the rise of IT has enabled coordination on a vastly greater scale, which has helped to create the modern corporation.
The point of subsidiarity though is that it puts central authorities in the position of midwifery for having such a world emerge rather than in the position of architecting, designing, and building it.
The problem though is that these are huge fields of thought and also that the very contentious. Ethics for example is usually defined to be asking the question of "what is good?" Obviously defining "what is good" is an undertaking which leads different people in different directions.
I take the approach of saying "what is good is what is conducive to human flourishing." This is largely an Aristotelian approach (and so you can't really accuse me of deriving terms to meet economics). Therefore I would say there is some virtue in entertaining people or rather that such is at least potentially virtuous.
The problem is that if you accept that human flourishing is the goal and thus the definition of virtue, and if you assume that to a large extent this is also the goal of the economy (something I think that both most economists and the Classical philosophers shared), then virtue and value can't be seen as separate at least in terms of solutions (as I say, wages are different).
Or are you saying that Hayek does not see distributive human flourishing as the ends to which our economic systems work?
People in 20-30 ages neither have virtue, nor know the meaning of value. They structure their lives around "fun", which is definable, post-hoc.
edit: by post-hoc I mean, facebook was never meant for memes, but it does now. So you say "facbook is about fun memes", "XXXX was fun". Both of them were not in the original idea, but you express the fallacy.
The fact is, the 15-30's people doze off in a third rate movie, have fun by poking and squeaking.
And I have watch, ppt's about my company's future.
> And now, like my dad I look for "virtuous" people/business to trust.
This relates to the virtue/value discussion elsewhere in this thread. As a consultant I'm learning that some companies are price-averse and some are risk-averse. The latter are willing to pay more to lower their risks. In other words they are willing to pay more to people they trust, which they base on perceived virtue. So here is at least one case where virtue really does have value precisely as virtue (leaving aside the difference between appearance vs. reality).
Very well written article, I always enjoy reading stuff from the intersection of writers and coders. What I found startling was how different my experience has been to that of the author's at the start of the article.
I've put up with some horrific jobs paying miserable money in some of the most uninspiring industrial parks you could imagine. I've taken work in dingy Victorian offices so cold that I've had to program through thick gloves. I've accepted commutes taking up to three hours and incorporating four different kinds of transport. More than once I've spent 18 hours on Saturday and Sunday circulating my CV to every job opening and recruiter even slightly relevant to web development.
Reverse interviews? People clamouring to have coffee with me? Beer constantly close to hand?? These are unheard of things. Conditions are definitely better for me now but my skills certainly don't mark me out as a celebrity. Maybe the financial bubble of the dot-com era has been replaced by a cultural bubble, one that I'm definitely not part of.
It's particular to the New York startup scene. There is a lot of money sloshing around and startups have a different hiring strategy than in Silicon Valley. It's quick to hire, quick to fire. I've had six figure offers without even going through a real interview. This leads to a lot of churn. One place I worked at ran through three almost entirely different engineering teams in 18 months. It will be interesting to read if the author of this article still has the same job in the fall.
Silicon Valley is a bit different. There is usually a gauntlet of multi-day interviews even for the lowliest position at a startup nobody has heard of. It's even more of a gauntlet at the large, established companies. I've heard of someone doing 10 days of interviews at amazon.com, and they are a company most want to avoid. Companies are very afraid of hiring the wrong person.
As you have noted, most everywhere else the developer is made to feel lucky to work at a folding table in an unheated area next to the men's room.
Not necessarily true. I think if you aren't in those tech places (beyond SV and NYC, there are other hotbeds too), you have to get known amongst a tech community and this is where social shines. I've been open sourcing software fairly frequently lately and was contacted by potential future bosses at three very well-known companies for software engineer roles, two in California and one smack-dab in Manhattan. I've also been leveraging things like Show HN and putting stuff in appropriate subreddits to get an initial jump. Once you break the trending barrier in your language of choice on Github, it's practically smooth sailing. Getting on the most-starred today (just on Objective-C) has resulted in two extra days of blog posts, tweets and mentions from all over the internet that I didn't even solicit at all. Two of my open-sourced repositories ended up trending number one overall on Github for a day each. I think if you make stuff that helps coders, and position it correctly on sites that care about that, then it's easier to get noticed and technical directors are more willing to consider you for a job (why wouldn't they hire someone that makes their team more productive).
I have a BA in Art with a minor in Advertising. If I can get noticed using these tactics, then surely people with CS or EE degrees can too.
Given your educational background, I think it could be argued that you are better prepared to do this type of personal promotion than others who have a purely technical background. In most cases, raw technical talent isn't as important (thought technical folks usually think it's all that matters) as being able to do the job at hand and being able to convince others that you can do that job.
Good for you for realizing that "marketing" doesn't have to be a dirty word and hopefully others can learn from your example.
True. People shouldn't shy away from personal promotion, just do it at appropriate times and in appropriate places. I hated advertising in school, but thought it was the only way I could make money with a graphic design degree. Turns out coding with a graphic design degree is a lot more rewarding (for me at least, I'm a builder by nature).
I did 10 interviews at Amazon, for a freaking sales job. 3 on the phone and then they flew me from DC to Seattle for a 7 more interviews in one day. After all that I didn't get an offer.
If you work in finance (City or Canary Wharf) you might get that treatment from recruiters. See, as a contractor, you are going to make a lot of money and a part of it is going directly to the recruiting agency so you're definitely going to get some free coffees.
I imagine you could have that too in Shoreditch if you're like the special one.
The problem is that it's hard to show your real skill level (especially on back-end work) so getting access to these better jobs is 90% how you market yourself and where you are (New York and Silicon Valley rents are horrid, but worth every penny in comparison to 95% of places) and only 10% based on your skill level (much less unrealized potential).
You don't have to be a celebrity, but you have to be some combination of accessible and validated.
The good news is that marketing yourself is actually a lot easier than programming. You don't need a natural social acumen to do it. I know this because I'm pretty good at it and certainly do not have natural social talents.
If you want to experience "reverse interviews", you have to be in a technology hub.
That said, the dynamic is not as engineer-favorable as people make it out to be. Sure, a good engineer can get 5 in-bound contacts from recruiters per week-- again, that has more to do with self-marketing than skill level-- but most of those don't mean anything. Most often, it's just an invitation to send a resume into the normal process.
Here's how I got people contacting me for jobs in the Valley and NYC. And not recruiters either, potential bosses.
1.) have a Github account, and release open-source software that helps coders be better at what they do. Basically release tools of sorts.
2.) Make the documentation for those tools absolutely stellar. Look at the .raw for the readmes of the various Github repositories you like and love. Emulate some of the stylistic choices to improve your documentation.
3.) Make a Show HN. Or, whenever your software can help a situation you notice in the comments here, promote it. Don't be afraid to do this, because this promotion is absolutely necessary.
4.) Post to the subreddit of the language that your software is for, but don't be commercial about it. You have to be honest. So write like you're talking to a friend. This is easier if you're talking out loud as you write.
==============
Doing these things will add eyes to your projects. People can see your nicely documented code, see your screenshots, and can imagine how this will help their coding. This is GOOD. If you can make the front page of HN or the frontpage of the subreddit for your language, you're in business. Basically you want to be trending on Github for the language of your choice (top 5 starred today is what you're aiming for). That is the hardest part, but from that point, it's a piece of cake. If you're trending on Github, there WILL be blog posts written about your repo, there WILL be unsolicited tweets about it, and you WILL get good feedback about your coding. The eyes on your repo have now exponentially gone up - and your coding is in front of a lot of people.
The thing is, technical directors like productive coders. If your repo/tool makes ALL coders more productive, then they will love you. It's as simple as that. You know the whole 10x engineer hoopla that's thrown around on here? You'd be doing that for other people. A multiplier of multipliers. If you do that, people will be in contact with you. Just stick with it.
I think the real issue is that there's people with lots of money that are willing to fund a lot of different companies in the hope of stumbling across the next facebook. Result? A whole host of well funded but ultimately useless apps and websites. Meanwhile, the people who can make these websites are making lots of money.
It's probably a bubble to some extent. Eventually, if the return from all these web apps is less than the money spent on them, the money will dry up. (I'd imagine that this will be the case - most of the value of the most-recent crop of high-value tech startups seems to be based on hype and even bigger companies buying them, rather than profitability or revenues). In the mean time, there's still people queuing up to fund things they don't understand. The biggest thing to remember about bubbles is that they always seem to keep growing far longer than any sensible person would expect. Will the money run out eventually? Probably.
People with lots of money have been disappointed by the performance of traditional investment vehicles for the past 5 years; meanwhile a lot of extra money has been put into the financial sector.
If you're a fund with $100 billion under management, dropping $2 billion into various venture funds is a reasonable part of the mix. Multiply that by the fact that several hundred funds, banks, trusts and companies collectively control trillions of dollars and these little 1 or 2 percent investments pump billions of dollars into tens of thousands of companies with collectively perhaps a few hundred thousand employees.
When every buyer is cashed up, prices for sellers rise.
(I don't know much about this but) I kind of believe it is a good thing that the traditional investment vehicles are not that good anymore. That was like betting on the outcome of the players of the roulette, right? Money against money against something that may make money. This last bit, investing in something that might make money, brings the world new ideas. It's probably a completely incorrect picture I'm drawing here though.
Investments almost work on a loss leader model - while there is still decisions made on who to invest in (some VC firms/investors/angels making wiser decisions than others), they are still banking on the small percentage of those investments actually making a return on investment. This return covers the failed investments that they might eventually write off.
The general rule is 1/3rd of investments will fail, 1/3rd of investments will under perform and 1/3rd will meet expectations - and to put that in to a better perspective, "expectations" should be read as a 5x to 10x return on investment. A better firm might be able to skew that more to the successful side, while another firm might be more on the failure side.
I don't think the money will "run out" - but the better firms (think USV, Sequoia, Accel, etc) will continue having the experience in finding successful startups and the smaller firms might just not see the return on interest and exit the game.
The money will definitely slow down if the return from the industry as a whole (the industry being SaaS and web apps) is less than the investment levels put in. It's worth remembering that while the initial investors in Facebook have seen amazing returns which will subsidise other investments for decades to come, those returns are only because the hype in the industry as a whole has lead other people to buy into facebook. Facebook itself has fairly low revenues and they're struggling to grow them. Equally while anyone who'd invested in instagram would have walked away with a nice sum, the money to fund that acquisition came from the same Facebook investment.
There's a lot of hype about "tech" (in this context really only web apps) which has lead to a lot of money pouring in from people desperate to grab a slice of the pie. The reality is that while web apps and SaaS is a large industry, the size of the investment funds targeting the industry is due to hype and investors saying "me too" rather than a reflection on the available revenues. You have to look at even the success stories in this context.
The money won't run out completely, but if the money from investors late to the party slows down then the value of the successful web apps will fall to levels which reflect their revenues.
Not sure I understand why we shouldn't have another facebook - sure it's not disruptive anymore - but facebook needs competition and social it is a massive market.
I don't hear anyone saying - don't bother making a smartphone unless you are Apple or Samsung (of course this might have been Motorola or Nokia in the past :-) )
Not all web developers are "plumbers". If you are working on websites with heavy traffic, then you need to use complex algorithms to scale. That needs someone who is comfortable with algorithms, and that needs a quite lot of thought.
The author describes simple CRUD websites and then equates all of web development to trivial tasks like that. Even scaling simple CRUD needs a lot skills, intelligence and ingenuity.
And talking of design, there is a reason some pieces of art are valued so much. A designer who designs a great website is an artist too, and deserves as much compensation.
Engineering the plumbing systems for the burj requires a different set of skills, but it's not done by plumbers. The actual plumbing part is pretty much the same as in any structure.
I agree. There are some that are worth a lot of money. But to be worth that kind of money you need deep domain knowledge. I won't pay $120/hr for someone to tinker with jQuery. I will pay that for a machine learning specialist.
The fact is that 'coders' are today's version of auto mechanics in the 70s. The barrier to entry is low- an old box, a linux distro, google- and off you go. So everybody who can't do something else can give it a try. Some will fail; some will succeed at a low level, and a few may find their calling. And a lot of people will pay for it because they don't know the difference.
But the one's who are truly good- who obtain that domain knowledge- are valuable. In their domain.
Perhaps you've been very lucky or found some especially naive people (right out of grad school, perhaps). Your rates immediately struck me as the kind of rate you would quote if you'd never, in fact, tried to hire somebody skilled. I allow the possibility you have been lucky and successful.
In general you'll be paying upwards of $250/hr for a machine learning specialist (sometimes upwards of $500/hr if they have a modicum of talent and experience) and more than $120/hr for a front-end webdev who can claim anything more than "I've heard of jQuery." Those are contracting rates of course; hiring somebody at salary has its own costs.
You caught me: I've not hired a domain expert coder. But I've also not paid $120/hr for front-end coders. I over-paid for one guy who knew far less than I did- horrible coding practice, always thought he was farther along on the project than he really was (and wanted money), and, in fact, never got far enough to even understand the problem before I terminated the agreement.
For $120/hr the person isn't fiddling- they've done it before, probably have a personal library they can pull, and a basic shell of a website can be ready pretty quickly. Their time is almost entirely spent on the custom part of the site.
Still it doesn't necessarily pay. There is a very large productivity gap between different people, for different tasks in software development (in the 1-50x range).
The fact you think price per hour is directly correlated with total cost to achieve a good result
I don't believe that. My point is that some web/coding skills are not that difficult to achieve, and those skills don't fetch seasoned defense-lawyer rates. At least, not from me.
I'm not sure using increasingly complex algorithms makes a programmer move into some kind of new title bracket. You are still a "plumber", you just have to use more complicated distribution systems to deal with the increased building size (to stick with the analogy).
I stopped reading about 1/2 way into the article. Brace yourself, the rant is coming...
<rant>
Too many red flags springing from a delusional, narrow-minded view which looked no further than the bloated world of Valley startups. Hope mr. Author could cast his eye to some other corners of the world (no, even other parts of the States) and witness the bleak reality of a career called web development. Startups are just a small part, albeit the highlighted part of the system. What would you then call those developers who do back-breaking work, with a fraction of your starting salary, in a dead-end career path surrounded by an equally depressing environment? Low-class web developers? They are just like any other programmers out side, people who chose coding as a profession. Before all the perks, freedom and fun, this is a job. You work for your pay, period. I am sick and tired off seeing another article with a viewpoint through rose-tinted glass about values of coders. Enough is enough. </rant>
<conclusion>
There are overpaid programmers and there are underpaid programmers, everywhere in the world. I probably fall into the underpaid camp, that's why I feel annoyed seeing way too many articles with BS about the cool $100k IT jobs for fresh graduates. Where I stay, it probably takes an above average fresh grad 5-6 years of doing good jobs while hopping around (startup/MNC/whatever) to come somewhere close to $100. The media just does not ever cover the average Joe programmer's career path, but this is the reality.
</conclusion>
Do you realize there is actual back breaking work - and coding isn't it? And that making $100k after 5 years is still about twice the median income for a family, and goes a lot farther in places with a low cost of living than it does in New York or the bay area. And that if you want a higher paying job, it is on you to find it and get hired?
Of course I meant in the industry itself. Project-based programmers working until midnight for $40k annum with no extras, that's back-breaking alright. Working 9-6pm, comparatively for $80k isn't.
Read the whole thing and my opinion is now slightly more favourable yet I maintain my point of "see more, hear more". I get the author's standpoint yet it irks me that he seems to think the majority of web developers out there are overpaid for doing less than meaningful work. What about those who cursed the job day in day out but stuck with it for the paycheck while looking out? Do they question the same things? Yes, but getting paid this month is more important and no, other parts of the world don't usually get severance packages.
The article is someone's opinion and I'm merely expressing my own as well. Everything is debatable because the industry is so much different from one city to another, let alone countries. Unless this is entirely dedicated to one part of the States, otherwise it is good but flawed argument.
I don't think he's suggesting they're overpaid. He's just wondering how much he's contributing to the world, despite his high pay and societal praise. If anything, I think it points out the fallacy of thinking that our pay and our personal interpretation of our "worth" are in actuality totally disconnected, although we may feel differently. He feels, I would imagine, some angst over the fact that his contribution to the world may not be worthy of the money he receives. It's an interesting point, and a tough one to wrestle with. Lots going on there.
Well, I didn't even read through the whole article. Your comment prompted me to go back and read everything, and I failed to see where the agreement lies.
My point was that the author questions the values of his work from a rather high vantage point, which may be valid but it annoyed me. For a well-informed opinion to be made, I'd think that it takes an amalgamation of different viewpoints, not just from the high medians. Besides, one must see how everyone else in the same profession has it in other areas. $80k for the 1st job? Not many places around the world, seriously. Based on how much underpaid/overpaid you are comparatively to your responsibilities plus where you stay, each combination forms an entirely different set of values to question, different set of objectives to purse. That is why I found the article well-intentioned but not well-informed and in a way, misleading as most articles involving start-up programmers.
"In today's world, web developers have it all: money, perks, freedom, respect."
This might be true only by a VERY narrow definition of "world", specifically Silicon Valley. Or maybe I live in a very peculiar place, but the developers (Web or otherwise) here are in the low to mid income range, regardless of their skills and experience. And don't even get me started on freedom and respect...
Not sure where you are located but I'm in the middle of nowhere Ohio and my experience is fairly close to the Author's. Salary is scaled down but not a ton. I have had three emails for jobs this week and it's been at bit slow this week. Being involved in the community and active on different sites like linkedin help though.
Interesting. I'm a decently skilled web-developer in Ohio (but not middle of nowhere) and I'd say decent jobs are pretty scarce. I've recently gotten more into the local tech scene and started on LinkedIn but it hasn't revealed any SV like opportunities. I keep wondering if moving to a large city is the answer but I feel bad for the area having suffered through a massive brain drain over the past few decades.
I will say that the closer I get to Cleveland the better the jobs are. Also I target startups pretty hard when looking for jobs. It's just my preferred environment.
I got a call from a poor recruiter in Chattanooga, TN trying to fill a Rails job. Seemed to be getting a little desperate. Being in Atlanta, I have no trouble finding stuff right here. When he tried to punk me by saying, "Well, if you have roots I can understand," with a tone like I'm hurting my career prospects by not being willing to move, I asked, "Well, just for kicks, what kind of compensation is being offered here?" When he said $60-70K I almost laughed in his face. "Yeah that's not nearly enough to get me to move. Oh well, good luck!"
You don't have to be in a startup hub to find work, just willing to move somewhere. That makes you much more marketable.
Yeah, I feel a bit bad for the recruiters trying to get senior/lead level guys for "fresh from GT CS program" wages. Or trying to hire iOS devs for $50 an hour.
Yeah, I have worded my question poorly, but the problem here is that simply moving is sometimes so difficult that it barely remains an option.
One important reason for that is that a lot of people who have enough skills and experience to be able to find a much better job elsewhere simply are not allowed -- or face extreme obstacles -- to move to that elsewhere. Visas, work permits, relocation expenses and other issues pile up quickly.
Adding to that other factors like age, family, lack of network in the new place (which is crucial when finding a job) etc, and you will see a quite different issue than you might be used to in terms of relocation.
As someone who's worked through the dot-com boom and crash of the 1990's, I can offer some perspective.
- Everything is relative: My salary is just now what it was at the top of the dot-com bubble about 14 years ago, and I've been promoted twice. That $150,000 offer the author mentions would be $250,000 if we used the wages for 1999, adjusted for inflation.
- Like all things, this is cyclical. In 1999, my email was jammed with messages from recruiters. In 2002, I could hardly get one to return my call. The same will happen with this latest enthusiasm for coders. In a couple of years, the startup scene may cool off considerably. You could end up grateful for a job that pays 1/2 the salary at a bank. Enjoy the free Red Bull and flexible hours while they last.
- You will be replaced. By someone younger, someone offshore, someone with newer skills. You can't change your age or nation of birth, so keep your skills sharp. I've gone from Pascal to Python in my career and will keep going. If you want money, look for the skills people will pay for. Lisp is a great language, but Java has had this great feature of paying my bills and then some.
- Make something people want. If they want you to make a micro social network for an apartment building, do it. Some of my best jobs (and money) have come from the mundane, like software help track toilet paper production costs. One of the wealthiest people I know started as an electrician. He now runs and a company of electricians and has the kind of money us start up folks dream of because he runs wires in houses in a city that went on a building boom.
- Realize that how much money people make doesn't relate to the value their job provides to society. Athletes can make incredible money, while people disarming bombs, fighting fires and repairing power lines make little. It's just a fact of life and it isn't fair. Left handed baseball pitchers have inherent value simply due to the fact that they are left handed on top of their ability to throw a ball. Coders make money right now because of the internet and the information age. Make the most of it and make the impact you want to make in the world.
Holy moses that was a long article. I couldn't get to the end to see if the punchline was different than, "Yes, we get paid for hard work, but we also get paid for having rare skills. Some think it's fair, some don't. The folks doing the paying seem to think it's reasonable, otherwise they wouldn't be doing it."
Is it just me, or is the photo a rather unlucky pick?
I'm seeing roughly 20 people, sitting in what is essentially a dark cellar. It would be completely colourless, if it wasn't for a handful of bizarre, childish gadgets here and there. I couldn't imagine working in such a crowded, depressing environment, let alone feel relaxed.
And from what I've seen, that's one reality for many programmers: The offices are often quite bad.
Open plan offices are popular (it's not like developers need to phone or anything, right?), developers are put in basements, under the roof, or into an unattractive second building. And home office is generally more difficult to pull off for programmers than for other knowledge workers, which I think is mostly caused by a general lack of trust.
Sure, the salaries are pretty good lately (although it's pretty much at the level of other professions requiring degrees, isn't it?), but I don't see much appreciation of programmers in most settings. The startup world is different (from my experience), but it's a very small part of our industry.
It's a valid observation, but does anyone have nice offices?
My parents are medical professionals, their offices were terrible. My sister runs a government org in Australia, has horrible office. One of my old bosses was the head of a department at Harvard, also had a disgusting office. I personally am not a fan of the open space bullpen, but almost every other professional I have met had an office that was worse. The only people I know with legitimately cool offices are interior designers.
Ours certainly isn't a beautiful masterpeice of architecture and modern art, but it definitely qualifies as nice.
A little bland and "airport-lounge-y" despite logos on the walls and brightly coloured chairs/sofa in the casual meeting areas (imagine an off-white and less saturated take on the pallet used for Mirror's Edge), but we have perfectly reasonable desks and chairs, plenty of space (I've seen more than twice as many people crammed into similar spaces), a good amount of natural light (though in the middle of the office the artificial lights still need to be on all day), and a useful kitchen area with plenty of space should lots of use want lunch in at the same time.
It's not really about the office, more about appreciation and respect.
I've been in only one big company so far, but anecdotal evidence suggests many are similar.
It was a software company. Software was it's core business. Yet the programmers, about 10% of the employees, where kind of the lowest in the food chain. Sales got all the money and a lot of attention. Professional services and product management got a lot of attention, shiny offices and regular praisings. The developers were all but ignored.
I stuck to smaller software companies (mostly startups) again ever since. I always felt valued there, and that's important for my overall job satisfaction.
I think the point about code being cool is rather shortsighted: Code has only been cool to anything like mainstream youth for maybe five or ten years. In US high school as an exchange student in 1999, the computer science class/team/whatever most certainly wasn't "cool" - and this was on the tail of the .com boom. In high school in Denmark 99-02, I saw the first general recognition that while "being able to make websites" was probably a useful skill, it certainly wasn't cool. Of course, the view changed at university, but then we're pretty far down the selection-bias rabbit-hole.
Given those 15 years of experience, I am not very surprised that I am in short supply. But now that coding is beginning to be cool, I'd also not be surprised if not too many years from now, basic computer literacy at the end of secondary education (or even earlier) includes the ability to crank out basic RoR-style apps.
I think you may be overestimating the capacity of the education system to change. I'm now a year removed from High School, but in my class of 350 there were maybe 10 who could program proficiently. Only two (myself and a friend) knew what linux was. I would say it wasn't treated as "cool", but certainly regarded with a wierd respect.
I don't accept that anyone can know how familiar everyone else is of a thing. It's possible that you're perfectly correct, but how you arrived at this was assumption.
What if, out of the 348/350 other students you claim were unaware of linux, there were more than 0 of them that were happy to play with linux on their own without broadcasting it to you?
My point in this is that hobbies may seem unique, and it's great that we can motivate the growth of our identity in this, but to take it to the extent that you and your friend were the only kids that knew linux is blissful ignorance. No offense to you, it's just your assumption that I feel compelled to respond to.
Yes there are some assumptions that play into my assertion. Were there others who may have toyed with linux on their own time, quietly? Quite possibly. However, I doubt it. When Lockheed Martin visited my school to mentor/start a cyber security (hacking) competition team, we were only able to get ~18 people (from multiple grades) to show up. I also know for a fact that there were only ever about 60 kids (across multiple grades) enrolled in the school's CS offerings during the year. I knew all of these kids.
I think you'll find that 300 is a much smaller number of people than it sounds. While I respect your challenge of my assertion, I feel it's mostly valid. The effect intended is that only a very small fraction of kids in my class were technically inclined.
I'm not talking about education (admittedly clouded by my anchoring my post in school experiences), I'm talking about kids picking these things up on their own because it's easy, fun and useful. Nobody would learn Word in school for the sake of learning Word, they learn it because it's a useful tool - the fact that there's a class is helpful, but mostly tangential.
This article seems to be missing a parallel that was drawn previously many times. PG has written on the subject, and so have others.
The closest parallel we can draw between "coders" and the distant past, is that with the artisans of the middle ages. Aside from the landowning nobles, the artisan class was probably the closest thing to a true middle class. Many advanced to upper-middle and later became the industrial nuevo rich.
The reason that this parallel works, is that the artisans were able to make something coveted out of what looked like a pile of junk (or less usable) to others. What a software developer creates with a few keystrokes looks to the lay person now as "magical" as creating a sword out of a pile of rocks looked to a lay person a thousand years ago. They get the general concept and see the value, but they have no idea of how it actually happened and cannot repeat it themselves.
By today's standards, most of what artisans of old made is not very desirable if it is made today (antiques not withstanding), hence only very few "artists" are actually paid well. Most people engaged in artisanal work are now in what we call 3rd world countries and make zilch. The reason behind this, is that technology has progressed and most of this stuff can be made faster, cheaper and more consistent by a manufacturing process. Today, we can take the best artisan of old, and copy him/her a million times over to produce the same trinket, with little extra expense.
Software engineering has not yet gone through a true manufacturing disruption. We cannot put software on the kind of assembly line/stamping process we can with say door hinges or other metal works that used to require a skilled blacksmith. That doesn't mean it won't happen some day.
We are constantly trying to go the direction of commoditizing complex problems. Looking back 30-40 years, the efficiency of tools we use has gone way up. Today, a developer doesn't have to write many components from scratch (unless he or she wants to). Most of what makes an application can be taken of the shelf, and modified to fit. For consumer apps, design is also where a lot of work goes in to.
This modification or molding process is a major barrier to having a true manufacturing process in place because the quality of the modifications depends on the individual skill of the person making it. A mistake made early on can bring a future company to it's knees and derail projects years after it was made. For example, a faulty data model that is let to grow to terabytes of information. Many companies choose to through money and computing resources, rather than risk a costly migration.
When we have sufficiently advanced to the point where such mistakes do not depend on an individual and the automation process can take care of it, the value of the software engineer as we know the profession today will diminish greatly. But then again, I'm sure the profession will evolve as well.
Software engineering has not yet gone through a true manufacturing disruption. We cannot put software on the kind of assembly line/stamping process we can with say door hinges or other metal works that used to require a skilled blacksmith. That doesn't mean it won't happen some day.
We also can't do this with writing. Is software more like writing prose, or more like manufacturing? That's the heart of the issue as I see it, and to me it looks more like "writing."
It is possible that AI software will eventually produce better writing than human writers, but that seems unlikely without human-like AI and also seems far enough into the future to be highly speculative.
Logical processes are pretty much ripe for exploiting via AI, but you're right. Lateral thinking processes - pretty much one of the hallmarks of humankind (that and absurdly good communication skills) - are going to be extremely hard to emulate. How does a computer go from examining pipes in houses, then make a parallel to half-pipes in skateboarding to understand better waterflow by the way boarders move through the pipe? Maybe through human guided modeling something like this could occur, but unforced, this is the holy grail.
That may be a silly example but that's just one avenue (wordplay) of lateral thinking, a subset of creative thought and processes.
We cannot put software on the kind of assembly line/stamping process we can with say door hinges or other metal works that used to require a skilled blacksmith. That doesn't mean it won't happen some day.
Stamping out software is what your Windows/OSX/linux installation CD/DVD is. Microsoft is making a lot of money stamping out (figuratively nowadays) copies of Windows.
Software engineering is the equivalent of the mechanical engineer combined with mold maker for plastic or stamped parts. There are a lot of tool and die shops around me that are making a lot of money[1] making molds.
[1] Well, they were before the auto market tanked, but they are doing OK again.
This is a great point. Because software has near-zero replication costs, the "manufacturing" process has no good analog for software industry.
The manufacturing industry is supported by an army of engineers in a wide variety of specialties. In fact, manufacturing, in some ways, is starting to look a little more similar to software, because the non-material costs of production are falling (automation replaces direct labor) so replication costs are falling. Manufacturing will never look exactly like software due to the intrinsic role of raw material costs and the capital costs of production equipment, but there are increasing similarities.
As manufacturing has become more automated, a greater emphasis is placed on the role of engineer: in an automated factory, there is greater need for skilled engineers and lesser need for direct operating labor.
While better software has also helped these engineers become more productive, the net effect has been to improve quality of engineering output rather than to reduce engineering demand. (The exception is people working with engineers, but at a lower skill level. For example, there is little need for draftspeople.) Examples of this improved engineering output are flow simulation software that allows engineers to better push the envelope in part design to reduce part weight and improve production cycle times.
The net lesson from manufacturing engineering is that software engineers will continue to be in demand -- likely even higher demand -- so long as they continue to evolve their skills to use the more powerful software as it becomes available. The more powerful software will lessen the cost of software, which will increase demand for it, and increase demand for the skilled developer. For example, abstraction platforms such as Rails increased demand for those who evolved their skill set, because cost of web development fell due to higher productivity, thereby increasing demand for it.
However, like a draftsperson or an engineer who refuses to learn Solidworks, the more powerful software will be a negative for those who cannot, or choose not to, evolve their skill set above a more basic level.
However, like a draftsperson or an engineer who refuses to learn Solidworks, the more powerful software will be a negative for those who cannot, or choose not to, evolve their skill set above a more basic level.
That line jogged loose a memory that might cause some thought in other so I'll share:
I used be an aerospace engineer. Typically, what we called * designers* were technical people who were very good at working with fancy 3D parametric, version controlled CAD systems. They could come up with mechanical designs and drawings at a rapid pace and good results. They rarely had engineering degrees and leaned a bit on the technical knowledge of the engineers when they needed to.
But we had one guy who refused to learn the new systems and instead insisted on using old, some might say obsolete) 2D CAD that he was familiar with. He happend to have an engineering degree, but most wouldn't guess because 9 out of 10 people picked him out as a dead ringer for Charlie Manson. He was loud, sometimes dirty, dressed more like a biker than an engineer, and he was rude and stubborn. I liked the guy. I really did. But I think I held a rare view, as most people found him offensive and crude. Some refused to work with him. "I'm not changing that" was almost a calling card of his.
Yet in spite of all of that, he designed a frighteningly high percentage of the space hardware that actually flew. His time was in demand and he produced easily as much as the designers with more modern tools. I never counted, but I'd bet he produced more than they did. And it was good because he was brilliantly skilled at the real task at hand.
So in the end, the technology did not matter - the man did.
I don't bring that up to argue- just think there's something worth thinking about.
You bring up a good point, in that knowing the software is no substitute for intellectual comprehension of the problem. An extremely capable engineer using outdated technology will often outperform less capable engineers using more modern tools, especially when the problem or task is complex.
Even though the average engineer may be more productive with modern tools than the average engineer without them, there is danger in assuming that any engineer with modern tools will outperform any engineer with less modern tools. Just like software, tools can be an enabler, but they do not make the engineer a commodity.
But doesn't it stand to reason that if he'd been just a little more personable and just a little less resistant to change, he could have done even better?
I'll avoid the personability side of the question, but focusing on the "change" aspect: Let's not confuse "physical" tools - like a hammer, press, CAD program, or text editor - with "thought" tools - like a paradigm of programming, say, Functional or Object-oriented.
I'm still a rather young programmer (only about 6 years into my career). But I've found myself actually working in a reverse flow of technology, in terms of what tools I use on a daily basis to do my job. I'm primarily a Microsoft-stack web applications developer. I spend most of my day writing out c# and t-sql; often, using Visual Studio and MS Management Studio to interact with my code and the database. Those are the tools I used at the start of my career, when really getting the hang of basic programming.
As I've moved forward (career-wise), I now find myself using rather "older" tools for my day to day tasks. I hate to touch a Windows machine without Cygwin installed, as BASH is my primary interaction with the computer now. And I find myself generating code at a faster pace in VIM than I can in VS. As I've started using more Postgres databases, I've grown more comfortable with psql, and the terminal-based interaction with the db.
So the actual tools I'm using to generate my work are, in a sense, "old". I've found myself almost shunning new tools. Or at the very least, not embracing new tools _just_ for the sake of them being new. A new tool has to have some value aside from "newness."
Use of these "old" tools is not to say that my skills as a developer are lagging behind, however. The same even goes for the languages I've been using for personal projects. I find myself working out ideas more quickly in Common Lisp (how's that for old?) than I can in c#.
The point I'm trying to make is: don't focus on whether or not the person is using the newest tools to do the job. What's going to matter more is how well they understand the fundamentals of their discipline. Granted, I'm still young, but I feel like the craftspeople that really understand the core principles of their craft, and keep an eye on the horizons of their craft, are going to remain in demand (so long as the craft is needed). And if that craft is software development, it won't so much matter whether the coder is embracing Rails or not; what will matter is that s/he understands the new ideas that Rails might embody, and can incorporate those ideas into their daily flow.
Not necessarily. We don't know if what made him so incredibly productive sprung from those same qualities. In fact, refusing to learn new processes when the old ones still worked might have aided him in gaining expertise.
I know what you mean, but I would say no. This is getting a little philosophical, but bear with me. It's like saying, "what if Steve Jobs were a little more down to earth?" or "What if Nick Saban was just a little easier to get a long with?" It's tough to separate out what makes a man who he is. The same unreasonableness that made people not want to work with him is what saw his designs through to production unmolested by the compromises of what he would likely have deemed lesser minds (or "fucking idiots" as he'd have said).
Depends what he wanted out of life really. He may just have run into the point where having a little extra leverage would have been meaningless, or at least more bother than it was worth.
Because software has near-zero replication costs, the "manufacturing" process has no good analog for software industry.
I wonder how many "business people" have a hard time understanding this concept, resulting in the typical shortsightedness that results in Build It Cheap And Fix It Later, where "later" almost never finds its time.
Stamping out software is what your Windows/OSX/linux installation CD/DVD is. Microsoft is making a lot of money stamping out (figuratively nowadays) copies of Windows.
I think you misunderstood. The product a software engineer makes is not a CD that lands in a person's hand. It's the arrangement of bits that eventually ends up on that CD. The OP is saying there is no way for a machine to arrange those bits.
He understands you. He just disagrees and thinks your metaphor is wrong.
First of all, let him speak for himself. Or do you claim to read his thoughts?
Secondly, no he doesn't seem to understand. The statement he disagreed with was:
We cannot put software on the kind of assembly line/stamping process
which is obviously a point about the fact that we have no assembly line process for writing new software.
He replied with:
Stamping out software is what your Windows/OSX/linux installation CD/DVD is
which is obviously missing the entire point. We are talking about writing the software, not putting the software on the disk. It's a distinction you've clearly also missed.
Agree with @abraininavat. There is a big difference between mass distribution and mass design.
Another way to look at it is coach builders. 100 years ago, there were many more coach builders and custom car manufacturers than there are today. Car was also a luxury item. That started to be disrupted by mass produced and mass distributed cars, like Ford.
Today, most companies that hire "web developers", are either building things using packaged libraries or modifying things to fit. They are not taking off the shelf components and connecting them without modification. Even tools like Salesforce.com default to people writing code to solve complex configuration issues. The UI is catching up, but it's not there yet.
>Stamping out software is what your Windows/OSX/linux installation CD/DVD is. Microsoft is making a lot of money stamping out (figuratively nowadays) copies of Windows.
Software engineering is the equivalent of the mechanical engineer combined with mold maker for plastic or stamped parts.
The whole quote shows he disagrees and thinks the analogue is wrong
You only excerpted part of the quote which was contradicting the other interpretation of what the product was, not the explanation of what he thought it was.
I agree that the quote suggests that gvb disagrees and thinks the analogy is wrong, or at least inferior to his analogy.
I mean, gvb's alternate analogy is also a good one. Yes, writing software is like configuring the assembly line, and yes both of those things are expensive work for highly-trained professionals. And yes the interesting difference is that 99.999% of the cost of making software is this design, whereas in car manufacture it's maybe 5% or 20% or something (and the actual stamping is the rest). This is a good analogy, sure! It's also hopefully old news for basically all of us. Isn't it?
But sologoub's analogy is a different one, and it's also a good analogy, and in my experience a less-overused one. And insofar as it's more-novel (to me), it's more instructive, and thus "correcting" it with an overused analogy suggests lack of understanding.
Sologoub's point, to attempt to restate and simplify, is this: the value and scarcity of modern engineering is comparable to the value and scarcity of pre-industrial artisans. Industrial artisans because less valuable when industrialization commoditized their work. And sologoub proposes that this could one day happen to software development (and other engineering), and if/when it does, (software) engineering will cease to be lucrative. I think it's an interesting model; I have no idea if it will be possible to industrialize software development, short of god-AI taking over all human labour.
I think it will be more like a new literacy. Everyone will learn to program quite a bit, and professional writers will be relatively rare.
Artisans always were tied into the production as well as the design. Mass writing ability however is more what you're seeing like the internet did to journalism. I anticipate something like THAT happening, not something related to artisans really.
Software engineering has not yet gone through a true manufacturing disruption. We cannot put software on the kind of assembly line/stamping process...
Such a naive view.) The whole Java ecosystem was especially marketed as a "industry standard" way of creating a "coding sweatshops". All you need is coding and testing teams, and a bureaucratic process (such as Scrum) on top of it.
With very rare exceptions of some OSS or in-house projects (Google, FB) most of software industry consists of such sweatshops which are almost an assembly lines which produce bloatware that mere passes most of tests.
Have you ever seen so-called Enterprise Software? ERPs? Web-portals?) Or take a look at the bloatware MS produces - these all products of such software-manufacturing processes.
Now about an artisan analogy. True (gifted) artists are extremely rare nowadays as thew were in the ancient times. Just take a look at the few examples:
- the requests module for Python. It is such a rare example of clean, idiomatic, human-readable Programming (not mere coding) that it could be considered as an art of programming. How many other modules programmed with such craftsmanship are around?
- nginx and redis are the example of the art of Software Engineering. They are works of art because of an approach and personal involvement of its authors.
- Plan9 team, FreeBSD's and OpenBSD's core teams, the author Gambit-C, authors of MIT Scheme, Symbolics Inc., the authors of Emacs are examples of a teams of gifted programmers.
If you want to see the examples of PG's "painters" - these people are. They are artisans of our craft. OK, there might be few hundreds more of such craftsman in the past (who created Lisps and other marvels) but it is mere 1% of all the field.
Sure some parts of software will become manufactured and mass produced - I mean that's the case today. How many libraries do you use?
However, software is still an intellectual, out of thin air, endeavor. It requires thought to create new and make work with existing circumstances. You aren't melting metal into a mold to make a tangible object. You are literally fusing thoughts and intellectual processes together to do it with software. Until thinking becomes mass-produced and optimized to the point of max efficiency, I don't see this happening.
But how long will this idea of 'fusing thoughts and intellectual processes' remain the sole domain of humans? Besides from some creative leaps, a lot of this has to do with logic: knowing what works where and when.
I'm working on an idea tied to literate programming where someone can tell a story of what they want a program to do, put it in a wiki, and then programmers can edit the wiki to add in the actual software to make the program work. How hard would it be to use NLP parsing, feed it into something using the priciples from BDD, take something that turns that into UML the run it through a code generator and you have an MVP. Software engineers may then end up like manufacturing engineers in that their job is to tweak and fine tune the output. But even then that part of the job may fall to ML which recognizes similar patterns between software implementation and optimum utilization (there are only so many ways to write a user login page on a web app for example)
I think thinking will never become mass-produced, but Mass-producing turning thoughts into software may happen within my lifetime.
Edit: missed out the main point from my last sentence.
Turning thoughts into software is what software engineers do every day. At every point along the technology trajectory technologists have created more and more powerful abstractions. When transistors where first created: print "Hello World" would have seemed a magical melding of the minds of machine and man, whereas today we can create customized eCommerce social networks with a few keystrokes.
However, as powerful as the tools get, you still have to tell the machine exactly what you want it to do. Depending on the tool, sometimes that is easy, sometimes that is hard. I feel like there should be a fundamental law of "ability to easily express" vs "flexibility of expression." I'm not sure how much machine learning can serve as our savior in this regard, even if we can eliminate "make an Amazon clone" posts on cheap-outsourced-developer.com by just using ML to classify it and spit out the boiler plate.
In order for something to be profitable it has to be customized, and when this happens the abstractions inevitably seem to break down. Based on this I think turning thoughts into software will happen around the same time thinking becomes mass-produced. How can you create, "eBay, but for car buying, with a twitter messaging component built for the Brazilian market" without first beating the Turing test as a sub-problem?
So, I want to check my diary and see what I need to do today. Anything really important should flash, so I know I need to take care of that. But, I know that my boss is waiting for me to finish that puff piece on our new acquisition, so double flash that. Also let me know I need to buy cat food. Stat.
Make a program that can parse that piece of text and you're onto a winner. (I'm trying)
It's a neat idea, but I don't think it's particularly useful. Natural language is an unnecessary abstraction that will not help you solve the problem, unless the problem in itself is about grasping the syntactic peculiarities of your language of choice. From your examples, it seems like it would just be a very verbose way of defining a solution to a problem in basically the same way you do in any imperative language.
Syntax should be the least of concerns, and even if it wasn't, involving the subtleties and ambiguities of english language wouldn't be the solution.
EDIT: I don't mean to say that a natural language programming language couldn't be useful, but I'm arguing that it won't be unless you can use natural language to move closer to the problem domain, rather than abstracting the solution description further.
Thinking will eventually become mass produced. To a pre-industrial revolution artisan, having their work automated probably seemed just as difficult and "far out into the future" as a modern day programmer feels about his work being automated.
Depends on your metaphor. If you think of software distribution like manufacturing, then writing the library is more like re-tooling an assembly line for production. This metaphor makes sense to me because both are significant one-time investments that (potentially) keep paying dividends far into the future.
+1 ""Software engineering has not yet gone through a true manufacturing disruption."" I keep expecting this to happen: the disruption. How much more productive is an engineer using something like Meteor compared to an equally talented engineer several years ago for building web apps where users interact (data wise) in some way. Same comment for Rails, even better IDEs, PAAS providers, libraries, etc.
When I started programming in 1973, my boss (who had a PhD in computer science) said that there was a factor of 100 difference in the productivity of different programmers he had worked with.
Even if this differential productivity remains, as tooling gets better I have to wonder how many software engineers will be needed in 20 years if all programmers become much more efficient?
I think it's not that software engineering has not yet gone through a true manufacturing disruption, it's that each new disruption increases the scope of problems that software can be applied to rather than reducing the skill it takes to create it.
The invention of compilers was a huge watershed moment for programming. All the manual labor you had to do, keeping track of memory layouts and registers and stack frames, was handled automatically for you. Yet the result is not that programming became easier - it's that programs became more complex. A journeyman programmer today can write Tetris in a few hours, while it took how many months for the original? But modern games are things like Starcraft 2 and Halo, not Tetris and Space Invaders.
Same with databases. It used to be that you had to think in terms of bytes in sectors on a disk. Now we have filesystems and databases and query languages to abstract that all away from us. But that doesn't mean programming the systems becomes easier, it means that the volume and variety of data stored increases.
It used to be that domains like speech recognition, natural language processing, text mining, computer vision, OCR, etc. were completely off-limits for computational approaches. Now we have standard libraries for them, and standardized tools for generating those standard libraries. But that just means the tools get applied to progressively more problems, not that solving those problems becomes easier.
I think this is what Marc Andreesen meant when he said that "Software is eating the world". There will eventually come a time when no more software engineers will be needed, when they have automated themselves out of a job. But that time will come when everything we do is computerized, when every aspect of modern life is ruled by algorithms and data. Until then, advances in technology just mean computation can be applied to more and more problem domains.
How many computer programmers really work on the complex problems; speech recognition, computer vision, etc; though? Even games, how many programmers are writing really top-notch engines? How's their share of the pie break down as compared to the rapid explosion of in the assets side of things?
It seems to me, and I may be wrong on this, that the complexity of the problems that most programmers are approaching doesn't scale on a 1:1 basis with the power of their tools.
There's complex technical work, and there's complex social work. The programmers who use a speech recognition API to solve a human problem - say, hands-free in-car navigation - are doing work that's every bit as difficult as building the speech recognition API in the first place. And they're often more highly compensated for it, if they succeed - it's just the risk structure of trying to match known technology to unknown problems is a lot more vague than trying to match unknown technology to known problems.
you have basically summed up my worldview and outlook for the future in a few paragraphs. i look forward to a time when we automate everyone out of a job, and it will be a better future for all of us. i have to admit there may be industries on the horizon that may be beyond our comprehension, but it will no doubt require a symbiosis between man/machine.
> Even if this differential productivity remains, as tooling gets better I have to wonder how many software engineers will be needed in 20 years if all programmers become much more efficient?
Greater labor efficiency in the production of a good only decreases the quantity of labor directed to producing that good demanded in the market when there is either some non-labor (or different kind of labor) input to production of the good (e.g., in agricultural productivity, land would be the non-labor input) whose supply is limited, or quantity demanded of the output good goes past the point where cost declines are responded to 1:1 with increased quantity demanded.
Given the number of existing and potential markets for software and the relative transferrability of programmer skills and effort between those marktes, I would be surprised if the latter was true, and I don't see an obvious non-programming constraint on the production of software that would be limiting.
I have to wonder how many software engineers will be needed in 20 years if all programmers become much more efficient?
The trend so far has been that when you improve programmer productivity, you increase the value of what we do, which increases the number of things that we can be useful for. This increases demand.
I think software development won't be done by machines any time soon. The reason is that the work done by programmers is almost always unique, tailored to the specifics of the situation. All but the crappiest programmers can automate repetitive tasks. Therefore, programmers - by definition - always make something new or slightly different. If it was repetitive, they would automate it.
I think that an analogy that we can use is that for making the machine that makes the products, a team of engineers is required, the products can be physical goods, a scalable webapp, a game, a library that then we can use that to make some other machine that can make others stuff.
I think that we already made that advancement in our industry its just that we are making lots of new products that require lots of engineers to make the machine that makes the products.
About the issue of the molding process, even intel with a fpu bug has problems and not just only software has issues whit errors early in the process
Oh boy, I have a lot of comments here, most of them in agreement with the article but not all.
According to all the anecdotal knowledge I have, my conclusion has been that currently software engineering and chemical engineering are tied for the locally optimal career choice for those with a four year degree or less. While some college graduates (or dropouts) will go become billionaires, as a general rule, it is quite hard to do better in a career without further education than you can with these two types of engineering.
Software and chemical engineering also both have their own "meccas": SV for software, Houston for chemical. Google/Facebook/etc. for software, Exxon/Chevron/etc. for chemical.
That said, anyone who thinks software developers are on the verge of being overpaid should consider how low developers (and most workers) are on the overall wage ladder. Maybe $150k sounds high to the typical person born and raised in the middle class, but that's pennies compared to the typical salaries in the true upper class. Moreover, the majority of developers making salaries around that number live in the most expensive cities in the country, not the cheaper ones.
As a general rule, software developers are in the middle class or upper middle class, but not the upper class. Of course, it depends on your definition of the various social classes. If you define it as the one percent, you'd need to make about $350k. If you define it as those whose primary income derives from investments rather than salaries, then even that is not nearly enough.
What's an example of something close to an "upper class career"? Some but certainly not all doctors would make it. Family physicians wouldn't be close (they're about the same as engineers in the end), but specialized academic surgeons will make $500k-$1M/year. At many universities with medical schools, the top surgical faculty will make about double the president of the university.
A successful trader of financial assets can make seven figures. This is almost common on Wall Street, but it happens elsewhere. The youngest billionaire in Houston right now made his fortune from Enron: he was a trader of energy derivatives and after making the company some $75 million one year he was awarded about an $8M bonus. He then started his own energy trading firm and went on to make a few billion.
A partner at McKinsey, the management consulting firm, will make seven figures in total compensation. A partner at any major law firm will make the same.
These aren't extraordinarily difficult careers to get into. Becoming a surgeon is mostly about planning ahead and dedication. Becoming a partner at a firm is just about putting in the hours for years. Literally anybody on Wall Street will make $100k minimum unless they're like a receptionist, and from there you can work your way onto the trading floor.
And let's not even talk about top business executives, who put all those salaries to shame. And contrary to popular belief there is a reasonably effective method towards becoming such an executive: Harvard/Wharton MBA, consultant at McKinsey, eventually hired by a company you consulted for. That's how Skilling got on as CEO of Enron. That's how my friend's dad, a CEO making $4M/year, got his job. Etc. There are tons of examples of it.
It happens all the time. Most people just don't know about it. Many people often think it's all luck getting into the upper ranks of the executives, but that's because they think it's done by choosing a company at a young age and working there for 20+ years. That's not how it's done in general. It happens occasionally, but you have to network your ass off for decades to make that work. But if you're a consultant, then you'll be hired by the top executives to help them solve problems. You just skipped all the lower ranks thanks to just a few years of business school. (This doesn't work with an MBA from anywhere but the most prestigious schools.)
In short, there are a number of careers where at your peak you'll make not a six figure salary but a seven figure salary. If you think back to how long the phrase "six figure salary" has been in use, you'll realize that, with inflation, a seven figure salary is quite accurately the new six figure salary in expensive cities like SF and NYC. People in these cities who aren't making such a salary are often willfully ignorant of this fact. They want the prestige of a "six figure salary," even though their buying power is an order of magnitude less than what that phrase originally referred to.
Just because there's a clearly defined path to making a lot of money in a high prestige job doesn't mean it's easy or that everyone can pull it off.
The best way to start an "upper class career" is to be born into the upper class. Good luck getting into finance or consulting if you didn't go to a top tier university. And it's not small matter -- even if you have the right background -- to get that job at McKinsey or into a Harvard MBA. You have to be extremely smart and strategic in your career choices, and not slacking off in college, either. Then, to advance and do well, you have to put in long hours and do very good work.
In this day and age, I think it is hard to argue that there is an easier, faster or more forgiving path to economic success than software engineering and programming. The author of this article wouldn't even have a chance to get started in any the of other lucrative professions you mention with his 2.9 GPA.
And in addition, you have to have the mental fortitude for this kind of career. Despite not being "well-known" there is an intense amount of competition from a group of the most ambitious people in the world. Being able to compete at that level is insanely hard, much harder than than being a software engineer.
I have had the fortune in my life to attempt both careers. I went to a top-tier undergrad, and got a job as an associate consultant at a strategy firm directly out of college. And guess what? It was terrible. Working every day from 9am-2am 6 days a week, no time for friends, family, hobbies, fun of any kind. After a few years of that life, I went right back to doing software engineering. Would I be richer now had I stayed in it? Probably. But happier? no.
I think it's important to remember in the end that very very few people in the world do work that matters, whether they make "something" or not. I think about this a lot while listening to stories about 14 year old Bangladeshi factory workers making $3 a month to support their families and being padlocked inside factories that are on fire so our clothes are cheaper.
We are all very lucky to be born into a country that gives us the opportunity to do comfortable jobs and complain about them freely. Use work to make enough money do the things you care about outside of work, and for god sakes, travel.
I like the way you think, especially with regards to happiness, and I would love to hear more about this part:
> and for god sakes, travel
I'm very fortunate to have a position where I work exactly 40 hours (some of which is programming). This has allowed me the freedom to pursue other activities, such as meeting and developing an extremely deep relationship with my girlfriend, being an active participant in my church, completing Financial Peace University, joining a local Toastmasters club, and joining CrossFit and getting in the best shape of my life.*
However, I have not yet done any major traveling. Why do you put such high importance on it?
I'm not asking why traveling is generally good, but why _you_ specifically are recommending it so highly.
* [Edit: After reading this, it sounds like bragging but sincerely that was not my goal. I just wanted to say I totally understand and agree about what you said with regards to work and happiness.]
In context it was meant as reference to the story of Bangladesh, meaning you don't know how lucky you are until you travel a bit of the world.
My story however, is pretty typical, I worked through my 20s and didn't take time to travel then even though I had disposable income and more importantly was in charge of only myself. When I got married, got a dog and had kids, traveling increases in complexity at O(an^k) where k is the number of people traveling together and a is the number of people you have to arrange babysitting (or dogsitting) for.
Now travel seems like a giant luxury an I doubt I will truly enjoy it again until the kids are in college. By then we'll be so bogged down with college costs, there will be no money.
I had the opportunity to go to China for work and it really opened my eyes about how little of the world I have seen.
If I could upvote this twice, I would. Software engineering is a great profession, but it's certainly not overpaid (well, the bad ones are overpaid, but the good ones are often underpaid).
There are many relatively straightforward routes to more $$; most people just don't know about them.
Like I wrote in the post above, one path is to get a Harvard MBA, then work at McKinsey as a management consultant, and eventually use the massive and extremely powerful alumni network of McKinsey to move to a company in a senior executive role. It's also fairly common for consultants to just get offers directly from clients they've helped.
You have to be an absolute workaholic to do it, but it's reliable if followed closely. You can substitute Wharton or a select few other places for Harvard, but overall the name of the business school you went to is by far the most important factor of your entire life-long education when it comes to this path. It doesn't matter even remotely whether you learned anything; you just need the brand name on the resume and you need to practice case interviews. It's one of the most superficial selection processes in the world. It's so extraordinarily superficial that most people wouldn't believe it's true.
It's true that there's always someone richer, younger, and better looking (and if not, wait a few years). But if that's what you pay attention to then you'll always be unhappy. Get a more diverse set of friends and you might realize what you have is already is pretty good.
I don't contest any of that (in fact I agree completely), but that's all a different topic. This is about whether software developers are overpaid, not about happiness or social circles.
All I'm saying is a developer getting $150k/year in one of the most expensive cities in the world working for a company with tens, hundreds, or thousands of millions of dollars in funding or revenue should not feel guilty or overpaid.
I think it is relevant, because "overpaid" or "underpaid" don't mean anything until you decide who you're going to compare to. It's fundamentally subjective. Compare yourself to teachers or average U.S. salary and you could make a different argument.
(The trick of only comparing to people who are paid more than you is used by execs all the time in corporate compensation committees.)
There is a lot to this piece. I do think good coders are worth it, but I also think the current social/mobile bubble will burst (not that there aren't important gains to be had in those spaces but that the gains do not live up to investors' expectations). It seems to me so many businesses are chasing playthings because solving the real problems of the day are hard. To be sure not one of us can solve any of them, but if we build our businesses with the problems in mind (rather than the hype), then we are all doing our part. That isn't a lot to ask but it seems like too much.
For my money the businesses I am starting (including the new cloud ERP start-up Efficito (http://www.efficito.com) are intended to continue to chip away at real problems. They may not seem like much but they will hopefully contribute in one way or another to helping support small businesses, the self-employed, and such against the leviathan multinational companies. Some problems (like too much corporate control) are bad matches for venture-capital backing. Other problems though may not be.
Too much corporate control over our lives as individuals is a problem a small family business can monetize a small piece of a solution for, but is not a problem a VC can, because the problem goes contrary to any VC exit strategy.
Now the second point though is that corporations may end up doing some of that anyway. The movement towards open source/free software and open data is helping establish that in at least some cases this problem is monetizable even on a corporate level if one realizes that the corporation is subject to the same restrictions as an individual. If we look at software or data as potential means of production, then this is a welcome way in which the industry is moving towards spreading around control over means of production.
But what I am advocating is not solving the important problems but building business with the important problems in mind, seeing what one can do to help with them in one way or another, either through operations or through product development. They don't need to be the primary focus (and maybe shouldn't be the primary focus) but the idea that we are living in a new world where virtue is not a prerequisite to value suggests two words to me "speculative bubble."
Are any of us worth it? In the developed world we all, from bus drivers and waitresses to software developers, earn 10x what anyone in, say, China would earn for the same work. First world problem.
As for writing as a career, most writers have earned very little for a very long time. It's always been an incredibly tough profession to crack. Yes it's being disrupted and is in a transitional phase, but life is change, it's not the end of the world or a crisis in social values. Plenty of people are still making a living out of it.
A bit off-topic here but I am not sure where folks find these mystical $100,000+ web development positions. I am paid $40,000 a year with virtually no benefits to be a web developer.
I've had luck at places like careers.stackexchange.com, but that's me coming into the picture with four years of intensive real-world object-oriented programming experience in the likes of Ruby and Perl, and a smattering of exposure to Big Data. Also, you need to live in New York or Silicon Valley or a close equivalent.
Toronto, New York, San Francisco, Amsterdam, Montreal, San Diego...these are the cities where you can handily make that kind of money. If you are getting paid 40k per year, it is time to move my friend.
Two things about this article (which are really more prominent in the comments and on twitter) bother me.
First, and perhaps this is because I am of lower working-class roots, whats actually wrong with being a plumber? I believe its fairly common to ask good programmers how they view the task -- generally speaking framed as a carpenter, a writer, or a mathematician.
Second is the ending: "Am I paid too much to code? Am I paid too little to write? No: in each case, I’m paid exactly what I should be." which I find important in tone but redundant. Generally speaking in America the idea is you take the pay you agree to -- which to me inherently means "I’m paid exactly what I should be".
Nothing is "wrong" with it, but the skill required to be one is much lower than it is for other vocations. This means that there are more of them and so they are lower paid. The perception in society is one that because of this they have a lower value.
From my perspective, people who choose low-skill, low-pay work aren't pushing themselves hard enough. It's great if you're good at it and you are low on money, but if it's your life goal to be a plumber you may be setting the bar too low.
But what if being a plumber gets you all of the things you want in life? What is this bar you are talking about? Not everyone wants to be or has to be at the absolute top of their potential.
There's a huge disconnect between economic value and, well, humanity, frankly. Somehow we've come to worship the economic return of everything over the actual experience of living.
This is a great reminder that the "just world" fallacy [successful people deserve their success because they worked hard and that the less fortunate/unemployed somehow deserve their failure] is just that, a fallacy.
> This is a great reminder that the "just world" fallacy is just that, a fallacy.
Where do you see that? He specifically concludes that he's making the right (ie "just") amount of money for writing and coding, and that it's "just" that the CFO makes more money than the coal-miner because his skills are more specialised and in higher demand.
He points out that he got lucky in his choice of professions (or one of his professions) and that it's easy for people to conflate that sort of luck (being in the right profession at the right time) with somehow being deserving of all the rewards associated with it. I know I don't work any harder than say a mechanical or chemical engineer, I'm just better paid.
The irony of James' point – that engineers are in such high demand because of how cheap it is to start a startup – is that engineering salaries are actually being prevented from going up as a result.
There's too little supply of qualified engineers, but startups can't afford to compete in salary with the big guys (basically Facebook, Google, Amazon and the financial sector). So we compete on cheaper things like nice offices, gym memberships, etc. Basically, feel good stuff that doesn't cost much.
Facebook's budget for engineers is a good 20-50% higher than the salary a startup can pay.
Thus, it's not actually crazy that a law school student can become an engineer worth $80,000/year in only eight months. It's the startup's only option.
And that's a good thing: Companies like ours (http://www.thinkful.com/) are finding a way to build products in a way that doesn't break our budget. We're also providing great opportunities to people who want to become engineers. The only caveat is for companies greater than 50 or so people – they're struggling most of all: Too small to attract great talent that wants stability, too large to attract hungry new talent or accept junior talent that needs to learn on the job.
There's an upper limit, but our research indicates that there's a durable demand for engineering. This demand will look decidedly less sexy once startups become less cool, but software engineering will be an in-demand skill well beyond the current hype cycle.
I'm sorry, but a law school student cannot "become an engineer" in only eight months. That's an insult for people that either have studied an engineering degree (like I have) or have earned enough experience through the years to call themselves an engineer, experience that can't be earned in eight months. That's ridiculous.
Sorry – should be more clear: What I mean to say is that there isn't a fixed definition of becoming "an engineer." One's level of expertise and proficiency is defined by the hirer... It's not for me to judge, especially in the abstract, what it takes. If an employer thinks you're qualified, then you're qualified.
What's happening in the market right now is that people who may not have previously been considered qualified are now considered qualified. This is because there's so much demand.
too large to... accept junior talent that needs to learn on the job.
Huh? Being larger should make it easier to do this, not harder. The company I currently work for has done this for its last 3 developer hires, and it's about the same size as yours.
Yeah – This was too quickly worded. What I'm saying is that these middle-size companies are too large to be seen as sexy startups that can make you millions, yet still too small to provide the mentorship needed for jr talent. One or two jr hires can be mentored, but that won't meet the demand, and is super expensive.
Definitely agree with this -- the cost of living in NYC/SF is absurd, and there's not really much of a market elsewhere. A few choice cities (Austin, Chicago, maybe Atlanta?), but the entire market everywhere else is dead in the water by comparison. There are certainly jobs in those places, but it's going to be very scary if this bubble bursts again because a lot of people are going to be out of work, and I don't know if the smaller markets are going to grow enough to compensate for everyone going back home.
Personally, I found it did go as far as I thought it would, in NYC. It might even seem like a bargain if you're coming from Australia. You finally get to live large, banking the same sums as an apprentice electrician from Perth.
A CFO is paid more than the coal miner because the skills required to be CFO of a Fortune 500 company are scarcer, and more wanted, than the skills required to be a coal miner. It’s the combination of scarcity and wantedness that drives up a salary. Coders (Read web-developers in the authors case) are paid more because of a demand for them which the author tries to understand.
One reason for how much software developers are valued is how big part of modern business it is. From the Steve Jobs Lost Tapes in mid 90s:
So software is infiltrating everything we do these days. In businesses, software is one of the most potent competitive weapons. One of the biggest business wars was "Friends and Family" in the last ten years. And what was that? It was a brilliant idea and a custom billing software. At&T didn't respond in 18 months, yielding billions of dollars of market share to MCI -- not because they were stupid but because they couldn't get the billing software done.
At the moment there are about three times as many programming job openings in the US as there are programmers graduating from universities. No wonder developers get paid well.
I often worry that my job is, well, not that hard. The thing is, if it were as easy as it seems, more people would be doing it - or it would be automated - and either way the price will drop pretty soon.
So either what we do really is that hard, and worth that much, or there's a crash coming.
I think this is a good observation, that the interest in this market as a whole creates a dearth of really poor business ideas.
I think this is natural. But if you're in a startup incubator especially, you have to remember that those aren't the cream of the crop. They're people who will most likely fail, and the world will be better if they do (if the world wanted their product, and they're the right ones to build it, they would surely succeed).
I've sat in tech incubators and overheard direction discussions and software discussions, and just laughed. My overwhelming thought was simply "That is the worst idea I have ever heard in my life."
There were one or two companies that had good ideas, good people, and did well. They quickly moved out of startup kindergarten.
"Dearth" means scarcity. I had to read your post over a few times to understand that you (probably) meant an abundance of poor business ideas. An enjoyable word to use, though!
Bad programming (coding) is easy. Idiots can learn it in 21 days, even if they are Dummies.
-- HtDP2
btw, 5 opening paragraphs with so many Is and numbers hardly could be considered a good writing. I guess that the word for this (writing style and the content) is hipsterism.)
I'm not sure if English is your first language, but the writing in this article is actually very good. I don't have any problem believing that people are willing to pay Somers for his writing. (And $10K is, by magazine standards, a princely sum.)
We are not the shovel, we are the ditch diggers, just slightly better paid. Either way, who wants to be a commodity. Your analogy doesn't make me feel lucky, but like I need to get out of the ditch.
Once you've learned a complex skill, it can seem very easy to you - and you no longer see that it's not-easy to the point of impossible for many others. Most people don't have the talent or patience to learn it at all.
Besides software, I play guitar. I've played guitar for over 30 years now. I long ago passed a level of mastery on the instrument, and my attention is focused on expression rather than technique. When working with other guitarists, it can be difficult to even comprehend that they haven't mastered their instruments technically, that things I do without thinking are hard for them to even understand, much less do. And frankly, most of them will never, ever master their instruments.
Software's the same way. It takes, at least, several years to really master it to the point where the act of coding itself no longer gets in your way and dominates your thinking, and you can focus on problems of expression rather than technique. The value of tools like Rails is that they further reduce the technical friction, getting you closer to expressiveness.
And THIS is why coders are worth it. There's work that needs to be done, work that has real market value, and a very limited number of people with the mastery needed to perform the work. And yeah, most startups are crap, but the big wins are big enough to justify risk on a bunch of small losers, so the market is there.
I just wish I could make a living as a guitarist. It's more interesting and challenging.
Some people can play chess every day for years and never get past being an advanced beginner. That's over 90% of the population whether it's Chess or programming. It's a very intense mental exercise... even if the author of the article thinks it's just "playing."
To get a job programming you have to be better than a mid-level chess player and demonstrate it by 'winning' projects that are relatively difficult. Even by doing that you still have to jump through tons of hoops to get a decent job. The types of perks and jobs the author is talking about sounds like a fantasy to me when I've worked nothing but soul-sucking .NET corporate jobs, and web contract work where I'm always worried about getting paid on time.
For me it's come to the point where I'd rather work a part-time job (I'm basically a Janitor) and do programming/startups on the side... than have more soul-sucking code-slave work. So it's almost like this guy lives in an alternate universe to me.
I'm trying to make it easier for me though... lately I'm taking a bunch of Nootropics (racetams, fish oil, various amino acids) and I'm getting back on track. I have a friend who takes Adderall every day to keep up with his job. It's great that the author does not have to do these things and everything just falls into place for him... but at least have some perspective that for many people this shit is not easy.
The web is a clusterfuck of pages users can only read and leave comments on. You won't really do anything relevant with the web. It's useless compared to the speed of current internet accesses. The only thing people use are internet browsers, which use HTTP as a basis, which is telnet. The web is still using old techs and no engineers is really trying to make new things, besides to-the-metal tech like bittorrent.
The internet is not just a web of websites, the internet is also connections between pairs, which is soooo much flexible.
Right now, what does applications do ? They select data from a database, with nice data structures, then they write this huge text file and send it. Your browser then parse this stuff to make some sense of it, you put data in it, and it's sent back. Google then parse aaaaall this, and tries to make some sense out of it. And somehow they manage. The web is like people communicating with mail instead of using real time telephone conversations. It's very inefficient. People try to make real time stuff with it, like facebook and their huge cluster, but it's really a sad show. JS has now almost replaced the role of C and C++ in execution speed. But it's still sent by mail.
I thought webrtc sockets would fix that, but in the end it's a huge fail, you can't connect to an IP address directly, it's still that centralized clusterfuck.
I was thinking for some time about this text and there is one thing that's been bugging me. This is one sentiment that could be applied to most jobs out there:
> We call ourselves web developers, software engineers, builders, entrepreneurs, innovators. We’re celebrated, we capture a lot of wealth and attention and talent. We’ve become a vortex on a par with Wall Street for precocious college grads. But we’re not making the self-driving car. We’re not making a smarter pill bottle. Most of what we’re doing, in fact, is putting boxes on a page. Users put words and pictures into one box; we store that stuff in a database; and then out it comes into another box.
You can compare this to dentistry. It's a highly regarded profession and the wages are really good. But most dentists just pull a tooth out or drill it and put a filling. It's not rocket science, and when was that last time a dentist made a recipe for a new type of filling? And the profession itself didn't make really huge leaps in the last hundred years. My hour as a programmer in croatia is worth around 10 USD. A dentist gets 20 USD just to look at your teeth and charges 50 USD for a white filling which they can do in under an hour.
One more thing. Lawyers, dentists and some other professions have associations that determine minimum prices "to assure high quality of work". Something to think about.
The question is not if its worth, i'm a programmer, and i do believe that construction workers and lady cleaners should earn more then me, since they are forced to do their job, and generally don't like it.
I in the other hand, i like my job. yet i earn more for doing more easy stuff (for me ), seams unfair, this has struck me some time ago. But what we have to accept is that, our income, or the income of a general profession, is based purely on supply and demand, and little more.
Companies in Silicon Valley seem to approach interviews with the idea that the candidate is a blank slate. There isn't a lot of discussion about things you've done in the past; the general approach is to ask you to design for hypothetical scenarios, or to write code on the whiteboard for algorithmic problems.
The Silicon Valley approach seems to throw out a lot of good information that could indicate a candidate's quality by focusing only on the now, rather than any historical data. One of the reasons given for this is that a lot of people are good at talking about stuff when they haven't actually implemented anything, but it seems to me that an in-depth conversation about someone's background followed by coding exercises would provide a better indication of their quality than abstract questions in a vacuum.
I think another reason why interviewers don't ask a lot of questions about candidates' backgrounds is to avoid potentially hearing something that the candidate's previous employer might consider confidential information. This is a fair concern, but it does lead to what seems to me to be a sub-optimal evaluation process.
There also doesn't seem to be as much urgency to hire someone now as the author indicates -- companies can and do take time to schedule interviews and make up their mind. Perhaps it's different in NY.
While it is important to keep our egos in check, I think this article leans a bit on the side of discounting modern web computing. Not all of us twiddle around with CRUD apps all day. Running a site with millions of concurrent users is challenging, and useful. Sites like twitter or reddit enable social change and transparency. Running a site with one hundred users doing real work is challenging too. Allowing one or two people to do a job that would have taken 10 people just a few years ago (ie: sales force, accounting systems, gis systems, big data analytics, etc.).
Also, he talks about how web devs are like plumbers with toilet fixing robots. This would be true if the plumbers happened to build the robots themselves. Who does he think built Rails, Http, Express, Sinatra, Cake, Mongo, MySQL, and every other open source web technology out there? People like you and me. The web is an amazing thing, quite possibly humanity's greatest achievement so far, and we are still working on making it better.
One last thing. The author seems to think web devs just think about colors all day. Is he not hacking, exploring, and building things? Someone who talks about colors all day is a web designer imo. Nothing wrong with that, but entirely different focus.
‘What about somebody in a coal mine — wouldn’t you say he works as hard as you? Why should you get paid so much more than that guy?’
If this is a question to yourself, then my response is: start sending part of your paycheck to the guy in the coal mine. Don't blame someone else for society's ills and not be willing to fix such problems yourself. Find a miner, adopt him, send him half your paycheck, keep tabs on him: how the job's going, how the family's doing, etc.
If this is a question for me, a creator of software, then my response is: I build things that enable new activities, I don't perform a task that maintains someone else's status quo. I (as a freelancer) or my employer (when I'm an employee) have found someone willing to pay what we charge. Period. How am I able to do that? By learning. Continuously. By adapting to changes in technology and the market. Why do I get paid more than the coal miner? Because he has settled for that life. If he needs something more, he needs to learn how to attain it.
But maybe, just maybe, our coal mining friend is doing what he loves. If he's happy ... leave him alone.
>start sending part of your paycheck to the guy in the coal mine
...that sounds incredibly insulting. It's not like he's homeless.
>Why do I get paid more than the coal miner? Because he has settled for that life.
bullshit. This is exactly the kind of attitude that the article is criticizing.
>maybe, our coal mining friend is doing what he loves
Maybe, and that's great. But for most people, "doing what you love" is simply not viable, at least career-wise. Don't forget how incredibly fortunate you are to love doing a job that also happens to pay extremely well.
I would be cautious with this callous mentality towards the miner. I wouldn't jump so quickly to the conclusion that he/she is not "learning. Continuously" as you say you are. The miner may very well be. The issue the article raises is a good one, what validity is there that someone can put the value they create so much higher then another.
What happens one day when your profession is no longer well paid? Think for a minute that you did not have the mental capacity to compete in the new 'tech world' in 20 years. Would you expect that you are paid in accordance with your lack of ability, or would you hope you are still given an equal shot.
Personally this article hits at the heart of the somewhat libertarian style commentary on Hacker New. One that I myself am all too engulfed in as well. Perhaps we aren't as valuable as we may think?
I actually think that the current trend of "chasing eyeballs" is really valuable on its own terms.
Anyone who wants to start a business to support themselves, or their family will ultimately need customers, and as we continue to move away from the post-war era of people working principally for large companies which make things towards a services economy, a lot of people will need to find ways to communicate with their potential customers.
Lowering the cost of acquiring customers makes it much, much easier to start a new business, and to make it successful. Products and platforms that do so are generating real value on human terms.
My wife, for instance, is a photographer whose business is driven almost entirely by word of mouth, and twitter and facebook are both important components of enabling her network to share her work with potential clients. It would be enormously more expensive and difficult for her to run her business via traditional advertising, and would probably preclude her from running her own photography business.
tl;dr Chasing eyeballs makes small businesses easier to start and that's a real, human good.
First of all, the plumbing analogy can be made to many professions. From a certain point of view, engineers, doctors and lawyers are all plumbers too - it's just silly. Said that, I think the author bought the propaganda that want to make us (web dev) feel special and valuable. But hey: that in no way means that the whole world - or even our bosses - think that we are very special and intelligent people. That means they know we produce more and get more creative when in certain environments and being well treated: so they provide. It seemed to me that the author climbed out of the propaganda and now thinks that the whole industry is pointless. He is on the other extreme of the very motivated hipster developer. Don't buy it too: we are needed (now) and our jobs are valuable. So are teachers, cops and many other jobs.
You are wrong in generalizing all coders.
You may be a "coder" designing web sites, doing repetitive work. But there are many out there working towards improvising the workflow in Healthcare, Manufacturing, Entertainment/Gaming, etc.
And as for comparing a coal miner to a coder (personally find that as the best comparison to make when mocking someone who undervalues coders); there is no special added value the miner brings to his workplace compared to a "coder" aka software engineer, whose 1hr's work can make processes efficient and increase productivity for 1000 more people.
Overall, I like your style of writing from your perspective. There is the reality part you have added. But as for generalizing coders. Don't. Specially here on HN.
Yeah, but even in poorer countries there's always a demand for programmers, because there's a global market for software. A lot of my friends in other professions have been jobless or taking shitty part-time jobs. Even the ones that have a job are scared of losing it. Me? I had the audacity to quit my job just because I didn't like it enough. Other people in my country can only dream about it. And every month now I get one or two job offers, usually when a friend or former colleague recommends me.
Oh how I despise the sentiment that if some group with an important function gets low pay that this could only mean other groups with a "less important" function are overpaid. The only people that are overpaid are executives. The rest of us are under-paid (with some exceptions). Yes, even developers. Look how much money a trader makes if they do well. Now compare that with a Quant who will do consistently better and make the company much more money.
If you feel like pulling others down into your pit, reach out for executives, not other working class people. FFS.
This article made me laugh. As with .COM bubble many people jumped compsci ship and when it sunk left that ship. People who really admired it stayed, now the same story repeats itself. The same with this articles' author who's reaping benefits by learning ROR, and being patient enough to read Manuals and Stack Overflow.
P.S. Also I'm quite sick of people building things which are targeted against huge sale later. Build software/website/app which solves some kind of problem you are passionate about. Also people who are into programming just because it pays well.
This was a depressing article. Sometimes, I often think of this myself. And then I think about investment bankers and what they do. That makes me feel better.
And all lawyers do is translate legal language into common tongue (speaking from experience) I looked up the law regarding my lawsuit and pointed out to my lawyer what he should say and he did so. The judge took it and I won my case. However, the judge would not have done so without my lawyer making a lengthy speech. The point being. We are the only one's who know how to put boxes on a page, so we get paid for it.
Yes, coders are worth it. I've personally seen medicine up close, and have friends in various industries ranging from law to business. Everything is shit compared to the value software engineers create. Anyways, I don't think people should get their egos wrapped up into their careers, life is more than that. But I wouldn't undervalue what you are doing.
> Web development is more like plumbing than any of us, perched in front of two slick monitors, would care to admit.
Not only web development, but programming jobs in general. To be honest, after going through a couple of 'real' jobs, I am at the point where I'd rather do some plumbing rather than write another rpm spec or see why a bash script is misbehaving.
One reaction: Routine Web site "plumbing"
is not nearly all there is to the Web
now or its potential in the future.
If the OP wants work more advanced,
significant, powerful, valuable, meaningful,
etc., then he should try to think of
a Web site that needs more than just routine
plumbing.
IT/developer scene in big NYC banks is even worse. People with no experience faking up resume get business analyst/QA position cramming some basic skills in few weeks and they are paid $55/hr for doing nothing. This cannot last, can it?
People with general skills like web development may not be worth much - You need to have advanced knowledge and ability to do large scale systems to be worth something as a technologist. Or you need to have business acumen to make it big.
Prices are set by supply and demand. Right now, coders are worth it, if they aren't the price will go down. If you want to take a longer term view and question the impact on the world, well, the galaxy will be gone in a trillion years.
I'm not sure if my case is very strange, but it took me a good 5 months to find a job after having designed a website and having a masters in math with many comp sci classes. I think it's not that easy for everyone to find a job.
When one becomes obsessed with the tool of the trade, one loses sight of the greater end.
If the ultimate goal is not worthwhile, it does not matter whether you code it Ruby or Scala. It's busy work and a waste of human energy.
Err... CEOs/CTOs/CFOs have that. You should add "reasonably minimal competitive" in front of each word when talking about web devs at an average company.
Kind of a link bait title. The real question the author is digging at is value of work, what others are willing to pay, and how it affects your self esteem. But it worked because I read it.
Are executives worth it? "In today's world, executives & managers have it all: money, perks, freedom, authority & respect. But is there value in what we do?"
As for me, making smarter pill boxes is not much better then designing new deo-spray box or make a new web page or app for a company. Not a big difference.
interestingly in my experience in the north of England this culture is much more applicable to designers - web development doesn't pay very many standard deviations above the average wage, where designers can command high freelancing rates and comfortable working conditions.
There does seem to be the same sort of frenzy in Canada. You can hear it in normal conversations all the time: 'Wow I can't believe you do that, it's so confusing to me. I could never write code.' It seems this mystique is what is paying us so well.
Depends on where you are. Toronto? Definitely. I get recruiters every day asking if I want a job (always onsite though, never contracts). Anywhere else I've found salary and perks drop quickly. Case in point, my last location (PEI) the salary was barely above minimum wage (35-50k/year), even though they keep complaining about shortages in skilled IT folks
Well I put it this way to friends of mine out in the Maritimes. Back in 2012 I was laid off due to government cutbacks (the company I worked for did the Tourism site), and in the three months afterwards I had three interviews across the maritimes. In a week visit to Toronto, I had twenty five interviews.
And salary is like night and day. The highest PHP salary I heard of out there in my interviews and travels was 55k. Toronto PHP jobs start at 60k and many are in the 70's and 80's. I've heard an even larger gulf for Java devs, especialy as you get up to the senior ranks. I don't know about salarys for other languages, but if you don't know Java, .Net, or PHP in the maritimes, you'll be waiting a long time to be employed. There's the occasional C++ or Python gig, but youd' make twice as much in Toronto.
And remote jobs, in Canada, in my experience, do not exist. I've seen maybe five in the last five years. Meanwhile I see remote gigs in the States daily. Basically if your'e looking for remote work, you're looking in the US. Canadian employers I've found are leery to openly hostile about folks working remotely.
Now things could have changed in the last year, and that's just my own experience, but if your'e looking for solid work, and good pay, and in Canada, it's effectively Toronto or bust.
Yeah, it is really too bad that there are such few jobs in the GTA that allow for more flexible working conditions. Guess things haven`t changed much since I left. I hope employers` attitudes will change once more and more startups mature, or do they not even allow a flexible schedule?
this guy's a moron. the price of content hasn't gone to zero -- the price of quality content has gone to the moon, and the price of the bs he writes is going to zero as it should.
the world’s most celebrated sushi chef turns to his son, who is leaving to start his own restaurant, and says: ‘You have no home to come back to.’ Which, when you think about it, isn’t harsh or discouraging but is in fact the very best thing you could say to someone setting out on an adventure.
Only if the only type of adventure you consider is one which aims toward financial success. There are other types of adventures in this world, and other types of goals. There's a place in this world for experimentation, playfulness, and boldness of the type that only is possible when you have a home to come back to.
I know that may not ring true in this forum, where the entrepreneurial spirit reigns, or in Japan, where (for many) katagaki (social rank through achievement) brings honor to your whole family, but from my point of view Jiro was sort of a jerk.
"All you can do in life is a lot of work" -- Ira Glass
Most people simply fear us because of the way media portrays us as Zuckerberg, and our most persistent topic of discussion outside the GitHub world is Zuckerberg or surveillance, which is just another route to Zuckerberg.
Going back to 17C Holland there was probably a huge demand for market traders able to distinguish fine differences in and trade tulip bulbs, until all of a sudden there wasn't - this is the kind of illusory value the writer posits for today's fêted startup web workers. I'm not sure I entirely agree, but it shouldn't be dismissed out of hand, because he's not just saying it's unfair, but that it may be unjustified.
The price of a word is being bid to zero.
This sentence near the end cuts to the heart of the matter for me - for writers or other producers of original content like photographers there is a cruel and dismal comparison to be drawn between the wages of those paid to frame content and present it to the world, and the wages of those who produce the content. The creative content (writing, photography, art, travel guides etc) is all in demand, but no-one wants to pay for it, perhaps because it's so easy to produce something yourself, and so hard to distinguish the fine differences in quality which separate a remarkable piece of writing or photography from the mediocre.