Hacker News new | past | comments | ask | show | jobs | submit login
What unemployment? Nothing like that in United States of Software (college2startup.tumblr.com)
44 points by safarimong10 on Sept 6, 2011 | hide | past | favorite | 70 comments



To become a software developer (especially the kind with a CS related degree), you need:

-A highly analytical mindset

-The ability to handle tough math

-The ability to read dense material and process it quickly

-The mindset to persevere, test, investigate, and find obscure problems in complex systems

-The ability to see a project through to its finish

-The ability to explain complex logic and design, verbally and in writing

It also helps immensely to have

- The ability to work well in a team, and good general social skills (the notion that this isn't important in software development is a silly myth, it's incredibly important)

- The ability to complete lengthy and difficult academic programs, without veering off into less demanding majors that give you more time to party.

- A family or other benefactor that can fund and support you through these academic programs (I knew a dude who tried to major in CS while working 25hrs a week in retail. Extraordinary people can do this, but it's very difficult to carry physics, math, cs, and a humanities elective under these circumstances. Many smart people fail even when school is their only "job").

It makes absolutely no sense to compare a person like this to the national average. The general unemployment rate has nothing to do with the unemployment rate for people like this.

So is the unemployment rate actually low for software developers, or is it simply low for all people with the traits listed above, regardless of field? You could reasonably argue that jobs go begging in software largely because the field is not competitive with the other professions/trades that people with these traits have available to them.


Disagree on "tough math". Strongly disagree.

Be analytical, practical and tenacious. Know how to abstract, design, program. Know tools, know platforms. Know how to triage and actually ship stuff. Be good to work with. Keep learning, constantly. Read tons of code.

But "tough math"? Honestly, haven't needed anything more than simple statistics for most problems.

"Tough math" seems more like an ego thing. I'd watch out for that.


You're correct for general problem solving in computer science. However, in order to get a degree from most universities you have to have taken 3 courses in calculus, a course in linear algebra and a course in differential equations. To most people those courses are "tough math".


If you are seeking a masters in CS most universities are only looking for programming languages, computer architecture, data structures and one algorithms course. These are the prerequisites for most courses.


That's true I'm just speaking from experience of looking at many undergraduate computer science programs in the United States. They all generally require some "tough math", and I think they should because that sort of analytic thinking in solving complex mathematical problems applies directly to computer science.


An ABET accredited CS major requires learning the calculus and probably somewhat beyond the AP BC level plus a course in discrete math. And maybe a bit more, you can check out their web site at http://www.abet.org/


I agree. Even as a math major myself, I have only needed formal math when math was part of the domain knowledge. Math is a great major, but I think you can be an absolutely top programmer with a degree in a different subject or no degree at all.

I mentioned this because if you want to be the kind of programmer with a CS or related degree, you'll need to be very good at math.


You do not need to be very good at math to get a CS degree. The last role I had was in a company heavily loaded up with UMich CS grad students, none of whom could (for instance) deploy a discrete cosine transform.

There's a level of "street math" that good programmers tend to have (statistics, maybe some trig, maybe basic matrix arithmetic), but you'd be surprised how many excellent programmers don't even use algebra. A friend of mine interviewed at a very famous quant hedge fund (a decade or so ago) and shocked his interviewer by answering one of the questions with a system of equations in two variables.


I'd expect UMich CS to be very strong. I haven't looked at the undergrad curriculum, but I'd be surprised if it doesn't require calculus through differential equations, along with some more advanced electives.

I should have defined "tough math," because a lot of people on HN have gone far enough with math that what I described above wouldn't be considered "tough". But it actually does serve as a gatekeeper for most engineering and hard science curricula, including CS.


One can get through the required classes without being any good at them. Wannabe doctors take 2 years of chemistry as premeds and promptly forget it all when finals are over.

That said, I've never seen a CS degree that required any more math than a year of calc and maybe a semester of linear algebra. That doesn't seem so hard, one just has to keep up with the homework. Maybe you are thinking of computer engineering?


Semester of statistics required for mine.

Of course they have since dropped the stats requirement. And the calc 2 requirement. So now it's a semester of calculus and a semester of linear algebra. I guess too many people failed calculus 2 and stats.


People fail stat and pass linear algebra?!


There's basic "stat" for the social sciences (nothing but arithmetic for means, medians, standard deviation, and so forth). Then there's calculus-based probability, which can be very rigorous. The CS curriculum at rigorous programs tends to require the harder, calculus-based probability. I was a math major, but there were a lot of CS and engineering students in my stats class.

BTW, even with the calculus prereq, I'd agree that linear algebra would typically be a harder class, so I definitely understand why you're surprised someone would pass Linear Algebra and fail Stats.


I don't think that math is universally necessary - it's actually rare that I need to sit down and write equations and think about something mathematically. (Unless I'm a genius who does it in his head, and trust me, I'm not.)

And a 25-hour-a-week job doesn't seem that overwhelming. I think I worked 20 hours a week with a 30 minute commute each way - unless you're having a truly difficult time with the core concepts, a 15 hour course load and a 25 hour work load isn't much harder than starting any new job.


Mathematical literacy is more than just writing down complex equations.

for (a = 0; a < n; a++) { for (b = 0; b < n; b++) { do_something(a,b); }}

How many times is do_something called? If you can say n^2 in under 15 seconds you are now better at math than the average American.


That isn't considered "tough math" though. Surely basic math is needed but most developers won't actually use high level math (except to get the degree).


There's an argument to be made that learning tough math is necessary to make basic math effortless.


That wouldn't be a very good argument. Basic math gets better by doing a lot of basic math. Your trigonometry will not improve by learning topology. I had more than one math prof who could not seem to balance a checkbook.


I knew very smart guy who worked over 30 hours a week on compilers alone. He got an A-. Maybe he could have gotten away with less work.

This was a notoriously difficult prof at a notoriously tough program, though. So maybe not typical.


I suppose it depends on the specifics of your program. BS in CS over here, never had to take a compiler class as far as I remember.


I agree. Just like I'm not going to tell someone who is more analytical to get into graphic design. Creative people go to school for design to sharpen their skill, not to learn how to think creatively.


Don’t mention this in the software industry especially startups, the situation is almost the complete opposite with demand for qualified engineers more than twice the supply.

At the price startups are willing to pay, naturally. I imagine demand exceeds supply for doctors willing to work for $30k + equity + copious free beer, too.

I know a lot of people feel that $80k or $120k for developers is quite generous. It may well be generous, but if you can't hire someone for it, it is by definition below market. You know all those articles we've read about structural change? Here's another structural change: the market clearing price for engineers may soon durably transition to that of e.g. management consultants or lawyers rather than that of e.g. HR clerks or marketing directors.


I hope so, but the problem I see is that new grads (I include myself here) are willing to work for 80k, while it's not that much of a stretch to say that the best graduates will be almost as productive in a few years as they will be in 7 or 8 years. That doesn't, however, automatically mean a commensurate salary increase, so developer salaries are kept perpetually lower than they should be. How can employers and employees adapt to this bifurcated market?


If X is repeatedly cheaper than what it should be, you should buy it instead of selling it.

Anyhow, if I'm reading you right, you think that the best college grad in 2019 will be about as good as you are in 2019, causing your salary to decrease? If I'm reading that right, you will have a very interesting eight years in industry and not worry about new college graduates ever again.

Generic career advice for every engineer: broaden skill set, get reassigned to a profit center, negotiate aggressively on comp.


Ah, sorry poor wording. What I mean is that a developer's skill increases non-linearly with experience. That is, there is a sharp spike after only a few years of working in industry. I won't argue that skill doesn't continue to increase with experience, but I think the difference between my skill in 9 years may be proportionally lower than the skill difference now and in, say, 2 years. However, employers generally seem to work on linear experience models, so if I start at $80k now, in 2 years I may get a simple cost of living + (hand-wavy) 10% salary increase, whereas if your assertion is correct I should probably be getting paid closer to the controlling market salary of >$120k. Do you think the proper way to handle this is for new developers to quit after 2 years?


Oh, if that's what you're worried about, a) you're right, employers routinely exploit wage anchoring of employees already on the payroll, b) I don't know if I agree that devs slow down in growth after two years, and c) regardless of whether they do or not, quit and demand wages commensurate with the value you bring to the company.

I used to work for $30k prior to discovering that there exist better options. Now I try to be the little Internet fairy whispering to young devs "There are better options!" ;)


In 1994, when I first entered the job market, $80k/year was a ridiculous number, reserved for people with unattainable-sounding skills like "PowerBuilder". Point being: look at the trend line, not the spot price. New grads today will accept $80k. Five years from now, if the market retains this level of heat, will they continue to do so?


To be clear, $80k in 1994 is roughly $116k today with inflation - probably still on the very high end of entry level.


Entry level? $80k was for a spectacularly senior position. $35k sounds about right for degreed entry level dev jobs in '94.


It's not quite equivalent, but in '98, I was thrilled when I got a job as a network engineer for 40k (in Boston). At the time it seemed like more money than I knew what to do with (I very quickly became much more imaginative).


Boston area tends to have relatively low salaries, both compared to competing areas (like the Silicon Valley, NYC, and Washington-Baltimore corridor) and relative to the cost of living (real estate, primarily). The difference is not trivial.


My experiences haven't found it to be that cut-and-dry:

I moved from Boston to Silicon Valley in '99, and found it was significantly more expensive with only marginally better salary (which I was happy to accept as we would leave at lunch and go surfing, which for me was a worthwhile tradeoff).

I moved from SV to Washington DC in '01, and found the cost of living slightly less, but salaries approximately the same.

I'm now in the process of moving to NYC, and the cost of living and salaries both are a little higher (I don't really care about money though, so I'll admit that it hasn't been a factor in my moving decision).


Software is more important (and valuable) to the world now than in 1994.


In Germany there is the word "Schweinezyklus". Pig cycle. It's mostly prevalent in engineering. It works like this: 1. Companies complain about not having enough qualified engineers

2. Everybody gets told to get into engineering in college

3. Boom crashes. Companies lay off engineers. The new engineering graduates don't get hired and are unemployed

4. Everybody flees engineering

5. Back to step 1.

I think we are at the end of step 1 and 2. -


I don't think this quite captures the situation. Companies aren't complaining about a lack of credentialed developers; they're complaining about a lack of competent developers.

In 2000 there were far more people graduating with computing degrees than in 1995; but the average quality was far far lower. The people who suddenly found that they couldn't get jobs when the dot-com bubble burst? Most of them didn't belong in the industry in the first place.


I finished undergrad in 2002, and no one was hiring then, even for Stanford CS grads. No one wanted people fresh out of school because there were so many unemployed engineers with 5+ years of experience that there was no incentive to take someone straight out of school.


I knew plenty of people who were top-notch developers who didn't find jobs back then.


Anything with delayed results will have cycles. You express cynicism, but there's no great evil behind it. Predicting the number and type of people that should study engineering today for work ten years from now is no easy task. It will always be a little bit wrong.

And at least you end up with educated people and the fruits of engineering work. The alternative is giving up: "I might as well not learn anything very useful, because there might be a period in the future where it's not so useful for a while."


I don't think this cycle is evil. I just want people to be careful when they read stuff like this in the article:

"Are you in college and bothered about the job market ahead of your graduation, should you consider switching majors? maybe. The industry is booming and doesn’t look like it will be slowing down anytime soon. In the middle of all the unemployment, maybe we can take solace in the never ending demand for Engineers."

After a boom there is always a bust. And the bigger the boom, the bigger the bust.


I'd go further and say this is a big problem for any supply-demand system where there is significant latency. Farming, Degrees etc. Wave effects seem inevitable in this case


It's pretty region and skill specific.

If you live somewhere outside of about 4 or 5 locations in the USA, getting a programming job can be hard. Also, skills that are hot in a place like NYC or Silicon Valley are often next to useless in other areas. (learning this the hard way)


Would you mind expanding on your last sentence. What was your personal experience? I'm guessing you have Rails/NoSQL experience while the jobs at your current market tend to be Java/Oracle type?


Yes, that is pretty much exactly what I meant.


It's ok to move, many will pay for relocation!


A demand for workers is not the same thing as no unemployment. Plenty of people out there with outdated skillsets.


Luckily for folks with outdated skillsets, you can learn and practice almost anything in software these days for free from the net. Back in the day you might have had to go back to school, now it's a matter of buckling down and brushing up on your Rails.


You need an incredibly high level of computer literacy first though.


I don't think we're talking about people who have never programmed before; we're talking about maybe COBOL and Fortran experts, or people who write low-level C but can only find Python and Ruby jobs.


Are we really at the point where engineers can't find jobs because they only know C++ instead of Python and Ruby web stacks?

This is a bit concerning, if so.


It happens. I used to work with a guy whose expertise was Java GUI development and who knew NOTHING about web stacks at all. He's been looking for work, and kept getting told that it was a big problem that he didn't know anything about web development. Someone even told him in an interview that he was getting dangerously close to the "point of no return" where his skills were so stale that he'd be unemployable anywhere else.

I spent a lot of late nights helping him come up to speed because he's not a bad guy. Even though he probably should have been doing this himself, the fact of the matter is that he chose poorly and now knows a lot about something that is growing less and less valuable every day. You can say that he's not an "engineer" but that's really just the No True Scotsman fallacy talking. It's really hard to compete on the job market when hiring managers just cut out huge swathes of your knowledge a priori, and the fact that happens is what concerns me.


Web stack replacing "Java GUI" is no big surprise, and totally dissimilar to the great-grandparent comment that implied that there might be a shortage of low-level C jobs. Python and Ruby have not eaten C/C++'s lunch, as far as I know (well, maybe snacked on it a bit). OSes, browsers, games, embedded... still seems to be lots of call for C/C++. Java GUI was never in that class, so it's not at all surprising to see it obsolesce.


True, but the best engineers are the ones that can adapt to new skillsets quickly.

For example, I'm willing to bet that 99% of the most successful iOS developers had not written a line of Objective-C until 3 years ago. Apple simply provided an opportunity and they embraced it, learning the skills they need to adapt.


It varies based on location. The unrealistic "we require 5 years experience in a technology that's 3 years old" HR people still exit. Even more common is N years of desktop app programming experience counting as 0 years of web programming experience (regardless of platform).


I hope not. I have worked quite a bit on Python/Django but I am considering diving into C++ due my recent exposure of the wonderful world of graphics programming (not game development but simulation and visualization).


Nobody is going to penalize you for being the Django guy who can also hack complex C++ and knows his way around OpenGL/DirectX. Far from it, you'll look like a guy with a truly broad skill-set who can handle damn near anything.

The guys who are having trouble are the guys who decided that the knowledge they needed for their 9-5 C job was "good enough" so they stopped learning. That's what kills careers.


Stopping learning is an issue, but employment reality is more complicated. In much of the country the guy who can hack both Django, C++ and OpenGL is unemployable, and the guy who turned his brain off after getting his MCSE can easily find a job. For instance, this is a pretty typical snapshot of who's hiring in my part of the country:

http://cl.ly/2T01382O3v3L3a1x1n2r


Individual locales can have particular niche demands or simply end up as technical backwaters where employability depends entirely on your familiarity with, say VB 6.

That's mostly, I think, an argument for moving out of technical backwaters if you're in a technical career. If you let yourself be limited to skills that are more valuable in Minneapolis than the major tech hubs, you're eventually going to be stuck with skills that aren't even valuable in Minneapolis, as the "hot" tech hub technologies become mainstream and finally migrate there way out to the backwaters.


No, and COBOL and Fortran experts are being handed blank cheques if they truly are expert (and sometimes even if they're not... how fast can you read? ;))


No, there is still a lot of C & C++ jobs out there. However COBOL might be another matter...


That was just an example I pulled out of my a^Hhat.


But that's their point. When you have people with no jobs, that counts for the unemployment. But even I know how difficult it is to find people to hire for our company as I do HR for a technology firm. There is a high demand, just shortage of people to fill the positions.


I see an increasing unwillingness of companies to train people who don't have EXACTLY the right skillset. This used be different in the 90s.


Which, as far as I can see, has led to a vicious circle where developers will often refuse/bitch/complain about working on something that doesn't increase their immediate employability prospects. Not to mention CVs that quite often have a half a page or so of solid acronyms to increase the chances of being found by simplistic recruiting company search engines.


I agree. I think the logic is: "if we train them, they'll go find better paying jobs, and besides, as soon as this project is done, everyone will be laid off anyway." So everyone only wants to hire exact fit.


My experience differs - I think it was in fact the rule of thumb in the 90s... until the dotcom gold rush got sufficiently out of hand that hiring people and training them became the only reasonable solution.


This can also be a function of density of capable talent where you are. I know in my area it can take a while to get a technology position (took me 8 months) due to there being very intense competition. However I did receive more than one offer to an area that I could not afford to relocate to due to other issues (mostly taxes). It seems like there are some great disparities in where the talent is and where the jobs are. The solution would seem to be for someone (people or the company) to move or to find a way for people to be able to work from a larger area around the business.


I'm sure if the economy continues its downward spiral, these software jobs will evaporate just like many of the others.


That reminds me of how people use to think that games were recession-proof before 2008 or so. Then the numbers came in. Quite a reality check when your sales drop by a third in a very short amount of time. All the big publishers took note.


Depends on the industry it services. I expect the Angry Birds/Farmville -style software shops to collapse at some point.

However, shops that turn out embedded system components & other infrastructural support will keep chugging along.


I'm observing some people trying to break into technology. Yes, there are online tools available. Yes, many of them are free. One of the complaints I hear is that the target platform you choose moves so fast.

Generalized, it means software -- the technology and the economy -- runs at a higher tempo. In order to step into the stream, you have to first match the new tempo.

I suspect that the rest of the economy stays in a slump because they are out-of-phase tempo-wise with software technology ... or that the tools that drive up the tempo of innovation in software has not made it out to the mainstream yet. (And even if it did, it will leave people behind simply because there are people who will refuse to impedance match).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: