Hacker News new | past | comments | ask | show | jobs | submit login
Responding to the Explosion of Student Interest in Computer Science [pdf] (washington.edu)
156 points by tchalla on May 24, 2014 | hide | past | favorite | 194 comments



The problem is that 'computational thinking' has become expected of a university education, regardless of major, but universities haven't shifted to meet the need.

All the other basic requirements of a university education are filled through entry level classes - credit-fillers that students have to take in order to graduate. This structure works out well for everyone: the university ensures that students are getting a well-rounded education (as opposed to a trade-school degree) and maintains its reputation and the value of its product.

The only classes that students can take to demonstrate an ability in computational thinking are CS classes. Entry level CS classes, like those in the other engineering disciplines, are not like entry level classes in non-engineering disciplines. Most students that take ARCH 101 and LIT 101 are not taking them because they are interested in pursuing the major, they take them because they were the best-sounding classes that help them to fill their credit requirements. In doing so, they end up with a well-rounded education. In contrast, most students that take EE 101 are trying out for EE as a major.

CS entry level classes traditionally fell into the latter category. Demand has resulted in students treating them as part of the former.

The best thing a university could do is to figure out what skills that these employers are all demanding and offer entry-level classes to fill the need. Structure them to work with large numbers of students, just like all the other 100 and 200 level classes that everyone takes.

Here's the key part: advertise these classes relentlessly, to students and employers. Talk about how the university has seen the demand for these skills and has shifted to focus on them. Become known as a university where every student can get these skills as a part of their education, regardless of major, and most choose to.

Potential students will be thrilled at the prospect of a university where they can get a bankable education without having to commit to CS as a major. Strain on the CS major will go down and prestige will increase - if every student is getting the kinds of CS skills that employers require, then the ones that enter the major must be the hardest of the hardcore.


>CS entry level classes traditionally fell into the latter category. Demand has resulted in students treating them as part of the former.

I was in a funny position with this. In university, I had taken the engineering "Intro to C" class as a first year student, but when I switched to a math major, I had to take the intro level CS classes.

Now, the first year CS classes are required for math, physics and chemistry majors at my school so you'd expect the syllabus to take that into account. The class was almost entirely built around working with the Java API and object hierarchies and writing bloated Java programs to perform simple tasks. It was a huge waste of time, and I didn't have much time to spend on a freshman level general ed requirement. I like to think my whining was one of the reasons why the math department finally made the push to offer a scientific programming course for the rest of the science departments.

I don't mean to bash my school's CS department, either. I took some higher level Systems Programming and Algorithms courses and thought they were fantastic. They just treated their intro courses like software engineering weed-out courses, when other departments required them as general ed courses.


Look at the chart for non-major interest in upper-level CS classes at Harvard: operating systems has no interest; "data science" is surging. Andrew Ng's machine learning class has more than twice the enrollment of computer architecture or software engineering (which is a particularly pathological case, since by definition, nobody else can teach Andrew Ng's class).

This is not general-interest demand. Students are hopping on the bandwagon, and picking the classes that sound like they're the shortest path to a high-paying job. About the only thing colleges can do right now is offer more of the hot classes via lecturers, and wait for the crash. Universities don't hire (lifetime) tenured faculty based on short-term enrollment surges.


"Shortest path to a high-paying job."

Big student loan needs to get paid somehow. Too many desperate programmers in the world is a bad scenario, IMO.


Better to have programmers who have learnt something about programming at college. Too many programmers without aptitude in IT shops who transferred into programming from some other non-technical job in the same company.


Another (possibly complimentary) is to make "programming for X" classes, the same way they make "statistics for X."

That way it can be a later year course and use tools and example relevant to those people. Degrees with a lot of statistics might have a "working with data" class which could teach SQL, R & such. Architects can learn Lisp for CAD. University students tend to learn writing by taking classes that require writing incidentally. Maybe programming could be similar.

"Introductory" classes are always in danger of leaving students with an expanded vocabulary but very little else for exactly the reason you describe whether they are "filler" courses or "major" courses.


I TA'd an entry-level CS 100 class, it counted as a core science credit and covered a good variety of the field of CS with the time it had. The curriculum was something along the lines of: hardware basics, binary math and logic, basic programming, algorithms, computer graphics, crypto, AI. Granted they didn't get very deep into any one topic but a few students came out of that class with the intention of switching to a CS major, and the others got more than they had anticipated from the class. Most of the students enjoyed programming in Scratch, except for those that had programmed before. Coincidentally, I was blown away by what some people managed to do with Scratch in a couple weeks.

The professor was recently written up in the NYT for her efforts to get a more solid CS curriculum into highschools, so perhaps this was an unusual course offering.


CSE 142 at UW is not this, it is geared toward CS and ENGR majors. An "entry-level" 100 class is geared towards non-majors, is quite different.


>credit-fillers that students have to take in order to graduate. This structure works out well for everyone: the university ensures that students are getting a well-rounded education (as opposed to a trade-school degree)

Do you really think the irrelevant classes are necessary at this level though? The student has had all of elementary through high school to be exposed to the broad subject range. By the time you enter university you should know what you're doing. None of my credit filler requirements have been especially enlightening or broadening; mostly just annoying. A Greek and Roman history class I took was interesting to me at least, but I don't see much value forcing another CS student to take it. If I have to take one more English class I might just drop out and start my own company.

Do you really think universities believe in the "rounded education" goal, or do they just know that making you take more classes gets them more money?


> Do you really think universities believe in the "rounded education" goal

This was the traditional role of Universities. It's only recently that they have become pressured to become trade schools and remove anything deemed 'unnecessary' or 'non-profitable.'

Of course when Universities filled that role, you didn't have choices as to what to use as, as you term it, irrelevant classes. Everyone had to have a broader education with classes drawn from many different disciplines.


Yes, yes, yes. I think good universities (i.e, most of their faculty and administration) still believe strongly in the "rounded education" goal. As you say, historically universities sought to provide a liberal education, regarding subjects that "were considered essential for a free person (a citizen) to know in order to take an active part in civic life. . . . The aim of these studies was to produce a virtuous, knowledgeable, and articulate person." [ http://en.wikipedia.org/wiki/Liberal_arts ]

It's mostly the students now who don't believe in a well-rounded education, who just want to prepare themselves for a job. I'm fifty years old and I've observed the trend towards higher education becoming more and more like vocational education. It began before I was born, but it surely seems to be accelerating. Frequent articles on whether college can be "economically justified" are representative of this. Formerly, there was no question of whether a liberal arts (i.e., "well rounded") education would somehow "pay off". It was something desirable for its own sake, part of becoming a better person.


>It's only recently that they have become pressured to become trade schools and remove anything deemed 'unnecessary' or 'non-profitable.'

They've come under this pressure because of the changed realities of the contemporary job market, combined with the rising cost of higher education.

College is largely an economic decision for the vast majority of people who attend. Unfortunately, some only realize this soon after graduating, unemployable with a mountain of debt.

So, this notion of churning out "well-rounded" students is an anachronism. Companies care about the hard skillset for which they are hiring. Period. In fact, they increasingly want people who are specialized beyond even a degree, let alone caring about what, say, a dev hire knows about biology.

In short, a trade school approach is exactly what we need, as it most accurately reflects the realities of the new job market.


I think this is laziness on the part of employers. Nobody wants to offer training in a new job so they pressure for cookie-cutter grads that they can slot into junior roles.


Also a mark of generally severe weakness in front line managers of programmers. Training and mentoring are ideals that only work if those responsible for them are capable, vs. e.g. the standard issue failed programmer who's also generally not very good at management.

For those able to learn on the job pretty much by themselves, well, the above type isn't able to judge much about the real work they do, let alone their not visibly accomplishing much for weeks while they come up to speed.


How is that lazy? Why would a company train you to program?


"In short, a trade school approach is exactly what we need, as it most accurately reflects the realities of the new job market."

Not sure who that "we" you're referring to is. Certainly the trade-school mentality helps the economy grow. Helping populate our country with thoughtful, well-informed citizens leading desirable lives, not so much.


>Certainly the trade-school mentality helps the economy grow.

It helps people to eat.

>Helping populate our country with thoughtful, well-informed citizens leading desirable lives, not so much.

Is a college education at an average cost of $19K/year ($33K for private colleges) and rising the only way to become a "thoughtful, well-informed citizen"? Is it even an efficient way to achieve that goal?

And it's difficult to have any kind of life, let alone a desirable one, when one cannot feed himself or provide for his basic needs.

Higher education itself sends an ambiguous message. "Give us $100K+ and we'll prepare you to participate in the economy and we'll make you a well-rounded citizen". So, which is it? Because these two goals are increasingly at odds. And, how many takers do you think they'd have at $100K for the "well-rounded" bit alone? How many for the "learn an employable skill" bit?

Right. So let's unbundle them and give people a choice.

Look, I'm the first to agree that the manner by which our current economy assigns value and subsequently allocates rewards is terribly skewed. If it wasn't, then college would be free for everyone and we wouldn't be having this discussion. But, unless and until the glorious day comes when we are provided with basic incomes and/or we are financially rewarded for being "thoughtful and well-informed", we need to stop doing our kids (and their parents) a disservice and prepare them for the real world that they'll actually be facing on graduation day+1. This, instead of breaking their backs with debt, only to have them unable to sustain themselves.


"Let's unbundle them and give people a choice."

They're already unbundled, and people already have a choice -- for the trade school thing you want, there are trade schools, 2 year colleges, etc, that require no (or far fewer) of those useless classes for creating an informed citizenry; and a burgeoning explosion of online offerings that are quite good, especially for the price. And there's nothing stopping you from getting into a small liberal arts school and majoring in feminist theory, if you don't want to be tainted by practical skills of any kind. And you can find any position along that gradient, for any dollar figure.

I'd say the issue isn't lack of choice; the issue is that people want contradictory things, and that the 'ambiguous message' is less a message universities are sending than a misunderstanding of what universities are really providing, which is the signaling power of their credential. The practical, no-fat skills acquisition you describe is there for the taking, if that's really what you want. If you're willing to spend the money you can have the best of both worlds at the cost of more time, which is the same as everything else in life.

I don't feel that bad for people who buy a Honda Civic and wish it were a Tesla, or vice-versa. There are certainly plenty of options to get either one, and plenty of information to know which one you're getting. But if you want the Tesla, but only want to pay for the Honda? Too bad.


I had a feeling someone would key on that "unbundle" phrase!

Two year schools and a few online courses aren't yet respected with regard to training in certain fields. In such fields (including software dev, for the most part), employers are generally looking for a bachelor's and, frequently, one in a related field. So, it's not just the university name, but the actual degree.

>the issue is that people want contradictory things

Not really. College is increasingly almost purely an economic choice, and that's the change that people looking from the outside fail to recognize. There was once a time when a person could obtain a bachelor's degree--in virtually any field--and be employable. Any additional personal growth or well-roundedness gained during their stint was a bonus that came at little-to-no cost. They'd get a job with their Women's Studies degree and thus recoup their investment. Not so much now. The wrong degree hurts and can sideline a prospect indefinitely. Likewise, even with the right degree, paying money for ancillary courses that prolong one's tenure hurts. There is, of course, the opportunity cost of an additional 1.5 years or so studying ancillary topics vs. being employed, as well as the actual cost of spending additonal time at ever more expensive universities. Meantime, employers increasingly just don't care.

So, if there is any contradiction, it's between universities and employers. Universities continue to require significant amounts of irrelevant coursework to obtain the coveted bachelor's degree, over which they have a "cornered market". Meanwhile, employers just want the core skills for which they are hiring. Stuck in the middle are the students, who ultimately need to take their place in the economy, irrespective of whatever other personal growth they may desire.

Edit: It's also possible (likely?) that employers will eventually look to other sources (beyond a bachelor's) at some point. This might include 2-year schools that you mentioned and other innovative (yet to be popularized) models. However, what I am describing above is the current reality.

But, if and when this happens, you can bet your last money that universities will "adapt" fast enough to make your head spin, as they will recognize it as the existential threat to them that it is. Offerings of 2.5 year bachelor's degrees may then become commonplace.


"There was once a time when a person could obtain a bachelor's degree--in virtually any field--and be employable."

As you make a nod to later, there was also once a time when a student could work their way through a university or college degree (and that includes not piling up significant debt). This was once true of MIT, let alone public schools.

That is now only a nostalgic memory, and with a rate of inflation that comfortably outstrips the CPI, the primacy of economic concerns cannot be avoided.


Exactly. The pressures are dual. On one hand, the costs of a degree are skyrocketing to the point where the value is really being scrutinized by that measure alone.

On the other hand, the increasing disparity between what colleges provide and what employers desire is being factored into that value equation.

So, at a time when universities need to be showing more value for their ballooning price tags, they are actually providing less (in purely economic terms).


The student can still do that at a community college. You just have to take advantage of federal and state aid and the debt won't be that bad.


Having sat through some courses designed to instill "thoughtfulness", and seen people leave being just as stupid as they were before, I don't think it can be taught, at least not in a class. You have to want to challenge your own thinking, and if you already do then you probably don't need a class.

Worse yet if they stay stupid but gain enough vocabulary to make their stupidity sound wise. Regardless, vocabulary is the best that can be expected from these courses.


You'd probably be surprised by the number of non-engineering disciplines that can leverage moderate programming skill these days.

A very easy example is biology. Many biologists may need to develop models or crunch data, and programming comes in handy. An even easier example is physics- one of the original uses for computers, along with finance.

It goes beyond that too. I can't think of a good example at the moment, but any time you see somebody using a hundred different Excel spreadsheets, that's someone who might be able to benefit from some basic programming skill.


Domain specific expertise combined with non-zero programming skills is a potent combination. But it's easier said than done for many, since the ability to solve problems with computers is a particular skill many just can't grasp (difficult both for programmers and non-programmers alike). Depending on your frame of reference this is either unfortunate or a godsend.


It's unfair to highly skilled people, because it would bore the hell out of them. That's how you weed out the top 1% and breed a normalized group of students. I truly understand why some of the super-skilled people I know didn't even consider going to any higher school, but still beat the shit out of the PhD's that I know, when it comes to a) knowledge b) theoretical and practical skill c) solid maths + formal verification. It doesn't mean that universities suck, but they polarize skills towards industrial demand and forget about the diversity of culture that arrived, they are about to eliminate for the greater good of the industrial market, which manifests in the number of graduates. But oh beware, that too many graduate, as if that would mean that the university is cheap and will loose it's prestige. Professors tend to hate it, when too many pass their exams and often times help to let some % of the students fail on purpose, to keep up their "prestige". If every single professor explained thinks like on Reddit's ELI10 and used interactive media to underline the structure behind a problem, then almost everybody would pass. So, does a high number graduations automatically mean that the university only teaches low-skill courses, resulting in low-quality students? I'd say no, if they teached students more efficiently.


Georgia Tech considers "computational thinking" to be an important component of a liberal arts education, so much so that it used to require all majors to take the same intro CS class that CS majors took. It was a legit intro CS course taught using SICP with Scheme.

As far as I can tell, all majors still must take an intro CS course, but the school now offers a wider variety of intro CS courses, because many people studying things like international affairs and even many engineers didn't want/need or couldn't handle the rigor/difficulty. So it seems at least some universities have indeed shifted to meet the need (and a long time ago at that).


Well, you can definitely tailor even introductory computational thinking to different disciplines without necessarily dumbing it down. For example, literary majors would probably rather learn in Perl, and "hard science" majors might benefit from spending their time in Fortran or Matlab, etc. Spend more time on data analysis for biologists, perhaps...

Of course it won't be a course ABOUT data analysis or physics simulations, but if you make the projects more germane to the students, they will get more out of it.


An example of a university that is doing what you suggest is Princeton: see Brian Kernighan's course "computers in our world" and the accompanying book D is for Digital. http://www.cs.princeton.edu/~bwk/


How about "Possibly, they see people making ridiculous amounts of money in tech and want a piece of it?"


When I started college in 1982 as a CS/EE double major, the big-money major was petroleum engineering. If I remember correctly, they lost a lot of people who had done well in high-school when they hit the calculus for engineers classes.


It's just the same now with CS. We lost 75% of CS prospects in the first semester, almost all of them due to Calculus (although I think some of them came in with the malformed idea that programming was just like video games).

I kept in contact with some of the folks who dropped out or switched majors - they're working at convenience stores and the like.


Similarly, when I started in '83, we had a bunch of gonna-be computer scientists at my school. I think about half of those people were gone a year later.

What I don't know is if the proportion of people who stick with it is about the same from year to year, or if the years with massive enrollment increases have much higher attrition rates.

Alas, that graph shows a big peak of CS graduates around 86/87 or so. I didn't get out until 88, after the wave - and the 87 market crash-let. I had to work too much to do my upper division work in 2 years, so it took me 3 - but I got to learn "dBASE" in my intern-like job, which was largely obsolete by the time I finished school. Glad we learned some C in school (even if I liked Pascal better).


Could you please forward me jobs advertised that have the real possibility of "ridiculous amounts of money"? All the programming jobs I see are about 90k?


Move to NYC or SF. Engineers fresh out of college are getting $90k. Startups are paying $100k - $150k to engineers with 5+ years experience. I don't know if you can call that "ridiculous amounts of money", but its definitely better than average.

It's worth noting that engineering salaries rise a lot quicker then traditional non engineering salaries, but the also plateau much earlier. The engineers making $150k with 8 years experience aren't far or are at the ceiling of their earning potential (without going the management or architect route).

It will be interesting to see if the continued increases will be sustainable over the long haul. From my non-scientific back of the napkin math it seems like NYC tech wages are up about 30% from where they were 4 years ago.


> Move to NYC or SF

Factor in cost of living and you're really not making much more.


Yes, but factor in all the activity and opportunities and life of the city, and it's worth moving there. Besides, it's not really higher cost if the wage of the industry you're working in scales nicely with the cost of living (true for programming).

I live in Manhattan. Within some 2-digit amount of minutes, I can experience the Tribeca Film Festival, I can visit some of the best museums and galleries in the world, I can get involved in the thriving nightlife, dozens of top restaurants, hundreds of huge retailers, and a massive tech scene. Whether we're talking conferences or weekly meetups or user groups or opportunities like Hacker School, there's just a higher chance it's going on in NYC than in a smaller city. Obviously when it comes specifically to the tech scene SF+SV+Bay Area is the juggernaut, but comparing NYC and SF to some small town is comparing some top college football player to some NFL players to your local middle school's quarterback. Yes, if I lived in the Midwest, I could get a McMansion and slightly larger numbers in the bank account if my industry's salary doesn't scale well between city sizes (again, programming does), but is it really worth it to give up all the other opportunities and stuff in bigger cities?


Compared to the median income for most of the US ($40-$50k/year), $90k for a new grad is pretty high.


Not to mention that $50K is the median household income. Individuals earn even less, on average.


this is a perfect illustration of why people protest the Google Shuttles in SF


What? Google has the right to rent a bus as much as its employees have the right to pay for taxis. Or their own cars.


and the Google Shuttle is an illustration of what a CS education is good for


Perhaps if those folks spent their time learning CS instead of protesting, they could actually ride the shuttle.


I thought the further-up comment--where the poster apparently didn't realize that many would see 90K as quite a lot of money--was a better example of this.


This number is very deceiving so be careful comparing your salary with it. Problem is that vast majority of population does not live in the SF, NYC, Seattle area so the median of entire US population is not comparable for people who live in these areas. The median metric is especially notorious in eliminating outliers like these areas.

Even within these areas, you want to compare your income with median in your closest neighbourhood of about 10,000 people. In most of these neighbourhoods with tech population, $90 would probably place you in bottom 30% bucket. In essence, you will be relatively "poor" and get pushed you out to fat outer areas of community where there are either no good schools, facilities and/or commute is ridiculous.

Always look at median of your target neighbour to compare your income, not the national median.


See business school graduate placement numbers too. Declines in finance and huge upswing in tech (particularly the huge web software companies)

Follow the money!

(I think it's wise for students to be doing this.)


The big money is still in finance, unless you manage to build a successful business.


There now exist options in engineering which are comparable with those in finance, and importantly, there is now a credible case that engineering is one of the best median outcome career paths for a bright student who is willing to push whatever lever and turn whatever knob required to get to professional success. Stanford undergrads with their degrees still wet are getting $1X0,000 offers from AmaGooFaceSoft. That's roughly comparable to management consulting or the anticipated returns to early-career finance majors.

When I graduated, back in the mists of history in 2004, you'd be crazy to assume that a 5th year engineer salary package would be approaching $300k. There was a huge salary gap in finance's favor on day one, it got bigger every year, and it exploded once the financiers started to hit their 30s. That gap has narrowed considerably, at least for folks on that trajectory in their 20s. (I'll confess to not having a great understanding of what it looks like to be a 32 year old at Google. I mean, any 32 year old at Google who was with them in 2004 is now a multimillionaire several times over, but I don't know what the reasonable modeling is for expectations for being 32 in 2024.)

The folks who think they're going to be in the top 10th of engineers AmaGooFaceSoft hire this year in terms of career success 10 years out now have a reasonable expectation that that will be as lucrative as a top 10th finance career.


The gap is closing for the first few years of engineering vs finance (first 3 years of comp in finance is definitely a lot lower now vs 2007, whereas first 3 years of comp at GOOG post 2009 has exploded), but the gap grows as you approach your mid 30's. When you're getting to $300k at GOOG, your buddy is over $500k at GS as a VP (VP is middle managemnt in high finance and iirc the high fliers get there in 6 years). Fast forward a few more years, and you're making $500k at GOOG and the buddy is well over $1MM. The gap will continue to grow in size exponentially thereafter.

I guess $300k vs $500k is much improved compared to $1X0k vs $500k that we would have been looking at a decade ago though :P

In addition, perhaps the parity is better when you compare median results rather than upper quintiles.

Btw, first year management consultants make nowhere close to $100k (though they can make up for it with Starwood points and frequent flier miles on United -- no joke). You'll only top $100k once you're an associate, which is after your 2+2 (2 years as an analyst, 2 years of bschool). Only 3-5% of the analyst class goes straight to associate. Students who enter as an analyst are doing it primarily for the great education and training McK/BCG/Bain give you, and definitely aren't making the choice based on monetary incentives.


What jobs make $300-500k at Google, I assume upper management positions?


Plenty of software engineers make this as well. (you must include equity which is often not reported in surveys)


Mid-career median at Google is $141,000.


Average at Goldman Sachs is $400k iirc.


I keep hearing all these high numbers getting thrown around a lot. Is $300K really the typical pay among, say, 5th year Google engineers who joined out of college? If so that's pretty amazing.


I really wonder that too. Places like salary.com indicate a much lower median.


Does it include equity?


Why should it? Most software engineers don't have significant equity i.e. equity that creates a cash flow equal to 100% of their salary annually. I only make this comparison since 141k is median at Google and you are saying lots of software engineers make 300k.


Standard new grad package at Google gives you over $120k worth of equity over 4 years, and I think additional $30k a year is definitely worth mentioning. For more experienced people it's obviously even more, and if you manage to stay for 4 years, the next stock grant will probably be significantly higher to keep you and your acquired inside knowledge inside the company.


30k != 300k

Do you get approximately 150k/yr of equity in the 5th year?

I think a lot of us are trying to understand the "300k/yr" part.

I wouldn't doubt to much that maybe a few people are getting paid that, but I doubt it's the median.


At companies like Google or Facebook, you expect to get 10-15% salary raise every year for the first few years, and around 10-15% annual bonus. Assuming that after 4 years, your equity doubles, the $300k figure seems to be pretty accurate for the fifth-sixth year of working there.


Yes, so $300k puts you in the top 3%, $380k puts you in the top 1% of earners nationwide. The median salary of presidents of private Universities is $400k. If you start at a salary of 150k and get 0.15 raises annually for 5 years, yes your will make 300k a year, but I'd be very surprised if this is the experience of general employees. Google has 50,000 employees. If you paid them all $100k, that would be 5 billion roughly equivalent to Googles costs.


Early in my career in software I was making six times what a high end, new Camaro cost. That might be $300 K now.


Early in his career, Bill Gates was making millions of dollars.

I never saw any convincing data on HN to back up the claims - it's always an anecdotal data point here and there. My guess is that a small fraction of the engineers actually make the stated figures, and confirmation bias leads HNers to believe that's the median pay.


Gates was making millions when he licensed DOS -- he was quickly worth $300 million and because he owned a big fraction of Microsoft.

When I was making six times what a new Camaro cost, I was just a 'worker bee' on a salary working on mostly military applied math and software around DC.

I didn't think that the income was so great: A new, two story, traditional, center hall floor plan house on a 1/4 acre lot cost three or four times what I was making. So, I didn't really have money enough to buy a house and support a family.

For $300 K in Silicon Valley, there is considerable question if that is enough to buy a nice house, say, 1/4 acre lot, two car garage, full basement, good insulation, central HVAC, four bedrooms, two baths, powder room, walk in closets, LR, DR, eat in kitchen, family room, deck and support a family.


I don't think the "median outcome" for a CS grad is to work at AmaGooFaceSoft, though.


Nor is the median outcome of someone entering finance to be a VP at Goldman. But bottom 50% programmers can still take well-paying jobs with good hours.


I've seen a few surveys claiming that money is 1st motivator.

People are attracted to high paying jobs not only for a financial gain. It's a signal that employer cares and other good people will be there.


I think it's a lousy signal if you don't at least control for location, job function, and working hours.


It's not a first pass filter, but if you remove obviously bad it's pretty good.


If that's the signal, it's not a very good signal. Finance jobs can be horrible, they don't care for you (they literally let you die), they only care for the top performers who then get promoted in a few years.


True. However, as a graduating CS major, almost every non-CS major I know here and at other schools has taken some programming-focused class or done Codecademy or something. Few if any are looking for what we might think of as "tech" jobs, or are motivated primarily by any promise of money. I can anecdotally attest to the fact that all the reasons listed by the author are playing a significant role.

(As a side note, a fair number of the people I know who are doing straight up startup-style "tech" jobs after graduation aren't even CS majors.)


This is evidence that programming has become a skill akin to literacy. The migration seems similar to the hypothetical situation in which universities and society had primarily been based on an oral tradition of communication, and then this fancy new technology called "writing" appeared (at first only in "Departments of orthography science"). Some of those who majored in "orthography science" would go on to study it, comparing and contrasting the features of phonetic of symbolic writing methods. Many others would learn orthography science because it enabled them to learn and communicate much more efficiently in general.

Who wants to place bets on when country development metrics include "population % who know how to code" ... ? I'll wager no later than 2030 :)


I completely disagree.

Having everyone knowing how to code is the opposite of specialisation. It's inefficient and pointless. If a certain profession needs a piece of software, it's much more efficient for specialist developers to consult with domain experts, write that software, and sell it into the industry. Having every (for example) doctor being able to code adds negligible value, and I would argue the opportunity cost is awful. If my doctor has a choice to become a very mediocre coder or a better doctor, I would hope they choose the latter.

As for the comparison to literacy, there's simply no comparison. Illiterate societies are unable to progress past agrarian economies because illiteracy precludes specialisation, prevents the dissemination of knowledge, and makes it almost impossible to learn new things except by imitation or direct instruction. Literacy is a necessary skill in order to do much beyond dirt farming. Illiterate people can't, for example, even follow a moderately complex checklist. In contrast, I see no negative impact on society if most people can't code.

I don't spend my time studying anatomy; my micro-surgeon shouldn't spend his time studying coding, and I don't see how it would do anything but make him less effective if he did.

A society of part-time coders would do nothing but produce a lot of badly-written toy programs. What's the point?


Someone a thousand years ago would have dictated to his scribe the following:

"Having everyone knowing how to read and write is the opposite of specialisation. It's inefficient and pointless. If a certain profession needs a document, it's much more efficient for specialist scribes to consult with domain experts, write that document, and sell it into the industry. Having every (for example) doctor being able to read and write adds negligible value, and I would argue the opportunity cost is awful. If my doctor has a choice to become a very mediocre writer or a better doctor, I would hope they choose the latter.

A society of part-time readers and writers would do nothing but produce a lot of badly-written texts. What's the point?"

The comparison with literacy is incredibly apt. Even though very few of us moderns are professional writers, we all benefit by knowing the basic abstractions of letters and words. In the future, everyone will benefit from knowing basics of computation. Probably in ways we can't imagine now, just as an ancient Egyptian scribe could not have conceived of "commoners" writing YouTube comments or texting "whatup u?" to a friend.


I explained why comparing coding to literacy in my post is a false equivalence. It's nice to see that you completely ignored that part.


This is an alluring retort, but I have to agree with kimdouglasmason that it has no actual weight. kimdouglasmason explains and gives examples of why literacy has had such a huge impact on the world. You give no examples of how code literacy will have similar impacts. You just make an analogy and say it will affect the world "in ways we can't imagine".

Your argument could just as easily apply to electrical engineering. The whole world runs on electricity, how is it that no one knows Maxwell's equations! Everyone should obtain basic proficiency in applying electromagnetic theory: it will help the world in ways we can't imagine now.


The electrical equivalent of everyone knowing the basics of computation isn't 'understanding maxwells equations' it is being able to wire up a battery and a light bulb.


Many people who have successfully understood Maxwell's equations but were unable to grasp coding would beg to differ.

Over and over again on this board I see people underestimating how demonstrably difficult the concepts are for many people. As my evidence, I again cite the dropout rate of CS1 classes.


You misunderstand me. I don't mean that understanding coding is not as difficult as understanding electricity, I mean that most people should have an understanding of basic coding at about the same level as they understand basic electricity (ie. they know how to do some things like plug things together, can generally avoid starting fires, being electrocuted, blowing fuses, and so on). IOW, I am lowering the bar.


I think people also underestimate just how soul-crushingly dull coding might be for many people. It's easy to assume that everyone will enjoy what you enjoy, but for some people this stuff is just extremely difficult, extremely boring, or both.


Perhaps we shouldn't teach anything at all? Let's just cancel school. Does teaching everyone a foreign language have a huge impact on the world? How about English? Or history? Or Art? Or PE?

How many high school students change the world by studying the periodic table or learning trigonometry?

By your criteria, how can we justify teaching any subject?


I think there's a very strong case to be made for programming as a prominent elective. But the context of this sub-thread is the claim that knowing programming is akin to basic literacy, which is a much, much stronger claim.


That rhetorical device only works if there is a clear parallel. Simply replacing the words does not make for a convincing counter-point, in itself. For example, "If a certain profession needs a document, it's much more efficient for specialist scribes to consult with domain experts, write that document, and sell it into the industry", but most in this industry can't/don't need to read: how will 'the industry' benefit from documents if they can't read? That goes back to kim's point about dissemination of knowledge.


> A society of part-time coders would do nothing but produce a lot of badly-written toy programs. What's the point?

Maybe you are embarassed by the content of github, but many of us feel it is changing the word for the better.

An Excel jockey writing macros -> better spreadsheets.

A writer who can analyze and restructure their story, wihout laboriously poring and cutting and pasting -> better stories.

There was a time when 'writing' was a specialist activity best left to professionals. Imagine if that never changed?

We don't need everyone to be an anatomist, but certainly we should all know enough about our own bodies to feed and exercise them properly. I hope we never live in a world where we only eat what The Nutritionist hands to us, and we only move our bodies in the way The Physiologist instructs us.


I didn't mention github, and don't see it as especially relevant.

If you are an 'excel jockey', then your job involves coding. Of course knowing how to code to some degree will make you better. I don't see that as a good example. You should have used one of the many professions that currently involve no coding at all as an example.

There was also a time when civil engineering was a specialist activity best left to professionals, and it still is. You seem to believe that coding is akin to literacy. I believe it is not, and have provided some reasons in my previous post.


>>You should have used one of the many professions that currently involve no coding at all as an example.

Okay, I'll bite.

My dad is an ophthalmologist in Turkey. He runs his own private practice. When he opened the clinic in the early 90s, he quickly realized that keeping track of patient records in paper format was not efficient. So he taught himself Microsoft Access (which came out in 1992!) and wrote his own patient record management software, which greatly increased his productivity and therefore income/leisure time.

Fast-forward to 2011. I was organizing an international conference and needed to send out some email blasts. The problem was, all the email addresses I needed were on a several hundred page PDF document with no clear format. So I had a choice: I could spend days manually copy-pasting and cleaning stuff up, or I could try to automate it. I ended up writing a Ruby script that grabbed the embedded text and pattern-matched email addresses and spat them out onto a text file. Still not perfect (my regex was rudimentary) but it got the job done a lot faster. Learning basic Ruby took me several hours, and I ran into some (silly) issues [1], but that's still better than the several days I would have spent doing the task manually.

You may be thinking, "well, we have specialists for that sort of thing today." Sure. But just like there are many rural areas in the US where it doesn't make economic sense for ISPs to lay out fiber, there are many personal tasks and errands where it doesn't make sense for professional developers to write software for. So there is a lot to be said for the need for individual empowerment through programming.

[1]http://stackoverflow.com/questions/6566884/rubys-file-open-g...


I'm still not convinced by those examples. My reasoning is opportunity cost.

If you have to learn to code to write some software, and then write it yourself, you will generally be less efficient at it than contracting someone else to do so. Furthermore, the contractor has a chance of potentially reselling the software to other clients (thus increasing efficiency even more), which you do not.

If it was cheaper for your father to write his own software in this case, then to me this indicates one of several things: 1. That he had already learned to code for some other reason (possibly a hobby). 2. That there was no opportunity cost because his business was operating way below capacity. If this is the case, his time is best spent fixing that problem first. 3. That there is an under-developed market for software services.

Your fathers' example also ignores some potential problems. A novice rolling their own DB code is likely to make novice mistakes, increasing the chance of (expensive) data loss or corruption.

I remain unconvinced that for a busy non-software specialist (such as a physical therapist, a welder, and many other professions) writing any software more complicated than a spreadsheet formula or macro is worth the opportunity cost. That doesn't mean that these people will never configure existing software, or write small scripts in DSLs, but I don't consider that 'knowing how to code' any more than I consider being able to remove one of my own teeth 'knowing how to dentist'.


Things you're not taking into account:

  1. Basic knowledge of programming gives you better odds of successfully interacting with specialists.

  2. Specialists are not always available.

  3. Specialists are not always affordable.
Just because there's somebody who's out there that can do it better doesn't mean that's the one true answer--I can damn well try to stop the bleeding before I seek a trauma doctor, to use an example.

In day-to-day life, learning just enough Ruby or Python or Perl to scratch an itch or Access to make a (terrible) DB is quite acceptable.


Well, I agree that non-professionals probably should avoid writing anything complicated, as there is a high risk that they will make bad mistakes.

However, that IMO does not mean that it's not good for essentially everyone to have at least some basic knowledge of programming and CS basics. For one, communicating your needs to a professional is also pretty costly, especially so when the task at hand also needs specialized knowledge from your own field (be it that communicating the requirements is a lot of work upfront or that the project in the first (few) round(s) doesn't end up doing what you expected it to do - not exactly unheard of in the software world), and secondly, even if the problem that you need to solve is too difficult for you to solve with your basic knowledge, that basic knowledge might still be very helpful in communicating the problem to the professional, as you might be able to see what kind of information the professional needs in order to help you. Plus, it might allow you to better select a good professional - the risk of hiring someone who isn't actually all that much of a professional is pretty real, after all, so adding to the costs of not learning and doing it yourself.


You do realize what you did in Ruby that took you a couple of hours to learn could have been done with grep in one line?


I do, yes. I was using Windows at the time, unfortunately. So I wasn't aware of a tool like Grep (or that it was available for Windows as a downloadable tool). Besides, I was looking for an excuse to learn a bit of programming, some sort of practical, small-scale project, so it worked out. ;)


I really doubt that he spent a few hours fiddling with regexps. 'Those darn @ signs! Just how do I match them?!' I suspect the few hours had to do more with the part he mentions about extracting the embedded text from a PDF. (I've seen a lot of PDFs where a standard tool like pdf2text spits out garbage.)


A writer who can analyze and restructure their story, wihout laboriously poring and cutting and pasting -> better stories.

How is this related to knowing how to code?


> There was a time when 'writing' was a specialist activity best left to professionals. Imagine if that never changed?

Writing = thoughts put on paper. When writing was a "specialist" activity it was due mostly to technical difficulties rather than people needing a whole new breed of skills. The anatomist analogy is a good one. But if we extend it to IT, it doesn't mean everybody should be able to program. Rather, everybody should know basic computer operation and how to use the most popular programs.


Programming = putting thoughts into computers in executable form.

I think all people should learn how to make computer do their bidding at a rate more than one click at a time.


More importantly, writing is an important mode of communication in a way programming will never be (and programming shouldn't be expected to be that).

Penpals don't mail source code back and forth to each other.

Thinking knowing to code is as important as literacy is just another version of people thinking their specific profession is super-important because it's what they spend their time doing. I imagine some especially pompous auto mechanics think knowing the internals of a car is as important as reading and writing. (Though, I imagine there are much fewer pompous auto mechanics than pompous hackers.)


Don't stop there. Architecture = putting thoughts in a "virtual building" form. Therefore everyone should learn how to design their own house. Seriously, you're making quite the jump putting writing on the same level as programming.


Instead of looking for the negative impact on society, have you tried looking for the possible positive impact _if_ everyone knew basic CS and programming? I see that those should logically be isomorphic, but I think the way you frame the question makes a big difference as to what you will discover.

I think literacy is a very good comparison. Society certainly did work without everyone being literate, with just a few specialists who knew how to read and write, and there was no obvious need for everyone to learn it, and it is a huge investment of time and effort for everyone who has to go through it, so it would seem like a natural candidate for specialization, wouldn't it? Yet, as you note, essentially universal literacy has transformed the way we live massively for the better, and few would argue that this massive investment of every single person is not worth it and we should return to specialists taking care of our reading and writing.

Now, the question is: What would a society with universal knowledge of CS and programming basics look like? What would future kimdouglasmasons say when they claim that some new skill is not worth teaching to everybody because there simply is no comparison to computationality? How were pre-computational societies fundamentally held back in their progress, what kind of fundamental transformations of society would never have been possible if people had stayed uncomputational?

Your example isn't particularly helpful. Anatomy is not a widely applicable set of skills. There never is any question of whether you could use the help of an anatomist to help you solve an anatomy problem with your new motor design or your new soup recipe. Automated computation, much like literacy, essentially can be applied everywhere. That's why you write "it's much more efficient for specialist developers to consult with domain experts". That sentence only makes sense for universally applicable skills - I doubt many anatomists often consult with domain experts because they need to apply their knowledge in a field they so far have no clue of.

Also, the grocery lists a literate society writes also aren't of particular literary value. That does not mean that society isn't better of for people being able to write them. And to appreciate more the works of those who still are "literacy professionals", who obviously still have their place in society.


Implicit in what you've written is the idea that humanity has reached some kind of end state in contemporary modernity. Yet given the accelerating pace of change, when the humans of 200 years from now look back on us, they'll undoubtedly see us as even more primitive and different than humans from the 1800s do to us today. Future-humans might even lament how poor and stuck we all were, perhaps in large part because so few of us could program.

Illiterate societies indeed tend to not progress, and humanity at the moment has massive problems that it shows little evidence of being able to contend with. You're completely right that existing professions, as currently practiced, benefit little from adding to them an ability to program. Yet we're seeing a massive trend away from many current professions, as currently practiced. Many jobs are getting automated out of existence[1]. The chief culprit? Software.

[1] http://www.economist.com/news/leaders/21594298-effect-todays...


>A society of part-time coders would do nothing but produce a lot of badly-written toy programs. What's the point?

If I may extend the previous metaphor: society already produces fanfiction.net, so I think we'll survive this too.


> write that software, and sell it into the industry

I hope by this you mean start a collective open standards FOSS project to fill the need, rather than write proprietary software that every single firm will redundantly reproduce because nobody wants to embrace the information age. That is the same kind of backwards redundancy that highlights why "everyone coding" is a bad idea.

> I would hope they choose the later

Medicine is rapidly being transformed by AI, robotics, and automation. In decades we should see the complete replacement of human labor in many surgical procedures, since a programmed machine can be much more precise and interpret the bodies signals and vitals in much greater clarity. But do you want doctors, the experts on performing these procedures, writing the code, or do you you want programmers who know to ironlad it against unexpected behavior to write the code? (trick question, we want both, but the doctors need to be computer literate and the programmers medically literate, but usually the later is more true than the former).

I think there is a misnomer equating computer literacy with writing software. The latter can give the former, but it doesn't always happen - I know many SEs who can't point out a hard drive from RAM, and couldn't make heads or tails of problems out of their niche domain. That isn't really computer literacy either, even if they live in a deep crevase within it.

I think computer literacy, as others in this thread have pointed out, should be its own course - just like English literature. You want to convey the macroscopic concepts and motivations and overarching conventions, rather than the minute details. The analogy would be do you care about sentence structure, word density, and subliminal messaging in Lit 101 - well, no, you care about overarching plot and meaning.

Any computer literacy class should have a cirriculum teaching its students what code is, but that doesn't mean they need to know how to write it. They should just understand how the machines in their day to day lives work, and how to control them if they want to from there. Because so many people are misinformed or just ignorant about what these magical glowy screens of letters and pictures actually are to begin with. It is the difference between being able to appreciate literature and being able to write world class novels.


> a lot of badly-written toy programs.

That's not necessarily true.

On some panel about python, one of the participants who teaches python to science students said that after their relatively short amount of training, scientists can do useful stuff with python and that resonated well within the panel.


This is another example of a field where coding is relevant to the job. Many current scientific methods involve heavy use of computers for things like simulation, complex calculation, or control of custom built equipment.

I still maintain there are many professions where coding is almost completely irrelevant. Some scientists use general purpose programming languages as part of their work. No (for example) physical therapist does this as part of their work.

Many people on this board also seem to underestimate just how difficult it can be to learn to code for many people who have skills in other areas. People who are studying science are probably the type who are good at coding. If you want to see how difficult coding is for many, just look at the drop-out rates in Computer Science 1.

The general argument that I am responding to is that coding is comparable to literacy. I argue that both in need and applicability, it simply is not. If you are illiterate, the number one thing you can generally do to improve your life is learn to read. I argue that unlike literacy, there are many professions where learning how to code provides less benefit than spending that effort learning or doing something else.


Much of your thesis relies on the following:

> If you are illiterate, the number one thing you can generally do to improve your life is learn to read.

I agree that this certainly holds true today, since our culture now uses reading through and through. But in the history of our species, literacy is the weird exception, not the norm.

For your argument to hold, it needs to have been true when only, say, 3% of the population could read and write (basically, a proportion comparable to today's proportion of people who "know how to code"). This graph (http://en.wikipedia.org/wiki/File:Illiteracy_france.png) suggests that the comparable era may be sometime around the medieval period, and it's not at all clear that your thesis would obviously hold true then, when other skills (hunting, barrel making, blacksmithing, etc.) would be far more immediately remunerative. When books are scarce, and few of one's peers are literate, being literate will do far less to improve one's life.


I agree , but those same claims are true for many subjects learned at school, for example math.


This is an interesting thought, and I really like how you've expressed it. The metaphor of mediaeval monks to your canonical computer scientist and of the modern university to monasteries is so rich, but you only hint at it.

Considering how relatively recent our cultural conversion from orality to literacy was, and how quickly society shifted, your metaphor becomes even more rich and meaningful.

Shakespeare stands at the border between the worlds of orality and literacy. Kind of makes me wonder who the Shakespeare of today will be in five-hundred years.


Yes. Would love to see programming as a commodity skill as writing so that real programmers can become writers that pursue the medium as an art and human expression rather than entrepreneurship; and entrepreneurship programming standardized and professionalized as accounting or communications fields.

And also Theory of Computation moved to mathematics, Bioinformatics moved to the life sciences. Digital humanities into the humanities. And only then CS departments focus on only the aesthetics of software architecture and the interpretation of software code as a testament to the creators' socioeconomic influence of the time and to the academic's present-time critical theories.


"Who wants to place bets on when country development metrics include "population % who know how to code" ... ? "

Oh, an example of how CS people overestimate their importance, how funny.

Dentists, doctors don't need to know how to code.

Teachers don't need to know how to code. Maybe it's better that they do (some of them, still). The same with other professions.

Humanity survived up to the 1980/70s without computers being widespread. Their problems were mostly solved without it.

Literacy is something different, you can read a hundred year old English without many issues. It's much more universal than the code being used today.

People should learn how to cook for example, before learning how to code.


Well said. Other things that strike me as more important than knowing how to code, which aren't common knowledge:

1. How our political system works. 2. How the economy works: specifically, what money is, where it comes from, how the banking sector works, and the relationships between money creation, interest rates, asset prices, and incomes.

Most people don't understand these things, and that holds our society back far more than not knowing how to code.


I didn't understand this:

"10% of Princeton’s students are computer science majors

Far more at MIT

10% of Princeton’s faculty are unlikely to ever be in computer science!

Ditto"

It's the last two lines I don't understand. 10% of currently existing faculty are not likely to work in the CS department? Or it's unlikely that in the future 10% of Princeton faculty will work in the CS department?

And the 'ditto'? Far more than 10% at MIT are unlikely to be in computer science?


I think he's saying that "it's unlikely that Princeton will ever have 10% of its faculty dedicated to computer science". Despite the demand, universities want to remain balanced and avoid becoming trade schools.

"Ditto" means "same for UW"; Ed Lazowska is the department chair at UW.


Despite the demand, universities want to remain balanced and avoid becoming trade schools.

Take a look at Matthew Reed's Confessions of a Community College Dean as well. There are other factors at work:

1. Tenured faculty can remain until they die. Almost every school still has tenured faculty from popular areas in the 70s.

2. It can be dangerous to chase trends, thanks to tenure: if the school tenures a bunch of CS professors today, and then something else comes along tomorrow, those profs can remain for 50 years.

3. Mandatory retirement ages were eliminated by the Supreme Court in the late 80s and early 90s (again, see Reed's discussion), which lends extra importance to points 1 and 2.


> Ed Lazowska is the department chair at UW.

He was the department chair at UW when I was a CSE student in 1995 (and through when I graduated in 1998). He definitely isn't anymore (I hope!).

He does have the Bill and Melinda chair of computer science, but this is something else entirely. The current chair of the department seems to be Hank Levy.


Thanks, you are absolutely correct.


Hard to parse this even as an alum. Maybe that faculty position numbers are a very political thing that doesn't increase or decrease too rapidly, whereas student number can fluctuate much quicker?

BTW there is this student/alumni/faculty questionnaire floating around to "think about and foster the role of entrepreneurship at the university", and it seems like the administration is utterly clueless about how to approach things or what they even want to do: http://www.princeton.edu/entrepreneurship/

It reeks of "me too"-ism but I guess it might be better than doing nothing and sending the same old huge swath of kids into finance and consulting who aren't gungho about those fields in the first place.


Before the dot.com bust, for decades MIT's EECS department had 40% of the undergraduates (there's several degrees there, all require some CS and EE, the two relevant ones would be the CS focused one, 6-3, which few take nowadays, and the one that's sort of split between both, 6-2, which if I remember correctly is now the most common one).

That fell by more than 1/2 after the crash, but as of late, like other schools such as Stanford, it's gone back up significantly, fairly close to the old numbers.

However the Institute has always keep the department's size proportionately much smaller, like 6% if I remember the '80s figures correctly. That's because history is littered with formerly extremely popular fields like aeronautics which at some point crashed and never returned (aero/astro did so in the early '70s). So even for MIT far less than 10% of its faculty will "ever be in computer science" given your latter interpretation of the phrase and the fact that many EECS faculty are EE types.


We have almost recovered to the number of CS degrees issued in 1985.


I never understood that factoid, and always assumed it was an illusion --- we must have manyfold more computer-related degrees now that back in 1985


In 1982 Time's "Person" of the year was the The Computer ( http://content.time.com/time/interactive/0,31813,1681791,00....) and I can remember a MIT CS professor, I think it was Michael Dertouzos, bemoaning how this would further skew enrollment. (Note also that at that time most of the department's majors were on an EE track.)

Computers were red hot in a way I'm not sure even the dot.com boom reached, and that boom and bust cycle was perhaps longer.


Totally unscientific opinion, but wouldn't increase in CS graduates and programmers mean lower salary for those newly minted graduates?

Or will we never have enough programmers?


Increase in supply, yes, but the question is if the demand will grow more quickly than the supply. If it does then salaries will keep rising (though probably more slowly). If supply growth outpaces demand growth then salaries will drop.

I don't think we really know enough about programming demand growth right now to make a call either way, we also don't know the scale of the CS graduate expansion.

In any case, this is a terribly simplistic view of things. Not all graduates are created equal, and not all jobs demand the same programmers. Neither programmers nor programming jobs are fully interchangeable widgets.

In reality there are multiple buckets to programmer supply, and multiple buckets to programmer demand. This is sort of the nice way of saying that there is a gradient between "has a CS degree but can't code out of a wet paper bag and probably never will" and "has a CS degree and knows how to solve truly hard problems and has the right foundations to be an excellent engineer".

IMO when people are entering a field purely for the cash, the latter pool won't expand by that much, but the former pool will expand greatly. This will depress salaries on one end of the industry but the impact on the other end won't be extreme. A dramatic expansion in CS enrollment IMO will cause a crunch in low-mid-end programming salaries while leaving high-end programming salaries mostly intact.

My main concern is if these college students are being sold a lie. Schools are holding up Google and Facebook engineers (and their salaries) as templates, but in reality most of them will end up writing enterprise code for a megacorp somewhere for $60K a year, and in fact the bulk of programming jobs in this country are much closer to that than they are to your archetypical Silicon Valley $200K job. This concern goes double for the Hacker School phenomenon, since they smell more opportunistic and get-rich-quick-y than traditional 4-year programs.


exactly on point in my humble opinion, people who are in it soley for the money will probably end up in these enterprise (Java comes to mind) jobs where it is not looked down uppon if you don't code in youre spare time.

And I don't mean to judge, the latter gtoup may have a healthier job-hobby-life relationship than me.


If we keep building software the way we do today then we'll definitely never have enough programmers.


We will never have enough good programmers


Totally unscientific opinion, but wouldn't increase in CS graduates and programmers mean lower salary for those newly minted graduates?

Yes. Unions and associations address this by lobbying and legislating such that someone who isn't part of the union or association can not legally perform the job. This is how they control the labor market to the benefit of those who are already in the organization.


One of the most interesting (and to me) significant points about this is the amount of non-CS majors taking CS classes.

My father, in his 70's, has run several large corporations over his career. We discussed this and he sees no need for "managers to learn to program". His view is that they can just hire computer people. I disagree. My experience is that if you don't know the basics of programming it's hard to know what to even ask for or what is possible. To say nothing of all the time saving personally writing little half hour scripts to do this and that.

My opinion is that anyone who is serious about a professional career involving information is very well served to know a bit of programming and handicapped if they refuse. I think maybe this realization is finally hitting home with others as well and this is partially responsible for the trend we are seeing (to say nothing of the perception of vast fortunes possible in startups).


I see this all the time. We have plenty of people that work with computers, but don't program them. Their inability to write a 5 line script, or even realize they could walk into a dev's office, explain what they want done, and have it in their email inbox before they can walk back to their desk has them poking away in File Manager, dragging and droppings "opps, didn't mean to do that, where did that go", just wasting their lives away. Well, 'waste' is an exaggeration, but you know what I mean. They are doing something mundane, tedious, and utterly unnecessary, and they always express surprise and gratitude when I offer to knock off a script for them.


In relation to "coding becoming the next literacy" and having demographics on people who know how to code...

What use does a mere "learning to code" actually have to most people? How do we measure fluency in "coding"?

Think about it. A crash course on basic procedural constructs like loops, variables, functions and maybe some simple exercises where one makes use of a language's standard library is not particularly useful. Even the conceptual knowledge that one gains is relatively minimal.

It's not like the huge percentages of people "knowing how to code" will do anything meaningful with it. Why? Because they wouldn't have the skill.

Many discussions revolving around compulsory code education seem to completely ignore the tons of domain-specific knowledge one requires to put coding skills to any real use. It's likely that little of the "code literates" will ever contribute to an open source project, for instance.

This is because besides code, you have (potentially): a build system, continuous integration, documentation generator, actual protocols that must be understood (HTTP, DNS, FTP, etc.), use of OS-specific system calls and constructs, general knowledge of memory layout, knowledge of system administration, etc.

These are merely condensed into a nutshell. Code by itself is of little use without actually knowing how computers work in general, which is no small task to comprehend, and requires tons of domain-specific knowledge as to the many abstractions we have created around software and hardware. It's a perpetual learning experience. A person who has learned rudimentary C is of little use in any serious project that requires knowledge of the POSIX and/or glibc API, for instance.

It's a completely meaningless metric in general. Most people will just do some basic imperative coding in a highly abstracted knowledge with no real conceptual understanding of what actually goes on, and promptly forget about it. Maybe they'll remember how to write a recursive Fibonacci sequence procedure.

Software is more than code. It is complex and multi-faceted. I can see blind coding having some uses for things like VBA macros, but those can be picked up by individuals on a case-by-case basis, when and if they need them.


Sure the interest is high, but what's the dropout rate for people, and what is the rate of people switching to other majors?


Exactly this. In my university in the first year there was 2750 students. After first semester only 323, second year 107, last year 82.

Let's be honest, money cannot motive you enough to try to learn for a job for which you have no passion.


What do the graduation rates look like, or is it too soon to tell?


What I'd be interested in, if anyone has a perspective to share, is whether we should be worried? Will the job market change? Will our salaries stagnate or fall?


Experienced engineers will still have an advantage in a job search. Few things to consider:

- First job: Getting the first one is always the hardest, the whole needs experience, but where do I get experience if no one gives me the chance scenario. Luckily programmers can have highly visable projects.

- Job markets do change: It's about supply and demand. US programmers makes more, especially in the SF Bay Area than anywhere else. Just read recently that we hit the point where we're graduating more nurses now than job openings for them. I think that's a very similar case where there was a huge push for the profession and now supply is more than demand.

- The best: I see people complain about VISAs in a lot of articles. The best in the profession all demand high pay no matter where they come from. I haven't met many highly skilled/sought after candidates from outside the country that demanded less than their Amercian counter-part.


"First job"

That's an outstanding point, especially in today's economy.

It sucks that a conventional programming career generally ends at age 35-40, but for someone not wedded to a programming career, and who has a modicum of talent and interest in the field, yeah, that'll get them started.

They will then have plenty of time to arrange their first transition into another career, and they will have learned a lot of generally useful stuff in the meantime.


> It sucks that a conventional programming career generally ends at age 35-40,

That's an illusion, created by two factors:

* Many people choose to leave the "coder" job as they age, and mentally it's a difficult field to enter if you never coded befo reage 35.

* The explosion of the size industry means that the number of young novices dwarfs the number of older veterans. You can see this now, as the young novices of the 2000s bust are still in demand when they are now veterans.


So you've not heard of, or do not believe, the claims of rampant age discrimination?

I have my own telling experience there, based on being able to hide my age in my resume and my looks, and doing the former in the middle of a job search as well as subsequent ones.


Not sure that age discrimination is rampant but I do believe (and witnessed in interviews) that lot of people don't keep up with technology and then later use age discrimination as excuse. Experience in relevant technologies trumps almost everything else. Disclaimer: I am in my late 40s


Does "keep up" require experience that can be cited on a resume?

"Experience in relevant technologies trumps almost everything else."

Hmmm, in my experience, unless it involves a big paradigm shift, such as moving to OO or functional programming, new "relevant technologies" can be quickly learned on the job, unlike being able to design systems, write good code, debug quickly if that's possible, build and debug systems, etc. The sort of stuff that takes more than a few years to learn how to do well.


What? Oh shit... I better start packing...

Maybe it's that the explosion in programming careers coincided with a bunch of kids that left school in the run-up to the last tech bubble? They are mostly in that range. Those are the ones who were growing into the web world (and were more likely to stay technically engaged).

I expect we'll continue to see that age slide a bit, though due to the senior programmer income plateau (and supervisory experience accrual), you also have the move to management effect. Many good programmers will become leads, architects, managers etc.

I have no idea why you would think a programming career ever ends in relation to an age? It may also be that people 35-40 don't want to work at companies or with the lifestyle you see around you?


What do you mean by transition into another career? Using your programming experience in a tangential field? Getting into management?


Anything, really (well, aside from areas requiring additional formal education and/or expensive in time and/or money credentials; this not a path to becoming a doctor, lawyer, scientist, etc.). There's very basic "work skills" that any job requires, e.g. showing up; the object here is to get a "real", career type of job before, say, a couple of years have passed unemployed, after which I've read its very very hard to get such a job.

I would also imagine it's easier to transition to something else desirable from a programming job compared to retail (which isn't doing well anyway), food service, etc., jobs which many of us are mentally/temperamentally unsuited.

When I read/skimmed it, What Color is Your Parachute?(https://en.wikipedia.org/wiki/What_Color_is_Your_Parachute%3...) spent a lot of ink on this. And as has been noted by many, most Americans change careers several times, it's definitely not out of the ordinary.


That sounds like an awful waste of 20 years' worth of skills and experience. I never saw this "check out at 35-40 and do something else" advice given in other specialized and relevant professions, but I can see how it may fit in the context of a jobs bubble where the industry can't sustain all the gold-rushers for their entire working life.


"That sounds like an awful waste of 20 years' worth of skills and experience."

Indeed. But then again, what can you say about a field where "senior" is commonly added to titles after 5 or so years?

If you're a programmer in the US and below the age of 40, I sincerely hope you investigate this before you find yourself only able to get consulting work. Or perhaps embedded, there are those who respect grey hairs in that field. Or government work; that's likely to be only attractive if you get a job requiring a serious security clearance, e.g. TS/SCI or Q, from an organization that's willing to have to mark time or whatever while you get it, then of course stay in jobs requiring a high clearance.

But this has been going on since at least the '90s, it has nothing to do with jobs bubbles, heck, it was strong at the height of the dot.com bubble, when I had my personal epiphany on it (was 35 in 1996), it's age discrimination. Or partly wage discrimination, young people, and/or those on H-1B and L-1 visas are cheaper, more malleable, etc. Google has provided one of the more notorious examples, https://en.wikipedia.org/wiki/Brian_Reid_(computer_scientist...


> It sucks that a conventional programming career generally ends at age 35-40

I think this is a myth. I'm 39. I've yet to see any sign that experienced engineers who keep their skills up to date have any issue at all. We just hired someone I estimate is in their late 50's.


Keep an eye on graduation rates. Enrollment numbers mean nothing for your concern. How many people do you think can meet a min GPA after completing core physics, math, and CS classes?


Personally, I think we are at a peak for outrageous programmer salaries. The market is currently correcting itself.


Programmer salaries aren't outrageous. Most peoples' real incomes have declined substantially, making programmer salaries look high. US GDP is $15.68 trillion (2012), with 146 million people employed (April 2014). That's $107,000 per worker. US median wage is around $27,000.

Programmers are one of the few occupations that have managed to retain bargaining power.

Compare the lifestyle (e.g. house, vehicle, ability to have a single income household with kids without getting into massive debt) a programmer in the Bay Area can buy right now compared to the lifestyle a plumber could buy forty years ago. The plumber of 1974 comes out ahead.

The only reason everyone hasn't noticed this huge decline is because the inflation numbers are rubbish and everyone has been kept distracted by wedge issues and wars.


> The only reason everyone hasn't noticed this huge decline is because the inflation numbers are rubbish

and a large part is that progress has raised productivity and the standard of living quite a bit, but the top x% have captured the profits from that progress.


Are you implying that GDP per capita and median wage should be the same?


By no means. I believe that having GDP per worker be 4x median wage is socially and politically dangerous, and results in persistent high unemployment because of lack of demand.

The sweet spot is somewhere in the middle. Capital needs a return, but we've seen current levels of wealth inequality in the past and it means bad things for the average person.

Edit: Another way of stating my position is that I'm against extreme income inequality. That does not mean that I am in favour of perfect income equality.


The bosses(Figure head manager) would still make more than programmers. How is that any correction


Before people start preaching 'coding is the new reading/writing', let's get to high school CS requirements first (like any other common science/math class).

Also, I think there's a sweet spot between a full-on academic CS degree and a level of software proficiency to make someone useful on the job. Whether it be data analytics or basic web development, you don't NEED to know the ins and outs of automata theory or compilers to be a 'coder' with hands-on skills.


The point of 'coding is the new literacy' is that we also need grade school CS standards.


If I was going into college this fall, instead of 15+ years ago, I'd focus on EE, ME, or Physics and just cherry pick advanced CS courses on the side.


If I was going to college this fall instead of 20 years ago I would get a tattoo on my arm reminding me that extra classes and networking pay off better long term than drinking beer and getting laid.

But... I'd probably do the same thing all over again anyway.


Conferences are better for that than college, all things considered, and the networking is better too.


Good point. And, the beer drinking is often at conferences too. The getting laid part...eh...well... Go to college kids! It's important and you will learn all kinds of great things.


In an interview just couple of days ago Knuth said something like 2% of human population finds this new "computational thinking" as native. Now we have 10% plus graduates coming out of Princeton and MITs in CS!

But what caught me is this line: Students are figuring out that all of the STEM jobs are in computer science

That pie chart is probably the most striking manifestation of "software is eating the world".


It is so important to foster and support the growth of the CS curriculm, at all levels but at the university level especially. We need these classes to be well taught, and well funded. If 1000 students all want to learn the same thing, that's a lot of funding to do some really interesting and amazing things. This is not about provisioning more lecture halls!

The way students pick classes is a lot different from the way that a consumer spends money. You're already committed to spending a certain amount, especially if it's not impacting your major, class selection is a different set of variables. But I'm sure the surge in CS enrollment is at least partially mirrored in a broad increase of CSed spending overall.

What leaves me nervous is, I wonder if the tool set is really ready for mass adoption. The most direct effect of massively increased enrollment, and massively increased class size, will be overall more people who know how to code, which would be great, but also significantly lower median coder ability. Which will have a large impact over time.


Do these trends hold in universities outside of the US? Because from where I come from at least, computer science is still as un-sexy as ever.


Didn't this also happen in the mid-to-late '90s, with the first dot-com boom? I mean, I don't think that the proportion of CS majors dropped precipitously after the crash, but the rise certainly seems to correlate with demand for programmers.

It makes a whole lot of sense, really, except that if you started your degree in 1998, you were going to graduate right into a crash.


"[...] I don't think that the proportion of CS majors dropped precipitously after the crash..."

MIT's EECS enrollment more than halved, after being 40% of the undergraduate body for more than 2 decades.


man, I wonder what happened to those guys? I was able to remain employed through the dot-com crash, but I know many people who were obviously better than I was that did not. It seems like it would have been brutal for a fresh graduate with no experience in '01.


A lot of them that were strong in math, getting the mostly EE or enough of both degrees, would have gone into finance as quants (https://en.wikipedia.org/wiki/Quantitative_analyst), which was a desired initial destination for many of them anyway.

Don't remember too much about the mostly CS track ones, I think it was fairly grim; I can remember at least one major, Fortune 50 or so company, rescinding offers they made one year (which obviously poisons the well for some time; one reason Kodak failed so badly in Japan). I couldn't remain employed during that period (had the particularly bad luck of joining Lucent at the beginning of 2001, just as it began its descent from 106,000 employees to 35,000).


What happened in late 2009? I saw the same inflection point in those redmonk github usage graphs.


There has been a cultural shift lately from "Ha! Those people are nerds!" to "Man, those guys are going to make so much money.". I would argue it has a lot to do with this movie:

http://en.wikipedia.org/wiki/The_Social_Network

Also, as a student at the University of Washington, I would like to note that, anecdotally, the quality of CS students has dropped through the floor. I'm a mechanical engineering major, but I often find myself helping my roommate with his CS homework, even though he's probably in the top five percent of CS students here. These people don't even understand basic things like what bytecode is.


Programming is not going to be as lucrative a few years from now. As soon as interest rates rise in a few years the startup boom will cool down a bit and the influx of new programmers will really depress the earning potential of the field.


Probably. But longer term software is only going to become increasingly important. Some refugees from the start up implosion (if there is one) will move on to other careers in other fields. Some will adjust and keep right on developing whatever comes next. There may be a bubble right now. But, it has a grain of truth to it. And that is... computing is changing the world.


Here in southern Europe is totally different. $90K for a new grad? That's incredible. Here, the majority of new grads earn about €15-25K. Take a look: https://www.infojobs.net/madrid/analista-programador-j2ee/of...

(Says: J2EE Analyst with 3-5 years of experience and a lot of another requirements => €27-30K)


This slide deck is misleading. CS is supply constrained, not demand.

I went to UW and wanted to major in computer science. At that time (1997), they accepted 80 students per year into the program.

80.

Guess how many students applied to the program every year? More than 1500. The supply was/is so constrained they do not consider anybody with less than a 3.9 GPA.

Even today, as per the slide deck, These schools accept 400, 800 students. How many do you think are applying?


That could be an "it depends". I can believe a lot of public universities have quotas on CS undergrads (and other majors, especially ones that require labs, and until recently as these things go that was a big issue for CS), but it's the mark of a good private school that if an undergraduate wants to major in X, they'll find a way. CMU is the only exception I know to this, has a quota of about 135 CS students the last time I checked.

According to this page http://www.statista.com/statistics/183995/us-college-enrollm... , as of 2011 61% of students enrolled in public colleges. That's for students at all levels, however (you can use this page to subtract those, but it's breakdown is by gender: http://www.statista.com/statistics/236654/us-post-baccalaure... ).


What is the biggest reason for this growth?


I can speak to a few anecdotal factors having recently graduated from Stanford.

1) The current tech boom. There's the perception of being able to easily create a startup or to get a programming job at a tech company. Some major in computer science also for the technical skills it gives even though they plan not to pursue programming as a job. They may want to become a PM, or a related non day-to-day programming role for which understanding technical concepts is important.

2) Financial downturn of 08. Many students who would have previously majored in economics or finance have chosen computer science due to the folding of many investment banks as well as the increasing negative perception of working on Wall Street.

3) Increase in diversity of the major. The major has become more interdisciplinary. You used to be required to take very "hardcore" classes like operating systems and compilers. Now though with the rise of fields like HCI the major has become more diverse.

4) Social tipping-point. There is a common joke at Stanford that everyone is a CS major. More people majoring in CS means more help from peers who have been through the same classes and can help out. This social support leads to the positive feedback loop of more people entering the major.


#4 is interesting I've never heard that before. I recently graduated as well but from a heavily liberal-arts oriented uni. Most of my friends didn't even know a CS major even existed.

> There's the perception of being able to easily create a startup or to get a programming job at a tech company

Yeah a lot of people believe simply having a CS degree is a shoo-in for any job. However, if you're studying CS you probably want to work at a top-tier corporate or startup and neither is easy for most grads.


A lot of commenters referred to money being the biggest driver but I think that's only a half truth. Rather, job security seems to be the most logical reason to study computer science. No matter who you talk to, in any field, there's a huge consensus that having knowledge and experience with computers increase your employment prospects (I'm sure this could be proved by scanning job boards). Combined with the current job market and popular opinion that college may not be worth it (especially for non-STEM majors), I'm not the least bit surprised that CS has gained such an enrollment.


Anecdotally, my experiences confirm this. My brother, who before, had never even tried to program, is currently a CS major. Most of his friends are too. They don't care about startups or startup jobs, they want secure jobs at big companies.


Yeah I think that's an important distinction. A lot of kids are picking up CS as a second major or as a minor because it's believed to make you more marketable for a non-tech field.


Money and prestige.


Love the "have a beer while the students use Coursera" approach. As a student this would be amazing.


A note to poor kids, or kids who don't want to rack up huge student loans listening to some some Blow Gard who didn't stay current, and spends a month talking about history of CS. Many of you can learn the employable CS skills on your own, with the help of the Internet, of course.


This type of thing is very unsettling to me as a second year CS major. I might be one of the highest achievers in my college, but how am I supposed to compete with the thousands of kids graduating from MIT, Stanford, and Harvard? really scares me.


One of the concepts I have been thinking about for an education system is based on this triad:

- Symbol (mathematical languages / abstract interfaces)

- Human (natural languages / human-human interfaces)

- Machine (programming languages / human-computer interfaces)

Any particular subject or area of study is taught and learned using all 3 approaches taken together. Used as a foundational educational framework, this system could have potential to help cross discipline study through the assimilation of fundamental aspects of computer science, cognitive science, linguistics, and math into other fields. This also allows people to skip learning these topics as unpleasant prerequisites, which may not be well integrated with their field's particular goals.

Existing universities and schools will not be able to transition to these sorts of new holistic systems. Entirely new systems based on modern first principles must be created for the future of education.


The obvious answer is specialization. Computer science needs to be split, at the very least, into theoretical CS and Software Engineering. From there, both CS and Software engineering will need to contain specializations as well. Software Engineering will splinter into Networking, Javascript Performance optimization, Compiler optimization, etc.


Professionals who talk about this stuff never mention that learning computer science is a lot like learning a foreign language. The people who learn something in their early years tend to have an easier time learning new computer skills as they age. I wish more parents knew this.


I learned programming & CS at age 18 and I'm on par with people I met who learned it at say 11.

I learned English starting at age 5 (I read as almost fluently in English as in my mother tongue, Russian); I learned Hebrew starting at age 10 (I have a hideous accent and prefer to read in other languages); and I tried French in university at age 20 (it was the only time I had to take private lessons to pass the exam).

I'm not sure CS/programming are that much like foreign languages.


I said its a lot like learning a foreign language. So learning a foreign language when young would likely help you learn to program when older. Your case only helps to prove my point.


> I learned programming & CS at age 18 and I'm on par with people I met who learned it at say 11.

These people have 7 whole years of experience more than you have and the advantage to have gained most of it during a time where their brains have been most able to learn and embrace new ways of thinking.

Even though you are making an unlikely statement, it might be true in your case. But I'd bet a litecoin it doesn't hold true if you average it out over a bigger sample set.


Im similar to the person you're responding to. I think what really matters is if you've had exposure to critical thinking and logic in your life. For people who may not have much exposure to math or logic, it is probably much harder to learn CS at an older age.


Programming and graphic design perhaps (personally, I've yet to meet someone who programmed as a child who was better than a good programmer who learned later), but I really doubt there is much of an advantage to teaching your child Depth-First Search.

I don't really think CS is like learning a foreign language at all. The advantage for languages supposedly comes from the fact that we are hard-wired to acquire language at a specific age range ("Critical period hypothesis"). The biggest advantage is for infants, which steadily tapers off until puberty. I doubt we're hard-wired to acquire CS the same way - programming or theory.


> but I really doubt there is much of an advantage to teaching your child Depth-First Search

There is a huge advantage if you can teach it to him in a way that he can understand it, not just memorize the algorithm, like the unfortunate way kids learn to divide and multiply without understanding wtf they are actually doing, which is a big IF. It's a basic reasoning skill, even more basic than "basic logic" or arithmetic and if you can wire it into a very young mind you've changed that mind for ever. He will understand much better anything from philosophy to the theory of evolution, because you will have primed the mind for algorithms and logic. It doesn't matter if he ends up an investment bank manager or a politician, he will have learned how to understand algorithmic thinking.

90% of the population are, imho, what I call "mud-minds", they are incapable of deep understanding of logic and algorithmic knowledge, they can just learn instructions and apply them, sometimes to great results, but they don't "grok" them. And when it comes to truly abstract concepts that have no direct equivalent in the real world, they can only recite definitions from books, they have no "intuitive vision" of abstractions. And I think they are like this because their parents and teachers fed them a strict diet of 100% practical knowledge and scientific facts and never exposed them to abstract algorithms, patterns and processes. (As a different line of thinking, I think "separating the mind from reality" also helps young minds get a better grip of abstraction, like "running" an algorithm or visualizing a pattern as pieces of a fantasy world or advice of an imaginary companion, though child psychologists would obviously not be happy of such ideas of education...)

By teaching your kid something like depth-first-search (A* search on a game maze map would be even more awesome if you can grab a kid's attention though...), you give him a chance of not becoming such a "mud mind".

True, if a kid is technically oriented, you'll probably have more luck teaching him/her general programming before a bunch of algorithms. But the algorithms are the stuff that really "sharpens the mind" and you don't even need programming to get them, they can be explained in more mathematical or more graphical terms. And you don't even need a computer to teach a kid depth-first-search and you don't even have to tell him how it's called or what it is, just try and imprint a deep intuitive understanding of the process and algorithm on his mind.


> personally, I've yet to meet someone who programmed as a child who was better than a good programmer who learned later

True, but then you're already filtering out all the ones that are worse because you preselect for 'good programmers who learned later'. You should compare those who programmed as a child as one group versus those that learned how to program later.


The 'critical period' hypothesis is not the only theory and I don't believe its a useful one.

I think you're misinformed if you think brain development can be said to taper off in any way over 18 years. There is pretty clear evidence that it happens in distinct stages.

I think the real mechanism is that learning symbolic logic early causes a type of confidence that adults without find hard to acquire. Of course, my opinion doesn't really count here since my Karma is negative one now.


I think you're misinformed if you think brain development can be said to taper off in any way over 18 years.

That's not what I said, and the advantage you talked about supposedly does taper off until the child is around 12.

(edited for clarity)


[Citation needed]




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: