Post number 7,819,394 talking about how useful or not useful a CS degree is to a programming job.
Here are some lessons I learned by going to school that I find incredibly useful:
1. How to shut up and get some work done, even if I don't see the point to it.
2. How to shut up and get some work done, even if it isn't directly applicable or valuable to a job or something I want in the future.
3. How to take feedback and criticism from people who know better than me.
4. How to deal with someone I don't like but who has power over me.
5. Lots of other fascinating things that have nothing to do with my job but that learning about made me happy and appreciative of life.
6. A handful of things that are pertinent to my job, but I see this as added extras because I didn't go to school for the sole purpose of getting a job.
7. That a lot of people seem to think that the only things in life worth learning or paying for are those that will be useful at a job. This is a sad one.
These are about going to school, though, not getting a CS degree. And 100% of these things could, and probably should, be learned in the real world and not through paying tens of thousands of dollars under the guise of education.
I did not go to school for a CS degree, and that has not hampered me in any way when searching for a job in software.
All I learned in college was how to game the system of the institution itself, and that institutions like that are a societal racket and waste of time and money.
If you mean 'school' as in 'secondary education' then no, they apply to tertiary education too.
And this is going to sound weird if you're an engineer, but the 'real world' of commercial employment is 90% about gaming the system of an institution which is most likely also a social racket.
That's why kickass engineers get paid less than mediocre managers; because they spend all their time solving fun technical problems instead of gaming the system.
I thought this was the case but wasn't 100% sure which this poster meant. So in the last sentence where they say 'college' do you think they're talking about secondary or tertiary? It might just be a confusing-to-parse post.
> that institutions like that are a societal racket and waste of time and money.
This is completely wrong. Universities are tremendously useful and do many great things for society and humanity. I'm not going to provide evidence because I think that any reasonable person will agree with me.
While it's clear many university institutes and projects have benefited the wider public, I think he was specifically referring to the value of university to an individual, for most people (excepting those who are studying sciences to enter a specific niche field where they wouldn't be able to pick up the necessary skills anywhere else) university is indeed a waste of time and money. I'm a 22 year old developer based in Australia, I've been working in the web industry full time since I was 18 and been earning money all along. Yet I have multiple friends who did CS etc. at uni and despite having the formal qualifications I lack, they're earning less than me and have a good 10 years of paying off their education debt before they can actually use their earnings beyond getting by.
> I'm a 22 year old developer based in Australia, I've been working in the web industry full time since I was 18 and been earning money all along.
You're conflating Comp Sci with programming. I think it's pretty clear you don't need CS to be a productive developer, especially if the kind of work you're interested in doing involves building small to medium size websites and CRUD apps using other people's libraries and frameworks. Try to go beyond that however and you will quickly find out why so many people value a formal education.
Well yeah, I'm certainly only doing high-level stuff, if I wanted to do low level systems programming, OS/language design or anything with hardware at a lower level than whatever APIs existing drivers provide then a CS degree would certainly come in handy.
I was more getting at the fact that for probably the majority of people who go to university, they're pretty much wasting their time and money because unless they're looking to do something that can't be self-taught or learned on-the-job, they could be gaining the same skills (and maybe even getting paid for it) without putting themselves in debt that they'll spend a decade or more paying off.
For example, all the lessons capote claims to have learned doing CS at uni, I learned on the job while getting paid for it.
I have no doubt that a uni education would benefit anyone, I'm just saying that I think for the vast majority of people going to uni, the costs they're ultimately paying don't outweigh the benefits they gain, unless they're going to uni to enter a high paying field which requires a degree to enter the industry at all such as law or medicine. Unless the career you want absolutely requires a degree, you probably don't need one.
You say "niche field" but in reality we could argue that there are no niche fields as all the information necessary to "self-teach" is probably out there on the internet. What's vastly different is the experience of being taught by someone who has a deep knowledge of the subject, and is willing to share his understanding. While this carries with it the nuances of that particular professor's experiences, it will be much more structured information (and tested, to an extent), and this is a big part of why education at a university is valued so much.
But certainly, university is not the only way to learn. The issue is that there are often too many applicants to jobs already, and recruiters use a university degree as a filter. I think this may be unfair to some people, but the companies also value their time.
Once again, you've completely missed the point. I am trying to say that the value of university has nothing to do with your ability to get a job. I'm trying to plead with people to stop trying to compare university with a job or measure the value of a university in terms of a job.
If you can't see the value in university, good for you; nobody's forcing you to go.
I still think it's worth talking about this though, as so many people DO think its the only way to go and take on a tremendous financial risk thinking they do it for the job.
I wouldn't contest it's useful, but I'd definitely feel that for most it's probably not worth the price of a US university.
But what we have is an unholy chimera of academia and occupational training where _everybody_ is pushed to go through it, not just the academics. That's where this two-faced mess of inapplicability comes from.
I actually think you should provide evidence because it not so obvious. Universities are a business first and foremost. The societal benefit I can see is that of providing scholarships to people of need and that is a relatively small percentage. You might also talk about the research that comes out of universities but much of that is largely funded by the tax-payer.
This is a very pessimistic and conspiracy-theorist-ish way to look at it. Yeah, most entities in the world don't operate at maximum efficiency or make perfect use of their money.
Funding to universities in the US has increased a ton by increasing student loan amounts and availability, to try to get more people degrees.
Studies have shown that the effect on universities is that "administrative expenses" have increased to absorb around 90% of the additional tuition funds coming in from loan-bearing students.
The parent is not pessimistic nor conspiratorial. This is observed behavior on the part of the universities.
Yes, 90% is me being a little facetious. I don't know the exact number. But I saw the waste first hand. My college ran large parts of its operations for the benefit of the employees, to the active detriment of the students and everyone else. For example, dining employees were easily paid 2x market wage, probably 4x counting fringe-benefits, while students were forced to pay for a wildly overpriced meal plan that robbed all of the local restaurants (which were all within a quarter-mile of the three dining halls) and supermarkets of business.
They also ran a fancy hotel that charges many hundreds per night, right in the heart of campus. They consistently failed to make a profit for years and then totally bungled recent renovations, going 3x over budget and delivering behind schedule. Here's an interesting (and admittedly biased) blog post documenting that sorry saga: http://www.dartblog.com/data/2012/01/009957.php
Throughout the country, tuition has risen many times over in the past few decades, but the money isn't going to hiring more professors. It's going to hiring more and more college administrators and staff, and on loss-making sports programs that go well beyond recreational athletics. Every single type of minority possible had at least one administrative department dedicated to them, each with multiple staff members, and often with physical plants. Many (most?) of these were straight-up indoctrination outfits, pushing critical-theory Marxism on students, teaching them that they're the victims of the white establishment, and fanning the flames of campus protests.
Don't take my word, ask the American Association of University Professors:
>The increase in spending on administrative functions, coupled with a decline in state funding relative to institutional operating expenses, is clearly connected to the continuing increases in tuition prices on many campuses. As we have noted in this report on several occasions in recent years, faculty pay is not driving up tuition costs. In fact, the stagnant salaries paid to full-time faculty members combined with the increasing use of lower-paid part-time and non-tenure-track faculty appointments have been reflected in the lowered relative spending on instruction documented earlier in this section. But don’t just take our word for it. The most recent report from the Delta Cost Project concluded that “faculty salaries were not the leading cause of rising college tuitions during the past decade. Increased benefits costs, nonfaculty positions added elsewhere on campus, declines in state and institutional subsidies, and other factors all played a role.”
DDS is and was a complete fucking scam. Seriously, nobody should make $20/hr for being a lunch lady, not even Ray - and I love the guy, since he does such a great job with Pigstick. Not to mention OPAL and the various other Collis directorates - I'll be honest, I benefited from all of them, since I worked for the A/V tech staff on work-study, but the amount of money that was lit on fire to administrate all of that is sickening.
And before galileo any reasonable person agreed that the earth was flat... And precisely this is the reason why you need evidence to back up your arguments and not the influencable opinion of other people.
Also, while people like to argue that it's possible to get programming jobs without a CS degree, it is guaranteed to be some value of harder to get those jobs. Especially if you dream of working in less common and more specialized areas, like (for example) working on compilers or doing HPC stuff, etc.
Yeah, in theory someone can get a job without a CS degree. But it'll be a lot harder, and it's very likely that doors will be closed to you that you would prefer be open. It might be unfair, it might be an injustice, but it's still part of the equation. You might overcome it, but it is still a thing to be overcome that was taken on as an additional burden in exchange for losing a different burden (college).
I have no problem with people with passion and talent skipping college and going straight into the workforce. And it's very clear up front what they get out of it- they get an additional 4 years of professional life, and they never rack up student loans. I just object to people claiming that there's no cost to it.
There is a cost: maybe a negligible one, maybe just a little, or maybe a lot if it means you don't get to do the things you really want to do. Or if you never even find out about the stuff that you would have been incredibly passionate about because you were never exposed to it.
By not going to school, there is guaranteed to be a cost in missed opportunities. Maybe those shut doors and lost opportunities are all for things that the person didn't want to do anyway. The part that bites people is that when they are making the decision to skip school, they're probably in the 18-20 age range and most likely do not have the capacity to evaluate what those lost opportunities even are, and whether or not each one is OK to discard as a long term life decision.
I'm not sure. So I'm going to ramble on about myself a bit...
I have a mediocre degree in an unrelated subject (History and Philosophy of Science) from a good school (Cambridge). I learned to program as a child by messing with 8-bit computers.
My first job was very much an apprenticeship, I found a small company that was willing to employ me based on some level of aptitude, a modicum of demonstrable programming ability, and a willingness to work for very little (£12k in 94)
Three years later, I was able to move to a more interesting, still not brilliantly paid (£23k in 97), job at a startup, which was acquired by MS a couple of years later. I spent 14 years at MS, and was a Principal Engineer by the time I left. My experience since then is that there are no doors closed to me.
There are two places where I got lucky, 1) finding a company that was willing to make a bet on me. 2) joining a top tier company through an acquisition.
but with regards to 1, over a 9 month period in 93-94, I applied to over 100 companies, and had three job offers in the end - so it was good luck to find a job based on my resumé at the time, but the jobs were out there
And with regards to 2, not everyone who joined through that acquisition was successful at MS, regardless of education. Some hated the idea of working for MS, others weren't suited to a large corporation, and many other reasons.
Comparing myself with peers, I find that their CS degrees possibly gave them a 3-5 year head start on me, they could go straight to better jobs, but the value of the apprenticeship type job, is that I was already pushing 10000 hours (or equivalent) by the time I was ready for the better job.
That's true, and I completely understand, but it's besides my point. My point is to implore people to stop looking at a CS degree as just a means to get a job. If you look at it that way, of course you're not going to find 100% value in it.
Getting a job is simply not what a CS degree is for. It was never declared to be for this purpose—nobody claims it's for this purpose (except maybe some schools for marketing purposes).
Yes, we all know that you can do a good job programming without a CS degree. Good job. Whoopdeedoo. Anyone who keeps repeating this point is a broken record. Anyone who tries to scientifically prove this is wasting their time. That's not what a CS degree tries to be. This isn't why I went to school, and it's not why many, many people go to school.
Thats totally fair. But what you need to understand is that the vast vast majority of young people don't give a shit about anything other than the getting the job part of the equation.
Young people need money to buy food and pay for rent and to have fun. Thats what they care about. I went to a top tech school, but I know so many people at the school who DIDN'T go into tech and are now working as consultants doing 80 hours a week for half of my pay. But they hate their career choices.
And those are the people with the GOOD jobs, because they majored in business or something. I know others who are doing even worse than that doing part time tutoring, ect.
All of those people would have been better off just going to a coding bootcamp or something. Its ok if programming wasn't their "passion". Because its not like those people are currently following their passion working as consultants and part time tutors.
At the end of the day, getting a 6 figure salary, and working normal hours puts you way ahead of the vast majority of americans. It gives you the freedom to do what you want, when you want.
And it turns out, young people, after they have been in the workforce, failing at their dreams for a couple years, start to realized that they care a lot more about the freedom that having an awesome, high paying job offers them, and not really about anything that college supposedly delivers them.
You're right. Which is why I have been pushing to entirely separate the concepts of university education and of a job in the minds of Hacker News readers.
If you want a job, go to dev bootcamp. Or online school. Or watch YouTube videos. Or just get a job. Or go to school then get a job.
If you want to go to school, go to school.
Let's all get along and stop trying to dismiss the other path as invalid, worthless, or dumb. It's everyone's prerogative to go to school or not. Stop comparing the two or drawing associations.
Most developers already have a well paying job (likely including you) that's why they don't prioritise job education as much as a student who enrolls this year.
Most parents insist on kids going to uni specifically to get a job. If you can get the same knowledge for free (assuming you have to pay for uni), then what's the value?
If college were about knowledge transfer, we'd all stay home and read.
The value is in guided discussions, meetings with your professors, comments on your papers and code, making friends as you finally begin to understand difficult material together in the library in the middle of the night, and 4 years of living in an environment with an anomalously high signal to noise ratio of smart people throwing 100% effort into their collective intellectual growth.
Except most colleges outside the elite (and probably a lot of students at those too) expend more like 30% of effort growing intellectually. There is also significant social growth, generally, but a lot of time at college (probably most) isn't spent on intellectual endeavors.
Yep I agree they have, but that is just a fraction of the purpose they serve today - right now they're just filling the gap until something more effective comes along, like a better version of the technical college.
Uni never should have catered to the massive demand of skilled workers, leave it to the eternal academics I say...
I'm trying to say that I found value in it, and many people do. Like I said in other posts, I'm not going to fully describe this value, because it's up to you to find it.
Just know that it's not only for a job. Universities have been around for a long time and have served many great and noble purposes.
While I agree that 'get a job' should not be the focus of university, that's what all of modern society focuses on, from government mandates, to parents pushing their kids, to employers screening candidates.
The dream of what academia should be is wildly removed from the reality of the modern (from an American viewpoint) university system.
If you can't see the point, then maybe university isn't for you. I'm not going to write an essay on what point I saw in school because I don't have time or interest.
The point is that I saw value in it. I liked learning about computer science. Why can't people stop shitting on this concept?
Good for you for being lucky enough to be able to follow your dreams.
But you have to understand that other people mostly just care about being able to put food on the table, pay rent, and being able to save up enough to be able to support a family and buy a house.
> people mostly just care about being able to put food on the table, pay rent, and being able to save up enough to be able to support a family and buy a house.
I think that's what capote is frustrated with.
College (traditionally, at least in the U.S.) is one of the only spaces that encourages young people to have dreams at all. It provides an environment where becoming radicalized is accepted as a norm, where young people can dream of a life and of a society other than the status quo.
I went to university, and you're right, it wasn't for me. It also wasn't for the vast majority of people I knew that went.
It's great that you saw value in it. I do think you are in the minority. Most people don't like learning about stuff at the same time as working a job and throwing all their money at the university just to teach them stuff.
Almost every person that goes to a university is doing it for job prospects. Unquestionably. They think it's the new high school diploma, and in a lot of ways and industries they are correct.
You are an outlier, and that's why people are shitting on this concept.
I have a degree in Mathematics, I think my educational experience has made me a better developer than if I'd have pursued a pure CS degree alone. Does that mean that I think everyone who wants to develop software should get a math degree? Not even a little bit. But I also don't think they should pursue a CS degree either. In the end the answer is "it's complicated, and there isn't one good answer yet", but we don't get from where we are now to a better state by pretending nothing is wrong with the status quo.
Moreover, I question whether or not each of your seven points are worth about $10k each, because a lot of students are paying that much.
You're still, in this post, trying to extract the value of a degree in terms of job power. That's what I'm arguing against.
I paid $180,000 to go to university and I think it was 100% worth it, regardless of my job or how much it helped my job. My selection of 7 points was mainly to make a point, that point being that I see a lot of value in university, many people do, and we should all stop trying to fit all of this into employment, because they have very little to do with each other.
In what world do we live in that a university education has nothing to do with employment? Not all of us are lucky enough to spend 180,000 on an education and not care about its impact on our job prospects.
Interesting. I was with you until this post--I'm curious if you haven't wondered, even a bit, if 180k was too steep of a price tag? I'm a fan of university in general, but find the sky-high prices of private institutions hard to stomach compared to similarly-good in-state options.
Yeah... it's definitely a high price tag. I should've probably gone to a state school to save that money. But I still find it worth it because I see what I got from school as priceless, as silly and dramatic as that sounds.
As an aside, I am of the opinion that university should be free. But in the US, it's not. It's super expensive. That's a bummer, but like I said, it's worth it to me, money or not, job or not.
> But I still find it worth it because I see what I got from school as priceless, as silly and dramatic as that sounds.
Okay that means the answer to your question why young people get college degrees for jobs can be described in one word: economics (And that should be obvious given your level of education)
>because they have very little to do with each other.
Not everyone has $180,000 lying around. Someone has to pay for it and if it's not your parents then you have to do it yourself so obviously people prioritise degrees with well paying jobs and that's how it's supposed to work in an efficient economy. Otherwise if you can spend $180,000 on something without intending to get financial value out of it, you could just as well do something random such as buying a lot of bananas.
I remember specifically taking a few equivalent math or stats classes that were equivalent credit to Comp Sci classes and realized I gained a much deeper and more transferable set of mathematical and conceptualization skills in addition to the traditional comp sci experience I was receiving.
And to think... I did took the more gruelling math and stat's courses to get out of doing comp sci labs that didn't interest me so I could work on problems I enjoyed.
#5 and #7 are great, I too don't like the anti-intellectualism a lot of degree seekers seem to have. But the typical tuition these days is a large price to pay for intellectualism. I paid a high price for my CE degree, which I entered into in large amount for the intellectual curiosity of understanding hardware, knowing full well I'd probably end up in some web dev software engineering job afterwards (that pays better anyway) since I already had prior experience with that, and in any case as long as I had some job in the tech industry I'd be making enough that student debt wouldn't cripple me.
I would only encourage people to make the choice based on a sober calculation of expected costs (including opportunity costs) and gains rather than because "it's the thing to do". Before I went to my school of choice I planned out the course sequence for my entire time there -- of course it ended up being different but not hugely so. Unexpected things will come up (#1 and #2 should have been learned by high school, what I didn't expect was that for me continuing the related path of "shut up and get BS work done" through college would result in burning out my tolerance for BS work in general and it's taken a long time to build that back up enough that I can perform ok in enterprisey roles) but that in itself is a useful thing to learn. If past me had future me as a mentor, I'd encourage past me to skip school and satiate his curiosity more directly, but not everyone has access to a mentor.
I can confirm from direct/life experience that all 7 of your points can be learned/reinforced both on the job and outside/beyond job/college/school.
Folks, just use your brain. Observe, read, study, converse, imagine, think, analyze, hypothesize, apply, try, test, revise, repeat, carry on. Good general process regardless/before/after/outside school/college/job. You can spend either $100k and 4-6 years, or, $0 and N years -- you get to decide.
People without CS are miss out, and also, some CS programs come up short: Learning soft skills are as important as any technical chops. Learning how to learn and realize you're never really done learning how to solve problems.
This is one of the reason some self-taught people often have more people skills than those who self-select to hide behind a keyboard most of their university years instead of getting out there, becoming more well rounded, and through it solving problems.
The ability to work in a group as you've mentioned is a critical and transferrable life skill that many people simply don't grasp. If you can't solve university group problems, guess what kind of co-workers you'll be stuck with in my 20's.
Having some structure in the start is important and valuable especially for those who don't have the discipline to do things that need to be done and instead jump from one shiny toy or problem to the next.
At the end of the day it's not just what your education (or lack of education) makes of you, but what you make of your ability to learn every day. Discipline is the master skill that most people at any age in or out of school in our 20's are missing.
It's funny that you mention working in a group because that's one area where I've found where academia isn't really preparing CS grads. It seems that the most our new grads have been exposed to is an ad-hoc group that was supposed to work together and threw something together which they made work but with complete disregard for having to maintain the code 1+ years in the future.
They then enter an environment where the group isn't ad-hoc, meaning there's a hierarchy and people, especially new CS grads, get overruled. They're expected to comply with policies and processes that they had no input into. And they're expected to write code that doesn't just work, but is understandable and maintainable by every member of the team.
I find that most new grads are very able to come up to speed on the code and the structure of our project. This likely comes from having a very malleable mind and never having seen a well-organized codebase. But it takes a good two months of almost constant corrections before they're contributing to the team in the right way and many more years before the understand why that type of contribution is necessary. Teamwork is more than just not writing the entire project yourself and it feels like CS grads are never taught any of the skills necessary to collaborate in the real world.
I didn't mean to imply CS consciously was setup to prepare me for group work at all intentionally - it only exposed and forced me to figure out how to get better at it in my case when I recognized it was a core skill. Having extra curricular group activities based around tech helped me a lot. I do think group work and social skills should probably be a course for CS majors as it attracts many introverts.
Many people don't having experience with having a relationship with a codebase for more than a few months, let alone a few years, let alone a few years old.. so it always seems the best way to start is fresh and from scratch because it seems easier to understand that way.
Re-factoring, or working with the reality of a codebase teaches that reading and understanding code is as important of a skill than writing.
The things you listed are pretty much everything you learn on the way to adulthood, which for a lot of people occurs when they move out and go to college, so you might want to consider that.
I worked as a programmer long before the formal education, and the only thing I really learned is the only thing that is really needed to make it a worthwhile endeavor: a shallow but wide exposure to the body of knowledge out there. I accidentally reinvented poisson regression, I didn't realize that I had wasted a month of my life until going to college in my late 20s and finding out that somebody had already done the work (and a much better job).
>7. That a lot of people seem to think that the only things in life worth learning or paying for are those that will be useful at a job. This is a sad one.
Where is the money supposed to come from? If you're paying 50k or more for your education you might want to at least break even. If there is no ROI then it's just regular consumer debt which is both bad for you and the economy as a whole.
I may have been lucky, but I don't recognize this statement when I think of my own CS curriculum:
The vast majority of what a programmer does, forming good abstractions and avoiding complexity, is almost completely untouched by computer science curriculums.
That's the majority of what my CS degree covered, in several different programming paradigms. Even the idea that there are "programming paradigms" is something I learned at university, from a great course called Comparative Programming Languages. Granted, it's one thing to study it and another thing to gain years of experience doing it, but it was certainly a big focus. I took a whole course on design and structure of OO programs, which mainly at the time focused on C++, but also covered some alternative OO models and what pros/cons various people have claimed for this style of abstraction. And another one on design and structure of functional programs, which in the case of the one I took was mainly from the ML family perspective. These had some more "theoretical" content as well, especially the functional-programming one had quite a bit on type systems, but the overarching goal of the courses was abstraction and structuring (and even the type-theory stuff was introduced as a means to the end of useful programming abstractions).
When I read these kinds of summaries of computer-science degrees, I get the impression people went to some kind of program that was 100% algorithms courses, proving big-O running times and termination properties and that kind of thing. I thought I went to a pretty theoretical one (vs. one with more of a software-engineering bent), but it was maybe 20-25% algorithms. Where are people going where they are taking nothing but Algorithms 101 through Algorithms 1601 for four years?
I do agree that the standard tech company interview process seems to mainly test the algorithms part of the CS curriculum for some reason. I think tech interviewers love big-O notation tests more than actual CS academics do.
I went to an Ivy League college with a notable connection to Basic, and I didn't write a line of code for my algorithms and data structures classes, or for my AI class, for that matter. I wrote a metric shit-ton of proofs, though, and I can say with 100% confidence, that that was a complete waste of time.
I also wrote a metric shit-ton of proofs in my CS graduate education, and have that to thank for a greatly increased capability for logical thought over my undergraduate self.
I really view that algorithms class as a massive, massive wasted opportunity. There are really two aspects to that kind of a subject - the theoretical underpinnings, and the practical applications. We beat the ever-living piss out of the theory, but I doubt anyone who didn't already know how to do it came out of that class knowing how you would actually code a linked list or a graph, and what the trade-offs of various implementations were. That kind of a subject would be such a rich field for a course in low level C or Rust or [insert manual-memory-management language]
I see complaints about big-O notation being a useless interview torture device all the time, and yet I'd be hard-pressed to find something else in my very theoretical CS education that (a) was easier to learn (it's the most hand-wavy math I've ever encountered), (b) I retained as well, or (c) is as useful to me as a full-stack web developer.
I can't speak for others, but knowing when to use an array vs. when to use a set feels pretty fundamental to me being able to do my job well.
>I can't speak for others, but knowing when to use an array vs. when to use a set feels pretty fundamental to me being able to do my job well.
I'm not sure why you assume that the only way to learn this knowledge is through a CS degree. These are things than can be quickly learned by reading a few articles online.
I also don't see why one would necessarily need to reason from complexity arguments, rather than more intuitive "I'm using a set because I want to use set operations like union and membership". If performance is crucial, reasoning by complexity analysis is not sufficient and often not even necessary.
This becomes tricky when you're using a language like Ruby where arrays also have built-in union and membership operations.
And I don't see why, if performance is critical, knowing which operations/techniques/algorithms are orders of magnitude slower than others wouldn't be useful (it has been for me).
With higher level languages it's even less important to know theoretic complexities, just represent the problem in the most natural representation and optimize later. (You're already giving up orders of magnitude in exchange for dev time by using a higher level language.) Good engineering practices like being able to swap out structures and algorithms later matter a lot more.
Operations/techniques/algorithms are great to know when performance is critical, the subset knowledge of worst-case time and space complexities not really. (And in interviews do you ask for best case or average-under-certain-probability-distributions cases? How expansive is the trivia matrix you get candidates to fill in?) They can sometimes help guide, but they'll just as often mislead. Bubble sort and insertion sort are both n^2 algorithms, but you should never use bubble sort, and if you never profile you'll be confused why my quicker quicksort uses insertion sort inside when n is small since quicksort is n * log(n) so it should always be better right? (Conveniently ignoring worst case for quicksort is n^2 because people usually just memorize the n * log(n) factoid.) Or you'll be confused why I don't use quicksort at all if I have multiple CPU cores to create a parallel algorithm on. Throw in other aspects of modern hardware like caches (try swapping the loop order in code that needs to loop over all cells in an NxM matrix and note how much performance can suffer if you do it wrong) and branch prediction and special instructions your CPU architecture supports, those are very good to know too if you care a lot about performance. For complexity analysis in the real world, "constants matter", but that doesn't really capture all of the caveats.
Edit: one interview problem I like to give (given I have to give one and can't do things my preferred way) is the jump game: https://leetcode.com/problems/jump-game/ Rot13 commentary: Bar fbyhgvba vf whfg qrcgu-svefg frnepu. Va Clguba zl pbqr sbe gung vf 12 yvarf, vg hfrf n fgnpx (jvgu n abezny clguba neenl; vg pna or n dhrhr gbb vs lbh whfg punatr gur nethzrag gb .cbc()) naq n frg, ohg lbh pbhyq whfg hfr n cynva neenl sbe rirelguvat. V yvxr guvf ceboyrz fvapr vg yrgf zr frr vs gur crefba xabjf ubj gb nccyl bgure fgehpgherf orfvqrf cynva neenlf. N ybg bs crbcyr pna erpnyy gur rkvfgrapr bs frgf naq dhrhrf naq znlor svyy va gurve pbzcyrkvgvrf bs PEHQ bcrengvbaf ba n zngevk ohg pna'g npghnyyl nccyl gurz be nal nytbevguzf orlbaq ovanel frnepu naq gur onfvp PEHQ vzcyrzragngvbaf sbe rnpu onfvp fgehpgher.
I'm not assuming this at all. By all means, learn via non-traditional routes! I only meant to push back on the parent comment's framing of big-O complexity as useless theory.
I got my degree in Computer Engineering rather than CompSci or Software Engineering at Waterloo, which is sort of a hardware/software hybrid program. In hindsight that was probably the wrong choice for me. I quickly found out I really didn't enjoy working with hardware, and that the software side of the program was woefully inadequate compared to some of the better CS programs out there, precisely because we had zero exposure to paradigms that were not imperative/OO programming.
My friends in CS at Waterloo got to work with Scheme in the first year for their introductory programming course, when we were using C# and Java and the like. I was young, stupid and arrogant at the time, and made fun of them because I dismissed Scheme as some weird language nobody would ever use, but now I really envy the CS grads for their clearly much more well-rounded and better thought-out software curriculum.
The saving grace of the Computer Engineering degree at Waterloo is of course the 6 4-month co-op terms (internships) sprinkled throughout the 5 years program, where I gained valuable real-world work experience and finally acquired an appreciation for functional programming, but there's a large degree of variance in students' experience in the co-op program depending on the jobs they can find, so not everyone can be so lucky.
Well, on the other hand you hopefully have a deeper appreciation of all the assumptions and approximations that make up the implementation of boolean logic underlying our computers.
Sounds like the kind of course that I would have wanted (I dropped out of mine). The sad thing is that they get squashed under the weight of students and industry railing against learning 'theoretical' stuff because it seems like it isn't 'real world' enough. On the contrary I've found my self-education in type theory, programming languages, and discrete mathematics hugely beneficial in helping me pick up new technology fast, and finding the right abstractions for the job at hand.
I probably covered more on networking than algorithms. I only did 2 papers on algorithms at university, as well as AI, which covered some more algorithms.
I did 2 papers on networking, a paper on computer security, and a paper on web development, and implemented an http client for Operating Systems.
I had 2 classes of data structures + algorithms, a couple more of algorithms and complexity analysis, and algorithms were heavy components of the AI and robotics electives that I took. I had one networking class, also an elective. I had a similar programming language comparison class as the poster you responded to, along with OOP design and a course on software engineering practices.
There were some other math and logic-related courses with some proofs and such, but almost all of that was backed by writing software projects. I didn't write many papers in my computer science classes, but I did a lot of projects, both individually and in groups.
It's interesting to read how different other people's university experiences were.
So I did a 4 year computer science degree about 10 years ago. It's really hard to understand why anyone would bash a degree. What I still find "relevant to practice" from then is:
- Any good CS curriculum is basically a calculus, statistics, linear algebra, real analysis, combinatorics, discrete mathematics course with a bunch of CS modules thrown in on the side
- All the computational intelligence stuff we did (particle swarm optimizers, neural networks, classification algorithms, data mining, ant colony optimization, genetic algorithms, etc.) these things were all cool but not really mainstream back then. It's all the rage right now. I'm really glad I have a good theoretical background in all of that now, it makes it so much easier to jump back into machine learning and analytics. The math I did makes it even easier still.
- I use a lot of "full-stack theoretical understanding" of operating systems, networking, compilers and distributed systems courses to reason about why things could possibly go wrong in production performance problems on large distributed enterprise applications, and how to isolate where the problem is. I do this on a daily basis and am amazed how many professional programmers don't understand what's going on "under the hood". It's almost a Dunning-Kruger effect situation where people don't even know what they don't know.
Then this doesn't even begin to mention things like formal methods, software architecture, computer graphics, database theory, etc. Oh, and the philosophy courses, which really went into logic and critical thinking.
A CS degree is completely mind-expanding if you get it from a good institution.
Huh. I'm a 2nd year undergrad atm, and came across an instance of the nurse scheduling problem at my on-campus job. I did some research and delved into evolutionary computation, eventually deciding on genetic algorithms to solve my schedule problem. (I'm using a self-adapted version of NSGA II)
Anyways, all my research has gotten me very interested in all the topics you mentioned, also neuroevolution. I actually brought up neuroevolution and optimization problems to my professor to get his insight, but he wasn't able to help me out much.
I keep finding that people state computational intelligence was all the rage about 10 years ago, and can't find much work on what is going on with it now. Can you provide me with some more info in the field?
I can only point you to the book we used at the time: Computational Intelligence: An Introduction http://jasss.soc.surrey.ac.uk/7/1/reviews/ramanath.html Although there were many class handouts and journal papers we had to read too.
Thanks - at this point I have looked through as much material on the subject since my project really only involves genetic evolution and multi objective optimization. Time to dive in!
People don't understand what computer science is. Don't claim you did CS and it didn't teach you to become a developer, that's not what CS is, and I'm surprised that after a bachelor's and master's the author doesn't acknowledge that.
> well, computer science should be MUCH more maths heavy than it is, and software development should be taught as a trade.
Up until recently, the vast majority of Computer Science programs were under the Mathematics department. For most colleges even to this day they are still under the Electrical Engineering department.
Interestingly enough, I think the misguided constant push to move CS degrees towards industry relevance more and more makes them more and more worthless -- we have academics going to industry conferences and asking what kind of developers would be preferable to churn out, students whining that they aren't 100% ready to drop in at a company when they graduate. So schools are pushing teaching how to write CRUD apps in java or "a class on the newest hottest mobile app development platform" or "version control" -- all stuff that should be fairly easy to pick up in a day or on the job for someone who knows theory (version control), or has no place in a University (the rest) when they should really focus on the hard stuff, like CS theory (PL, theory of computation, distributed, compilers, algo, concurrency, mathematics), which is MUCH more difficult to learn outside of an academic environment, is universal, and translates to every language and development paradigm whether it's now or 50 years from now.
>students whining that they aren't 100% ready to drop in at a company when they graduate.
I think the problem is not with the students. The problem is that the companies expect you to be 100% ready to drop in and the only education you can put on your resume comes from a CS degree or previously having a job in the industry.
As someone that had a highly theoretical computer science education, worked in the industry for three years, then returned to academic CS for summer research, I have a few thoughts on this.
1. A highly theoretical CS foundation is still very useful. One of the hardest classes I've ever taken in all my studies was an upper-level proof-based algorithms course. Nowadays I can glance at an algorithm and give you it's runtime complexity. This is actually an important skill because it influences how you construct your own methods and such. There were quite a few times at my old employer where I'd be like, "If we just switched this out to a hashmap, we'd have a constant time solution instead of this polynomial one," or something of the sort. Furthermore, that algorithms course changed how I thought about computing (and life) forever. Computational complexity theory is simply wonderous. I'm in med school now, and I will tell you, learning about human biology through the lens of a computer scientist is a truly enriching experience. Computation is everywhere!
2. Your everyday programming jobs probably don't require that deep-cut CS coursework BUT there are plenty of specialized jobs that do. I can think of many sectors where you need to be fluent in something like graph theory and ways to traverse graphs, or statistical methods + machine learning. Not all jobs are full-stack dev jobs. Undergraduate CS is supposed to lay a solid foundation that you can build your career on. It's not vocational training. You have no idea where you'll end up! And honestly, I think there's plenty of hindsight bias going on here.
3. Man, oh, man, people in academia suck at software engineering (and not to mention, product design). This isn't their fault! They haven't worked on a large codebase before. They don't understand git-flow or continuous integration or test-driven development. Their products need to get A's, not sell to a consumer. I will say, coming from the industry into back an academic environment it has been a bit of culture shock. Tools used, architecture, even ways of speaking about the software are often completely outmoded.
All in all, I'd say both are very desirable. I'd recommend self-taught programmers take some classes on the more theoretical/mathy side of computer science. And I'd recommend that the academically-minded CS seniors FOR THE LOVE OF ALL THINGS GOOD take a year off and work in the industry before pursuing grad school.
I wonder if part of the problem is some universities watering down core, foundational, beautiful CS to try and make it 'practical'. They end up with a course labelled 'CS', but is actually a poorly implemented, behind-the times, irrelevant software engineering course that is the worst of both worlds...
Oh, I agree. My school tried very hard to NOT do this. I remember I was concerned about the lack of "practical" CS education, but alumni reassured me that whatever I needed to know after graduation would be picked up very easily. They were right.
A degree in literature doesn't teach you how to be a best selling author or a get a job writing tv scripts. On the other hand becoming a good a programmer is much more difficult when you don't understand the fundamental inventions upon which everything else is built. You're limited by what you see in front of you in stack overflow answers and blog posts, with no insight into the why's, only the how's. Self taught programmers that progress beyond simple web pages and crud applications will undoubtedly come across a lot of computer science on their journey. Maybe you don't have to study Milton to be the next Stephen King, but your life will be a lot richer for having done so.
"I hope to see the industry improve in this respect, but in the meantime I'm happy to exploit this imbalance as a competitive advantage."
I wonder how much of requiring a computer science degree is simply because the people hiring have computer science degrees. Seems like a self perpetuating insiders club.
I have a CS degree and would value it not at all when judging a candidate for a web/mobile dev job.
It's interesting that this story appears at the same time that an article about how memory games designed to improve your intelligence only improve your memory with that memory game - not in broader contexts.
Chess masters are no better than beginners at memorizing chess boards with a more random arrangement of the same pieces.[0]
Expertise in humans seems to be highly specialized.
So then why do we expect algorithmic knowledge of a CS degree from people writing modern crud web apps?
Conversely I don't have a CS degree and I take a rather cynical view of candidates who have do. Perhaps I've created a self perpetuating outsiders club at my company. :)
In all seriousness, though, it's because the CS majors I've interviewed have (generally) been great at theory, crap at application. I once had a guy hold forth for 10 minutes about cardinality and then flop sweat when I asked him to explain the difference between "GROUP BY" and an "ORDER BY".
I am also not impressed by a lot of CS majors I interview. I often ask them to come up with a simple sort algorithm where performance doesn't matter or to reverse an array. A lot of them have problems with that. I am very tempted to try fizz-buzz...
As someone with a computer science degree I can say I've never questioned it being on my companies job postings. But when I think about it I'd rather see a candidate with an excellent github account than a degree.
But that means you're heavily biased against programmers that may be very capable and successful but don't work on open source projects. That reduces the catchment size considerably.
I also agree with the article author to give them a take home project to work on. Just saying that a computer science degree and a 10 min chalk board test are a pretty terrible way to make new hires.
How long of a take-home? In my current job search, I think I've finally hit my limit of overly burdensome take-home exams that significantly disrupt the childcare and exercise routines that I depend on. I think any take home that takes more than 2 hours of my time should require the company to compensate me at a fair hourly rate for completing the test for them, otherwise, given how many companies do not place this asymmetric burden on candidates, it's just not worth it.
I also think many of the same biases and inefficiencies that haunt terrible whiteboard hazing interviews still apply to take-home tests. If you say something like, "write this as if it's going into production" it's a disaster -- whatever qualifies as good enough for production is a subjective opinion that varies from organization to organization -- and often varies considerably within an organization too.
A great programmer pressed for time might slap something together and add some notes in a write-up about the extensively different way they would work on it if they were being paid a wage, given quiet working conditions, and tasked with it in the regular setting of a job. And would likely get rejected, despite being a great programmer.
Meanwhile, a candidate coming from some situation without hardly any time constraints, perhaps taking time off between jobs or during a university break if still in school, might spend 20 hours on something that should only take 4 hours, polish the entire thing, write a full test suite, and implement extra features. They might be more likely to be hired even if they are an inferior programmer, simply because they had access to significantly more free personal time to fit it in.
It can even be prohibitively hard just fitting in round after round of technical screens and 1-hour phone calls that probe your experience or ask you to solve short form problems, especially if you're interviewing with many places and they all want you to do that.
Making it take-home sounds better in theory, but there still are a ton of failure modes that people don't adequately account for.
"However, I also have to acknowledge that the scale and difficulty of the problems I've worked on is unusual compared to the vast majority of programmers"
It's likely that the vast majority of programmers wish they were working on these interesting and difficult problems, but instead they are writing another yet another ETL transform for an ERP system (not that there's no value in that, but that was my job, and it was repetitive and dull)
So even if a Computer Science degree isn't necessary for the vast majority of programming jobs, without that degree, you can only do one of those vast majority of programming jobs, leaving the really interesting problems to those with advanced degrees (or special domain knowledge)
I'll take domain knowledge over any degree every day of the week. In quite a few cases, we're still figuring out what the domains are or they're domains that academia doesn't care about.
More than that it's all about how much of a self-learner someone is. This industry is so new and changes so rapidly that if you aren't feeling that itch to understand some new aspect of programming you're going to be left behind. The really good programmers I've seen have all been full-stack, not in frameworks, but in a complete understanding of how a system functions top to bottom.
Also quite a few CS people would do well to examine EE or a proper engineering track. I've always laughed at the title "Software Engineer". Very few are, most of what we do is just throwing something against a wall to see what sticks.
The value of a computer science degree does not lie in the number of direct applications you will encounter but rather in the school of thought and way of thinking it molds in you.
Often, people take for granted how differently you approach problem solving coming from a CS perspective. If anything, I have gained mostly in the ability to break down problems into the smallest pieces possible that work together thus reducing complexity and shaping the problem case. I can isolate core problems and through algorithm design implementation evaluate best solutions. I therefore wholeheartedly disagree with the notion that CS does not teach how to reduce complexity. It does.
Above all, I have rid my decisions of emotion and solely focus on the problem at hand. It's brutal, rough and inconsiderate at times, but it produces the desired results.
I agree. The problem is the intangibility and the difficulty to estimate the value (by an economic metric) of the degree.
Several careers have a more direct and tangible link degree-value, no one can be a cardiologist or a civil engineer without the degree.
I've a Software Engineering degree but I don't really know if it was worth it, I learned some valuable things but how can I say that I made a good choice with some degree of certainty?
Can we all agree that college is about getting a knowledge foundation, a nice degree that says you can get things done, and interpersonal relationships, and move on already?
This topic comes up regularly. Yes, a CS education is not enough. You also need to learn programming. You also need to build systems and learn to think about how to build others even more. But for me it has been very useful almost daily and it can make the difference between being on solid ground vs. being out of your depth and sometimes not even knowing it.
>I highly doubt most programmers need to know how to do formal proofs of algorithms, determine asymptotic complexity of algorithms, or even know that much about data structures.
Indeed, but I believe that having a solid foundation in theory is important. People have spent a lot of time coming up with ways to reason about common problems, and you don't need to reinvent the wheel.
Additionally, while many programmers can get by with skimping on theory, your mathematical knowledge tends to limit problems you can tackle. While you might be able to get away with making some CRUD application without applying any complicated math, you will have a hard time trying to write code to localize your robot with SLAM. Knowing theory only opens more doors to more interesting problems.
sure, but the maths involved with building a robot is not the same as doing user analytics etc. better to assume no knowledge when entering those fields, and learn on the job otherwise you learn a whole load of stuff you'll never need. you can't be a senior in a field you know nothing about, so these jobs will always have someone that knows that stuff and can pass on their knowledge.
i can say for fairly certain that 90% of the engineering maths i did at uni will not be applicable to me in my career, and there's plenty of maths that would be applicable that i wasn't taught. considering that was a 6mo unit (and ignoring the fact that it was easily the one that burned me out fastest), that's a lot of wasted time.
The engineering math curriculum has issues, yes, I agree. I thought a lot of topics were taught in the wrong order and their usefulness was often not made clear until much later. At this point, though, I think I've used techniques from pretty much every class (even ones I thought were irrelevant when I took them) I took.
But really, the engineering math curriculum has (or at least should have) three purposes. The first is to teach you logical reasoning and problem solving -- critical skills. The next is to provide you with a toolbox of techniques you can call upon to solve common problems in your field. Finally, the curriculum should provide you with the mathematical maturity to acquire the new mathematical tools that you need.
The last point directly relates to your second paragraph. I don't know very much about data analytics. If I (somehow) got hired into a big data role, the senior engineers will lack both the time and the desire to sit me down and give me the background on statistics, machine learning, etc. that I would need. However, because I have spent a lot of time studying other mathematical topics and have a background in linear algebra, probability, etc., they can point me at good sources and I can learn myself. If I came in just knowing single variable calculus from high school I would have a rough time.
Except that I think formal mathematics and proofs are attainable by informally-trained programmers... just perhaps not computer scientists.
To quote Friedrich Bauer:
"Software engineering is the part of computer science which is too difficult for the computer scientist."
I'm not sure how true that is. I don't have a computer science degree. I just have some interesting anecdotes from a colleague who is pursuing his Ph.D in computer science with a focus on developing a formal proof system. As a teachers' assistant he gets to run some classes. Most students tend to snore through formal methods I'm told. In order to wake them up he asked his students, who are working on their masters, to implement binary search. He gave them a specification for the algorithm and off they went in pairs. He used the PlusCal language from TLA+ to model around eight submissions in roughly two hours (I'm not sure what the exact number was). He found errors in 7 of them and was able to demonstrate what the inputs were and the steps the algorithm would take to reach the error state thanks to the model checker in the TLA+ toolbox. He got a few students hooked from that point.
In my case maths is exactly what I needed. I'd been working in distributed systems for years without an iota of training in formal methods. I'd read Leslie Lamport's papers on time, Paxos Made Simple, Raft, Bloom^L, etc. I have several common textbooks on distributed systems. I'd worked my way through many subjects in graph theory, "concrete" mathematics, discrete maths, etc. But as Nathan says what saved me most was experience and learning what abstractions to use and how to maintain a handle on complexity. If I just followed the code I could make things work. Except when the code wasn't a good enough abstraction (in the mathematical sense)... then things got hard (ie: race conditions, scheduling problems, etc). I used to just follow the code and suss out the problem using standard debugging tools and time-honored techniques but what started to really click for me was modelling my problem in TLA+. When I started leaning on maths I felt like I could start achieving more and solve problems before they became bigger problems.
I often talk about programming as a force-amplifier. If you're an actuary and you know how to program you will be a better actuary than your peers. Maths is the force-amplifier for programmers. You can learn formal mathematics and create a language for modelling computations that you can prove. Anything you can do from there will be better than what your peers can do.
... and yes I realize that not every piece of software needs to be accompanied by a proof. But for the hard problems it's a useful tool to have. And there are many other subjects where it's becoming necessary to understand different branches: graphics, AI, "big data" (ie: applied statistics), distributed systems (even your modern processor is a tiny distributed system), etc.
I have very mixed feelings about this article. For one thing, there are a couple of different kinds of CS programs. The basic division is, or used to be, whether it was more Math or Engineering heavy. In fact, some schools attach the program to a BS in the Math department and others in the engineering with more exposure to EE and Computer Engineering. Neither of these are necessarily helpful for becoming a programmer in today's market, but they do generate very different experiences.
The biggest benefit for me was the complete understanding and model you get of the entire computer. How it works, what theories it is based on, why things are the way they are (in terms of the Turing model, the way microprocessors work, etc).
I don't believe this knowledge is helpful from a software engineering point, and I may have been a better programmer if I had spent those years learning more about how to work as a team and how to solve real-world problems. However, I do feel that it has allowed me to switch between different fields of programming without completely being thrown for a loop. It has allowed me to understand the issues that crop up when something goes wrong on a much deeper level, or why problems crop up on one part of "the stack" that I may not have understood had I focused merely on learning how to program. These are personally valuable to me, whether or not they may have been more valuable to my career.
I do think it makes you a better engineer, but I also think that the first year or two of working in a group with other developers taught me alot more about becoming a good software developer than alot of what I learned in college. You learn more about how computers work in Uni, but you learn more about how software development groups work by doing software development.
Isn't the tricky bit that a computer science education is in fact an education in the science of computers (no surprises here), yet what most people study it for is an education in their usage?
Maybe what we actually need is a separate field, e.g. "practical computing". - which, I accept, would lend itself less well to University education which we have come to expect as a prerequisite for better paying jobs.
'software engineering' is a degree program in itself in many universities, and it is a curriculum more focused on pragmatic, often even specifically business-focused, project delivery, from coding in a wide range of business-heavy languages to theories of software lifecycles and team management.
That said, anytime I've met one of the hard-working students who opted for a more math and theory heavy CS program instead of an SE program, that person has been able to software engineer and project manage circles around the people who specifically studied software engineering and project management.
You start with a subjective criteria? Ouch. It reminds me of the time I had to do some work for a factory... every department was telling me how theirs was the most important work - with arguments.
I can make an argument for even a "write to log" line as being extremely important when something goes wrong; I have no idea how I would pick three lines out of a hundred.
"Whether someone can or cannot solve some cute algorithm problem in a high-pressure situation tells you nothing about that person's ability to write solid, clean, well-structured programs in normal working conditions."
It sure tells SOMETHING.Ofcourse there are better ways to evaluate a person as a programmer but I hate it when people claim something without basis.
Annoyingly, there isn't much in the way of change on the horizon for this problem, people seem to like things the way they are.
The problem is that there is no good formalized educational standard for software development training. So instead Computer Science gets shoehorned in as a substitute. This leads both to sub-standard Computer Science degree programs from a pure CS perspective (because they are primarily just trade school educations in disguise) which are even worse as regards to training for Software Engineering. CS is very algorithm and data structure focused, which is good in moderation for dev training, but they usually delve into realms that are not useful even in the "well it's good to have a broad knowledge base" sense. Moreover, CS doesn't typically touch at all on the many practical skills, and their fundamentals, needed to be a good dev, ranging from development cycles and styles to QA and testing fundamentals to source control use and management to proper composition and componentization of software to refactoring and dealing with legacy code to deployment strategies and techniques and then on to familiarity with actual operational systems related to all of these things. Each of those topics has sufficient depth to be an entire course or series of courses, yet they are frequently only barely touched on in CS programs, and sometimes they are ignored entirely.
I've been in hiring loops numerous times, and one of the quickest things I learned about hiring in tech is that a CS degree on a resume means basically nothing, there is almost no correlation between having studied CS in college and being able to code at even the most basic level. Unfortunately, I don't foresee much change in the future on this problem. CS is still a prestige degree (a 4-year degree in one of the coveted STEM fields) and I have a hard time seeing a legitimate Software Development education fitting in the mold of either an arts or sciences degree program at a liberal arts college.
One interpretation of CS (that I personally subscribe to) is as a subject about the theoretical and mathematical understanding of process and abstraction. Computers and software are just the medium. Taken that way, it's an important subject that should be supported and protected. But but students and employers seem to be expecting CS courses to be an industry training program about specific technologies and practical programming. This leads to a confused mess where nobody is speaking the same language, and limited satisfaction for all parties.
I think the author has accidentally replaced the word "useful" with "being a good programmer".
Sure, most computer scientists become programmers, and CS degrees don't necessarily translate into programming skills. However, getting a "computer science" degree is not the same as getting a "programming" degree and I don't think that its "usefulness" should be determined by the skill someone has as a programmer after getting the degree.
The reason getting a CS degree does not make someone a good programmer is simple:
* A large percentage of the courses you will take will be general education (about 20% in my colleges case)
* Another large chunk will be in mathematics (say another 20%)
* One more sizable chunk will be "lost" to CS theory and "row-based" information (remember writing round-robin schedules by hand, or verbally explaining parts of TCP in class)?
Leaving us with maybe 40% of classes having some sort of programming lab.
And if your like me you'll specialize in machine learning, will will allow for even less exposure to things like the gang of 4 patterns.
However, the flexibility of a CS education is much much greater than if I had just been programming for those 4 years. If I decide I'm more interested in a career in IT security or data science, my courses will have useful. I think that someone who had instead just been programming Java and Rails web apps for 4 years (like I do now) would have a much harder time transitioning into those (potentially higher paying) fields as the market evolves.
Also it's important not to discredit general education. Being a strong writer, speaker will always help you. Knowing some history and geography will keep you from sounding like an idiot when you talk to people from other countries.
A college education is not the same thing as a trade school and shouldn't be compared to one.
Ater working with Storm and reading Nathan Marz's Master thesis, I got to say, he really is one hella of a developer. He was also very responsive on the questions I asked him on Storm's forum. I take Nathan's advice at its face value cause he is the most well-rounded computer scientist you can ever find.
If any young person is reading this I would caution them not to take such advises blindly. A "good computer science" education beats "no formal education" any day for an average person for the simple reasons.
1. Success in tech industry depends as much on your peers as much it would depend on your IQ. Knowing bright, well established people in tech industry is invaluable. Such connections are often built when you go to schools like Stanford, MIT etc. or even a reasonably good college.
2. Having a good college name on your resume is going to open many doors for you. You will realize that as a MIT grad when you talk to investors about cutting edge research in a specific field would be far more credible.
3. CS degree will not teach you a lot about how to code, build great products. Not more than a passionate coder writing code in his basement. I admit that.
> The vast majority of what a programmer does, forming good abstractions and avoiding complexity, is almost completely untouched by computer science curriculums.
I feel like in the course of working on large projects such as a parsing, lexing, compiling program, a student inevitably touches on these topics. And certainly when working with template code (such as in an Intro to CS class), they can see how nicely it was structured for them.
However, I agree with the sentiment that it seems like these topics are not directly addressed by most curriculums. At the same time, from my experience it seems like these ideas are applied very differently in different contexts (such as different platforms and languages), so a college class would be more abstract in teaching them. Perhaps it would be effective if they did some examples while teaching these ideas.
Some very valid points made in this post. Is CS valuable? I believe it is. I've witnessed the difference first hand in our startup but then again it's "hard tech."
The part I question is, the percentage of programmers that will work at a "hard tech" company is so small that, unless the company you're working at is a hard tech company, then there is no relevance to onboarding someone with CS skills. It does not diversify the team. Diversification in a team is of high value. Most engineers want to hire others engineers so that they have something to talk about. But that's a comfort thing.
Long story short, soft tech companies should only have one or two hard tech engineers on their team especially in the early days and only test at the skill level required when onboarding.
Another thing that bothered me about the article, although it did make some good points, is that it just assumes that the choice is either four years of employment as a programmer at some level or four years of uni. If you're trying to get a job as a developer as an 18-year-old with a fresh H.S. diploma and maybe some tech school credits, your competition isn't other H.S. diploma carrying developers, it's a 22 year old graduate with a CS degree.
If you live somewhere that you can get a job in the field directly out of High School, that's all well and good, but it's alot harder for a variety of reasons than someone who has finished a degree. Which is one reason why so many people go that route.
>If you live somewhere that you can get a job in the field directly out of High School, that's all well and good, but it's alot harder for a variety of reasons than someone who has finished a degree. Which is one reason why so many people go that route.
that is key; there is a strong argument in favor of jumping right to a programming job if you have the opportunity... it's what I did. but I got lucky on a bunch of fronts. (parents in the industry, timing, etc) - I mean, if you have a programming job right there it's hard to argue that you should go to school instead. And in my case, the job was available in large part due to timing; it would have been way harder to get a job in 2001 than it was in 1997, even if I had gotten a degree meanwhile.
I guess the other thing to consider is how hard it is for you to go to school later. I mean, if the parents are willing to foot the bill for your education now and won't be willing a few years from now? that's a pretty big thing. I'm slowly working on getting myself into college now, and it's harder in some ways, because the typical application process was not designed for me, but easier in others, because things like money aren't a huge problem anymore, and I am so much more self-disciplined.
I'm too early in the process to tell you if it's easier or not (I've got either 9 credits, or 0, depending on how you count) - but it's certainly different. It is... difficult to pay people to
But, in general, if you have the opportunity to get a coding job right out of high school, and nobody is paying for your college, take the job. If someone is paying for your college and you don't have the opportunity to get a coding job out of high school, go to college. The optimal course of action gets foggy if you have both opportunities or neither.
I didn't see university as a occupational training facility, and I was torn between majoring in CS or history.
What I got out of my CS program was... discovering that figuring out how systems and things worked was interesting to me. I also found the prospect of something closer to engineering was a better path to me than academia, teaching, law or some other path.
In terms of the relevance of specific coursework, I can't say that I spend a lot of time with advanced math or some of the other topics, but I learned my way around a UNIX system, got my first taste of large scale networking, and a bunch of other things.
Doing well in a STEM degree like CS or Math is valuable for two reasons:
(1) it usually, but not always, implies that the candidate has at least some coding knowledge; and,
(2) [most important] it is a good proxy for logical/mathematical intelligence.
As long as IQ tests for employment remain a legal grey area, employers will continue using STEM degrees as a filter.
The claim that (2) is the most important is anecdotal, being based on my own experience: I have a degree in mathematics, which doesn't really imply (1) to nearly the same extent as CS, and I have never felt it made it more difficult to get job interviews.
That sounds like a round-about way to train, prepare someone for a profession. The CS program is an artifact of how slow universities are to adapt to the needs of a growing industry. I would much rather see software development slip into an apprenticeship program rather than rigged up to a program that was never intended to train engineers.
> Whether someone can or cannot solve some cute algorithm problem in a high-pressure situation tells you nothing about that person's ability to write solid, clean, well-structured programs in normal working conditions. This practice is akin to hiring pilots based on how they do on an aeronautical engineering exam. That knowledge is relevant and useful but tells you little about the skill that person has for the job.
There's a lot of talk and debate about whiteboard interviews and this, for me, succinctly sums it up
Is the analogy really that good? A pilot will never be called upon to engineer a plane. Lots of good "hackers" will get stuck if they have to design an algorithm.
I value my CS degree, but I also agree with the author. I don't think recruiters should ask for a CS degree, or focus on applicants having a formal CS education. There needs to be an simultaneous shift in the industry and in academia that allows for software engineering and computer science to be distinct diciplines with distinct goals: SE gets you into the industry, CS gets you into research. Most other sciences have their envineering partners, and technology should be no different.
The vast majority of what a programmer does, forming good abstractions and avoiding complexity, is almost completely untouched by computer science curriculums.
It's also almost completely ignored in the typical programming job interview. Which is why when I interview people now I'm most interested in looking at code they've already written, and asking them to break down an application description into a set of data & interface abstractions.
The students you really want working for you are those who went through the rigor of a theoretical CS curriculum while also indulging a more practical skill set. Such a skill set can be honed at hackathons, for example.
It mustn't be forgotten that a theoretical curriculum teaches discipline to learn just about anything--if you can learn complexity theory and algorithmic proofs, chances are you can learn to create a CRUD API.
I was just thinking about this today, after I ran out of time in an online interview where I had to write code to correct the formatting of various strings (add spaces, capitalize etc). I was given 20 minutes.
I thought, I would have nailed this problem when I was looking for my first developer role, when I knew precisely how python regexps worked. Now of course I just look them up when I need them, which is never.
Seems silly to pay thousands of dollars to read through some books at a slow pace.
My employer wants to pay me to get a bachelors in ee or cs (have half of one, but dropped out when it was clear I couldn't afford to finish), followed by a masters/phd, but it's only tangentially about the 'education'. It's all about networking and being able to wave around credentials.
CS education is not useless, but less used. And it is also used-less by the same people who teach it. There are exceptions - but that number is way too small than it should be.
If someone wants to learn coding or programming - screw the CS degree, there are better options. I know CS degree holders who can't code more than a few lines or can't debug a large program. And there are better programmers who don't have a CS degree. YouTube will teach better programming than a CS degree class.
Apart from programming, if one "also" wants to learn the mathematical aspect of CS and want to work on core-stuff, go for a CS degree. And those skills do not come easy from some tutorials - a academic rigor is probably the best way to earn it. But if you use the O-notation at some jobs - you could become an outcast in the team.
When I got into a top college for a CS degree, I had high hopes from it. For the first 1.5 years, I wanted to quit, quit and just quit. It was hopeless - they were teaching few subjects that I was way better at. But for some reason they would not allow me to just give exams and get promoted. I already knew programming, patterns, electronics and even dealt with basic robotics before I joined college. For one example - when they started teaching operating-systems, my expectation was that they will at least give a hands-on on writing a pretty basic OS. No - they were just doing theories. Ultimately, I ended up writing a pretty simple version of a custom OS without any mentoring (and used Google) - because the professors had never even written a boot-loader to start with. I also ended up helping weaker students get upto the mark. In one of the semester assignment evaluations, I was told by a senior professor " you copied programs from the internet - how come they are so well written". I lost trust in them.
I did learn some stuff in the later years and came to peace. While I knew programming, I did not have a strong math background - stuff like probabilities, numerical methods, theoretical computer science and bit of advanced algorithms were the things I learnt and explored further. The most important thing was getting introduced to AI which turned out to be a huge area of interest for me.
However, all of that bad experience has turned me sour towards academia in general and I do not recommend spending a lot of money and more-important-time learning "used-less" things - unless you are expecting more than programming in CS. And when it comes to CS, the learning never stops.
In my years working in the computing field, I have found that individuals with a Computer Science (or related) degree to be much more adaptable to emerging technologies. Individuals without a degree have proved to be excellent at yesterday's problems, but they have a difficult time moving past their initial expertise.
So basically the guy is saying there IS value, but limited. And in a fairly neutral stance about a rather sensitive topic, he gets to advertise his book as well.I guess with degree from Stanford, he may have written a program to generate that article with a target sentiment and then insert the plug ?
If you are not a DBA and did not get a BSCS, ask yourself this question - when people speak of relational databases - what is a relation? Then look up the answer, which is rather simple (don't reply with it here, unless you rot13 it or something). Maybe ask engineers with a few years experience. Were they right?
Where does the concept of regular expressions come from? Can you explain why you can't use regular expressions as an all-purpose HTML parser?
I took 1 1/2 semesters on critical sections, race conditions and such. I deal with these problems all the time, and have seen much code where people either never learned about it, or didn't learn it properly.
I had the idea once to hash a short list of small numbers as a Goedel number. Looking around I see the idea was not original (in other places I use triangular numbers etc.) I would never have known about those without CS.
Yes you have to keep learning more. It is a good foundation to start from though.
Ive heard the value of comp science being exposed to various algorithms but im surprised that knowlege isnt an advantageous starting point? I think in long run it wouldnt matter since if you dont have exposure you will eventually pick up
Here are some lessons I learned by going to school that I find incredibly useful:
1. How to shut up and get some work done, even if I don't see the point to it.
2. How to shut up and get some work done, even if it isn't directly applicable or valuable to a job or something I want in the future.
3. How to take feedback and criticism from people who know better than me.
4. How to deal with someone I don't like but who has power over me.
5. Lots of other fascinating things that have nothing to do with my job but that learning about made me happy and appreciative of life.
6. A handful of things that are pertinent to my job, but I see this as added extras because I didn't go to school for the sole purpose of getting a job.
7. That a lot of people seem to think that the only things in life worth learning or paying for are those that will be useful at a job. This is a sad one.