Hacker News new | past | comments | ask | show | jobs | submit login
What happens to tech workers when their skills become obsolete? (qz.com)
138 points by seagullz on Sept 11, 2019 | hide | past | favorite | 230 comments



Flash was one of the first tools I ever started programming for. Today I work in AI.

This article is a thinly veiled advertisement for bootcamps, spouting this nonsense:

The “learning-by-doing” approach in the Flash workers’ scenario highlights the value of coding schools and boot camps designed to teach students in-demand skills in as little as eight weeks. And whereas traditional educational environments tend to view their curriculum as having an end date (generally aligned with their students’ graduation), Lambda School co-founder Austen Allred envisions students coming back to his coding school every eight years or so to learn new skills.

No. What will give you longevity is a degree in CS. Frameworks and technologies change, but CS concepts stay consistent.

If you have a good grounding in CS, you have longevity. Couple that with a good math background, and then you have even more possibilities.

Bootcamps are rubbish in the long term. They try to lure you in with quick results from little time investment by teaching you how to hack with the hip framework of the week.

I have been there, done that, having gone to a polytechnic school where I learned Flash + web dev. I later made the switch to a university to study CS. One of the best decisions I ever made.

Don't fall for the trap of instant gratification. Learn CS (and math) with a formal education, and you will thank yourself years from now.


This is a weird post. A CS degree and programming bootcamps aren't mutually exclusive. I'd say they're not even really comparable.

There's many reasons why a CS degree would not be a good fit for someone. Some people just don't perform well in an academic environment. Some people simply don't have the intelligence required for that kind of education; not all tech workers are "software engineer", there's "tech workers" stretching from PhD level all the way to near minimum wage. Some people might not have 3 or 4 years of time to spend on a CS degree. Some people might not be able to afford a CS degree, depending on where they live.

Basically, I think your post just mentions a lot of the positives of a CS degree (without mentioning any of th negatives), and then somehow uses this to conclude that "bootcamps are rubbish" without actually saying anything about bootcamps themselves, as though there's a binary choice between the two where both options are always equally viable.


I don't see how you get that conclusion.

The OP seems to be saying bootcamps alone are insufficient and a poor choice for someone who will not be able to get a degree and wants career longevity. I've worked with a lot of people from the technical skills but no tech academics portion of the tech field. If they want to work in tech support or IT then it is not such a bad option, but that isn't really what bootcamps are trying to prepare them for.


How much good do you think that my CS degree from an unknown state college in 1996 has actually done me compared to keeping up with the latest tech, doing Resume Driven Development and staying buzzword compliant?


Having any STEM degree from anywhere has probably lowered your risks of a gap (or level drop) and potentially permanent drop in salary considerably, with more for it being in CS. The degree also gives you options like embedded or government positions that are pretty much unavailable without a CS (or EE) degree and have less change, making better per hour compensation if you don't like working more than 40 hours, 32 hours, 20 years for a pension?

You certainly can compete with all the people without degrees where they can make as much total compensation as someone with a degree by going with the most instable parts of the field. But that competition will sound exhausting and the level you put in will be exhausting your other options in life while still probably being less energy than the bootstrap-only graduate.


Having any STEM degree from anywhere has probably lowered your risks of a gap (or level drop) and potentially permanent drop in salary considerably, with more for it being in CS.

I’ve been working in the industry, in Atlanta for 20+ years. I’ve been on the market 6 times (stayed at one company almost a decade) and it has never taken me more than three weeks to have multiple offers. I can guarantee you that none of those jobs or interviews cared about my CS degree. The quickest was when my contract was over, I called a recruiter, had an interview four days later and an offer the same day. I’m no special snowflake, any halfway connected, developer with buzzword compliant resume could tell a similar story.

My second job out of college my manager cared a lot more about my then encyclopedic knowledge of C from spending way too much time on comp.lang.c, my (self taught) experience with x86 assembly and that I was a big enough geek to be able to talk about some of the 65C02 assembly language hobby projects I did in middle and high school than my courses in Pascal, Basic, FORTRAN, and COBOL or even my one simple data structures class in college.

Do you think consulting companies are recruiting me now based on my 20+ year old degree or my more recent experience as a team lead, AWS architect, and my being able to be talk about “The Well Architected Framework”, Domain Driven Design, and the Cloud Maturity Model? Yes these are all buzzwords to a certain extent but consultants get paid a lot for them.

Whether I had a degree or not, if I couldn’t negotiate a salary commensurate with my experience - I would be doing it wrong.

But that competition will sound exhausting and the level you put in will be exhausting your other options in life while still probably being less energy than the bootstrap-only graduate.

Considering what I learned in college and how useless it was compared to what I taught myself even when I graduated or even more importantly the experience and networking I’ve done in the past twenty years, I really don’t see my career trajectory being that much different.

The only thing college did useful for me was get my first job based on an internship the year before. In today’s world, if a boot camp (which didn’t exist back then) could have gotten my foot in the door, it wouldn’t have made any difference.

As far as pension, it’s nothing special. A pension is just worth the present value of all the payments you hope to make after you retire, considering the difference in pay - and the flexibility - in the private industry even in Atlanta, you can easily save/invest enough over your career to create a greater annuitized income in retirement.


> In today’s world, if a boot camp (which didn’t exist back then) could have gotten my foot in the door, it wouldn’t have made any difference.

I delayed my degree while the first dotcom boom was in swing.. I did the only relevant classes before I delayed it, and not having the paper did make a difference. I could have argued my way into the same position I transferred to after getting the degree, but they would have delayed things, delayed upping compensation, maybe given me lesser tasks or been less convinced of my work in some cases. They also would know which other companies couldn't be competing for me.. again loss of compensation.

I would be very surprised if that 1996 degree isn't on your resume, equally surprised if anyone checked if the school is accredited and surprised if many of your choices were slightly fewer/different options leading to accrued path dependent losses on average. I.e. a first team lead isn't suggested leading to no path or a significantly delayed one to team lead on your resume.

I used university as a bootstrap back then, and then had the company pay for the rest of the degree. This could probably work for bootstraps and a degree today.. but the only reason I see for that path over credited time toward the resume item is if you needed to convince yourself of whether you want a tech job.


I could have argued my way into the same position I transferred to after getting the degree, but they would have delayed things, delayed upping compensation, maybe given me lesser tasks or been less convinced of my work in some cases. They also would know which other companies couldn't be competing for me.. again loss of compensation.

The dot com boom was fairly isolated, unlike what happened in 2008. In 1999, when I was looking for my second job, enterprises were so desperate for qualified developers jobs were easy to come by. I negotiated a relatively decent raise (not Silicon Valley type money) after the dot com bust because profitable companies still needed developers.

Even today, most corporate enterprise type developer jobs - where most developers are - wouldn’t know CS from a hole in the wall. They just want people who can turn business requirements into shipping products. The degree has never been what gave me leverage and optionality, keeping my resume in line what the market wants has been.

When I stayed at one job for 10 years until 2008 and was woefully behind the state of the art, my degree did me little good.

I would be very surprised if that 1996 degree isn't on your resume, equally surprised if anyone checked if the school is accredited and surprised if many of your choices were slightly fewer/different options leading to accrued path dependent losses on average. I.e. a first team lead isn't suggested leading to no path or a significantly delayed one to team lead on your resume.

Yes the degree is on my resume, but not the year I graduated. Neither is anything before 2008.

My very delayed path to being a team lead was a function of me taking my eye off of the ball for close to a decade and not gaining the skillset to be one. Being a team lead has nothing to do with how well you can do algorithms in most companies and is usual a combination of your interpersonal skills, your emotional intelligence, and experience with translating business problems into working systems. In fact, more often than not, it’s knowing what not to build and focusing your team/company on writing software that is within their core competency and outsourcing the rest or using third party systems.

That being said, I purposefully self demoted (in title not pay) from being the dev lead of medium ($1 billion in revenue) non software company to being a senior engineer at a much smaller software company who needed someone to modernize the software architecture. They were basically using AWS as a glorified overpriced colo. I discovered as a team lead that the real money locally was in consultancy and I had a few gaps to fill in.

My CS degree from 1996 definitely didn’t prepare me for my roles over the past three or four years dealing with enterprise architecture.


> A CS degree and programming bootcamps aren't mutually exclusive

Of course not, and the GP never said that they were, but bootcamps are often marketed as an alternative to a CS degree.


I agree with your statement: "If you have a good grounding in CS, you have longevity. Couple that with a good math background, and then you have even more possibilities."

But I disagree with your other statement: "Learn CS (and math) with a formal education".

You can learn CS from internet.

https://github.com/ossu/computer-science https://teachyourselfcs.com/

In the first link, it is said you need roughly 2 years to master core CS.

Actually, I have a wild fantasy. I build a startup like Lambda School but it is not supervised. So I give money to motivated students. This money cover their living cost for 2 or 3 years, internet, and a decent laptop. They have to spend 10 hours per day, 6 days per week to learn CS. After finishing the study, they can repay back my money with some interest (capped to a number) only if they find a proper job. If they can not find a proper job for 5 years, the debt is forgiven. But this is just my wild fantasy. :)


>I agree with your statement: "If you have a good grounding in CS, you have longevity. Couple that with a good math background, and then you have even more possibilities."

Maybe in the US or SV but in Europe nobody outside research, academia and archaic engineering companies where a degree is a form of signaling, care about your degrees.

In tech here, unless you have x years experience in the latest tech(.net, typescript, Node, etc.) you ain't getting any job regardless of your CS degree.

Edit: not sure why I'm being downvoted, I'm only expressing my experience on the market here in Europe as a engineer with 6 years of C&C++ experience, with BS in CS and a MS in Embedded systems, and web shops have declined to interview me citing I don't have enough experience with the latest web languages and frameworks. Perhaps people feel that ageism and obsolescence won't happen to them.


I guess I only worked for archaic engineering companies then.

Outside a couple of startups with dubious prognostic of success, I never saw a CV going through HR for IT department without having higher education degree, with the exception of technical students.

And in most European countries one doesn't get to sign Engineer on formal documents (from law point of view) without the proper tile certification.


I've worked in lots of places where having a degree was not a requirement to get hired but certainly a lot (especially bigger) companies value an education. I would say that the more accurate statement is that enough valuable work-experience can alleviate the need for a degree but degrees are far from worthless.


Some of the best as well as some of the worst developers I have worked with had no formal CS education. (The worst ones got in through biology -> bioinformatics, the best ones through gaming and setting up networks and general enthusiasm for the subject).


Maybe in the US or SV but in Europe nobody outside research, academia and archaic engineering companies where a degree is a form of signaling, care about your degrees.

I regularly hire people into a commercial software company in Europe, and I don't inherently care about your degree, but I do care about you understanding a broad range of the fundamentals, and in the absence of a CS degree (which does happen) I have to spend extra time assessing an applicant's understanding of the fundamentals. Recently hired a fresh grad without a CS degree for a C++ programming position, and his interview was a bit longer than usual while we ran over those basics.

The degree IS a form of signalling; it signals "I probably have a grasp of the basics".


We can argue about "fundamentals", but I'm happy hiring non-CS grads. Because for us, "fundamentals" are things like "can you work in a team?", "can you write/communicate well?", "can you grasp real-world workflows/issues/bottlenecks real people have using products?", and "will you try the simplest thing first?". Other degrees offer the same or more opportunities to hone these skills.

It's easy for senior engineers on the team to address gaps in CS knowledge, but much harder to address a lack of soft skills, writing, or over-designing/too much abstraction (because you were taught the patterns, but not when to use them). I could go on, but IMO, this is why interviews are still so broken, and why the field of software development still faces the same issues it always has. Granted, I'm not building a DB, but compared to maintenance and over-engineering, performance is rarely an issue, and easily solved via profiling.


My fundamentals include a basic knowledge of algorithms, data structures and essentially just how to think about problems in a way that solutions can be programmatically represented.

Basically, can you write simple software? Everything you mentioned is also needed, but I really, first and foremost, need them to be able to write some simple software. If they can't do that, I really can't afford to teach them. A CS degree often makes this check simpler than for the candidates with other educations, such as the history degree holder we have.

If you don't need that then sure, it's a different hiring game for you.


My guess is that c/c++ experience is really good when you apply for a c or c++ job. If you apply for a web developer job mostly consisting of html+css+js it might not be seen as a plus. Even for c# and java jobs c++ experience is somewhat good but not perfect.

My view has been that the experience is good but not enough, you must show a willingness to learn other things and work in other languages and any experience in the languages actually used trumps a lot of other things.


How do you show that when you're no even invited to the interviews in the first place? HR looks at your resume first when screening, not your willingness to learn.


Maybe in the US or SV but in Europe nobody outside research, academia and archaic engineering companies where a degree is a form of signaling, care about your degrees.

That’s also true in the US outside of the HN/Silicon Valley bubble...


Is .Net better regarded than Java in Europe?


Europe is not a single entity in tech, you have to check out your area(country, city) what prevails the most.


> Actually, I have a wild fantasy. I build a startup like Lambda School but it is not supervised. So I give money to motivated students. This money cover their living cost for 2 or 3 years, internet, and a decent laptop. They have to spend 10 hours per day, 6 days per week to learn CS. After finishing the study, they can repay back my money with some interest (capped to a number) only if they find a proper job.

That was exactly the notion of modern labor, in Y Combinator last year. It lasted six months until it learned that teaching yourself CS is much more difficult than it sounds, almost everyone had given up, then they pivoted.


Modern Labor in March 2019: "We Pay You to Learn to Code. Modern Labor is a revolutionary platform that pays you $2000 per month for 5 months to learn in-demand tech skills and then finds you your new job." http://web.archive.org/web/20190320144627/https://modernlabo...

Modern Labor in April 2019: "We Grow Tech Talent. Modern Labor is a revolutionary platform that grows talent for technical roles at your company." http://web.archive.org/web/20190412130518/https://modernlabo...

Modern Labor in May 2019: "Hire Tech Talent On-Demand. Modern Labor is a highly-selective talent network of US-based tech professionals to help you build tech teams when you need them." http://web.archive.org/web/20190501133947/https://modernlabo...

Modern Labor in August 2019: "Hire technical talent fast. Modern Labor partners with companies to help them find and hire technical talent quickly and easily." http://web.archive.org/web/20190831055322/https://modernlabo...

Fascinating to watch their transition from revolutionary platform to ordinary recruiting agency.


Revolution and disruption is sometimes needs more than words, evolution or fitting in. Increasingly it looks like finding small new interactions that can repeat and spread.


You grounded my fantasy. :(

I thought self-learning CS can be a good way to democratize wealth. You only need a laptop (even Chromebook is fine), internet connection, self-determination, raw intelligence and living cost covered to be able to master CS. But apparently most people are not able to do that. Maybe less than 1% of population can do this kind of thing. Or maybe not.


Maybe they did something wrong and it's still possible. Hard to say. The difficult piece to solve is how to get people to power through when they're frustrated and confused. That probably takes more than money.


I don't understand why your fantasy includes working people far more hours than us necessary in a year.


Basically it is my reflection. If I were to get this kind of chance, I would work myself to the bone (hence 60 hours per week) and 60 hours * 3 years = 9000 hours. So after 3 years, I would have a really strong CS foundation. But you have a point. What is suitable for me may not be suitable for other people.


Everyone's learning curve and trajectory is different and often with the similar potential outcomes.


Going through ossu at the moment, highly recommended. Coming from a Physics background and working as a SE, it gives me the tools I need to "catch up" on my own terms.

Edit: Typo


> You can learn CS from internet.

I strongly disagree. You need real professors TAs and project work, deadlines,exams etc.


Depends on what you already know, for example a math grad should have little problems self teaching CS with the internet.


Lambda School is 9 months long full-time (~2000 programming hours) and includes CS fundamentals.

The assertion that those things must be taught/learned within the confines of a CS degree is a little silly.


Sorry, but it is IMPOSSIBLE to learn the CS fundamentals in 9 months. Period. What you may be able to do is to teach people how to game interviews within 9 months, sure. Good luck once that performance review comes around...

I was very interested in math and theoretical CS. It took the whole batch of students, including the ones in honors math bachelor a good 2 years of daily intense study alone to get the basic feeling for math straight (I was part of the IDEA league program in Europe and I would consider my CS studies among the best in Europe, especially when I see what students come out of Berkeley and Toronto). Then it took at least another year or two of algorithm studies to properly deal with algorithms and data structures. Also note the exam failure rate of about 90%.

Sure, you can hack some knowledge together in 9 months. But it is biologically impossible for the brain to properly learn these things in 9 months. If you can do it, you are a genius and your talents are completely wasted in this bootcamp. You should instead apply for a PhD in astrophysics at MIT or something and start contributing to mankind...


What do you consider the CS fundamentals? Data structures & algorithms, and what else?

Also, what is the baseline you're starting at, before you start the timer?

Maybe I'm just arguing about what you call a genius, but I think it's totally biologically possible for many students to learn CS fundamentals beyond the capabilities of the average CS major, in 9 months.


9 months is 3 quarters. A dedicated student could easily fit 12 courses into that. 12 courses is the bulk of a CS degree (barring a few electives). See for example this one:

https://undergrad.soe.ucsc.edu/sites/default/files/curriculu...

It'd be intensive; but doable.


Forgetting the fact that most schools aren't on quarters, the 12 courses all have prerequisites for a reason. You can't just take them all at the same time.

Even if you could take them all at the same time, maybe 2 or 3% of students would have been capable of completing those courses in that time frame at my University.

You're also cutting out 5 classes because they are called electives. In most programs electives are structured so that you're going to get exposure to certain topics no matter which electives you take, so randomly cutting out 5 classes just because there is some choice doesn't make sense.


I mean "Data structures & Algorithms" is a very broad topic.

The baseline is high-school.

Well "the average CS major" is also a very broad statement :D. I have no good overview of what an average CS major is, to be honest.

I think we are talking about different things. Let's say you learn how to build a hashmap, how to solve TSP with dynamic programming. Conceptually, this is possible to learn in 9 months if you are a good student (and therefore, your are talents are already wasted in this bootcamp). But what if I ask you some follow up questions? Modify the problem. Will you be able to explain how perfect hashing works or what the runtime of a hashmap is if the hashing function is not O(1). How different hashing procedures lead to different qualities for different implementations of hash maps?

We are not talking about knowing that hashmap give you O(1) lookup if you are lucky. If that is the skill you want to learn, sure, 9 months will do. But truly understanding what you are talking about and being able to explain, augment, modify/improve data structure and suit algorithms to your needs... Proving that they still work correctly after your modification. Understanding how that damn distributed consensus algorithm works that seems to have a bug that fucks up your database every now and then.

Yes agreed. You don't need to know that for 99% of CS jobs. But we are not talking about whether bootcamps can prepare you for work in average code mills, we are talking about whether they can replace CS education.

As a matter of fact I DO believe that bootcamps solve a critical purpose in filling the vast amount of gaps in our IT market. But I am always surprised again and again why "the new kid on the block" always needs to attack other completely valid paths (like CS major). They are completely different things made for a different purpose.

I think people need to realize that CS major is NOT the right choice for most coding jobs. But that does not mean that a bootcamp can replace a CS major, it just means that a bootcamp can be an efficient shortcut to hit the job market running.


> Yes agreed. You don't need to know that for 99% of CS jobs. But we are not talking about whether bootcamps can prepare you for work in average code mills, we are talking about whether they can replace CS education.

We're discussing "what happens to tech workers when their skills become obsolete." If being able to contribute to a scaled system as a great software engineer at Google or Amazon is not a sufficient measure of "Knowing CS well enough," I'm not sure we're talking about the same thing.

Is it enough CS to do fundamental AI research? Eh probably not. Is it enough CS to do pretty much any other job out there? Yes.


It probably doesn't even matter for AI research. I don't know any professional AI researchers, but I bet reading arxiv and reproducing other's work, and publishing your own work is how you go about that. Why would you do coursework when there's papers to read?


_Maybe_ you don't need that much CS per se (debateable, but an argument can be made); however you do need a lot of math coursework.

Also, it's far easier to grasp the basics of AI from a course than to reconstruct "basic body of knowledge" just from reading arxiv. As a matter of practical considerations, I just don't believe anybody can become an expert by reading only the research, without going first through the basic training. It's too damned difficult, too much work (and pointless, too - people did that work for you already and built great courses with the summaries, why not take advantage of that?


>how to solve TSP with dynamic programming.

Impressive. (TSP is NP-hard. Not even NP-complete because you can't verify the solution in polynomial time... or at least, I don't know how to do it and would love to see a solution with dynamic programming)


It's an approximate solution with bounded unoptimality, I don't remember the exact bound off the top of my head.

Edit: it's 3/2 for Christofides' algorithm. Native dynamic is bad speed but gives accurate solution. It's no good for more than 24 nodes or so.

Usually it is good enough, other similar and better attempts try to tighten that bound with more admissible heuristics. E.g. for metric spaces solution is possible to tighten a lot. (Like shortest travel without weights.)


Oh, I see what you mean; I would still be seriously impressed with any bootcamp that teaches approximation algorithms. And not in a good way TBH - it kinda' defeats the purpose of the bootcamp. It's hard enough to go through Knuth in a few months...


Article is talking about Flash Developers, who most likely became JS front end developers when the wheels fell off the cart. No need to even know a single algorithm to be a successful front end developer. Any algorithm you can think of, that would possibly be relevant to their job is already implemented in some NPM and probably far better than any of us would implement it.

I value CS degrees and value mine, but reality is, not everyone that writes code, need a CS degree from MIT or Stanford. Bootcamps serve their purpose and I have worked with many self taught and Bootcamp grad developers who went on to be exceptional developers. I myself was self taught before I entered into a CS program and honestly I wish I would have went into applied mathematics. Everything I learned in school I could have self taught by already having a base in programming. I also never use anything I learned in school at my day job since I left AI and 3D dev.

More relevant to the story, aging techs can actually be quite lucrative once you pass the curve of no new people are learning it, and all the skilled workers have left for greener pastures and are not looking back. I have taken some gigs for IBM Universe, VB 6, and COBOL JCL that have been quite lucrative because no one wants to touch them and most of the talent has retired out of the field.


>>You should instead apply for a PhD in astrophysics at MIT or something and start contributing to mankind...

I think that is horrible advise. They will pay you peanuts and abuse you. Instead go found a startup and make lots of money. Then you can fund interesting research that contributes to humanity, if you want. Or you could get a PhD once you've gotten your money.

This is all assuming that said person is a genius, of course. Otherwise disregard this advise.


I think you could teach decent, but not stellar, CS in 9 months. Most of the crap I had to take in college was not related at all to CS, or only conditionally or infreqently related. Calc 2? Wonder when I'll need to pull that out again, probably never. Western Civ? Totally irrelevant, even if it was interesting and enjoyable and made me a "more well rounded" student. There's a lot of fat to cut out of traditional CS programs at traditional colleges.


Every CS topic requires math much more advanced than calc 2 to properly understand it. For example:

* Machine learning is nothing but multivariate calculus.

* Analyzing network traffic is all queueing theory which is based on calculus.

* Even the simple data structures proof that no comparison-based sorting algorithm can run faster than n log n requires calculus.


But the vast vast majority of programming jobs don’t and will not require this type of knowledge. Yes it’s useful in some very important career paths but not the average.


And yet you see plenty of graphics programmers learning advanced 3D shit without a CS degree.

And all those older programmers doing machine learning when they haven't had any CS courses on the topic.

Obviously a degree helps, but it doesn't stop someone suitably motivated and capable.


You can do quite a bit without understanding it properly. As tons of cool kids have shown none of that shit actually matters if you are trying to create the next big thing that's going to make billions.


Missed statistics on the second bit. It's not just calculus though statistics use it.


Calc 2 usually covers sequences and series. That's very relevant to CS.


I can't think of any time it has helped me. I have never needed nor benefited from that effort, time and money.


We have vastly different experiences then.


You could also learn sequences and series outside of Calc 2 in a fraction of the time


Probably--assuming you already have the prerequisite knowledge required. But will you? Probably not.


What prerequisites would you need? Notation and arithmetic?


Have you ever tutored or taught a Calc 2 class?


Have you ever taught yourself something complicated? Books can be read and comprehend and applied without paying tuition and yes sometimes you will find you need to read another book first before continuing. There are plenty of people who can do this just fine, maybe you have a different learning style but don’t assert that on everyone else.


Reading a book and thinking you understand the material is not the same thing as understanding the material. If you randomly pick a bunch of people who claim to have self taught college level mathematics and gave them the corresponding MIT exams I bet most of them would fail. If it is a higher level course I'd bet not a single one of them would pass since the things they misunderstood compounds as they get deeper down into abstractions.

Of course it is possible, a few people throughout history have done it, but it is rare enough that I would claim that anyone who thinks they did are deluding themselves unless they can come up with further proof.


> “read and comprehend and applied“

To quote my prior comment, this basically says it takes more than reading. But reading is the root of it. These days “reading” encompasses everything from books, internet, and even YouTube style videos. Basically individual leaning content.

It’s very possible if you’re motivated and put in the time. College is forced motivation.


You can't apply maths as a test for it, you need someone who can tell you when you are wrong. It is not easy like a program crashing or an api returning an error, all errors when doing maths are entirely silent.

Lets take an example from the subject at hand, series in calculus: we want to prove that the sum of a series converges, how to you verify the proof without an instructor? Check that it is the same as the book? Most likely it wont be the same, proofs can come in many different ways. So either students start discarding their correct solutions thinking they are wrong or they fail to discard wrong solutions. Either case they fail to fully grasp the material.

You need to be a genius to properly root out all the errors in your head on your own, doesn't matter how many videos or books or tutorials you go through they can't evaluate your creative solutions like a real person can. Of course the need for instructors mostly disappear as you reach mathematical maturity, but to get there without help is extremely hard.


If you randomly pick a group of students who passed these exams more than 1, 3, and 5+ years ago, I bet nearly all of them would fail.

Why? Because learning to take tests and learning are two different things. And nothing is permanently learned. Except maybe how to ride a bike.


I 100% disagree. College moves at a glacial pace once you're used to learning on your own. It's definitely possible to have a stronger grasp of CS concepts after a 9 month intensive study than 4 years in traditional education. Corollary, it's also common to not have any practical skills coming out of a CS degree.


You can get a piece of paper saying you have a degree with no work. Conversely, you can work really hard in college and learn orders of magnitude more than at any bootcamp.


It sounds like you have very high standards for CS fundamentals, so high that the average software engineer definitely doesn’t know them and very few companies have a workforce that does.


>> biologically impossible for the brain to properly learn these things in 9 months

[citation needed]


I mean is that really a controversial statement?

I've been doing self-studying for the past few years and there's no way I could possibly truly learn the bulk of CS in only nine months. Every skill needs many hours of practice and the brain needs time to process information. Unless you're part of the 1% of people who are extremely quick learners and have profound memory retention skills, most people need a lot of time to comprehend complex topics. It's not pure coincidence that some of the best performers of a lot of subjects started when they were young: by the time they were in college/adulthood they already had thousands of hours of practice.


>> I mean is that really a controversial statement?

Yes. "Biologically impossible" is quite the high standard to meet.


Could you learn all of computer science in nine months? No.

Could you learn the necessary CS fundamentals that are included in a CS degree in nine months?

Nine months of Lambda School full-time is virtually equivalent to the amount of time you'd spend in the CS portion of a four-year CS degree.


Lambda school has 8 weeks of CS and the rest is just learning frameworks. At least according to this:

https://lambdaschool.com/courses/full-stack-web-development

Also notable that the CS block is last, right before interviews. It would have made more sense if you put that first and then refereed to that knowledge in the other parts, but now it looks like Lambda school just treats CS fundamentals as interview prep instead of necessary building blocks.


It may be just a difference of bottom up vs top down teaching methods.


CS is interwoven throughout the entire curriculum, not just the final eight weeks.


Generally you need about ten years to mastery for about everything. But we're not talking about mastery. Nine months until you start being useful seems reasonable.


Sorry, but it is IMPOSSIBLE to learn the CS fundamentals in 9 months. Period. What you may be able to do is to teach people how to game interviews within 9 months, sure. Good luck once that performance review comes around...

How many jobs require a knowledge of “CS fundamentals”. The typical development job are basically the “dark matter developers” writing yet another software as a service CRUD app or bespoke internal app that will never see the light of day outside of the company.


That sounds depressing.

Why would you want to confine yourself to working on CRUD apps instead of working on cool tech?


Because after programming either professionally or as a hobby for over 30 years, no “tech” is “cool”. It’s simply a means to have enough money deposited into my bank account to enjoy an upper middle class lifestyle.

In given week, I’m up and down the stack from the web to the infrastructure (AWS). I found all of the things that AWS enabled that took literally months to provision in the old world “cool” for about a year and then it became just another means to an end.


What a ridiculous assertion, it’s impossible to learn fundamentals in 9 months??

2000 hours is a LONG time to do something, you will certainly have a grasp of the fundamentals after 2000 hours. Will you be a master? No, but did you stop learning when you got your first engineering job?

Hell, most traditional CS degrees (at least their core credit requirements) can be completed within 9 months if you crunch


Also, you can learn fundamental, intermediate and even advanced topics on CS without a formal degree via MOOCs, like the ones offered by Coursera - some of them taught by professors from universities that you may not even have the money/time/grades to get in for a formal degree.

I was thinking about attending to a CS undergrad course some time ago, but since I'm full-time employed at a very big ("unicorn") startup - an opportunity too good to be wasted -, I opted to take some classes on advanced topics such as Discrete Optimization and Automata theory on Coursera . It has been the best use of my spare time ever.


True, but the assertion that 9 months of a bootcamp that includes "CS fundamentals" is the same depth as 4 years of CS study is also a little silly. They are different beasts, for different purposes.


What exactly is the purpose of a CS degree?

If you go into your CS degree as a young hacker, adept at *nix with a penchant for assembly language, with enough experience to appreciate the CS concepts, you're going to be disappointed. There's a strong chance that you'll know more than the professors.

Alternatively, if you go in knowing nothing and really work hard to sponge up everything the degree program wants to teach you, you're going to think you've learned something but you won't actually leave with any useful knowledge. You'll have dated and often half-baked understandings about how computer systems should theoretically be designed. You might have an idea how a basic OS kernel works from your OS class, or how to write a sorting algo from your data structures class, or maybe a parser from a language design class. None of that is useful at all, other than maybe to get you interacting with your machine, in the hopes that you'll learn how to use a computer (which is a completely different set of knowledge) by the time you graduate. And if you did ever want to write your own programming language, you can throw out everything you learned in class and start over by Googling it and following the best practices of today, like we do for everything else.

The only point I can see is to be able to say you have a CS degree, and hopefully that's losing value.


> you can throw out everything you learned in class and start over by Googling it and following the best practices of today, like we do for everything else

Yep, everything newer is automatically better. No need to have any grounding in objective utility as long as you're up-to-date on the latest web framework!

> You might have an idea how a basic OS kernel works from your OS class, or how...or maybe...None of that is useful at all, other than maybe to get you interacting with your machine, in the hopes that you'll learn how to use a computer (which is a completely different set of knowledge)

Any decent computer science program is both theoretical and hands-on. Projects where you get hands-on experience with the concepts you just learned. Not all of us can read Data Structures and Algorithms and implement a search algorithm as an 18 year-old. No, not everyone needs to know this. Yet, some people do. For those people who do need to know, their work wouldn't be possible without it.

> If you go into your CS degree as a young hacker, adept at nix with a penchant for assembly language, with enough experience to appreciate the CS concepts, you're going to be disappointed. There's a strong chance that you'll know more than the professors.

True, but (especially now) I'd be surprised if this is <2-3% of the CS undergrad freshmen population. I personally switched from pre-med to CS as a junior in college with only a single QBasic high school class under my belt.

Finally, as others have pointed out, (and as much as I hate admitting to this phrase that was so often spouted out by my college professors, perhaps I have drank the Kool-Aid), you're learning how to learn. That is, you're practicing juggling abstract concepts in your head, making connections and weighing solutions. I found my CS classes were 10x better at making me a general problem solver than my pre-med classes (mostly my discrete math/logic classes).


> Yep, everything newer is automatically better.

I didn't say newer is better, I'm saying that your recollection of how a compiler worked in university will not be of any use to you if you need to write your own language for production today. I've also never heard anyone say "oh yeah, I did this once in college, here's how you do that".

> For those people who do need to know, their work wouldn't be possible without it.

Who? Seriously, who would ever be employed to author search algorithm using the knowledge they obtained in a 4 year computer science program? Extremely few people are involved in the work of implementing anything that low level, and thinking you understand what's going on under the hood is just tricking yourself. For example, your 4-year CS degree holder would surely understand the trade-offs between a linked list and an array, right? Then you find out none of that is true in practice: https://dzone.com/articles/performance-of-array-vs-linked-li...

> I personally switched from pre-med to CS as a junior in college with only a single QBasic high school class under my belt.

I'm not saying only the l33t can do CS, I'm just giving the example that if you actually do know what's going on and you have to take all those classes, it's very obvious that the professors have nothing of value to add that you can't learn faster by yourself online. I grew up cracking windows software for fun and I had to watch a professor stumble through the basics of x86 assembly. It was an obvious waste of time for all parties. Educational resources today are vast, and if people want to learn something, they can just go learn it.

Also, have you looked at professor salaries vs engineer salaries? I know there are some people who teach at night or do it for the passion of it, but the reality is that it's not going to attract the most ambitious minds in our society. Meanwhile, so many major pieces of software are available for free online, and you can actually talk to the teams doing the work, and they'll let you contribute and give you feedback... for free!

> Finally, as others have pointed out, (and as much as I hate admitting to this phrase that was so often spouted out by my college professors, perhaps I have drank the Kool-Aid), you're learning how to learn. That is, you're practicing juggling abstract concepts in your head, making connections and weighing solutions.

This is my primary objection, and perhaps you have drank the Kool-Aid. If you get a CS degree at any major institution, you are not learning how to learn. You're both preventing learning and picking up bad learning habits. Doing a pre-built lab in a CS course is so much worse than contributing to literally anything on github. Studying how algorithms used to work in the 70s is useless compared to diving into any modern piece of code and benchmarking and learning how to optimize.

Look at what is actually in a CS program from a respected school: https://cse.engin.umich.edu/wp-content/uploads/sites/3/2019/...

There's a 4th-year course just called "Algorithms", check it out: http://www.eecs.umich.edu/courses/eecs477/f02/syl.html

They might as well have called it "Inefficient implementations of already solved problems".

I can't imagine why there's a "Databases" class and "Web Databases" class, but I think you see my point. Nobody is implementing a database with anything they learned in uni, and nobody is doing a better job of learning how to use a database in school than they would with experimentation and online explanation.

Maybe the fact that you're graded on learning these things, coupled with the enormous price, drives a student to independently research each of the topics presented throughout your 4-years? Then through research and play, they would gain understanding of each topic. But if the school is simply serving as an extremely expensive prompt for self-learning, then what value is the school really adding?

> I found my CS classes were 10x better at making me a general problem solver than my pre-med classes (mostly my discrete math/logic classes)

Every time I hear that reasoning, I think it's a rationalization for spending huge amounts of time and money doing something pointless. I also like "the college experience" as a good reason it was valuable.

Maybe general problem solving is being taught and is valuable, but that's not what these CS degrees are being sold as. To pick on umich again, look at the "Student outcomes": https://cse.engin.umich.edu/academics/undergraduate/computer...

So if you graduate with this program, you'll be able to "Analyze a complex computing problem and to apply principles of computing and other relevant disciplines to identify solutions" and "Apply computer science theory and software development fundamentals to produce computing-based solutions"? Maybe if you study independently while also stressing about passing your exams and doing your homework, then you'll be able to do those things, but it'll happen at a slower rate than if you just started building software and researching as you went. I just looked at a few of the syllabuses for this program as an example, and there's no way it's going to deliver what they're promising.


> If you get a CS degree at any major institution, you are not learning how to learn. You're both preventing learning and picking up bad learning habits.

What planet am I on right now?


Exactly!


Did you mean to write ">2-3%"?


Getting fundamentals to work with most problems related to CS? I'm sure average hacker: * can design complex algorithms and prove them, knows most of algorithmics space * has enough algebra and statistics knowledge to develop/understand e.g. ML methods * can write computational software without numerical methods knowledge * has broad understanding of programming langs and computational concepts * can easily read more mathy books related to CS, has no truble with following whitepapers

So I would say solid math base is what you get at university that allows to work at more challenging/interesting projects.

Unfortunatelly most unis exept top ones are waste of time at least in my country.


Worse still you'll be 10s of thousands of dollars in debt where anyone from a third world country with access to a computer can connect to the internet and do your job for a fraction of the cost. CS degrees are a hazing ritual that keeps CS professors and fat cat administrators employed.


Not all countries have the profit oriented model of US universities.


Which ones don't?


Across the EU most universities have tuition costs of <$100 per month. The bulk of that usually goes to a public transport ticket.

For people who can't afford that, there are usually some forms of getting it for free.

Higher education really shouldn't be a matter of personal finances.


If there is any tuition.


Most state universities across European countries, where students get partially supported via our taxes, to the benefit of all.


If someone with no working experience knows more than the professors then they go to a poor university.


Coming from a strong hobbyist background, I did a "conversion" master's in computer science at a decent university, and definitely knew more than the professors on the topics that I already knew about. I didn't know anything about SDLC or requirements engineering, for example, though, so from that perspective a good third of it at least was very useful (and at the end of it I had a lovely qualification and had spent a generally enjoyable year, so I can't really complain).

(This was slightly longer ago than I'd care to admit, so maybe things have changed... I would expect it's still something of a case of "YMMV", as we used to say.)


Heh, I don't think I've ever heard SDLC mentioned in industry outside of bad consultants.


Students don't study CS for 4 years full-time. They study CS modules for 18 months, some of which are things like chemistry and physics and pretty unrelated to CS (though valuable things to learn generally) and only a part of a day in each. You're not actually going 8,000 hours on CS - nowhere close to it.

If you spent time looking at how much time students spent (in and out of class) on CS and math topics, it's way less than you'd assume.


There is a saturation point for learning. A 1 year program where you spend 20 hours a week studying a subject is not equivalent to a 3 month program where you spend 80 hours a week studying.


That's not what the research shows, given a properly structured learning environment. Research actually shows the losses are much greater due to context switching than to saturation.


You still need spaced repetition for actual retention of the skill. Though it is easier to recover a forgotten well learned skill.

It's also known in layman terms as practice. 9 months of cramming is not good enough. Neither is 4 years of tests without actually using the taught things in practice and laboratory exercises. And preferably also homework.

Source: SuperMemo research, Piotr Woźniak and his citations. No reason why it wouldn't apply to CS or programming.

The amount of information useful requires essentially lifelong interleaved focused learning at least hour a day. You can also maybe pick some things up in a job, but these have stricter limits.


> You still need spaced repetition for actual retention of the skill. Though it is easier to recover a forgotten well learned skill.

This isn’t true. People forget the most advanced skills they learned unless they use them regularly but they don’t forget the ones that are necessary for that last skill. If you ever knew calculus you’ll remember algebra after decades without using it. I haven’t spoken German in most of a decade but I can still read it fine. Spaced repetition is the most efficient way to durably learn something but it’s not necessary.


Do you have any sources for your statement? Especially context switching. Where it is terrible at very short notices (because cohesion is low), it is beneficial for introducing spacing while running maximum allowable workload.

People tend to forget unpracticed skills, not oldest, and most complex parts (least compressible, biggest) first, not necessarily hardest. And the forgetting is exponential, depends on how well the material is presented (cohesion specifically).

Exponentials flatten a whole lot.

You do forget a little still over time. Ask a 40 year old who knew algebra and calculus really well and does not use it often if at all. (I do sometimes, so I don't count.)

Stability and accuracy are also separate variables.

Algebra is not one thing, it's thousands of memory chunks. As a probe, think if you remember Fundamental Theorem of Algebra which is a keystone, but slightly tricky. Compare to whether you can solve ordinary differential equations of second kind, and whether you can solve an equation involving logarithms of rational non-negative numbers. (I picked random not absolute basics from high school, 101 and 202. Bonus points if you spot the stinker.)

--

I always thought of even latest versions of SuperMemo like Antikythera mechanism for learning. It's a great model even if principle is not yet properly researched.


Lifetime maintenance of high school mathematics content.

Harry P Bahrick, Lynda K Hall Journal of experimental psychology: general 120 (1), 20, 1991 An analysis of life span memory identifies those variables that affect losses in recall and recognition of the content of high school algebra and geometry courses. Even in the absence of further rehearsal activities, individuals who take college-level mathematics courses at or above the level of calculus have minimal losses of high school algebra for half a century. Individuals who performed equally well in the high school course but took no college math courses reduce performance to near-chance levels during the same period. In contrast, the best predictors of test performance (eg, Scholastic Aptitude Test scores, grades) have trivial effects on the rate of performance decline. Pedagogical implications for life span maintenance of knowledge are derived and discussed.


Correct, later courses force you to practice earlier skills. This is the mechanism these researchers stumbled upon. Plain spaced repetition explains a lot. SAT is definitely not meant to test performance of learning, though some people with very high SAT scores may be a bit more efficient or know tricks that are not widely taught - SAT is too easy to differentiate them.

Related knowledge strengthens already known things, and once critical maximum stability/retrievability is reached (optimally 7 or so precisely spaced rehearsals) it is pretty much cemented. Without optimal spacing, probably quite a few more repeats. At maximum stability/retrievability the exponential maintains "flat" form for a very long time. A refresher may be needed to get facile again, otherwise it may take a short while to remember and there may be mistakes. Even a trivial refresher will work though, chances to use basic algebra are many.

Principles SuperMemo SM-17 algorithm puts in quantification. Here's a link with all the history and references to some other research: https://www.supermemo.com/en/articles/history

Thanks for the paper, though it's slightly wish washy, but still useful.


I only learned math up to maybe the level of a decent BA student but I feel like when you take a math course you learn how to “do a trick” (pass the exam) and then actually learn the material when you take the next course, where you need to figure out which parts of your last class you apply to “do the trick” in the current class.

So my question is what level of calculus is retained, and if students who take calculus then advanced calculus or introductory analysis or whatever you’d call it retain more.


Spaced repetition does not necessitate spending few hours learning per day. That’s not what the research says at all.


Even one to two hours is enough of an optimal schedule is used. It takes longer time if the material is badly presented or not practiced.


Please show me the research that says there is no saturation point to learning.


The burden of proof is on you for showing that there is one.


That's not how science or epistemology works. The answer to any question is "I don't know" until you have some kind of research backing an answer. There is no default answer that something does or does not exist or is or is not true.


I'm not the one who said "The research shows". The status quo is the 4 year degree. If you think that your 9 month degree is better, the burden of proof is on you to prove that it is.


Looking at the discussion here, i think there might be a problem regarding peoples frame of reference and vocabulary. What you describe applies to the US, in Europe your bachelor degree consists of 180 -240 ECTS (European Credit Transfer and Accumulation System), with 60 ECTS being equivalent to a year of full time study. The 180 / 240 ECTS are almost exclusively from your field of study. Unless you have a very specialized degree, you wont have any chemistry in your CS degree. The only exceptions for non-CS material is what fundamentals you need for your degree, so for example the mathematical background or scientific writing.


That amazes me. What do Europeans think of their counterparts that have studied half as much professional material?


I cant really answer that, as i never had any contact with an US curriculum.

The introduction of the ECTS was however highly controversial at least in Germany. Before the Bologna Process German universities had the "Diplom" taking no less then 4,5 years for CS. The introduction of the bachelor with down to only 180ects, so only 3 years, was seen as cheapening the education severely. Universities generally dont see themselves as preparing you for the job market, but instead provide you with the prerequisites to enable you to contribute novel ideas to your field in form of a doctoral degree. That almost every CS job requires at least a bachelor degree is just an side effect from the point of view of the university. An often mentioned criticism was the government selling out the education system so companies could quicker get new employees. Quantity over quality. As a result, you still find a high percentage of students automatically adding a masters to get the equivalent of a "real degree", with some universities not changing much in the structure of their diplom curriculum, awarding you a bachelors degree after 3 years but expecting you to finish the rest.

There is however also the opposite in the form of an (payed, but badly payed) 3 year apprenticeship training after school. You can become an "IT specialist" without going to university and even without the prerequisite school years to start university (normally successfully finishing grade 13 and thus getting your "Abitur"). You can start the training once you successfully finished your 10th schoolyear and thus getting your "Realschulabschluss". The 3 years are 50/50 professional school and working in a company. Looking at the curriculum of one of the first schools in google, they have in total between 880 and 960 school hours. Depending on your focus, 300-400 hours of "Information and telecommunication systems", 200-300 hours of application development, 200 hours of econ and business processes and ,due to accepting people who finished with 10th grade, 60-100 hours of English lessons. IT specialist here means either becoming a sysadmin or a coder.


Well now I'm curious. What are you assuming that I'd assume?

FWIW, I assume little. I have a degree and know the effort that went into it, and am well versed in how courses are distributed in and out of your major over 4 years of study. I'm not engaging in this discussion to say one is better than the other... I'm saying they are different. Different schools, for different purposes, serving different people.


Fair enough. I'd guess on average 500-800 hours spent actually studying and writing code in CS-related topics & math.


An undergrad degree is only 18months (2 yetas minus summer) in a major, not 4years. And some of that is not-particularly-relevant electives. And if you already have a college degree you might have some overlap with the major.


well all those courses that aren't comp sci are still part of a degree, still require learning skills and still promote university-level thinking and production.

I think bootcamps actually demand a far higher rate of output and often at a very high level within specializations. The problem is I don't want to hire someone who can crank out bleeding edge framework code 20 hrs/day for 6 months.


From what I've seen, most people treat the university as a prerequisite to getting a decent job. Only a very small percentage actually treat all classes in a serious way. Most just skate by in a majority of classes that they deem to be filler.


As someone who dropped out of college after the first two years (before specializing in a degree), I'm not sure "university-level thinking and production" is actually meaningful.


It isn’t. It’s like transfer learning generally, despite decades of research trying to find it there’s no real evidence for it. Learning Latin barely makes you better at learning Italian, never mind reasoning. People learn what they’ve been taught and overwhelmingly don’t generalize.

University level work in physics, literature and chemistry are so different as to have basically no overlap. University level is as meaningful as high school level in a world where Calclulus II tells you the course covered calculus and is otherwise uninformative.


Physics and chemistry has plenty of overlap, you even have physical chemistry and chemical physics.


I can’t imagine why I would need or want to go back to something like this every 8 years. You don’t see people going back to university and relearning other trades every 8 years.

Hopefully changing technology will be learned on the job as needed.


9 months is short, and far less than that is spent on teaching CS. So yes, a CS degree would be far more comprehensive.

A bootcamp is not a substitute for a CS degree.


a cs degree is going to require you to cover a lot of math, which is required for cs proper. a bit of an apple and oranges comparison


If the purpose of the degree is getting your foot in the door at your average corporate dev job - it’s good enough for most people.


Yes - I was super disappointed by the last two to three paragraphs. Was there any research there? Some evidence that these coding schools and boot camps actually provide value?

I didn't finish my computer science degree, I've never done a single boot camp or coding school, and I used to write ColdFusion before moving to VBScript / Classic ASP. Talk about obsolete! But I learned VB.NET and C#. I migrated to HTML 5 and CSS 3 and through jQuery and newer JavaScript libraries. Now I'm a Node.js developer. Go figure! My "learning by doing" has nothing at all to do with any formal education, least of all the aforementioned boot camps!


> If you have a good grounding in CS, you have longevity. Couple that with a good math background, and then you have even more possibilities.

I think you've just described the 1% of programmers out there. For the rest, it's bootcamps and stackoverflow. Unless we're still deluded that any software developer is as good as any other just because they happen to have the same title on their contract papers. And yes, I agree with what you're saying.


Most dev jobs in my experience rarely need much CS knowledge past data structures and the odd basic algorithm. This may be different outside of the top tier companies but they makes up a small part of the working world.

Being a good person to work with, ability/desire to learn stuff, cool head when things go wrong, and being reliable are far more valuable once you've got that basic CS knowledge.


I’m not sure where you are, but in Denmark this is absolutely not true as most companies and organisations don’t even go over your resume if you don’t have a degree.

I know this isn’t popular on HN, and maybe things are very different in places where education isn’t free, but I’m fairly certain we aren’t the only country to do this.

The jobs themselves don’t usually need much CS knowledge, as you point out, but the degree is an entry point because it gives employers a certain form of safety. I know, I know the HN mantra of “weeeell you can get a CS degree and suck”, but the reason practices are like this in my region is because the companies who followed it did better than those who didn’t.


Can you show me a double-blind controlled trial of companies that interviewed only CS grads vs anyone? I'm pretty sure such a study does not exist. So I'm pretty sure you can't make claims like that with any confidence.

There is MOUNTAINS of research at Google which cannot be shared externally that shows -- despite all of our efforts -- we are hardly better at selecting candidates through interviews than flipping a coin.

There's also a ton of research by Amos Tversky at the Israeli Air Force that suggests similarly. That's where Google got the idea to try to measure this, and how we got our "strange" interview process.


Guess what? Not every company needs smart people(tm). The level of complexity of software at Google is completely different than your standard enterprise developer - the jobs most people have.


"but the reason practices are like this in my region is because the companies who followed it did better than those who didn’t"

Except that this practice will be dropped in the next few decades in many parts of the world, particularly in the tech industry.


I understand YMMV for all statements, but I'd love to understand why CS is needed for.. 80% of programming jobs.

We mostly write CRUD apps to fulfill some sort of business or project function. If we're not doing that, we're trying to make two or twenty different systems work together. If we're not doing that, we're developing CRUD interfaces for our databases with some business or project related logic that can be executed programatically..

None of this requires a CS degree. Design and architecture doesn't. Testing doesn't. Implementation doesn't.

I don't write embedded firmware for IoT blockchain roundhouse kicks.. I write Java/.NET/PHP/JavaScript/Python code to make CRUD happen faster.


I can't really point out what exctaly it is, but it did make a difference for me.

I did a did a 4year apprenticeship in IT and Programming before deciding to go to college. It was all fine, could do real coding for a good company. I knew the syntax of a few languages, database design, uml, some basics of how webservers work,... But it didn't feel proper. It was more like a lot of things, but only superficially.

College on the other hand felt much more in-depth. I was really given the time to properly understand the things I was doing. Even the "maths" bits helped. E.g. vector maths changed the way how I reason about little unrelated things sometimes.


I don't get why the justification for not needing to understand CS is that 80% of the programming jobs are CRUD apps.

That's such a weak justification in my opinion. Why would you want to work on CRUD apps over more interesting stuff?


One of my first senior engineering roles was making what was at first a CRUD app to support user studies and data collection. At some point the project blew up, and requirements grew many fold in complexity. My CS background let me view the problem space from a broader perspective, and I ended up building what was essentially a spreadsheet based DSL so that user study designers could define what were, under the hood, state machines that covered pretty much any scenario they could come up with.

The workload increased by 50x, but the system I built only needed a couple engineers to maintain and add features to.

I hate to sound cocky, but I think any engineer without a formal CS background would have attempted to solve each case on an ad hoc basis, either greatly delaying the project at best, or fucking it up irremediably at worst.

I have many examples like that.

Perhaps in all of the examples you cited, you’ve inadvertently cost the company 10 times the time or the money that a more methodical engineer with a theoretical background would have. You don’t know what you don’t know, and that’s the difference between an engineer and a technician (a distinction that the US doesn’t tend to make, but some other countries do).


Bootcamps get a bad rap on here, but for experienced developers I could actually see them being fairly useful. There is quite a difference between training an experienced developer in a new language / framework than trying to bring someone from zero knowledge to an employable level of knowledge.

As for CS fundamentals, the only time I have needed to implement a sort algorithm was at university and in some interviews. Not saying that it's all bad, it is useful to understand the underlying principles but there seems to be a focus on stuff that is becoming less and less relevant.


the only time I have needed to implement a sort algorithm was at university

Sure, but at some point you've probably had to implement some sort of novel algorithm. The reason they make you implement a sort algorithm isn't so that you learn to write sort algorithms, it's so that you learn how to implement an algorithm in general.


In most jobs - you won’t have to implement any type of complex algorithm. The complexity is usually in the business requirements.


you won’t have to implement any type of complex algorithm.

Even designing and implementing fairly simple algorithms is easy to screw up if you're not sure how to approach it.


Frameworks and programming languages are useful for anyone that uses them.

A CS degree on the other hand teaches you a lot more than you really need to know (compiler design, theorey of automata, etc) as a software engineer and so it's kinda wasteful. Sure, it's interesting and fun to learn as an academic but it's not actually useful.

The nice thing about bootcamps is they teach you just what you need to learn and nothing more, nothing less.

There's few Software engineering jobs that utilize the full scope (or even half of it) of CS knowledge you gain from a CS degree.


Bootcamps are insufficient to a person being a programmer. Learning CS/math in a formal setting is expensive, and out of reach for many, and also may not prepare you for a job programming. These are not mutually exclusive.

One can learn CS at their own pace. Being interested and driven to learn is the difference between a bootcamp coder with only a few years of shelf life and a long career. Same goes for that CS degree.


I see value in learning CS, but not the degree/university part. University has become a business and is causing long term strain on society through debt.

I think bootcamps attempt to fill this, although maybe less so for CS/maths fundamentals. Online course work seems good, but the social aspect of having a physical space to go to, meet and collaborate in is what's missing.

Any ideas for how to solve that?


What will give you longevity is a degree in CS. Frameworks and technologies change, but CS concepts stay consistent.

Try getting a job outside of Silicon Valley by just knowing “CS concepts” when your standard Corp job wants someone who can hit the ground running on their tech stack.


> Bootcamps are rubbish in the long term.

That statement is rubbish. An education in CS will get you certain things, but your career is what you make of it. If you(hypothetical person, not OP) go to a bootcamp expecting to be handed a career on a silver platter, then you're a moron. Yes, there are plenty of those morons. I knew some of them when I went to Dev Bootcamp. And they were the types who didn't take charge of their careers, work on their people skills, go to meetups, interview like mad, etc. They got exactly what they put into it and either floundered or ended up in a different career, and I don't begrudge them for figuring out that the field wasn't for them.

Going to a bootcamp was a turning point in my life, and I'm making six figures working on a high profile app 6 years later as of this month. I know others who went to my bootcamp, didn't have a degree in CS, and are way more successful than I am. But it doesn't even matter if someone goes to bootcamp; they can learn entirely using free resources, and their inner drive is still the most important variable in their future success.

Is having a CS degree better? Maybe it is. But your post is telling people, who may be the perfect type of person to go to a bootcamp, that bootcamps are junk and are short term gratification. That's asinine.

By the way, what makes you think that your knowledge in CS and AI are going to be applicable in another decade? You have a good shot, for sure, but if there is a revolution in AI that changes the field fundamentally, or if(when) computers begin programming themselves, then you are just as screwed as someone whose framework or language of choice has become obsolete and dead.

Tech stack might go out of vogue? Then fucking learn another one. Your language is going the way of COBOL? Then fucking learn another language. The end. You don't need a formal education to do that. The only reason anyone is in a bad situation because their skills became obsolete is because their ability to predict the future sucks and they couldn't or wouldn't learn new things fast enough. Most languages aren't even that different, and there's a TON of replication between frameworks, libraries, compilers, VMs, etc.

I'm not trying to disparage your background in CS. I commend you for it. No matter what, there will always be people who are too inept to adapt. CS will only help those who have a calling for it, but no matter what, if someone doesn't have that fire within them that's going to take them where they want to go, they're an accident waiting to happen.


I'm making six figures working on a high profile app 6 years later as of this month

6 years is not really long enough to measure longevity.

Is having a CS degree better?

That's exactly what I'm saying. Your time and money would be better spent on getting a CS degree if you are concerned about career longevity and opportunities.

By the way, what makes you think that your knowledge in CS and AI are going to be applicable in another decade?

Because general CS knowledge stays consistent.

This is almost like asking whether broad subjects like math, chemistry, physics etc. will be relevant years from now. Yes, they will be. If you are asking this question, then perhaps you should reevaluate what you think CS is.

If AI reaches the point of becoming self aware and writing code, then we're all out of work. I'm not too worried about that happening.


OP may have other skills that they are now complimenting with coding knowledge.

Specific domain knowledge, outstanding interpersonal skills and customer empathy, design/UX/sales etc - these will likely be far more valuable than CS fundamentals if AI starts writing all code.


That's exactly what I'm saying. Your time and money would be better spent on getting a CS degree if you are concerned about career longevity and opportunities.

I am 45 years old, got my degree from a no name school that taught one simple data structures class and the rest of the classes were teaching either outdated programming languages (except for C). I can honestly say that nothing I learned in school served me better than the time I spent hacking around in assembly in the 80s in middle and high school.

How much more “longevity” should I be on the look out for since My degree was useless? I’ve already been doing this professionally for almost 25 years.


I am self taught. Never went to a bootcamp or any schooling past High School. I fell into it starting out doing design work... What has it meant over the past couple decades? A lot of constant reading and learning. It never stops. I spend an average of 2 hours a day reading and learning, every day. Some I tuck away, some I explore. In the end, I've seen things shift multiple times.

When Flash was at it's peak, I worked in eLearning writing simulations in flash, and supporting backends in VB/VB.Net, some C# and ASP/ASP.Net with SQL. From there, much more web/js, and various other database backends (cassandra, mongo, redis, etc). Currently working with C# and Node.js in web apps still. Learning Rust.

In the end, progress doesn't stop and wait for you. I tend to push for things faster than my workplaces want to adopt. In the end, it's a struggle, and it doesn't end. I will continue to do so until I die. I have absolutely no plans to retire.

Formal CS knowledge and education can help. Understanding and learning multiple platforms and languages helps more. If you want to settle in a rest on your laurels you won't last forever. You're best off understanding various ideas, workflows and patterns and how to recognize when one is a better fit. I have my preferences but am under no illusions that it will stay the same.

In the end, you have to commit to spending time each month/year learning and working on new things. It's the only way to keep up or get ahead.


> Lambda School co-founder Austen Allred envisions students coming back to his coding school every eight years or so to learn new skills.

Eh... While that might be a good story to sell to his investors, most anybody worth their salt won't need to go back to a "bootcamp coding school" every 8 years to learn a new language.


you're right - it's kinda funny that boot camps are promoted because "university is a terrible way to learn programming", but they then try and sell boot camps as an effective way to learn the framework de jour.

Maybe if you finished an undergrad you'd be well equipped to stay on top of an ever-changing tech landscape?


What is magical about undergrad that makes it exactly the correct amount of time to spend in school?


It's exactly difficult enough and requires enough work that for many people it's an effective sorting tool for general intelligence and work ethic.

If bootcamps take off it will be because they replace this function of university.


It’s the standard way to do things so doing it signals conformity and employers don’t generally want non-conformists because they’re boat rockers.


Why even have school at all if everyone can just learn everything on their own with a teacher or lab mates?


I'll take the "no true scotsman".

If you know how to program, that skill does not simply "go obsolete".

The sort of "skill" that this article is talking about seems to be on par with like, the time I thought I knew how to write C as a teenager.

The local market going to shit, you developing medical issues, ageism, etc are all far more of a worry than "oops, I've been doing QBASIC for 10 yea.....[NO CARRIER]"


Flash is hardly the only tech to have become obsolete. Anyone who has been around this industry more than a decade can list off multiple technologies we don't work with anymore.

The article was right that ongoing learning is the key. And while formal higher education is not for everyone, it does well at teaching people new techniques for learning and research. It isn't the only path, especially in today's reality. Its current failings are one reason that bootcamps exist...

But the idea that people are going to let their skills stagnate for 8 years, then return to a bootcamp to get the latest tech... Sorry, but that is simply absurd.


I'll be the first in this thread to start listing off technologies other than Flash. We're at the point now where if you include JQuery on your resume, recruiters are less likely to hire you. Ruby used to be trendy for startups, Python 2.7 used to be popular. Angular 1.x beat off BackboneJS, MeteorJS, and EmberJS but even Angular was completely rewritten in Angular 2.x and superseded with React.


Completely agree... I don't even remember enough Perl or Ruby to be effective with it at all. It all changes over time. I do miss parts of Flash/Flex though.


I still pull Perl out when I get a chance.

Its nice for small scripts that people might use bash for. I do Python most of the time, but Perl's syntactic sugar makes it so much nicer for something like looking up a set of files matching a regex, moving and renaming them.


I was Flash developer from 2006-2010. Flash used ActionScript which is/was based on the ECMAScript standard.

Since then, I develop primarily in Javascript, based on the ECMAScript standard.

Looking back, I don't even think anyone could find the seam in my career.


This makes the article somewhat silly because the demand shifted over to JS, HTML5, Canvas, etc., and because ActionScript is in the same family as Javascript, people are already halfway there. It was very evolutionary.


To be polite, the article shows how little they understand the topic they are writing about.


Same here, I switched from ActionScript to JavaScript fairly easily, and now almost 10 years later I'm well rounded in both front-end and back-end.


thank you. The article is really silly because it fails to understand that Flash was not one-of-a-kind work assignment on a computer, whereas it was just a tool to do web animation. It is not like web animation has become obsolete.


If your skills become obsolete, you are not a tech worker. The only thing you should have accepted is that this is a fast moving industry and you should learn new technologies all the time. It's just the nature of it.


Reminds me of the HiSOFT guy who used to build assemblers and compilers etc. for ZX Spectrum, back in 1980, and is still programming. I thought it was interesting to contrast that older work with his current line of work.

http://www.hisoft.co.uk/

"HiSOFT has been in existence since 1980, founded by David Link and Dave Nutkins.

Originally we created software for the NASCOM 1 kit-built microcomputer but swiftly moved on to the ZX Spectrum, for which we created many esoteric items such as HiSOFT Devpac, HiSOFT C, HiSOFT BASIC, HiSOFT Pascal, UltraKit, Colt and much more.

After great success with the various incarnations of the Spectrum we ported our core titles (Devpac, C++ and Pascal) to many other Z80-based computers; Tatung Einstein, Newbrain, Memotech 512, Amstrad CPC& PCW, Elan Enterprise and more!

'Twas a lot of fun and, undoubtedly, this list will stir as much excitement in some people as David's favourite band since 1971, Genesis, do in him!

After the Z80 processor began to flag (shame!), we moved on to the 68000 which meant moving stuff over to the Atari ST and Commodore Amiga. This, along with many hardware projects (such as Megalosound, Replay 16, Clarity 16, Squirrel SCSI, VideoMaster etc.) kept us going through the 90s until, reluctantly, we were forced to take the PC seriously.

Having forged a close relationship with MAXON Computer in Germany throughout the Amiga and Atari years, it was natural for us to take on the UK mantle for their flagship product, CINEMA 4D, an exciting and now rather important 3D product.

HiSOFT promoted, distributed and sold CINEMA 4D from 1997 until 2001, at which point David Link formed MAXON Computer Ltd and moved all things CINEMA 4D under the MAXON umbrella. David worked at MAXON UK as CEO until resigning for personal reasons in early 2003.

David Link continues to work at HiSOFT, as you will see from this website, while also trying to earn some money running the odd pub, café and seaside bar/restaurant/guest house!"


For an older person who was a programmer long ago, it’s completely normal for that person to still be programming. It’s just that for a programmer, it’s unusual for that person to be old. Don’t confuse the two concepts.


I'm not confusing anything. I am 41 myself and have been programming for a long time myself. I don't think its unusual for him to still be programming.

I just thought it was a good example of how people evolve. And also interesting that they used to make development tools and now does websites. As well as the rest of the career is just an interesting evolution.


> I don't think its unusual for him to still be programming.

But since essentially all other descriptions of older programmers make that point (and it’s the conventional, however incorrect, wisdom that older programmers stop programming, similar to sports). When you simply describe an older programmer without making any other point explicit, one must assume that that’s what you meant to say.


Flash developers were fine since the tech market was still expanding rapidly, so they could learn new skills. What happens when the market is no longer growing, and workers become obsolete since their job is automated away?


If a person truly becomes unemployable in their career, for whatever reason, then they do something else.

I'm driving a semi truck.


Truck driver is a dying industry. Jobs are scarcer every day.


4,500 US Truckers Lose Their Jobs In August Amid Freight Recession

https://www.zerohedge.com/economics/4500-us-truckers-lose-th...

Loadsmart And Starsky Make First Start-to-Finish Autonomous Truck Delivery

https://www.zerohedge.com/news/2019-08-12/loadsmart-and-star...

UPS Quietly Using Self-Driving Trucks For Months

https://www.zerohedge.com/news/2019-08-19/ups-quietly-using-...


That may or may not be true. The two threats are a possible recession, which is near, somewhat predictable and temporary by definition. And automation, which is "real soon now," and permanent whenever it happens.

In the mean time, trucking is hiring.

Not that I would recommend it to most people, it's just what I did when I realized I wouldn't be working in software anymore. I ended up liking it.


That's a pretty interesting transition. Got a link to your story?


Sorry, no. Overly simple summary: I was a B player, I failed to stay relevant, and I got old.


I think there's always new skills to learn... software development especially if you're good with the UI/UX will always be needed. Some areas more or less than others. After 9/11 (Phoenix) it was a rough year... others in California were far worse off, as the dotcom bust was worse there.

Software is a field that works on how other jobs get automated. We'll have work until it's all automated, which is unlikely in our lifetimes.


What do you think Soylent is made of? Obsolete tech workers are a valuable source of protein! :P


I'm one of the authors of the original paper the article is based on. Here's a link to the actual research: https://www.john-joseph-horton.com/papers/schumpeter.pdf


I know one flash programmer who suddenly found herself without a job and in a tough situation. She had to completely learn a new skillset: HTML5 and javascript and the Front end web. It took a whole year for her to get up to speed and get a new job.

To prevent this from happening to me, I like to keep an eye on which technologies and programming languages are in highest demand, and which direction they're moving in terms of job demand. So, I created this app that measure programming language demand based on job postings and analyzes on a city by city basis, cross-referencing with salaries posted: https://skilldime.com/


>HTML5 and javascript and the Front end web. It took a whole year for her to get up to speed and get a new job.

It took her a year to learn HTML and Javascript?


Let's see. Commodore Basic, Borland Pascal, Borland C++, MS Access, Delphi/Oracle, Java/MUMPS (don't ask), C#/.Net/MSSQL, C#/Net Core, Postgres. Those are the big chunks, Postgres being the newest. Across DOS, Windows, Solaris, HPUX and Linux. I get bored with stuff after a few years and just want to build systems with something different. I personally don't go for the new hotness, I go for the hotness that has been around a while.

This was the same discussion around 2001. You had people who got into tech for the money and people who just like it. Many people who did it for the money didn't survive past the dot bomb in 2001.


One thing I've asked just about everyone who I've managed lately is if they have a job or a career? Folks with jobs normally have few skills that they do 9 - 5 Mon - Fri. They don't learn new skills, nor take on different work, they just come in do what's asked then leave. These are the folks who have the most problems when there is a shift in tech and some stack become obsolete. Folks with a career know they have to stay current in their skills as well as anticipate what coming down the pipe in their field.

This is more of an observation than a criticism, btw.


The article itself is a bit weak and,as suggested by others,looks like an ad for a bootcamp. One of the reasons why tech workers had little to no problem to change the course while they started seeing that their skillset is dated is because it takes years to happen.For instance someone comes up with some new language called SnakeScript and it starts taking over the world,while at the same time reducing the number of Python jobs.Well those python devs would simply start learning SnakeScript and eventually completely transition to it.


Primer (2004) said it best:

You know what they do with engineers when they turn 40? They take them out and shoot them.


It's not true though. It's based on a misunderstanding of the mathematics of an exponentially growing industry.


There is truth to it.


As some that is 44, I think older devs just get sick of the industry.

The constant churn while valuing little of the things I feel that I am actually skilled at - good code design, keeping things as simple as possible but no simpler. Instead we need experience trendy framework of the month, and interviews on algorithms that you will never need in your job. Oh, and please spend the weekend completing our pointless technical test before you get to speak to anyone technical about the job we want to interview you for.


Cry me river.

I’m 45 and have had to abandon framework, languages, operating systems for close to 35 years.

We get paid an above average wage and all we have to do is watch a few videos and read a little bit.

I’m not sitting her le crying that my knowledge of 65C02 assembly is obsolete or that I had to learn the intricacies of DEC VAX and Stratus VOS over 20 years ago.


If that's what floats your boat go for it. I am bored of the pointless churn. If it was actually useful I would be far more enthusiastic. I am currently working on a React frontend to our app. It was supposed to make things s faster and easier, but the reality is it just adds more layers of unnecessary complexity.

I want to solve problems efficiently, not chase fads.


There is a real lack of mentoring and apprenticeship in software so new developers relearn the lessons of the past.


I wouldn't even say that new developers do that. We just keep encouraging chasing fads. Database query is slow. Switch to NoSQL instead of learning how to add an index.


When I was in college I lamented the fact that my school didn't teach any "modern" technologies. Forget mobile or web; we even used Java Swing at one point (this was 2013!). Now I appreciate how much focus they placed on fundamentals instead of literally anything else. I couldn't sign up for a class on iOS development like you can at some other institutions, but the things I learned were the things that won't go away.


Two things:

Flash is not the first tech to die and it won't be the last. Hell, I spent almost a decade of my career focused in Flash dev... and even when I was learning Flash and ActionScript there were 3 or 4 main platforms I used to work with that I had already abandoned. This is a constant.

Also. Even a long time after Flash was already on the way out, the demand for good Flash devs was absurd, made worse because it wasn't a "hot" platform anymore. I remember we had a Flash project that needed to be maintained at the company I was working at and we needed to hire an external developer to do it. A lucky guy with the skills ended up being paid top dollar for a super easy job for years because it seemed no one else could do it. He was smart and kept his skills up to date, so he had no problem getting out of that scene later, but he surely used this "outdated" knowledge to his advantage. This also happens all the time.


Programming language knowledge overlaps with skills in that different languages affect the ways you think about problems, but knowledge, skills and intelligence are different things.

Only having knowledge is probably the hardest place to be, and those are the least desired staff ("paper $CERTIFICATION"), but if you have the others then knowledge is (arguably) the easiest thing to add.

If you have programming skills then you can extend from that base into front end, back end, etc. It's going to be harder to jump to sales, because you don't have the skills or the knowledge. Similarly if you've only ever done sales, or accounting, or cooking, or whatever then it'll be harder to develop the skills and knowledge for programming.


This reinforces what I tell new devs: "Don't rest on your laurels". You have to always be learning new languages and technologies, developing new hobbies and interests. A friend of mine constantly trots out her masters degree; she works at a greenhouse and spends most of the time planting and weeding, she loves it and seems happy, but the fact that she earned the masters then stopped learning marketable skills scares me.


Started with web, was impressed by what flash could do and learned that. Worked in advertising and we really pushed Flash to the limit (FWA, etc). I mean actual Flash apps, close to the structure of today’s React apps. Software 3D back then, but Stage3D arrived too late. Adobe AIR had potential but Adobe didn’t push enough for this tech. Then I went back to web app development and today I work at an investment bank making trading apps.


In the context of this article choosing to focus on Flash..

Flash uses actionscript.. Which is ecmascript, which today we know as Javascript.

It's not a stretch to imagine having Flash skills is one thing but building digital experiences in js, or js friendly/inspired syntaxes likely has plenty of transferable skills.

In another way of looking at it, today's js developers share some kinship with actions riot developers.


I learned new skills (or more accurately improved my minor skills to become major ones) and make more money now. I wouldn't even say my skills became obsolete, I just needed to add other skills to them that I wasn't eager to add - so I found another way. I've done that twice now - hopefully I can do it a third.


Good experience and perspective + a curiosity to learn == lots of fun + lots of money.

I came into CS expecting a cushy job which was well paid. What I found was a challenging job that placed extraordinary premium on agility and the capacity to build useful products. I love being a "tech worker".


They spend too much time on HN.


I can't understand which part of this article was upvoted. There is very little information in it and it doesn't answer the question it is asking in the title.


Does anybody knows the artist who made the "slamm" phone poster on the article illustration ? or where can i find it ? I really like it.


The artist is Deborah Azzopardi (https://deborahazzopardi.co.uk/)


Thank you very much


What happens to tech workers when their skills become obsolete?

1. Learn new skills

2. Move into management

3. Leave the industry


I used to be quite good with 026 and 029. Probably not much demand now.


Try being a Java developer in a modern world.

In my non-Bay Area city, the majority of jobs are Node or RoR. Been struggling to find a job, especially since Java is synonymous with big data, which I have no experience in. No Spark? Sorry. No ML? Next! Ageism in tech is real.


> Java is synonymous with big data

Most Java jobs are not big data. There may be a skew in the openings, as people with java/scala big data skills are hard to get, but there are probably millions of people doing non-big data Java development right now, and it's not going to change.


Sadly, that is not the case where I live. Mostly new companies, where the Java stack is not prevalent. No jobs to apply to, no one willing to even look at someone with experience in stacks other than their own. Not everyone lives in SF.


That's not SF situation. Probably 90%+ of boring big corps have millions and millions of lines of Java code to maintain and develop further. You just need to be in a city big enough to have some of those big companies. At least in Poland, it's hard to hire a generic senior Java dev.

One possible explanation for what you're seeing (except not being in a right kind of city) is that maybe the offshoring has hit the US enough to affect the Java market? There tens/hundreds of thousands of Java developers working in Poland are doing offshore development for US/global corporations. The whole Central/Eastern Europe is like that.


Well, the easy answer is to go work for IT in a non-tech industry shop


They go work for a mid-sized insurance company in the midwest.


Or government! Uncle Sam is still hiring people to write MUMPS and VMS.


They become managers?


That could use a 9mo boot camp. Just because you're good at code and even manage a project with some tech parts doesn't mean you'll be a great manager. Not usually from aptitude but because the thinking akillset is often different. It becomes less about how to solve a problem and more about should you solve the problem.


Realistically for companies rolling their own managers, it should be a gradual multi-year process.

Initially mentoring less experienced developers, then into running a small team, scaling up the managerial aspects and learning to let go of the code, hire great people and trust them to deliver over time.

The modern industry moves faster than that, new roles are machine gunned into our inboxes, we are told you can't stand still or you're hurting your own career.

Those aren't mutually exclusive situations, but it's certainly made more difficult by their orthogonality.


They evolve into birds and fly away outside


Nothing is impossible.


Not on zombocom


The only limit is yourself


They learn new skill. Or suffer and switch political parties.


tldr: highly educated and well-off engineers are able to adapt.


The article alludes to education, but doesn't connect any dots. I'm not sure where "well-off" is coming from, though.


If you are paycheck to paycheck, the immediacy of needing work can drive you to do something else. Then it just gets harder and harder to get back into the industry.


That's almost certainly a valid point. I just don't believe the article really addressed it, so it's an inaccurate summary above.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: