Hacker News new | past | comments | ask | show | jobs | submit login
Americans don’t need more degrees, they need training (venturebeat.com)
264 points by sylvainkalache on June 28, 2017 | hide | past | favorite | 201 comments



This is from the operator of a coding school. They have an interesting payment plan. "There is no upfront cost to join Holberton school. We only charge 17% of your internship earnings and 17% of your salary over 3 years once you find a job." That's more interesting than anything in the article. The school takes the financial risk on students not getting a job. This is the reverse of most for-profit education, which relies on loans and doesn't guarantee a job.

I wonder how this is working out for Holberton School.


[I am the cofounder of Holberton] it is working great for us. As you said we do take the financial risk on students not getting jobs. Education is so broken today: students get in debt and don't find jobs because too many universities are in the business of signing in students and not finding them a job :( hopefully we can show the way and we'll make this change. Purdue university is now offering this too, which is a good sign that things are changing. Honestly I think that this is going to become the norm simply because the current system is broken an unsustainable (US students debt is in Trillion USD! and cost of atudies is getting higher and higher -> more devt and more ppl who can't afford Education anymore) 10 years from now ppl will simply look back and laugh at today's system.

There are two other important things that this model solves too: 0. There is no more financial barrier to high quality Education. Anyone can afford Holberton* 1. Colleges need to make sure that you are a fit to their system/culture/field. Today this is not the case so students are wasting time and money in programs they are likely to fail. By having this tuition model they will need to be transparent and make sure everyone succeed

(*) we are in SF so cost of living is high. this is a pb for a lot of candidates. We are activally working on a solution.


> too many universities are in the business of signing in students and not finding them a job

If you are a university in the "business of finding students a job" you are university'ing wrong.


> If you are a university in the "business of finding students a job" you are university'ing wrong.

Then everyone is university'ing wrong!

Universities spend an enormous amount of time and money getting students jobs. Maintaining relationships with recruiters, developing alumni networks, and entire departments of full time staff dedicated to "career services".

There's a solid critique of education programs that are overly sensitive to industry fads [1], but IMO universities and educators should definitely include "finding students a job" as an explicit curricular goal -- balanced, of course, with "ensuring students have a long and healthy career beyond that first job".

[1] https://www.joelonsoftware.com/2005/12/29/the-perils-of-java...


> Then everyone is university'ing wrong!

Hey hey! I tend to agree!

Finding students a job should be a secondary byproduct of first giving them a world class education where you prepare and enable the student for society through being able to learn. Universities are to teach students how to learn so they can learn and be more aware and active about what is happening around them in society. Universities gives you exposure to a wide array of subjects for a purpose. It isn't random, and if you are attending a university where it does feel random, then find a better university.

"The goal of university education is to help build a fairer, more just society" - Steven Schwartz. [0]

Indeed this aligns with Plato's view on education -- "Plato regards education as a means to achieve justice, both individual justice and social justice." [1]

[0]: https://www.timeshighereducation.com/comment/columnists/the-...

[1]: http://epublications.marquette.edu/dissertations/AAI9517932/


I agree with you, philosophically.

But when I apply this philosophy to concrete curriculum design questions in CS, I end up caring about placing students at internships and jobs.

First, internships are a form of education, and I find that students who complete internships come back the next Fall as much more mature programmers. I can then leverage that maturity in programming to dig deeper into interesting theory. Because I'm not helping debug for loops, I can students debug proofs or design more complex algorithms. So I consider internship placement a major goal for the first two years of a CS curriculum, even when my goal is to teach pure theory.

Second, I have a hard time justifying the situation where students are debt slaves to banks. How does that achieve individual or social justice?

Third, your work output is an enormous aspect of your contribution to society. Someone who can build a software platform that helps rural poor in Nigeria get access to micro-loans under a fair terms is making a much greater impact on the world than a philosopher with a perfect understanding of what it means for the world to be just, but without the means to act on that knowledge.


The problem is that it is awfully hard to be a "life long learner" if you are unemployed, desperate, or working a shit job just trying to make ends meet.

Imagine all that extra time college graduates could spend learning if they were easily able to provide for themselves because college prepared them for the real world and they were able to get a good job.


Okay nice goals. But we face a problem today will a shortage of low skilled jobs. The few low skilled jobs that remain often still require university training. The rest offer terrible working conditions and pay.

It's nice to have a society that has people who are more aware and active about what is happening around them in society. But what about the average worker that's been unemployed for six months and just want's to be able to afford to raise a family?

We need to meet our basic needs first. Universities used to be for the upper few, but as they are required more and more for the majority of jobs, either the universities need to change or the job requirements need to change.


Well, the problem is that universities are being co-opted into job training programs, when they shouldn't be. The cost of training people for jobs is being passed on to them through the university, from corporations. It shouldn't be that way. Finding a job should never be the main, or even a large part of attending college. Now, naturally that's not what is currently happening, but we need to work to avoid this as a long-term solution to job training.


That seems unrealistic for the most part. No, most universities aren't optimized as trade schools focused on near-term employment prospects. And, you can certainly choose a university and major that, shall we say, sub-optimizes employment prospects.

However, it's certainly the expectation of most people that going to university for four years is going to improve their employment prospects--often in a field directly related to their undergraduate study. And as someone else noted, schools absolutely put effort into relationships and programs that improve their graduates' job prospects.


> And as someone else noted, schools absolutely put effort into relationships and programs that improve their graduates' job prospects.

They also put in efforts in a bunch of other places as well.

I will admit that, coming from a lower middle class background, university education to a 17 year old me was mostly a way to optimize my career prospects. But in hindsight, I would say it is a lot more than that. The two most important skills I learned were probably critical thinking and team work. There were also many side-benefits, that you get merely from being in close quarters with so many different kinds of people... to learn that diversity is important, that even people from different cultures are awesome etc. These soft skills are very important, and I don't see a way of acquiring them otherwise.

However, I think a way to remedy this would be to be very clear with what a university provides instead of being vague about it. e.g. if you pursue CS in Uof Something, you have x % chances of getting a job here right out of college. I agree Universities have been co-opted into being preparatory schools for the job market, but that is too narrow a purpose for them.


I don't really disagree with any of that. There are (mostly good) reasons why most universities look and act differently than they would were they solely constructed for learning some trade whether white collar or otherwise. Part of this is that the training part is mixed in, for many people, with moving out of the house for the first time, being self-directed to a greater degree, etc.

That said, there's also a general expectation that, barring graduate school, university grads will also go on to earning their own living. Some do so with just generalized university education--which is more or less what you have if you majored in medieval German literature. But it's easier with an engineering degree or something else that ties directly to what a company is specifically hiring for.


That's a good point. While they're definitely not the same thing, I wonder if the goal posts are just changing. So at first, it was: complete high school and you will def get a job! And then it was: get a univ education and then you will def get a job!

Maybe the economy is just changing so much that we simply don't have the capacity to absorb so many workers. Or maybe we need to direct more resources to create employment in those areas (i.e. more funding for arts, history, archaeology etc.). I don't know what the solution is.


You want it to be one way but it's the other way.


Right. We're doing things the wrong way, and I'd like to see that changed.


Effort doesn't alway equal results and we shouldn't be praising them not attaining their stated goals. (or perhaps they are, and their goals don't matter to their customers -- students)

I'm just a single point of data, but none of my friends ever found the career services people to be helpful in any meaningful way. In my world, their success rate is 0%.


> and we shouldn't be praising them not attaining their stated goals

My post wasn't intended as praise for career services... it was intended as praise for departments and universities that prepare their students for internships and jobs.

> and their goals don't matter to their customers -- students

I don't know this, but I think a lot of the dedicated professionalized career services people are mostly there more to placate clueless parents. It's marketing, and it works.

> none of my friends ever found the career services people to be helpful in any meaningful way

Absolutely. The scare quotes in my original post are there fore a reason...

That said, good career services people are usually work closely with or are even embedded in the department -- not a separate service in the admin building. They ensure that there are regular talks from the alumni network or local community. That companies are invited to come in for an evening and talk about what they're working on / what skills they need. And then ensure those companies are invited back to job fairs etc. in the Fall, with companies who take it as a serious opportunity and who you think are most likely to fast track your students' resumes.

In fact, at smaller universities/departments, the best career services employees are typically professors who keep in contact with graduated students are former colleagues.


University is sold to kids as the thing they'll do after high school and before starting their careers if they want to be successful. There's a grain of truth to that too; I have heard that it's hard to get a tolerable job out of college these days but even harder without a degree.

If that's how universities are positioning themselves, if they're opening their doors to students who are clearly not the best of the best and destined for academia and/or highly technical careers that demand high-level theoretical knowledge, then you bet your ass they should be teaching practical skills and helping students find jobs. Otherwise they're just taking (a lot of) money from kids who clearly don't know any better, and leaving them high and dry after graduation.


That is not exactly what I said :)


With the name treehau5 I expect there's some affiliation to Treehouse here


Unrelated question, but curious: why do your BART ads that say "I am a software engineer" have a fork bomb in faint text in the bottom-left corner?


But 17% of an entry-level job is HUUUUUGE for the student. If I paid that in my internships and entry-level jobs, I'd be in trouble.

Granted, over the long run, it's a great deal! Just short-term, it's extremely expensive. Personally, I'd opt for a smaller payment that lasts longer, even if it means more money for you.


Not sure if this is it but some guy did an AMA on reddit with a similar payment plan but it was really scummy. They took some percentage of your salary for a given number of years when you eventually landed a job. But there was no stipulation that the job needed to be in tech or anything. So the whole thing might be a total waste of time and you would still have to fork away a cut of your salary when you find a job as a kitchen hand or whatever.


Tell me how this is different from the rest of the student loan industry. My student loan is until I've paid it all back, or I reach 65, or I die. It adds a top up to my income tax, taking 9% of everything over £17kpa earnt.

It doesn't care that I've got a really good job, or that it's relevant to the courses I took. Or that I even finished my degree. It's just the same in that respect.

Holberton takes a higher percentage, but earlier on and only for three years. When you're earning less. You might thing 17% is shitty but coming from low or no income, you're probably not going to notice it. Three years will fly by and by the fourth, you get a massive pay jump.

I prefer Holberton's take to my up-to-48y lien.


> Tell me how this is different from the rest of the student loan industry. My student loan is until I've paid it all back, or I reach 65, or I die. It adds a top up to my income tax, taking 9% of everything over £17kpa earnt.

Student loan industry in the UK. In the US and most other places, income-based repayment is not a thing, the payments are due no matter how much or little you earn.


At least for now the US does have several income based repayment plans. The best one for most circumstances is PAYE (pay as you earn). It requires payments of 10% of disposable income, i.e. dollars earned after 150% of the poverty line, and the debt is discharged after 20 years of payments or 10 years while working for certain government or non-profit entities.

This program is partly the result of regulatory rule making and so could be sharply curtailed should the President or his underlings wish to do so.


I don't think that's accurate. When I graduated I got a zillion papers in the mail for applying for various hardship deferments and payment options. And then there's this https://obamawhitehouse.archives.gov/blog/2012/06/07/income-... which is explicitly income-based.


This occurs mostly in private student loans vs. federal ones. Private offers even fewer options than the federally mandated ones provide.


Income-based repayments are definitely 'a thing' in the US.


> But there was no stipulation that the job needed to be in tech or anything. So the whole thing might be a total waste of time and you would still have to fork away a cut of your salary when you find a job as a kitchen hand or whatever.

But isn't it objectively less "scummy" than charging money up front, which most universities do? Then if you fail to find a good job after graduation, you are still responsible for the full amount regardless of your income.

Some minimum threshold before the percentage is taken off would make sense - the "tax" would be insignificant income to the university but a high cost to someone working for minimum wage as a kitchen hand.

But restricting it to tech is not that obvious. Every office job in the modern world benefits from a) having some sort of degree paper, and b) having some tech skills.


FYI we (Holberton) have a minimum threshold: if you don't find a job or if it under 40k you don't pay us back because we failed at either training you or selecting you as a student.


> ...or selecting you as a student

IMO this is primarily where many universities are failing, which is an enormous disservice to their students.

Any reasonably intelligent and dedicated student can go to university and come out the other end with a well-paying job.


> Any reasonably intelligent and dedicated student can go to university and come out the other end with a well-paying job.

That's a pretty restrictive filter. Currently you have to be at least two of smart, goal-directed, and curious to make it through the university system with a decent education and good job prosepcts. Many other students make it through with good job prospects but no real education because they were able to bull their way through an in-demand major program. But a lot of them slip through the cracks and end up with neither job nor education.


> That's a pretty restrictive filter

I agree. Ideally, tuition would be extremely subsidized and higher ed wouldn't constitute an enormous financial gamble. But in the current US system, the restrictive filter really is in the students' best interest.


That's a rather self-serving criteria, isn't it? Oh, you failed to get a job? Must be because you weren't reasonably intelligent enough!


It's not self-serving to tell someone "really, no, you shouldn't take out a big loan to pay for this because you're probably unprepared and will fail".

On the contrary, taking literally every student who can get a federal student loan -- regardless of your faith in their ability to get something meaningful out of the degree -- is extremely selfish!


90%! of my childhood friends succeeded at getting a degree but failed at that "well paying job" part.

Graduating is the easy part.


Interestingly enough, that's around the same number I came up with as a minimum wage for students with big loans to pay back a University. People would tell me that $15 an hour or so for IT work like we get out here was decent pay for new people. I said a lot of these people have Bachelor's degrees with minimum payments of $2,000 a month. Lowest cost-of-living out here (poor people life) + that payment = you need at least $40,000 a year just to scrape by. People just don't get how the traditional model of education plus companies not paying people anything puts such a financial burden on graduates that they're damn-near better off taking a no-degree approach.


Did you go to a private school? Federal loans (~90% of all student loans are Federal) have income based repayment plans.

Under the income based repayment plan, a single person making $15/hour 40 hours/week would only need to pay about $100 per month.


Yep. A choice I regret now.


> But isn't it objectively less "scummy" than charging money up front, which most universities do?

It entirely depends on the price and what you're getting for that price. I generally agree with you that the model is 1) interesting and 2) not at all scummy.

But the cost-benefit on these coding schools always confuses me.

The total cost of one of this certificate is probably something in the 30k-60k+ range assuming you're successful (location is SF -- and marketing print specifically mentions Google et al). That's just tuition, not cost of living. And that's for two 9 month project-based sessions.

For reference, the average student loan debt is 37k for four years and an accredited diploma. Which, okay, might not be the full cost. But to be fair, you can DEFINITELY pay for college in 100% debt + weekend jobs/internships, and come out in the 20k-25k range. And that's total cost, with no scholarships, assuming you go to a second-tier state school or can live at home (ok, that skews total cost, but it's a very real option for many, many students!).

IMO universities are the clear winner in this cost-benefit calculation. Hands down, no questions asked. You can double or triple major and have one or two very tenable backup careers if a tech bubble bursts. In addition to that, if you want, you'll have everything you would learn at a coding school. Plus some extra theory/systems knowledge that opens up the non-web-dev job market. Graduate school is an option if you want to go the research route. And, very likely, you'll have a much larger network of close-knit peers that includes not only coders but also managers, engineers, business types, doctors, lawyers, etc. Immigration becomes easier. You'll have some physics/chemistry/calculus knowledge as well as some econ knowledge if you choose the right courses. And so on.

The problem is that a lot of people don't take full advantage of university while they're there, or choose universities that don't align with their goals. I think coding schools are weird in this respect -- IMO they tend to be pretty expensive for what you're getting, but maybe the target clientele wouldn't thrive in a university environment or don't want to take 4 years to get a holistic education.

So again, not at all scummy! But also IMO definitely not a better deal than the traditional route.


In terms of scumminess I think it is relevant that the student loans (mostly) come from the federal government. Neither the for-profit or nominally non-profit universities bear any risk of default. They make sure to get all their cash up front. This means that their incentives are as well aligned as they might be.

One interesting proposal I've seen is to create a 'skin in the game' rule akin to the one that was supposed to go into place for mortgage originators under Dodd-Frank (but mostly didn't). That is require educational institutions themselves to finance and hold part of the student loans used to pay for their programs.


Agreed, 100%!

> That is require educational institutions themselves to finance and hold part of the student loans used to pay for their programs.

Wouldn't this requirement make "17%" institutions illegal? I think the 17% approach is reasonable. IMO coding schools should be allowed to offer bona fide diplomas. Maybe not Bachelor's Degrees, but they should be able to get accredited and be eligible for GI bill etc.

The last thing higher ed needs is even larger moats.

So, skin in the game -- absolutely! But not at the expense of making it even harder for new players to enter the market.


I would think that the skin in the game requirement would be tied into eligibility for federal student loans. Since a 17% institution wouldn't be using those it wouldn't impact them. I agree that model is even better at aligning incentives and shouldn't be forbidden.


What would a bona fide diploma be? A diploma is just a certificate that you have completed a course. It's not legally protected in any way is it?


It's not the certification of the student that matters; it's the accreditation of the company providing the education! You can't use your GI bill benefits or subsidized student loans at most coding schools. That's a shame in cases where the coding school is high quality!

If the coding school is an accredited degree-granting institution, then federal/state assistance can be used to pay for the education.


Your $37k debt figure sounds low...really low. I graduated 20 years ago with a (then average) debt load of ~$20k. Tuition rates and debt loads have skyrocketed since then. $37k could represent average debt load including those who have been paying for years already but I'd love to see your source for that figure.


Here's where I got the number; no citation but they do stipulate 2016 at time of graduation: https://studentloanhero.com/student-loan-debt-statistics/

Here's a breakdown by state for 2015 from a more legit looking source and a report I don't have time to read; they actually put it at 30k: http://ticas.org/posd/map-state-data

> Tuition rates and debt loads have skyrocketed since then

Total debt load across the entire population has skyrocketed in part because there are a lot more people attending college and during the recession a lot of people weren't paying off their loans, so that number could be misleading for this reason. Could be that debt load at graduation per pupil has not increased, but total debt load has skyrocketed.

Tuition rates have certainly increased, but it's really hard to find data on what people are actually paying outside of borrowing rates. I graduated high school with high ACTs but an extremely low GPA and still had >66% "scholarship"... Tuition != amount paid. If you have "actual amount paid" data I'd love to see it.


Sounds like what happens in New Zealand in regards to student loans. Basically, 8% of my pay goes to paying off my student loan (but only on earnings above ~$20,000). I can't pay less than this, it's taken from my payroll like tax.

Not really scummy at all in my opinion.


The US has basically this exact same program. For a single person, it's currently 10% of your income over $18k.


Tbf 17% of a kitchen hand wage vs 17% of a software wage is a huge difference. The school is incentivised to find their students good work and therefore train them all in order to build their own reputation. That doesn't sound scummy at all.


I'm still not sure that's bad. With this plan, the school would much rather you land a job getting $50/hour instead of $7.50 per hour because they make more money. I can think of a few ways that could make this better (such as a sliding scale that goes lower if you are making less), but overall the incentives seem to be aligned with them landing you a good job, and getting you a good pay scale early on (since they only get paid from your first 3 years).


I'm 1.5 years out of a coding bootcamp I paid about $12,000 for.

At a 17% rate, I would have paid $11,500 already, and would be on track to pay $22,950 more if I stay at my current job for 1.5 more years with no salary increases (I expect salary increases).

There were some students in my cohort who reverted back to their previous careers after bootcamp, but most became developers. I would bet the Holberton plan is far, far, far more lucrative than an upfront fee, as long as they are discerning in who they accept.


"At a 17% rate, I would have paid $11,500 already, and would be on track to pay $22,950 more if I stay at my current job for 1.5 more years with no salary increases (I expect salary increases)."

If the bootcamp is going to shoulder that much risk, expect them to get paid for it. If you can afford to just pay for it up front, expect to pay less net money, but then, you're shouldering more risk that it won't be worthwhile. If you consider the distribution of both money and risk it looks more balanced. Whether it's precisely balanced or something doesn't have a unique answer because people have different valuations on the risk vs. the money. In particular, someone with no money to speak of at all is obviously going to prefer the single option that is available to them for the chance to come out wildly better off than they started, even at a 17% salary cut for 3 years.

(Also, I note you were not making wild accusations or anything, so I'm not being accusatory either, just elaborating.)


> I'm 1.5 years out of a coding bootcamp I paid about $12,000 for. At a 17% rate, I would have paid $11,500 already, and would be on track to pay $22,950 more if I stay at my current job for 1.5 more years

But you needed that $12,000 upfront, which means you spent time in crappy jobs to save up for it. Time that could have been spent in your new job making more money, so you're not accounting for that wage difference, ie. if it took you 6 months to save that $12,000, how much faster could you have made it at the new job had you started code camp earlier?


> But you needed that $12,000 upfront, which means you spent time in crappy jobs to save up for it.

That's what loans are for. The question is, do you want to take the risk or do you want to offload that risk?

Maybe the factor is more than just risk. By giving someone a vested interest in your success, they might also be there with other support after the class is done, such as job hunting.


I know it's not a fair comparison anymore since I realize that Holberton is 2 years and not a "bootcamp" but most bootcamps offer a certain period of time for you to repay the tuition fee.

For example, at the bootcamp I did, I think there was a 1000 fee due before 1st day of class, a couple more during the bootcamp, then 5k due 6 months after the last day of class.


Hey pharrlax, current holberton student here. So happy to see that a bootcamp worked for you! I have a few friends who are going through or just completed galvanize's gschool and hackreactor. About your comment, you totally right! 17% for 3 years seems like a lot! However the program is for 2 years. I view it as a college replacement rather than a bootcamp. Therefore I feel like what i will be paying is 150% worth it!


That was my exact thought as well. I paid $8000 for my bootcamp. I'm 18 months into an average of 65k over that time. So in that 18 months I'd already have paid in the area of $16k, and assuming I don't get a raise double that over the 3 years.

Also, I'm curious the arrangement for contract work that the developer does on his own time. If they charge a percentage of that, I'd have paid them more than a downpayment on a house over the course of 3 years.


My first reaction was that 17%x3 seems like it's on the high side. But then I read Sylvain's (who I've met a few times over the years back when he was at LNKD) profile and saw that Holberton is a 2 year program, which is something like 8x as long as any other "coding school" (I refrain from using "bootcamp" since a 2 year school is clearly not a bootcamp) that I can think of.

That puts an even more interesting twists on how their financials work out.


That works out to $54,000 for two years of schooling. Community colleges offer the same for under $10,000 total.


There should hopefully be a significant difference in scope and quality between the material offered at a coding school and a community college's CS program.


A CC programming degree was enough to get me a professional job at 19. It was taught by the same professors that taught CS at the 4 year affiliate, they just focused on application over theory.

I can't imagine boot camps are any better. We built applications using technology stacks that were popular at the time (.NET/MSSQL, LAMP) and our capstone project involved writing actual production code for a company. Everyone who graduated the program got a job or went on to a four-year university.


> they just focused on application over theory.

That would be exactly the criterion I had in mind. Colleges and universities should teach computer science, not coding. Dijkstra was an accomplished computer scientist and never owned a computer; there is a whole academic discipline there which should be taught in its own right. Whereas a coding school should probably lead off with source control and the command line, the university should start with lambda calculus.

It's not that these things can't be just as useful, it's that they are distinct topics and neither should be used as a replacement for the other. (The unfortunate corollary to this is that it takes 6-10 years to become competent as a programmer, and a number of those years must be in the industry. We should really get a guild structure together.)


Would have been $28k for me, working mostly in the Midwest.

For someone who graduated this without an internship, then worked 3 years full time at minimum wage (I used $7/hr for this), it would be around $7k.

Still seems high, but variably so.


You should be comparing them to top universities, not community college.

Top universities can cost twice as much per year.


Why would I compare a two-year program nobody's ever heard of to a four-year program at a top university? Community college is a much more accurate match, as another two-year program nobody's ever heard of.

If Holberton wants to publish standardized test scores for their students, we'd have better information to go on.


The reason is because the starting salaries of people coming out of the boot camps is higher than the average starting salary of people coming out of top colleges.

Tech Bootcamps and tech college replacements have a starting salary around 80-100K whereas top schools for an average graduate(including ALL graduates, not just CS majors) is less than 80K.

Test scores are dumb. They don't matter. What matters is OUTCOME. And the best way to measure outcome is via salary.


> Tech Bootcamps and tech college replacements have a starting salary around 80-100K

Not according to the comments in this very thread from satisfied bootcamp customers.

But even if it were true, the appropriate comparison would indeed be starting salary of top school graduates who took a job similar to what the bootcamps are targeting, not starting salary of all graduates of top schools.


MissionU does something very similar (charge nothing up front and take X% of your salary for X years after you graduate) http://www.missionu.com/ It is really a great model, the "school" doesn't get paid if you don't get a job after graduating.

Higher Education in it is current form has limited time left. Universities will have to evolve or die. Student loan debt is completely out of control and University's aren't preparing kids for real jobs.


in some ways this is similar to the australian university funding system where loans to cover education are managed by government - not the private sector - there is no interest beyond inflation, and repayments are taken automatically out of salary after your income exceeds a certain threshold.

i would imagine that Holberton School is much more focused on only accepting candidates for training where there is a good probability that the person will end up in the workforce afterwards. unlike perhaps some things one can study at university, or people who are studying but aren't thinking about a career in the workforce.


They're also incentivized to provide a high quality program and have their grads land high paying jobs. Essentially they're invested in their students success in a way most educational programs aren't.


The US loan system works almost the exact same way. Federal loans (90% of student loans) are paid directly by the government, and come with income based repayment plans capped at 10% of disposable income.


For reference, I plugged in my own early career (I graduated fall 2006).

I went to school in the Midwest, worked 2 full years as an intern, then two years as a full-salary employee, then moved to Maryland. So 17% of all my earnings in the Midwest + my first year in Maryland comes out to about $28k.

If I cut the internship down from 2 years to 1 summer, it's around $24k.

That's roughly the same as my 4-year degrees at the state university I attended from 2002-2006 (double major: BS in Computer Information Science, BA in German).

edit: As cableshaft mentioned, I did not account for the cost of interest -- my ~$28k was just for tuition + books/fees.


> That's roughly the same as my 4-year degrees at the state university

Tuition or total cost of attendance? Notice that the 17% for 3 years is equivalent to tuition, not full cost.

> double major: BS in Computer Information Science, BA in German

The way I think about this is that you got a free BA in German, some free theory/systems CS courses, and some free gen eds out of the deal.


Tuition + books/fees. No student loan interest, housing, meals, etc, accounted for.

And that's probably a good way to think about it. I also came out with fewer math courses -- the specific CIS track I took had a lot of electives, as did the German major, so I just married them together. A pure CIS track would have included a math minor instead.

edit: Also, assuming the 2 years for the course in the article is roughly equal to the computer science education I got, I could see being willing to pay the same amount to get out 2 years earlier and start earning graduate salaries more quickly. I certainly don't use the gen eds or German knowledge in any money-making way that I have noticed.


Assuming you took out student loans, are you factoring in the interest accumulated into your total costs? There's no interest in the "17% over 3 years" model.

I had about $25k of student loan debt when I graduated, but by the time I've paid it off I think I will have paid closer to $35k-40k, factoring in interest over several years.


This is an excellent point and IMO a major benefit of the 17% model.

Entirely too much money spent on higher ed in the US goes to banks.


Federal loans (90% of all student loan disbursements) are paid from the treasury.

There's an argument to be made that the government shouldn't be charging students interest. But there's no banks involved.


> There's no interest in the "17% over 3 years" model.

There isn't but it's insignificant and the potential salary increases would overshadow charging interest anyway (at least in the current low interest environment in the US)


Yeah, good point. I didn't think to factor in interest. Noted in the OP now.


If you don't find a good job and end up working in fast food, do they still take 17% of your salary?


In Australia I think we have a really good system, although politicians are trying to destroy it now.

You get an interest-free loan for your education, the only thing it accrues is inflation. The loan is repaid as a percentage of your taxable income if and only if that taxable income is greater than 54k or so. It means that you will potentially never be asked to repay your student loans if you never find professional work.


54K seems like a really good deal (at least, an incentive for anyone to take on a university course). In the UK, the threshold is much, much lower: https://www.gov.uk/repaying-your-student-loan/what-you-pay


The UK threshold is still sensible. 9% for everything above £17k and forgiving all debt after 30 years means that paying back your debt won't significantly affect your lifestyle and that lower earners will never have to pay everything.


Probably speaking in australian dollars.


Which is still ~32k GBP, so almost double the UK threshold.


That seems pretty sensible to me. When the educational funding bubble pops in the U.S., I could imagine something like that taking it's place.


We have nearly the exact same system in the US.

Federal loans (90% of student loans) are paid directly by the government, and come with income based repayment plans capped at 10% of disposable income.

The debt is cancelled after 20 years, so if you don't find a decent job, you'll pay very little.

There is interest though.


Minor correction: there is no real interest on HECS debt but it is indexed by inflation every year.


> This is the reverse of most for-profit education, which relies on loans and doesn't guarantee a job.

I've wondered for _years_ why the "equity" (vs debt) model of funding isn't available to private citizens, at least in narrow circumstances like wanting to mitigate the risk of massive amounts of student loans and un- or underemployment. It's nice to see people actually putting that into practice. Presumably if it gets too widespread, some hack will write an article about how it's exploitative and we'll start seeing pressure for a crackdown.


Are they selective? Kinda important to know whether they are able to pick and choose who they think spending their time on.


Yes we are: the admission process is algorithm-led. The admission process is totally blind and removes human biais.

During the admission process (please tey it :)) we try to replicate as much as we can what students will be living at Holberton. Simce we are project-based and students also collaborate a lot with each other, the admission process is a project where you have to collaborate with other candidates. Basically if you has fun doing the admission process you will love Holberton and if you don't then you will drop-out during the admission process vs dropping out from school. Most of the applicants don't finish it. It takes anywhere from 10 to 60h to finish it - depending on your current level (this is designed for ppl with 0 level in CS/programming)


Probably no more selective than a top university.


No. Taking financial risks is sharing downside also - that is paying their students if they do not find a job in the industry. They only share the upside - that is called risk transfer, not risk taking.


Risk transfer is charging tuition. They are sharing the risk - not finding employment means the school doesn't get paid for the education it provided. The only way this isn't taking on some risk is if the marginal cost of the additional student is zero (or de minimis).


There is/was a popular scam in Eastern Europe - some middlemen promises to help you with bureaucratic hardships for a bribe and returns your bribe money if he could not help. The thing is the scammer does exactly nothing - just waits if his victim solves the problem himself, and if not - just returns the money saying "sorry, I could not help with the situation".


But in this case they are doing something, right? There's an education. And money isn't changing hands prior to employment.

I know nothing about the particulars of this school, I'm just speaking to the concept, but it is entirely distinguishable from what you describe.


The job of the scammer is to make his victim think he does some very important things, which always include spending on some bells and whistles.


It would be neat to also have an option to pay up front and then get a refund if you can't find a job. Would that be cheaper for people who have the money?


"There was a time when most people could make a career out of a skill, or stay within the same type of job, but workers today constantly need to adapt. They must become lifelong learners: Teach a student one skill and you got him one job; teach a student to learn and you got him lifetime employment."

It's far from clear to me that any beyond a small percentage of people generally, but perhaps Americans in particular, have the capacity to become lifelong learners. Not saying folks are stupid -- the problem isn't a lack of raw cognitive capacity, IMHO.

What actually makes someone a lifelong learner? Curiosity. If you don't _want_ to learn, it's going to be very difficult to do so, except perhaps while under the gun for short periods of time. And, unfortunately, our education system seems ideally tailored for stamping out children's natural curiosity. If that sounds overly pessimistic, I'd welcome a demonstration to the contrary.

I feel that the answer is basically "let's fix our education system," which I -- again, perhaps overly pessimistically -- believe to be politically impossible.


I kinda disagree. It is a long story. I will skip a lot and make it as brief as I can.

I used to own my own company. It has long since been sold and I am retired. We modeled traffic and I helped bring traffic modeling to the computer age.

This was way back in the dark ages of the early 1990s, when you actually had to work to find people who could write good code. You also had to pay them a goodly amount of money.

One of the things I offered was to help with continued education. For example, we had one lady who was a secretary and she wanted to learn to code. So, we sent her to school part time, on our dime, and she eventually found a home in QA.

When we began to provision services in other areas, besides vehicular modeling for municipalities, I sent people to get the training they felt they needed. I'd even go with them, if I had time and it would benefit me. I love Defcon, and have gone even after retiring.

We'd get a local university to do some research and then poach the folks who stood out - and help them complete their education.

Quite a few acted on this and I have to say we benefited greatly from this. I don't have exact numbers but it was well over a third.

So, in my experience, they want the education. In my experience, they can be educated. Maybe try offering to pay for it, if you're in a position to make that choice? Cover their total costs - including ensuring they can afford the time.

In my case, there weren't exactly a whole lot of traffic engineers. We had to make them, and we did.


I think that your company had to send its employees off to formal education in a classroom (or similar) setting is indicative of the problem highlighted by the person you are replying to. Your employees would not have been anywhere near as successful with some combination of textbook, lectures on video (e.g. on VHS), and a study group made up of other employees. Lifelong learning means being able to acquire new knowledge and skills outside of the traditional educational setting.

Often, a University will claim that its students are not there to learn specific subjects but to learn how to learn. A lot of that learning is studying and solving problems outside of lecture time. These are the skills which are lacking because of a cultural perception that learning happens in a class.


I am addressing the part about not being lifelong learners. My experience suggests this is not quite accurate and that people will do so, if they can.


>I am addressing the part about not being lifelong learners.

I'm not disagreeing with you but I do want to point out that your semantics of "lifelong learner" is not the same as msluyter's:

msluyter's "lifelong learner": self-directed, self-motivated learning driven by inner curiosity. Perhaps another word would be auto-didact[1].

KGIII's "lifelong learner": the learner's brains can continue to receive new knowledge well into old age especially if an outside factor (e.g. you the employer) provide structured education

Your definition isn't wrong... but it differs from msluyter's so you're talking past each other.

[1] https://en.wikipedia.org/wiki/Autodidacticism


The "motivated by curiousity" part should be irelevant imo and so should be preference for self direction as opposed preference for classroom setting.

More meaningful definition is the one about what the person does - continuously learning whole life. Whether it is curiosity or feeling of duty or attempt to mimic some role model in the past or pragmatical strategy should not matter.


Maybe? In this case, it was they who took the initiative - I just provided the means. If that makes any sense?


I think the resources of the employee are more the constraint than the manner of education - especially on the low end of the salary scale, both money but also available and/or ofsettable time. Without support of the employer, especially hourly employees would have a much more difficult time re-training.


In our case, they were still given adequate salary.

"I want to learn to..." "What do you need from me in order to...?"

I've covered everything from salary to child care.

By the way, the costs for salary and benefits were a fairly small percentage of overhead. A decent print room can be more expensive than even the highest paid employee.


That is wonderful, but I feel that employers with the wisdom to implement policies like yours have been shrinking in number. I think this is something of the consequence of making every person individually carry their own financial weight, resulting in bad public and private policies in a number of areas.

But it's the pressure on businesses to extract maximized profit in their 'core competitve area' too... it leads naturally to the thought that employees should pay their own time and money for their own training. I think every suffers a little loss of efficiency in that way.


I have no control but I am pretty positive that doing this sort of stuff resulted in great profits and value for the company. We had a trivial amount of turnover. I opened in 1991, but really started hiring in 1995. I sold for a decent nine digit sum in 2007.

In that amount of time, I can recall fewer people leaving than I have fingers and toes. In the end, we had a bit over 225 people and offices in three States.

I am lacking a control group, and I haven't the ethics to make one, but I'm convinced this was what enabled us to do what we did, as well as we did it.

I'm not very articulate, so I hope that helps. It bothers me to hear that people don't have their company invest in them. It wasn't me who made it all work, it was them who enabled me to go much further.


> In my case, there weren't exactly a whole lot of traffic engineers. We had to make them, and we did.

i think that's the model in a nutshell. all companies should be doing this. no companies should be bellyaching about a shortage of qualified workers.


Want to hear something scary?

Lead traffic engineers make less, according to the web, than I used to start them at.

Programmers, on average, also make less than I used to start them at. In 1998, depending on experience, I'd bring them in at about $120k, plus bonuses and benefits.

Suffice to say, I see serious issues.

One of the best things about training is that you get the exact skills you need. I had no moral issues with pushing research to a local university and then poaching them.

It's a damned shame, I think, that businesses often refuse to invest in their people. I hired people because they were able to do things that I could not. If I could have done them, I'd have not had to hire them. Frankly, I'd like them to be able to do their best. That means ensuring they have the tools they need. One of those tools is an education.


Honestly, it sounds like you were a good businessman with a decent moral compass.


Thank you. Though, there is a profit motive. Happy employees do good work. Educated employees helped keep me in front of the competition that was coming online.

We'd eventually expand to offering pedestrian traffic modeling. We'd sell appliances that enabled them to keep their data in-house, as well.

Curiously enough, I'd expected to remain in academia. My dissertation (my Ph.D. is in Applied Mathematics) was based on traffic throughput, optimization and prediction, which led to a contract being offered. I was able to be in the right place, at the right time, and able to take advantage of it.

The rest is, shall we say, history. I ended up in the field largely because the data was free and DEC was able to throw compute resources at me. It is largely luck, even though I put lots of work into it.

Ah well... I don't have words of wisdom. It was largely due to just timing and being able to take the risks. Even selling was largely luck. The economy crashed and the government wanted to build/expand highways as 'shovel ready jobs.'


I've worked across a variety of industries, from warehouses to factories to big government bureaucracies to startups. I'd argue that the problem isn't that people don't have the curiosity to become lifelong learners. Almost everyone is enthusiastic about something (most people have hobbies they spend their own time on). The problem is that their workplace (and to some extent, society) don't reward curiosity.

If you're a programmer who works at startups, it's easy to think of learning as part of the job -- your tools and conditions are constantly shifting, which makes learning part of the work culture.

Unfortunately, most industries don't work that way. At a warehouse, you do the same thing from the first day you start to the day you leave. It's much the same in the government and at other large bureaucracies.

The problem here isn't with our educational system, it's with the modes in which we work. And to be honest, that's fine. Most people who work in industries that have static work conditions optimize for other things, like family time. It's also very hard to see outside of your silo and imagine a future where you're doing different work.

Most people are optimizing their life for right now, not their theoretical career 10 years out. Unfortunately, there are fewer stable lifelong factory jobs every year. This forces people to move outside their comfort zone, which makes lifelong learning necessary.

I don't have an answer to the problem, but some potential solutions are:

* Reward employees for taking time outside of work to learn skills

* Show people examples of others like them who have learned for motivation

* Find something a person is really interested in (not because it's a "valuable skill", but just because they think it's cool), and help them take small steps to start learning it

Basically, I think it's possible to help people transition to new careers further into their lives than you do. I don't think it's ever too late for someone to learn a new skill, if they have the right incentives.


> It's far from clear to me that any beyond a small percentage of people generally, but perhaps Americans in particular, have the capacity to become lifelong learners.

This is very much incorrect, although I imagine that there is an abundance of anecdata that supports your hypothesis.

First, most people are learning and are developing cognitively for their entire lives. To get some simple data points, ask anyone who has had kids or who has been/stayed married -- these are fairly intensive ongoing learning experiences. But maybe it's different for formal learning...

For simple data points on that, I encourage you to volunteer your time at a retirement community to teach basic tech to the residents. Teach them how to use apps like Facebook, FaceTime, Skype, or even Snapchat with their family members (esp. grandkids). They will LOVE it, and they will definitely learn. Of course, some will be more curious than others, and some will learn more quickly than others, but the capacity and motivation is there.

I agree with your point that motivation is sometimes an issue, but there are a variety of reasons for that, and some of those reasons don't reside with the potential learner.

> I feel that the answer is basically "let's fix our education system," which I -- again, perhaps overly pessimistically -- believe to be politically impossible.

This is a really good point, but I think that there is a lot of potential for disruption in the education sector.


> What actually makes someone a lifelong learner? Curiosity.

A lot of people, maybe most, can't learn from written text, they need to someone to explain things to them in person. This tends to come off as a lack of curiosity, but often there is some underlying learning issue that people are disguising by pretending they don't care or whatever.


I've never heard that before. Do you have any sources for that?


Looking at the results of the National Adult Literacy Survey would be a good place to start.


My belief is that the US freedom for curiosity is a side effect of being wealthy post WW2 relative to the other countries. The extraneous buffer allowed people to party through college and still make a living but is non-competitive when everyone else is actually learning. (at an accelerated rate due to the internet)

The globalization theoretically works both ways but most Americans aren't willing to live like kings in some other country due to cultural differences, perhaps even a disdain for other cultures. Safety is also often a concern, but overblown as the US is actually very unsafe due to its gun culture. The true value of a dollar on the international scene isn't revealed as most Americans are limited in how they can use their dollar.


That would be gang culture, not gun culture. If gun culture were a problem, then gun shows would be the most dangerous places.


Yes, this is very true. Most companies aren't going to keep you long enough to spend lots of money to train you. In the old days you expected to be at the same company until you retired. These days you're lucky if you are there for five. Training should really start at the high school level. It's a disgrace that most students graduate from 12+ years of school without the training to get a good job but they can attend a codecamp for 1 year and get a starting job as a coder. It's no wonder we have a large number of workers that feel the american dream is slipping away. They never learned the skills they need to survive and thrive in our current economic system.

What's even sadder is that leaders are not focusing in this part of the problem but instead in silliness like who uses what bathroom. Things need to change.


> In the old days you expected to be at the same company until you retired. These days you're lucky if you are there for five.

It sounds like you want to shift the onus of training from the organization that benefits from the training to taxpayers and workers. Why is that? Why not try to shift the employer/employee leverage balance to incentivize employers to again train its labor force, provide long-term employment, and give an improving quality of life to its workers? It's not like people stopped being long-term employees for no reason. Employers stopped showing loyalty because they didn't need to anymore. Given the fact that wages have been stagnant since the 70's when trickle-down economics were enacted, yet large corporations have seen massive gains, wouldn't you say it's time for a different approach to the direction of our economy?

This is all also assuming that Americans need a combination of experience and education to get ahead, which is just false. Education does help, but only pockets of highly populated areas of the country are experiencing growth at the moment. The rest is stagnating or on the decline... But that is a different issue that involves the dissolution of anti-trust laws more than an employer/employee leverage shift[1].

[1] http://washingtonmonthly.com/magazine/novdec-2015/bloom-and-...


Well, one of the potential pitfalls with practical skills only is that you end up with a lot of skill-based workers without a strong understanding of what it is they're actually doing. Most Tech stuff is a healthy mixture of theory and practical skills, often changing on a day to day basis for any given tech job.

Dealing with clients who have little theoretical knowledge about the infrastructure they're maintaining is incredibly difficult as they just don't have the background to understand why there are performance issues, or why an error happens, and so on, simply because they weren't really trained to understand it, they were trained "You click this to do X, this to do Y". I am not suggesting everyone needs to be an expert in everything, but a lot of modern IT infrastructure teeters on the precipice of a complete disaster; they're the IT equivalent of "Knows enough to to damage" type users in an end-user setting, and the software is specifically set up to accommodate such users. This isn't necessarily bad; easy to use software for running mission critical software is great for everyone, but there are a lot of skills where you really need to know more than "Click this to do X" to be successful and tenable.

As an aside regarding your comment on leadership focus, people can handle multiple problems at once. It's a bit strange to write off someone else's problem as silliness just because it doesn't directly affect you.


Should apply this thinking to things like personal finance/budgeting. You can do Accounting at high school level and walk away with very little practical knowledge that applies directly to getting by in real life.


This is a problem that many people are trying to solve. The current US education system is geared towards producing mostly blue collar workers, and promoting a few geniuses to white collar by funneling them to Universities. We definitely need to change it to emphasize more on practical skills, or at least skills that encourage further learning.

I imagine a kid who loves learning would be capable of picking up skills like Mathematics etc. reasonably well, while a trained but burned out PhD mathematician might just have lost all love for his anointed subject. I've actually seen this happen firsthand as a college student: my fellow students who had worked intensely through prep schools were absolutely burnt out by the effort and had no interest in academics for the first couple of years.


I cringe when I hear about plans for "tuition-free" 4-year college. I went to a fancy private high school, with a bunch of kids who graduated and went to fancy private universities. From what I can tell, it's done very little for them. The people who aren't working in tech or finance are literally on food stamps, extending their time in college, doing something like Teach for America, or working retail jobs. With 4-year degrees.

As if that weren't agitating enough, universities seem to relish their role in "not providing job training" and being "not a vocational school" and "producing well-rounded students". The government shouldn't be footing the bill to send people to these schools (through subsidized loans or other mechanisms), which for most students are glorified daycare centers for young adults who don't want to join the workforce yet. It's a huge waste of money, as the current levels of student loan debt attests to.

I don't know all the details here but an apprenticeship program seems like a step in the right direction. The federal government should forgive student loans (lay the cost on the universities)


Problem is the university lobby is now massive. Tons of free money, using the only non-dischargable debt there is. The flaws in education are the flaws in politics at large - too much corruption that is too detached from the suffering of the commons. All major structural social issues trace back to a broken incentives structure and an absence of transparency and accountability at all levels that produce policy.

Of course, that means changing the framework of that decision making, which is constitutional. And the established structure is wholly intentional, it is not just some mistake the people just need to "wake up and realize is wrong, and then we can all fix it". There are antagonists who prosper under the establishment, who now wield controlling influence in all spheres.

The question for humanity in the 21st century looks to be how to fix that kind of structural misalignment of incentives towards the will of power.


You could make the same argument from the worker's perspective. Companies don't need to hire graduates, they need to offer training for their employees! Why are we focused on what people should do when companies are the ones setting the bar?


The skills need to be non-transferable to other places, otherwise the company is just paying to supply their competitors with a better-trained workforce.


The age old management dilemma: "CFO: What happens if we train them and they leave? CEO: What happens if we don’t and they stay?"


This a very dim, short-sighted view. Most large companies do have training programs and don't artificially limit the skills to ones that are not transferrable. Investing in the skills of your current employees can be far cheaper than trying to hire for those same skills.

Even in your worst case scenario you've presumably gotten some benefit from the employee before they depart and spent less than you would on a bad hire who has the desired skills (and therefore makes more).

If the result of your investments really is competitors with better trained employees then you have a higher skilled pool to recruit from when you need it. Plus, you've saved money in the meantime.


They employer has ways of making it more attractive to work at their company. More pay, nicer workplace, more freedom... But besides that, many people tend to develop loyality when they started as an apprentice and the employer gives the impression that they offer a stable job for life. That system works reasonably well here in Germany and, from what I hear, in Japan.


To be fair, in Japan it tends to be more about people being discouraged to leave, lest they be branded for life as disloyal, and therefore not employable by any other big company.


If people are leaving your company right after being trained, your company has some major problems.


Well, they can also HIRE better trained staff from their competitors.

If companies really wanted to preserve talent for the long term, however, they should look long and hard at aggressive outsourcing programs that evacuate all technical and operations expertise to offshore companies. This has been "training competitors" for decades. There will be a comeuppance for this someday.


No, all the training has to do is increase the corporation's profits by more than the cost.

People are getting paid to provide labor, not to be serfs.


In theory the company can instead hire trained employees whose training is paid by the other companies, and not invest anything to train their own workforce. And that ultimately leads to no company investing into training, other than the bare minimum.


In theory after training you can demand a salary higher by at least the value of training.


This. It is much cheaper for me to hire a fresh masters in telecommunications and spend 6 months in training than to hire a 15+ year industry veteran with no college education.


The company I was at prevoiusly had this exact dilemna. In the end, they are still ahead of competition, in part because of their trained workforce, and in part because training people also means the hierarchy is more susceptible to listen to them. The people that leave for companies not training are not as effective in their new roles because of that.


There aren't so many of those jobs, but there are solutions that make this not so much of a problem.

On pharmacy I worked at trained their own pharmacy technicians. You can go to school for it and get a license. If you train at the pharmacy, however, it is not transferable to a competitor.

Some companies have contracts, stating that you must work at the company for x years after completing training. Occasionally there will be money to pay back. This is exactly so the company feels it can get some of its training investment back. I've seen this quite a bit with education reimbursement and management training programs.

You can also simply make your workplace a horrible place to work. Treat employees well, don't take advantage of employees-in-training, give vacations and be reasonable. You know, normal stuff.


It's absolutely true. Here's one example: The Los Angeles Department of water and powers pays millions on its apprenticeship programs yet many never finish or they start working for some other company.

Here's the story:

http://www.latimes.com/local/lanow/la-me-dwp-audit-20170328-...


There are some companies that take this approach, though I am not sure if it is intentional. Goldman Sachs being one of the most famous examples who use lots of proprietary in-house technology, which can make it difficult for leavers to get new jobs elsewhere.


the flip side is that i'm paying through the nose to get trained to do a job for you, but with no actual job, maybe never an actual job.


Puritanical work ethic America can't shake off combined with an arrogant expectation of rugged individualism.


Which, it turns out, emerged from ... a political speech by Herbert Hoover in 1928.

https://books.google.com/ngrams/graph?content=rugged+individ...


Jobs are for machines, life is for people. How many of the 'jobs' realy contribute to a better world? You can't blame people wanting them regardless as we tied in their whole right to live with 'employment'. I'd say especially Americans could do with a broad scientific and cultural renaissance style education.


That's sweet and all but the rest of us have to pay rent


There are ~23 empty homes for every homeless person in the US[1].

For those of us living in the US, have you ever wondered why we have "housing crises" and such high rents when we have such an absurdly high surplus of empty homes?

[1]https://skeptics.stackexchange.com/a/22336


The problem is those empty homes aren't located where you want to live. Among cities in the US, Flint, Michigan has the highest home vacancy rate. Coinciding with that surplus, the median rent is more than a couple of hundred dollars per month below the national median.

If someone is feeling like there is a housing crisis in some part of the country, the obvious solution is to move to Flint, or another community with a similar surplus of housing and cheap rent. However, there is probably a good reason why they haven't already done so.

They've done the math and have realized that while rent may be higher, that higher rent has much greater value (access to better jobs, lifestyle, environment, etc.) and is actually worth every penny.


I'm sure plenty of people would be up for living in the ghost towns around the country if it weren't for the issue of needing access to food and services (utilities, medical care, etc).

On one hand, you want to think its a great idea to take all the poor and homeless and disenfranchised and just pay to let them live in these houses otherwise decaying for nothing.

But then, at the same time, you realize the real cost isn't in the houses anyway. It makes way more financial sense to keep them where they are, ramp up the residential density, and let the abandoned houses and towns degrade. Maintaining their infrastructure and supply lines, especially across vastly low density areas, is an insane expense - on the orders of hundreds of billions of dollars of wasted expense annually - and trying to artificially maintain that circumstance to use up poorly planned real estate of yesteryear is asking for financial troubles.

The real impairment holding back American prosperity is structural, but it is more in how we don't promote sustainable and affordable living. We artificially constrain population density for NIMBYisms, build cities around cars instead of people, and run infrastructure deficits in almost every county in the US that leads to the degradation of society as the bridges between us, in all their forms (digital or physical) crumble with age and disuse.


>that higher rent has much greater value (access to better jobs

Indeed, and this is getting to the point that the parent and I are making. The same system which ideologically mandates maximum employment, is the same one maintaining the artificially inflated housing prices.


But the economy always seeks equilibrium eventually. Rents are rising in places like San Francisco because it is trying to reach equilibrium with Flint. While it may take time to get there, eventually we should see rents reach a point where it is a better deal, or at least no worse of a deal, to live in Flint, even if the career opportunities are not as good.

For instance, as a hypothetical, if you make $70k post-tax as a developer in SF, or $15k working part time at Mcdonalds in Flint, once your rent reaches about $5,000/month, you may actually be better off in Flint career-wise. But in fact, the median household income in SF is only $88k/year (pre-tax), which means that there are probably people already living in SF who aren't far off being better off moving to Flint for career reasons (not necessarily for the other reasons listed).


How does the economy know what equilibrium is?


This seems like kind of an absurd oversimplification. Maybe what you're saying is true over geologic timescales but if SF suddenly had a large amount of vacant housing, rents wouldn't continue to increase regardless of what's happening economically in Flint.


Why would SF suddenly have a large amount of vacant housing that isn't there now, unless people lost interest in the city? A large surplus like that suggests that things have swing too far in the other direction. At that point, rents will decline to try and bring equilibrium back the other way, exactly like you see in Flint today.


I don't know, overoptimistic real estate developers? My point is just that asserting that rents in SF are increasing because they are trying to reach some kind of equilibrium with Flint, MI obviously involves glossing over tons of factors that affect rents more in the short term. You might as well argue that rents in SF are rising because they are trying to reach equilibrium with the rents in rural villages in Mali


> I don't know, overoptimistic real estate developers?

If developers in SF become too optimistic then those homes become a drag on the economy, and SF soon loses the career advantage it currently has, still leading to equilibrium eventually.

> tons of factors that affect rents more in the short term.

I felt I was quite explicit that finding equilibrium takes time. I am not sure why you are focusing on short term movements in price.

> You might as well argue that rents in SF are rising because they are trying to reach equilibrium with the rents in rural villages in Mali

In some ways that is true. China's rags to riches story happened in large part due to costs in the US rising to the point where it became cheaper to move (some) operations over there. Reaching equilibrium at the international scale is more difficult due to many artificial barriers found along the way, however. There is really nothing stopping an American citizen living in SF from moving to Flint.


From your source:

So the answer is NO; 24 is a rather gross exaggeration of the number of homes available for sale/rent.

Gotta read the source, not just cite it.


Ditto my friend, which is why I used the word "empty" (meaning vacant) rather than "available for sale/rent".

13785000 / 610042 = 22.6

24 empty houses for each homeless American (assuming only the USA) seems like a reasonable estimate, especially since the meme is older than 3 Feb 2012.


I agree to some extend, but for a lot of jobs the degree is your skill. I employ a lot of low level coders who are trained to do what we want them to do. Haven't had trouble finding those.

What I do have trouble finding is someone with the theoretical knowledge on how to utilize machine learning patterns, which design patterns, paradigms, techs and libraries we should use and why and so on.


There’re people that have CS degrees but can’t code. After learning the syntax of a language, anyone can write simple programs, and with some experience and dedication some can even write well-crafted code that others can use. What matters is one’s approach to problem solving and this is learnable and you get better at it as and when you work on such stuff as machine learning, and if you’ve adaptability that’s an advantage.


A CS degree isn't intended to teach students how to code. What did you expect?


While I definitely agree to some extent, I'd just like to note that those things can be learned without the need for a degree.

In fact, many people with degrees in those areas skirt by and never actually develop a deep grasp or understanding of the material. Not saying this is true for everyone, but I'm just pointing out that a degree is no guarantee.


Wouldn't the low level people given enough time and perhaps mentorship eventually become the second type? It sounds like it could be the same person at different times in their career, or are you saying they'll never get there?


In my experience, once people are trained to think about the hammer (some programming paradigm/language/library), rather than the table (the underlying theoretical problem to solve), it's very hard to unlearn. They become superb at weilding a hammer, hitting the nails at a perfect 90 degree angle every time, and are very proud about that fact. But start asking them about the statics of the table, whether we could maybe make one with half the nails that can bear the same forces, and they'll simply get annoyed about your utterly impractical attitude.


This has not been my experience at all. I have watched many developers progress from being coders ("hammers") to being thoughtful architects who have no trouble seeing the forest through the trees and re-imagining problems from first principles.

Of course people will try to lean on the skills they already have when tasked with solving a problem. This is completely orthogonal to whether they have any trouble adding to their skills. The former is often the correct response to solving an immediate problem as quickly as possible, while the later is something that happens over a longer period of time and many projects.

Your assertion that people get stuck in this "hammer" mindset and have a harder time learning once there does not match my observations at all. From my experience the people who see a brick wall between these two skills fall into two categories. I don't know if you fall into either of these, but it would not surprise me.

1. People on the "architect" side of the wall who want to justify their feeling of superiority, their higher salaries, or their enterprise's highly regimented hierarchy that separates the two. 2. People on the "hammer" side of the wall who suffer from imposter syndrome and seeking to justify their sense of inadequacy.

Both are incorrect. There is no wall. In fact, between those two skills is a well worn and wide road that most developers move along throughout their career. Many don't complete the journey to your (or my) ideal of "software enlightenment", but most make progress, and being trained as a "hammer" doesn't stall the progress, it helps.

Also worth reflecting that software is still relatively new and we really don't know what we're doing. We're all making it up as we go along for the most part, so let's not be so quick to prejudge the capabilities of our peers. All of our enlightened architecture may seem pretty foolish a century from now, or even a year from now.


> Both are incorrect. There is no wall.

There is no wall if you define a wall as something that can't be tunnelled through. So I'd agree that a wall is not a good analogy (and I also never used it). But I'm reasonably sure that there is a hill in between, namely the hill to go from one local optimum to another, more deeper one. Whether anyone can mount that hill that depends on her/his drive to learn. All that (good) academic degrees do is basically filtering for people with that drive. That doesn't mean that people without degrees can't have that of course, it just means that some effort needs to be spent finding them.

> Also worth reflecting that software is still relatively new and we really don't know what we're doing.

I agree wholeheartedly, and that's exactly why I'm very skeptical of people falling in love with some hammer-du-jour. It can be an OS, it can be a paradigm, it can be an editor, a library or a programming language - one should always be aware of its downsides, unknowns, alternatives and predecessors (and why/if they failed).

Edit: One more thing:

> I have watched many developers progress from being coders ("hammers") to being thoughtful architects

I don't doubt that one bit. To continue with the analogy, there's two ways to approach a 'hammer': becoming proficient in it, yet keep recognising that it is a tool and only a tool, and its use should be questioned as often as possible. To be able to do that it is often necessary to learn a few other tools at the same time, just to see the other side of the coin. Learning the right tools most certainly makes you a better architect as well though. The other way is to fall in love with the hammer and then start seeing everything as nails. I'm often seeing examples of people stuck in such a mindset and it usually happens with things that have a large community that serves as an echo chamber for people's fandom.


This is all fair and tempers my interpretation of your original post. Some people do get stuck, and members of echo chambers are a good example. I think this observation is likely a snapshot of people who usually do move forward, but take a bit of a pause as they indulge. Another example would be people who just don't care, who found a reasonable 9-5 "job" and aren't very interested in seeing it as more than that. They often have to be pushed hard to advance their skills. Thanks for the thoughtful reply.


Hmmmm

I have a degree, but not in a closely related field. My first job out of college was very much an apprenticeship for me, making the transition from having been a kid who could program, to being an employee who could write software that existed in a wider context. I left that job 20 years ago, mostly because my opportunity for growth was limited, and am now a Principal Engineer at Amazon, and am a little past the hammer/nail stage.

Having presented myself as a counter example, I do have some observations. One of the "leadership principles" at Amazon is "Learn and be curious". It feels like this has been lacking in your experience, while it is everyone's responsibility to look out for themselves, it is a particular responsibility of more senior folks to provide guidance and mentoring to help avoid these traps.


Maybe my point wasn't entirely clear. I actually think that an out-of-field degree may rather help than hurt, because you have an additional perspective - especially if it's a degree based in math.

I definitely see your point about leadership though - university is just one thing, the right guidance at the first longer term job is certainly essential, and something I didn't yet have the luck to see.

To sum up, I didn't mean to say that a degree is what you need, rather that people training only in tool usage often backfires later on, and doesn't necessarily help to get at the underlying fundamentals.


> What I do have trouble finding is someone with the theoretical knowledge on how to utilize machine learning patterns, which design patterns, paradigms, techs and libraries we should use and why and so on.

As someone who is much more comfortable with learning the principles and ideas behind tech and considering its pros and cons, that's surprising to me. Being fluent and confident in the details, minutiae, syntax, very specific tasks within those topics feels much harder to achieve.


> What I do have trouble finding is someone with the theoretical knowledge on how to utilize machine learning patterns, which design patterns, paradigms, techs and libraries we should use and why and so on.

All that is learnable. I have seen people with good background in math and basic coding ability pull that off reasonably fast.


Education in this country needs to be totally reformed. High school is basically day care in most places. No reason the average 4 year degree cannot be completed at ages 14-18. Then if people want to get a masters they can go ahead, but it shouldn't be expected like going off to college is.


Idk about you,maybe you won the Putnam, but when I was 15-20 my brain was still working on getting fiber down for the last mile of the prefrontal cortex. The bandwidth required to really grok calculus (or even write a coherent essay on the metaphors of The Great Gatsby) would have been a bit untenable.

Don't get me wrong, I would have loved to have been given these types of challenges at a younger age, if only to be forced to learn time management skills sooner. However the intuitive understanding of something like the mechanism by which integrals work would have been lacking, and resulted in an employee that lacked the ability to use the lesson taught by that particular type of abstraction in a diverse and creative way.


> I was 15-20 my brain was still working on getting fiber down for the last mile of the prefrontal cortex.

From what research I've read, that is happening later than 15-20. Its more in the early to mid 20s.


That was the implication I was trying to make, I apparently failed in that regard.


While I am sure that some exceptional people can complete a 4 year university degree by age 18, there is no basis to say that's even remotely possible for the norm.

For many complex and intertwined reasons (not all of which have to do with school) high schools are, in many places, failing to teach many students the fundamentals in mathematics, reading and writing.

I think that this new emphasis on training programs misses the fact that many students will simply be unprepared unless they first learn what they were supposed to learn in high school.


No evidence to support the thesis. No special personal perspective to share the secret sauce.

The unfilled job number of 6 million is an interesting data point. I'll give them that. But a job does not equal having a good life.

In short, the only data point to take away from this is waste of time. Am I being harsh? Not as harsh as un-empathic, presumptive declarative what people in America need (by the way you need to provide evidence and support to make claims like this at least).


Job training has been universally popular among US politicians for a long time, but the evidence that it helps people find work is weak. See

https://en.wikipedia.org/wiki/ITT_Technical_Institute


A lot of this is because nobody wants to be honest.

A huge portion of "vacant" jobs are companies that maintain listings for 10 year fully stack veterans while offering $60k a year salary. Unreasonable to anyone, but the point is to dangle the carrot in case you can get super lucky and nab talent at well below market rate. It doesn't cost you anything to keep the offer on the market. They are a great excuse to complain about vacancies, though.

If there were real demand, truly authentic honest demand for work, that wasn't a known fleeting thing, humans adapt. Human history is nothing if not the demonstration that people migrate towards opportunity. It might be considered on the order of days or generations, but people will do it. So don't sit around complaining about nobody wanting to move somewhere where the salary is "competitive" but you have rigged the externalities like taxes and property ownership so far against civilians your bafflement that nobody wants to go where the "jobs" are is dastardly.


A useful point and one I make in this essay on boosting apprenticeships: http://seliger.com/2017/06/16/rare-good-political-news-boost.... The "College for all, all the time and everywhere" mantra needs to end. College is great but is not a panacea.


I heard that most American degrees are, in terms of knowledge and whatnot, worth a high school diploma from some European countries. Any truth to that at all?


The german realschulabschluss is similar to an american hs diploma. Students on the track towards university will add the abitur on top of this, which is similar in depth and difficulty to an undergraduate degree at an american comunity college. Which is why Germany doesn't have pre-med, pre-law etc.


Nope. This is nonsense.


No.


>Companies, especially in the science, technology, engineering, and mathematics (STEM) industries, are shifting their recruiting process from “where did you study?” to “what can you do?”.

But are they really? Go look at job openings. You will find more often than not 'X degree required' (or at least recommended). If you don't have a degree, your application will often go straight into the trash, before the question "what can you do?" is asked.


Two recruiters from Switzerland recently told me that the only things that matters is your study programme (should fit the one given in job opening) and your degree (B.Sc/M.Sc). Salary is non negotiable for entry level jobs. "What can you do" or whatever you did while in college increases only the chances of getting hired. But having basic programming skills is usually already enough to get an offer.


> Salary is non negotiable for entry level jobs

I can assure you that our students (Holberton) do negotiate their entry-level jobs. The majority successfully do so :)

> the only things that matters is your study programme and your degree

When we started we had no track record whatsoever. Our first students found internships but also jobs everywhere including both completely unknown and super cool startup and very high-profile companies such as NASA, Apple etc... they were all selected on what they could do and not on a degree (we are not accredited) nor the school itself (we had no track record)

Today we atart to have a very good reputation so the problem is different, our atudents don't have to fight anymore, companies come to them. But for the first students it was very clear that ppl didn't care about degrees or "where do you come from".


More like "have you used hyped-new-tech-v-0.6.1" because there is no way you can learn it on the job.


That's for people that have to look at job openings. Without a degree and experience alone using open source tools and freely available resources, I am getting more recruiters spamming my LinkedIn than I care to sort through or respond to about 90 percent of the time.


There is a New York Times article today (6/28/17) about the idea of skills vs. degrees.

> https://www.nytimes.com/2017/06/28/technology/tech-jobs-skil...


When I was in High School, I went to 2 years of Vocational School at the same time and took advanced studies for college so I'd have something to fall back on one way or the other.


Ditto. I had two years of VoTech (1/2 day) electronics, which I'd say was the equivalent of an AA degree. It made EE easier as I could focus on the math & physics in the first year instead of electronics.

Unfortunately they scrapped that program in the early 90's as they shifted everyone into an academic track.


What vocational trade?


Industrial and Residential Electricity. We learned mainly residential but our instructor wanted us to be prepared for industrial as well. In the end, I was accepted to a small college but didn't want to work a FT job and attend college so I went into the Navy instead, into the AEF (Advanced Electronics Field).


These articles for the most part live off of an imagined past. It wasn't hard to train someone to work an assembly line, which used to be all you needed to do.

Skilled employment such as plumbing electrical had and still have apprenticeships...


Can critical thinking and debate skills be taught? Throw in some geopolitics and I can support the idea...


[flagged]


Please don't post like this on HN.

https://news.ycombinator.com/newsguidelines.html




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: