Hacker News new | past | comments | ask | show | jobs | submit login
Business Can Pay to Train Its Own Work Force (chronicle.com)
165 points by petethomas on June 22, 2015 | hide | past | favorite | 145 comments



The worst class I had in college was Software Engineering. It was the university's attempt to prepare us for the work force, and it was taught by an adjunct who had plenty of industry experience, but it was already a 10 years out of date.

Industry processes are mostly fads that change with wind. CS fundamentals however, are much more stable. 20 years from now knowledge of automta, graph theory, and complexity analysis will still have immense value--a scrum master certification won't.


I also had a Software Engineering class (actually two them) that were focused on how to build software in real life. This was in '03 and we covered things like waterfall methodology, requirements gathering, functional specs, etc. If taught the exact same way today it would be woefully out of date but the time we spent on requirements gathering (where the teacher or TA) pretended to be a product owner and purposely gave really crappy answers and we had to extract useful information but by bit was one of the best pieces of prep I ever received.

All in all, it was boring and tedious but it certainly wasn't the worst class I ever had in regards to preparing me for a career in technology. I use those lessons to some degree all the time, I rarely directly use all the work I had to do to create my own OS...


> I use those lessons to some degree all the time, I rarely directly use all the work I had to do to create my own OS...

I think you probably would have learned those lessons after a few months on the job. You probably wouldn't have picked up the knowledge to build your own OS on the job, however.

>All in all, it was boring and tedious but it certainly wasn't the worst class I ever had in regards to preparing me for a career in technology.

Did you learn about billing clients, and effectively advertising your services in your CS degree? What about equity versus salary tradeoffs?

At least 50% of working in technology is soft skills, why doesn't a CS degree spend 50% of the time teaching you those? The answer is that a CS degree isn't supposed to be vocational training.

Vocational training, including learning to talk to clients, should be done on the job, during an internship/apprenticeship, or in a specialized vocational training program. By including it in a college degree, employers have successfully pushed employee training costs onto workers and society (as the article argues).


The article isn't complaining about CS majors not getting jobs, it's about soft liberal arts majors not getting jobs.

When companies hired people for 30 year careers, they could afford to invest a tremendous amount in training. When they hire people for 2-3 years, they have less time to amortize the costs. And it's up to the employee to convince the companies that they can learn quickly, and on their own if need be.

Any CS major with decent grades and a positive attitude can learn anything in most any job. (Certainly CS, consulting, finance, marketing, even some kinds of sales) I can't say the same for liberal arts majors. There are great ones out there, but also a lot of folks who goofed off for 4 years and didn't learn anything.


You could take that one step further. CS has, in it's modern incarnation, only been around for ~90 years. CS, as it is taught at the undergraduate level, has changed dramatically in recent years, and will continue into the future.

Math, on the other hand, has been around for thousands of years, is relatively stable, and unlikely to become obsolete in the way a scrum certification, or even a machine learning algorithm, will.

Of course, 'CS fundamentals' usually end up at a very close intersection with math. I'm just suggesting that mathematics has an even deeper level of the 'stability' you referenced.


>Of course, 'CS fundamentals' usually end up at a very close intersection with math. I'm just suggesting that mathematics has an even deeper level of the 'stability' you referenced.

I agree with you completely. The parts of CS that are stable are the parts that are based on rigorous mathematical foundations. I think that teaching things like Object Oriented Programming strays too far from a rigorous foundation--away from math and even engineering into craft (which belongs in vocational training).

When I look back, the classes that I learned the most from were, Discrete Math, Automata, Design and Analysis of Algorithms, and Programming Language Concepts (which went into the academic side of programming language research more than what was currently in use in industry).


I mostly agree, but for me, the line gets blurry around the applied areas that have a lot of depth: computer architecture, operating systems, networking protocols, compilers, and databases. In all of those, I learned a lot about theory, practice, and engineering trade-offs, all of which was worthwhile. I didn't study it myself, but I would imagine distributed systems is (or should be) a similarly rich subject. I also learned a ton from studying the history of computing, which I wish would be more of a focus for those entering the industry.


Good point. Many of the area's you mentioned do have a lot of formal underpinnings, and there are large bodies of research to look to for guidance.

Computer architecture is big E Engineering, done by Computer Engineers for example. The networks class I took was also one of the most math heavy, and most of the book was supported by proofs. In addition, we spent the first half of databases working with only relational algebra.

If you look through a textbook on any of the subjects you mentioned, and compare it to say a book on design patterns, the distinction between math/engineering and craft is pretty clear.

>I also learned a ton from studying the history of computing, which I wish would be more of a focus for those entering the industry.

We went over the history in depth in my program--from Turing to Konrad Zuse to Backus. I also found it immensely useful.


My software engineering class spent most of the semester going over design patterns which are in fact quite useful to learn in school. And then maybe 1/4 of the class on going over the various development methodologies. I agree that a class devoted entirely to methodology would be complete over kill. However I think there is room for getting some exposure to it in school. Ideally before taking higher level classes, where having knowledge of a existing ways to structure your group work will be beneficial.


The problem is design patterns are subjective, they are craft, not science or engineering.

There hasn't been enough work serious research done on "Software Engineering" to call Engineering with a straight face. You can't point a whole stack of serious research to say the design pattern A is objectively better in situation X because Y and Z.

What you can say is that design pattern A is currently in vogue so you should probably use it, while design pattern B has fallen out of favor in industry, so you should avoid it.

That is something that belongs in a vocational training program or an internship/apprenticeship not in a university Computer Science education.


You're going too far when you dismiss design patterns as being merely fashion. Just because something is not objectively proven doesn't make it false. Things in the real world are not binary, where they are either objectively proven (hard sciences) or completely false ("The earth is flat"). In reality, lot of things are gray. Design patterns fall in that bucket — many of them help, as long as you remember that there are exceptions.

If many people learnt over and over again that global state, for example, leads to more bugs, you'd be wrong in completely dismissing it just because it isn't objectively proven. Because then you'd be arguing that a program that uses only global variables is just as good as one that's properly encapsulated and abstracted. Do you think anyone would take you seriously if said that?


I don't have a problem with design patterns as a concept. But you need to recognize them for what they are--folk wisdom. Some of it is useful, much of it isn't.

If there is no theory we have to fall back to empirical analysis, and unfortunately our industry hasn't done much of that. The only thing we have to go on is the general "consensus" of the industry, which is cyclical, transient and mostly fashion.

Some of the industry folk wisdom if beneficial and withstands the test of time. Most of us agree that encapsulation is nice. However, we don't agree on what form that encapsulation should take.

OO programmers argue that state should hidden away inside objects, functional programmers believe that state should be explicit and we should always try for pure functions and immutable data when possible. There's very little objective data to support either side (except that functional programming languages tend to have more formal underpinnings). Mostly it falls back to personal preference, which programmer sages you trust, and what the current industry fashion is.

Our industry reinvents the wheel time and again because we are slaves to the cyclical nature of the industry fashion.

Relational Algebra/calculus has been formalized for decades, yet people who don't understand relational theory cried out for something "simpler" and thus NoSQL was born. Fast forward 5 years and you'll find many of the people championing NoSQL had to reinvent most of the tough problems that CS had solved decades ago.

Again, there is nothing inherently wrong with craft and folk wisdom--just like there's nothing wrong with learning salary negotiation, but they don't belong in an academic CS environment. These things are best taught in an internship/apprenticeship after you've learned the underlying theory.


Design patterns are models, something very much within the wheelhouse of engineering. Engineering doesn't always deal with absolute facts as in science. When an underlying system is too complex to fully describe, simplified modeling can be appropriate.


Models are used to get a better of understanding of a complex system, or to test a complex system when you can't test the real thing. A design pattern could be used to this way but they aren't. They are used as a design methodology that you are encouraged to follow.

They also aren't based on any formal theory. They are based only on the experience of the people who create them. They are almost the definition of a craft (as opposed to engineering).

In an actual Engineering discipline you would need evidence to prove that your model fits reality as opposed to just trusting the experience of a few guys who wrote a book.


From my experience design patterns are both taught and used as a starting place for solving a problem which is recognized as similar to a problem which was previously solved effectively with in the manner of the pattern. Rarely will the design pattern fit perfectly as the solution, but by recognizing and using the correct one as a starting point for many architecture related problems you can greatly reduce the work involved, in creating the solution.

They keep people from re-inventing nearly identical solutions over and over, and allow for a vocabulary which can express rather complex ideas because fellow practitioners of software engineering generally know many of the same design patterns. This saves time and reduces the opportunity for miscommunication when expressing a more complex idea (assuming the person you are talking to doesn't lie and for instance say they are familiar with the facade pattern, when they are not).

To me this is using them to both get and express a better understanding of a complex system. I am also able to understand code for new projects I am to work on much quicker when I understand the underlying design of the elements. When the elements are designed with a structure that resembles things I am familiar with this process becomes fairly easy.

I see them as things like definitions of 'Suspension Bridge', 'Victorian House', 'Tunnel', or 'Dog House'. Sure we might see a book consisting of very rudimentary definitions of structures like these as being absurd, for the corresponding type of engineer, but that is because of our familiarity with the form these structures take. That familiarity with the form is the point of design patterns in my opinion. And I believe their utility is indispensable should we want to continue building more complex structures in code.


I have no problem with design patterns as a concept. And I have no problem with design patterns being taught as the collective folk wisdom of wise sages of industry.

However, design patterns are craft, not engineering, and should be taught as vocational training--not as an academic subject.

Design patterns have no rigorous underpinnings. They have very little academic research to back up their efficacy. The only "proof" we have that the design patterns being taught are beneficial is the word of a few guys who wrote a book and the collective folk wisdom of industry.

There is nothing wrong with this, but it isn't firm foundation for an academic subject.


Again we see another Dijkstra soldier pushing the corrosive "software programming is complex" ideology. Notice how they are the same group as Martin Fowler followers.

> "They keep people from re-inventing nearly identical solutions over and over"

No, they force users to rewrite the patterns over and over again, since a pattern is an idea that a given language is not powerful enough to abstract over. Cf. opinions on the C2 wiki.

Maybe that's why you think what we do is "too complex to fully describe" - you haven't used a language that allows you to abstract at will.


OK, just cool your jets mister, everyone here is having a pretty civil conversation without you attacking people for 'pushing corrosive ideology'.

And in order to formulate that attack you pieced together quotes from two completely differently messages and people to make your 'point'.

I said they keep people from re-inventing solutions, however it was the comment you actually replied to which else claimed they were too complex.

I agree they are patterns which languages do not yet abstract over, but until languages do I think they are very useful patterns for programmers to be able to correctly recognize as they occur frequently enough that knowing the pattern saves a lot of time and work. When languages can successfully abstract over those patterns, then knowing them will be niche knowledge assuming new patterns cease to be detected by humans prior to their ability to be abstracted by a language.


> OK, just cool your jets mister, everyone here is having a pretty civil conversation without you attacking people for 'pushing corrosive ideology'.

Please.

Rhetoric = Ethos + Logos + Pathos

Unless you want to believe I chose my words while extremely angry at you personally, and unless you want everyone to never use pathos and talk like robots in perfect clauses and rebuttals connected in an acyclic directed graph, I suggested you get used to this very common rhetorical necessity.


If college education is too specific to one company then you are locking yourself into that company and you will lose any leverage over salary or even leaving.


Oh god yes I went through the same pain. Our university decided that every Engineering student had to take two CS courses during the first year. This applied regardless of whether you were studying Civil Engineering, Electrical, Mechanical etc. This was alongside all the other first year courses we were required to take such as Chemistry, Physics, Calculus, Linear Algebra, Statics, Dynamics and all that.

First Semester we took Intro to Programming and Algorithms (CS1100 or something like that). We learnt everything from Binary notation, logic gates through to floating points, I went from not knowing any programming language to having a fair idea about how topics like recursion worked our last project was to write some code to walk a tree using recursion. I also learnt how to use Unix and recieved my lifelong love of Emacs (which we used in our tutorials) from this course.

Second Semester we had to take Software Engineering (CS 1110 I think). It covered waterfall model, version control, specifications, unit tests and more estoteric stuff like loop invariants and formal correctness. Our major project was to write an essay about the Arriane V rocket explosion. I really enjoyed CS 1100 class but CS 1110 effectively succeeded in boring me to tears. It was somehow supposed to give us a taste of real world software design. All it really did was encourage myself and many others not to take CS electives in later years.

My major was Materials Engineering I took mostly physics electives in the last year of my degree, which is kind of ironic now because I work pretty heavily with programs - my day job involves writing computer simulations. I learned most of the skills I need on the job (including the C programming language, databases/SQL etc.) I probably would have benefited from more formal training but my experience of university CS was so miserable I actively avoided it.


I think it's a fallacy to assume that anything not based in CS fundamentals is a fad or has only short-term value.

In college, I didn't learn version control, continuous integration (continuously submitting your work in small changelists or patches), unit testing, making sure you're building the right product before building it, delivering the simplest possible code and design that meets the requirements, code quality, working in teams, untangling dependencies and making as much progress as possible today without waiting for all your dependencies to be resolved, and so on.

I expect that all these skills will be very much relevant 20 years from now. So, don't confuse long-term value with "grounded in CS fundamentals". Programming isn't a hard science like physics.


>I think it's a fallacy to assume that anything not based in CS fundamentals is a fad or has only short-term value.

I didn't say that. I said most industry processes are fads. I also didn't say that nothing that isn't grounded in CS fundamentals has long term value.

There are plenty of other skills that have long term value. Office politics, salary negotiation, self-promotion are much more valuable than knowing how to run a few git commands. But non of those things should be taught in a Computer Science program.

They are fundamentally vocational skills. Just like version control, unit testing, and continuous integration are vocational skills. Sure they're useful but they should be taught in an internship/apprenticeship or on the job.

>In college, I didn't learn version control...

I learned to use subversion, and other than the fact that they are are both version control systems, what I learned didn't really carry over to distributed version control like Git.

In a CS program you should be learning things like how to implement a version control system, not how to use Git. I would have been pissed paying thousands of dollars per semester for a professor to walk me through a Git tutorial.

I don't have a problem if a professor wants you to use github to submit your assignments or something like that. And sure some of the vocational skills you listed are going to be useful for years to come. But these skills should be ancillary. They should be just a happy side effect--like learning teamwork during a group project.


I'm a practician, not a theorist or an academic. I couldn't care less whether something is based in CS theory. I care whether something will be useful to me over the course of my career. If it is, I'd like my education to train me for that.

In fact, some of the academic stuff like compilers and automata have been useless in real life. That's a failing of academia from my point of view.


That's perfectly fine. What you're looking for is vocational training, not a liberal college education. Non professional college programs are explicitly not vocational training. If they were, they wouldn't require spending nearly half your time on general education requirements (assuming we're talking about the US here). I doubt art history, physics, or psychology has proven much direct use to you in your career.

>In fact, some of the academic stuff like compilers and automata have been useless in real life. That's a failing of academia from my point of view.

Finite state machines and pushdown automata are an incredibly common pattern, and I can't see how you can work as a professional software developer without running into that pattern time and again. Have you never used regular expressions?

Automata (usually taught along with theory of computation) teaches you all kinds of useful real world knowledge, like why you can't parse HTML with regular expressions, and why you can't write a program to tell if another program will eventually halt.


My idea of education is one that teaches you skills that are broadly used throughout your career. I don't a priori reject things that meet this criteria just because it's not based in theory (because theory is not an end to itself), or by applying arbitrary labels like "vocational" (whatever that means), "liberal" or "professional".

As for art history and psychology, that's a different debate to be had about education — whether these should be part of education and how much time they should take.

As for your question, I've used regexes, but you don't need to understand the details of the regex engine in order to use them. Neither do I, in my day-to-day work, write programs that try to tell if other programs halt.


>but you don't need to understand the details of the regex engine in order to use them.

Yes, at some point you do. Without understanding how regular expressions actually work, you can't know when it is appropriate to use them. Many things aren't possible with regular expressions and many grammars aren't parsable with regular expressions. You can either waste time trying to write an impossible regex (or write one that works on your tests, but blows up in the wild) or you can study automata theory and understand what actually goes on underneath.

As for the halting problem, I'll leave you this stack overflow explanation for why it is beneficial to understand.

http://cs.stackexchange.com/a/32853

Many problems in CS have already been solved, some are impossible to solve. You can either waste time on trial and error trying to reinvent the wheel or you can study the theoretical underpinnings.

Do you want to spend a week trying to model a problem as a finite state machine, only to determine that finite state machine isn't powerful enough to solve your problem?

Do you want to spend a month banging your head against a wall trying to solve a problem that you could have solved in 5 minutes had you realized it was just a well known graph theory problem all along? A problem that was solved decades ago. The only way to know these things is to study the theory behind what you do.

Why do you think Civil Engineers are required to take physics? The difference between an Engineer and an artisan is a rigorous understanding of the formal system underpinning his work. Artisans build through trial and error and experiences, and they leave many failed projects in their wake while they gain this experience. Engineers use theory and modeling to limit the number of failed projects to the net benefit of everyone involved.


> My idea of education is one that teaches you skills that are broadly used throughout your career. ... applying arbitrary labels like "vocational" (whatever that means)

vocational - adj. 2. (of education or training) directed at a particular occupation and its skills.

Which is exactly your idea of education. The labels are descriptive, not arbitrary.


Then, do not go for a degree in Computer Science if that is all you want. I am sure there are cheaper and better ways to become a practician than going through college.

Becoming a software developer is just one of the possible career options after a CS degree.


I doubt that there are cheaper and better ways to become a practician than going through college. In India, where I live, most companies are not interested in you unless you have a degree. The normal path to a career in programming is to get a CS degree. And the normal outcome from a CS degree is a career in tech. So, they are much more closely related than you acknowledge, at least in my part of the world (things may be different in yours).


Yes, it happens also in Spain: companies want college degrees for their software factories, and at the same time complain that colleges do not teach anything useful.

It's not colleges fault, in my opinion.


I did learn a lot of those things in college...


I guess it depends on the purpose of the education. If you're already getting the CS fundamentals, maybe it doesn't hurt to get up to speed on some of the industry fads at the moment, since most of the students could be looking for an industry job in two-three years.


Everything is a tradeoff if you spend time on industry fads, you're spending less time on CS fundamentals.

Industry fads aren't useful outside of industry, so learning them should be paid for by industry. As the article argues, 20 years ago they were. Now employers are trying to push the cost of vocational training onto workers and the public.


Well, the whole point of the article is that those students looking for industry jobs in two-three years should just be hired and trained in the fads of the moment, instead of wasting higher-ed time teaching trivial and likely irrelevant-by-then knowledge.


For the most part, bonding agreements ("you can't leave for X years without repaying us for your training") are considered exploitative and usually not legally enforceable.

As a result, a business can't pay to train it's own work force - if a business invests $20k in training and $80k in salary, there is nothing stopping another employer from offering $90k in salary after training is complete.

If an investment can't be protected it's pointless to make it. Having employees pay for (and be compensated for) their own training is the most reasonable workaround.


For a mathematician you sure like fallacious arguments.

As a result, a business can't pay to train it's own work force

Well, they could offer more pay after the training is completed, or focus on being a nice place to work such that employees would choose to stay put with a company that treated them well. Your argument rests on the implicit assumption that employees will leave the moment they receive a competitive offer and are only interested in maximizing the take-home pay aspect of their economic advantage. Employees are motivated by a combination of monetary compensation, benefits, and good will towards a firm in the same way that a firm is valued by both its book assets and its good will in the marketplace. You assume, without foundation, that employee loyalty towards an employer that provides training will be nil, and ignores the possibility that employees might look forward to receiving future training (and promotions) which would have significant economic value, whereas a firm that poaches employees and offers no training of its own presumably won't be offering any in future either.

If an investment can't be protected it's pointless to make it. Having employees pay for (and be compensated for) their own training is the most reasonable workaround.

You could equally argue that it's pointless for employees to run up debt buying expensive an education which an employer might then say doesn't quite meet their requirements and therefore shouldn't be rewarded by additional compensation. Considering the non-dischargability of student loans in bankruptcy and the financial weakness of employees relative to employers when it comes to negotiating prices for training/education, I would say it's rather irrational for employees to take on all the financial risk involved.


The other company without training could also be a nice place to work, and pay more as well. You are right that some employees might give "goodwill" to their employer. If such goodwill effects become sufficiently large, maybe employers will start offering training again. I guess you'll be encouraging employees to hold irrational feelings of loyalty towards their employers?

As for the costs to employees, that is indeed a potential concern. I encourage anyone considering investing money in education to consider the marketability of same.


I don't think it's at all irrational to have a feeling of loyalty one's employer if that employer actually supports one's economic interests to some degree. Even if one views people as pure utility maximizers, it doesn't follow that salary differential is the only measure of utility. There's a cost to quitting a job in terms of the simple disruption and effort to form new social relationships in the new workplace, not to mention the non-zero opportunity cost at the previous employer in terms of foregone promotions.

I'm all in favor of using economic analysis to work out how much a firm should invest in its labor force, but what you are doing (and do in almost every conversation about economics) is arguing that we should consider only those which affect your preferred metric. As someone else pointed out, you ignored things like the probability of increased productivity to the training employer during the training period and so forth.

I've actually worked freelance most of my life, and so I'm responsible for my own skills, pay both the employer and employee portions of payroll tax and so on, so the ideas you're advancing are far from being foreign concepts to me.


Provided the transaction cost of shifting jobs (which can include feelings of loyalty) is lower than the cost of training, any employee has the incentive to jump ship. I.e., if transaction costs are $10k but training costs $20k, then a poacher can steal your trained employees by offering $15k more.

I don't know why you think I'm disputing that transaction costs can cause markets to be inefficient - I agree completely. As a limiting case, if we punished quitting your job with death, shifting jobs would only make sense for the suicidal.

What, exactly, do you disagree with me about?


Your basic contention that business can't train it's own work force.

Your original analysis neglected transaction costs entirely.


What incentive would an employee have to leave company A just because company B also offers the same "nice" salary and work environment. The advantage of offering the training in the first place is you GET the employees to come to you, then as long as they're compensated they will have no reason to leave after training is complete. It's a win-win for the employee and employer.

You for some reason seem to hold the idea that after training, some other company can just come in and offer more money. Why?


Scroll up for the arithmetic of how company B can pay more money to the employees.


The arithmetic is wrong, unless it's just not fully explained.

"there is nothing stopping another employer from offering $90k in salary after training is complete"

Sure there is, and many people have pointed out the flaws in this comment. If $90k is the market rate, then there is nothing stopping the company that provided the training from paying that too, and there would be no more advantage to offering $90k. In addition, as others have pointed out, there will still be some ramp up time and training at any new company.

If your point is that the first company can't expect to under pay their employees after providing training, well, that's obvious and nobody suggested it.


So to keep the same set of employees, a company must pay $20k for training + $90k for employees = $110k. Alternately, a company can just pay $91k and let someone else pay for training.

Do you not see any reason why a company might prefer to be in the second category?


Of course companies will offload training costs on taxpayers or individuals whenever possible. It's the in the "nature" of capitalism to maximize profits and mimimize costs. It used to the government and the state that provided balance but now it's implicitly, if not explicitly, accepted that business has "won" and calls the shots. The state's roll now is to further the interests of business.

This is great if you're an employer but what are the consequences of shifting from a capitalist economy to a capitalist society? There is more to a successful society than increasingly desperate and ruthless competition for a shrinking pool of jobs. How many people want to spend their entire working life in a cage match for survival hoping that they are one of the lucky few who make to the billionaire's club? If government isn't going to represent the people what's the point of having one?

The standard retort from market fundamentalists (and it is a faith) is to reframe the debate as capitalism vs. socialism, or argue that the market is as natural as breathing and we have to let it do what it must. The first one is straw man argument; not many people are arguing in favor of a Marxist paradise. And the market as a system of nature is simply delusional. A market economy is fine, most people agree on that. The problem is when the market model invades every last aspect of society and opposition is neither mon-existant or rendered impotent. The end result is a dictatorship of market fundamentalism. Kind of a capitalist version of the Soviet Union where the Communist Party is replaced by oligarchs and the lowly worker is replaced by the lowly worker version 2.0. IOW, post-industrial feudalism. Is that what "the people" want? Something tells the answer is "no, they absolutely do not."


$90k is the market rate for trained employee not an untrained employee in this scenario. It's entirely reasonable for the company to say to an unpaid employee we'll pay you $70k + $20k worth of training until you become trained, and then we'll pay you $90k. The added value the company gets is being able to focus the candidates training on their needs (and perhaps a little light indoctrination).


Companies in the second category are going to be in the market for lemons. Company A has inside knowledge of how each employee is performing and can easily match a price increase for employees that are worth it while letting go of those that aren't. Company A is paying a bit more to purchase an option on skilled employees, Company B is hoping that they don't just get lemons.


No, I don't see any reason why a company would prefer to be in the second category. If you're in that category you'll be trying to hire employees for the same salary they're already being paid by an (intelligently run) company, in addition to having to retrain them for your company.


>I don't see any reason why a company would prefer to be in the second category.

The 2nd company doesn't have to bear the costs for training.

>If you're in that category you'll be trying to hire employees for the same salary they're already being paid by an (intelligently run) company,

For the first company, it doesn't matter how much they (intelligently) increases the salary because they still have to add in the expense of the training.

>, in addition to having to retrain them for your company.

Not necessarily. A common example would be IT consultants. It is very common for Oracle and SAP have their consultants spend 4 to 8 weeks of training and then boutique firms would poach them with higher salaries. The boutique firms didn't have to pay the $20k to $40k for the 8 weeks of classes and they also don't have to retrain them. The Oracle DBAs' skills are ready to be put to use on day 1.

If one thinks Oracle/SAP can simply increase the salary to what the boutique firm was offering, the math doesn't always work out because the total higher compensation has to include the training they already paid.

If we use following placeholders:

s=salary, t=training, i=increase to market salary rate

Category 1 company total expense = s+t+i

Category 2 company total expense = s+i

Because "t" was a non-zero amount that can't be magically erased, it means that for all values of "i", "s+t+i > s+i"

In other words, for Category 1 company to "match" a higher Company 2 salary, they must always pay more than that salary because the "pay more" includes the training $$$ they paid.

Hopefully, for Company 1, they have other non-monetary advantages that outweighs the absolute mathematical disadvantage the above equation shows.


https://en.m.wikipedia.org/wiki/Golden_handcuffs

How would training followed by a bonding period be considered differently than other mechanisms inducing employees to stay? Could these other mechanisms also be legally questionable?

E.g. some San Francisco Bay Area technology companies offer large (~$20k) signing bonuses to new uni graduate hires that the employee must return if she leaves within her first year at the company. Similarly, companies offer five-year equity packages that deliver no equity until the twelfth month.


Vesting options on your 401(k) are common. For example, my company does it like this:

First year - no company 401(k) contributions.

Second year - full 401(k) contributions but 25% vestment. That means you get the money in your account (and it compounds) but if you leave before you hit your third year you only get to ultimately keep 25% of the money they contributed.

Third year - 50% vested.

Fourth year - 75% vested.

fifth year - 100% vested. Money is all yours.

(I think vested is the right word, correct me if I'm wrong)

Also, I worked for a company (in a non-technical position) where they initially hired at a low rate. You learned your job as you went and each job milestone had a test (written and practical). Passing the test netted a large increase in salary. You could learn other jobs other than the one you were hired for and increase your salary even more. It was a great system.


No, "vested" is exactly the right word for this.


Yeah, I heard about companies who pay for college - as long as you stay one year after your last class. If not you have to pay them back. Actually, just pay back just the tuition you spent in the last year.


When I started my first job out of college in 2001 the way the company did it for my masters program was to give interest free loans to cover the cost, then forgave that loan balance over the course of a few years (forget how long).

I got my masters degree and then stayed at the company for a year and a half before leaving. At which point I had to pay back the remaining loan balance.

Worked pretty well for the situation, but that would only work if the cost of training wasn't exorbitant and the worker pay was relatively high. I believe I was making $58k when I left and had to pay $8k or so to repay the loan. Doable for me, but not necessarily so for a lot of people.


Seeing as Army ROTC will want 4 years, it sounds like a great deal.


If you take classes for 4 years then you'll have to stay 5 to get your education fully paid off. If you took 4 years of classes and worked for 4 years then you'd owe them 1 year tuition (the previous year)

That being said, this deal was to further your education for your job. So you couldn't get your masters in finance unless you worked in finance. A software developer couldn't get their finance degree paid for.

This was for a company my friend worked for.


Hard to say, but if banned it seems like that sort of a signing bonus could simply be retraced by issuing a signing bonus that only vests after one year. And of course, the hire could probably finaigle a loan against it to have it basically be the same as the original intent, except with a middle man taking out interest.


And yet it was the norm only a few decades ago. What's new is the emphasis on speed and flexibility. If you're starting a new initiative, it's better to hire skilled people now than to start a training program and get them 2 years from now. And if the new initiative fails, laying off people you've trained is throwing away the money you spent on their training.


It's not just starting and stopping projects faster, it's also social norms about changing jobs. We have a much more liquid job market and this provides a lot more opportunities for folks to arb opportunities.

This is overall a good thing, but there are side effects.


How much of the liquidy is caused by changing social norms among workers and workers taking advantage of better work opportunities and how much is caused by an unstable job market(with unions losing power) caused by CEOs? Mass layoffs were not always a "thing." They are traumatic to those who get layed off and those who witness layoffs of their friends, family, co-workers, and neighbors without getting layed off themselves.


I think it has as much to do with a corporate shift away from pensions and other considerations for long-term employees. For my entire professional career I've been warned that there is no one else looking out for me but me. There will be no pension when I retire, and no social security, I'm told. So I have to maximize what I earn and what I save now in order to survive later.

Is it any wonder then that my generation has absolutely no sense of loyalty to their employer and show a pathological willingness to jump at almost anything that pays more or has better growth potential?

What benefit is there to an employee that would encourage them to stay with one employer? (To be fair...I've been with my employer for about 10 years, so YMMV.)


The shift away from pensions is caused mainly by laws (e.g. ERISA) requiring companies to properly account for them. Defined benefit pensions are both extremely expensive and very risky, but companies were allowed to hide these losses and risks off the balance sheets in the past.

http://www.thedailybeast.com/articles/2013/03/15/sorry-folks...


And, as the author of the piece says, if you did have an overstuffed pension fund that caused problems too both with the IRS and because of LBOs.

Of course, a lot of benefits choices are driven by laws/accounting rules/taxes. Ideally companies want to give benefits that cost them less (however they look at cost) than their value to the employee. Once that balance flips, that benefit makes a lot less sense.

That said, a benefit structure that assumes long-term employment with a single company doesn't make a lot of sense for most people or most companies today. (Though I'm probably unusual among this readership is that I actually will have a modest pension from a past employer.)


But most employers in the developed world shifted to defined contribution benefits ages ago. I was part of that first generation of workers to be told that my pension was going to be a moveable feast and not something that I could have specific expectations about, and that was back in the early 90s. Defined benefits pensions are indeed still expensive for companies and local governments that owe them to older workers, but they've been moving away from taking on such obligations for a long time now.


The shift to 401K is more about the financial industry rigging the system so that they have more of other people's money to play with. Need a bigger slush fund? Lobby Congress to make pensions unfeasible through carefully constructed legislation and wait 10 years for the payoff to start rolling in.


Clearly, employers underfunding pensions and then going bankrupt causing retired employees to go broke was not a real problem.

Pensions were always unfeasible. ERISA simply requires employers to properly account for that.


Yeah, I'd say that employee loyalty's dropoff kind of trails the dropoff in employer loyalty. Like when companies started using backdoors to raid pension funds and pay out executive bonuses with them that was not the result of some newfound job-hopping tendency in retired workers.


People change jobs more because nowadays, it's the only way they can get a raise. This is also the fault of employers. The middle class hasn't gotten a raise in years, while executive bonuses continue to rise. Pay people more, and you'll retain them longer.


Companies have to do stuff like this since employee loyalty was lost when companies started making layoffs commonplace.

Training from a company says that they have a vested interest in the person, but they don't. Companies don't care about you, they only care about the bottom line and do not have any interest in spending effort to teach you something you could use somewhere else.


"if a business invests $20k in training and $80k in salary, there is nothing stopping another employer from offering $90k in salary after training is complete"

Doesn't this assume a lifestyle from several generations ago where you get "the training" then you're set for the rest of your working life? I don't think the real world has worked like that since at least the 70s.

If I took that $90K job I'd be OK while I'm there... thats what, on average just a couple years? Then I'd be dead in the water and have some explaining to do at the interview for the next $80K job that provides the entry level training I'd need. Why would any employer hire an untrained employee if there's a perfectly well trained employee from the $80K job? Meanwhile the guy who stayed at the $80K job likely got a promotion to $100K and here I am trying to get in at the ground floor yet again.

How this fits in with stereotypical OTJ training is a mystery. Most formal training is a way to shovel money to middlemen who provide it in an accelerated format for an extremely large fee. Most real world training is "here's a PC with a web browser and your bosses bosses boss declared the due date for the project is next month, now figure it out yourself".

Think of the last time you worked a project with a junk spec that has no relationship with reality. Those people who can't spec a fizzbuzz if their life depended on it are the ones specifying the required training. I wonder if that training spec will be worth the paper its printed on?


If I've got a specific skill or qualification that's increased my market value to $90k, I'm not really sure why I'd be reduced to applying for entry level jobs and requiring training again in two years time?

And there's a huge spectrum of on the job training, from pointless classroom exercises to tick company compliance boxes to highly expensive professional qualifications that might pay for themselves in two years if the firm can hang onto the employee and bill for their time, but also make it much easier for the employee to get a better offer. It's the latter type that costs firms the most money as well as benefits candidates the most.


> If I've got a specific skill or qualification that's increased my market value to $90k, I'm not really sure why I'd be reduced to applying for entry level jobs and requiring training again in two years time?

In other news, the huge demand for people who could efficiently work a loom has slacked greatly since the industrial revolution.

(A snarky way of saying, training and the benefits and value it confers is a function over time. Especially in the current world of web-based software development, two years can be a very large amount of time.)


Sure, but in anything resembling a competitive market I can't see any reason to believe a newly-acquired skill will hold its value less well in future if I move to a new company offering a higher salary because of it, as in the scenario proposed above.


> if a business invests $20k in training and $80k in salary, there is nothing stopping another employer from offering $90k in salary after training is complete.

And nothing stops the first business from raising the employee's pay to $91k. Why should they expect to keep paying the same salary to someone they've made more valuable? They already amortized the costs of training over the value added by the employee in the first year, right?


Then the total cost to the first business for this employee is $111k. The cost to the second business (who provides no training) is $92k to beat the first company's offer. It's a losing proposition to provide training - you get the same labor at a higher cost.

In contrast, if you simply pay $100k to trained employees, and employees can pay for training out of pocket, you preserve the cost structure (big caveat: I'm ignoring taxes) but without the risk of losing your investment.


> if you simply pay $100k to trained employees

Assuming 1) you can find trained employees, and 2) they are willing to work for $100k. If the value of the trained employee is $111k, then why would they work for $100k except for ignorance of their value?

This is my big problem with STEM shortage parrots and with labor as a market in general: ideal candidates are rare, and, in general, willingness to spend more or train up is pretty low. On the other hand, anecdotally businesses seem willing to wait months or years for that ideal candidate to come along and fail to balance the gigantic opportunity cost lost with the cost to train.

It's economically not as simple as what I'm describing, but that's sorta my point. I rarely see anyone in labor market discussions (especially around STEM or highly skilled workers) include the opportunity cost of an unfilled position.


Exactly right. That opportunity cost seems perpetually ignored. Also, when we talk about "value in the market of an employee's skillset", we're obviously talking about how much total compensation they'll receive. What ALSO seems to get ignored is the fact that the people you're hiring (particularly in software!) are or should be generating a multiple of that salary, so worrying about "90k" vs "110k" is laughable when they may be generating value of 500k-1M, even after you factor in health care and all the other benefits. Does anybody actually believe that those few tens of thousands of dollars (should) make or break anything? We're not talking about the cost of eggs or something here; we're talking about software, a huge force and value multiplier.

The (IMO very weak) counterargument here is "well, budgets are set to blah blah blah" - but in that case, you're allowing a difference of a few dozen thousand dollars dictate whether or not you get someone, which can be the difference between shipping, getting out a new version, or whatever and not doing so. If your budgeting is that inflexible, maybe your organization has other problems it should address first.

When it comes to software, cost-based pricing should not be anywhere near the discussion. Software developers and their output should judged on value. It's crazy that anyone is still in this mindset after nearly two decades of software eating the world. We may be getting paid a lot relative to the median, but it's still just a fraction of the value it creates - otherwise, why would anyone bother? Hire a dev for 135k for example (includes benefits), and have them only generate 150k worth of value? Come on.


Their market value probably isn't $111k, hence their existing employer being unwilling to match competitors' salary offering for a trained member of staff and pick up the tab for the training.


If their market value isn't $111k, why would the competitor pay that?


Person currently earns $80k. Training costs $20k. Rival company will be willing to pay $90k salary after training to poach the developer, which means the current employer may end up forced to counter offer with $91k salary after paying for the training. 91k + $20k in training fees = $111k

Competitor does not pay $111k because competitor does not pay for training. Which was the entire point of the exchanges above

You could of course question why a company would train their employee at all if the market rate for an already trained member of staff was about $90k. Because retaining an existing member of staff at an $80k salary after paying $20k for their training happens to pay for itself after 2 years. Its quite sad I feel I have to post the calculation to avoid further downvotes, but its: (($80k * 2) + 20k)/2 = $90k

I honestly thought this was high school stuff....


Nobody is asking how to add numbers together. I just think you're missing the point.

Whatever a competitor is willing to pay, the company that provided the training can just pay that and not risk their employees being poached. Additionally, it isn't free to poach employees, they will have training costs no matter what. In other words, after training you raise their salary.


> Whatever a competitor is willing to pay, the company that provided the training can just pay that and not risk their employees being poached.

Yes they can, if they want to go out of business.

If I spend $20k to train a new employee, I can afford to pay them their value to me - $20k (less than that if I want to make a profit, but this is the maximum that allows me to stay in business). Meanwhile, my competitor waits 4 weeks for me to do the training and offers their full value. Now I can either let them go, or match the offer - but either way I'm down $20k, while my competitor comes out even.

So you can imagine why I might not want to spend money on training.

Given your other points, this is only a problem is the cost of the portion of the training that is transferable is more than the extra cost of poaching (over that of hiring a fresh employee). As the job market becomes more liquid (decreasing that poaching cost), this becomes the case more and more.


It isn't free to poach and onboard employees, but it's less than the cost of training in a huge number of industries, especially if the training comes with a recognised professional qualification attached to it. If I spend a large fraction of their annual salary on supporting them through a qualification I might well give them a pay rise afterwards, but a competitor able to extract similar revenue per head from trained staff can always afford to offer them a bigger pay rise after I've picked up the tab for their training. I'm only in profit whilst providing training if I can pay my staff less than their marginal revenue product for long enough to cover that overhead; ideally it would be before they've finished and got the certificate but that's not always possible.

And hence, in the original example, the company providing the training ending up spending $111k on their member of staff over the course of the year to head off a competitor's $90k offer, even if the employee isn't worth $111k (Considering your original contribution to the thread was to argue the competitor wouldn't pay $111k, I think accusing me of missing the point is a bit rich...)


Exactly. Sometimes the market "decides" that you're going to have to stop under-paying your employees.


You misunderstood the premise. The first company pays $91k after training. So it's not $111k, it's $100k during training, then $91k after that. More competitively, it would just always be $100k and the second company would have no advantage. In fact, the first company has the advantage because it will be easier to initially find the people that need training.


On top of that: The second company presumably still has domain-specific training of its own to perform, and will have to un-train the hire in other ways. Makes more sense to just pick up someone new.


Your argument rests on an assumption that an employee produces nothing of value while learning. If they are performing useful work while learning, then the salary paid to them isn't being wasted.

How true this is depends on the field and subject matter, but surely any entity will tend to see their own specifics as basic things anyone should already know even though the field is actually much larger.


The investment could produce a net gain, even in the immediate, and there would still be a problem. Leaking value into the competitive environment in which you're trying to survive will kill you. As alluded to by the GP, training as compensation produces employees that are worth more to competitors. Once trained, an employee experiences a loss in pay equal to the cost of their training. This Effectively subsidizes their departure.


> Leaking value into the competitive environment in which you're trying to survive will kill you

Taken to the extreme, sure. But every voluntary interaction necessarily "leaks value", being non-zero-sum.

This word "training" keeps getting used, and I think it's at the heart of this disagreement - viewing learning as a discrete top-down event as opposed to a continuous process.

GP's example is akin to an employer paying for a bachelor's degree. My comment is easiest thought of in terms of company-specific processes. For any specific 'skill', the truth lies somewhere in the middle.

Software is a bit of a rare case where most tools are highly complex, but are also freely (as-in-beer) accessible. Tool choice is heavily driven by fashion, yet fashion being foreign to our worldview we're blind to this. Is $software_package a fundamental that every college graduate should know, or basically equivalent to an in-house framework? Who knows...


If you don't make the worker want to take off then it doesn't have to come to that.


That's not true at all. Business doesn't want to have to actually compete for it's employees, that's it.


I think this logic breaks down when you run out of people who already have the skills du jour.

If you can hire people who already have the specific skills you need, sure, let others pay for the cost of acquiring those skills. But, when you run out of those people, then the choice becomes paying to train them, or suffer the cost of leaving the work they'd do undone. If the value of the work they perform is more than the cost of training, it's logical to train them.

In this example, the market rate for someone already trained at a certain set of skills is $90k, and the cost of training is $20k. Nothing prevents a company from offering people who are smart and capable, but not trained at whatever the company happens to need today, $70k. But with a promise (in the employment agreement, not orally) to offer $90k one year from now.

Everyone benefits from this arrangement — the candidate, who now has a better job they otherwise would have (which is presumably why they're taking the offer in the first place). And the company, who's getting a capable employee without spending more on training.

So, the bottom line is that you can always structure your incentives in a way that makes the training worthwhile for all parties involved, without asking the employee to pay for it.


"As a result, a business can't pay to train it's own work force."

I believe in Germany consortiums of businesses in the same industry subsidize training. Seems reasonable.

In any case, your observation would seem to be refuted by the fact that businesses in general used to do precisely that, and many businesses still do do precisely that.

"Having employees pay for (and be compensated for) their own training is the most reasonable workaround."

For the business, maybe.


"As a result, a business can't pay to train it's own work force - if a business invests $20k in training and $80k in salary, there is nothing stopping another employer from offering $90k in salary after training is complete."

Except for the massive information asymmetries in the market.

Even ignoring the information asymmetry, hiring a new employee is expensive. You can easily end up paying 40-50k to hire a good developer (as a combination of the recruitment process and ramp up time).

So really company B is paying salary + $40-50k recruitment. This means company A can easily match company B as well as provide training.


> For the most part, bonding agreements ("you can't leave for X years without repaying us for your training") are considered exploitative and usually not legally enforceable.

Extremely common in the UK :(


They're common in the US too; they just aren't enforceable. However, I assume a lot of people aren't aware of that or else just feel grateful enough to an employer that offered them training of some significant value that they stick around for a few years.


I remember some motivational poster from my old tech recruiter's FB feed that seems appropriate here:

"What if we invest in training and our employees leave?"

"What if we don't, and they stay?"


There's no reason why they would pay $80k in salary if they're not getting enough expected value.

Until a new programmer is making minimum wage, there's always a point at which companies can employ them for less, and hope their value (plus expected value from the ones who stay on at a higher wage) will be more than the costs of training.

Also, training is often domain specific. Employees are generally worth more to their employer than a competitor (though there are exceptions).


As a result, a business can't pay to train it's own work force - if a business invests $20k in training and $80k in salary, there is nothing stopping another employer from offering $90k in salary after training is complete.

If the training has increased her market salary, then the simple solution is to raise her salary to meet the market.


> if a business invests $20k in training and $80k in salary, there is nothing stopping another employer from offering $90k in salary after training is complete.

Then that business will have to pay better.

You want me to stick around? Fuck you, pay me[1] (or give me some other reason that is compelling--to me, not to you--to stay).

[1]: https://www.youtube.com/watch?v=jVkLVRt6c1U


The whole for-profit code school thing has been giving me the creeps since I started hearing about it years ago, precisely because it's the potential employees paying for hyper-specific training. I wonder if there would be fewer outcries about a talent shortage if companies were somehow incentivized to hire these more junior people with the explicit goal of training them up to a productive level.


It's a tough situation. It's hard to identify people who have no background in computer science, but still have tons of potential if given the opportunity.

Coding bootcamps exist to bridge that gap - in a sense, you can think of them as recruitment agencies and not as learning institutions. The goal isn't to replace a 4 year degree with 3 months of intense learning. That's simply not possible. The goal is to find smart, motivated people who can learn quickly, give them a skill which will let them hit the ground running, and present them to companies.

Companies benefit because they get a low cost hire who has the potential to grow tremendously if given the right environment. Employees benefit because they can typically enter the workforce at a higher salary than they could command from self-study alone. Win-win all around.

The key takeaway is that for-profit code schools are not schools. Code schools are primarily recruitment agencies designed to find high quality, non traditional talent. In that regard, I think the best ones are an absolute success.


My opinion is divided about the "pay-to-play" model of workers paying for their own training. On one hand, they assume all the risk and financial hardship of training. On the other hand, compared to other developed economies (British, Australian), they have more opportunities to change careers because the risk of training has already been assumed by the worker. I've found that American hiring is far more flexible than the Commonwealth tradition of jockeying for an apprenticeship. And older workers have far less opportunity for apprenticeships.

It's hard to defend the code schools though. There are a high amount of horror stories we hear about them on HN, from both the students' and hiring managers' perspectives.


> There are a high amount of horror stories we hear about them on HN, from both the students' and hiring managers' perspectives.

Can you link to them? I'm trying to do due diligence on some of them, but haven't been able to find much at all of substance. I'm specifically interested in Maker's Academy in London.


Just going from what I remembered reading, here are the ones that stood out the most. Comments vary from supporters to detractors.

Ask HN threads I found interesting:

https://news.ycombinator.com/item?id=8844848

https://news.ycombinator.com/item?id=7147664

https://news.ycombinator.com/item?id=9616691

The one I really wanted to find, which accompanied an article about some students who felt scammed by their bootcamp, is eluding me right now. It was an interesting article about how the students were required to post misleading stories on social media that made the school look better than it was.

Edit: I found out why it was hard to find; it was taken down. Here's the link to the comments. https://news.ycombinator.com/item?id=9492381


Today's code schools may be new, but not code schools in general. Riding on the T in Boston 2000-2005 I saw plenty of advertisements for courses in Java, XML, etc. I always assumed those were low-quality classes for low-quality programmers. Nowadays the code schools teach Node.js and TDD, and their branding is more hip. I hope the quality has increased (of both instruction and students), but I still see it as a continuation of the training classes that have always been out there. Not the same people (I think), but the same market need.


This should be less risky for large corps like MS, IBM etc. I believe this may already be done?


Depends on the company. I think it's done much less now than it was in the past. I know when I was hired into my first job in 2007 I barely knew their default programming language and definitely didn't know their in-house framework, so I spent the first two months in organized training and the next six or so in "on-the-job" training on a low priority production system.


The Uber robotics talent raid of CMU took piratization to a logical extreme, http://www.theverge.com/transportation/2015/5/19/8622831/ube...

"They took all the guys that were working on vehicle autonomy — basically whole groups, whole teams of developers, commercialization specialists, all the guys that find grants and who were bringing the intellectual property," recalls a person who was there during the departures" ... Uber snatched up about 50 people from Carnegie Mellon, including many from its highest ranks.

... the deal includes a "transition period" that keeps some of the departed staffers around ... "The work of these employees is very incestuous and loose," says the same NREC insider. "They are given free rein of the facilities as part-time CMU employees, but there are absolutely no checks on the work that they are doing or what [intellectual property] they are taking. Is it for CMU? Is it for Uber? None of us here know."

Edit: could CMU have gotten better IP licensing terms and ROI for the University, if they had spun out the entire team (with private financing) and had an open auction of RoboticsResearchCo to the many companies investing in this field?


I think this is a better example of dysfunction in academia than retaining/training a workforce. In particular, previous articles suggest that most folks saw their salaries doubled, with six-figure incentive checks to lure them away. From what I hear, it's not so much that the new salary is unusually high for industry researchers and engineers, but that the old salaries were unusually low (except in academia).

CMU could probably have gotten better return if they hadn't made it a point to underpay them so much compared to their market value.


Historically, university researchers doing fundamental research with mass-market industrial applications have been spun out into a company, so that both the university and the researchers have an upside opportunity for founding equity stakes.

A signing bonus or normalized salary is in an entirely different (smaller) class of compensation, especially since there was already an international track record of investment into the commercialization of self-driving vehicles.


I would guess that that the NREC engineers were paid locally-competitive engineering salaries in Pittsburgh, which would enable them to live a very decent lifestyle. Doubling their salaries probably was a change from $80k a year to $160k or so— from Pittsburgh to Silicon Valley pay.

For reference, $160k can buy a nice house in a nice neighborhood or suburb in Pittsburgh.


CMU was doing development for Cadillac[1], underpaying their employees while charging more to the customer, General Motors. That backfired on CMU.

[1] http://www.cmu.edu/homepage/environment/2014/fall/from-0-70-...


Perhaps this can serve as a cautionary tale to the public about the perils of private-public partnering. Especially when the "partnership" is so slanted in favor of the "private" side of things that such a situation can occur.


Football coaches are often the highest paid employees at big schools. If I was a world class researcher, I'd be pissed as hell about that.

That said, Uber pretty much showed that they are unable to be trusted (if anybody trusts them at all still).


It's quantifiable to the administration how much money a winning sports team can bring to the school. What is the average return on a world-class researcher?


> The trick is to relabel it as education, then complain that your prospective employees aren’t getting the right kind.

Well, I guess if already-educated workers are the norm in your industry, companies are going to change their hiring practices accordingly. It's okay not to like it, but arguing against it on the basis that 'it didn't used to be like this' just makes you look entitled and whiny.

> Bemoaning the unpreparedness of undergraduates isn’t new. Today, however, those complaints are getting a more sympathetic hearing from the policy makers who govern public higher education.

Yeah, well, when policy makers are responsible for all the cheap money flowing into the system (without which the system would probably collapse at this point), I guess that'll happen.


> ...if already-educated workers are the norm...

That's the point -- there's a difference between education and training. It is only an issue because so much of the education looks very similar to the desired training.

It is not the purpose of an education to produce new employees, but to provide a broad basis for experiencing and making sense of the world. It just so happens that you also learn transferrable skills like a basic familiarity with a particular field, along with some tools and techniques that help you organize and solve problems. A proper education is not an extended code bootcamp, nor should it be.


> That's the point -- there's a difference between education and training.

Excellent point. Two problems:

(1) Over the past ~2.5 generations, the narrative about education has been informed by the conventional wisdom of {grades -> university -> job -> pension}. Hence, education is now only nominally about producing 'citizens of the republic' (so to speak), and de facto about producing well-trained workers. Disambiguating these goals is important, but hampered by...

(2) The perverse incentives created by cheap government money encourage malinvestment (i.e. an English degree [I say this with love — my wife is an English major]), and the consequent propensity of industry to not only see it as a jobs program, but also as a government policy problem.


One possible solution is cognitive screening - the use of tests such as the Wonderlic, SAT/ACT, or Wechsler to find prospective employees who can learn quickly and have good critical thinking skills (and thus would benefit the most from on-site training for technical tasks. training obviously costs money), but unfortunately something called 'disparate impact' makes this difficult to implement, so employers instead have to let colleges do the screening, turning an advanced degree into a very overpriced, time consuming 'IQ test'. Some people are more concerned about hurt feelings than providing equal opportunities. The 'logic' is if the tests expose a reality that isn't politically correct, we must do away with the test, so the result is more student loan debt, a worse labor market, and more credentialism.


> The 'logic' is if the tests expose a reality that isn't politically correct, we must do away with the test

Untrue. The logic is if there is a disparate impact against a legally-protected class, such that the test would be a convenient cover for illegal discrimination, and you allow discrimination using the test without demonstration of relevance, it becomes an easy, obvious, and effective tool for those looking for cover for discrimination on an illegal basis; to avoid that, the logic goes, you simply require those who wish to use the test as a basis for discrimination in that case to actually be able demonstrate that the test is meaningful to the job and that it is applied as a discrimination factor in a manner consistent with the way that it is meaningful to the job.

If they've actually done the kind of analysis that would let them know that the test really is useful, this is trivial; it does, however, prevent adopting a test with discriminatory effect against a protected class merely based on intuition or conventional wisdom.

The type of analysis involved may be costly, but then, if it really is something that those wishing to apply have high confidence would be valuable for their business, that type of analysis would be worthwhile to pay for. The reason it is difficult is because none of the people who like to talk about how useful these things would be when talk is cheap wants to put their money where the mouth is on the issue.


Strangely, we are unwilling to apply that same logic to traditional hiring processes. I.e., few companies have ever done a study (sufficient to win in court) to prove that their subjective human opinion-based tests do not have a disparate impact. Yet processes like this are somehow allowed.

I.e., if my subjective human hiring technique is biased, you need to prove I discriminated on purpose. If my objective, IQ-based technique is biased, I need to prove I didn't. Why this disparity?


>Why this disparity?

For the simple fact that we already have evidence that some protected classes perform worse on IQ tests. Therefore, simply by using an IQ test you are discriminating against a protected class. The burden is on you to prove that the discrimination is necessary. No one needs to prove that discrimination is happening because you are using a test that has already been show to be discriminatory.

Interview based hiring techniques are much more varied than IQ tests, and they have not been shown to be near universally discriminatory. Therefore the burden is first to prove that discrimination is happening in the particular situation.


>they have not been shown to be near universally discriminatory.

They actually have...

I am not going to get fully into it at the moment (but there's a ton of research on the topic) but we know that resumes that say Lakisha are significantly less likely to get a callback than a resume that says Karen.

Another example is blinding in orchestras. When the practice became the applicant played behind a curtain (so they judge didn't know what the applicant looked like) the number of women in orchestras increased.


I'm with you on this, but there are differences.

Subjective interview based hiring across all companies has been show to be discriminatory towards protected classes. But that's different than showing a specific company's hiring process is discriminative.

If company A uses IQ test X, and IQ test X has been shown to be discriminatory, then you can say definitively that the hiring process of company A is discriminatory.

If company B uses hiring process Y, and you can show that hiring process Y is discriminatory in 50% of the companies that use it, you can't make a definitive conclusion about company B in the same way you can company A.


Yea, I agree, thanks for clarifying.

There is a huge difference is explicit bias (give applicants an IQ test even though we'll weed out the blacks or even to weed out the blacks) and subconscious bias - which is an incredibly complex problem.

Most companies are trying to eliminate (unintended) bias in the hiring process.


> Strangely, we are unwilling to apply that same logic to traditional hiring processes.

Untrue.

> I.e., few companies have ever done a study (sufficient to win in court) to prove that their subjective human opinion-based tests do not have a disparate impact.

When an employment practice -- including a subjective, human-based tests -- does have a disproportionate impact against a protected class, and actions under it are challenged under anti-discrimination law, companies do have to prove that the practices are sufficiently related to the specific job being hired for that the disproportionate impact is not unjustified. IQ tests are not different in this regard.

They are different in that:

(1) the evidence of disproportionate impact is well-established and ready to use, and

(2) unlike most companies' other hiring practices, there is very little on the surface to show a trier-of-fact that it is related to the specific job duties, so tying it to the specific job duties takes a lot of work -- and, in fact, the places where they have been used and challenged are largely the kinds of places where studies have shown them least relevant to job performance.


how many examples are there of jobs where mental ability is irrelevant to job performance AND where there are more readily available and reliable signals of future job performance?


Trying disprove disparate impact can be time-consuming and expensive, especially with the litigation, and that's why only large companies and municipalities can use these tests. One problem according to business surveys is the skills mismatch, which is that employers can't find enough qualified employees to fill open spots. Maybe the labor market would be better if these tests were made more readily available so employers can find talent that would benefit the most from on-site training, but most businesses don't have the resources to disprove disparate impact should a lawsuit arise, nor the compiled data to show there isn't discriminatory hiring . Even if the plaintiff's case is without merit, it still costs the employer time and money to disprove the case. And it's not like the defendant can recover from a frivolous lawsuit.


I'm not sure why employers must prove the relevance of IQ tests but not work sample tests, given that the former are better predictors of job success (Hunter & Schmidt 1998).


> I'm not sure why employers must prove the relevance of IQ tests but not work sample tests

If challenged in court, and if the work sample test is shown to have a disparate impact such that its use would be illegal discrimination if it were not tailored to the job, they do have to show this for work sample tests. Your premise is simply false.

(OTOH, the disparate impact of IQ tests is more readily established by a plaintiff since there are numerous, readily available studies. So the work of establishing the threshold issue which requires the employer to prove relevance has largely been done by the plaintiff for IQ tests.)

> given that the former are better predictors of job success

The legal standard addresses the specific job for which hiring is being done, not a generic job. Interestingly, your source indicates that the types of jobs where IQ test challenges have notably occurred are those where the test is least relevant per the source you cite (unskilled/semiskilled jobs).


Let's imagine that tomorrow I come up with a magical super-reliable method of interviewing, which works better for most jobs than existing specific methods. Then I'm fucked! My method will certainly have disparate impact (because it has nonzero correlation with IQ), be not very tailored to any particular job (because it's magical), and there will be numerous available studies (making it easier to sue me as you explain). So tell me, what incentive do I have to invent such a method?


> Let's imagine that tomorrow I come up with a magical super-reliable method of interviewing, which works better for most jobs than existing specific methods.

If you can demonstrate this is the case for the specific jobs where you want to apply it, you will then have no problem, even if it has a disproportionate negative impact on a protected class.


> You will have no problem

Only in the same sense that an unjustly accused person will "have no problem" as long as they're actually innocent. In practice, they do have a problem. Firms are afraid of using IQ tests and being the first against the wall.


Actually, there seems to not really be any legal issue with IQ testing job candidates (in the US), and plenty of companies do some kind of cognitive testing: http://econlog.econlib.org/archives/2013/07/three_big_facts....


It isn't about political correctness (this word has really lost its meaning lately) its about eliminating tools that companies use to illegally discriminate against a protected class. It depends on its intent and its effect. The company must demonstrate the job requirement that has a disparate impact is job-related and consistent with business needs. A strength test might have a disparate impact on women and those with disabilities but that's ok as long as the job actually requires heavy lifting.

Tests have been used as a tool to weed out the undesirables, this is a fact. Look at literacy requirements for voting, for example. If this wasn't the case we wouldn't be facing this issue.

That being said - its a huge can of worms that needs more clarifying. It has a spotty judicial history, to put it lightly.

It is also questionable that requiring a college degree for many jobs is actually legal anyway and it is consistent with precedent such a practice is actually illegal.

Griggs v. Duke Power was a big Supreme Court case in this area. The Duke Power Company had explicitly segregated its workforce when it was legal to do so reserving the low paying jobs for blacks and the high paying jobs for whites. When it became illegal to do so they changed "you have to be white" to "you have to have a high school diploma or pass an IQ test."

Duke lost the case.

See here: http://www.popecenter.org/commentaries/article.html?id=3118

>Furthermore, the company’s lawyers argued, the legislative history of the Civil Rights Act clearly showed that it was not intended to interfere with bona fide aptitude testing, widely used in business at that time. During Senate debate on the bill, opponents argued that it could be used to attack employment testing, which had in fact occurred in a case in Illinois involving Motorola. A state official had ruled the company’s testing illegal under state law because it was “unfair to disadvantaged groups.”

>Bill sponsors, including Senator Hubert Humphrey, replied that nothing in the language of the statute could be construed that way, but to head off objections, they included a new section, 703(h). That makes it legal for an employer to use a “professionally designed ability test” if it is not “designed, intended or used to discriminate….”

>You might think the Court would have ruled in Duke Power’s favor. Wrong—it ruled unanimously against it. The justices ignored the legislative history and gave deference to the federal agency charged with enforcing the law, the Equal Employment Opportunity Commission (EEOC).

>But there was also a delayed consequence. With actual intelligence testing now an invitation to costly litigation, “many employers made the college degree a de facto intelligence test and focused only on hiring applicants who possessed it.”

>O’Keefe and Vedder raise that question: “If challenged, could employers who have set the college degree as a requirement show that it has anything at all to do with ‘business necessity’ or are ‘job related’? That is very doubtful.”

>Suppose that someone who’d been turned away from a sales job for lack of college degree took the company to court, claiming that its educational requirement had a disparate impact, screening out people who could succeed in the job. That would appear to be a strong case.

>I have never heard of such a case, attacking an employer’s college degree requirement on disparate impact grounds. But nothing would more rapidly deflate the college bubble than if the Court were to hear such a case and rule consistently with Griggs.

Intention doesn't match with reality. Honestly, we need to revisit the issue and so something differently. However, political correctness has nothing to do with it.

(I am also not convinced such tests actually have much to do with intelligence anyways...)


"Political correctness" has maintained its primary meaning over the years, which is an indication of terrible opinions coming from the user.


As a lib arts graduate, I might be biased, but I believe that it's still the best education for the type of decision-making that's most useful in real-world business decisions: ambiguous and incomplete information from a variety of sources with competing interests.

That being said, if you're not in a leadership position that kind of decision making isn't what you're doing: you're most likely just optimizing on your own little anthill, so a "profit-centered" education like the article is against might make sense for the worker bees.

Another thing that I think is usually missing from the "train your workers" debate is how much variance there is in productivity, and how actual productivity is usually unknowable unless you have 2-3 months of project data for a given worker (these are assumptions). So hire unskilled contract workers, fire 80% of them, and then train the remaining 20%. They won't have the credentials to work elsewhere, and you've been able to identify the true all-stars using data on their actual work product.


It seems expecting schools to do all the training your employees will ever need would not be a good bet from a business point of view either. If your employees have the exact same skills as the competition's employees how do you expect the business differentiate itself?


It's true that business should pay to train its own work force, but I'm not convinced it can.

Businesses that would like to but that operate in a sector where a competitor can successfully externalize that cost will be at a competitive disadvantage.

And businesses that subscribe to managerialism -- the idea that it's primarily management/leadership skills that differentiate a business, rather than domain knowledge -- may not know how to train employees at all, as it takes someone with domain knowledge to know how that can be done...


Companies are often eating the cost of training employees without admitting to it.

Pick some company at random, look around, and there will often be piles of people who have no official training or certifications to speak of, and no relevant degree, but who are perfectly competent at their jobs. Somehow, magic happened, and they were trained, despite no training money in the budget.

A lot of it is just done semi-officially. The boss points at some guy he trusts and tells him to fill them in, and lower productivity is accepted for some period.


I thought tech internships were more about recruiting than about training.


Students aren't dealing with the proper issues that arise in a regular workplace. Many professors lack knowledge on what has been going on between them leaving the workplace and today. Because of this students are learning principles that aren't fully applicable to many business jobs today instead of learning how to deal with corporate incompetence. If companies train employees on how to deal with personnel, they will learn more of the technicalities on the job underneath a hopefully competent supervisor.


Why should they? That's the whole argument of this piece -- that they shouldn't.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: