Hacker News new | past | comments | ask | show | jobs | submit login
99 Out Of 100 Programmers Can’t Program – I Call BS (skorks.com)
132 points by duck on Oct 16, 2010 | hide | past | favorite | 161 comments



[Interviewers] ask simple coding questions like the fizzbuzz, but [their] applicants still fail – even ones that shouldn't. [...] [Y]ou're using the wrong medium for what you're trying to achieve (e.g. coding over the phone).

Clap clap clap.

The same people that insist that the minimum development station is a tricked out computer with two (three?) big monitors, SSDs, the best IDE money can buy, etc. don't understand why someone is uncomfortable writing a program on a whiteboard or, even worse, over the phone.

Unfortunately, I don't have the solution. Perhaps pg can add "Interviewing that doesn't suck" to his list http://ycombinator.com/ideas.html - obviously this is included in "7. Something your company needs that doesn't exist." and is the company version of "8. Dating."

Personally, I think Steve Newcomb has a 90% (maybe better) solution. http://blognewcomb.squarespace.com/essays/2010/10/14/cult-cr... (discussed here http://news.ycombinator.com/item?id=1793095).


My company does whiteboard interviews. The important thing is not getting the right answer. We don't bother to attempt to compile the applicants' code, so we don't even know if it works. The idea is to get a glimpse of how they reason through a problem. After they've roughed up a solution we ask them follow-on questions like, "How would you test this? What corner cases would you check?" If it suddenly dawns on them that they forgot about certain corner cases, that's considered positively; if they blunder confidently on through without checking to see if what they'd written would really work, that's considered negatively.

We also ask them a design question, typically something like "design an automatic retrieval system for one of our robots". Again, not being mechanical engineers we have no way of telling whether their ideas would actually work. It's the thought process we're interested in looking at: how detailed they can get, how they anticipate and solve or work around problems with their design, etc.

Given that we have a pretty top-notch software team, I'd say our system has worked out rather well. :)


The problem with the Newcomb solution is that it is aimed at the high-end of tech companies, which are ultimately willing to pay very high portion of the company worth for the top talent.

Most companies employing programmers today are not such companies -- they cannot possibly hope to hire A+ employees, but they are trying to build a solid team of B- or C-level programmers, just hoping to keep the F-level out. As far as I know, this is a total crapshoot and the ones succeeding in it do so mostly by luck.


Understood and agreed. I thought about editing my comment to add a disclaimer about "if you can implement it" because many companies cannot.

On the other hand, the same companies that are decrying "99 out of 100 cannot program" also claim to be in the same high-end tech company space as Newcomb.


This doesn't solve the problem for everyone else, but when interviewing, solve the problem for yourself: Bring your own laptop. (Or borrow one if you need to!)

I can't believe in 2010 this isn't "conventional wisdom", but heck, take advantage of the competitive advantage while you can. No matter how impressive your linked list reversal is on the whiteboard, running code is even more impressive.


Given last.fms database of (user, (artist, track), datetime) plays, how would you quickly recommend me something to listen to? Now you have two problems. First, your code, in order to run, needs to load in and process this list, and you need to decide if it's a flat text file (the format of which you then need to define and parse) or if it is in a database (the connection to and schema of you will then have to deal with). That's boilerplate that's (a) easy to get around on the whiteboard (b) utterly irrelevant to the question asked. Second, you don't actually have such a list, so running your code isn't actually impressive. Mocking up a sample list takes time, and engineering it to create useful results even more.

Next, I ask you to parallelise it, and you don't recall the pragma that OpenMP uses, so you just need to pull up the manpage, or you don't have the Actors library installed, so, uh, what's the password for wireless? Oh, and you still don't have that large file, so even if you get your parallel code running before I run out of patience, waiting for the answer "well, access to this counter would have to be synchronised, so that would create a bottleneck", running your program will still not have enough data to munch on to gain any insight in it's performance.

tl;dr: You will have to be very sure of yourself to pull this off. I want to know how you think, not how you handle your tools.


Yes, much better to just handwave; that'll show you have it all together.

You're seriously arguing against having somebody actually code in their interview? I'm not saying it has to be the entire interview, but how incredibly crazy is it to actively argue against the idea that maybe you, as the interviewee, might want to actually show you can do what you say you can do?

Besides, I find myself questioning if you've been on the interviewer side of very many interviews. Finding people who can slurp a file in at all is a damn good start, unfortunately.

Then again, I know I do this really whacky thing where my interview isn't set in stone, but I actually react to the skills demonstrated on the other end of the table. Believe me, if somebody walks in sporting a laptop I'll customize something up for them to show me out of my base set of questions. It would hardly be any different than what I do anyhow.


Lay off the strawman, whiteboard programming is not handwaving.

And I can do without the ad hominem as well. No, I haven't done a lot of interviews (about 6-7 I think), but my post still has a substance that can be argued against.

> You're seriously arguing against having somebody actually code in their interview?

No, I have no problem with coding in an interview (I think it should be part of it, just not the part where you do something else). I have a problem with generalizing the advice "When asked to do whiteboard programming, whip out your laptop.".


(blush) That didn't occur to me. Good thing I haven't had any "program on a whiteboard" interviews. Thanks for giving up your competitive advantage to all the HNers. :-)

Personally, I hate to program on someone else's computer. I run pretty bog-standard linux with minimal plugins and aliases (in part, so that I'm comfortable in a bog-standard environment typical of servers). The problem is, lots of people have customized the crap out of their system. The result is added discomfort when I try to do something on their computer.


Someone once brought in his laptop. He showed me a running program and gave me a tour of his code. It was very impressive.


That's very much like what I would ask from a candidate developer: "Bring your laptop and show me some stuff you made in the past year(s), and expect to implement a small programming problem I give you. You may use your favourite development environment for that."


What happens if they don't have a laptop?


Wouldn't that be some kind of a contra-indication? Not a definite "don't hire" signal, but pretty close.


Now, sorry to say, but that's just a stupid presumption. It amounts to either economic discrimination against people who can't afford a laptop (for a whole host of possible reasons other than "I guess they never had a job that paid well because they are crap programmers"), or at the very least, prejudice against people who work on desktops and don't like laptops. Not every programmer is a laptop-wielding RoR nomad who writes code wherever, whenever. Some people need their special cave and their fixed rig.

Don't assume everyone is like you.


The person who is interviewed should be interviewed at least for a full day. They should be presented with a large set of problems, both "fizzbuzz" type, design problems, debugging problems. They should talk to the top technical people in the company, with those they will be working with, and with as many other people as possible.

This is even more important in a small company. A small company cannot afford to hire a dead weight. A couple of those will kill the company -- they waste resources, don't get work done, and, what is perhaps very dangerous, drive away top technical talent who does not tolerate mediocrity.


This sounds great in theory, but can also be a big time waster for your top dev talent. Especially so considering you usually start hiring when you're especially crunched for developer resources.

The interview process should be extensive, yes, but it should also be spread out in a way that allows you to fail candidates early. Blocking out a day of interview time at once can be inefficient - rarely will that person be sent home early (for various reasons) and they are now just wasting a bunch of peoples time, and their own.


Think of all the time they will be wasting with this person if she turns out not to be a good fit.

You presumably want people who will enhance and help your top talent.


That's exactly the point and this inversely related to the size of the company. Imagine if IBM hires a dead weight, the company is so big and there are so many other positions that the company as a whole will not be terribly hurt (unless it's the CEO perhaps).

But with a small company, when you have 5 people and you hire the 6 and it turn out to be a bad choice, it could kill the company.


The critical people interview first, after the interview they talk to one of the owners. At that point the interview could stop. So it might actually not take more than half an hour.

Hiring many people at one time I think is a sign of a problem. We try not to hire more than 3-5 people / year. It is not that bad for a each top dev to spend 10 hours / year to find the right candidate.


I don't understand this idea that it is a good idea to waste such vast quantities of time for both the potential employee and the employer. Whenever I've looked for a job in the past, I'm typically still employed. Using PTO to go to day long interviews means I can't give a company a shot unless I already know I want to work there.


Because when a small company is hiring an employee they are risking quite a bit. It is worth spending more time finding out who they are. Hiring and firing people immediately because they turned out to be the wrong person is very counter-productive and demoralizing for everyone involved.

Before the day long interview we would have a phone interview. At that point you would decide whether you want to take a day out of your life to find out more about a company where will potentially spend years working. I don't think that is that unreasonable?


It depends. What is the likelihood that that person will get hired if they go on the day-long excursion? Will this person be just one of many people you will take on these marathon interviews? Or will you be completely focused on this one person, and if they seem like a good fit they get hired immediately?

The point is, if you expect me to invest so much in an "interview" with you, you had better be laying some big chips on the table as well.


You will pretty much be hired at the end of the day. We hire about 10% of people that we interview this way. So we already pre-screen people before spending all that time with them.

Remember you can always leave and we can always stop the interview.

Well this way of interviewing has worked well for 15+ years ;-) , we like it, we'll probably stick to it.


A 10% shot at most jobs is not worth 1 day of PTO.


It is not that random. When I went for the interview I was 80% sure I would get it. But I wanted to spend the day there to confirm that. I was interviewing them as much as they interviewed me. I had multiple job offers and meeting the people I would work with helped me decide.

After being there for 4 years, I would it was the best 1 day of PTO ever spent in my life.


One of the most eye-opening lessons of my professional life:

I had a set of coding questions to give final round candidates to complete in person. By paper, whiteboard, or laptop -- their pick.

The success rate for candidates was about 25%.

I then started giving the same questions as a time-limited take-home test, for first-round candidates.

The success rate jumped to almost 100%.

Same questions. Less-filtered candidates. The difference? The primary determinant of success was interview environment, not candidate ability.


Or possibly the ability to cheat...

It's becoming more of a problem, especially with university level recruitment, that candidates are either getting others to take the test for them or downloading solutions from somewhere.

Often candidates once bought in for interview fail to be able to explain how code "they wrote" works.


I once interviewed someone who said they wrote a web browser in Qt. After inquiring a bit it turns out they had really forked the Arora browser I wrote. I had fun for several minutes asking him questions about the code that he clearly had no clue about.


That's exactly what happened to me in my recent interviews.

Some company wanted me to write a simple string reversion and prime number finder programs on paper, I failed because I'm not used to code on paper.

Some other company wanted me to write a server client program which takes a file from client and saves it to server. I completed it on my laptop on the same day when I got home.

If the interviewer is going to make the interviewee take a written test; it must be about concepts, designs and stuff, not implementation.


Why would anyone ask to find primes? Most people don't know the best algorithm and anyone who needs to code efficient ways of determining primality has intense number theory training.

The "naive" solution typically is a sieve method and checks more than the required number of factors (goes beyond sqrt(n)). It's like asking someone to sort an array and expecting bubble sort because you never use sorting algorithms in what you actually code.

The best way I know (and there are actually better) to find primes is using an approximation algorithm that no one would ever come up with independently without months of research. The Miller-Rabin primality test finds if a candidate for primality n has a % chance of being prime by looking for numbers a < n that don't satisfy Fermat's little equation. Almost every composite number violates Fermat's little equation (Carmichael numbers are the only exceptions).

Once you hit the desired probability that n is prime (testing more a < n raises this probability) you use a deterministic method to determine 100% if n is prime. If it isn't you throw it out and use Miller-Rabin to find another candidate. Even though you might throw some candidates out and have to pick others to check for primality this method is far faster than a purely deterministic solution using a naive sieve methods.

This is a moronic test of ability. It exposes the arrogance of the company in assuming they can evaluate candidates in 15 minutes by asking the most challenging number theory problems out there when they have no understanding of the current methods in the discipline.


The goal is not to illustrate the best algorithm known to researchers, but simply to demonstrate that the candidate can explain the obvious algorithm and get reasonably close to correctly implementing it. Primes is an easy problem to explain and to solve (the naive way), but I wouldn't ask such a commonplace problem because a sub-par candidate may have faced it before and partly memorized the answer.


It's not the best algorithm known to researchers. It's the best algorithm known to people who took any course in number theory or a higher level computer science course that covered approximation algorithms. This is used in actual applications like crypto (RSA) frequently.

It is exactly like asking an applicant to sort an array and expecting the naive bubble sort response to check they can develop and implement a naive algorithm for searching. It says you can brute force a solution with a for loop. It says nothing about how well you understand advanced data structures, recursion, algorithmic analysis, language features, domain specific knowledge, program design and structure, or knowing when you you're out of your league and should google the best approach. It quizzes chapter 3 of any intro programming book and basic division in a situation where you would never use it.

Ask them about event handling or something useful like hash tables or linked lists or version control systems. Any math major can answer this question easily without knowing much beyond for loops and arrays. It tells you almost nothing about their programming skills. I was math/cs with grad school in math and every math major I know could do this. Most of them have no idea how to code. I know some great developers who would struggle and fail because they typically don't use this math in their job and have forgotten it. They work at some of the top companies out there (which are frequently discussed on this webpage).


Out of curiosity was the failure at an algorithm level or at a semantic level ? (i.e. could you not put the algorithm down on paper, or was it just a case of not remembering the correct syntax or libraries)


Personally, (visual and diagnosed ADD) it's very difficult for me to think through algorithms under pressure, or in environments with distractions. Stuff that takes me five minutes in a quiet, private, relaxed setting can take me an hour to think through in an environment with more people. Lab courses in college were particularly frustrating for me for this reason.


Algorithm was fine in both cases, I had the problem solved in my head but couldn't pass it onto paper effectively. I had trouble remembering semantics. Because I use autocomplete functions of IDEs a lot.


>> The same people ... don't understand why someone is uncomfortable writing a program on a whiteboard or, even worse, over the phone

Exactly. I loathe whiteboard interview questions. In five years of programming, how many times have I ever written or looked at code on a whiteboard to solve a real, work problem?

Zero.

Yet some people think the best way to learn about a candidate is to put them in a situation far removed from a real, daily work situation and see how they perform.


Full working code on a whiteboard, is very uncommon. However a lot of the time useful code snippets go on a whiteboard. Particularly in informal meetings. More than that, design diagrams go up there as well. There are some interview questions that I like which force people to put up the sort of pictures we actually draw in meetings - for instance come up with the tables backing a particular application. At the place where we asked that, you'd find yourself doing that after you got hired as well.

If you don't use whiteboards in these ways, then I'm going to suggest putting whiteboards near developers and consciously try it. You'll be amazed how well it works.


The general principle, as I understand it, is that if you can't write a very simple program like FizzBuzz under pressure, you will also do poorly when you're doing harder tasks with less pressure. This aligns pretty well with my experiences, both in school and industry.

Really, if you can't write a few lines "on the spot", what use are you going to be to an employer? In the real world there is always some deadline. I don't always expect perfection in a whiteboard scenario, but there's a world of difference between a blank stare and a 90% correct solution with an off-by-one.


I disagree. Give me a hard problem and a short deadline, leave me in front of my computer with my environment set up (so I can do some quick trial-error), don't look over my shoulder to evaluate me (but you can help me if we're doing pp), and I'll succeed. Give me an easy problem over the phone (I'm a visual) or on a whiteboard (my handwriting is not good) while timing me, and I will fail.

But in the end, I like this kind of interview, because I don't want to work for anyone who would evaluate me like that.


The other thing is that these interviews almost always enforce no Internet access. If I was in a position to hire, I'm looking for people who can solve problems, not those who can remember the Java API in their head. If their first instinct is to open Javadoc, then my reaction should be "good, they know where to look for to make sure it's correct" not "God, why can't they just do this [arbitrary task]?"

I'm hoping to finish my PhD and become a professor in Computer Science. My plan, given university constraints, is to never once offer a written final. I've not once taken an exam where I was confident just writing code down. It doesn't make sense. Offer them a final on a computer with Internet access, but with a sufficiently difficult problem that it can't just be Googled for a complete answer. Their web access logs are turned in with their exam, so I can see if they copied something if it looks suspect, or if they're borderline I can give extra credit for going through a mental process that shows promise.


"If I was in a position to hire, I'm looking for people who can solve problems, not those who can remember the Java API in their head"

That is what makes the whiteboard such a good tool: when working on a whiteboard, you can focus on the problem, not on pesky details. A whiteboard will never complain about API issues, function naming, etc. Not even by using squiggly underlines.


I agree and think people are getting caught up with writing complete running code on a whiteboard which should not be what the whiteboard is for. A whiteboard is for writing pseudocode to quickly build algorithms, work on design or solve other 'thinking' type problems. It's not for testing if someone knows obscure syntax issues (there are actual tests for that!).

In school we were taught to work through algorithms and other design using pseudocode and pencil and paper. Doing this also helped reenforce that languages are very interchangeable using the same algorithms and designs. I wonder if that is simply not done anymore in school?


I think another way to impress an interviewer is to come to your interview prepared to discuss (with technical detail) past exploits in which you were able to do just what you described. Most people just flow with the interview rather than try to (subtly) direct it in a way that is more favorable to them. This is a workaround for sure, but aren't programmers used to finding workarounds to impossible situations?

Another option is to tackle the interview itself as a challenge and practice interview skills, such as whiteboard coding. I think Steve Yegge suggested this in one of this blog posts about how to interview at Google. Personally, I find this to be very impressive: You knew there was going to be whiteboard coding, so instead of complaining about it, you practiced it. If you (casually) mentioned this during one of my interviews, I would probably give you some extra points.


Deadline pressure != interview pressure.

Also stressful deadlines are an indicator that sosmthing else is wrong in a company. The company should fix that issue instead of hiring people that can "program under pressure".


Plenty of code changes need to be made under pressure. Some recent examples from stories posted on HN:

1) Bug in Google authentication which allows websites running on App Engine to grab users email addresses without permission

2) Crunchbase making all historical searches public

3) Facebook downtime bringing down "like" buttons

All three of these issues were for the companies in questions bugs that had to be fixed in realtime. Does it indicate that something is wrong with these companies ? - no.

Any changing production system is going to suffer from bugs that on occasion will require programmers to be woken up in the middle of the night to urgently fix a bug.


"Does it indicate that something is wrong with these companies"

It means that their testing wasn't very good.

If programmers are being woken up to fix something then something is very wrong with an environment.


You're never going to have 100% test coverage. Any complex system is going to have bugs. It's pretty much impossible to write even a small library without bugs, let alone huge websites with the complexity of Facebook or Google.

Look at cryptographic libraries. Often they've been gone over line-by-line by hundreds of experts in the field, have extensive testing both unit and field. Yet bugs still get found in them.


My point isn't that bugs happen - of course they do and of course they get missed by tests. It is that if you pull someone out of bed to fix a problem I would suspect there is a high likelihood that you will be introducing more problems when you deploy this fix unless you run a full QA cycle after the change.


don't know about 1. and 2. The 3rd - Facebook - from what i read about it (bug in cache invalidation/update, herding the db) as well as recent problem at Foursquare (uneven sharding on MongoDb overloading a shard), as well as the production problems i have faced and resolved personally, have nothing in common with interview questions i've been asked so far in my life. Not even close, not even in the same Universe.


The problem is that real world bugs are often too complex to use in interview questions. From my experience solving production bugs often involves having to piece together bits of information from logs, databases, etc. and making educated guess to track down the source of the bug. It's pretty hard to replicate in an interview environment.

Spotting bugs in code, operational questions, and writing code samples are probably the closest you'll be able to come within the time restrictions of an interview.


At my company (we make AUVs), stressful deadlines mean you're out on a boat and you've got ops personnel looking over your shoulder and if you can't push out that code fix or configuration change NOW then they may not be able to run missions for the rest of the day because the vehicle would be dead in the water and the sun is going down.

Needless to say, actual software development in the office is pretty relaxed. But when you go out on ops, you will be in the pressure cooker and you will need to handle it.


Occasionally I have to troubleshoot our production code while we lose several thousand dollars per hour in revenue and jeopardize business partnerships. Sometimes it's just pointing out which servers need a kick in the head, but sometimes it's a recent code change that didn't handle all cases and we need to fix forward on the spot (because rolling back all interdependencies would take longer).


You should be able to reason regardless of how big or interactive the surface you're messing around with is.

The reason that big monitors are useful isn't because it makes you smarter or more able to reason about trivial problems. It's because it provides more space to cross reference code, more space to look up documentation, and so on.

For the sorts of questions that are being asked in an interview, you should be able to sketch out a solution on a whiteboard no problem.


Our field is such a weird place to be. I live in a city where most people are looking for .NET, WCF, WWF, WPF, WebServices, SQL-Server (T-SQL), and some Windows Administration, either that set or this one: JSF, Struts2, Spring, Hibernate, Maven/Ant, XML, XSLT, XSL, XQuery, XPath with Oracle/DB2.

Most programmers in my area are no longer care about Algorithm, Data Structure, Operating System principles, Database principles, or basic Math.

Most companies, that I work for or heard from employees, don't have good codebase, don't have good automation tests, don't have good culture.

Sometime I do wonder what does it mean to "be able to program".

Can deciding to use MVC (Spring, ASP.NET, Rails) to spit out XML/JSON so that it can be consumed by AIR, pure JS/jQuery, or even your iPhone app be counted as "be able to program"?

Can writing good code that can be easily testable be counted as "be able to program"?

Can producing code that consumed WebServices from other payment provider be considered as "be able to program"?

Or, can solving tricky algorithm problems is the de-facto "can program" filter?

Some of the "can..." produce business values. Some others, might not necessarily produce business values, but instead, just to show that "yeah, I can sling code the way you want me to".

Tricky field...


Ugh!

Please don't enumerate buzzwords like that. Or at least ROT13 it to be kind to some people's refined slacker sensibilities.

Right now I am digging my nails into the desk and biting my teeth: please, don't give the scumbags any of your attention. Hack outside what you see on corporate job descriptions, even if you have to wait tables to support your console habit.

Everyday you spend learning something to meet a job requirement, you're being someone's #&@! And someone who doesn't know you or give a crap about you. Do this shit for yourself. Do what you* want.

Forty years from now, at the end of your career, what do you want to remember? What do you want to be remembered by? Surely, not that pile of inventory reporting macros you wrote in Vector SQL-PowerG# 2020.

I see these ghosts at bookstores, discussing how they're certified in Java, Oracle, CISCO, and A+, and now need to add Ruby and Scala because they saw it on an ad. I am sure a great many of them can code and code "well", as much as their tools allow them. But at what cost to themselves, their ego, karma, mental health, personal growth, and fucking COJONES .. tienes algo?

Don't choose to remain adequate, and acceptable. Aim to excel, exceed, and if necessary, offend. Today's challenges and good times are tomorrow's fond-memories. Take a shot at something great, even if you fail, do it, and enjoy yourself.


Those skills are not specifically for corporate jobs by the way. A typical MS shops (regardless what you build) will require at least .NET 2.0/3.5 with ASP.NET (not even MVC), WPF, WCF, WWF, Silverlight, and SQL-Server. A typical Java shops would require you to know J2EE, Servlet, JSP, Tomcat/Websphere/JBoss, Maven/Ant, JUnit, XML (with some would ask of GWT experience).

But a couple days ago I told myself to ignore the rat race. I then bought a Linux book specializing in command-line, a Vim book, and a Rails 3 book. (Why Rails? um, why not? it's better than those corporate tools). I don't care if I have to learn PHP. As ugly as it is, it's still better than using .NET tools (regardless if it's ASP.NET MVC or not).

C# could be a better language, but I just want to use light-weight tools (even if it means I have to learn many-many tools). No more heavy weight tools.

I noticed that the people who choose the lightweight path (C/C++, or LAMP/RoR) seems to be able to focus on what matters the most: learning skills that can last longer than .NET version 2.0.


On the first day of my first job out of college, the sales engineer of the company took me out to lunch to give me some career advice. He said the best thing he ever did for his career was not learning how to code. Apparently he had graduated as a CS major and went straight for a software company, but really just could not program. Of course, as he put it he chose not to program, but I ultimately saw some of his work and it was beyond terrible.

Instead, he made sure never to actually be useful to the programming or hardware teams, but talk to them just enough to know the basics and keep updated. Since he wasn't actually ever working on anything, he used his free time to mingle with management and used his inside knowledge from the programming teams to announce potential problems to management before the actual programming teams could communicate this info.

Sure enough, he slowly moved into management and higher salary brackets by never actually learning how to program. Of course, the programmers and hardware teams caught on to his game and cut him off from any useful information, and without any actually knowledge he hit a ceiling within the company pretty quickly. Still, it was an important business lesson for me, and ever since watching this all take place I've made sure to carve out some time from my heads-down work to maintain visibility and directly communicate with management.


The really sad thing is that he had something incredibly valuable - the ability to translate between geek-speak and manager-speak - and chose the dark side. If he worked as a collaborating liaison instead of a back-stabbing asshat, he probably would have climbed faster and definitely farther.


Honestly, as a CS major, I really believe that just looking at the rest of the people in my class.

College only teaches you theory, for the most part. Some will not teach you any programming besides some generic C++. You have to learn that on your own; I'll believe that the vast majority don't have the drive to learn that on their own for years to perfect their skills.

I really agree with your points on the problem being with the interview process though.


As a professional I can believe it too. Most of my colleagues are fine if they have some existing code and need to modify it, or add a feature following the pattern of the existing code, and that is 90% of what a developer does in the real world. A completely blank screen tho' and even I hesitate. Almost every new project I start actually starts as a skeleton from another, here's my Makefile, here's my usual basic imports, command-line parsing block, etc etc. Once I get some momentum going it's fine...

The fact is that computer science is not software development, any more than you could take an astronomer, drop him on a ship and say "navigate!".


There is a simple solution to this. Start with a working program that doesn't do what you want, and incrementally make it into the one you want.

It doesn't matter what the original program does. I cannot count how many times I started with a program that printed Hello, world and wound up with with something useful and unrelated.


That is how most programs are written, but it really helps to have some existing structure, such as an app framework. Also I suspect there has only ever been one Makefile, that has been copied and tweaked millions of times by millions of people over the years...


Why do you think that?

Personally I write all my makefiles from scratch. I'm always baffled by other makefiles though. They always look so huge and for some unknown reason just overly complex.


Now you know the reason ;)


Really? I've been using the same one for the last 15 years, and before me it was owned by an old geezer. Some people say I should try some newfangled nonsense like Ant, but make has never let me down.


Ant, waf, and the rest of them are usually overkill IMHO. With makefiles the basics are simple and straightforward and I'd argue that most uses only require the basics, it's like what I've been hearing about autotools these days but even autotools seems overkill especially on Linux where we've had proper pkgconfig support everywhere for years. The same is probably true for the BSDs as well. I can't comment on Windows or OS X as I have no experince there but I imagine it must be easier to write a couple basic makefiles for 2 differnt platforms if you need. Most cross platform projects don't need as they either use Gtk, glib, Qt, etc which already pulls in all their dependencies.


I have a co-worker. This co-worker can't solve problems. He just lacks the ability completely. Give him an existing code base and an error, and he has NO IDEA how to proceed.

Contrariwise, if he has to write something from scratch he can... as long as it's entirely stand alone and not too complex.


This might not be a valid excuse but look around at our industry: overtime everywhere. If you've been working 10-12 hours a day, you don't necessary have the power to learn on your own at the end of the day. That's why sometime I'm not surprised to see any type of programmers wanting to move to management unless he/she is really uber geek; you know, the type who have social issues and only focus on one thing.

Let's face it: our industry is as bad as the quality it produced.


I don't know how good or bad my courses are, but really, just how shitty are most CS degrees? In the 4 years (and counting) that I've spent studying CS, my assignments have included writing: - a kernel driver in Linux - Writing and analyzing graph metaheuristic algorithms (and the test case generators in Python) - nontrivial data structures in C, C++, and assembly - Writing graphics and matrix math functions in Assembly - Simple OS schedulers in Java - Recursive functions in Haskell - software contracts in academic verification languages - 95% of all reports in LaTex

And this is basically for the equivalent of an undergraduate's degree. I'm not saying all degrees should have this level of variety, but are people really coming out of Uni without at least having written a linked list in a low-level language?


My degree was pretty hardcore as well. Did it prepare me for work? Yes, but only in the abstract, and that's because I was lucky enough to have a natural talent that uni enhanced through practice (That of rapidly picking new stuff up).

Not once did I use an MVC beyond a crappy Java implementation. We didn't do maintenance practice, or log file spelunking, or customer interaction.

I learned how to create multi-user, networked, stateless applications editing one source. I created a compiler and language, learned assembly, studied algorythms and z-schemas. I switched between languages every day and solved some annoyingly abstract and difficult problems.

Number of bugs I solved in other people's code outside of an exam? 0. Number of incredibly obscure projects I took from start to finish? 0.

CS Degrees only teach people to be potentially good programmers IF those people are able to learn how to fix shit by themselves. Otherwise you're teaching greek to a deaf person. Sure, they know what they're asked, but they can't talk back.


"College only teaches you theory...You have to learn...on your own."

Agreed. My experience was that I got into programming as a kid. By the time I enrolled in college, I had written hundreds of thousands of lines of BASIC code and had written a ton of Pascal programs. College enhanced my programming abilities with concepts like big O notation but for the most part did not give me any hard technical skills that were hirable. My first job out of college was programming in Java which I picked up on my own and via an internship.


Did you know that "I" is the most used English word?


I've had the opportunity to interview/hire a lot of people over my career. The candidates who completely learned how to program from their college courses and never pushed themselves to learn, try or do more are really easy to spot. They always get a firm "no" from me. I'm looking for the candidate who programs because they truly enjoy it, almost regardless of having a degree or not. If your resume doesn't have at least a few side projects you've done I'm almost guaranteed to not even bother interviewing you.


Agreed, It's funny how if I take a look at facebook and view classmates profiles I appear to be the only one who still programs despite us all taking software engineering in College. Maybe that's the effect of the current economic climate but at the time everyone else seemed to just view it as a high paying career path and couldn't tell you the difference between a variable and a constant.


It's still high paying, and still has a lot of job openings, for the A, B, and C level programmers (as it were, keep the "F-level" out).

I see two likely posibilities here. One, you classmates were all terrible -- i doubt that is true, but maybe one or two never 'got it' in industry. Two, most of them discovered that they didn't love programming, and if you don't love this line of work, it's really easy to hate it.

Maybe they just found their passion elsewhere?


I agree, but I think that at times there is some more talent about in peers, just need to give them the benefit of the doubt and see how they go.

I was thinking about it recently, if I had spent my 4 years of uni (just finish up now) focusing solely on the course content and not spending time reading things like books and hacker news and hacking my own projects/ freelancing/ startup, where would I be as a programmer? I think I would be so far behind where I am now, especially on the business side of things but also the programming. I know for a fact the thesis I'm finishing up would be nowhere near the level I have go it to.

It's interesting that in uni in terms of ability I would be close to the top, but when I've seen some of the amazing talent in Melbourne through coworking I'm just average.


I've seen people say this alot these days, "Degree is not that important". But I kinda disagree with it.

I would rather say, "Grades" are not that important. You don't go to Engineering school just to learn programming.

Programming is something you learn to apply what you are learning at school.


You need it all. A degree can get you over the "you don't know what you don't know" hump that many self-taught programmers never get past. My lightbulb moment was linear programming. Up 'til then I was a competent programmer, mind, but suddenly a whole new world opened up, I had no idea you could do this stuff by any means other than brute force...


I don't think it's quite the theory/practice split here. A large percentage of CS majors would probably fail the theory version of fizzbuzz too; say, ask them to analyze a simple sort algorithm for asymptotic complexity on the whiteboard.


It depends on the department. Some have a heavy focus on implementation projects in courses.


Blame Your Interview Process First

Maybe it's the dogmatic zealotry, but it seems like 9 out of 10 programmers just repeat what they hear to be common wisdom so that they don't stick out like a sore thumb. Who would want to be one of these mythical programmers that cannot program? Who wants to be called "dead weight" and cast out of the group? Who would ever admit to being one of these fabled creatures?

We programmers need to stop being so nasty and elitist. Programming is hard and I'd be impressed if you could write down a working solution to a toy computer science problem you probably slaved over for months in university on a napkin in 10 minutes. You should go on Jeopardy. You'd make a killing. I can't do that; I don't have a mind for trivia. We need to stop entertaining our egos and thinking of everyone as "dead weight." We can all stand to improve the code we write. We should be encouraging one another and learning together.

We don't "interview" people anymore. We "screen" them. I suspect it has something to do with the elitism in the industry as well as the inherent anxiousness and dread of interviewing people you don't know. It's become a terribly broken process, yet it seems precious few people such as the OP realize this.

We could do something about this. Here are some tips I have from my experiences:

1. Know the person's name before they walk into the room. Don't shake their hand as you look down at their resume to figure out their name. At least make it look like you put thought and care into who you invite into your office for an interview.

2. Don't use a white board or a pad of paper. A select few eccentric programmers have ever actually sat down and wrote out a program on paper. The tools of the trade are debuggers, compilers/interpreters, and text editors. If you want to see how they approach problems, tell them to come prepared with code samples to share or something. Make sure they're aware that they'll be talking about their solution, the tools they used, etc.

One last thing (and a bit of a shameless plug), I think software can be used to ease some of the pain of the hiring process. I'm working on a project to amalgamate meta-data from repository analysis tools and social networks into a single candidate profile that should give you a pretty clear indication of their skills and abilities. The idea is that you should only invite people you are interested in hiring into the interview. To be able to do that you need to know as much about this person as you can and be comfortable that you have an accurate picture of their abilities and work history. The volume of resumes can make this hard... why not automate it?

Just sayin'


Don't use a white board or a pad of paper. A select few eccentric programmers have ever actually sat down and wrote out a program on paper.

Every physicist can solve first-year physics problems on the back of a napkin.

Every mathematician can solve first-year math problems on the back of a napkin.

Every machinist can sketch her approach to making a widget on the back of a napkin. Electrical engineers can sketch schematics. Chemical engineers can sketch reactions. Architects do nothing but sketch.

Why? Because practice makes perfect: In most technical fields, thinking on paper is an important part of basic education.

Why? Because even at the highest level so much of the most important creative work is done on whiteboards or notebooks or lunch-counter napkins, often in the middle of an impromptu jam session with the smartest peers you can find. My Ph.D. work was mostly involved with machines and materials, but the essential plans and conclusions were sketched out on a set of whiteboards in a series of evening discussions after eating pizza with some folks from my research group. This was completely typical.

If you can't think about programming without electronic help, does that mean you can't think about programming while walking down the street, or showering, or shaving, or doing dishes? If you can't discuss software, even at the oh-so-basic level of an interview question, without a screen and keyboard in front of you, does that mean that you can't design software while eating at a nice restaurant with your collaborators?

If you can't communicate technically in person in real time -- and we're talking really basic stuff, not formal proofs of Euclid's Algorithm or anything -- why should you be hired onto a team? The team can just outsource you.


I'm talking specifically about interviews here.

(In short, yes one should be able to think about programming problems while away from the computer. However, one doesn't arrive at a solution away from the computer. Napkins don't compile.)

Writing code in an interview will tell you one of two things:

1. This person has encountered the problem before and is simply regurgitating the solution they came to when they had four months to work on it and the assistance of a TA and other students.

2. This person has not encountered the problem before.

While the latter case is considered ideal, it only works if you know how to ask the proper questions. I suggest avoiding it because there are more direct ways to get what you want to know.

My suggestion? Code reviews. Have some pre-made examples or ask the candidate to bring their own code. No one has to be put on the spot. It's much easier to control the dialog.


There is no such thing as first year programming problems. At best you provide a simple problem with minimal constrains, but they have nothing to do with professional programming.

PS: What would you do if someone said write a function that will sort these three numbers? Hint: there is no safe answer.


> Hint: there is no safe answer.

I'd be interested in hearing what you're thinking of -- but if you're right that actually makes it a perfect interview question. The baseline implementation is simple, but you can keep pointing out deficiencies and observe how the candidate attacks the revised problem.

The point of interview coding problems isn't to produces production-quality code with no preparation, it's to gain some insight in the way the candidate attacks a problem, particularly when approaching the edge of his abilities.

At one of the interviews I went through for my current job, I completely tanked on a question that fished for a particular design pattern. But while I didn't figure out I needed to use that particular pattern, I was able to participate actively in the discussion on how and why the solution I chose wasn't optimal.


Simply put based the type job your applying for there is types of solutions they are going to want to see. Without knowing more about the job you may be thinking frameworks while they want ASM.


Then ask for elaboration. It's a discussion, not a monologue.


Of course, but that's not a year one problem. If I say how much energy does a 10kg rock gain falling 50m in a 1G gravity field you can just solve it. However, when handed a simple programming problem the natural instinct is to ask for just a little more information.

I find many programmers have issues quickly context switching between thinking about the problem, discovering what the constraints are, coding a solution, and then debugging that solution. Ask them to do it on a blackboard under a little pressure and they can't do it. The problem is the better programmers often do worse in these situations because they think about more things at each step so they need to keep purging more ideas.


I'm not sure I follow. I seem to remember being given various algorithmic and data structure related problems to solve in 1st year of University - both the implementation and appropriate application of algorithms - that I tend to find myself using as fundamental building blocks in my work. Granted, I haven't had to write a sort function since 1st year, but knowing when to use various approaches to representing and processing data and their limitations is pretty much fundamental stuff.


Problems without safe answers are great. They are infinite fodder for discussion.

P.S. Let's assume, oh interviewer, that your answer to my question "what kind of numbers?" is "integers, for now".

    sort(a,b,c) {
      exchange(a,b) if a > b
      exchange(b,c) if b > c
      exchange(a,b) if a > b
      return (a,b,c)
    }
I think that's probably the stupidest thing that could possibly work, not counting:

    <?php
       $sorted = sort($unsorted);
Assuming it does work, of course. It may have a bug. Probably does, in fact.

Now you could ask me to find the bug and I'd write out some test cases in the margin. Or you could ask me what happens if a string gets in there, or how I'd generalize this algorithm to N numbers, or what kind of language I'm using with this exchange primitive in it (answer: my own highly arbitrary mental pseudocode!) or how I'd implement the exchange primitive, or how fast this algorithm will run on N numbers (not very; it looks to be O(N!), and that's not counting the fact that my exchange primitive is probably wicked slow).

Or, depending on the job, you might head in some other direction.


My point is there is no default correct answer to that type of problem. In physics there are default approaches based on ignoring wind resistance etc. However, in computing knowing which questions to ask is the real test and they don't teach you that in your first year.

EX1: You wrote code before asking for more details you fail.

EX2: Then again some people view asking questions as stalling.

Perhaps they are thinking about sorting a vector and want you to use the the language library of your choice or write something. Or perhaps they want a few lines of ASM that include at most 2 swaps and:

  x = x xor y
  y = x xor y
  x = x xor y
But if it's a high level ASM then you may have XCHG etc.


EX1: You wrote code before asking for more details you fail.

EX2: Then again some people view asking questions as stalling.

This is interviewing, not a mathematics competition or a judicial proceeding. It's not a test, it's a first date.

If your interviewer applies either of these rules so strictly that you "fail", you probably don't want to work for them...

...unless you really are the type of person who wants a work environment where such rules apply strictly, and you happen to have the same ruleset as the person who is interviewing you. In which case the two of you may work together very well, and the interview will be a big success.

The first rule of interviewing is: Although one of you is leading, and one of you is following, you are interviewing each other. It's a conversation. If I start out answering your question on a track that you don't want to follow -- integers and not vectors, Lisp but not ASM -- you need to lead me in the right direction, just as you would do if we were discussing a real problem. Your ability to guide the conversation, and my ability to follow, and our resultant ability to establish a rapport and keep the conversation moving, is what is under test here. The problem itself is far less important.

P.S. I have seen that XOR trick before, once, but it still blows my mind. I can never understand it without spending half an hour in a corner muttering to myself. Is there a field where that trick is so idiomatic that it's a reasonable thing to expect a candidate to come up with it an interview?


Yes, ultra low power embedded computers. 2 years ago a friend was working on a 32khz 8bit cpu with 384 bytes of ram and 2,000 bytes of ROM. The problem may have been simple but he setup multi tasking so he could do some networking while still keeping track of some other things and manual error correction because the memory was being flipped in the field. Basically, it needed to operate for years in a remote location on the power output of a watch battery and when your putting a million of the things in the field better code wins out over better hardware.

PS: In the end he still he still had 12 bit's of unused memory and in his words "you can do a lot with 12 bit's of ram".


you don't have to use xor, this will work and perhaps easy to follow

  /* assume x = 5, y = 10; */
  x = x + y;  /* x = 5 + 10 = 15*/
  y = x - y;  /* y = 15 - 10 = 5 */
  x = x - y;  /* x = 15 - 5 = 10 */


Until overflow occurs. Xor is safer.


  r[0] = Min( a, Min( b, c ) )
  r[2] = Max( a, Max( b, c ) )
  r[1] = a + b + c - r[0] - r[2]
I don't think that can suffer from integer overflow. But it might suffer from floating-point rounding problems.

Of course you could always go with the boring solution:

  if a > b:    swap( a, b )
  if b > c:    swap( b, c )
  if a > b:    swap( a, b )


In my experience thinking is nigh impossible with your hands on the keyboard. Thinking happens in the shower, or with a pen in your hand and a sheet of paper in front of you, or in a conversation.

Here's a homework question that I got last week:

Two numbers x and y in 0..50 with x <= y get selected. Bob gets told x^2+y^2, Alice gets told x+y. Alice and Bob have the following conversation:

- Bob: I don't know which numbers x and y are.

- Alice: I don't know which numbers x and y are.

- Bob: I don't know which numbers x and y are.

- Alice: I don't know which numbers x and y are.

- Bob: I don't know which numbers x and y are.

- Alice: I don't know which numbers x and y are.

- Bob: Now I know which numbers x and y are.

Question: which numbers can x and y be?

If you're able to solve this problem by firing up your editor and starting coding the 10 lines of code required to solve it I'd be very surprised. Hard problems require real thinking.


Maybe my thinking is a bit slow here, but can't you even only begin to hope to answer this question if you assume Alice and Bob are cooperating in some fashion?


It's a very awkwardly phrased recursion problem. The point is to find a unique x, y where x <= y. The algorithm starts with the set of all possible (x, y) pairs, and switches between the Bob and Alice stages.

In Bob's stage, eliminate pairs that give a unique value for x^2 + y^2. If such a unique value matches the value Bob was given, then obviously you know x, y for sure. Otherwise, switch to Alice's stage, which operates on the remaining pairs from Bob's stage, and eliminates all pairs that give a unique value for x + y.

The weirdness in the question is both Alice and Bob are performing the exact same algorithm, but (to use CSP parlance) there is a channel from Bob to Alice in Bob's stage, and vice versa, due to the different information given to both parties. Alice must wait for Bob to declare if the algorithm has terminated or not.


I don't think the algorithm you're proposing works, although that was my first idea as well. And I don't a priori see why the pair (x,y) would have to be unique (but it does appear to be, even if instead of choosing x and y in 0..5 you choose from all natural numbers).


If they're unique it means you can determine x, y from the result of x^2 + y^2.


Right. I meant that the answer to the question 'Question: which numbers can x and y be?' is not necessarily a single unique (x,y) pair.


Well, it is assumed they are speaking the truth, and it is assumed that they are good mathematicians (i.e. if they could know what x and y are at some point, they would know).

The idea is that if Bob says that he doesn't know what x and y are, then Alice can deduce some things from that, because she knows for which x and y he would know what they are. For example, after the first time Bob says he doesn't know Alice can deduce that x=1 y=1 is not the solution, because then Bob would know (because Bob gets told 1^2 + 1^2 = 2, and 2 = x^2 + y^2 implies x=1 y=1).


> it is assumed that they are good mathematicians (i.e. if they could know what x and y are at some point, they would know).

Ok that's the part I was missing. The problem makes significantly more sense now.


Reminds me of the blue-eyed islanders puzzle.

http://www.xkcd.com/blue_eyes.html


That's a similar puzzle. Maybe it's because I solved the homework puzzle first, but the blue eyes puzzle is much much easier. You might want to try to solve that first ;)

The cool thing about the blue eyes puzzle is that you can solve it without a computer or pen and paper.


If I ask a physicist to sketch out a solution to a problem, I'll be willing to accept that they may need to look up some constants or formulas along the way, because knowing how to attack the problem and memorizing a bunch of numbers are two very different things; the latter is useful from time to time, but it's the former that I care about if I'm interviewing you.

Why, then, do so many interview processes assume that the ability to, essentially, vomit rote-memorized bits of syntax, argument orders, etc. onto a whiteboard is so important that any "real programmer" should be able to do it on command?


Perfect analogy right there.

See Tim Bray doesn't know operator precedence rules. It's perfectly possible to live and work and thrive without memorizing everything. This is what being an engineer is all about. Yet the FUD that goes into interview & recruitment processes (as well as other aspects of a programmer's work, but that's a longer story) is astonishing.


If you're talking pseudo-code, I'd agree. And I think all these various sketches you're talking about are comparable to psuedo-code. Those sketches aren't going to tell you exactly what PSI some escape valve needs to blow at, or what temperature a reaction is going to run at, etc.

If you're expecting me to write out code that can be ocr'ed and compiled, which is basically what most whiteboard tests expect, I'd disagree.

I failed one whiteboard test (C#) because:

(a) I didn't know the formatting for Console.WriteLine off the top of my head, because who the hell uses that when writing an app anyway?

(b) Because I didn't check the List<int> parameter too see if it was null, even though I did handle an empty list correctly.

With the whole interview thing going on, I didn't even argue point (b), there was too much going on. On the way home I was thinking about it, and as a design decision I actually would've just let the function throw a NullPointerException. No specification was provided to the contrary. Should I have wrapped it in a FooAppException, java-style? Or just silently done nothing even though the inputs were invalid?


I would expect any programmer would be able to write a program on paper.

I wouldn't expect it to be perfect or even compilable, just the same as a widget or schematic. Physicists and mathematicians have an advantage in producing 'perfect' things on paper, because they 'solve' where the other examples you gave 'develop'. They work in the other direction; the mathematical pursuits analyze systems whereas you have asked the other pursuits to create systems, which is much more difficult on a piece of paper.

Now, if you had the designers working forwards- analyzing code or schematics or widgets- then I would expect an accurate answer (with a little fuzz for basic mistakes, same as I'd give a mathematician slack for accidentally dropping a minus sign)


While you're absolutely correct, it's worth noting that for most of the disciplines you mention, the actual work takes place on paper too. A mathematician, physicist or engineer will spend most of his working life scribbling on paper, so it makes a lot of sense to interview that way.

There are plenty of other fields that don't work quite that way.

You wouldn't interview a baseball player by asking him to describe hitting a curve ball on paper, or a nurse by asking him to draw stiches on a piece of paper. Every field has its own medium of expression. For many fields, that's paper. For programming, it's a text editor.


2. I mostly agree. However, if you are applying for a position at a company that specializes in program languages or program analysis, then I disagree. These applicants should be sophisticated enough on the topic of programming languages to be able to code on a whiteboard.

College undergraduates: If you want to do well in interviews, especially ones that involve whiteboard coding, take or audit a graduate course on programming languages and pay more-than-average attention. Your code will be better, especially in rarely encountered (and thus poorly understood) corner cases. And you will interview better.


Rather than simply speaking broadly, here's a very simple example of when I would expect someone who is sophisticated on the topic of programming languages to excel over applicants who are merely practitioners. (In practice, I actually do see this fairly frequently.)

Both C++ and Java provide "dynamic dispatch" as a language feature. However, by default, C++ does not use it; by default, it uses "ad hoc polymorphism" on the static types of its arguments, including the argument passed as the implicit this pointer. In order to enable "dynamic dispatch", the member function you want it applied to must be decorated with the "virtual" keyword.

If someone is able to reason at the level of "dynamic dispatch" and "ad hoc polymorphism" and how to activate these features, s/he should be able to write correct, OO code on a whiteboard. However, if s/he is not formally trained in programming languages, while this person may understand this on some level in practice, s/he may under pressure lose this distinction in whiteboard code, particularly if there is no test environment available.

I don't honestly believe that anyone who has done OOP in C++ has never run into this, so I chalk it up to lack of formal training on programming languages. And while this person may not be appropriate for a company that requires sophistication on the very specific topic of programming languages, this person may be appropriate for another.

I have plenty more examples, but this is a simple one that I think illustrates my point for most people.

On a side note, becoming somewhat formally trained in programming languages, whether or not this is the field of computer science you are interested in, allows you to easily pick up powerful languages, such as Scala, which some people are apparently unable to pick up.

EDIT: Fixed a typo.


I would consider this to be a pretty fine-grained differentiator in who you want to hire however. The formal training aspect gives you the name of things, but it doesn't make you better at reasoning in general. Given the choice between the faster, smarter thinker and the guy with the more specific domain and terminology knowledge which would you hire?


Must I choose? May I choose the smarter, faster thinker with domain knowledge? :P

I do not think domain expertise should be marginalized, especially when the domain is deep. Someone who is smart and fast but who has only ever thought in Spring and Hibernate isn't going to be able to start collapsing strongly connected components in call graphs. There is far too much distance between a blank slate and being able to do that effectively.

And how can a person who is able to achieve mastery of a particularly difficult domain be unintelligent? Truly becoming a master at something is a rare trait. I read in _Coders at Work_ that Ken Thompson, the inventor of Unix, looks specifically for the ability to acquire mastery: He picks an item on the candidate's resume, whether or not he himself knows anything about it, and proceeds to grill the interviewee about it. He reasons that, if the candidate is unable to convincingly convey mastery to someone who is entirely uninitiated to the topic, then how can this person be an effective programmer and team mate (at Google)? On the other hand, if this very same tactic actually sets the candidate off -- that is, the candidate starts waxing romantic about the topic while deftly dispatching any technical concerns -- then not only will this person be effective but this is someone who will be a pleasure to work with.

BTW, I believe that knowing the terminology is of only secondary importance. Its purpose is to effectively verbalize fine distinctions to team mates and often serves as a marker for greater sophistication. (Note that plenty of people throws terms around _incorrectly_, often very much so; I am specifically not referring to these people.)


Excellent article. And I think the parenthetical comment here cuts to the chase

> And as much as I want to just accept it (for reasons of self-aggrandisement)

For people who identify as "great programmers", there's a huge amount of ego and status that comes with this view. Don't get me wrong, I think that computer science education in the US is pretty bad, and lots of people come out of school without having great programming skills, but there's a huge difference between the elitism of "almost everybody but me is a bozo" and the more complex reality.


The thing is, agree with most of this post (when it comes to interviewing and assessing ability)... except I don't buy that that explains it.

Take this part for example: 'Somehow though I don't reckon there are 99 working programmers sitting there reading those posts thinking "…yeah I am a bit of a failure, I wish I was one of those 1 in a 100 dudes"'.

Well, 99 out of 100 programmers don't read those posts. That's how they get by. They never confront themselves with what they don't know, don't read tech blogs, don't visit HN and never go to conferences, user groups etcetera.

I've been working in this business for over 20 years, I have no illusions about my own limited abilities, but yeah, 99 out of 100 doesn't seem that unreal to me. The vast majority of programmers I worked with are utterly incompetent.

BTW, in interviews I prefer to let the candidate talk about code rather than writing it. Conceptual understanding, and the ability to express that already tells me more than enough. Only when it comes to juniors fresh out of college I may want some confirmation that they can do more than talk a good game.


I've definitely come across candidates that simply couldn't code well. They couldn't handle basic things, and they didn't have a strong grasp of the fundamentals of their preferred languages.

Then again, I've been in interviews that were so brutally hard I stormed off rather than keep dealing with the issue. No exaggeration: I've been plopped down at an unfamiliar machine with only 1 putty window to a unix box (that didn't have emacs!) and told to write an ls clone in 30 minutes, without being allowed to open a web browser (and that was only one of three tasks to do in 1.5 hours)

But I also can't help but wonder if the author is in his own self-selected world. Now that I'm in startup-land, I see fewer incompetent people. Even being in Silicon Valley strongly ups your odds of meeting people who are competent. When I was interviewing outside of the Valley, things sometimes were confusingly bad.


I tend to think that if you're going to insist that someone write code without access to documentation and in an unfamiliar environment then you shouldn't ask for anything more than pseudocode. Otherwise you're just testing whether or not someone has memorized the standard library of whatever language you want.

You can argue that someone who has memorized the [Foo] standard library might be better at some things than someone who hasn't, of course. But I think that what you're testing for is at best tangental to the ability to design/write good code.


I was hoping for something more concrete. For instance, consider that terrible programmers stay in the job market for a long time, so you may encounter them more often. Also, if you're not doing your own recruitment, non-technical, keyword-seeking recruiters can be a hindrance to finding good people.

This post just seems to be emphatically reasking the question "Really, guys? Really??"


But "Really?" is actually a very good question; it's an appeal to common sense. Do we all work at places where only 1% of the programmers can actually program? I know I haven't... even when I just started and was clearly the least experienced person on the team, I could still get things done...


+1 for the concrete thing. Spolsky was using real numbers from real experience with real applicants, but this article seems to be pure speculation and anecdote. The author is saying, "Well, my personal experience doesn't match up with this so it must be inaccurate."


Citation? I'm serious - I looked but could not find an article with real numbers. The one linked in the OP http://www.joelonsoftware.com/items/2005/01/27.html is using made up numbers.

Joel actually has a pretty good theory - the same 199 incompetents plus one good programmer apply to all job openings. The good programmer gets hired and the other 199 flock to the next job opening. Occam's razor selects that explanation.


hm, actually I might be wrong. Shame on me for trusting my memory. :)

Double flame then! Someone give us some real statistics!


Recently we started interviewing 4-5th year CS students for a C++/database programming job/internship. One of our questions is, if you have 10 computers and all are connected, how many connections do you have in the network (45)? Even after drawing the n=3,4 cases on the whiteboard as graphs, most CS majors cannot answer the question. As we spend more time on these interviews, we're considering making this our first question, and if somebody cannot answer it, however long it takes, interview is over, let's stop wasting each other's time. Similar no-go questions that people have problems with are: what's a JOIN, what does ACID actually mean, the difference between TCP and UDP.


Are you specifying that you want students with database and networking experience? If not, your expectations may be off. I did both my BS and MS in CS (both at schools you've heard of), but concentrated on theory and AI, and I'm not sure I'd get the last two questions right. TCP guarantees packet delivery and UDP doesn't, though I only happened to learn that in passing - for all I know there are other differences too. And I don't actually know what ACID stands for, though if I had to guess I'd say it describes a set of properties that a database may or may not have. But that guess, even if vaguely right, also comes from happening to have seen it in context somewhere, not from my education. Hmm...a friend of mine from grad school just started at Google and I don't think she'd be able to answer those either.


Our ad is "Database startup looking for C++ programmers". Also, here in Hungary CS majors take at least one semester of database, where they learn ACID.

Also, we're not looking for exhaustive answers, eg. in the TCP/UDP case all we want to hear is guaranteed delivery, packet reordering, stream oriented, packet oriented, UDP is good for audio, that's about it.

You should know we're a startup writing a distributed database (hence the n(n-1)/2 question), so virtually all code involves fairly low-level networking, and ACID is an everyday issue. This all of course is obvious from our website, which the interviewee presumably checks. The code is open-source, so they could actually look at the code to see what kind of questions to expect.


Okay, fair enough. I probably wouldn't apply for a job like that, but if I were I'd make sure to learn a bit about DBs first, if not networking.


I would have said 90 as each node is connection to 9 other nodes if they really are all connected. I wouldn't have considered each connection as bidirectional so no divide by 2. It's possible I would have said 100 because it likely in practice that a program will also have a connection to itself. That's it for me I guess.


I know I used to get nervous during interviews, so we're really nice. It's OK if somebody says 90 or 100 or 10^9 at first, but I think it's a fair expectation that you should be able to figure it out. Btw. the formula for the number of edges of a total graph is n(n-1)/2, which here in Hungary CS majors learn in school. Also, I try to help out by reminding: you have n nodes, each has a connection to n-1 others, but you count each of them twice.


I think you mean complete graph. They are quite different:

http://mathworld.wolfram.com/TotalGraph.html http://mathworld.wolfram.com/CompleteGraph.html

Thanks, btw, I hadn't actually come across total graph before, you learn something everyday.


I'm still not completely clear on how filtering on such easily lookup up tactical sort of information will be an indictor or not if they will deliver software. This is like recipe stuff when cooking. I would be frustrated because you obviously have your favorite answer which might be less important than how they would actually solve a problem in a team environment given requirements and a schedule.


Assuming of course that each connection is bidirectional, and if you are modelling the network like a graph then you may not have bidirectional links in order to capture some asymmetric edge weighting for routing purposes :)


The historic reason I give this as an interview question is:

My first distributed networking framework (in our Keyspace product) used unidirectional connections, eg. I had two connections per node-pair (n(n-1) total). This was OK because Keyspace was meant to run on 3 nodes, and it's very easy to handle in terms of code.

However in ScalienDB, which is a generic sharded database meant to be run on 10-100s nodes, I wanted to get that /2 in the formula, after all "easy to handle in terms of code" is not a good excuse for having 2x as many TCP connections on the switch! It was surprisingly error-prone to get this right. The basic problem is both sides initiating connections, and then figuring out which one to drop.


Maybe just ask the question you want to ask? Modern stacks can handle hundreds of thousands of connections these days, so this concern may be outdated. Then you could have a conversation about if there's enough CPU, is there enough RAM to handle the connections, network bandwidth, how you handle replication, failover, structuring software to handle lots of connections, etc, all of which indicate someone may actually know something, but you'll never get to that point because of the magic formula. Often these problems are used as a proxy for something else when you can actually talk about at that something else. Sometimes I think it's just because most people are covering for not knowing about that something else.


CS students that haven't taken graph theory yet probably don't know the number of links in a mesh grid. Have you considered that?

And honestly, are you telling me that nobody answered in "however long it takes", ie. nobody just counted them on paper?


Despite my other comment, I actually think anyone smart should be able to answer the complete graph question, especially if you walk through a couple smaller examples. You don't need a class on graph theory to recognize the pattern 1+2+...+(n-1).


You're right, and that's why I asked if nobody really "brute-forced" it, but one probably never knew about meshes and never wondered about the # of links in them until one has had someone describe it to them. My objection, if you will, is that the stimulus for such subjects comes mainly from a class on graph theory (sure, there are exceptions).


By the time the interviewee would be desperate enough to brute force it [were they in a room by themselves], they get so embarassed they say they don't know and they'd rather move on.


That's a failure of the interviewer in my opinion. Means he managed to intimidate (not pressure) the guy/gal enough to back off.


You shouldn't need graph theory to simply reason your way to an answer using a bit of thought and some basic arithmetic. Anybody how looks at a problem and goes "oh that looks like X and I don't know X" and then shut of their brains is probably not a great hire.


1. That's what I said. Are you sure you read the whole post?

2. That's exactly what the underlying problem is; oversimplifications regarding interview processes, a complex enough human interaction as it is.


EDIT: Since a number of people asked, the ad was "Database startup looking for C++ programmers".

Btw., our pre-screening per-email test is to write the in-place remove_char() function which was posted here on HN a couple of months ago [10 LOC], and to write an instrusive stack [50 LOC]. People usually get them wrong but get it right the second time, when I explain what the point is.


This is why I don't give programming problems as questions. I've written numerous database applications and I really couldn't tell you off of the top of my head all of the parameters to make a proper database connection. When a mathematician writes a formula, it's a communication from one mathematician to another of intent. When a programmer writes code, it's primarily instructing the machine how to perform a given calculation; as such, it's an exercise in tedium and not very appropriate for person-to-person communication.

So, that's why I ask design problems. I ask candidates to draw diagrams of databases that meet a certain need. I ask them to draw flow charts and state diagrams. These are much more conducive to human communication. I need to know that they can think before they can program. I'd hire someone with no experience in Python to start a brand new project in Python if they showed they had excellent problem solving skills because the hows of writing Python code is just going to be one more problem to solve for them. They'll figure it out and do what they need to do.

When was the last time you know everything about how to program something before you started on the project? If the answer is "ever", did you stick around for the end or quit because you were bored?


This is utter bullshit strawman trolling hyperbole.

I remember reading Joel's post alluding to this back in the day, then Jeff's a couple of years ago

Hmmm. The Raganwald post that Jeff's is based on says this at the top:

If you think that I just claimed that 199 out of 200 working programmers cannot code, stop immediately and either read my follow-up explaining the math, or read Joel’s article on why companies aren’t as picky as they think they are. Thank you.

Wait, so what did Joel's post say back in the day?

That means, in this horribly simplified universe, that the entire world could consist of 1,000,000 programmers, of whom the worst 199 keep applying for every job and never getting them, but the best 999,801 always get jobs as soon as they apply for one. So every time a job is listed the 199 losers apply, as usual, and one guy from the pool of 999,801 applies, and he gets the job, of course, because he's the best, and now, in this contrived example, every employer thinks they're getting the top 0.5% when they're actually getting the top 99.9801%.

Ok, let's see how low you can go in your quest for faux controversy:

One question immediately springs to mind: Is that x out of y applicants, or x out of y working developers? There is a massive distinction.

O RLY?

All of a sudden the headline is: X out of Y Applicants for Positions Advertised By My Company Can't Program. That's a whole lot less impressive/sensational and whole lot closer to reality.

Actually, Joel's impressive/sensational headline was: News. I think you might be projecting.


As someone who does a lot of interviewing at RethinkDB I agree that these numbers are in many ways bullshit. Firstly they're pretty clearly hyperbolic (I think we all knew that already). However I think they also tend to be the result of a short memory span. Whatever the actual numbers may be (we should keep track) they are bad enough that every once in a while you get a run of ~5 interviews that really make you scratch your head. Which is about the time that these posts get written.

The linked list question is designed as a smoke test. Admittedly it's a piece of code that one can do loads of coding without ever having to write. But bear in mind that in an interview we're looking for some code indicative of one's ability in the space of an hour, code that you actually write for production takes weeks-months-years and we simply don't have that kind of time. Questions like this have a high signal to noise; they're just esoteric enough that few people actually have a full implementation committed to memory but simple enough that anyone who gets the concepts of pointers and recursion should know just what to do.

When runs of bad candidates pop up we do indeed reinspect our process. One major improvement to the process is to ask people question their resume suggests they should be more comfortable with. Ask questions in the more esoteric languages they know, ask questions in the field they did their research on. Obviously this greatly increases the chance of correct answers, thus these type of questions have a smaller correlation between a correct answer and a good hire. But what these questions can really do well is convince you that when this person understands something, he really understands it; he gets it well enough that he can very quickly explain it to me. This is very good sign in a potential hire.


The end of the article explains this all quite well: Programming is very easy to get into professionally. I know we all like to think of ourselves as super smart and that our knowledge is somehow hard to come by, but in reality, it can be as easy as picking up a book. You just can't do that in most fields, even "easy" ones.

As a result, you get a wide variance of commitment, and therefore talent.


It is also perceived to be easy to get into, walk into a bookstore and see all the "whatever for dummies" books. So there is a huge army of low-grade "programmers", many of whom are now unemployed. That is true across the entire planet, mind. So while there is a shortage of true talent, that won't be solved by importing the "... for dummies" crowd from overseas.


Strange article. The author deliberately mis-quotes a bunch of sources to get his outrageous title, then calls bullshit on it. Then he goes in and actually reads the articles he's calling out, using quotes from them to support his point, which is also the point of the articles he's calling out.

Everybody knows that this statistic refers to people applying for jobs, not working programmers. Everybody who's ever tried to hire in this industry has observed it first hand. Every article about it goes to pains to explain that it's a problem involving unemployable programmers applying for jobs they don't deserve.

I'm not sure who this author thinks he's arguing with.


Assuming that by singly linked list, he means that each node only has a pointer to the next node and not to the previous one. For each node, I'd save a pointer to it and to the next node. Then replace its next pointer with the pointer to the previous node which should have been saved. Move to the next node and do the same. Done in t(n)=n. If the list was doubly linked, you could do it in half the time by switching 1 and n, 2 and n-1, etc...Would I get the job?

EDIT: also change the list to point to the new first node.


>All of a sudden the headline is: X out of Y Applicants for Positions Advertised By My Company Can't Program

Theres an important point here that he didn't make. The ones that aren't remotely qualified to even apply for the position? They applied to all the positions at every company they could find. Of course they seem to outnumber the remotely competent programmers; they sent out far more resumes, and by all the non-scientific metrics they get counted that many more times.

In particular, this line bothers me: 'How much time did you spend making sure your ad was sufficiently attractive to the star developers and a sufficient deterrent to the crappy ones?' Making the ad attractive to 'star developers' is not the issue here, making it less attractive to resume-spammers is.


I think the fizzbuzz thing is a myth reinforced by confirmation bias. No one takes to the Internet to talk about how an interview went as expected, and rarely to mention a better than expected outcome.

We're only hearing from the subset of interview processes that attracted subpar candidates or had particularly bad luck.


Agree - The interview process needs to be questioned too. How relevant is writing a linked list to the current position? Maybe it is. Mostly it isn't. Canned interview processes end up in disappointment for both sides.


X% of programmers can program. But they do so in such a way that when they've added N thousands of lines of code, it starts to suck.

An easy metric for suck doesn't immediately pop into my head.



Good in the aesthetic sense. Bad in that this can be easily gamed.


You aren't something unless you do it. First, define programming. I think I program, but I might not be programming if someone's sense of the word is 100% test complete.


That might get into a distinction between programming (let's make the computer do what we tell it) and developing (let's follow these processes while we program).

But it's a good point. That said, if I were interviewing someone who was clearly a good coder but who said they didn't test their code, I would still hire them as long as they would be willing to start writing some tests.


Great post, good points (had a good laugh about the loop and arithmetic assumption ;-)) and I mostly agree - although probably 1/3 of [academic/scientific] programmers [at least] really can't program (although that is much less than what got you about to post). [see Nature Oct 14, 2010] However, that is definitely less than the "majority".


Interview != Quiz


I have a theory on this. Let me make an analogy to advice I gave to a naive female friend (personal theory, I'm not a psychologist).

Consider psychopaths. They are not many (in percent of the male population), but if you go out and meet guys socially, you will meet a much larger fraction of them. That is because they use people and have to get new social environments quite often...

I think it is the same when you interview programmers; the bad ones tend to apply for jobs much more. The interview process has a built in selection bias, so people doing interviews go crazy because most people they see are ass hats.

And about white boards -- there are lots of subjects where I have good experience, but not recent enough to do a white board test. OTOH, one of the smartest people I studied with aced all the math courses but could never learn to program. [Edit: What I was trying to say here, an interviewer couldn't see a difference between us two by asking about e.g. a language I haven't used in a few years.]

Disclaimer: I don't know how good I am, but I only look for work at most every few years... :-)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: