I interviewed at Twilio. My resume and phone chats made it abundantly clear that my professional experience was 95% backend in languages like Java and Go.
I looked at their Glassdoor where people wrote they were heavily biased to the algorithm type questions, for better or worse.
In reality I was given only a single coding interview my whole onsite, and it was to build a JavaScript SPA, including routing/linking, without the use of any framework, just vanilla JS.
I was allowed to Google whatever but 50 minutes is still a crazy time crunch to figure out a lot of the stuff that needed to be figured out.
It felt like a massively unfair way to evaluate me. I have honestly never felt more screwed over and time wasted by an interview than at Twilio and I would highly recommend avoiding interviewing there.
My overall point being, is there’s no silver bullet. It’s not necessarily about this magical process will work well and this one will not. It’s about the small details, especially things like making sure all of your interviewers actually read the candidates resume and that everyone knows what type of role they are interviewing for, and that the interview is designed to discover both the candidates strength and weakness. I’ve been on both sides of the process and I know too often shortcuts are taken and churning the pipeline forward takes priority over doing a diligent job and respecting the time of every candidate you engage with.
The very fact that the foundation of tech recruiting is inexperienced recruiters, with minimal tech knowledge, who are incentivized by sales type numbers, and churned through themselves at a comical rate, speaks to how the industry views recruiting. College athletes get recruited, top law school graduates get recruited, engineers aren’t recruited but relentlessly spammed in hopes of finding more bodies to feed into a process straight from a Kafka short story, the problem starts there and no a-ha process improvement will fix that.
> In reality I was given only a single coding interview my whole onsite, and it was to build a JavaScript SPA, including routing/linking, without the use of any framework, just vanilla JS.
> I was allowed to Google whatever but 50 minutes is still a crazy time crunch to figure out a lot of the stuff that needed to be figured out.
This actually might not be so nuts if it were clearly a way to evaluate how you handle unfamiliar problems and they made it clear they didn't really expect you to finish the whole thing in 50 minutes, I guess.
Part of the problem is that most companies seem to be complete dogshit at communicating:
1) what they're going to quiz you over—they seem to think it must be a pop-quiz with absolutely no way for you to know what might be on it or specifically prepare for it, sourced from literally all of your past experience, all of CS, all of software engineering practices, and maybe even stuff you never claimed to know, which is plainly, to be blunt, fucking bullshit and goddamn insulting, and that's not even getting into how many of these questions/challenges rely on recalling (rarely on formulating or discovering, that's just impractical and unlikely) some very specific "ah-ha" insight to not look like a complete dumbass while trying to solve them—and,
2) why they're giving you certain challenges or asking certain lines of questions, what they're trying to evaluate, or how you'll be judged, leading to stressful, pointless guessing games like "should I risk not completing this challenge to write very complete tests, because they'd rather see good tests than a working solution, or am I just completely fucked and a getting a definite 'no' if I don't have a working solution anyway, so I should not write any tests that don't increase the likelihood of my finishing in the time allotted?"
[EDIT] "why don't you just ask questions to resolve point #2?" Yes that can work, but a lot of this is so damn vague that then you've got the meta-guessing-game where there's a real chance you'll "lose some points" if you ask valid-in-context but "wrong" questions, because any "real" developer should know the answer ("of course you must write extremely complete tests for all code ever! Ugh, this guy must suck.")
> This actually might not be so nuts if it were clearly a way to evaluate how you handle unfamiliar problems and they made it clear they didn't really expect you to finish the whole thing in 50 minutes, I guess.
This happened to me. I once interviewed at a place where they presented me with a language I had never seen. Very different concepts than your typical C/Algol flavored languages. They went over the basic concepts, with a handy reference sheet for me, then gave me a few tasks to complete in that language.
Towards the end, after completing the tasks, I congratulated them for inventing a ficticious language in order to test candidates on a more equal footing. Cute, clever, creative, even though it's probably far-removed from the real world...
Their response? Oh this is not a ficiticious language, this is what we use daily. It's our proprietary language and we're quite proud of it.
Me: ...
(I ended up getting the job and learned a lot in the domain of wheel reinvention.)
Man, this reminds me of the story about the person that gets a position at a new company and it turns out they have a ridiculous tech stack where it's a custom language. The new person comes in and adds comments which breaks everything and somehow files have like a version history which affects how they run? Anyone have the link to that? There's some individual that developed all of it and gets the new person fired for breaking everything in production (because of course it all runs from head) and was portrayed very negatively.
I don't know. I'm very comfortable in JS, valilla, jquery, react and vue. And apparently my customers think I do a decent freelancer work since they keep hiring me back.
And yet, an entire SPA in vanilla JS in 50 minutes seems a lot to me.
I guess it depends of the size of the SPA, but user input + rendering + routing + whatever logic they ask you for the app is a lot of work.
I guess it depends a lot of the requirements: you could skip the routing, make a dirty innerHTML rendering, deal with one browser only for the input events, etc. Still...
Besides, I hate being rushed when I code. I never speed code IRL, the deadlines are in days/months, hours for crisis, never minutes.
I've done a similar test. I didn't finish it in the time allotted, but a few other candidates did and they hired one of them. The fact companies can do these kinds of demanding frontend tests and still hire people is what makes me think there's a glut of web developers and not a shortage like people keep saying.
I had an onsite recently that told me were were going to be building something that had to interact with a postgres database.
I built out the entire backend prior and took copious notes because getting stuck looking up things like “how do I initialize a postgresdb” (something I do maybe once a year, if ever) would eat up time.
I think I did pretty well at that onsite because of the fact that I prepped before hand and they didn’t seem to mind that.
It’s weird to me because in general most engineers are rarely if ever creating an entire stack from scratch and it can be very easy to want to do such a thing methodically and with best practices in mind.
How would I enable hot code reloading? How do I set up PostGres so I’m not just connecting as super user which is obviously a security risk? Etc etc.
To be able to do these tasks under pressure with an engineers mindset is tough and you either have to really drill down and focus and stress that in a real setting you wouldn’t be connecting as superuser but for the sake of time in this particular toy exercise you will be.
Being able to prep was essential, and even so, I still stumbled here and there with unfamiliar syntax.
There's a glut of resumes that have something web dev on them, and a scarcity of reliable ways to filter for the candidates who legitimately have the skillset(s) needed for the job.
I've done a lot of these tests and interview projects, passed plenty, didn't pass a few, didn't bother finishing a few, but - weirdly - every place I've been hired didn't use them in the first place.
I've started to view them as a sign of a company you probably don't want to work for.
> I've done a lot of these tests and interview projects, passed plenty, didn't pass a few, didn't bother finishing a few, but - weirdly - every place I've been hired didn't use them in the first place.
Similar here. Strangely, the places that don't do them pay about the same as the ones who do (in the same market—not talking FAANG versus Bob's Printer Service & PC Repair in Madison, WI).
After my latest search my new policy for future searches (at least until the downturn or it otherwise stops being super easy to find jobs) is at least not to do any kind of project or evaluation before a real interview, that is, without a real person from the company taking the same time I am, at the same time. The kind where they send you to some "coding challenge" website or give you some "take-home" project before even talking to you. If they're asking me to burn a bunch of my time to save some of theirs, it means I'm too far down the slush pile and/or they're too bad at interviewing for it to be worth my time.
We use a third party for the initial code screen because we feel like it’s more useful and more respectful to the candidates. We worked with them to pick questions that were representative of the type of “real” coding someone might do at our company, i.e. not algorithms or brain teasers. The company also includes a section assessing how the candidate communicates about bugs they’ve investigated. The screen allows 75 minutes and can be done any time the candidate wants, without someone sitting over their shoulder. When it’s done, the candidate gets the same feedback that we do about their performance. The feedback we get includes the candidate’s solution, a general “score,” and a series of bullets of things the candidate did and did not do well.
Compare this to what we did previously, which was to get one of our engineers on the phone and on a pair coding website to ask brain teasers. The candidate got no followup feedback if they didn’t do well. On top of that, because the people doing the interviews knew that if they said “no hire” it would stop the process in its tracks and the person would never get an on-site, we had something like a 95% pass rate. We evaluated several third parties for this, and felt the one we chose was the most useful and the most respectful of candidates’ time.
I understand that the process isn’t for everyone (no process is), but we’ve gotten generally good feedback from applicants so far about the company we chose (Woven, for the record).
Edit: also just noting that we don’t do any sort of hard cutoff for the scores they send us. We try to evaluate each candidate’s performance in light of their experience, looking at what they were and weren’t able to get done, etc. We had someone recently do amazingly on the first half but not even finish the second: presumably they ran out of time, but we brought them on anyway because we felt like their performance on the first half deserved earned them some further consideration, even though their overall score was quite low.
I figure if my initial contact & a work history that fits very closely with what the posting company's asking for (as it usually does when I personally apply somewhere, rather than letting recruiters bring me stuff) wasn't interesting enough to get someone on the phone and instead gets me directed to a screener robot, they're just Not That Into Me. If/when (next downturn, probably, when the number of applicants vastly outstrips the available jobs) everyone's doing it I'll just have to live with it, but for now, not worth an hour plus for a lottery ticket to maybe talk to someone. Others are excited to talk to me on seeing what I've done and what I might do for them. No need to buy lottery tickets with hours of my time.
Plus, I mean, I'm not as good at coding challenges as I am at actual development work (which is a lot of stuff, some of which is tappy-tappying code into an editor) so if that's the first gate I have to pass before anything else, that lottery ticket I'm buying with my hour typing at a robot ain't likely to pay out anyway, even if I'm actually the best candidate for the job itself, which isn't guaranteed. Luckily there are so many fish in the sea (job offers paying roughly the same) that it's not an issue for now. Hopefully I've fully "leveled up" into titles for which coding challenges aren't the norm in hiring before the next downturn, otherwise I guess I'll be spending some time drilling to preen those peacock feathers, which is what I'd have had to do this time if it weren't such a seller's market for devs.
You might have seen the same thing, but any title that sounds managerial in some way - even if in the day to day work, it's completely meaningless - goes a long way toward jumping the line and convincing non-technical people involved in the process that you're a good bet.
Some indication that you have the soft skills and social skills to get along well with non-technical staff helps short circuit the whole thing, as well as convince the type of technical staff that put too much emphasis on coding challenge questions, etc., that they need to cool it and consider the whole candidate, not just a score on a quiz.
Yeah, my curse and blessing is that I'm (apparently—I'm just going off feedback and where I've observed myself contributing strongly to projects) really good at parts of software development that don't really come through in coding challenges, even most of the ones that try to be like "real work", and that are also kinda hard to talk about directly without seeming like a bit of a prick. Good taste, good intuition, broad knowledge, a tendency to think in and about whole systems, and (frankly) sufficient general intelligence that connections among all those things just pop into my head. I also seem to be above-average at explaining concepts to others, at writing generally, and other developers on my teams are usually really happy with library interfaces, APIs, and similar things I design & build (see again: good taste).
Meanwhile, during the act of writing code, I crib off existing code to remember how the hell method invocation looks in the current language I'm writing and similar important details. If I haven't touched a language in 3 months I very likely can't FizzBuzz without syntax errors or using the wrong function somewhere or whatever. I put Go on my résumé because I have written quite a bit of it, and can talk to someone fluently about the parts that are likely to seem odd to a newcomer and some of its strengths and weaknesses, but without some cramming ahead of time or a cheatsheet I probably can't demonstrate a bit of that in code. I'll likely mess up keyword order and all kinds of basic things just trying to "hello world".
In short: I've realized, later than I probably should have, I need to get the fuck out of development per se and into something that's still basically developing software since I'm actually pretty good at that, but doesn't judge me on whether I can, in the moment, recall what a print statement looks like in a language I was writing literally yesterday (it's entirely possible I'll blank on stuff like that!). In my defense I've been easily landing jobs doing this stuff since I was 15 so my blinders were, I feel, well justified—it'd just not occurred to me previously that I might shine better, at least in interviews, which is kinda important, by seeking roles a bit adjacent to development rather than in the thick of it, until I got mumble mumble years in the field and started to think about what I'll be doing when I'm, you know, not even sort of young anymore.
[EDIT] and yeah, of course I could put together Anki decks and do daily drills to get better at the parts I'm bad at and it'd certainly help, but another part of this series of mid-career revelations I've had is that if you find yourself wondering why other people are having trouble with something that's easy to you, that's what you should put your effort into, and conversely, if you can find some way to ignore the parts you find harder than most people seem to (not always possible and sometimes you just have to work on things you suck at, sure, talking about ideally) then you should do so, to free up more time for the easy-to-you-but-not-others stuff.
> "to evaluate how you handle unfamiliar problems"
The only problem here is figuring out why would anyone work for a company with these practices. Unless I am missing something and it is common practice to build a product without any thinking, research and design.
Almost no companies come out and say "we will ask you some dumb bullshit in the interview and not tell you what we're evaluating or how we're evaluating it" and a huge percentage of the industry interviews like this, at least to some degree. The best I've ever gotten when asking "is there anything I should prep for or expect for the on-site, and how's the interview going to run?" is "bring a laptop, here's a schedule of who you'll talk to and what their titles, oh and consider dressing [some way]".
I think next time I'm just going to ask them if they merge a lot of PRs that look like what they're asking me to write, and if they say yes, thank them for their time and leave.
I think the answer is often 'no' and they've just never thought about it that way. My favorite interview question has a credible business story I can give for why we're doing it this way.
The goal is to see how you approach thinking about it and conducting your initial research. The solution is irrelevant, and falling back to "well we won't do that at work!!" is missing the point.
If you're the type of person who freaks out when we discuss the possibility of using a new tech you're unfamiliar with, I don't want to work with you. There's a certain level of pressure in saying "I know we've never worked with $TECH before, but we're going to use it for $IMPORTANT_PROJECT_WITH_A_DEADLINE" and if 50 minutes of fiddling around with JavaScript is enough to psych you out, you're probably not going to handle the new tech well either when you barely even know what to search for when looking for tutorials.
From my perspective, a big part of the problem is that companies don't let you know they'll be doing something like this (a simple "the 1:00 section of the interview will involve giving you a task using some tech you're not familiar with"), so it's a surprise (interviews are already stressful in ways that real work pretty much never is, "surprises" are deeply unhelpful for evaluating candidates outside very rare and particular kinds of work), and then also often forget to provide extremely useful information like "we know this isn't a familiar language or environment for you—we specifically want to see how you work in something you're not familiar with, so don't stress out if you aren't close to a complete solution by the end, we don't really expect you to be."
So what would you like to be interviewed on? If it's not based on your CV/experience, general CS best practices, neither known nor unknown technologies, nor general problem-solving questions...
People ask all of these for good reasons. You can write literally anything on the CV, so people ask about your experience to check you're not lying (you'd be surprised how many people do lie and at least misrepresent). People ask coding challenges because they expect you to code (and we've all heard horror stories about like the head of IT security not actually knowing how computers work). They ask you general problem-solving questions to test your intelligence, mental models, and behavior under stress/adversity (again, you'd be surprised how many people just suck here, or worse, get insulting etc. ALSO - note that I'm not saying that all such questions are good, they're mostly bad, but with a little bit of thought and planning you can design a good question). People test for basic CS knowledge because it's important in almost all situations (would you hire a brain surgeon that didn't know what a spleen was?).
(Not discounting all the actually terrible interviewing practices, e.g. people asking impossible questions to feel good about how smart they are, etc.)
I don't think I've directly ruled out much in the way of interview questions on in my comments on this thread, I basically just think not telling people the sorts of questions to expect and in which sections of the day, and not making it very clear what you're evaluating and why you're asking the questions, is harmful and inhumane.
A simple "this part of the day will be algo questions, we do expect actual solutions for at least some of the questions, but we're happy to talk through your process with you as you solve them, and none will be more involved/complex/deep-lore than [a couple examples]" would go a long way. For one thing it'd let people just "nope" out of interviews they know they aren't prepped for ("well they should just know they can't do the job in the first place and not apply" that'd work if interview practices/questions and actual work-on-the-job—let alone what's asked for in job postings—mapped even somewhat closely to one another, across the industry, but they very much do not).
I don't think mystery-interviews and having no clue whatsoever what might be covered in the oft-featuring pop-quizzes probably does much for improving the quality of passing candidates, but it does waste a bunch of time (for everyone) and make the whole thing way more stressful.
[EDIT] tangentially related, I find it really weird that most every company says they want people who can learn new things and get shit done more than ones who just know lots of stuff, but act like if they provide enough information that a candidate might conceivably be able to even kind of study or prep for their interview process, that'll ruin it somehow. WTF? If the candidate is not capable of doing much with red-black trees in an off-the-cuff kind of situation because they're rusty or whatever, but then you tell them a week out that you'll be asking some questions about tree and graph structures more complex than simple binary trees, so they brush up over the weekend then ace the interview including some red-black tree stuff which they couldn't have done under those conditions a week earlier, isn't that precisely what you claimed you wanted in a candidate? How is that harmful?
The company has a given technology stack and rather than asking for anything which would directly benefits the company asks you to do some task that might be similar to their tech stack, but instead for the public domain / some open source project.
Programming: We've picked 10 problem tickets from various open source projects that are similar to things that might happen at work, show us what you can do in an hour.
DevOps: There's a local charity that needs a small version of infrastructure similar to what we have in house. Here's some old hardware we're donating and the things we'd like to see on it.
Whenever I've worked with a new $TECH for an $IMPORTANT_PROJECT_WITH_A_DEADLINE the timeframe for getting up to speed with the new tech has been days, not minutes.
I was on a python/javascript team that transitioned to kotlin -- everything went well, better than I would have expected tbh given only 1 of us even had any production java experience, but 50 minutes would not have been enough time to figure out how to usefully set up the IDE, let alone build anything beyond FizzBuzz.
There's a big difference between spending 50 minutes fiddling with that stuff when you're employed and evaluating it as part of the research phase for a project vs. in an interview with a gun to your head.
we only got half of the story here, but if the workplace has a history of switching technology and trying new things often interviewing a specialist was still their mistake, not the specialist's, that wastes everyone time.
> This actually might not be so nuts if it were clearly a way to evaluate how you handle unfamiliar problems
Why not seek applicants who are familiar with the problems they’ll be regularly encountering for the position you’re trying to fill? And as an employee I’d rather interview for positions that will require me to solve problems I’m familiar with.
Because even if you're familiar with all the tech we use, you're not going to be familiar with the business domain, our cobbled-together solution on top of the tech stack, and the tens or hundreds of thousands of lines of code we nurtured over the past N years.
How you handle unfamiliar stuff is critical when evaluating an engineer. That's like the whole job.
Unless your goal is to churn out copycat CRUD apps and marketing pages your whole life. Then carry on :)
everyone is looking for their flavour of 'full stack' because in the mind of middle managers human resources need to be fluid across organisational projects.
Sorry to hear about your experience. I'm at twilio and we recently started a hiring guild to tackle this exact problem. Most of the members of the guild on the technical side are there specifically to address bad interview experiences like yours. We are focusing on improving process, questions, and candidate experience. We are moving the technical portion towards work sample like questions and ensuring the interview focuses on what you bring to the board as opposed to someone's pet question in their favorite stack.
I interviewed at Twilio. My resume and phone chats made it abundantly clear that my professional experience was 95% backend in languages like Java and Go.
I was given only a single coding interview my whole onsite, and it was to build a JavaScript SPA, including routing/linking, without the use of any framework, just vanilla JS.
This is of course both completely ludicrous, and incredibly disrespectful. Thanks for pointing this out so that we know to add that company (Twilio) to our list of companies to pass on, the next time they come our way.
Do you really take random comments on the internet into consideration when you're looking for your next gig? If you do, in my experience, every company is off the table.
What else am I supposed to take into consideration? The companies marketing material? Maybe the signal in those comments is a bit biased, but it seems far less biased than the alternatives.
At this point, unless I talk to someone I trust within a company, I try to reserve all judgement. In fact, I do that with everything nowadays. The internet and modern media skews so negative, according to them nobody is happy anywhere and everything sucks. That hasn't been my experience.
The last two times I went through an interview cycle (one on each side) it struck me how dependent your success was on your chosen tools and whether the coding question dovetailed with those choices or not.
For instance if I'm working on a hard problem I usually need tests (whether I initially admit that to myself or not), and despite the fact that I'm often the one who sets up and/or defines our testing strategies, every time I set things up is an adventure, because I set it up once and run that way for years. I have zero muscle memory for the installation process and anyway the decisions might be a little different in 3 years. And more to the point, I want people to copy what I've done, so I do the same thing (which gets me an experience of what others are having to put up with/enjoying about my solution).
For the app server the situation isn't much better.
In an interview taking the time to set any of that up would be crippling, even though I'd come out even on a 2-3 day story and ahead on anything longer.
I have thought many times about setting up a blank application with testing, logging, and production-reasonable overrides for all the defaults for the app server, etc. Just so I have an easy starting point for quick prototyping, which is essentially what an interview often is.
Generators probably work for an interview, but for real projects, re-applying the generator with each release is kind of a bear. I propose that it would be much easier (possibly trivial) to do on an empty shell project and then merge downstream.
I kind of think one of the things we are missing about DVCS is that the ability to maintain permanent forks gives us other options for arranging cross-cutting concerns. Maybe one more generation of merge tooling is necessary for that to be A Thing.
Most interviews seem to weigh heavily in favor of "sprinters" over "marathoners". That is, they select for people who can go from nothing to working code in a short period of time.
I am one of those people who takes a while to get rolling, especially when starting from scratch. Interviews generally give you 30 minutes or an hour to produce something. I will easily take an hour to think about the problem, do some research, and then create some notes before writing any code.
This is, of course, not what most companies want. Unless you are asking me to do something trivial or something that I have claimed to do many times in the past, you can't expect me to come up with the "right" answer instantaneously.
I never jump right into editing code, unless I am already intimately familiar with it.
> Most interviews seem to weigh heavily in favor of "sprinters" over "marathoners". That is, they select for people who can go from nothing to working code in a short period of time.
This is a great way of putting it. As a "marathoner", I think the only way to pass these things is to condition yourself for the sprint. Which means being familiar with the types of questions that might be asked and practicing coding solutions until you can code them off the top of your head (and even on a whiteboard if necessary!!) The investment of required to achieve this is ridiculous, and there's no guarantee you'll be asked a familiar question.
It's absurd because in some software engineering jobs, a marathoner might be preferable to a sprinter but they'll most likely hire the sprinter.
Hiring remains an unsolved problem. The company who can truly solve the hiring problem will be a unicorn.
Exactly my feeling. I've been pondering a couple of recent negative interview experiences and felt this was my main issue. For any normal issue/work/feature/whatever, I'll have a few hours/days to ponder it and think of possible approaches and any issues that I might encounter down the line.
When put on the spot with a time crisis I tend to go too deep down a rabbit hole before realising the shortcoming of the approach I chose to take.
This leads to me usually doing great in take homes, compared to these recent in-person assessments. Also, confounded by having to work on an unfamiliar machine without any of my usual tooling which I'm not a fan of, especially as a keen fan of having a decent debugger set up.
> It felt like a massively unfair way to evaluate me.
My advice to someone in a similar situation is to start talking and ask about what we want to really achieve here. Not what the task description literally says (SPA or whatever - that's "how" but not "what for"), but what's the real business purpose for it. For the interview purposes - what do they want to see during or after those 50 minutes.
Then making a guess whenever this objective can be realistically delivered within the provided constraints, and communicating how do you feel about it.
If they provide their real expectations (e.g. "we want to see how you tackle a problem outside of your immediate expertise domain, we recognize harsh time constraints and it is okay if you won't complete it, although we'd appreciate trying your best") it's all fair game. If they don'tjust insist they want this specific JS SPA done in 50 minutes - well, that says something about their project management habits. In such case, I'd start asking questions about the role responsibilities.
I had a 2 hour coding project in a language I'm familiar with - entirely based around a sub module that I've never used before.
Maybe 2 hours should have been enough? I don't know - with the stress of writing under time pressure ( I had a new sympathy for contestants in cooking shows ) and complete unfamiliarity with the module, I was pretty impressed that I submitted something that passed the unit tests.
Then I got rejected with the implication that I wasn't using good OOP principles. OK, my decision to store JSON data as a byte array was unorthodox - but it worked without a lot of coding overhead. I thought it was pretty clever frankly.
I'd be more impressed if they had a mock code review as a discussion afterwards where you could explain your decisions and they could question/challenge parts of it. Not only would that give you time to settle, it'd give both parties some real insights into expectations, skills and work practices.
Something like: "Why did you store it as a byte array?" and then "How can you reconcile it with OOP principles?" as follow up questions would be far more productive.
"Which principles do you mean? If I write an object then it only matters to the caller what the API is, not the underlying storage, right?" and before you know it you're in an interesting conversation where you both have the opportunity to learn something.
It'd probably be quite enjoyable, regardless of whether you got the job in the end.
There is three types of developers in this scenario. Those who "do it right from the start", they are slow. Those that do the quick and dirty, and create an issue in the issue tracker about what to change and why. Then there is those that do the quick and dirty with mental note about what to change, and never return to it.
If you have advertised a job and I've given you my resume, it is to give you some information you can use to avoid wasting your time and my time.
Just like I would hope your advertisement is accurate and useful enough to let me decide if it is a good use of time for me to send you a resume (e.g., saw an ad for "react developer" -- got to the interview and they were looking for a "server side java dev" -- I mean -- wtf?).
Not reading a resume before an interview feels... rude. It feels like the path that leads to you asking a Java/Go programmer to do a Javascript coding challenge in 50 minutes, which is a waste of everyone's time.
It is rude. People put lots of effort into resumes. But it's ruder to reject a candidate based on age, gender, martial status, education or lack thereof or doing non sw jobs in the past.
I always talk on the phone before and that's where we both make sure the interview is of relevance.
If you can't read someone's resume without rejecting them based on age, gender, marital status, education or lack thereof, or for their non-sw work history, then it sounds like you're not a very good person to be interviewing candidates. The concept of "unconscious bias" might be true in an academic sense, but in a very real way you should be treating people with respect regardless of their characteristics, especially during an interview process. Not reading someone's resume isn't just rude, it's supremely disrespectful.
I take a completely different approach, because I have been on the receiving end of disrespectful interviews and I won't stand for it and I don't expect the candidates I interview to accept it either. I carefully read the resume and research the candidate far in advance to the interview. I only ask questions during the interview which are open-ended, not trivia questions, and are directly related to either the content of the candidate's resume or things we're actually doing day-to-day on my team.
The dog and pony show interview style is intensely disrespectful and so is the idea that you won't even read a candidate's resume. I've walked out of interviews where both have happened, and I hope everyone on HN gets the personal confidence to do the same. I'm a professional, I expect to be treated like a professional, and I return that by treating those I interview like professionals. End of story.
>it sounds like you're not a very good person to be interviewing candidates
We agree on that one, but the alternatives are worse. Would you like to take my place interviewing? It's a time consuming task I'd love to delegate to someone more experienced.
I would honestly love to take a job that was 100% focused on interviewing and hiring technical people. I have been through so many bad interviews (on both sides of the desk) that I think it really should be a dedicated position staffed by people who actually understand the major defects in the industry.
You can always apply to triplebytes and get exactly such position :)
<disclaimer: my affiliation with triplebytes is that they've rejected my application twice - possibly cementing parent suggestion that I'm underqualified to interview developers>
I hate interviewing (on both sides of the table), but I’d much rather do it than have someone else do it because I hope I can make it marginally less harrowing for the candidates.
I guess I’m a terrible interviewer, but so far the candidates we hired have worked out.
There's a difference between being cognizant of unconscious bias and being disrespectful to your candidate. I am cognizant of unconscious bias, so cognizant that I know it plays a much smaller role in biased outcomes than conscious intended bias. I read candidate resumes and I treat candidates with respect.
The parent is disrespectful to the candidates they interview and I am being charitable in taking their explanation about bias at face value.
Not reading resumes can be part of a process to efficiently discriminate by age, because it's precisely older people who have an interest in you reading a resume, because it's they who have important stuff there.
Yeah, age I can understand. People usually list their college degree along with the date of graduation, so you can usually make a good guess about their age.
But, I have no idea how you gather gender or marital status from a resume.
Most names are gendered. Can I tell 100% that Jane is female from the name? Of course not, but then I can’t tell with certainty that someone who graduated in 2011 is ~30 either.
It’s fairly common in my experience for CVs from European candidates to have marital status/other family information (and often a photo!) on them. I don’t know why and I just ignore it.
Very true. I suppose I worded my earlier comment poorly. You can generally infer age and gender from one's name and education dates (of course, not always). But the point I was trying to make it that this would not be explicitly on there, particularly marital status. But looks like even there I'm wrong.
Yeah. It was super weird the first time I saw it on >80% of resumes I was screening for a director position in Europe. I asked my HR contact over there and she didn’t seem to think it was unusual at all.
> It feels like the path that leads to you asking a Java/Go programmer to do a Javascript coding challenge in 50 minutes, which is a waste of everyone's time.
Depends, we always give someone a super small coding challenge (literally write 4 lines if you know how to, and the longest I’ve seen is something like 15), and any decent Java/Go dev could do that even if they had to learn JS on the spot (probably not necessary, but...).
But yeah, I agree with your main point. Not reading a resume is rude.
You're being downvoted but I've had the same experience as well. They bias interviews that are supposed to be performance based.
Resumes are a carefully crafted signal paper targeted at getting past an algorithm, recruiter, and into an interview. They're chock full of brand names (both education and company) and keywords. There's also unconcious bias based on the person's last name, locations that they've lived in the past, and so on.
I've found the best thing to do is skim their past job roles, to set expectations for where you can start with the level of difficulty in questions.
What? Unbiased mind after reading a resume? I'm expecting you to read my resume when I apply for a job. I hate it when recruiters call and ask how many years I've done X. It's right there on the resume. If you don't read it, you're wasting my time, as well as your own, if you're doing hiring.
Less information means more bias. You may not have the bias that comes from misconceptions about the information on the resume, but now you have the bias that comes from assuming a "typical" resume whatever that is in your context.
Great anecdote, thanks for sharing. I myself have some stories about poor tech hiring all over the place.
As a counterpoint/to play devil's advocate - it doesn't actually seem like these companies are any worse off for this, are they? In other words, it doesn't seem like poor interviewing practices (Google is offender #1) are negatively affecting these companies.
I had an interview like that. I’m a marketer and my background is clearly B2C, and a hiring manager reached out about a B2B role. We chatted, had a fine conversation, then I had an hour “test” on lead scoring. I wasn't passed on, and told it was because I had no prior experience with B2B lead scoring...duh?
> In reality I was given only a single coding interview my whole onsite, and it was to build a JavaScript SPA, including routing/linking, without the use of any framework, just vanilla JS.
OK but that's like 4 lines of code:
Line 1: Get the URL of the current page
Line 2: Get the path part of the URL
Line 3: Using the path as a key, retrieve some chunk of text from an object
Line 4: Write that chunk of text to the DOM
I wouldn't know the syntax for any of that without googling, but that doesn't seem excessively crazy even as a question for someone who doesn't know the language.
Yes they are? Gift baskets, personal emails/calls from the CEO, offers of travel just to meet the team (not an interview), offers to fly out and meet with them. The whole nine yards. The number of people who get this treatment is small compared to the number of engineers, but it's not zero.
> was allowed to Google whatever but 50 minutes is still a crazy time crunch to figure out a lot of the stuff that needed to be figured out.
The point of such interviews is to see how you handle a situation with limited time and - like in this example - mostly new tech and paradigms to you. Usually you are not expected to output anything.
Seems unnecessarily cruel and divorced from reality, even if that is the point. In real software jobs, you don't have to learn a new language, framework, and especially a new paradigm in 50 minutes. Those are things that happen over weeks, months, and years. I don't know of any useful skill that adding even more stress to an interview will surface.
An engineering mindset and systems thinking—or simple awareness of things like experimental design principals—are badly under-applied to software hiring processes, IMO. "What are we trying to select for with this section of the interview? Does this accomplish that, or does it do something else, or does it accomplish that but also do something else? Are we being any less humane and supportive in this section than necessary to evaluate what we have decided we need to evaluate? If not, how can we fix that?"
What exactly would you be hoping to see the candidate do in such an interview? I’m moderately experienced on the front end (mostly jQuery and React, a bit of AngularJS) but if the task is making a SPA in vanilla JS I’d barely know where to begin. It’s pretty likely that I’d do hours of reading and researching before writing a single line of code. It would be one thing to talk about how you’d approach the task, but the GP specified their interview was apparently expecting workable code to be produced during the interview.
The single best interview experience I've seen was given by Oracle when I worked there a few years ago.
The onsite was a panel interview where candidates were given a laptop with a configured IDE and codebase with a specific set of bugs. Their task was to debug the various issues and complete as many as they could with the understanding that it was not expected for them to solve them all or even most of them. There were also tests to help them. And a different section of the interview asked them to right some tests and some code for again brain dead easy coding algorithms like reverse a string.
I thought it was great because it closely resembled what being a dev is like and actually tested for the skills we cared about. No you didn't get to use your own custom vim configuration but I'm not sure how that would have really worked even if they wanted.
This reminds me of the story of the young intern that went around a few years ago. They sheepishly admitted to their boss at the end of their summer internship that they had just googled all the Excel things that were required, that they hadn't actually known any of it.
"But your generation KNOWS to look for help, and knows where to find it! That's where you were doing it right," was the boss's response. And that's heartening.
One of the biggest differences between school and business is that in business no one really gives a shit how you got something done, within reason. You just knew it? Cool. You had to do the whole thing with the manual open on the desk next to you? Is it done? Fine, don't care, that's great, not even really sure why you told me that. You enlisted someone else to help? Is it done? Good, want to be a team lead?
Of course there are constraints imposed all over, but fundamentally how you work is way less important and much more open-ended than all of school.
Heard of a guy who was interviewing for a position and had an in person section where he had to use software he had never even heard of.
Guy googled enough to muddle through and got hired. Management explicitly said that he was the only person to look up information about the software, and that was why they hired him.
This is such a valuable skill. For our work we've found this ability to google or find answers without incredibly specific step by step instructions to be lacking from many younger workers. I've wondered if others have found this for many smart people under 30 as well?
I really believe there are too many important things to know at this point, and so I instead concentrate on memorizing the bounds of the Realm of the Possible. I know what my tools can do, I know where the pitfalls are, but I don't always recall the exact details of those boundaries. I often have to look up why, and I'm perfectly okay with that.
Where that doesn't work well is in interviews, and in places where people think it's perfectly reasonable to have technical discussion meetings where laptops are banned. We're really going to make architectural decisions without looking at any code? Who thought that was a good idea?
I'm old enough to be covered by age discrimination laws and think I'm better than a lot of people at finding things, and I believe a key reason is that Google deliberately, and perhaps many other types of search accidentally, tries to turn every search into a few results from a single cluster. The only way you can find a different cluster a lot of times is by already knowing it exists, or knowing that the garbage cluster you get is garbage and then you can modify your search terms, or read additional pages of results.
What experience provides is not knowing everything about everything, but the meta-knowledge whether there is an answer out there, and whether you've found it yet. Inexperience results in stopping too soon in your search, either giving up or settling for something flawed.
Really, who can expect people to solve the halting problem fresh out of college?
At my previous company, the technical interview was 2 hours long. The candidate was given a choice of about 10 very small, simple projects. Then we sat at a pairing station and paired on the problem. The candidate can use google, their favorite editor, write tests, anything they want. The goal is not to finish the project, but just code with them for 2 hours. It's the best technical assessment I've yet seen in an interview setting.
One caveat is this company really believed in pair programming, and virtually all code was created in pairs. So that's part of why this interview approach worked.
I did an interview once at a company that is religious about pair programming. I found the exercise unsettling: would they evaluate technical ability or culture fit more? If they were focusing on the former, I should take the wheel and try to demonstrate my chops (perhaps making me appear arrogant); if the latter, I should try to demonstrate how I can collaborate with the other developer to make both of us better (perhaps at the expense of me not demonstrating as many skills within that brief window of time).
It turns out that they were really focusing on how I interacted with the fake "client" and whether I had the proper "consultant mindset". Didn't see that coming (perhaps that's my fault). Though I feel I ended up dodging a bullet in not getting an offer from them.
My suggestion is to try to give candidates specific times to demonstrate various competencies (technical, colleague interaction, customer-facing), and tell them which ones are the focus of each exercise. Yes, that's not how the real world works (you need to utilize them all at the same time, of course), but an interview setting is hardly the real world. And you'll still see glimpses of their overall capabilities in each stage.
Chiming in to agree with you - where I work there are different coding competencies that are assigned per interviewer (e.g. good understanding of data structures and algorithms, good problem solving when dealing with ambiguity, knows how to write code that's readable/maintainable/extensible for future use cases without overengineering, etc).
I explicitly tell the candidate what I'm looking for at the very start of the technical portion of the interview. I'll say something like "this is a difficult problem and you might not finish, but that's intended since I want to see your problem solving skills", or "this problem is meant to be a little simpler so don't overthink it, I'm focusing on how you structure your code to be readable and easily modifiable if we want to change behavior or add a feature". I suspect that knowing exactly what I'm evaluating, instead of thinking they need to excel at everything, takes a bit of pressure off and allows the candidate to perform better.
That's a really good point. I honestly don't remember if we gave the candidate a heads up in that regard. In general interviewing is extremely imprecise. It's an unsolved problem unfortunately, but I feel most companies "fix" the problem by just over-interviewing and being ok with false negatives.
> would they evaluate technical ability or culture fit more?
Why not ask them that? "What are the roles in this roleplaying game? Am I a driver and you are evaluating me? Are you a client? Are we colleagues?" Or even more directly what you asked: "Are you evaluating technical ability or culture fit more? "
I don't think there would be a point in keeping that a secret.
Yeah, from a candidate perspective that's a good idea. From an interviewer perspective, though, it's important to remember that many very good candidates will feel uncomfortable doing so.
I don't know if it should go without saying, but don't have two halves of your interview process run by different people/groups with decoupled ideas of what to look for.
Like, if you have a take-home test, but you don't trust the results because people can cheat, then you're going to bring in people who are unsuited for a programming test in-person. Besides any cheaters, I mean.
Some of the tests in my current company work like this. We don't expect you to have everything memorized. We give you a problem and then say, "feel free to use any reference you want, help, man pages, google, stack overflow, whatever."
Some people still stubbornly refuse to look for help.
Some people act on the first google search result even though it's obviously wrong.
Some people don't know how to read a man page.
I find it very illuminating, but I don't really have enough data points to say how successful it is.
> Some people still stubbornly refuse to look for help.
The thing is that some people might feel that they will get assessed lower if they look for help rather than come up or even luck into a solution.
Despite making it abundantly clear to candidates that they should take the test(s) as a pair programming sessions, we had candidates struggle with some basic stuff and not ask questions. They just kept trying things while getting more nervous.
A lot of people would feel that it might be held against them. The thing is a lot of people under pressure will forget the basics
One thing I’ve had help melt the pressure on both sides of the table is a laugh. I’m fortunate enough to be able to sort of slice through tension and get a rise out of most people.
Stressing exactly what you’re looking for, that you’re not perfect either, that there’s no one solution, and letting them know things can be collaborative has been helpful too.
Let’s ping pong off each other and see where we get, I mean that’s what we’d be doing in the day to day right?
To this end I’ve sat down and added new questions to the interview guides at my previous position because I was concerned that any person asking the same question over and over again would really start to be biased towards some sort of “ideal” solution.
But there’s really nothing like a good laugh to put people at ease.
After one interview I mentioned that I felt I did a good job of interviewing because I could usually help relax the tensions which generally gets the best out of people and someone said that they didn’t think they’d ever gotten a laugh out of someone during an interview.
Thank you for doing this!
I'm doing several processes now and it's so unpleasant and stressful just before the interviews, that taking some weight off of it is really helpful.
We did a bulk interviewing round a couple years ago and the person I fought hardest for thought she flunked the interview because she had to dump into the debugger to figure out why her code was broken.
Going into the debugger is precisely why I picked her over several other people. People struggle with crap there is no value in struggling with and it doesn't make you a stronger coder to keep banging your head against the wall. It makes you stupid or a masochist. And I hate dealing with code written by masochists. Stupid people usually write code that is broken in obvious ways. Masochists prefer torture.
I've seen the opposite issue. I'll tell a candidate "I don't care if you know the exact function name and parameter order for the library function the get a sub-array; just make up something reasonable and focus on solving the bigger problem." And they still look up the actual function definition.
> I'd tell them that at the start, obviously. This isn't The Secret Rules For Getting Hired.
> "I know you've never used Blender," I'd say, "And this job doesn't require it. But we want to see how quickly and accurately you can learn to use something unfamiliar while under pressure."
> Or whatever.
The main problem is that the level of pressure that people feel under interview conditions will vary dramatically. Personally, I would find this a very bad test as I get horrendously stressed at interviews yet somehow manage to portrait a facade of calmness (to be fair I've been told once that I looked remarkably calm).
To the usual refrain of:
If you can't handle the pressure of an interview then maybe you are not a good fit for our company
I normally reply:
If day to day working at your company generates interview like levels of stress, then please do not offer me a job
If someone says there was a lack of culture fit, it stings and certainly could be some type of illegal discrimination. But at the same time, it's necessarily true! Unless your culture is discriminating unfairly...
Well this is exactly like a test I went through at one time. It even used the same reasoning pretty much.
The job was fullstack node and react, I went in and they said we want you to look at this game written in Python which is really slow and tell us what is wrong with it (I hadn't done python in years and back then wasn't very good at it). We will sit with you. I said I was game, they said lots of people weren't - one guy evidently got offended and said he didn't do that language (I can definitely think of one friend of mine who is totally competent who would get offended at being asked to work in a language that did not have anything to do with the job he was applying for)
The game was using a big dictionary in an array and you had to look up words from it and it was slow (about 7 minutes to do searches). So I thought to myself, damn I cannot remember the name of that algorithm for this (binary search), ok let's stall a bit maybe I will remember the algorithm name.
So what's the first thing you would do?
Uhm how about checking a profiler.
So we checked and found some simple things we could optimize.
And then I looked through the code and I saw there were parts I could cache etc.
So after all those things were done the time to do a search was a little bit more than halved and I still couldn't remember "binary search", so I stared dumbfounded at the screen for I think about a 3-4 seconds and I started to say "uhm" and then the guy sitting with me I guess ran out of the time he wanted to spend and said "ok well there is an algorithm for this blah blah blah" showed how to do it in python. Thanks for your time.
I actually didn't want the job but still felt somewhat let down that I couldn't remember.
Of course there was pressure, but why not just say that you know an algorithm that can do this faster, but you can't remember the name of it, and then just provide the intuition behind it?
because then I had to go through explaining the thing the algorithm did and I didn't necessarily want to do that, I just wanted to remember the name so I could say there is an algorithm named X for this case.
At any rate that was why I said the last uhm, because I saw no other way and I was thus ready to go the long route, but then I guess I was out of time.
In my experience on both sides none of that matters, also none of the “how do they look stuff up” matters, what matters is did you just know what to do and do the right dance.
You're not alone. At this point I'm pretty sure the majority of reports that there are lots of people lying on résumés and also being able to convince people in conversation that they know how to write software but who can't actually do [basic programming task] or answer [basic programming trivia question] are just cases of interview-stress-induced stupidity in otherwise competent devs being taken for savvy and insanely competent (but not at programming) grifters who were trying to trick the interviewers. Because I don't believe such ultra-convincing liars are both common and trying for software jobs (fucking why? Go into sales!), in numbers significant enough to explain widespread reports of seeing them in the wild.
I think it's the stress of not having any idea what you might be asked about over an unreasonably-wide field of topics, or how you will be judged on your answers, making people forget stuff or act dumber than they actually are, when asked to perform under such (unheard of in real work, "server's on fire at midnight" stress isn't even half as bad and certainly not as adversarial-feeling) conditions.
Well, there also might be a selection effect in that most people who are competent and bad at that style of interview find other ways to get hired, so grifters could be at least overrepresented somewhat in those who bang their head against the wall.
I have never been hired via a process that involves a technical quiz or in person programming, and the last time around, one such farce was my limit. If I was competent and not able to pass that sort of filter, then I kind of had to find another way.
I'm a mechanical engineer by training and still occasionally do some entry level hiring in it.
My favorite question in recent years has been posing an advanced project - not unreasonably difficult, but one I know they won't have covered in school. Make it clear I don't expect them to have the answer, but have them talk through their approach to this project with me. "How would you get started?" "What would you need to know?" "What resources would you use?", etc.
For us it's been a great filter - both a gauge of their current knowledge and ability to tackle an unknown.
I'm an electrical engineer, and I have a similar question that I ask my entry level candidates - "How would you measure the voltage coming out of a wall socket?"
It's familiar enough to be accessible, but technically just far enough out of their reach that they don't know exactly how to do it. We talk about approaches, tools they could use, some potential problems or issues they'd need to overcome.
If they get flustered about where to start, I laugh, and say "Don't worry - I don't know the right answer. I'm actually afraid of AC mains electricity." Never fails to calm them down, because, well, it's kind of funny to hear an electrical engineer admit that they're afraid of electricity!
It's a question that I've gotten a lot of mileage on.
I passed an analogous interview to this once when I was much younger, and then subjected several applicants to it thinking it was a useful filter.
It's not, it's abusive psychological torture that rewards sadistic amateur psychologists and interview administrators. The only way to succeed is to submit to the interviewer by asking for help. It filters for people who know to implicate others in failure while taking credit for small successes, and bullies people into submission.
The interview I passed was a variation (I learned after) on the bridges of konigsberg, where instead of asking for help, I asked if it was impossible. The person administering it hired me on the spot, but even this was wrong because the place was full of people who thought the reason they were there was because they were intelligent - something they didn't have control over - and they acted
As pettily as psychotically as you would expect.
The problem on teams isn't individuals not asking for help, it's managers who don't have the confidence of their teams to ask.
If you are reading this and I put you through that exercise 15+ years ago, I apologise.
> I'd tell them that at the start, obviously. This isn't The Secret Rules For Getting Hired.
> "I know you've never used Blender*," I'd say, "And this job doesn't require it. But we want to see how quickly and accurately you can learn to use something unfamiliar while under pressure."
I don't think I've ever had to learn something very complex under the same short time constraints and high do-or-die pressure as an interview. Tiny things (think: new-to-me unixy commands), yeah, but nothing very complex. With as many unknowns as that introduces, should it ever come up in a real job situation with a very short time window available, my expert (I'm quite serious using that term here) advice would be that we should count on having it take quite a while, and immediately begin whatever mitigation is required to allow for that, rather than waiting to begin planning for and mitigating the harm that'll do and simply gambling that this complex thing won't surprise us and end up taking longer than we'd hoped (likelihood that it will: extremely high).
I end up doing a lot of technology selection and so I have to have these skills and the ability to ask hard questions.
I don't for a second believe this should be the skillset of every member of a team. A team where everyone is adding new tools all the time is chaos. Learning new tools isn't really a goal. Getting better at your job is the goal, and learning new tools is either a means to that end or just moving the goalposts over and over so nobody knows what's going on.
Yeah, sure, I do evaluative work just fine, but I don't think "build [non-trivial thing] with [fairly complex software or tools you've never used] while we watch you, you have 50 minutes" resembles that very much. The closest I've seen is situations along the lines of "meeting's in 50 minutes, could you find out about [thing similar enough to other things you've done or worked with that you are qualified to evaluate it on short notice]? I just learned we'll be talking about it in there and your take on [a couple specific aspects of it] would be useful." Not "you've never used this, I want a demo of [more than hello-world] in 50 minutes, or it's your ass", ever.
We do this at retriever.co for our virtual assistants, in two areas:
1. we have applicants do a data transform with an esoteric excel function that they've /almost surely/ never used, and prompt them to find videos / tutorials on how to use it if they don't know (which, we assume, they wont)
2. we have people answer fake customer support tickets that require them to do some on-the-job learning about the Internet Service Provider industry -- to a level of detail that requires that they read through and grok a page on wikipedia explaining up-time calculation.
While neither of these pieces of knowledge are necessary for their day-to-day at Retriever, they allow us to stratify candidates' ability to learn new skills and grok new information.
I've been thinking through how best to screen personal assistants, and these are both very helpful ideas. I particularly like the first idea to screen applicants on esoteric utilities - thanks for your comment!
I partially agree, and it depends on what you are interviewing for. In the end, getting a job done, on time, without error/bugs is generally most important to the people you are going to be working for.
As for algorithm questions at some FAANG or wannabe startup, it can be rough. Personally I tend to blank on a lot of terminology, similarly on names a lot of the time. Some trivia games kill me, I can see the answer's face, the movies they've been in, but just blank on the name.
All the same, in my life and career, I've managed to learn, adapt and create both simple and complex systems in new languages and platforms.
On the other side of the table, I tend to ask maybe 3-5 trivia like questions only to gauge where a person is at, and let the conversational parts determine if the person would be a good fit. I think you need some of both. I feel drive accounts for more than even skill, knowledge and experience.
Maybe you just can't figure a person out in an interview. I got my current job as a contractor/temp, and thinking back to a previous company, one of the people that was the best and stayed the longest was a contractor that "graduated" to permanent. The simplest way to prevent people from keeping up a facade is just time.
It completely depends on the manual. If this is the first time I have ever seen it, and it's about something I have never used, it very well might take me a while to figure out how to look up information quickly and effectively.
Does the manual having working code examples? Then is it probably a good manual. If it does not have working examples that you can copy & paste (and they actually work), then it probably is terrible.
There is enormous variation in the quality of different indexes (indices?) The whole concept of an index, at least since computers have been around, should be that it's better than a full text search, but that's not something you can automatically generate.
After a couple of years working as a web developer, I decided to go to College. I took a "Web Development" class. I failed the final. The assignment was to code a website by hand. I used BBEdit to pretty print my hand coded HTML and it inserted a <meta name="editor"> tag that I didn't catch. I dropped out and went back to getting paid. Been doing it for 20 years now, though I use Prettier to do my formatting these days.
This is very similar to what I do for roles that are supposed to be independent and to work with customer software. I ask the interviewee to do something with a technology or software they don't know, but give them no other constraints. They can use whatever programming language, whatever OS, whatever IDE, whatever existing libraries online and even IRC/Slack if time permits - the goal is to see that they can solve a problem that is new to them without panicking. In real life they'll have a lifeline or two in the form of colleagues, but those colleagues might also not have the answers.
Obviously a terrible solution for a standard product engineering position, it is only relevant for special roles.
I've been to group interviews where they pretty much do that. Randomly divide all the candidates into groups, give them a task, see how they work together.
I can't say I enjoyed the experience - and I've no idea how good it is as a tool - but it was illuminating to see how people treated each other.
15 years of software engineering experience, and I don't know how to ask a question on SO without having it shut down as non-constructive, or that I should have used one of the 300 odd ultra niche sister sites.
Most vendors and libraries have IRC / Slack channels which are much more welcoming.
I'll fail, because I will check the actual documentation and the source code, then make the quickest check on SO before abandoning it for source code. SO has given me obviously wrong answers enough times that I prefer the documentation first.
And the ROI on learning to game Stack Overflow isn't worth it for me. :)
When I'm stuck I look for answers on SO, however, if I don't find answers I open a book or read documentation. I actually never had to actually ask a question on SO.
I've never found anything of actual use on StackOverflow in my ~decade of doing this professionally. The only time StackOverflow provided useful answers to my questions was during my first year of programming courses in university.
There's some useful information there. It sometimes takes a lot of digging, especially if you're dealing in things that have existed for a long time, but change a lot (UI stuff tends to be pretty bad, questions from 5 years ago rank highly but are nearly useless because it's three frameworks ago).
Also, I was able to cargo cult an applescript accessible objective C program utility (returned the color of a specified pixel from the screenbuffer) from stackoverflow without having to actually learn anything about objective C. That was mostly because the online sources to get started on objective C weren't geared towards writing a short one off program; Apple's documentation in general is hard for me to process.
Thats why I put in the "know how to ask questions" part. Sure, a lot of the time you don't get an answer. And you end up figuring out stuff the hard way yourself. But then I always force myself to go back and answer my own question. It might help the next guy, but more importantly if I have to answer the question myself I need to put what I learned in writing.. and in an understandable way as well. Often the exercise of explaining it will make me understand even better.
It's pretty nice for "is this thing I've encountered in this (API/library/SDK/tool) with no public issue or bugtracker a bug, or am I screwing up?" sorts of questions.
I had that same automated Microsoft Office test at a temp agency in the 1990s.
Funny thing, it had a bug. As long as you didn't let go of the mouse button, it would let you open and navigate all of the menus. So I simply browsed the menu UI until I found the correct choice. 100% score.
I've always wanted to run one where we bring them to a room to interview. Maybe take a break. Move their stuff to a slightly different room. Pick back up right where we left off. Maybe change out the furniture and people outside during the interview. Introduce them to numerous employees, who will respond to names swapped amongst each other. Another break - switch interviewers. Pick up right where the first one left off - insist that this employee has conducted the entire thing. If they drove there, it would be best to move their car.
I'm not sure what position I would be hiring for, but this is the interview I would like to conduct.
Other possible ideas would be a background check that is either insanely deep (so you're 3rd grade teacher was unavailable for a recommendation) or raises issues that they have no part of, "we see you've spent time in North Korea."
I hear a lot (and believe myself I suppose) that I would rather hire someone that knows how to learn than someone who currently knows thing x but can't pick up and learn things on their own. But I feel like it's VERY hard to figure this out about someone. It's pretty easy to test how well someone knows a particular technology or tool (given some variance for operating in a high-pressure scenario etc.) but it seems pretty hard to tell if someone can learn new things quickly, and learn the right things. I feel like this is what algorithm tests are trying to do, but I also feel like a lot of algorithms are based on having a key insight into a problem that is, at least in my experience, pretty hard to have when you're in a room with strangers on a timer and being judged, so if you don't know the particular algorithm to apply, good luck. And also it's pretty easy to go study algorithms and hope you remember how to implement DFS if that's what they ask you in the interview. And there is some non-zero value to just knowing about different data structures and algorithms for certain types of work for sure. I don't think many of us web developer types will be implementing DFS ourselves, but I run into issues pretty often where having knowledge about different data structures and algorithms comes in handy.
So I can see the appeal to doing a test like this that has nothing to do with a particular skill, where you're trying to evaluate someone's judgement and how quickly they can absorb new ideas or figure something out on the fly etc. BUT, I just think there are too many factors that can really muddy this up, and I still don't feel like there's an objective way to measure "smart and learns quickly". Even "learns quickly" is somewhat vague; I personally wouldn't expect someone to pick up, say, functional programming in an hour just by saying "go read about it." Some things take longer to learn than others.
Anyway, there's no silver bullet. I still feel like we end up making pretty subjective decisions about other people when interviewing them. I don't think we're even all that aware of them ourselves. I would say though that if you're interviewing somewhere, you're getting useful information about that place by how they interview you, and so I wouldn't feel too disheartened if you feel like the interview is unfair or doesn't work for you. That's probably a good sign that you wouldn't enjoy working there anyway.
I totally agree. The best solution I have come up with for trying to assess if someone can learn new things quickly is simply to ask a candidate to describe the last new thing that they learned. It doesn't have to be job-related at all. Just the last thing they learned. And then ask some follow up questions about how they learned, why they learned it, and how have they used the knowledge.
If nothing else, it will give you a general idea of how often they learn new things.
I think the primary issue with interviews is that the interviewer doesn’t share expectations. I think it’s totally fine to evaluate how a person asks for help, as long as it’s clear they aren’t expected to solve the problem.
Further, you’d need some standardized way to evaluate how well someone asks for questions. Otherwise you’ll have interviewer bias.
Which is why designing interviews are very difficult.
Interviews should be personalized. That they are conducted en masse with little consideration for applicant background is an artifact of the difficulty of adequate personalization. Targeted ads are valued precisely because they are targeted, presumably increasing the chance of a match. Has anyone heard of any startup companies in the personalized interview space?
I was pretty happy with the interviews I went through for my current position, although that might be selection bias because I passed them. They all shared some important attributes:
1: Fairly open-ended, there was no "bzzt-wrong" moment like the Office 97 case in TFA, so I didn't feel a lot of pressure. It was definitely enough rope to hang oneself, so the interviewers could form an actionable and objective judgment, but as a candidate, I didn't feel that I was on trial, more just chilling and talking tech with some other nerds.
2: The underlying skills (understanding one's audience, breaking a large problem into pieces, making reasonable assumptions) were obviously directly applicable to the position. I never felt that I was doing senseless work, even when the questions were clearly contrived, because the connections were self-evident.
3: Despite the above, the actual questions didn't require a lot of domain-specific knowledge. Even the one that sorta did, could be answered by assuming that it was similar to a more common system, and that's how I approached it.
4: In every case, there were "plan B" options and accommodations for having a gap in one's knowledge. "I don't know this off the top of my head, but here's where I'd go to find out and here's the guess I'm going to use in the meantime" was a valid answer. Perhaps the best answer in some cases.
The result is that I work with a team of people who are super adaptable, can think on their feet, but have a keen interest in pushing towards correctness as soon as real information is available. Whenever I think about possibly going elsewhere, I remember that most HR departments don't select for any of those attributes, often quite the opposite. (And those places tend to be the clients whose problems we get called in to help solve, as a result.)
I experienced an extremely simple (and IMO effective) version of this.
For a dba-adjacent job, I sat down with a dba and was shown how to connect to a sql server instance and change some particular setting. At the end of our hour of discussing other technical matters and my resume, he asked me to demonstrate recall of this simple command.
There must be a serious dearth of candidates in my area, because the last time I was subjected to a time-constrained test was almost three years ago, and it was a Codility challenge, so I had some time to prepare for it.
I quit my job recently, and in order to find another I went full-auto with the applications - I've sent close to thirty of them I think, so a long series of recruitment processes followed.
They were all mostly the same - I either received some homework to do or had a 1-2h technical interview during which I was asked a set of questions which were easy for the most part. And that was it.
My point being: I've discovered that I can't really relate to the stories other people posted in this thread. Does this mean I wasn't ambitious enough regarding my applications, or are the others subjected to a difficult job market? Maybe both? Or none? I really don't know.
I keep hearing and reading horror stories like this one on HN and other website. I never did interviews at big companies, nor did I do any in the US, in 15 years of freelancing, I only had once a technical interview, with a C++ guru, I failed most of the questions but still got hired because "you still did good, kept calm and asked good questions".
In my last gig, it was for a work on a Java application. I was upfront that I only did a bit of Java in school, but that I was more of a C++ guy. The Java guru hirer was like "Yeah, if you know C++, you'll get Java quickly" they were more interested in my experience with video and image processing.
I basically avoid to do HR-based interviews. Short-circuit them when you can.
That's how Holberton School (we train people to become software engineer) select and train students. We are a college-alternative where students learn by working on projects, in group.
The whole education is flipped compared to a regular one. We give students problem sets and they have to find the answer. We never give lectures (we have no teachers), just guidance so that they can get started on a topic, but never enough that they can actually achieve the tasks we give them.
> Are they able to read a manual?
Candidates (no prior experience in coding needed) need to answer questions on a specific command that can be found by reading the manpage. At first we provide the manpage, then we show them how to find a manpage, and finally, we don't provide any guidance, because at this point they should be able to do it on their own. In the curriculum, once students, we have a quizz to makre sure they can answer basic questions on the topic before jumping into the coding part.
>Can they formulate a search query?
We ask trivial questions that can be answered just by Googling. Might sound like a no-brainer to the developer community, but not everybody is good at this.
>How do they assess whether the tutorial they found is suitable or reliable?
That's the tricky part, but basically, because the school is not providing material that contains the answer, candidates and students have to navigate the ocean of information/tutorial that can be wrong/incomplete of correct. By pressing a button, students work (code at this point) is automatically and instantly corrected, so they can easily figure that what they built is correct or not.
> What steps do they take to make sure they're finding - and learning - the right information?
On this one, we as a school, are the one providing this guidance. Basically when students are doing their projects, we define learning goals. Getting coding part done is great, but we also want to make sure that students have been able to grasp concepts that are key to becoming a great software engineers. So students look at these "learning points" before, and by the end of the project, they must be able to discuss all the points.
Haha, that's funny, and an awful idea. At a previous employer, as a hiring manager, I offered candidates an option of bringing in their fully-configured IDE in their own laptop with the language of their choice and I'd give them the problem to solve. Essentially, act in the environment that you've already mastered.
Only one guy did (I had maybe 10 a month) and for some inexplicable reason he couldn't figure out how to create his `main` equivalent in his language in his IDE. He must have been incredibly nervous because we tried "perhaps you could sketch out your thoughts in broad strokes and then we can return to the IDE" etc.
I just got interested in the way the school 42 works. Especially the month-long "test" that they have to accept tuition free students.
The idea is to make you complete a lot of programming projects in languages and tools you normally don't know at all. When you start. The evaluation does not only take into account if you finished project but also if you helped others, knew how to look for help (basically: google or other students). Its goal is to select students who would work well in a peer-learning project-based-learning paradigm. This, in my opinion, is much closer to real-life conditions than any academic test.
Our team uses hacker rank or something similar (I forget) to weed out bad profiles (most of them). We combed through their question base to remove things that felt too specific. But I’m happy with the result, because there’s nothing you can’t google, and googling is allowed.
We’re a python and R shop. You can choose either test. As a non R user, I can get a decent score on the R test, because I can read docs. And that’s great.
We have also dabbled in game based assessments, and while I have qualms with it, the same kinds of skill sets seem to be the primary differentiator of success.
My 2c on two types of assessments people probably normally disregard
Another take on what the author proposing is to simply not care about the output of the task, but focus on how that person approaches the problem (even if they don't manage to complete the task at all).
Honestly, I wish more companies would just tell you in advance of the interview what the specific prompt is going to be (and perhaps the evaluation criteria too).
Sure, you could "cheat" by Googling the answer in advance or asking someone for help. But if the question's sufficiently open-ended, it's not that hard to suss out in follow up questions whether you have a deep first principles understanding of the subject area or you're just repeating something you read.
I'm pretty sure if you give me a subject I know nothing about and a couple days to study I can teach myself about it, at least to the level of someone who has real experience in the subject which may be a couple of years out of use. So then how does a hiring manager know to filter me out?
Presumably, at some point, you'll be interviewed by someone with more up to date experience. And if you study enough to fool that person, that seems fine? It's a good sign you're capable of the job, prior experience be damned.
I don't know about this. I don't think I should spend most of my time learning completely new tools.
If people spend most of their time learning tools, that encourages tools which are easy to learn but lack deep and powerful functionality. I think it's fine, and even good, for tools to require focused learning to understand, and in exchange have a worldview and tool use that is really powerful. Like Emacs or Vim or Excel, for example.
I do hiring interview in my company and one of the test we like to do is to give the interviewee a problem out of his area of expertise and open a browser and an editor for him and tell him that he is allowed to search for things. Most of people I interviewed do not use Google help they are afraid you judge them negatively for it even if you say you will not.
This reminds me so much of Vernor Vinge's story "Fast Times at Fairmont High". It's set in the near future, and one plot point is that high school tests cover this skill--to read through the manuals quickly and get things done with new tools. I recommend it, and the related book "Rainbows End".
If the test doesn't reflect the job you'll be doing then it's not a good test of whether or not you'll be able to do the job. If you're applying for a role that expects you to know an application then looking in the help is cheating. If using the help is acceptable in the job then it's not.
It's also worth noting that people don't like "proxy tests" for the skills they'll need to use on the job. If the test only tangentially assesses a skill then it's going to be influenced by other skills or flaws and won't give an interviewer a good idea of how good you are at something. This is why people complain about whiteboard exercises in developer hiring. Unless remembering and explaining an algorithm on a board is something you'll actually be doing in the role then it's not a good test - it sort of assesses how good you are at explaining an algorithm, but it really assesses memory, presentation skills, etc. Someone with brilliant whiteboard skills isn't necessarily a good developer, so hiring them on because you were impressed by how well they explained something in front of a board feels wrong, especially to people who aren't good at the tangential stuff.
In the case of the suggested hiring test in the article, it doesn't test someone's job skills. It tests how well they look for help, but it also tests how well someone copes in unfamiliar applications, or with esoteric UIs (in the case of Blender), or if they've actually used the app that's being used as a proxy. None of those things are necessarily a good assessment of whether the person can do the role they're applying for, so the test fails.
> If you're applying for a role that expects you to know an application then looking in the help is cheating.
If you discourage your employees from learning you are going to have a lot of dumb employees. Even the developers who made excel do not know how to do everything that excel can do.
Some businesses need to hire people to do a job using an application in a very standardized way from day 1, there is no expectation that the person doing it will learn anything in the role, and when the software changes training is provided. The employer is in complete control of what is done and how the software is used. Figuring something out yourself is considered a bad thing because it's not the standardized way of doing the thing.
Please don't go through life imagining every job is like a developer job. We get far more freedom, creativity, and responsibility than a lot of other workers.
That is all well and good for those types of jobs. But no one can remember how to use all of Excel, and they should be able to search for help and understand that help.
If you are required to demonstrate that you can create pivot tables, and you not only know what they are and how they work but also figured out how to do it in a brand new software suite within 10 minutes under stressful conditions, you have succeeded. Committing obscure UX patterns to memory is far less important than your ability to understand concepts and find ways to apply them.
I understand a manager may prefer to supply someone with Microsoft Office 2001 Alpha Deluxe Edition (& Knuckles) and see them hit the ground sprinting but the time spent looking for such a person might be greater than what it would take for a motivated person to reach that level.
This really nailed it on at least one particular job I was interviewing for. I'm freelance and really happy with it, but was referred by a friend to a (at least regionally highly regarded) company, so I thought I'd go.
I was first presented with a IQ test-looking set of pictures where you choose what comes next in the sequence. Seeing as I love puzzles, I scored great on it, but also aware that it doesn't really say anything about my skills on the job.
They were very positive and wanted to move forward. So I got a take-home test in Java. There wasn't a time estimate, but it was fairly convoluted, and it was meant to show what you would turn in, had it been my task, meaning I wanted build configs, testing, deployment, etc.
The problem was that I hadn't worked with Java for a decade, nor did the position have any connection to Java. I didn't even have a jdk installed at the time, but made an effort at setting it all up, figuring out maven or gradle, making my classes and data models, getting a skeleton set up, and after three hours figured I wasn't even half way done.
I wasn't interested enough in the job, would have gotten a pay cut, and had a 3 month old at home I was much more keen to hang out with on my free time, so I reported what I had done and that I was unlikely to do more. They stated that they would set up a meeting with the CTO, but never heard back.
I wish they had done a (sort of) exit interview or just sent out a minimal effort feedback questionnaire. It would be one of the first one I'd actually like to fill out.
I don't think I'm top tier in my field or region, or that I would've gotten/taken the job anyways, but they really put me off for the wrong reasons.
And I bet this is more common than it has any right to be.
I looked at their Glassdoor where people wrote they were heavily biased to the algorithm type questions, for better or worse.
In reality I was given only a single coding interview my whole onsite, and it was to build a JavaScript SPA, including routing/linking, without the use of any framework, just vanilla JS.
I was allowed to Google whatever but 50 minutes is still a crazy time crunch to figure out a lot of the stuff that needed to be figured out.
It felt like a massively unfair way to evaluate me. I have honestly never felt more screwed over and time wasted by an interview than at Twilio and I would highly recommend avoiding interviewing there.
My overall point being, is there’s no silver bullet. It’s not necessarily about this magical process will work well and this one will not. It’s about the small details, especially things like making sure all of your interviewers actually read the candidates resume and that everyone knows what type of role they are interviewing for, and that the interview is designed to discover both the candidates strength and weakness. I’ve been on both sides of the process and I know too often shortcuts are taken and churning the pipeline forward takes priority over doing a diligent job and respecting the time of every candidate you engage with.
The very fact that the foundation of tech recruiting is inexperienced recruiters, with minimal tech knowledge, who are incentivized by sales type numbers, and churned through themselves at a comical rate, speaks to how the industry views recruiting. College athletes get recruited, top law school graduates get recruited, engineers aren’t recruited but relentlessly spammed in hopes of finding more bodies to feed into a process straight from a Kafka short story, the problem starts there and no a-ha process improvement will fix that.