>Created a new CMS for the company in Python that resulted in an ~70% reduction in website update times, from 1 hour to 18 minutes
Personally, I don't put too much emphasis on these numbers because most of them are probably BS. Don't get me wrong, it's still probably in a candidate's best interest to quantify their accomplishments since it's recommended everywhere but don't over do it.
They are absolutely BS because 99% of programming is unquantifiable. There are too many variables. I added 5 features this quarter and sales went up 5%. Is it because of the features, or better sales techniques, or dumb luck?
"Redesigned the web pages and user satisfaction went up 10%" - but also you hired 5 more customer support agents and a backend engineer improved the slow pages by 30%.
"Created a new CMS in python (Based on the old CMS) (With a team of 2 others and one guru who did the design but I wrote his code) resulting in 70% reduction in update times (after we ran it in a cluster with twice the resources but I never could understand clustering so guess it was my coding)" "And didn't have time to write tests, and that project was canned because it took too long actually"
As engineers we're measuring everything, all the time right? The point is 'wrote a web api' is worse than saying 'wrote a web api that handled 1M tx/hour' or some such. Even if the number in this case was exaggerated, it stands out and we have something to discuss in the interview phase.
And if the software you write works with money at all, put the amounts in there. Moving around large amounts of money shows trust from your existing company, and attention to detail.
> As engineers we're measuring everything, all the time right?
Is this mostly a joke? I've written plenty of APIs, but I have never load-tested them in isolation, never had to respond to underperforming latency or throughput metrics, or even face any feedback on software performance. In most cases, I don't know who uses the code I wrote, and no metrics from them ever make it back to me personally. For exactly 95.2% of what I've delivered, I couldn't tell you that I improved Foo by Bar units even if you held a gun to my head.
This is after years of delivering LOB software to clients in banking, manufacturing, and oil & gas industries.
I think it depends on work culture. We recently merged with an american corporate and their engineers are obsessed with measuring, pilots, ab testing, RFCs etc. It's so slow to work with them even on the smallest features. Poor 10% of users who miss out on awesome features for 4 months because we "need" a control group.
I think that the point of the parent poster is that often even when the work culture is obsessed with measuring, the people doing the A/B testing and careful study of the benefit to the company are quite separate from the people implementing the change; the information will be used in some decisions and flow to various layers of management, but the developer who built that feature won't necessarily even get a message when it eventually got chosen for widespread deployment or got abandoned after 4 months of being shown to a control group, much less getting the data on what the estimates of financial impact showed.
I'm not so sure about this. I support things and I've supported them over absurd growth periods for years. I've lost track of how many zeros are on the number of things per thing; it doesn't matter.
The process is: You're in charge of a thing, it grows, it breaks, you fix it, it grows, it breaks, you refactor it, it grows, it breaks, you fix it, it grows you replace it and shard it into N things, they grow, they break, etc. The metrics matter, but after a while it's just bigger and bigger but gotta work, so you just make it work.
Oh -- obviously I'm not an engineer -- I don't wear a striped hat and drive a train or sign blue prints.
They might be, but they are a good jumping off point for a discussion during an interview. But also if you say something interesting, it will pique my interest enough that I at least want to talk to you about it in an interview.
Was going to say this. You need your resume to cover both types of people who will be looking at it. HR/Managers for initial screening and then developers for the real interview. For a startup where I know the dev or founder are seeing my resume right away, I wouldn't include those types of quantifications. If I'm applying at ${bigcorp} then it helps get you through round 1.
The value is as a hook in a conversation. e.g. if you have:
- moved from Kafka to Flume
- installed Kubernetes and Dockerized our code
- binary serialized our data
Then this is going to happen:
1. I will assume you don't care about the outcomes of what you do, only the task
2. I will assume you don't know how to communicate why something matters
3. I am going to have to pick at random and hope it's interesting
On the other hand, if each one has the outcomes listed, not only is it clear that you know why, but perhaps more importantly, it is a conversation hook that I can use to enter the discussion. Well, why was it important that the size of the data be small? Couldn't you just zstd your JSON instead of binary serializing some processed version? etc. etc. and then you get to show off why and what that thing you made does and I get to enjoy that and we're both happy.
Most developers don't have agency to choose what they work on within a company. Expecting the developer to have any influence on the outcome of what they produce other than the implementation quality shows a lack of understanding in how actual software development works.
And to be fair to you, almost all management two steps removed from actual coding does not understand how actual software development works. If you are interviewing a PM you should care about the outcome, for engineers focus on quality, timeliness and skill.
Have you considered... Asking? Asking why you are doing something? All of these changes are going to have a purpose behind them, and at least in these examples those reasons are likely to be things a dev is in a position to metric the results of themselves. How much throughput did you add by switching to Flume? How much did you reduce bandwidth with that binary format? Even if you don't have access to the production metrics (and most places you will) you could estimate based on synthetic data.
Even for product work I've yet to meet the product person who isn't eager to talk about how a new feature was received when a dev asks.
Does your company put you head to head with another developer to see who can develop the same feature the fastest?
Or maybe you secretly keep tabs on all of your teammates and their delivery speed, so that you can calculate how much faster than your teammates you are at delivering and how many fewer bugs you ship?
As a manager, that's half true. We have business needs, but I am hiring for the ability to think past "what is being asked of me" and into "What do we really need, and how can we deliver it?"
That isn't rewarded in all jobs, mind you, but it is something to look for when you have a choice.
I've been a software engineer at a few highly regarded companies for a decade now, at none of them did I have the ability to make large product or design decisions as a software engineer. I was able to suggest things, I was able to call out issues in the design and I was able to propose ideas but I was never able to unilaterally make any decisions and a lot of the time any ideas I had were put on the backlog because the company was in focus mode on one specific goal (and I'm not criticizing this idea of having focus).
So my point is, if leadership and product are bad and ask engineers to produce turds. The engineers don't really have control over the fact that they are producing turds but they have control over the quality, how buggy, and the shinyness of the turds.
There's three questions for every product - "Why?", "What?" and "How?"
> I've been a software engineer at a few highly regarded companies for a decade now, at none of them did I have the ability to make large product or design decisions as a software engineer.
IME, even the highest engineer at the company has little to no ability to make even small product design decisions.
I'm not saying whether I think it's a good thing or a bad thing, but the truth is there's a role at every organisation who's job it is to decide what the product should look like and what it should do. That role decides all the "what" questions.
Engineering is all about "How?"
Even the product design role doesn't have all the power - the "Why?" is decided by someone else.
Key point: quantify the shinyness of the turds, so that HR screen likes the look of the CV, and be prepared to talk about turd shinyness and why it would matter.
How about we start with you not making 2 massive assumptions off of a sentence in a document that summarizes anywhere from 1-30+ years of someone's work.
Then how about we move away from you thinking that interviews are for your entertainment and the interviewee is a circus animal where 'you get to enjoy that' as they 'show off'.
This is a weirdly aggressive response to what is literally the core purpose of collecting resumes. Like it or not a hiring manager likely has hundreds of resumes they are looking at and nowhere near enough time to review them all. Making judgement calls on limited information is necessary. Maybe the hypothetical engineer with this resume does understand the goals and outcomes of these tasks, but they have failed to demonstrate it in this resume. Given another engineer who gives me details about the business need they were meeting and the outcome, why would I interview the first instead of the second?
The core purpose of collecting resumes is gauging the likelihood that a candidate may have the skillsets to align with what the job is, and then bringing them further into the process to dive deeper into their experiences. It is not to make wild assumptions about how someone is probably incompetent because they put put 'GIT' or 'JAVA' or omitted the business need and outcome on every project they have worked on(good luck getting all that on one page). It is almost a certainty that someone making snap judgements on what capitalization was used for a specific tech is also making those judgements on things like nationality, origin, sex, etc...
To be honest, why do you even care what the business need was? Maybe it is classified? Maybe it just landed on their desk and they crushed it? You honestly want to read about the business needs of things that are completely irrelevant to you and in the past? Go download a white paper.
Your whole paragraph is a juxtaposition. You are saying that hiring managers are strapped for time to thoroughly review all the resumes(I agree) but then say that a resume should have the business needs and outcomes of any number of projects that a dev has worked on in their career(and we haven't even touched on what they actually DID for those projects).
The core purpose of collecting resumes is to determine which candidates are most worth bringing in for an interview. A candidate who understands the business purpose of what they are doing and is capable of calibrating their task to best meet that need and gauging the outcomes of how well it did so is infinitely more valuable than one who just blindly does whatever task with no knowledge or understanding. Skillsets are more than just a list of technologies you've worked with.
"Deployed our core application to Kubernetes": Cool you once saw a Dockerfile and Kubernetes YAML.Like every single other applicant. I don't even know from this if you have a basic understanding of Kubernetes. "Deployed our application to Kubernetes for better server utilization and resiliency increasing our uptime by 20% while reducing cutting our server costs by 10%". This person at minimum has some idea of how to configure Kubernetes for application resiliency, knows how to measure and maximize hardware utilization, knows how to measure and minimize downtime, and probably knows enough about Kubernetes to provide advice about when to use it and when not to. Maybe the first person does too, but they did nothing to show it.
> It is not to make wild assumptions about how someone is probably incompetent because they put put 'GIT' or 'JAVA'
No one said wildly incompetent.
> every project they have worked on(good luck getting all that on one page)
Don't put every project on your resume. Pick the most interesting ones for the job your applying for. If you are sufficiently senior that this still doesn't fit one one page, use two.
> It is almost a certainty that someone making snap judgements on what capitalization was used for a specific tech is also making those judgements on things like nationality, origin, sex, etc...
No.
> To be honest, why do you even care what the business need was?
Most hiring managers. Someone who understands WHY they are doing something will make better choices than someone who sees "Move our application to Kubernetes" in the backlog, grabs a yaml template off the internet and calls it day.
> Maybe it is classified?
Government resumes tend to be EVEN MORE outcome focused than private industry. You can write "increased pipeline throughput by 50%" without writing the national secrets in that pipeline.
> Maybe it just landed on their desk and they crushed it?
How can they possibly know they crushed it without understanding why they were doing it in the first place, or anything about what happened after they released it?
> You honestly want to read about the business needs of things that are completely irrelevant to you and in the past? Go download a white paper.
A white paper tells me nothing about the candidate.
> Your whole paragraph is a juxtaposition. You are saying that hiring managers are strapped for time to thoroughly review all the resumes(I agree) but then say that a resume should have the business needs and outcomes of any number of projects that a dev has worked on in their career(and we haven't even touched on what they actually DID for those projects).
You know what takes longer than reading a resume? Interviewing someone. Your right. Resumes are going to get a skim first. Only the most relevant ones are going to actually get read thoroughly. This is exactly why you want numbers and metrics as much as possible. Numbers catch the eye far more easily than sentences. If I see "saved $100,000" on a page, that's going to get caught on my first skim. It's an effective hook. I want to know more. Without it I've just got "Kubernetes, C#, Javascript", I might as well have a computer read the resume and build me a checklist.
I appreciate the thorough response. I guess we disagree about the purpose of a resume, I just think it is wrong to assess someone's worth off a one-pager(two pager max). There is obviously a lot of grey area between what a perfect resume and an atrocious one looks like, and that will vary even between each job application. Matching the expectations of candidates with those of the hiring managers is extremely difficult, dating can be thought of as a similar problem that has not found a great solution(and in no way am I saying dating is like interviewing).
For classification I meant that generally a company you have worked at will not be keen on you going around talking about their internal business needs, especially if it is close to the product.
Lastly, as some have mentioned, statements like 'increased this by X' are usually either fudged, or marginally related to what the candidate worked on, conflating factors being in the mix.
Isn't it preferable to you that I continue to be like this so overtly? Since you dislike how I am and you may well dislike working with me, you can instantly reject me and my org. This is good for both of us.
I don't think people are "circus animals". But I do like to work with people who have done things that they are proud of and who do like talking about those things. And it is important to me that we both enjoy the conversation, yes.
I agree. I know it's the conventional wisdom to phrase things in that way. Having hired a lot of people and read a lot of resumes I personally either ignore these statements or have a chuckle over them when they are silly. In no world would someone capturing the "impact" of what they did make me more likely to hire them.
I'd actually say I'm much more interested in understanding what concretely a person actually did, so I know what kind of experience and capabilities they have, vs what it led to, which has so many external variables and is probably mostly BS anyway.
Personally, I don't put too much emphasis on these numbers because most of them are probably BS. Don't get me wrong, it's still probably in a candidate's best interest to quantify their accomplishments since it's recommended everywhere but don't over do it.