I was explaining this to a friend who's a top-shelf cabinetmaker.
He was telling me how he would sell high-quality cabinets to homeowners, basically by building a "dream kitchen," that far exceeds their budget, then backing down, by removing features, until they have something that exceeds their original budget, but would still be quite good, and that they want.
He was saying that I should use his methodology.
I explained that his sales are to the people that would actually use the cabinets, so they have a vested interest in high quality.
In my case, I would be working for a business that absolutely doesn't care about quality. They want the cheapest, shittiest garbage they can possibly get away with pushing. They don't use it, and many of them won't even be responsible for supporting it, afterwards, so there's no incentive for quality.
The same goes for corporations that don't try to retain talent. If someone is only going to be around for eighteen months, they are paid well, and they are pressured to produce a lot of "stuff," then you can expect them to produce a lot of ... stuff. They don't give a damn about supporting it, because they are already practicing LeetCode for their next gig.
I have found that giving people a vested interest in Quality is essential. That often comes from being responsible for taking care of it, after it is released, using it themselves, or having their reputation staked on it.
I don't feel the current tech industry meets any of these bars.
Most of the software I write, is stuff that I use, and I am intimately involved with the users of my work. I want it to be good, because I see them, several times a week. Also, I have a fairly high personal bar, that I won't compromise. I have to feel good about my work, and that doesn't seem to sell.
When I started at Oracle yonks ago, there was a bizarre bug management system.
When a bug was found, it was assigned to the next available developer in the team. It didn't matter who wrote the code and created the bug - there was no feedback to them unless they happened to be the one who picked up the bug report.
The bug reports were printed out and stood in a tall pile on a manager's desk.
Quality was, as one might imagine, terrible. Junior developers had no idea what a bad job they were doing - senior developers spent their days fixing stupid bugs they would never have caused themselves.
The solution, blindingly obvious, was to start assigning bugs to the developer who caused them. The improvement was instant, because most people actually want to do a good job, and to be seen doing a good job. The pile of bug reports literally shrank before people's eyes.
The current industry seems to have moved back to these bad old days but on a longer timescale.
Resume-driven development abounds. Developers move on to the next gig before the impact of their decisions becomes obvious and quality plummets accordingly.
It doesn't help that most of the career advice out there now is to move to a new role every 2 years if you want to get underpaid. Developers don't have the opportunities (or don't give it to themselves) to see how well what they build stands against the test of time.
I've been at my current role for almost 7 years. Many of the best lessons I've learned have come from the pain of maintaining and fixing my own mistakes.
Sometimes it can take 2 years just to tell if a decision you made was good or not.
Devs leave in 18 months because their market value increases faster than their pay at their current company.
You can give devs a vested interest in their work by making sure compensation tracks/exceeds what they can get outside, because it would give them a vested interest in remaining employed with the company.
This is true, but it is also only one axis. Once we have been at it for a while, respect, work-life balance, and job satisfaction are also very important variables.
I will bet that the top companies are able to provide all of these, and keep people, but I suspect that many companies have terrible managers that can’t keep a decent work environment going, so compensation is the only variable that matters.
The company I worked for, didn’t pay especially well, and I worked very hard to treat my employees well. It seemed to work. It wasn’t a case of greedy managers, making money off the backs of peons. None of us made that much, but the work was deeply satisfying. We worked with the top engineers and scientists in the world (in our field). Our business cards opened a lot of doors.
Seems like building and selling your own software (aka solopreneur) would help you to build capital at the same time as it also forces you to write good code.
Yes. That’s pretty much what I do. There are a couple of issues with this, though:
1) It’s difficult to do this, unless you already have a lot of experience, writing good code.
2) There’s a ceiling to what we can do, as individuals. I have heard tales of Linus, The Miracle Coder, who can write an entire operating system, as well as an industry-changing VCS, on his own; but I only know of one cranky Finn that seems to fit that description. Otherwise, most of us mortals need to work in teams, to achieve ambitious goals, and that brings its own set of challenges.
Actually, he understood perfectly. He was a businessman, as well as a craftsman, and understood that this is par for the course, in many successful businesses.
I believe that it's vital to have a stake in Quality, or it won't happen.
I don’t know. If I have learned something in the last decade about software engineering and quality is: business only care about revenue and speed, while engineers don’t have an objective way to define quality (10 engineers would give you 10 different opinions about the same piece of code).
The only moment I consider quality as the top priority is when I work on side projects (because there’s only one opinion about what quality means, because there’s no time pressure and because no one is asking me “are we profitable now?”)
I agree, and take it a little further. 10 engineers couldn't agree on the _point_ of quality code to begin with, let alone define how to get there. Consider two programs:
1. The spaghetti mess, half-done abstractions, inconsistent uses of anything everwhere. But accomplishes the users expectations perfectly
2. A beautiful codebase, clean abstractions, tests and documentation everywhere. But the user hates it. It's slow, requires some domain knowledge of how to drive and get result.
Two very contrived examples, but not unrealistic examples.
Intuitively, better cabinet quality leads to a better cabinet experience. Does better code quality lead to a better product? It should, that's what quality is about. And if not, is "quality" even the right word?
Whenever I hear an engineer talk about quality, I clarify. What kinda of quality are we talking about?
Right, and they are, that's my point. Quality isn't a single attribute of a system, it's a judgement call based on objectives.
The objectives of a business selling software, and that of engineers is something else. Sometimes maintenance, sometimes extensibility, sometimes exploration, sometimes just seeing if something is possible. Quality correlates to the objective, and in my experience, many software engineers have a hard time seeing their code through other perspectives.
There is a lot to be said for getting outside the four walls of a business (or org) to evaluate things. If it's not visible outside those walls (software buggy enough to lose customers) and doesn't introduce significant future risk to the business (competition can move faster than you) it's probably good enough. The real trick of course is predicting and communicating why you think one of these is true. It's an essential problem of commercial software dev.
> 10 engineers would give you 10 different opinions about the same piece of code
This plays out in code review in a way that drives me insane. So much back and forth and time spent/wasted because there's always that one person or small group of people who insist their way is the one true one.
Trying to get 10 people to agree which is the best chocolate ice cream is closer to the quality problem.
I appreciate the scatological approach, though. And most of the time when I hear the word "quality," it's said in anger that could easily be communicated by flinging poo . . .
Different ideas are of course important. But companies that take every idea seriously usually have as many ridiculous problems as there are ideas. So not every idea may be really good. Accepting this liberates a person.
'Eating your own dog food' is the best path to quality software in my opinion. Too many people working for a software company (developers, salespeople, product managers, etc.) never bother to use the software to do the kinds of things they expect their customers to use it for on a regular basis. Write the code. Make sure it passes some tests. Move on to the next project. This is common.
No wonder so many bugs never get reported unless many customers run into it much later. I have a project I work on regularly. I use it regularly to do productive things and I find most of the bugs just doing that. I had a couple different 'business partners' who talked a good game, but I could not get them to actually use the software and give me feedback on how to improve it. Neither one added much value to it and quickly moved on to other things.
> Write the code. Make sure it passes some tests. Move on to the next project.
Let's mention the missing step: don't even bother to run the code.
I'm simply embarrassed to admit how often I've been in teams that not only "don't use the software" (i.e. no dogfooding) but even "don't run the program". It's embarrassing. These types of teams miss bugs that get shipped because not one of the people involved in making that software has ever even actually run the damn app, let alone actually used it for any length of time.
This is shameful and embarrassing. Our profession is a joke. How can we even call ourselves professionals?
I agree with this so much. Only yesterday someone picked a ticket off the backlog to change a minor detail on a feature I've been working on. Rather than getting me to do it (a ten second job) this person made the change and also made unrelated changes. Now the user may proceed to the next step and bypass all the validation errors in fields they forgot to put values in.
If they'd ran it, they'd have seen how broken it is. But they didn't run it, and just merged the branch.
Another underappreciated effect of dogfooding may be its reduction in bloated functionality.
If you're not dogfooding, you rely harder on a mental user model. Just conjecturing -- not only does that model diverge across your organization, but it could result in more top-down decisions about what a user wants, which probably creates more politics and friction all around your teams.
An issue I've encountered is that all of those non-developer people you mentioned generally don't eat the dogfood, even if they push the idea of dogfooding themselves.
They assume (or don't even pause to think about it) that developers eating the dogfood is enough.
At larger companies, "eating your own dogfood" only works well if people with power to make roadmap and time-allocation decisions also eat it.
Dogfooding was popularised by Microsoft, which AFAIK is still doing it, but it seems these days it's more like they're just being force-fed the dogfood without having any actual power to change it.
One of the main problems with Facebook's metaverse was precisely that none of the developers working on it actually wanted to use it, to the point where it had to be mandated by management.
Like, guys, when even the people who are making your product don't want to use it, this counts as data about how successful the product will be. If you have to force paid employees to use the product, then you'd have to pay users to use it too: there's no profit margin here, there's no user base.
Dogfooding is good, but I wonder to what extent it has become the case that problems that programmers have (solved by software that programmers will use and can easily evaluate how it works) are already solved pretty well by open source programs. I mean, imagine trying to sell a compiler. Good luck.
If you want to sell software, maybe one of the biggest markets to play in is software that programmers don’t find interesting to write and use?
There’s clearly not a 100% overlap between problems that programmers find interesting and open source projects. But it is applying not so favorable filter, right?
It's maybe no coincidence that some of the biggest companies that actually sell software (as opposed to user data) that I can think of are Epic, Salesforce, and Intuit. Imagine spending your evenings on volunteer work for a CRM system, bleh.
Sounds obvious in theory but the majority of applications are targeted for a very specific audience, i.e. banking, freight forwarding, CRM even. Not to mention if you work at a mid+ size company you'll be working on a piece of the application. Good luck trying to use that in your day to day life.
If you're writing software to automate a business process, you're well advised to spend some time learning how to do the job you're automating so you some idea if what you're making is shit or not.
> For example the telephone system and the Internet are both fundamentally grounded on software developed using a waterfall methodology
Is this true? I can’t speak for telco, but I thought the internet in particular was developed incrementally with a lot of experimentation. I mean, yes, the experimentation resulted in RFPs and STDs. But I thought these generally came after the software was working. And as someone who has implemented a few RFPs, I would not say my approach was remotely waterfall.
Indeed my perhaps incorrect version of events is that the waterfall approach is represented by the big loser in telco, the ISO OSI.
> Here’s a little-known secret: most six-nines reliability software projects are developed using a waterfall methodology.
I've designed and deployed Tier 1 services for a Big Tech company, and here's is a little-known secret: when nothing changes, our reliability is higher than six-nines.
Last year I measured our uptime during Black Friday for fun. Our error rate was measured in scientific notation because the number was so small. We didn't do any deployments or changes during that period.
When you operate in a steady state it's easy to achieve zero errors, and most downtime comes from random failures in hardware, i.e. servers crashing or network blips (which, operating at scale, are relatively common).
So my and other's personal experience is that most outages are due to changes in the software, dependency outages, or the rare large scale event that completely kills your SLA (e.g. a whole AWS region is down). Taming these is at the essence of reliable software.
Whoever tells you that the best software is made using waterwall methodologies from a fixed and never changing set of specifications, lives in a fantasyland alien to the vast majority of developers.
He was telling me how he would sell high-quality cabinets to homeowners, basically by building a "dream kitchen," that far exceeds their budget, then backing down, by removing features, until they have something that exceeds their original budget, but would still be quite good, and that they want.
He was saying that I should use his methodology.
I explained that his sales are to the people that would actually use the cabinets, so they have a vested interest in high quality.
In my case, I would be working for a business that absolutely doesn't care about quality. They want the cheapest, shittiest garbage they can possibly get away with pushing. They don't use it, and many of them won't even be responsible for supporting it, afterwards, so there's no incentive for quality.
The same goes for corporations that don't try to retain talent. If someone is only going to be around for eighteen months, they are paid well, and they are pressured to produce a lot of "stuff," then you can expect them to produce a lot of ... stuff. They don't give a damn about supporting it, because they are already practicing LeetCode for their next gig.
I have found that giving people a vested interest in Quality is essential. That often comes from being responsible for taking care of it, after it is released, using it themselves, or having their reputation staked on it.
I don't feel the current tech industry meets any of these bars.
Most of the software I write, is stuff that I use, and I am intimately involved with the users of my work. I want it to be good, because I see them, several times a week. Also, I have a fairly high personal bar, that I won't compromise. I have to feel good about my work, and that doesn't seem to sell.