Hacker News new | past | comments | ask | show | jobs | submit login

I recently read Atul Gawande's "The checklist manifesto" which covers applying aviation style check-lists to surgery. The outcome was a 30% reduction in post-surgical complications.

From reading this, I see how bad most software process check-lists are. They tend to be laundry lists mixing obvious minutia and vague uncheckable goals. They are not practical executable tools but well meaning documents that are ignored at the error prone moments when a check-list has the most potential impact.

A check-list for a good check-list:

1. Plan for a specific trigger giving a single pause point of up to 30-90s at which point the check-list is opened and executed

2. Decide whether it is a DO-CONFIRM or READ-DO list

3. Design to engage the brain rather than switch it off

4. 5-9 items at most

5. Don't include any action that is rarely forgotten i.e. the criteria for inclusion is the probability and cost of error. This is very different from listing the most important steps or attempting to fully document a process.

6. Regularly test and measure the check-list for effectiveness

I would say Fog-creek's article is in the "read once" training material category and not a practical check-list. For a real check-list, I would interview the team and examine recent commits and look for the 5 most serious real issues you are experiencing that you need to eliminate from future code. That is a now a review check-list you can mandate, measure and iterate.




For those who don't want to read the whole book, or want inspiration to read the book, here's the New Yorker articles on which the book is based. It's long, but worth reading: http://www.newyorker.com/magazine/2007/12/10/the-checklist


I also came across Gawande's Checklist Manifesto when creating a usability checklist for websites: https://userium.com/ As Freakonomics said: "The book’s main point is simple: no matter how expert you may be, well-designed check lists can improve outcomes".


Userium seems useful, but philosophically different than Gawande's work. More educational and complete, less aimed at someone who is already an expert at this stuff.

I think both have their place, but people usually think of the former when discussing introducing checklists into processes, and that's usually inappropriate unless you have people performing tasks outside of their domain.


Practically speaking, I think the parts of code review that could actually be covered properly by a checklist could also just be done by software. The parts that are "fuzzy" shouldn't really be in a checklist at all.


How familiar are you with the book? I would recommend reading it, if you haven't.

The point of a checklist is to complement expertise. Anything that can be automated should of course be automated and shouldn't be on the checklist. What goes on the checklist are things that humans have to be doing, that are bad when overlooked, and which sometimes get overlooked. It strikes me as likely that, in most situations, there are some such things for code reviews - although what they are may differ from context to context. Trying to find patterns in them would be a fantastic project.


Perhaps having automated tasks be a single check box "Has this task been performed?".

For example, instead of having a check for whether braces are in the correct spot, have a check for whether 'astyle' or a similar code formatter has been run.


That can work, but ideally, a CI server would actually run all of those, leaving only a handful of things that require human judgment in the checklist.


Yeah, from that perspective this isn't a very good checklist.


I find that most of the defects I come across, are things which have simply been forgotten. So whilst a short list of 5 things might stop common mistakes, it wouldn't help with catching omissions.


A check-list for the reviewer only needs to cover what reviewers are overlooking and not everything that authors are overlooking.

For example, authors regularly forget to remove commented out code but a reviewer gets slapped in the face by it so "no commented out code" would be in a policy document but no need to add it to a check-list you expect a reviewer to consult when reviewing code.

If designing for the professional already trained in your policies, it means you can concentrate on the shorter-list of high impact things reviewers are found to overlook which are probably project specific subtleties like how argument sanitation or error handling needs to be done.


I wonder how well a phased checklist would work for training on policies. Start out with everything, drop things gradually in order of "least likely to be forgotten" - and reintroduce if you find that they are.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: