I think Clean Code is one of these things where it's a reasonably good place to start, but as a text it's a little infantalising and overly prescriptive for people with significant experience, ignoring the necessary trade-offs which are being made in the recommendations in favour of trying to present something which appears to be self-consistently opinionated. Nevertheless, at a certain level of critical thinking it becomes clear that the recommendations in Clean Code are in fact opinions made on broadly subjective criteria - it is something of an influencer tome more than it is one about engineering or developer psychology. I would prefer a methodology which was a bit more rooted in objectivity and with clear, honest evaluations of the trade-offs involved. For instance, Clean Code broadly shies away from talking about the negative aspects of atomising your code base which Codin' Dirty touches on.
In this sense I think there are criteria which we could talk about more concretely: code density and code sparsity, particularly where coherent functionaltiy ends up split across many code files and over many more hundreds of lines than would have been written had they been included in a few larger methods. This is not to say that what Codin' Dirty is talking about is necessarily right in an absolute sense, I would not be so prescriptive, but when organisations are thinking about how many developers might work on a project these kinds of metrics should probably be taken into consideration because I think small teams are generally speaking going to work better on denser code bases with larger methods and larger teams may work better where the code base is more spread out and so they are less likely to tread on each others' toes.
In general I get a sense of dishonesty in the world of Clean Code and such where we like to pretend that "clean" code is not only going to be more maintainable, that it is also going to be more performant and just better - but in fact these claims are, in general, not true. The most optimised code is pretty much the complete antithesis of Clean Code recommendations, for example, usually with as minimal indirection as possible and high levels of expertise required to understand the code base.
In this sense I think there are criteria which we could talk about more concretely: code density and code sparsity, particularly where coherent functionaltiy ends up split across many code files and over many more hundreds of lines than would have been written had they been included in a few larger methods. This is not to say that what Codin' Dirty is talking about is necessarily right in an absolute sense, I would not be so prescriptive, but when organisations are thinking about how many developers might work on a project these kinds of metrics should probably be taken into consideration because I think small teams are generally speaking going to work better on denser code bases with larger methods and larger teams may work better where the code base is more spread out and so they are less likely to tread on each others' toes.
In general I get a sense of dishonesty in the world of Clean Code and such where we like to pretend that "clean" code is not only going to be more maintainable, that it is also going to be more performant and just better - but in fact these claims are, in general, not true. The most optimised code is pretty much the complete antithesis of Clean Code recommendations, for example, usually with as minimal indirection as possible and high levels of expertise required to understand the code base.