Hacker News new | past | comments | ask | show | jobs | submit login
RIP TDD (facebook.com)
122 points by platz on April 29, 2014 | hide | past | favorite | 91 comments



This... is not Kent's finest moment. I get what he's saying, but the snark doesn't really help, and doesn't actually rebut any of the points that DHH was making against TDD.

Testing does tend to have a negative impact on API design. If we were to make a list of positives and negatives, this one would go in the "negative" column. Listing out TDD's positives does not invalidate that negative.


> This... is not Kent's finest moment.

It's really not. Objectively, this wouldn't even be within artillery range of the front page if it hadn't been written by a Big Name(tm).

I even have what borders on a fundamental disagreement with his last point :

> Anxiety. Perhaps what I'll miss most is the way TDD gives me an instantaneous "Is Everything Okay?" button.

This point of Kent's feels particularly weird given that DHH had specifically criticized the TDD artifact that is unit test worship & mock abuse. That workflow practically embodies the notion of a "false sense of security". It might do wonders for personal anxiety, but I have doubts about its ability to rigorously test a system :/


> It's really not. Objectively, this wouldn't even be within artillery range of the front page if it hadn't been written by a Big Name(tm).

This is doubly true of the DHH article Kent is rebutting.


No, it was a well written article that made its point articulately. This post is snarky and only briefly touches on some positives of TDD without addressing the points raised by DHH. I don't know either of these guys by name, but I upvoted DHH and not this post because it is simply better written.


"Testing does tend to have a negative impact on API design"

I think this sentiment is the single root cause of the current back in forth. In my experience this is completely untrue. For DHH and others there appears to be some sort of culture of dogmatic polluting of APIs for testing which I'm not familiar with.

Everything else seems to be talking around the real issue and finger pointing.


Let's take as an example dependency injection. The genesis for DI was to facilitate the testing of objects that are dependent on objects that are not easily constructed, such as DAOs. And the whole reason this was done was so that these classes, the ones that have a DAO dependency, could be more easily and quickly tested.

So, in order to use (constructor) DI you would have a constructor that looks like this:

    public InvoiceController(InvoiceDao invoiceDao) {
        this.invoiceDao = invoiceDao;
    }
The key point to remember here is that this has been done so that in our unit tests we can inject a fake InvoiceDao. Now, let's say that our InvoiceDao class has one and only one constructor:

    public InvoiceDao() {
        // sets up connection parameters, etc.
    }
Then the constructor for InvoiceController could be simplified to:

    public InvoiceController() {
        this.invoiceDao = new InvoiceDao();
    }
This is quite a bit cleaner from an API perspective, and that is the entire point. This is only a simple example. For more complex classes, with multiple dependencies, it really becomes cumbersome. What if InvoiceController also needed access ReporterDao? Well, then you need to add that as a parameter to the constructor as well. Your API is made more complicated, all in an effort to make testing possible.

This does not, of course, invalidate the benefits of unit testing, which are many. But it does expose a negative that is not frequently acknowledged, and that's what DHH is talking about, and what Kent has failed to address.


"The genesis for DI was to facilitate the testing of objects that are dependent on objects that are not easily constructed, such as DAOs. And the whole reason this was done was so that these classes, the ones that have a DAO dependency, could be more easily and quickly tested."

I disagree with this central premise. If you have introduced DI simply for testing reasons than you have missed the point. The reason for DI is that a very common point of change with software is at the interaction between clients and services. Tight coupling at this point of change is usually a design smell. That DI enables automated testing is a by product of it being loosely coupled not the point of the loose coupling. That TDD drives you to this sort of solution, all other things being equal, is one of the advantages of TDD.

If DI has become cumbersome and is making your code hard to reason about/read/maintain etc then blame DI, not the testing. You also might want to consider the higher level architecture of your solution if you are finding yourself with lots of complicated dependency chains.


"The reason for DI is that a very common point of change with software is at the interaction between clients and services. Tight coupling at this point of change is usually a design smell."

The problem with TDD is that if it is NOT a common point of change, TDD adherents often say that you should introduce DI anyway to make testing easier. "One day you might want to change out the whole database, so you should abstract out the DB into a blah blah blah pattern, with a layer of the rah rah rah pattern between to ..."


> The problem with TDD is that if it is NOT a common point of change, TDD adherents often say that you should introduce DI anyway to make testing easier.

To the extent that's a problem, its not with TDD, nor with DI, but with the particular people giving the advice. Why do people think that "Some of the people who follow X give bad advice Y" is a criticism of X when Y is not something that that X requires?



This is a very insidious comment. You have actually committed the logical fallacy as your position is "All TDD proponents do X". When someone points out, no "Some TDD proponents do X and many of us TDD proponents think X is bad practice" you blame them of the No true Scotsman fallacy, when in reality your position is the one claiming a universal property about TDD.


Tell me who is a good representative of TDD that has written a lot about it. Where is the cannon of TDD so to speak.


Not sure if I'm qualified to appoint a cannon but I can tell you of the books that were very influential to me with regards to TDD:

Test-Driven Development by Example, Kent Beck. This came out in 2002 and I haven't read it since then but at the time it really influenced a lot of my TDD thinking. I expect if I read it now I'd have some complaints based on my decade of experience with the process.

Growing Object-Oriented Software, Guided by Tests, Freeman & Pryce. Much more recent and includes more modern thinking about TDD with acceptance/integration tests.

Working Effectively with Legacy Code, Michael Feathers. A great book for dealing with TDD when you aren't doing green field development. He is a bigger fan of Mock Objects than I am, but he illustrates some examples when mocking is the most appropriate response to the current requirements.

All that said, TDD is like any other software methodology. It is a set of patterns and principles that each practitioner interprets for themselves. At it's core though it's pretty simple, automatic verification of specifications are as important as implementation of the specification for any sufficiently long lived, complex software system. Writing those verifications first provides, design, process, and management advantages over the historical process of writing them last and has a high correlation with well factored code. That's it, no doctrines about unit vs integration tests, mocking out DB access or 100% code coverage.


"and management advantages over the historical process of writing them last and has a high correlation with well factored code."

Where is the data supporting this claim? I don't believe it is true.

"That's it, no doctrines about unit vs integration tests, mocking out DB access or 100% code coverage"

Why do you think so many of us feel that it is an ideology? Are we seeing ghosts and imagining the whole thing?


"Where is the data supporting this claim? I don't believe it is true."

This is one central premise of TDD and is not proven or disproven yet. What we can say is that previous software methodologies were lacking, and cannot prove or disprove their superiority over TDD with regards to this statement. If you distill all debates about TDD down to their essence, it revolves around this premise.

I am fine with someone being skeptical of this claim, but I would prefer if they offered A) a measure we can use to test this hypothesis and B) a contrasting methodology that performs better with regards to the measurements provided in A. The single biggest outstanding problem in software engineering is finding a metric by which we can judge software quality objectively. Because it is such an elusive goal other less precise proxies for software quality have been proposed to stand in for the more complex metric. TDD hangs it's shingle on "testability". Even though this has obvious defects, I've seen no evidence for any other objective measure as a more precise indicator of software quality and quite a few advantages to "testability". Namely, it is measurable.

"Why do you think so many of us feel that it is an ideology? Are we seeing ghosts and imagining the whole thing?"

No, I'm sure that you have encountered well meaning but flawed practitioners of TDD. Your skepticism of their process doesn't bother me. What bothers me is your (and DHH's) painting of all TDD folks as cultists. I don't believe this stuff because Uncle Bob told me to believe it. I believe it because my experience shows that rigorous use of TDD practices trend toward better software than a lack of rigor outside of an objective measure for software quality.

If you have an alternate rigorous approach that you believe trends (or better yet is provably) better in software quality, by all means outline it. I know that TDD is flawed and am happy to find its successor.

Let me ask you this, prior to the rise of TDD, how prevalent do you think automated testing was? If it was very prevalent why is it only after the rise of TDD that automated testing became a central part of any build workflow and the entire concept of Continuous Integration/Deployment developed?

In my experience, automated testing prior to the rise of TDD was not wide spread. But maybe I was seeing ghosts and imagining that.


Neither of us ever said all TDD folks are cultist. We are both tired of being told that we are absolutely doing it wrong if we don't practice TDD. For an example of people telling us this please see this talk by bob Martin and jump to the 58 minute mark. http://m.youtube.com/watch?v=WpkDN78P884

Bob is not an obscure figure by any stretch of the imagination. You have admitted to having no real data to back up the claim that writing tests first leads to better code, yet we have bob here telling us we are absolutely wrong if we don't follow his religion.


> Why do you think so many of us feel that it is an ideology?

Probably because you've run into cargo cult practitioners that treat it that way -- the same as every methodology that's become known outside of a narrow initiating group has -- and human perceptual biases tend to overemphasize and overgeneralize the most extreme examples.


Is bob Martin a cargo cult practitioner?

jump to the 58 minute mark. http://m.youtube.com/watch?v=WpkDN78P884


That wasn't the central premise. The central premise is that there is a trade-off in class design when doing unit testing.


And I say that if you are making design decisions BECAUSE Of testing you are doing it wrong.

The premise of TDD (and one I've found true) is that there is a natural correlation between easily tested code and well factored code, not that you should be compromising your design for testing. Thus the original comment about bad api design as a result of TDD rings false with me.

You have apparently encountered cases where people have introduced unnecessary DI in the name of ease of testing when in reality they should have been fixing their central design problems. Software with long dependency chains and complex IoC containers are not easy to test, though looking through the comments it seems that many people have fooled themselves into thinking they are.


> Let's take as an example dependency injection. The genesis for DI was to facilitate the testing of objects that are dependent on objects that are not easily constructed, such as DAOs.

Really? Because I've always seen DI advocated for loose coupling, with testability just coming along for the ride the way it does in general with loose coupling, I've rarely seen it advocated for testability independently of the broader architectural motivation for loose coupling.


But that's not true at all. The genesis for DI was nothing close to that (as I learned it). DI's main purpose isn't testing, it's flexibility. DI isn't there solely for testing, or if it is, you're probably missing the point of it.


Right. However, in many DI-heavy apps I've seen, nobody ever makes use of that flexibility. Otherwise it's just extra, unnecessary complexity, sitting around in verbose XML files and annotations, causing code clutter.

These are "enterprise" Java apps with Spring, etc.


I personally don't see how a couple parameters in the constructor are adding so much complexity that you'd be better off without DI.

A good DI container can make it much easier to follow a pretty good amount of general best practices. Single Responsibility, DRY, and Open-Closed Principle to name a few.


This has been my experience as well. The core of Spring -- IoC/DI -- solves a problem that is largely of its own making.

I've had better luck and cleaner code without it than with it.


You miss the point of DI.

DI is solving a problem common to non-trivial OO apps - where do you create the objects? Every sane approach to this tends to lead to a form of DI.

Spring is a bad example of DI because it encourages an implicit, DI-everything style. Take a look at Guice or other frameworks where bindings are explicit.


"Testability" tends to add unnecessary (by KISS) configurability and indirection to code composed out of sub-units.

Because a testable unit of code must have its inputs and collaborators controlled, all the inputs and collaborators need to be configurable or replaceable, even when such configuration is unnecessary to the business value aimed at.

In statically typed languages like Java, this tends to exhibit as a proliferation of interfaces that only have a single implementation, namespaces with too many identifiers publicly exposed, and lots of open extension points - which hurts versioning. It tends towards requiring complex IoC controllers with associated config. Code ends up filled with boilerplate glue and over-abstraction (factory-factories etc.), and much harder to read because IDEs can no longer follow the link between a method call and its implementation - because the method call is all too often dynamically resolved via an interface implementation.


> "Testability" tends to add unnecessary (by KISS) configurability and indirection to code composed out of sub-units.

You mean "White Box Testability". Don't forget Black Box testing is agnostic to the implementation.


To be sure, "negative" is the weak point in the statement/sentiment. It's a value judgement subject to where the person is drawing a line between the goods and the bads (methodology always unstated), so the argument wallows in indeterminacy.

As long as people throw out a simple "the real issue" without explaining what they mean by it, it's inevitable. Pretty soon, every player is talking past one another, someone writes a blog post purporting to lay out the landscape of all sides, someone counters rap-battle style, someone else posts to HN "Ask HN: your favorite reasons to TDD," which provides the foundation for an ebook, after which three people post "Show HN: site to plan an optimal TDD strategy." After that, we wait for someone to posit a successor and the cycle starts over again.


This is exactly an argument I am having at work right now, from someone that hates dependency injection.

If I have a controller that calls a service, and a service that calls a dao, then the service needs the dao injected:

  public Service(Dao dao) {
    this.dao = dao
  }
Which means (in the absence of autowiring) that the controller needs to be "aware" of the dao:

  ...
  service = new Service(new Dao());
  result = service.serviceMethod(param1, param2);
  ...
And that is objectionable because of putting too much of a burden on people writing controller logic, because of separation of concerns, etc. I am finding it difficult to argue against that point.

I get that this goes away if using a proper inversion of control container so one can autowire, but that is not always practical.


> And that is objectionable because of putting too much of a burden on people writing controller logic, because of separation of concerns, etc.

No, the controller needs to be aware of a method of creating new services. Depending on the specific language and other constraints, that might be a factory function passed into the controller's constructor, a factory object passed into the controller's constructor, a refernce to a service locator object, or some other mechanism. Specific direct knowledge of the DAO class is generally neither necessary nor desirable.


How is that a burden? The controller has to know about the methods of the DAO no matter what. The debate you're having seems to be "what is the best way to get a concrete instance of the DAO into the controller?" The argument being made for constructor/accessor injection is that it makes testing easier. The argument against it is that it makes the API more complicated than it would otherwise be.


If you don't think that the AngularJS API is Vietnam then I don't think I'd be able to change your mind regarding this.


You're implying that AngularJS's API is Vietnam? I get that the DI system is complicated, but in my mind, it's a very well thought out approach which relies minimally on global objects. I'd take Angular's approach over jQuery's any day.


Yes, I am.

It might be clever but it's unintuitive and as a consequence developers (even otherwise experienced ones) who have not yet mastered AngularJS will write the kind of mess that is painful to clean up later. This is more or less true for any technology, but as a general rule the more complicated or different the concepts are, the less likely it is that people will get things right the first few times. And this gets worse as the application gets larger and / or more people get added to the team. A simpler approach with less cleverness would help.


I don't know the first thing about AngularJS.


> Testing does tend to have a negative impact on API design.

Not all test has a negative impact. IMO, certain forms of testing have a positive impact because it provides feedback and allows fearless refactoring.

White Box testing with many mocks and other affordances has a negative impact.

Black box testing does not require many affordances.


Is this a joke or he is he just unable to separate "tests" from "test driven development"? Most of these points have nothing to do with "tests first".


>Is this a joke or he is he just unable to separate "tests" from "test driven development"?

I'm not sure I've ever seen a debate about TDD without at least one TDD advocate conflating testing in general with TDD…


Well, you see, we've wandered into a dogmatic war. This is like the early days of Vim vs Emacs. There will be thousands upon thousands of words expended, because being verbose means you're right. Nobody will listen to the other side because they're convinced they're right. We'll also see strawman effigies burned to the ground, and one side claiming victory because of it. I feel that this is long from over.


He ignores the points DHH makes, and then replies with his own snide talking points like a politician on a sunday morning talk show.


DHH largely creates a straw man: the fanatical, religious TDD devotee who blindly worships at the Altar of Unit Tests. For an enormous majority of the wider development community (who do no testing whatsoever outside of click-the-thing-and-see-if-it-works), TDD is an easy to understand, easy to get started way to improve the quality of your code. I haven't met any of these supposed religious zealots DHH is railing about, and I worked for an "Agile Consultancy".

Also, DHH argues against plenty of other best practices / design patterns and is considered incorrect in many of these cases (his "only use the Rails classes" objections to having a service layer most famously). It just so happens that when he bashes something that a lot of developers don't like to do already (write unit tests), the pitchforks rally behind him.

It's the same sort of thing when people post articles on HN bashing pair programming, or open workspaces. We get it. Some of you really don't like change, or anything "agile". Now please stop beating the dead horse already.


> DHH largely creates a straw man: the fanatical, religious TDD devotee who blindly worships at the Altar of Unit Tests.

That's not a straw man: That's real life. I've worked with people like this. They're fairly common in TDD shops. You even see them on Hacker News making spurious claims like how TDD would have prevented the <security hole> in <framework> (nevermind the fact that that <framework>'s dev teams is heavily invested in TDD)


I've seen code bases designed from the tests first and they were pleasant to work with.

I've worked jobs where it was my responsibility to salvage undocumented, untested legacy code and extend it with new features while keeping it running.

The truth is rarely in the extremes being characterized in these blog posts and discussion threads.

The only kind of test code I've seen that required far too many mocks and dependency injections tend to result from poor design choices and writing the tests after the functionality.

If TDD as a practice doesn't work for you, that's fine -- but I think it's disingenuous to offer no alternatives. Integration testing isn't an alternative; I use them in concert with a well tested code base. And I use whole-system tests too.

DHH works in a small corner of the field. TDD may not work well for Rails applications is all I've been able to glean from his posts. I think that's a battle for the Ruby/Rails community to have. It doesn't discredit TDD one bit.


> TDD may not work well for Rails applications is all I've been able to glean from his posts.

And even this is not the case - the Ruby/Rails community is heavily involved with TDD/testing frameworks/etc, because it is so easy to do and because unit tests can help bring some of the assurances you miss from static typing back into your systems.

The issue DHH has with TDD is, imo, all wrapped up in the issues he has with design patterns, service layers, etc. Doing TDD on a business domain object that subclasses ActiveRecord::Base forces you to mock/stub all sorts of DB calls (which is irritating), which in turn highlights the fact that, contrary to what DHH says, it's not always a very good idea to stick all of your business logic in the model layer.

DHH isn't having any of it, but by no means is he representative of the Ruby/Rails community here. (see: https://www.destroyallsoftware.com/screencasts/catalog/what-... for just one of many examples)


I really like Gary's videos. I wish I had stopped to meet him at Pycon this year (but I didn't know who he was until after his talk! Derp).

Thanks for pointing that out though; I didn't intend to assert

  dhh's opinion <=> rails community's opinion


Out of curiosity, where could/should business logic go in MVC-ish architectures?


Normally in the model; but often in Rails the "model" gets conflated with the "persistence layer." Many of us who develop in Rails feel that the persistence layer should have the single responsibility of persistence, and would like to use something like a DAO pattern there, keeping business objects / logic in pure Ruby classes.

Note that this is not necessary for small, simple Rails applications: by all means, if you're writing a blog app, stick everything in AR::Base and don't over-design. But once you start working in large-scale, non-trivial Rails apps, it becomes very painful to have to have all of your tests running against your database for example.

This is probably something of an idiosyncrasy of Rails, and may have little relevance to other frameworks / languages.


I've worked for one of the biggest agile consultancies around, and while most of the developers write unit tests, I never met anyone who started frothing at the mouth if we wrote code before tests. I just don't buy the pervasiveness of these TDD-devotees to the degree that justifies all of this backlash, DHH articles, HN comments, etc.


You probably got lucky.

But yeah, they exist


And I've worked in TDD shops for years and never seen the fanatical TDD devotee. I have seen lots of developers that will use any excuse they can find not to provide automated tests for their code.


I've met plenty of these devotees. I'm not sure that they practice what they preach, but I have felt the wrath of their judgment, to the point that I hate "admitting" that I don't believe in TDD. I, for one, am pleased that DHH has made my world a little brighter.


> I have felt the wrath of their judgment, to the point that I hate "admitting" that I don't believe in TDD

I'm not sure if you're exaggerating a little, but I think that having colleagues whose wrath makes you hesitant to express yourself is an entirely different issue from that of test-driven development or not. Development is a fairly easy field in which to demonstrate that you're at least as correct in your methodologies as another developer: just produce quality code. This feels (as these arguments very often feel to me) like a personality issue more than a methodological one.


I think you mis-characterize people's concern for pair programming and open workspaces. That alone makes me wonder how objectively you can look at this topic. TDD has become a foundational tenet of agile, so any criticisms of it naturally get seen as a criticism of agile (similar to open spaces and pair programming).

There's more than one way to skin a cat, and it isn't always fear of change that keeps people from changing.


> TDD has become a foundational tenet of agile

When I see "agile" and "tenet" in the same sentence, I think "government shop". An explicitly flexible premise (Agile) does not have "one true way" to do anything.


> TDD has become a foundational tenet of agile

Dear god, please take the straw man out back and shoot him already.

There really is no way to have an intellectually honest discussion about these subjects, is there?


"For an enormous majority of the wider development community" => noise generated does not equal the size of the community.


Agreed. It is interesting how "up in arms" so many people are getting. I've never used TDD and have been very successful both in the startup world and enterprise world. I used to think I was "inferior" and had to keep quite about my lack of TDD. At least DHH has "put a voice" to our kind.

I span that weird space between product and engineering though. So most of the time I'm hacking in something like "prototype" mode, so TDD just makes no sense. I approach it more like sculpting. I'm slowing hacking away at this block of marble to produce an elegant "beautiful" product. Once it start to take shape I can create tests and specs to define the learnings and creations that I've made.

Tests first? It just doesn't work for me.


Indeed. However this doesn't mean that TDD is dead - which is what a lot of people who do not like TDD seem to believe. I know you might not have said this but this whole movement is misguided.

If anything, if the adherents of TDD were to properly abandon it, then that would be TDD dying because they would have come up with something better or moved on.


I'm not 100% sure that this post is supposed to be sarcastic (although I am about 60% sure.) The issues are certainly legitimate, and TDD does define one approach (of many possible approaches) for mitigating them.


Look at Kent's eight bullet points: half of them are unrelated to the DD part of TDD. Coming from a man who should know the difference between having automated tests for your software and partaking in test-driven design, I must assume that he had a good reason for conflating the two.

Hacker News is not the audience for this post.

Kent Beck is doing damage control.

DHH lives in a developer community that has adopted unit tests to the point of feeling ashamed about untested code---testing is more than just a practice, it's a culture.

Kent Beck lives in a world where the mere existence of unit tests is a champagne-worthy surprise.

There are people out there who are not yet convinced of the benefits of tests, let alone test-first or test-driven design. Kent Beck is a missionary, bringing them the gospel of automated testing. Can you imagine the impact that a piece like DHH's would have on his efforts ?

A developer with an incomplete understanding of software tests, reading a post by a big-shot recognizable name that one would expect (based on the Rails community's love for testing) to be a major proponent of tests, would take it as "Tests are actually a bad idea !"

This is not what DHH said. There is probably no one among us here who would understand it this way, and most experienced folks would just shake their heads at the ongoing back-and-forth and resume their position of "TDD is a tool in my toolbox, and I use it whenever it helps me."

Kent Beck wrote a piece for the lost soul who doesn't know the difference between TDD and automated testing, and who might become confused after reading DHH's opinion. This is no time for subtlety, for paying notice to the differences between testing methodologies, or for polite agreement with at least some parts of the "opposing" piece.

Let David work to bring the Rails community back from the "test all the things!" extremities it might have reached, let Kent work to bring the unenlightened masses out of the "tests are useless!" darkness wherein they have dwelt for so long, and let us accept that if we truly have the capacity to criticize what those two are saying, then they probably weren't talking to us in the first place.


Kent Beck is normally intelligent, concise and persuasive. This post is reactionary, and disappointing. I hope he follows it up with something a little more considered.


He has books and lectures on the subject so I'm not sure he has much more to say.


Jessica Kerr was discussing property-based testing (i.e. quickcheck, scalacheck) on a podcast recently, and I thought it was interesting she noted that the red-green-refactor cycle didn't make as much sense with property-based testing.

It's more "contemplative" - thinking about invariants of your software rather than designing for mocks and various injected components.

(although property-based testing is a bit less effective in languages that don't make type information available, to generate the input values)



yep! Although it's kind of a weird dynamic going on - the host gets defensive at times, getting a sense these two were grating on each other a little bit. Always enjoy the things Jessica has to say though.


I'm guessing this doesn't disappear anytime soon https://github.com/rails/rails/tree/master/activerecord/test


An indirectly related fixation that bugs me every time I see a Facebook link is this: why are notes, events, etc. appended with a string of digits?

Shouldn't the URL here be simply: https://www.facebook.com/notes/kent-beck/rip-tdd/ rather than https://www.facebook.com/notes/kent-beck/rip-tdd/75084019494...?

Does every element that exists on facebook.com get assigned a unique string like the one above?


Basically, the "/kent-beck/rip-tdd" portion is for humans (and/or SEO), so that we can look at the URL and have some idea of what it is, and the "750840194948847" is for their server to figure out what to actually show. You can change the for-humans part to whatever you want and still see the same note: https://www.facebook.com/notes/foo/bar/750840194948847

The advantage of only looking at the number is that if Kent Beck's name changes or the title of the note changes, they can just start linking to a new URL with a changed human-part (but the same numeric identifier) and the old URL will still work. The "correct" way to do it would be to track and validate every version of the human-part for every URL, redirect old versions to the newest version, and 404 made-up ones. But in a way that would be slightly inconveniencing human users following outdated links by making their browsers follow a redirect when really only search engine spiders care whether there's a single canonical URL per page, so if Facebook has given it any thought, they probably decided just to not bother.

And at their scale, keeping track of one or more string URL versions for each page (that is, each piece of URL-addressable content that makes sense to have some kind of title slug included) would be non-trivial. Billions and billions of URLs to track, dubious benefit.


This is simply crazy talk. He creates a list of genuine issue, e.g. over-engineering and slaps them under the moniker of "because TDD."

I don't know how many times production was saved because there was an automated test written via TDD that prevented bad code from going out. Too many times to count.

What the author is suggesting is to throw this out and go back to the days of broken production apps are okay. This has never bothered Facebook, and it will definitely bother engineers who want to take pride in their work.


It's sarcasm.


Which is funny

"Over-engineering. I have a tendency to "throw in" functionality I "know" I'm "going to need". Making one red test green (along with the list of future tests) helps me implement just enough. I need to find a new way to stay focused."

So creating a test for a function that sum of numbers that gets 1 and 1 and returning a hardcoded 2 is "just enough" and " a great way to stay focused ", funny

I kind of agree with the other points, not so much about the "API design" stuff, while it's true it facilitates, the end word is with the main API user.

Well, I'm glad it's dead, now I don't have to spend time mocking all other parts of my program.


Could testing preconditions and expectations of OpenSSL rooted out Heartbleed and the the Safari goto gaffe? Those bugs were process orientated - not cryptographic algorithm failures.

I think now is an important time as ever for TDD. (Test your expectations! You might be surprised or even wrong. Do not underestimate your hubris when it comes to introducing bugs.)


A knee-jerk reaction to an article that could cause many to have knee-jerk reactions against TDD? How appropriate.


Who is stopping Kent from continuing using TDD techniques? Since when IT became a GULAG?


Another month, another Agile war. Is it co-location? Nope. Is it SAFe? Nope. Looks like TDD. Again.

1) People make money on these wars. Do not waste your time on them.

2) Out of all the Agile stuff I've ever come across, TDD is the thing that drives most people nuts. I think because it's something that has to do with your minute-to-minute work. I'm a fan -- when it's appropriate (not in startups). But dang, people go crazy about it.

3) TDD is more about the way we approach doing a thorough job than it is about programming. At least in my mind. It's double-entry bookkeeping for coding. Not only did you do it, but it's cross-checked. Therefore, just like good accounting practices, it makes sense to a lot of people. It's also annoying as hell to a lot of people. This set probably overlaps.


TDD advocates: build a new framework that extends Rails, one that makes it easy to add service layers and fast tests.

IMHO fat models are the new PHP, and giant classes suck. Its time for new conventions to emerge.

Prove DHH wrong with working code, not blog posts or books or conference talks.


You don't need a new framework to easily add service layers and fast tests. That comes with some proper planning before writing the code. And the conventions are there for decades as well, developers just need to be aware of them a bit more.


> Prove DHH wrong with working code, not blog posts or books or conference talks.

I completely agree; I will only say that it actually is quite easy to add service layers and fast tests to Rails applications - you just don't stick everything inside ActiveRecord like DHH would like you to :)


I did this about a year ago, but doesn't really extend or care about rails at all.

http://obvious.retromocha.com

The only "framework" part that you need outside of some reasonable OOP design principles is some amount of type checking to enforce some notion of boundaries. Ruby is super weak in this area, so I added some bits to do that.

For the record, until you experience the trouble that the MVC Big Ball of Fat Models or Controllers gets you to, you probably won't have much appreciation for the service/command style or the hexagonal ports/adapters style. At least, that's been my experience with other developers.


What-the-fuck-ever

To the whole article. I read the dhh article, Kent did not. The people practicing umm "agility" did a better job handling some of the root issues DHH brought up.


Strange to see you being downvoted. I think many commenting did not read the article and did not the video (http://www.justin.tv/confreaks/b/522089408 fast-forward to 10:30). DHH does not argue agains testing, his argument is against cargo-culting and hurting the design in pursue of faster testing and more coverage, even when it is usless. DHH even quotes Kent himself in the talk: "I get paid for code that works, not for tests, so my philosophy is to test as little as possible to reach a given level of confidence".


Indeed. I think DHH was very specific in his criticism, that for controllers only, then TDD and unit tests aren't the wisest choice, because they introduce complexity and potential problems (really, are you sure all those mocks catch all your edge cases?).

The critics, including Kent Beck here, have by and large reducto absurdum'd it to "DHH HATES TESTING AND EATS BABIES", which isn't his point at all. Sure, DHH could have been more narrowly focused and less confrontational in his language, but that doesn't mean he's wrong.

Look at it this way... are controllers hard to test in Rails, or are controllers hard to test in general? This is an argument for lightweight controllers that are little more than routing glue (don't put business logic in controllers), and then giving them a pass on unit testing in order to keep them clean.

This gets to something I've been chewing on and should write about, which I think of as Tautology Testing. It's endemic in excessive unit testing. You create a mock so you can create a test so a test exists. The test doesn't really test anything except its own existence. How is that useful? You're just adding complexity, not functionality.


Perhaps DHH could have chosen a better title for his post than "TDD is dead", then? Perhaps "unit testing for controllers: not the wisest choice"?


DHH's article looks to be titled 'Test-induced design damage', not 'TDD is dead'. 'TDD is dead' is referenced in the article as a current uprising (which I hadn't heard of, fwiw), not necessarily as something DHH personally believes. I thought his take on the issue was nuanced and thoughtful, so its surprising to see Kent's overreaction here.



Why aren't these quips written as comments in the main DHH article submission? Everybody has to whore themselves out these days - embarrassing.


It's possible that, since this was written on FB and not Kent's blog ( http://www.threeriversinstitute.org/blog/ ), that this wasn't intended for as public an audience as HN and was not an attempt to "whore himself out."



Ah, missed that. Well, fair points then :)


I rest my case.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: