Saturday, January 31, 2009

An Old Debate

I don't think I've ever written a post like this one. I've tended to shy away. But I thought this was a very interesting discussion, and one that ties in well with things I've written in the past.

Background Reading:
http://www.joelonsoftware.com/items/2009/01/31.html Joel Spolsky and Jeff Atwood did a podcast, you can find a partial transcript here. In it, they say that TDD, taken to an extreme, is a waste of time. And that the "SOLID" principles are bureaucratic. But the point seems to be that all this stuff shouldn't get in the way of delivering software your customers need and want.

http://blog.objectmentor.com/articles/2009/01/31/quality-doesnt-matter-that-much-jeff-and-joel Robert Martin, who's big in TDD and SOLID, responds to Joel and Jeff. In this, he says they've completely misunderstood TDD and SOLID and Agile and basically claims they are unqualified to talk about it.

Moving On:
It's impossible to argue that you should spend lots of time doing anything in code if that time doesn't pay off in the long run.

That's where this argument gets interesting. Joel and Jeff are taking their understanding of unit tests, SOLID, etc applying it to their experience and deciding it requires extra time in code that doesn't pay off in the long run.

Martin, on the other hand, is defending his baby by pointing out that Joel and Jeff aren't understanding unit tests, SOLID, etc in the way he understands them. So his claim is that those techniques, when done correctly, do pay off in the long run.

Its pretty much impossible to resolve this. No two people will approach the code the same way, even if they think they're both being Agile and using the same principles.

So what I got out of all of this is:
  1. Think. You have to think. You have to pay attention. You have to play the balance games and weigh the pros and cons.
  2. TDD is hard. Code design is hard. They're both even harder to teach than they are to do. And in the end, these are just techniques. They are a means to an end, and are not the only means.
  3. Situation makes all the difference in the world. You really can't talk about coding practices without talking about what you're building. A small website requires different coding than the .NET framework. Its very important to recognize this, especially when you're reading recommendations from other people.

3 comments:

  1. Sweet a post i have an opinion on.

    I think Uncle Bob and (Joel and Jeff) are both wrong in some ways and right in others.

    In the past 4 years I've been on projects that tried to do TDD and failed, projects that completely ignored it and projects that did it well. Overall I see benefit in the concept of test driven development. When it is used as a tool and used effectively it does indeed result in lower defect rates, better design and quicker recognition of problems.

    That said I think trying to enforce 100% code coverage is just silly. When given that mandate, unless your team is made up of 100% superstar programmers who love writing tests, you end up with sloppy tests that don't do much other than execute code. I know this because that first project i mentioned that failed at TDD did just that. For the longest time our build process had a coverage check that required 100%. It was ridiculous. You ended up writing tests for simple properties and framework features.

    One movement is starting to emerge that might be a big of a compromise on this debate (and it is a debate these guys aren't the first ones to argue about it.) Behavior Driven Design (BDD - us CPS people love acronyms) is the application of tests to a project in a way that documents and tests the behaviors of that project. So instead of writing a test that checks a particular function you are checking a particular behavior. It should be noted behaviors and requirements are not synonymous, in fact general each requirement consists of multiple behaviors. For example its unlikely that a requirement will list every error condition that could arise but each error condition is a behavior and should be tested.

    Now I haven't done this, I can't claim to be an expert just more of an interested party. It strikes me as a good way to approach this debate though. If writing a test doesn't further your goals of covering all of the behaviors and you would simply be testing some ancillary or framework specific feature, don't do it.

    Overall though I don't want to ignore TDD. There does seem to be a constant argument between management and programmers about it though. It takes time to learn, which slows people down at first. Eventually though it pays off in reduced maintenance fees, which gets back to that old chart of costs in a software project. Maintenance is almost always the largest cost but the least considered when building the application.

    ReplyDelete
  2. Welp i probably should have read Uncle Bob's article before commenting because he says the same thing about code coverage that I did. 100% tends to be useless.

    ReplyDelete
  3. Like Josh, I didn't read the background material either. Also, I have nothing to contribute to the discussion at hand. Hi Kevin!

    ReplyDelete

Note: Only a member of this blog may post a comment.