Monday, June 8, 2009

Thinking like a TDDer

Welcome to my 100th post! I'm going to return to a topic that I've touched on many times in the past, namely, unit testing.

Unit testing is one of those things that is very attractive in principle but gets ugly real fast in practice. I've tried it countless times but never managed to follow through. But in principle, I'm a fan. Having tests to fall back on to help prevent regression and to make refactoring easier and less stressful is a big win. And following the TDD process is very rewarding. Every couple minutes you get another green bubble, encouraging you, telling you you're making progress and doing a good job. Plus, unit testing reinforces the benefits of following good design practices like loose coupling, DRY, and Do One Thing (which are all the same...).

For me personally, unit testing still has 2 primary stumbling blocks:
  1. Mocking
  2. "Maintenance" development
I've talked about many other "pitfalls" including Mocking in an earlier post called Unit Testing Pitfalls. I still think those are true, but I'm coming to terms with the ways around some of them. However, mocking still drives me crazy. I hate having to specify every method and/or property that my object under test is likely to call on the dependency. On the other hand, I hate writing Stub objects because it takes time and effort and you frequently make mistakes that cause tests to fail. Maybe I just need to learn all the features of RhinoMocks (if only the documentation was better) or learn Moq or something. If I find a way to resolve this, I'll let you know.

The second issue is "Maintenance development," by which I mean, going back to code you've written and changing it. When I'm first working on some code I find TDD very easy, but when I have to go back to it to fix bugs or add features I always go straight for the code and cause my Unit Tests to be out of date. And therefore, worthless.

A Continuous Integration server which ran the Unit Tests would force me to keep the tests up to date, but I don't like to be forced to do things. In fact, I'd probably just start commenting out the failing tests so I could get my bug fix checked in. Yeah, I know! Hideous behavior right? Should be punishable by death. But I'm lazy, and so are you.

My problem is that I'm still not thinking like a TDDer. If a TDDer needed to fix a bug, they would start by adding a failing test, then they would work until the test passed, THEN they'd try to repo the bug and make sure it was gone. That's the process they would follow, but what's really interesting in the mindset that process requires.

You've probably heard people say that tests are as good as if not better than documentation. I've even heard people says tests can replace specifications! These people are completely insane! Lock them up in a padded room, throw away the key, and please please please don't let them write any more blog posts!

Ok, I know, they aren't REALLY crazy, they're just using the words to mean something different than what the words REALLY mean. Welcome to software development, we love overrides so much we can't resist overriding words in the English language!

So if you fight through the crazy and think about the documentation argument for a moment, that unit tests are documentation, you can see where this might make sense. First, assume we're talking about API documentation and not end user documentation, obviously. I'd like to see an end user figure out which button they need to click on the screen by reading a bunch of unit tests. Don't worry, I'm sure they'll re-up your contract next year!

Anyway, you have a bunch of tests like, Apply_is_enabled_when_required_fields_are_filled() and so forth. A smart enough and patient enough person could read through tests like this and figure out what the object under test does and how to code against it. In fact, because of the volume of code in the tests, they'd be able to understand and extract much more meaning than they could from some MSDN style documentation.

But can you imagine if the only documentation Microsoft published for the .NET framework was its unit tests? The MSDN style docs are hard enough most of the time! Unit tests ARE NOT documentation! But they are incredibly documentative (made up words are fun!), and they are very very useful for someone who intends to work on the object under test. So, it makes a lot of sense to think about them this way. That is, its a useful mindset.

What about the argument that they are specifications. This one is even more off base. A specification is what you write up front to describe what the software should do and how it should work and basically what it should look like. Yeah, Agile people jump up and down and say you don't need specs. "Just write a story card!" they squeak. "Pin it to your wall!" Sorry little agile buddy, your story card is not a spec. It's not even the replacement of a spec. At some point you're still sitting down and figuring out what the software needs to do, and deciding how it should work. Maybe you're meeting with your "customer representative" and you show him your little story card and he starts rambling on about the details of that story while you furiously scribble notes on a legal pad. Then you sit down with your programming buddy (read: "pair") and you start writing code. Well, the scribbles on your note pad are the spec.

Despite all the condescending language in that paragraph, this is actually fine by me. As long you manage to capture enough detail in that meeting and are able to think through the consequences of your implementation decisions as well as the issues of integrating this part with the other parts of your software, this is a perfectly acceptable way of producing specifications. You might want to be a bit more diligent if you are writing the software that flys the space shuttle, but for most business and banking apps, I think you'll be just fine.

But, back to my point, unit tests are not Specifications. However, they do specify how the object under test should behave. In fact, they *should* specify absolutely everything about how that object should behave, if they're going to be truly effective. And this again indicates something about the mindset of the TDDer.

If unit tests are really going to work, they have to be more than regression tests. And they have to be more than a crutch you lean on during "initial development." They have to be both documentation and specification. This mind set leads you to all kinds of realizations.

For example, to be effective documentation and specification, they have to focus on behavior and they have to have good names. So, "Add_returns_false_null_person" is not a good test name. Maybe you should have gone with "Add_fails_when_person_is_missing."

If you're thinking about your tests this way and someone finds a bug, what's the best way to approach it? Not diving into the code and looking at stack traces trying to find the line of code that's in error, no sir! If there's a bug, you must have missed something in your specification of the problem. So what you're going to do is go look through your spec, find the missing piece that explains the bug, then update your spec.

What this means is that your goal when you're writing your tests shouldn't really be code coverage. And it shouldn't be to test every input/output combination. It should be to fully describe and specify all the required and expected behavior of the object you are testing. These should more or less turn out to be the same thing, but because the mindset is so completely different the details will be different. The details will be better. And you might actually stand a chance at keeping those tests up to date. Heck, you might even enjoy keeping them up to date instead of feeling like it's a chore.

Well, I don't know, that's asking a lot.

Is changing the way you approach your tests going to keep the real life details from getting ugly? Nope. You're still going to end up with test code that is ridiculously repetitive because you have to test the same method with these inputs, and those inputs, and those other inputs. And you'll still have to update all this repetitive code when you decide to change the type of one of the method parameters. And you'll still struggle with different ways to refactor your tests to cut down on the repetitiveness, which adds more abstraction and occasionally doesn't work out so well. And don't forget, you'll still be mocking out the dependencies. You know how I feel about mocking.

But now that your tests are more valueable, at least in your mind, this might all be worth it. I guess we'll just have to try it and see.

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.