Practical TDD

It took me several years to drink the TDD kool-aid but now that I have I'm addicted. It's not that I didn't want to automate my testing, it's just that it wasn't particularly practical for me to do so. Having worked at startups over the past several years, I have never been able to find that balance between producing new code and appropriate test coverage for that code.

The problem has typically been that the testable API changed frequently enough that I spent as much time or more updating tests as I did writing new code. This was the problem for unit tests and as such I never seem to get around to writing them or using continuous integration tools like CruiseControl. However, at my current job we have managed to create a TDD methodology that works particularly well for us. It essentially works like this:

  1. Agree upon web service API
  2. Write Unit Tests that cover the new service API
  3. Iterate on code until all service tests pass

The primary difference between this and any other testing methodology is that we focus on testing our web services as opposed to the underlying API's. This gives us a few very concrete benefits:

  • The front-end team can begin coding against the service API immediately.
  • Increased test coverage with less tests due to service dependencies.
  • Immediate feedback on work in progress.
  • Breaking API changes caught immediately, reducing impact on customers.
The introduction of continuous integration via CruiseControl along with 100% service coverage has allowed us to immediately see the benefits. The number of bugs introduced into our production environment has been reduced a measurable amount since creating the test framework.


No comments: