November 12, 2007

...Learn TDD with Codemanship

Example Design Quality Review

Many of the project team engagements I do these days begin with a review of their code to see what's what, who's where and why for.

Some people - well, mainly the voices in my head - keep harrassing me to see a copy of the output of one of these code quality reviews, perhaps to get some pointers for their own reviews. (Which might be the blind leading the blind, but, hey, this is the software industry and that's how things work around these here parts.)

So I've anonymised and uploaded a copy of a report I did this weekend for one client, so you can see what kinds of things I might be looking for and what kind of feedback I give.

First thing to note is that this particular team has done some pretty snappy work, so there wasn't a great deal to complain about. But even the best teams gradually start to let a quantity of nastiness into their code, and that's why it's vitally important to stay on top of design quality and get frequent, meaningful, objective feedback. Preferably from someone who charges by the day.

I like to think my kind of feedback is especially objective, since I tend these days to steer clear of issues like naming conventions and architectural style, and to focus on the structural aspects of the code that I'm quite confident are important regardless of style - like size and complexity, coupling and cohesion, generality and so on.

The process I go through is fairly straightforward. I have a list of things I would usually test for - like long/complex methods and classes, highly coupled classes and packages, lack of cohesion at various levels, and so on. I also increasingly test for a variety of less scientific code "smells" - like use of switch statements instead of polymorphism, failure to observe the single responsibility design principles, failure to isolate concerns so that they only need to be changed in one place (e.g., knowledge of how to connect to a database), data classes, parellel inheritence heirarchies, etc etc.

As I go through these tests, I list the results in a Word document, which is then formatted nice and neat so that managers and pixies and pussycats can understand the information. And that's what you are looking at if you've downloaded the example I posted today.

Increasingly, I use automated code analysis tools to perform these tests - like NDepend and FxCop. You can cover a lot of code very quickly with automated inspections. And increasingly I look at the reports I'm generating and wonder if I could completely automate them at some point in the future.

Then teams could get my feedback whenever they liked, while I lounge by the pool with my Harry Potter and a Gin & Tonic.
Posted 5 hours, 44 minutes ago on November 12, 2007