January 14, 2006

...Learn TDD with Codemanship

Sticking To Our Guns

As a consultant, moving from client to client and team to team, you get a different insight to that which you might get working on the same project for months or years at a time.

In the last 5 years, I've come across dozens of teams who claim to be doing eXtreme Programming. To this day, I can think of two that actually were. It was just the same with the Unified Process, Catalysis, Fusion, and SSADM, as I recall. Most teams wore tham as badges but when you got down to the nitty gritty of what people did on a day-to-day basis, few if any were actually using the methods.

Typically, teams cherry pick the parts of the method they like - and there's nought wrong with that, unless you choose them for the wrong reasons. In RUP, for example, people tended to gravitate towards use cases. I've seen teams doing "iterations" of up to 6 months, doing virtually no analysis and design, and leaving out such little luxuries as testing and configuration management. One investment bank in particular was actually a case study for RUP, but they really weren't doing it at all.

One problem is that many methods don't stipluate a clear bottom line. They should have a big warning on the box saying "if you're not doing X, Y and Z then you're not doing this method". RUP should have a big warning on the documentation stating clearly that, if you're not doing the 6 best practices RUP is buit on, then you're not doing RUP.

XP is a little clearer. There are (well, there used to be - I haven't really kept up to date with the latest documentation) 12 practices and 4 core values that defined the method. If you're not doing developer testing, then you're not doing XP. If you're not working from user stories, you're not doing XP. If you're not doing small releases then you're not doing XP. If your customer isn't part of the team, then you're not doing XP. If you don't favour face-to-face communication over documentation, then you're not doing XP. And so on.

The three core practices that are most closely associated with XP are also the most widely adopted: test-driven development, refactoring and continuous integration. Since these tend to require freely available tools, and don't require changes to the way the project is managed, developers are free to adopt these practices without asking for permission from some higher authority (though why some of them continue to do exactly that is still a bit of a worry).

Once they get past the basic programming practices, developers have to start involving other people in their adoption of XP, and this is where it gets tricky. Iterative, evolutionary development has really mainly taken root with programmers. Many project managers, requirements analysts, testers and certainly business stakeholders have probably never heard of XP, or if they have, they know very little about it. They're still stuck in a waterfall mentality, and often refuse to co-operate in any move to iterative development.

Which is why the norm - from personal experience, I must stress - has been teams doing TDD, refactoring and continuous integration that are locked inside projects that are doing Big Plan Up Front. Don't get me wrong - using these core practices will help things on any project, but not as much as continuous customer involvement and adaptive planning. Feedback from unit tests only tells us that we built it right. It can't tell us if we built the right thing. I don't know about you, but I'd rather get a buggy implementation of something useful than a bug-free implementation of something I don't need.

It's not the developers fault, of course. Most of them would give their right arms to work on a project where everybody was playing their role effectively. But these are few and far between, and it's usually not within the developer's power to demand it.

It's symptomatic of the power structures that have grown within the software and IT industry. The experts - the people who make software - are at the bottom of the pyramid of authority, and are usually expected to just do as they're told - often by people who've never written a line of code in their lives (or more often by people whose last line of code was written in COBOL some 25 years ago).

We're not a very proactive bunch, I've discovered. In other very skilled jobs, the people with the skills tend to have a say in how the job is done. Hospital adminstrators can't set clinical standards. Building project managers have to be qualified builders. Lawyers define the standards within which law must be practiced.

But in software development, if the project manager wants to fix the deadline, the scope and the cost, the developers have to suck it up or walk. Indeed, walking seems to be the preferred mode by which we enforce higher standards of practice, since digging in your heels and just saying "no" tends to lead to enforced walking anyway.

If we got organised, and all the better developers got together, drew up a very basic charter based on the things that we all agree on (like close customer involvement, that you can't fix all the project variables, and the need for iterative development regardless of whether or not the requirements are set in stone) and then make a brave (and, admittedly, risky) commitment to never stray from that charter...

Of course, people would still stray, just as doctors still occasionally break their hypocratic oath, but I suspect far fewer would. I also suspect having the moral backing of several thousand experts would bolster our confidence and make it easier to put our foot down when the situation really calls for it. So instead of the "play ball or walk" choice many developers currently face, it might become a "no, YOU play ball, or we ALL walk" choice for the errant customer or manager.

Now what are the chances of that happening?

There's also a danger inherent in methods. Many methods have a significant element of fashion about them, so insisting on doing XP, or RUP, or DSDM, might be seen as unrealistically dogmatic. (The "D" word comes up a lot if you try to stick to your guns). Perhaps a more usable charter would simply stipulate some absolute bare minimum best practices, and then stipulate that on any project the developers should decide exactly what methods to use. I would be happy with that. Agree on what we can all agree on, and make that the gospel for all projects. And then agree that the developers make the decisions about how the software should be developed around those fundamental principles.

But what are those fundamental principles? Here are my suggestions:

1. Incremental development. When is less feedback less often ever better than more feedback more often?
2. Continual stakeholder involvement - in planning, requirements definition, testing, UI design, deployment, user documentation and so on.
3. Adaptive planning. Can we finally all agree that you can't predict time, scope, cost AND quality. In even the shortest and simplest projects, at least one of these is GOING TO CHANGE
4. Completed = tested. How can you know a feature has been delivered if you don't know if it actually works? This, I find, is absolutely key to effective planning, among other things.
5. Automate repetition. Whether its deploying software over and over again. Or running system tests over and over again. If you want or need to do something over and over again, and it's programmable, them automate it.
6. Maximise returns. Do the least you can do to satisfy the requirements.
7. Testable project goals. Putting aside the lovely widgets and wotnots the customer's asking for, what are we actually trying to achieve? How will we know when we've achieved it? When is the work really done?

I don't know about you (no, really, I don't), but I'd be content to work with just these basic (but deceptively difficult to achieve - entirely for political or cultural reasons) things in place on most projects, whether we're doing true XP or not.
Posted 1 week, 3 days ago on January 14, 2006