September 13, 2014
What Is Continuous Inspection, Anyway?Following on from my braindump about Continuous Inspection (ContInsp, as I've recently started to abbreviate it), a few people have asked "What is Continuous Inspection?"
I may have been getting ahead of myself, and overestimated how far this idea has reached. So here's a quick lowdown on Continuous Inspection for the uninitiated.
Just as we discovered that it works out better if we test our software frequently throughout development, instead of having one big testing phase near the end, teams have also learned that instead of having occasional Big Code Inspections - and we all know how useful they are - it's better to inspect the code frequently throughout development.
Catching code quality problems earlier can have similar benefits to catching bugs earlier. Firstly, they're cheaper to rectify when there's less code surrounding them (the equivalent of weeding before a thick jungle grows around the weeds that we'll have to hack our way through). And secondly, when we talk about "code quality problems", we're often-as-not talking about design issues that can hinder progress later on. So, the earlier we tackle code quality problems, the cheaper they tend to be to fix, and the faster we tend to go because of lower impedance in our code.
There's also the question of focus. It's said that if you show a programmer a line of code and ask "What's wrong with this code?", they'll tell you. If you show them 1,000 lines of code, they'll shrug their shoulders and say "Looks okay to me."
Better to deal with one design issue at a time, and to fix them as soon as they appear.
For this reason, we see a need - well, those of us who give a damn - to be continuously monitoring the code for the appearance of code quality problems, and to be eliminating them as soon as we spot them.
On XP teams, there are a number of techniques that can help in this respect:
1. Pair Programming - when done well (i.e., with sufficient attention paid to code quality), having a second pair of eyes on the code as it's being written can help spot problems earlier. Pair Programming can also help raise awareness of what code quality problems look like among team members.
2. Test-driven Development - for two reasons: a. because it focuses us on making one or two design decisions at a time, and b. because it explicitly provides a reminder after passing each test that tells us to LOOK CAREFULLY AT THE CODE NOW and refactor it until we're happy with the quality before we move on to the next failing test.
3. Automated Code Inspections - this final technique is still relatively rare among teams, but is growing in popularity. It has the same advantage as automated tests, in that it can help us to continually re-inspect the quality of our code, asking many questions about many lines of code, cheaply and quickly. Arguably, ContInsp doesn't really scale until you start using automation for at least some of it.
Teams are increasingly seeing automated code inspections as the best way to guard against code rot. At the very least, it can provide a clear line that developers must not cross when checking in their code, with alarm bells ringing when some does.
To give you an example, let's say we agree as a team that methods shouldn't make more than one decision or contain more than one loop. We duly program our static code analysis tool to flag up any methods with a cyclomatic complexity (a measure of the number of paths through the code) of more than 2. Bill writes a method with an IF statement inside a FOR loop, and tries to check his code in. The alarm goes off, the team goes to DEFCON ONE, and the problem has to be fixed - by, say, extracting the IF statement into its own self-describing method - before anyone else can check their code in.
This may seem very strict to you, but at the very least, teams should consider having the early warning, even if its just to trigger a discussion and a chance to consciously make a decision as to whether to let it through.
The alternative is what we currently have, where developers can merrily check in code riddled with new smells and nobody finds out until the cement has set, so to speak.
On new code, it's a good idea to set the quality bar as high as you comfortably can, because as the ode grows and becomes more complex, it's better to knowingly make small concessions and compromises on quality than to realise too late that the quality stinks and it's going to take you a year to make the code anything like as maintainable as you need for adding new features.
Legacy code, though, is a different matter. Likely, it's already crammed full of quality problems. Applying your usual high quality to bar will just reveal a yawning chasm that will demoralise the team. (Unless your goal is simply to benchmark, of course.)
I recommend in that instance using inspections in an exploratory way:; the goal being to discover what the internal quality of the code is. Once you've established the current reality, draw a line there and set up your ContInsp to enforce it not getting any worse. Then , as you being to refactor the code, and as the quality improves, raise the bar to meet it, always enforcing the policy that it should get no worse.
Gradually, as your code is rehabilitated back into a condition that can accommodate change, you tighten the screws on the vice of quality metrics that are holding it where it is, until all the dials are back in the green.
And do not, under any circumstances, make these code quality reports available to managers. Let them focus on what matters to the business, while you focus on sustaining the pace of frequent, reliable software deliveries.
Which brings me to my final point about Continuous Inspection; just as we see continuous testing and continuous integration and keys to achieving Continuous Delivery, so too should we consider adding ContInsp to that mix (along with Continuous Improvement), because if our code gets harder and harder to change, then we're going to end up Continuously Delivering software that has no added value.
Posted 4 years, 10 months ago on September 13, 2014