September 11, 2015

...Learn TDD with Codemanship

What Not To Do In a TDD Pair Programming Interview

A few quick thoughts this morning after a fairly concentrated run of pair programming interviews for several clients, particularly on Test-driven Development (TDD).

If, as a candidate, you're lined up to pair with someone like me, and "TDD" is being requested as a key skill, here are some things you probably shouldn't do when we pair:

1. Start by writing implementation code

The Golden Rule of TDD is "Don't write any implementation code until there's a failing test that requires it". So if we're doing, say, a program to convert degrees Celcius into degrees Fahrenheit, and we're supposed to be doing TDD "by the book", then I'm going to be disappointed if you start by declaring your TemperatureConverter class first, and then start writing a test for it.

Yes, it's true: the test code needs a TemperatureConverter class to compile and run, but the whole idea of test-driven approaches to design is that the test comes first, and the declaration follows. Simply imagine you have such a class, write a test that needs it, and as soon as your IDE gives you that red squiggly line because it's not compiling... that's your permission to declare it.

2. Introduce speculative generality

The second most common gotcha in a TDD pairing interview is creating code we don't need to pass the tests. For example, surprisingly often candidates will start by declaring an interface to be implemented by the class under test. I'll ask "what do we need the interface for?", and typically the answer will be about some possible unspoken need in the future. e.g., "So we can mock it" or "In case we need to use this dependency injection framework".

And, just so you know, adding a dependency injection framework - in most TDD exercises - is a Big Red Flag. Just create the implementation code you'll need to pass the tests. Everything else falls under Y.A.G.N.I. (You Ain't Gonna Need It).

3. Write weak or meanngless tests

Some candidates have read somewhere that you need to run the test to see it fail. This is because it helps us to check that our test is valid - i.e., it would fail if the result was wrong.

But writing fail() to see the test fail is just testing fail(). Enough said.

Likewise, with assertions that leave the range of potential solution wide open. A classic example is checking the length of a list when what we really should be asking is if the item we added can be found in the list where we say it should be.

4. Writing redundant tests

Harking back to the first point, we don't declare classes until our failing teests require them. But that doesn't mean we specifically write tests in order to declare classes in our implementation. I see this too often; a candidate thinks "I'm going to need a FizzBuzzer class", and so they write test code:

FizzBuzzer fizzBuzzer = new FizzBuzzer();
assertNotNull(fizzBuzzer);


If fizzBuzzer was null, then attempting to use it's methods in a test will fail. No need to specifically test that it's not null. That's just testing runtime object creation.

Tests - including unit tests - should be about behaviour. Don't test that a Car has Wheels, or that a Customer has a Name. Tests should be about the work that our code does, not the structure it has in order to do it.

5. Not running the tests

Yes, it actually needs saying. I constantly have to remind candidates to run their tests after making changes to the code. That's what automated unit tests are for. This makes me worry about the candidate's habits.

Of course, they may have become used to letting a continuous testing tool like JUnitMax or Infinitest run the tests for them in the background. But such tools are not always available, and I find that we need to work with them switched off on a regular basis to keep reinforcing that habit for when we find ourselves working in a technology that doesn't have tools like that. And for when we're doing pair programming interviews, of course, because I will ask you to turn the tool off so I can see you swimming without the armbands.

6. Not refactoring when it's obviously needed

TDD has 3 key activities: we're either writing a test that fails, or writing the simplest code to pass that test, or refactoring the code to keep it clean and maintainable. The vast majority of developers have to be reminded about that third - vitally important - activity.

Alarm bells go off when candidates don't refactor. There could be a number of reasons why they don't, all of which are troubling:

a. They're not in the habit of refactoring
b. They don't recognise code smells when they see them
c. They don't know how to refactor

Typically, it's all three.

7. Hacking away at the code when you're "refactoring"

Refactoring is a discipline. You make one small, atomic change to the code (e.g., extract a method), and then you run the tests. Then, if necessary, you do another refactoring. Watching someone just start typing, deleting, cutting, pasting and generally hacking away at the code - without running the tests frequently - is like watching someone doing their makeup or shaving while they're driving a car. If pair programming interviews were driving tests, we'd fail you on the spot. Indeed, maybe we should.

Refactoring is the best tell-tale I've seen for distinguishing good developers from the rest. That is to say, I've noticed how developers who refactor well tend to do lots of stuff well.

Get some practice at refactoring, learn the common code smells and the toolset of refactoings you'll need to fix them. Get to know the refactoring menu in your IDE. If it doesn't have one, or you're working in a language that makes many refactorings difficult to automate, then get practice at doing them safely by hand.

And don't forget to keep running those tests!

8. Writing one test that asks all the questions

So, here's my first test for a FizzBuzzer:

assertTrue(fizzBuzzer.fizzBuzz().startsWith("1,2,Fizz,4,Buzz,Fizz,7,8,Fizz,Buzz,11,Fizz,13,14,FizzBuzz"));


That's actually a whole bunch of tests in one go. All the rules of FizzBuzz are tested in this one assertion. To pass this test, we can either hardcode that entire return string, or we can write a complete FizzBuzzer algorithm.

If we hardcode it, then move on to the next test:

assertTrue(fizzBuzzer.fizzBuzz().endsWith("Fizz,Buzz"));


We can end up painting ourselves into a corner of either hardcoding the entire output from 1...100, or havig to make a big leap from a hardcoded response to the full algorithm. Triangulation it ain't!

Ask one question at a time, and generalise your solution with each new test case:

assertEquals("1", fizzBuzzer().fizzBuzz().split(',')[0]);


Remember: BABY STEPS


Of course, there are other things candidates do in pair programming interviews that might suggest they're not as familiar or experienced at TDD as their CV suggests. But these are the ones I see most often.


Shameless Plug: join us in London on Saturday October 10th to brush up your TDD discipline on the last Codemanship Intensive TDD workshop of 2015. Amazingly good value at just £49






Posted 2 years, 2 months ago on September 11, 2015