November 20, 2013

...Learn TDD with Codemanship

Retro Programming

After being reminded of some key information sources that pre-date the year I was born (1971, if you please, and not 10,000 BC as some have suggested), I've set myself a little challenge. Well, it's good to have a hobby, right?

I've long known that many of the "new" practices I use as a software developer are actually not as new as people think. In fact, some date back - in one form or another - to the 1950's.

Take Test-driven Development, for example. Kent Beck spoke of how he "rediscovered" TDD, referencing a book - whose name he appears to have forgotten - that spelled out how programmers could define a program by examples of inputs and outputs (in this case recorded on magnetic tape) and writing code to map the input onto the output. That's TDD.

Other TDD-like references date as far back as 1957 (and lets assume that the actual application of those practices goes back even further). From Arialdo Martini's blog post "You won't believe how old TDD is ", he quotes from the book "Digital Computer Programming" by Daniel D McCracken:

"The first attack on the checkout problem may be made before coding is begun. In order to fully ascertain the accuracy of the answers, it is necessary to have a hand-calculated. check case with which to compare the answers which will later be calculated by the machine. This means that stored program machines are never used for a true one-shot problem. There must always be an element of iteration to make it pay. The hand calculations can be done at any point during programming. Frequently, however, computers are operated by computing experts to prepare the problems as a service for engineers or scientists. In these cases it is highly desirable that the “customer” prepare the check case, largely because logical errors and misunderstandings between the programmer and customer may be pointed out by such procedure. If the customer is to prepare the test solution is best for him to start well in advance of actual checkout, since for any sizable problem it will take several days or weeks to and calculate the test."

What jumps out at me from this paragraph is the allusion to an iterative process of programming, driven by tests that are written by the customer (a domain expert).

Similar references to TDD-like practices can be found in interviews with Jerry Weinberg where he talks about his experiences on NASA's Project Mercury in the early 1960s. It was also stated by Craig Larman that development on Project Mercury was done "top-down" (we now call it "outside-in") using test doubles.

So we can conclude that some of form of iterative, test-driven software development was being practiced on real projects before 1971.

Years of my own digging has revealed that many other practices go back to the 1960's and earlier. It's believed, for example, that teams working on IBM's OS/360 in the early 1960's were doing integration builds.

The Report from the 1968 NATO Conference on Software Engineering describes some strikingly familiar problems faced by programmers then and programmers today, and includes pieces of advice that modern teams would do well to heed.

And then there's the technology itself. Advancements in programming languages had reached a plateau by the time I was born. By 1971 we had progressed from programming in binary with punchcards, via Assembler and 3GLs like FORTRAN, to typing early object oriented programs using languages like Simula and Smalltalk into text editors (maybe even using a Graphical User Interface, if you happened to work at Xerox Parc). LISP was invented in the 1950's, and folk who think functional programming is all trendy and new might like to reflect that on the fact that FP was old news by the year of my birth. The programming tools of today, while far more powerful in their scope, are essentially little changed from the programming tools of 1971. They typed programs into a text editor, and used a compiler to read the source code and generate machine-executable code.

And last night it struck me; would it be possible to synthesize a recognisably "modern" approach to software development using only the principles and practices and types of tools that were available in 1971?

If I shopped around the various texts written before 1971, could I create the requirements, design, programming, testing, configuration management and other disciplines I might need to produce valuable software in 2013?

Would it be simply, as some suggest, that all I would need to do is find out what we were calling it then and map that onto things we do today? Would there be big gaps? Would there be no gaps? Would it cover all 11 essential disciplines in my Back To Basics paper

Just for jolly, I intend to try and create a methodology synthesized out of these vintage disciplines that might be fit for 21st century purposes.

Wish me well!







Posted 4 years, 2 months ago on November 20, 2013