April 11, 2018

Learn TDD with Codemanship

The Foundation of a Dev Profession Should Be Mentoring

What makes something like engineering or law or medicine a "profession"? Ask me 20 years ago, I'd have said it was standards and ethics, policed by some kind of professional body and/or the law. There are certain things, say, an electronic engineer isn't supposed to do, certain things you can't ask your doctor for, certain things a lawyer would end up in jail for doing.

Ask me today, and my answer would be this: a profession is a community of people following a vocation - like writing software or teaching children - that professes how it works to people who want to learn how to do it.

Experienced school teachers help people learning to be school teachers how to teach. They pass on the benefit of their experience, including all the stuff an even more experienced teacher passed on to them.

I still very much believe that standards and ethics must be part of a profession of software development. But I'm increasingly convinced that the bedrock of any such profession would be mentoring. I think of all the time I wasted in my early years of programming, and all the things that would have helped enormously to know back then. Even programming for fun in my teenage bedroom would have been made easier with some basic code craft like unit testing and rudimentary version control.

I was very lucky to be exposed to much more experienced "software engineers" who nudged me firmly in the direction of rigorous user-centred iterative software development, mentioning books I should read, newsgroups I should visit, courses I should go on, and showing me with their day-to-day examples techniques I still apply - and teach - today.

I make it my business today to pass on the benefits of the mentoring I received.And that, to my mind, should be the basis for a profession of software development.

For that to work, though, it's necessary that developers stay developers. "Use it or lose it" has never been more true than in software. I see developers I coached 10 years ago get promoted into management roles - sheesh, I know a lot of CTOs, according to LinkedIn - and quickly lose their coding abilities and fall behind with the technology. Their experience might be invaluable to someone starting out, but it's hard to lead by example if the last programming you did was in Visual C++ 6.0 and your junior devs are working in F#.

So, another pillar of this professional foundation must necessarily be parallel career progression - up to CTO equivalent - for developers. Looking for work for the first time in a decade has left me in little doubt that - with a handful of glorious exceptions that I'm exploring - many employers don't want older (i.e., more expensive) developers, and even the most senior dev roles typically pay a lot less than management equivalents. I meet a lot of senior managers who are reluctantly in this roles because they have big mortgages and school fees to pay. They'd much rather have stayed hands-on. If the best potential mentors are disappearing into meeting rooms all day, it will always be impossible to square this circle.

The idea's been floated before - including by me - but I think it's finally time to start a software developer's guild, with a specific purpose of championing long-term mentoring and parallel career progression for devs who want to stay devs.

Who's with me?




March 24, 2018

Learn TDD with Codemanship

Code Craft: What Is It, And Why Do You Need It?

One of my missions at the moment is to spread the word about the importance of code craft to organisations of all shapes and sizes.

The software craftsmanship (now "software crafters") movement may have left some observers with the impression that a bunch of prima donna programmers were throwing our toys out of the pram over "beautiful code".

For me, nothing could be further from the truth. It's always been clear in my mind - and I've tried to be clear when talking about craft - that it's not about "beautiful code", or about "masters and apprentices". It has always been about delivering software that works - does what end users need - and that can be easily changed to solve new problems.

I learned early on that iterating our designs was the ultimate requirements discipline. Any solution of any appreciable complexity is something we're unlikely to get right first time. That would be the proverbial "hole in one". We should expect to need multiple passes at it, each pass getting it less wrong.

Iterating software designs requires us to be able to keep changing the code over and over. If the code's difficult to change, then we get less throws of the dice. So there's a simple business truth here: the harder our code is to change, the less likely we are to deliver a good working solution. And, as times goes on, the less able we are to keep our working solution working, as the problem itself changes.

For me, code craft's about delivering the right thing in the short-to-medium term, and about sustaining the pace of innovation to keep our solution working in the long term.

The factors involved here are well-understood.

1. The longer it takes us to re-test our software, the bigger the cost of fixing anything we broke. This is supported by a mountain of evidence collected from thousands of projects over several decades. The cost of fixing bugs rises exponentially the longer they go undetected. So a comprehensive suite of good fast-running automated tests is an essential ingredient in minimising the cost of changing code. I see it being a major bottleneck for many organisations, and see the devastating effect long testing feedback loops can have on a business.

2. The harder it is to understand the code, the more likely it is we'll break it if we change it.

3. The more complex our code is, the harder it is to understand and the easier it is to break. More ways for it to be wrong, basically.

4. Duplication in our code multiplies the cost of changing common logic.

5. The more the different units* in our software depend on each other, the wider the potential impact of changing one unit on other units. (The "ripple effect").

6. When units aren't easily swappable, the impact of changing one unit can break other modules that interact with it.

* Where a "unit" could be a function, a module, a component, or a service. A unit of reusable code, essentially.

So, six key factors determine the cost of changing code:

* Test Assurance & Execution Time
* Readability
* Complexity
* Duplication
* Coupling
* Abstraction of Dependencies

Add to these, a few other factors can make a big difference.

Firstly, the amount of "friction" in the delivery pipeline. I'd classify "friction" here as "steps in releasing or deploying working software into production that take a long time and/or have a high cost". Manually testing the software before a release would be one example of high friction. Manually deploying the executable files would be another.

The longer it takes, the more it costs and the more error-prone the delivery process is, the less often we can deliver. When we deliver less often, we're iterating more slowly. When we iterate more slowly, we're back to my "less throws of the dice" metaphor.

Frequency of releases is directly related also to the size of each release. Releasing changes in big batches has other drawbacks, too. Most importantly - because software either works as a whole or it doesn't - big releases incorporating many changes present us with an all-or-nothing choice. If change X is wrong, we now have to carefully rework that one thing with all the other changes still in place. So much easier to do a single release for change X by itself, and if it doesn't work, roll it back.

Another aside factor to consider is how easy it is to undo mistakes if necessary. If my big refactoring goes awry, can I easily get back to the last good state of the code? If a release goes pear-shaped, can we easily roll it back to a working version, with minimal disruption to our end customer?

Small releases help a lot in this respect, as does Version Control and Continuous Integration. VCS and CI is like seatbelts for programmers. It can significantly reduce lost time if we have a little accident.

So, I add:

* Small & Frequent Releases
* Frictionless Delivery Processes (build-test-deploy automation)
* Version Control
* Continuous Integration

To my working definition of "code craft".

Noted that there's more to delivering software than these things. There's requirements, there's UX, there's InfoSec, there's data management, and a heap of other considerations. Which is why I'm clear to disambiguate code craft and software development.

Organisations who depend on software need code that works and that can change and stay working. My belief is that anyone writing software for a living needs to get to grips with code craft.

As software continues to "eat the world", this need will grow. I've watched $multi-billion on their knees because their software and systems couldn't change fast enough. As the influence of code spreads into every facet of life, our ability to change code becomes more and more a limiting factor on what we can achieve.

To borrow from Peter McBreen's original book on software craftsmanship, there's a code craft imperative.



March 20, 2018

Learn TDD with Codemanship

Why I Won't Take Automated "Hacker Tests" To Get Job Interviews

I'm back on the contract market - give me a shout if you're in the London area (or looking for remote-workers) and could use a very experience Java and/or C# bod - and it's been a looong time since I looked for regular work.

Much seems to be the same as it was when I was last contracting: the junior recruiters randomly filtering out candidates because their CV doesn't specifically mention that version of Spring, the dispiriting job ads that effectively say "It's a shitstorm here, but you get foosball and free breakfast!", the banks who make us wait 6 weeks for an interview date, the ever-growing lists of languages, tools and frameworks we're expected to have 500 years experience of. Yep. It's all as I left it.

But there's something new among all this. More and more of us are apparently being asked to take some kind of automated online coding test before the employer will even consider speaking to us. I was asked this week to take a hackerrank test that lasted 90 minutes. The recruiter said the client was "very positive" about my CV. But, it turns out, this step in the recruitment process was non-negotiable.

I have no problem with being asked to demonstrate technical competence. I kind of do it for a living. I code in front of other developers on training workshops, at conferences, on YouTube, and via my Github account. I'm not hiding anything. If you want to pair program with me on a problem to see the cut of my jib, I'm okay with that.

But I draw the line at these online timed tests. The focus on them is necessarily very narrow, for a start. Maths puzzles and algorithms and "stuff about programming languages". That sort of thing. Is this the new "whiteboard interview"? (Flashbacks to interviews where someone wrote some Java on a board and asked "Will that compile?" I'm sorry, I wasn't aware we'd be compiling this software in our heads.)

I think the focus has to be narrow, because there's a limit to what can be scored automatically. Basically, "this is what we know how to measure". I understand that a lot of these tests focus on algorithms. You're asked to solve a problem, and then scored on passing acceptance tests (easy to automate) and execution time (again, easy to automate).

While I agree that passing acceptance tests is kind of important, I worry about the next biggest factor being Big O-style algorithmic efficiency. Maybe my solution is slower, but easier to understand, for example. And it's sneaky, too. If there are performance criteria, we should bee told what they are up-front. I'm not in the business of making code faster than it needs to be just as a matter of course.

I also worry about the competition element of some of these tests, especially given the narrow focus. I do not rank "hackers" by their ability to create efficient algorithms alone, or by their in-depth knowledge of Java syntax. Let's measure something else to illustrate what I mean; in my Team Dojo, developers have to work together to solve a set of non-trivial problems. They also score points for passing acceptance tests. And what I've learned from watching hundreds of teams take this test is that individual technical ability is a poor predictor of team performance. Teams of coding ninjas are routinely outperformed by teams of average devs who just worked together better. It's quite inspiring to watch.

My other objection to taking these tests is the time candidates are expected to invest speculatively, just to be considered for interview. If you're on the market for work, you may be making multiple applications every week. What if they all ask you to take one of these tests, just to be considered? This creates a big overhead for candidates. If you're coming to the end of a contract, have young kids at home, or are caring for a relative, or have other time commitments, where are you going to find 90 minutes in your day just to prove that you know LINQ. Every. Single. Time. You. Apply?

I would be in favour of a website where devs can once-and-for-all demonstrate their competence in something. Not every time an employer says "dance for me!" I thought this site was called "Github", but that shows what I know.

But I'm not in favour of this cookie-cutter-one-size-fits-all approach to filtering. I guess my real gripe about being asked by a well-known Agile consultancy to take a hackerrank test is not that they asked, but that there was simply no other way of demonstrating my coding chops that they'd consider.

In discussing this with other developers, it seems as if there's a "horses for courses" situation here. Not everyone codes in their spare time, not everyone has a portfolio of stuff (e.g., on Github) they can point to. Not everyone shines when they're put on the spot. Not everyone likes to take stuff away and work alone. There's no one single way that will give every developer a chance to show us what they can really do.

Perhaps what I'm saying here is that we should let the candidate choose how they demonstrate technical competence. I might say "take a look at my screencasts" or "let me fire up Zoom.us and pair with one of your devs" or "how about I come in and run a little hands-on workshop?" Someone else might want to do the hackerrank test, perhaps because they lack job experience and need to demonstrate some raw ability, or maybe they just get nervous with new people. Someone else might want to do a - gulp - whiteboard interview because they worry they'll mess up coding in front of other people, but can demonstrate how much they've learned.

The point is that I can tell shit from shinola any of these ways. If you suck, and you have a portfolio, I'll know it from looking at that. If you suck and we pair, I'll know soon enough. If you suck and take a hackerrank test... I'll still want to see the code. But eventually, I'll know. (So might as well look at your Github.) And if you suck and we get around a whiteboard, I'll get it from that, too.

It seems to me that these automated coding tests are an attempt to remove the "it takes one to know one" element from filtering candidates. My contention is that you can't. That kind of machine intelligence is still decades way. Meanwhile, we're stuck with people assessing other people. And it helps enormously if those people know what they're looking at (and what they're looking for.) That's what needs fixing here.

There's no economy of scale in software development. Why would we believe there's economy of scale in software developer recruitment? That's the problem these online tests claim to solve, but - evidently - they haven't. They just filter out experienced candidates like the many developers I've spoken to.

So we might as well let candidates put their best foot forward and let them decide which foot that is.

Otherwise the end result is you filter out a lot of good people who'd be great additions to your team, but who just don't fit in your recruiting process.

Perhaps we need a Dev Recruitement Manifesto?





February 1, 2018

Learn TDD with Codemanship

BDD & Specification By Example - Where Did We Go Wrong?

I've been saving this post up for a while, but with a bit of pre-dinner free time I wanted to put it out there now.

I meet a lot of teams, and one thing many of them tell me is that the "customer tests" they've been driving their designs from are actually written by the developers, not the customer.



Sure, they're written using a "Behaviour-Driven Development" or "Acceptance Testing" tool like Cucumber or Fitnesse. But just because you've built a "granny annex" on your house, if there's no granny living in it, it's just an "annex".

We've dropped the ball on this. The CHAOS report, published every year by the Standish Group, consistently cites lack of customer involvement as the number one factor in project failure. A tool won't fix that.

Especially when that tool wasn't designed with customer collaboration in mind. When your "Getting Started" guide begins "First, install Visual Studio..." or requires your customer to learn a mark-up language or to use version control, arguably you're bound to have a hard time getting them to engage in the process.

Increasingly, I work with teams who want to somehow connect the way their customer actually prefers to capture examples with the way devs like to automate tests. 90% of the time, that means pulling data out of Excel spreadsheets - still the most widely used tool in both communities - into unit tests. Some unit testing frameworks even have that facility built in (e.g., MSTest for .NET). But reading data from spreadsheets is child's play for most developers. With OLD DB or JDBC, for example, a spreadsheet's just a database.

But, regardless of the tools, the problem most teams need to solve is a people problem. I've found that close customer involvement is so critical to the chances of a team succeeding at solving the customer's problems that I actually stop development until they engage at the level we need them to. No play? No code.

The mistake many of us make is to give them a choice. "Would you like to spend a lot of time with us discussing requirements and playing with candidate releases and giving us feedback?" "No thanks, ta very much. See you in a year's time."

We made a rod for our backs by allowing them to be absentee partners and trying to figure out what they want and need for them. Specification By Example presents us with an opportunity to make the relationship clearer. The customer has to be "trained" to understand that if they haven't agreed a test for it, they ain't gonna get it.



January 21, 2018

Learn TDD with Codemanship

Delegating "Junior" Development Tasks. (SPOILER ALERT: It doesn't work)

When I first took on a leadership role on a software development team 20 years ago, from the reading I did, I learned that the key to managing successfully was apparently delegation.

I would break the work down - GUI, core logic, persistence, etc - and assign it to the people I believed had the necessary skills. The hard stuff I delegated to the most experienced and knowledgeable developers, The "easy" stuff, I left to the juniors.

It only took me a few months to realise that this model of team management simply doesn't work for software development. In code, the devil is in the detail. To delegate a task, I had to explain precisely what I wanted that code to do, and how I wanted it to be (in terms of coding standards, our architecture, and so on).

If the task was trivial enough to give to a "junior" dev, it was usually quicker to do it myself. I spent a lot more time cleaning up after them than I thought I was saving by delegating.

So I changed my focus. I delegated work in big enough chucks to make it worthwhile, which meant it was no longer "junior" work.

Looking back with the benefit of 20 years of hindsight, I realise now that delegating "junior" dev tasks is absurd. It's like a lead screenwriter delegating the easy words to a junior screenwriter. It would also probably be a very frustrating learning experience for them. I'm very glad I never went through a phase in my early career of doing "junior" work (although I probably wrote plenty of "junior" code!)

The value in bringing inexperienced developers in to a team is to give them the opportunity to learn from more seasoned developers. I got that chance, and it was invaluable. Now, I recommend to managers that their noobs pair up with the old hands on proper actual software development, and allow for the fact that it will take them longer.

This necessitates - if you want the team to be productive as a whole - that the experienced developers outnumber the juniors. Actually, let's not call them that. The trainees.

Over time - months and years - the level of mentoring required will fall, until eventually they can be left to get on with it. And to mentor new developers coming in.

But I still see and hear from many, many people who are stuck in the hell of a Thousand Junior Programmers, where senior people - often called "architects" - are greatly outnumbered by people still wet behind the ears, to whom all the "painting by numbers" is delegated. This mindset is deeply embedded in the cultures of some major software companies. The result is invariably software that's much worse, and costs much more.

It also leads to some pretty demoralised developers. This is not the movie industry. We don't need runners to fetch our coffee.


ADDENDUM: It also just occurred to me, while I'm recalling, that whenever I examined those "junior" dev tasks more closely, their entire existence was caused by a problem in the way we were doing things (e.g., bugginess, lack of separation of presentation and logic, duplication in data access code, etc). These days, when it feels like "grunt" work - repetitive grind - I stop and ask myself why.

January 3, 2018

Learn TDD with Codemanship

Professionalism & the "Customer"

Just a few words to add to a post I wrote a few days ago about TDD & "professionalism". I scribbled a quick Venn diagram to illustrate my ideas about stuff software development "professionals" should aim for.



A few good folk have understandably raised objections, which is the natural consequence of saying stuff on the Internet. In particular, some folk object to the idea that a "professional" doesn't write code the customer didn't ask for.

What if the customer doesn't know what they want? Should we build something and see if they like it? Call it an "experiment". We could do that. But before we do that, we could discuss it with the customer and seek their input before we build what we're planning to build. A mock-up, a storyboard, or other lo-fi prototype could clue them in as to what exactly it is we're planning to try for them.

And what if we're building software for the general public? How do we seek permission to try ideas?

This is the problem with words.

What exactly is a "customer"? Different teams will be working in different situations with different kinds of "customer". And there are many understandings of what that word means.

To me, the "customer" is whoever decides what the money gets spent on. In relation to professionalism, we can look at our relationship with our "customer" in many ways.

Think of doctors and patients: the doctor doesn't ask the patient "What medicine would you like me to prescribe?" Instead, she examines the patient, diagnoses the illness, and proposes a treatment. But she still seeks permission from the patient to try it. (Unless the patient is unable to give consent.) Arguably, it would be "unprofessional" of a doctor to administer a treatment without telling the patient what it is, what it's supposed to do, and what side effects it might have. There is a dialogue, then there is consent. The patient decides yay or nay, usually.

Or think of it as gambling. In the casino of software development, decisions are made to bet sums of money on features and changes. Some bets will be bigger than others. Some features will have a potentially larger pay-out than others. In that scenario, where we don't know what the outcome is going to be (which is - let's be honest - how it really is in software development anyway), who are we? Are we the gambler? Or are we the croupier? Do we take their money and tell them to go to the bar while we place bets on their behalf? Or do we ask them to sit at the table, and at least seek consent for every bet before it's placed?

And when it's us deciding what features to try, aren't we the "customer"? In this situation, it's our money we're gambling with. Do we randomly write code and see how it turns out? Or do we take aim before we fire? I've found it to be a bad idea to start writing code without a clear idea of what that code's supposed to do, regardless of whether this is decided in a conversation with a "customer", or in a conversation with myself.

One thing is clear to me (and feel free to disagree): all software development is an experiment. So, personally, I don't distinguish between a "spike" and a "finished solution". They're all spikes. I've found I'm genuinely no quicker producing working code when I cut corners. So my spikes have automated tests, and the code's maintainable. (I rarely even write sample code (e.g., for blog posts) without tests any more.) And they proceed a conversation in which the purpose of the spike is explicitly agreed, and consent - even if it's my own consent - is given to do it.

Now, like I said in the original post: I don't find discussions about professionalism very helpful. Words are difficult. However I spin it, some folk will object. And that's fine. Don't wanna do it my way? Don't do it. I'm not in charge of anyone except myself.

And isn't that, after all is said and done, the real definition of a "professional"?


December 30, 2017

Learn TDD with Codemanship

TDD & "Professionalism"

Much talk (and gnashing of teeth) about the link between Test-Driven Development and "professionalism". It probably won't surprise you to learn that I've given this a bit of thought.

To be clear, I'm not in the business of selling TDD to developers and teams. If you don't want to do TDD, don't do it. (If you do want to do TDD, then maybe I can help.)

But let's talk about "professionalism"...

I believe it's "unprofessional" to ship untested code. Let me qualify that: it's not a good thing to ship code that has been added or changed that hasn't been tested since you added or changed it. At the very least, it's a courtesy to your customers. And, at times, their businesses or even their lives may depend on it.

So, maybe my definition of "professionalism" would include the need to test (and re-test) the software every time I want to ship it. That's a start.

Another courtesy we can do for our customers is to not make them wait a long time for important changes to the software. I've seen many, many businesses brought their knees by long delivery cycle times caused by Big Bang release processes. So, perhaps it's "unprofessional" to have long release cycles.

When I draw my imaginary Venn diagram of "Doesn't ship untested code" and "Doesn't make the customer wait for changes", I see that the intersection of those two sets implies "Doesn't take long to test the software". If sufficiently good testing takes weeks, then we're going to have to make the customer wait. If we skimp on the testing, we're going to have to ship untrustworthy code.

There's no magic bullet for rapidly testing (and re-testing) code. The only technique we've found after 70-odd years of writing software is to write programs that automate test execution. And for those tests - of which there could be tens of thousands - to run genuinely fast enough to ensure customers aren't left waiting for too long, they need to be written to run fast. That typically means our tests should mostly have no external dependencies that would slow them down. Sometimes referred to as "unit tests".

So, to avoid shipping broken code, we test it every time. To avoid making the customer wait too long, we test it automatically. And to avoid our automated tests being slow, we write mostly "unit tests" (tests without external dependencies).

None of this mandates TDD. There are other ways. But my line in the sand is that these outcomes are mandated. I will not ship untested code. I will not make my customer wait too long. Therefore I will write many fast-running automated "unit tests".

And this is not a complete picture, of course. Time taken to test (and re-test) the code is one factor in how long my customer might have to wait. And it's a big factor. But there are other factors.

For example, how difficult it becomes to make the changes the customer wants. As the code grows, complexity and entropy can overwhelm us. It's basic physics. As it expands, code can become complicated, difficult to understand, highly interconnected and easy to break.

So I add a third set to my imaginary Venn diagram, "Minimises entropy in the code". In the intersection of all three sets, we have a sweet spot that I might still call "professionalism"; never shipping untested code, not making our customers wait too long, and sustaining that pace of delivery for as long as our customer needs changes by keeping the code "clean".

I achieve those goals by writing fast-running automated "unit tests", and continually refactoring my code to minimise entropy.

Lastly - but by no means leastly - I believe it's "unprofessional" to ship code the customer didn't ask for. Software is expensive to produce. Even very simple features can rack up a total cost of thousands of dollars to deliver in a working end product. I don't make my customers pay for stuff they didn't ask for.

So, a "professional" developer clearly, unambiguously establishes what the customer requires from the code before they write it.

Now my Venn diagram is complete.



ASIDE: In reality, these are fuzzy sets. Some teams ship better-tested code than others. Some teams release more frequently than others, and so have shorter lead times. Some teams write cleaner code than others. Some teams waste less time on unwanted features than others.

So there are degrees of "professionalism" in these respects. And this is before I add the other sets relating to things like ethics and environmental responsibility. It's not a simple binary choice of "professional" or "unprofessional". It's complicated. Personally, I don't think discussions about "professionalism" are very helpful.


Like I said at the start, TDD isn't mandatory. But I do have to wonder, when teams aren't doing TDD, what are they doing to keep themselves in that sweet spot?



December 17, 2017

Learn TDD with Codemanship

Dev Teams As Assets

One of the biggest giveaways about how a lot of employers view software developers is the way they handle their high-performing teams.

I've seen it happen many, many times; a dev team does a great job delivering something of high value to the business, and at the end the business splits them up without a second thought.


A high-performing team, yesterday


In other lines of work, this would be considered very ill-judged. I struggle to imagine the execs at Apple Records saying "Thanks, John, Paul, George, Ringo. That first album album sold gangbusters. Time to split you up."

Some managers misguidedly do it in the hope of "spreading the love", moving developers from their most successful teams on to other teams that may be struggling, hoping some of the magic will rub off.

But development teams are holistic. They succeed or fail as a single unit. I've seen high-performing teams of mediocre developers, and I've seen teams made of 100% code ninja fail pitifully.

The "magic" we're seeking to replicate exists between team members.

Faced with a lack of control over who they get to work with, some teams take the bold step to move the whole kit and caboodle out of that organisation to retain the magic and apply it somewhere else, for some other client's benefit.

But most developers, with mortgages and school fees and wotnot to pay, can't justify the risks, so they accept their fate and go back to the seemingly much more popular choice of Not SucceedingTM for the remainder of their careers.

In these instances, you lose your high-performing dev team and gain a bunch of newly miserable and demotivated devs. High five!

A rare few organisations recognise the value of dev teams, see them as assets, and invest in them as whole units. They work to ensure the team retains cohesion, even after individual members leave, and as new members join. They know that the better way to spread the love is not to scatter high-performing team members to the winds, but to embed trainees in the team. What you want to rub off on them is likely to be find there, not carried by individual team members.





December 7, 2017

Learn TDD with Codemanship

"This would never have happened if we'd written it in Haskell" - Bah Humbug!

Spurred on by a spate of social media activity of the "We replaced a system written in X with one written in Y, and it was way better" kind, I just wanted to throw my hat into the ring on this topic.

As someone with practical insights into high-integrity software development, I can confidently attest that this is bunk of the highest order. There is no programming language that assures reliability.

Sure, there are languages with built-in features that can help, but you actually have to do stuff to make sure your code works 99.999999999% of the time. Y'know? Like testing and wotnot.

For example, you can inflict all kinds of damage in C, thanks to direct manipulation of memory, but you don't have to abuse those features of the language. A Bugatti Veyron has a top speed of 254 mph, but you don't have to drive it at 254 mph.

"We would never have had that crash if we'd been driving a Volvo" really means "We'd never have had that crash if we'd been driving slower".

If you want to avoid dangling pointers in a C program, you can. It just takes a bit of know-how and a bit of discipline. Don't blame the language for any shortcomings you might have in either. The difference the language makes is small compared to the difference you make.


ADDENDUM: Just to clarify, I'm not saying better languages and tools don't help. What I'm saying is that the difference they make can be very small compared to other factors. How do I know this? Well, I've been programming for 35 years and have worked in a dozen or more languages on real projects. So there's that. But also, I've worked with a lot of teams, and noticed how the same team using different tools gets similar results, while different teams using identical tools can get strikingly different results. So I conclude that the team makes the bigger difference, by orders of magnitude. So I choose to focus more on teams and how they work than on the tools, by orders of magnitude. And it's not as if tools and technology don't get enough focus within the industry :)



October 18, 2017

Learn TDD with Codemanship

12 Things a Professional Computer Programmer Needs to Learn

The last few years has seen an explosion of great learning resources for people interesting in getting into computer programming.

But alongside that, I've noticed a growing number of people, who have ambitions to work in the industry as programmers, being bamboozled into believing all it takes is a few weeks of self-paced JavaScript tutorials to reach a professional level.

Nothing could be further from the truth, though. Programming languages are just one small aspect of writing software as a professional (albeit a crucial one).

When learners ask me "What else do I need to know how to do?", I'm typically unprepared to answer. Unhelpfully, I might just say "Loads!"

Here, I'm going to attempt to structure some thoughts on this.

1. Learn to code. Well, obviously. This is your starter for 10. You need to be able to make computers do stuff to order. There's no getting around that, I'm afraid. If you want to do it for a living, you're probably best off learning programming languages that are in demand. As unhip and uncool as they may be these days, languages like Java and C# are still very much in demand. And JavaScript is at the top of the list. To become an in-demand "full-stack" software developer, you're going to need to learn several languages, including JavaScript. Research the kinds of applications you want to work on. Find out what technologies are used to create them. Those are the languages you need to learn.

2. Learn to use Version Control. Version Control Systems (VCSs) are seatbelts for programmers. If your code has a nasty accident, you want to be able to easily go back to a versin of it that worked. And most professional developers collaborate with other developers on the same source code, so to do it for a living you'll want to know how to use VCSs like Git and Mercurial to effectively manage collaborating without tripping over each other.

3. Learn to work with customers. Typically, when we're learning to code, we tackle our own projects, so - in essence - we are the customer. It gets a bit more complicated when we're creating software for someone else. We need to get a clear understanding of their requirements, and so it's important to learn some simple techniques for exploring and capturing those requirements. Look into use cases and user stories to get you started. Then learn about Specification by Example.

4. Learn to test software. There's more to making sure our code works than running the application and randomly clicking buttons. You'll need to understand how to turn requirement specifications into structured test scripts that really give the code a proper, in-depth workout. How do make sure every requirement is satisfied? How do make sure every line of code is put through its paces? How do we identify combinations of inputs that the code can't handle?

5. Learn to write automated tests. Automated tests are commonly used in Specification by Example to really nail down exactly what the customer wants. They are also crucial to maintaining our code as it grows. Without a decent set of fast-running automated tests, changing code becomes a very risky and expensive business. We're likely to break it and not find out for a long time. Learn how to write automated unit tests for your code, and how to automate other kinds of tests (like system tests that check the whole thing through the user interface or API, and integration tests that check system components work together).

6. Learn to write code that's easy to change. On average, software costs 7-10x as much to maintain over its lifetime as it did to write in the first place. And if there's one thing we've learned from 70 years of writing software, it's that it'll need to change. But, even though we call it "software" - as opposed to "hardware" - because it's easier to change than the design of, say, a printed circuit board, it can still be pretty hard to change code without breaking it. You'll need to learn what kind of things we can do in code tend to make it harder to change and easier to break, and how to avoid doing them. Learn about writing code that's easy to read. Learn about simple design. Learn how to avoid writing "spaghetti code", where the logic gets complicated and tangled. Learn how to shield modules in your code from knowing too much about each other, creating a dense web of dependencies in which even the smallest changes can have catastrophic impact. Learn how to use abstractions to make it easier to swap out different parts of the code when they need to be replaced or extended.

7. Learn to improve the code without breaking it. We call this skill "refactoring", and it's really, really important. Good programmers can write code that works. Great programmers can improve the code - to make it easier to understand and easier to change - in ways that ensure it still works. A function getting too complicated to understand? Refactor it into smaller functions. A module doing too much? Refactor it into multiple modules that do one job. This skill is very closely connected to #5 and #6. You need to know bad code when you see it, and know how to make it better. And you need to be able to re-test the code quickly to make sure you haven't broken anything. Automated Tests + Design Smarts + Refactoring form a Golden Circle for code that works today and can be easily changed tomorrow to meet new customer requirements.

8. Learn to automate donkeywork like building and deploying the software. Good software developers don't manually copy and paste files to production servers, run database scripts, and all of that repetitive stuff, when they want to create test or production builds of their systems and deploy them to a live environment. They program computers to do it. Learn how to automate builds, to do Continuous Integration, and automate your deployments, so that whole delivery process can become as easy and as frictionless as possible.

9. Learn about software architecture. Is your application a mobile app? A website? A Cloud service? Does it need huge amounts of data to be stored? Does it need to be super-secure? Will some features be used by millions of users every day? Will it have a GUI? An API? Is the data really sensitive (e.g., medical records)? We have 7 decades of knowledge - accumulated through trial and error - about how to design software and systems. We have principles for software architecture and the different qualities we might need our software to have: availability, speed, scalability, security, and many more. And there are hundreds of architectural patterns we can learn about that encapsulate much of this knowledge.

10. Learn to manage your time (and yourself). You might enjoy the occasional late night working on your own projects as a beginner, but a professional programmer's in this for lomg haul. So you need to learn to work at a sustainable pace, and to prioritise effectively so that the important stuff gets done. You need to learn what kinds of environments you work best in, and how to change your working environment to maximise your productive time. For example, I tend to work best in the morning, so I like to get an early start. And I rarely spend more than 7-8 hours in a day programming. Learn to manage your time and get the best out of yourself, and to avoid burning out. Work smarter, not harder, and pace yourself. Writing software's a marathon, not a sprint.

11. Learn to collaborate effectively. Typically, writing software is a team sport. Teams that work well together get more done. I've seen teams made up of programmers who are all individually great, but who couldn't work together. They couldn't make decisions, or reach a consensus, and stuff didn't get done because they were too busy arguing and treading on each others' toes. And I've seen teams where everyone was individually technically average, but as a single unit they absolutely shone. Arguably, this is the hardest skill, and the one that takes the longest to master. You may think code's hard. But people are way harder. Much has been written about managing software teams over the decades, but one author I highly recommend is Tom DeMarco (author of "Peopleware"). In practice, this is something you can really only learn from lots and lots of experience. And increasingly important is your ability to work well with diverse teams. The days when computer programming was predominanly a pursuit for western, white, middle class heterosexual men are thankfully changing. If you're one of those people who thinks "girls can't code", or that people from third-world countries are probably not as educated as you, or that people with disabilties probably aren't as smart, then I heartily recommend a different career.

12. Learn to learn. For computer programmers, there's 70 years of learning to catch up on, and a technology landscape that's constantly evolving. This is not a profession where you can let the grass grow under your feet. People with busy lives and limited time have to be good at making the most of their learning opportunities. If you thought you'd learned to learn at college... oh boy, are you in for shock? So many professional programmers I know said they learned more in the first 6 months doing it for a living than they did in 3-4 years of full-time study. But this is one of the reasons I love this job. It never gets boring, and there's always something more to learn. But I've had to work hard to improve how I learn over the years. So will you. Hopefully, the more you learn, the clearer the gaps that need filling will become.

So, there are my twelve things I think you need to learn to be a professional computer programmer. Agree? What would be on your list? You can tweet your thoughts to @jasongorman.