May 15, 2013
How Can You Attract And Retain Great Developers?Companies often wonder how they can attract and retain great software developers.
Well, here's the thing about great software developers. They don't approach what they do as just a job. To them, it's a passion, a calling. They do it because they love to do it.
To make sense of this, let's change the context of the question.
You have started a band. How can you attract and retain great musicians for your band?
Here's how you might not do it:
1. Constantly remind them that this is your band and that they must do as you tell them
2. Get them playing awful, tacky music (e.g., that song from Four Weddings & A Funeral, anything by Meatloaf) at weddings and school proms
3. Force them to play with crappy musicians who make tonnes of mistakes, and don't give them time and space to help those musicians improve. And then blame them if it sounds crap.
4. Consistently approach the band's musical output with a "that'll do" attitude. "Yeah, the vocal's off-key in the chorus, but we're on a deadline so let's just print the CDs already"
5. Make unrealistic demands of them. "We're going to be playing 2 shows a day for the next 6 months", "We've got 5 days to rehearse this 90-minute set"
6. When they do something amazing, ignore it. Focus on you. You're the band leader, after all.
7. Routinely remind them that they are dispensible. Great musicians grow on trees, remember?
8. Discourage a musician-led culture in your band, and restrict time to practice, learn and grow as musicians. You're there to make money. Who gives a shit about day trips to NAMM or time off to attend guitar clinics?
9. Most important of all, remember: when the band's a success, it's because you're a great band leader. When it's a failure, it's because your musicians suck.
Now, ask me again: how can you attract and retain great software developers?
December 23, 2012
Lowering The Bar Is Not The AnswerJust time for a pre-Xmas rant before I am overcome for the holidays by gin and marzipan.
I keep seeing more and more adverts for Learn To Code-style training courses. While I'm delighted at this new found interest among people to learn how to write computer programs - something I wish everyone would have a go at at some point in their lives - I'm also very concerned about some of the unrealistic expectations these courses seem to be setting.
Putting aside that two of those "programming languages" were HTML and CSS, the implications were that people who attended this course now understood web development. Well, hmmm.
Other courses charging thousands of dollars or pounds claim they can teach you to be a software developer in 10-12 weeks. Again, hmmmm. I haven't met a competent software developer who hasn't at least been programming for a few years.
We've been here before, of course. During the last dotcom boom, employers were desperate to find web programmers so they could cash in on the bubble before it inevitably burst. We'd interview a hundred developers and find maybe one or two who really could wak the walk. If you need 10, then the solution is to interview a thousand. But this takes time; too much time for the "must have it now" dotcom fanatics.
A competent software developer is a million miles from "Hello, world!" More accurately, a competent software developer is several years and a bunch of non-trivial projects beyond "Hello, world!" Taking a cooked meal and adding a bit of salt to it does not make one a chef.
Anyway, the bubble burst, but thousands of these non-competent programmers remained. And remain to this day; clogging up the profession, inundating employers with their CVs and drowning out the competent developers (who are often, sadly, less inclined to make the kinds of naive boasts that win jobs.)
Claiming that you can teach a room full of total newbies to program - even to a basic level - in three languages in a day is also a very naive boast. As is claiming you can turn someone into a software developer - even an entry-level one - in 3 months.
There is no shortage of software developers. Consider that not all developers are equal, and some developers achieve more than others. In reality, 80% of the working code in operation today can probably be attributed to small proportion of us. The rest just get in the way. If anything, if we thinned down the herd to just the stronger programmers, more might get done.
What we need, as a profession and for the sake of our economy, is better software developers, doing better work.
Lowering the bar is likely to be counterproductive.
November 21, 2012
The Budgetary Autonomy Coefficient & Why You Should Treat Developers As Partners, Not EmployeesIn complex professional work, the ability for the people doing the work - the ones who understand best - to make decisions when they need to be made is very important.
So important, in fact, that I always look for signs that might helpme gauage how things are in that respect when I'm working with a team.
In some organisations, management put a lot of trust in software developers. If the team says they need new dev servers, they get new dev servers. It may be a £10,000 decision, but a medium-sized team will burn through that in a few days. Their time is worth more than the decision.
In other organisations, if developers say they need some more index cards or some blue-tack, they have to go through a standard internal purchasing procudure and may have to wait a week or two to get them.
I've seen teams wait 6+ months to get software licenses worth less than £200, and I've seen teams order and pay for £20,000 of training in under a week.
So I think it may be useful to have a simple measure to help visualise the extent of a developer or a team's budgetary autonomy - or lack thereof.
How big a purchase can you make without asking anyone's permission? And how much is your time worth?
Let's assume that the value of a developer or a team outweighs its cost (yes, this is not always true, but let's average it out.) So you're worth at least what it costs to employ you - and that includes factoring in fixed overheads like the office space you use and the cost of hiring you in the first place. For an average UK software developer, that's close to £100,000 a year, or £2,000 a week.
Granted, a big purchasing decision might take more time to process, but this all about weighing things up against each other.
If the lead time on buying a £200 monitor is 6 weeks, for example, divide that £200 by your cost for 6 weeks (£12,000). That would give you Budgetary Autonomy Coefficient (as I'm now calling it) of 1/60. If you can get the monitor the very next day (e.g., just go online and order for next day delivery), it would be 1/2. If you could pop down the road to PC World and pick one up for that afternoon, it would be 1.
Now, remember, this is not an absolute, scinetific measure based on the cost of making that decision. It's just a finger-in-the-air indicator of how easy it is for you to make execute those kinds of decisions.
And, I find, that can be a wider indicator of how much decision-making power you and your team has generally.
So, yes, monitors and index cards and blue-tack are fairly trivial things. You'll get by, I'm sure. But the same lack of autonomy that makes them less accessible to you has a tendency to translate into lack of autonomy over more pressing technology decisions, and decisions over the way the team works, and even who works on the team.
As a business owner, I have a default Autonomy Coefficient of 1 at all times. If the business can afford it, it's entirely my decision.
This is why I believe the best approach to managing teams is to treat them as businesses. In fact, from an accounting perspective, many organisations do. Well, sort of. It's common practice to manage departments as profit & loss centres. But that's only as far as budgetary accountability goes. they generally don't give them the same level of autonomy that a real profit & loss centre would need to function.
Treat team members as partners, not employees.
September 16, 2012
Are Woolly Definitions Of "Success" At The Heart Of Software Development's Thrall To Untested Ideas?In the ongoing debate about what works and what doesn't in software development, we need to be especially careful to define what we mean by "it worked".
In my Back To Basics paper, I made the point that teams need to have a clear, shared and testable understanding of what is to be achieved.
Without this, we're a ship on a course to who-knows-where, and I've observed all manner of ills stemming from this.
Firstly, when we don't know where we're supposed to be headed, steering becomes a fruitless exercise.
It also becomes nigh-on impossible to gauge progress in any meaningful way. It's like trying to score an archery contest with an invisible target.
To add to our worries, teams that lack clear goals have a tendency to eat themselves from the inside. We programmers will happily invent our own goals and persue our own agendas in the absence of a clear vision of what we're all meant to be aiming for.
This can lead to excess internal conflict as team members vie to stamp their own vision on a product or project. Hence an HR system can turn into a project to implement an "Enterprise Service Bus" or to "adopt Agile".
Since nobody can articulate what the real goals are, any goal becomes more justifiable, and success becomes much easier to claim. I've met a lot of teams who rated their product or project as a "big success", much to the bemusement of the end users, project sponsors and other stakeholders, who can take a very different view.
There are times when we can display all the misplaced confidence and self-delusion of an X Factor contestant who genuinely seems to have no idea that they're singing out of tune and dancing like their Dad at a wedding.
Much of the wisdom we find on software development comes from people, and teams, who are basing their insights on a self-endowed sense of success. "We did X and we succeeded, therefore it is good to X" sort of thing.
Here's my beef with that: first off, it's bad science.
It's bad science for three reasons: one is that one data point doesn't make a trend, two is that perhaps you have incorrectly attributed your success to X rather than one of the miriad other factors in software development, and three is that can we really be sure that you genuinely succeeded?
If I claim that rubbing frogspawn into your eyes cures blindness, we can test that by rubbing frogspawn into the eyes of blind people and then measuring the accuity of their eyesight afterwards.
If, on ther hand, I claim that rubbing frogspawn into your eyes is "a good thing to do", and that after I rubbed frogspawn into my eyes, I got "better" - well, how can we test that? What is "better"? Maybe I rubbed frogspawn into my eyes and my vocabulary improved.
My sense is that a worrying proportion of what we read and hear about "things that are good to do" in software development is based on little more than "how good (or how right) it felt" to do them. Who knows; maybe rubbing fresh frogspawn in your eyes feels great. But that has little bearing on its efficacy as a treatment.
Without clear goals, it's not easy to objectively determine if what we're doing is working, and this - I suspect - is the underlying reason why so much of what we know, or we think we know, about software development is so darned subjective.
Teams who've claimed to me that they're "winning" (perhaps because of all the tiger blood) have turned out to be so wide of the mark that, in reality, the exact oppsosite was true. These days, when I hear proclamations of great success, it's usually a precursor to the whole project getting canned.
The irony is that those few teams who knew exactly what they were aiming for often measure themselves more brutally against their goals, and are more pessimistic, despite in real terms being more "winning" than teams who were prematurely doing their victory lap.
This, I suspect, has also contributed to the dominance of subjective ideas in software development. Ideas backed up by objective successes seem to be expressed more tentatively and with more caveats than ideas backed up by little more than feelgood and tiger blood, which are expressed more confidently and in more absolute terms.
The naked ape in all of us seems to respond more favourably to people who present their ideas with confidence and a greater sense of authority. In reality, many of these ideas have never really been put to the test.
Once an idea's gained traction, there can be benefits within the software development community to being its originator or a perceived expert in it. Quickly, vested interests build up and the prospect of having their ideas thoroughly tested and potentially debunked becomes very unattractive. The more popular the idea, and the deeper the vested interests, the more resistance to testing it. We do not question whether a burning bush really could talk when we're in the middle of a fundraising drive for the church roof...
It's saddening to see, then, that in the typical lifecycle of an idea, publicising it often preceds testing it. More fools us, though. We probably need to be much more skeptical and demanding of hard evidence to back these ideas up.
Will that happen? I'd like to think it could, but the pessimist in me wonders if we'll always opt for the shiny-and-new and leave our skeptical hats at home when sexy new ideas - with sexy new acronyms - come along.
But a good start would be to make the edges of our definition of "success" crisper and less forgiving.
September 13, 2012
We Can Learn A Lot About Collaborative Design From AardmanScientists have learned a great deal about humans by studying other animals and looking for similar attributes (and differences) that mark out what it means to be "human".
In particular, we've learned an enormous amount from studying our closest cousins, the Great Apes.
I've been pondering what software development's closest cousins might be, and what we could learn from them.
While watching Aardman's The Pirates In An Adventure With Scientists, it suddenly struck me that perhaps the endevour that most closxely resembles software development is animation.
We face strikingly similar problems to animators.
Firstly, we're both trying to tell compelling stories. Software, when it's done well, has a clear narrative, just like an animated movie. This narrative can be expressed in many ways, and - just as it is with animation - the process of producing working software can be thought of as telling and re-telling the story, adding more detail and refining it until the story's told in executable code.
The second similarity is that we both have to overcome the extreme difficulty of taking care of millions of tiny details without losing sight of the big picture.
Programming is inherently fiddly; far too fiddly for most people to be bothered with. What other kind of person would devote the lion's share of their lives to the kind of minutiae we do? Well, animators for one.
A single animation unit working on a film like "Pirates" might produce 4 seconds of usable action in a week. Each second of film is made up of 24 frames, each of which has to be painstakingly manipulated, with dozens of details changing from frame to frame that they have to keep track of.
And yet, working one frame at a time, tracking miriad interconnected elements, Aardman are able to produce something miraculous; something that many live action films fail to capture - comic timing.
Fight scenes, chase scenes, comedy - all of this is hard enough to get right shooting at 24 frames a second. To execute it so perfectly working one individual frame at a time requires something that, sadly, too many software teams lack - a clear vision.
The split-second timing and the exquisite dynamics of an Aardman animation are no accident. The mechanics of the overall narrative, every scene and every shot are carefully choreographed with storyboards, animatics (more animation) and with people performing the action to match the voice recordings of the actors, so that the animators can see how it should look and work towards realising that vision.
And with as many as 40 units working on different shots at any given time, this vision not only needs to be clear but it also must be a shared vision.
The rules that apply to each character - including non-living characters like the ocean and the wind - have to be clearly established so that no matter which team is animating those characters, they behave in a way that's consistent to their character. It would do little for the movie if the Pirate Captain inexplicably moved and behaved in 40 different ways through the movie, depending on who was animating him.
The objects in our software - howvere you choose to interpret that word - are the characters in our stories. As the design evolves and grows, is extremely important to maintain a clear shared vision of those objects and how they behave, as well as the narratives in which those objects play a part.
Watching "Pirates", something else jumps out at me; the extraordinary consistency of quality. Aardman have very high standards, and these standards seem to have been applied across the board.
I don't doubt that there were animators working on that film with less experience than some of the others. I don't doubt that some animators were probably learning this craft on the job. Where else do they get their great animators from? That scope and quality is not evident in art and film schools. I suspect you can only really learn to make films of Aardman quality working for someone like Aardman.
But there's not a scrap of evidence for less experienced animators in the movie. Every scene and every shot is sublime. If someone was screwing up, then it must have ended up on the cutting room floor or at the back of shot where nobody noticed.
The greatest animators are masters of collaborative design. I believe there's much we could learn from companies like Aardman about telling compelling stories, about establishing a clear shared vision, about getting the tiniest details right while not losing sight of our "comic timing", and about committing to consistently high standards of quality.
August 17, 2012
Software Apprenticeships Summit, Sept 20thOn Sept 20th I'll be chairing a summit for people interested in long-term mentoring of aspiring software developers.
I'll explain a bit of the background. For the last year, I've been looking into this whole question of apprenticeships for software developers, talking to employers, universities and professional bodies who might be interested in getting involved. And guess what? They aren't.
With very few exceptions, it seems, the traditional alliances between employers, higher education and professional institutions hasn't got legs when we're talking about real and genuinely meaningful apprenticeships for developers.
This leaves two main groups still in the game. There are young people out there who are interested in learning how to be software developers and who've contacted me asking about apprenticeships. And there are practitioners who've expressed willingness to take on apprentices in some form.
The good news is that, in theory, that's all we need to get started.
I plan to take on two apprentices in the next year. Alas, I'm not in a position to offer them employment. Doubtless, many of us won't be. But what I am able to offer is ongoing guidance and mentoring, as well as opportunities that they might not otherwise have found.
As a mentor, I'll enter into a contract with my apprentices that stipulates a roadmap for what I want them to learn and to do, and will work with them on a regular basis - e.g., a couple of hours a week - to offer guidance and to pair program with them.
Once a year - probably during summer recess - I'll ask my apprentices to undertake a significant challenge. They'll be tasked with creating working software of the order of a dozen or so use cases for some good cause. I'll be acting as the "customer" and monitoring their progress, keeping an eye on the quality of the software they create.
Year on year, the challenges will get more sophisticated and the quality bar will be set higher. My aim is that after a few years, the projects will be not just like real-world software development, but a whole heap better than that. Being fiendish, I plan to make them build on the code they wrote in the previous year, and improve it year on year. Yes, that much better!
Outside of development skills, I'll also be helping them out by paying for them to attend a couple of conferences each year, so they can meet real developers and see what the zeitgeist is like.
I'll be asking them to blog throughout, and eventually to teach and mentor other developers, as I feel that can be a hugely valuable experience.
And, if they do well, I'll be promoting them as professionals as they become fully rounded developers. My hope is that when they apply for their first development job, they'll not just have solid development skills, people skills and experience of writing software under similar constraints to industry, but they'll be known quantities in our community, with a body of work people can look at, blogs, talks at conferences and other public-facing stuff people can judge them on. And judge me on, as their mentor.
Perhaps in 5-6 years time, Codemanship might be in a position to take them on full-time. But that is not the be-all and end-all. I'm fully prepared that this will cost me time and money and that I personally won't gain (in those terms) from doing it.
For those among you who feel that anyone who does all this and gets nothing in return is a fool, I'd like to introduce you to this thing called society. Software development as a whole could benefit, and that's plenty benefit for me. I'll also get a kick out of doing it. I'm funny like that.
What I'd really like is to see a bunch if us take on apprentices, and then we can share this experience and amplify the benefits. If we can agree on a basic foundation that would mean that any apprentice mentored by us would have to achieve a shared vision of what we think it means to be a software developer, and co-ordinate and collaborate, I believe a lot more could be achieved at a national and maybe even an international level.
So I'm organising this little get-together at Bletchley Park on Sept 20th to set out my stall, so to speak, and explain what I'm going to be doing, and then no doubt have a lively discussion with others like me, kicking ideas around in an informal setting, to see if we can begin to point ourselves in roughly the same direction.
My proposal is that we form a loose alliance beneath a recognisable banner - e.g., a guild, or an institute, or something else that wouldn't look out of place on an apprentice's CV - establish a foundation for skills and knowledge (without smearing marketing hype all over it, I hope) and also decide where/how we set the bar for mentors. Because not every developer's necessarily going to be a great role model, let's face it.
This alliance might do little more than promote a shared vision, act as a gatekeeper to fliter out ne'er-do-wells, and maybe organise a conference where applicants can meet mentors once a year (in the spring?), and possibly even graduation challenges where apprentices prove their metal on a bigger project.
Strength in numbers, basically.
If you're think of mentoring a software developer, and would like to talk with others like you, I really hope you can join us on Sept 20th.
August 14, 2012
I Was Worried About Apprenticeships. Now I'm Resolved.I'm worried.
No, not about whether New Girl will get another series.
I'm worried about apprenticeships. Apprenticeships for software developers, specifically.
Over the last year my focus has been shifting inexorably towards apprenticeships. Whichever angle I approach it from, I seem to always arrive at apprenticeships as the best potential answer to the question "where will the next generation of great software developers come from?"
I've been talking to employers, to aspiring apprentices (one of whom I have decided to take on as my own "apprentice" when he begins his degree studies, and I'm looking for one more, if anyone out there's interested), as well as to various august institutions of learning and professionalism, and a whole bunch of drunked conversations with my fellow practitioners.
And what I'm hearing - the general themes that are emerging - worry me.
Theme Number 1 - sing along if you know the words - has emerged from employers. We are at odds. Most practitioners I give any credence to believe that an apprenticeship of 5-7 years might be sufficient time to "grow" a proper software developer. Most employers don't think beyond 3 years. If companies were to take on apprentices, I fear they would be looking to "speed up" this process, taking many shortcuts and ultimately lowering the bar. The evidence corroborates this. I've seen a lot of companies offering inadequately short apprenticeships of a few months, maybe a year. the longest I've seen is 18 months.
Theme Number 2 is differing expectations about where the bar should be set. I, as you probably know, have little interest in cultivating anything short of excellence. Maybe if you're hiring developers out by the hour to clients who can't tell the difference, then mediocrity is worth money to you, but I've worked with those teams and I would be vehemently opposed to real apprenticeships becoming part of that scam. Let's start as we mean to go, shall we; honestly and with noble intentions. But how many employers have such high standards? Indeed, how often have you worked in a place that rewarded striving for excellence over vulgar politicial pragmatism? I am not interested in apprenticeships for greasy pole climbers. They belong in business schools.
Theme Number 3 is most troubling of all. Institutions that could support and co-ordinate apprenticeship schemes at national and international levels have given me strong hints that they're viewing apprenticeships as a source of income or as a source of greater influence. None seem all that interested in the apprentices themselves. My fear is that, seeking the largest audience possible, apprenticeships under their governance might be designed to fit the lowest common denominator.
Theme Number 4 is a common sentiment among those who fear they might lose out to apprenticeships; in particular, institutions of higher education. I make no bones about it - education for software developers isn't working. Kids are spending 3-4 years studying computing or software engineering and emerging blinking into the harsh light of the real world effectively still at square one as software developers.
This could be because universities are preparing them for a world that doesn't exist; a world where we generate code from UML models and use mathematical proofs to test our shopping cart code. Most computing graduates have never written a unit test. Most computing graduates have never refactored legacy code. Most computing graduates have never worked on a shared code base at the same time.
I've spent 12 years trying to collaborate with universities on developing courses that offer real hands-on experience of actual software development - and not just a week's worth - and it always falls down at the same hurdle. Universities teach what they teach because that's what their teachers know how to teach. Inevitably our partnerships evolve from enthusiastic lunches with departments heads who are "100% with me all the way" to frustrating meetings with senior lecturers with beards and sandals who 100% insist that the course must contain a module on Z and on compiler design.
The fact is that more than half of computer science graduates who work in software took a CS degree because they wanted to work in software. Nobody's denying that the theory's useful. And nobody's suggesting that they don't teach them the theory. But the brick wall I hit time and again is this insistence that the real world has got it wrong, and students have nothing to learn from those of us who debase ourselves by working in it. And so theory's all most computing graduates get. And, especially in software engineering, a lot of that theory is demonstrably wrong.
My ultimate goal is that apprenticeships should work. And they'll have to work in the real world, where employers are short-termists, where excellence isn't valued, where companies and institutions have their own agendas, and where the academic institutions won't help you because they can't.
In my mind, that just leaves you and me.
An employer's unlikely to commit to 5+ years during which time a considerable amount of learning's going on (though, it's going on all the time under their noses no matter how experienced their developers are - but don't tell them, or they'll assume you're not busy enough and relieve you of some of that "slack"). But I can. I know I can (sudden death or unexpected eloping to Fiji with Julia Sawalha permitting.)
What I can't do is pay someone for 5+ years.
And so we come to the compromise where our plucky apprentice gets to have her cake and eat it.
The world carries on as normal. Kids seeking careers as great software developers take their A-Levels, apply to university and do their computing degrees along with all the consultancy fodder. They study hard. They graduate. They apply for jobs as software developers. Just like they were probably going to do anyway.
The change I'm proposing happens alongside all of that. A person with considerable proven knowledge and experience working as a software developer takes them on as an apprentice.
They spend time with their apprentice every week (maybe a few hours on Skype of a weekend, maybe a face-to-face) guiding them, mentoring them and helping them to develop as fully-rounded software developers. This commitment - this bond - between the apprentice and their mentor (let's not call them "masters", eh?) will endure for years. Certainly well into the apprentice's career, with the amount of guidance needed gradually diminishing until this becomes a relationship of equals.
As well as guiding them to become better developers, we would also nuture them as professionals - gradually introducing them into the software development community and encouraging them to actively engage with their peers and do more than just write code for money.
Eventually, I would hope these apprentices will become mentors themselves, and perpetuate the relationship from one generation to the next. And, as mentors, we would be as much defined by the achievements and the conduct of our apprentices as we are by our own.
To avoid saddling them with an experience that means little to the rest of the world, I'd also seek to engage with other mentors and apprentices to build a consensus that means that my apprentice can command the same respect for her achievements from another mentor as that mentor might give to their own apprentices. Yes, I'm afraid this is going to mean that we'll need to agree on some things. That, in itself, will make for an interestingf experiment.
So this is the end of my journey of research on apprenticeships, and the beginning of my journey doing it for real.
July 27, 2012
Great Software Ideas #4751 - Eat Your Own Dog FoodHere's a random Friday thought to end the week before the behemoth we call "The Olympics" shuts London down for 2 weeks. (Imagine what £11 billion could have done for, say, science! But, hey, running and jumping's important, too. Right?)
Anyhoo, moan moan moan and so on.
One thing that often strikes me on software projects is how unaware developers can sometimes seem to what it is like to use their software.
It's a bit like customer service. Here in the UK, we're famed throughout the world for our truly awful customer service. We complain endlessly about the poor service we get from companies, while failing to see the irony that this poor customer service is being dished out by - well, not to put too fine a point on it - us.
A lot of businesses have no idea that their products and services suck. When you watch these TV shows where the boss goes "back to the floor", they always seem genuinely surprised to discover that all is not well in their company.
This obliviousness may be commonplace in software. Our reputation as an industry for quality is by no means enviable. And I'm sure we've all had experiences with tech support that suggest that, just maybe, software companies are also blissfully unaware that their products suck to one degree or another.
Rather than bury our heads in the sand, or, worse, get angry and defensive about it ("I mean, obviously, if you want to send a blind carbon copy of the document you press the button with the picture of an Elf on it! Duh!"), perhaps matters could be improved if more of us tasted our own dog food.
I led a team on a job seekers web site many moons ago, and the most damning verdict I can give on it today is that, when seeking the exact kind of work this site specialises in, I've never used it. I did try once, months afterwards, and quickly decided it wasn't working for me.
Looking back, I should have tried it while we were iterating the design. I might then have noticed how cumbersome and clunky it was, or how off-the-mark the search results were, and how out of date the job postings were.
The site was designed entirely from the advertiser's point of view, it transpires, with barely lipservice paid to job seekers.
Many times since, I've made a point, if I can, to become a user of the software I'm working on - though that's not always possible (e.g., a private banking web site that requires a minimum investment of £100,000). But it should almost always be possible to simulate that experience, at least. In the case of the bank, for example, we could create a mirror version that uses simulated money against real financial instruments and play a Monopoly Money version of being a real user. This relates back the Model Office idea I talked about in the last blog post.
If you make a promise to yourself today to eat your own dog food, I would expect it to have quite a profound effect on your attitude to design and development. There should be at least some part of us that's aligned with the users, and wants what they want. Or is at least capable of understanding why they want it and why it's important to them.
I've found no better way of understanding our users than walking a mile in their shoes.
July 10, 2012
Software Apprentices Will Need Insights, Not BuzzwordsA lot of the debate that goes on in the world of software development about the processes, practices and techniques that we should be applying seems to hinge on what we choose to call what we do.
This has the effect of creating the illusion that nobody really agrees on anything. And as my thoughts turn more exclusively to apprenticeships for software developers, this presents something of a problem.
If John Q Apprentice learns how to write software at Company X, there's no guarantee that he'll come away from that with knowledge and skills that Company Y would agree are important. My mind borks at the prospect of "Agile apprenticeships" or "Scrum apprenticeships" or "Extreme Programming apprenticeships", because I fear for apprentices being sold such narrow perspectives on what is, in fact, a very wide discipline.
I know I bang on about the potential for "evidence-based" approaches, but this is the real reason why. I believe we have a responsibility to young, impressionable minds to find a way for them to learn their craft (there, I've said it) without bamboozling them with buzzwords and brand names.
I'm planning to take on 2 "apprentices" in the near future (two undergraduates who I'll mentor throughout their degree studies and beyond), and this problem's weighed heavily on my mind.
What I really want to do, apart from giving them an opportunity to get a thousand or more hours of good, focused hands-on practice before they hit the job market, is give them a thorough grounding in the underlying principles of software development - free from fashions and fads - and use this practice time to help them internalise those principles until they become part of their developer DNA, so to speak.
Most importantly, I don't want to present a picture of software development that's personal, subjective and founded on little more than anecdotes. It's the height of arrogance, in my (arrogant) opinion, to tell people "you'll just have to take my word for it". I don't want to saddle two bright and enthusiastic young developers with the "Jason Gorman way" of writing software. That's a burden I wouldn't wish on anyone. Except maybe myself. Well, even then...
Fortunately, it's not necessary. In all the areas that count, much work has been done in the last few decades to establish principles upon which one could base a perfectly workable discipline of software development.
We know, for example, that more feedback more often, and more meaningful feedback, helps us solve complex problems more effectively and more economically than trying to get it right first time.
And we know that close collaboration with our customers is a major factor in project success rates.
Just as we know that testing earlier and more frequently throughout the development process catches problems sooner, making them so much easier to fix that the time saved later often outweighs the extra time spent testing.
There are a bunch of underlying principles upon which I feel I could build a good apprenticeship without lazily resorting to throwing buzzwords out there and saying "trust me, it works". My ambition is that apprentices will not only be fluent in the practies they apply, but they also know why they're applying them, and can find ways to apply these underlying principles regardless of the specific environments they find themselves in.
Buzzwords may last them a few productive years before they fall out of favour, but I'm hoping some key fundamental insights will serve them throughout their careers.
July 8, 2012
Testing The Testers - A Vague Hiring Process.Over sunday lunch with a tester friend today, I got to thinking about testing interviews.
There's been quite a lot of good ideas floating around recently about interviewing developers (e.g., Hibri Marzook's Pair Programming Interviews workshop at SC2012), and I've seen testers put through their paces with what are essentially developer interviews, too.
But testing is not programming - though programming may well be involved. On the principle that if you want to see if a juggler can juggle, ask to see them juggle, what kind of practical techniques could we use to put a tester through his or her paces?
What occured to me over lunch is that there'd be three distinct areas I'd look into.
The most obvious is the tester's ability to find bugs. Bring them in (after some basic vetting to weed out the testers who, let's face it, just aren't - still too many of those about, sadly) and sit them down with a copy of some software in which there are known bugs. Then give them a fixed amount of time to find those bugs, and document them in a useful way (i.e., how to reproduce them.)
This is sort of a human variant of mutation testing. We test the tester by introducing known defects into the code and then see if they can find them.
We could make it more meaningful by introducing the bugs in places where bugs would be more likely to lurk (long/complex methods, multithreaded code accessing global variables etc) so that they could use their understanding of the relationship between code and quality to make educated guesses. You could also include an incomplete automated test suite so they could look for parts of the software that aren't being tested, where bugs are more likely to lurk. You could even be really cheeky and leave a test failing, to see if they even bother to check. You might also like to leave them a pile of user stories with points assigned to them by the customer for relative value, or feature usage statistics, to test their ability to not only find bugs, but find the most important ones first.
There's more to being a tester than finding bugs, of course. So the second thing I'd want to look into is the tester's ability to drive out the details of what a customer wants and "bridge the communication gap", as Gojko Adzic puts it.
One way I thought of might be to get a "customer" - a non-technical domain/application expert - to describe features of an existing piece of software to our candidate. The candidate can ask questions and use examples and test cases to firm up their understanding of what it should actually be like, eventually agreeing a set of acceptance test scripts for each feature with the "customer". Because this software actually exists, we can execute this tests against a running version of it, and test the tests, effectively.
Finally, these days, a tester often needs to be a programmer - and a pretty handy one at that. So my third focus would be on programming skills, probably with an emphasis on automating tests. I might ask them to write Selenium scripts for the acceptance tests they agreed for this existing piece of software, looking not only for test automation abilities, but also clean code and generally good dev instincts.
Realistically, you might be looking at a whole day to put a tester through their paces, but this could be a progression. If they can't find bugs, probably not much point moving on to the next stage, so it might only be a whole day if you're actually any good.
And then there's the whole question of team fit. Sure, they may have the technical chops, but can this person actually work well with us? Maybe round the day off, if they get through all the previous stages, with a Team Dojo with the candidate fulfilling tester duties.
So, in practice, how might I do it? I think I might run it as elimination rounds. Invite a sixteen of the best candidates in to do the bug-finding exercise, and select the best eight at that to do the "customer" exercise, and the best four from that to do a pair programming interview to check their dev skills, and the two remaining after that participate in a Team Dojo to determine which one will be a better fit. (Those numbers are pretty arbitrary - you may be looking for several testers, for example - but that's the general idea. Whittle them down over the course of a day.)
Of course, I'm just thinking out loud. Again.