April 17, 2017

Learn TDD with Codemanship

Do It Anyway

In my previous post, I suggested that developers take the necessary technical authority that managers are often unwilling to give.

If I could summarise my approach to software development - my "method", if you could call it that - it would be Do It AnywayTM.

An example that recently came up in a Twitter conversation about out-of-hours support - something I have mixed feelings about - was how temporary fixes applied at 2am to patch a broken system up often end up being permanent, because the temptation's too great for managers to assign the fixer to seemingly more valuable work the next day.

Think of out-of-hours fixes as being like battlefield surgery. The immediate aim is to stop the patient from dying. The patient is then transported to a proper hospital for proper treatment. If the patient's intestines are being held in with sticky tape, that is not a lasting fix.

We've all seen what happens when the code fills up with quick'n'dirty fixes. It becomes less reliable and less maintainable. The cumulative effect increases failures in production, creates a barrier to useful change, and hastens the system's demise considerably.

My pithy - but entirely serious - advice in that situation is Do It AnywayTM.

There are, of course, obligations implied when we Do It AnywayTM. What we're doing must be in the best interests of stakeholders. Do It AnywayTM is not a Get Out Jail Free card for any decision you might want to justify. We are making informed decisions on their behalf. And then doing what needs to be done. Y'know. Like professionals.

And this is where the rubber meets the road; if managers and customers don't trust us to make good decisions on their behalf, they'll try to stop us making decisions. It's a perfectly natural response. And to some extent we brought it on ourselves - as a profession - through decades of being a bit, well, difficult. We don't always do things in the customer's interests. Sometimes - gasp - we do stuff because it benefits us (e.g., adopting a fashionable technology), or because it's all we know to do (e.g., not writing automated tests because we don't know how), or just because it's the new cool thing (and we love new cool things!)

If we can learn to curb our excesses and focus more on the customer's problems instead of our solutions, then maybe managers and customers can learn to trust us to make technical decisions like whether or not a hot fix needs to be done properly later.




April 15, 2017

Learn TDD with Codemanship

IT's Original Sin: Separating Authority & Expertise

I'm recalling a contract I did a few years ago, where I was leading a new team on an XML publishing sort of thing. The company I was working for had been started by two entrepreneurs who had zero experience of the software industry. Which is fine, if you can hire experts who have the necessary experience.

On our little team, things were generally good, except for one fly in the ointment. The guy they'd trained up in this specific database technology we were using had his own ideas about building the whole thing in XQuery, and refused to go along with any technical decision made by the team.

Without his specific skillset, we couldn't deliver anything, and so he had the team over a barrel. It should have been easily fixed, though. I just had to go to the bosses, and get an executive decision. Case closed.

Except the bosses didn't know whether to believe me or believe him, because they knew nothing about what we were trying to do. And they were unwilling to concede authority to the team to make the decisions.

This is a common anti-pattern in the way organisations are run, and it seems to be getting worse. Our whole society now seems to be organised in a way that separates decision-making authority from expertise. Indeed, the more authority someone has, the less they tend to understand the ramifications of the decisions they're making.

In Britain, this is partly - I think - down to an historical belief in a "leadership class". Leaders go to certain good (fee-paying) schools, they get classical educations that focus on ancient languages, history, literature etc, and they study purely academic non-technical subjects at university like Politics or History. They're prepared for the burden of command by their upbringing - "character building" - rather than their actual education or training. They're not actually qualified to do anything. They're not scientists, or doctors, or nurses, or engineers, or designers, or programmers, or builders, or anything drearily practical like that. They're leaders.

These are the people who tend to end up in the boardrooms of Britain's biggest companies. These are the people who end up in charge of government departments. These are the people who are in charge of TV & radio.

And hence we get someone making the big decisions about healthcare who knows nothing about medicine or about running hospitals or ambulance services. And we get someone in charge of all the schools who knows nothing about teaching or running a school. And we get someone in charge of a major software company whose last job was being in charge of a soft drinks company. And so on.

Again, this is fine, if they leave the technical decisions to the technical experts. And that's where it all falls down, of course. They don't.

The guy in charge of the NHS insists on telling doctors and nurses how they should do their jobs. The woman in charge of UK schools insists on overriding the expertise of teachers. The guy in charge of a major software company refuses to listen to the engineers about the need for automated testing. And so on.

This is the Dunning-Kruger effect writ large. CEOs and government ministers are brimming with the overconfidence of someone who doesn't know that they don't know.

Recently, I got into a Twitter conversation with someone attending the launch of the TechNation report here in the UK. It was all about digital strategy, and I had naively asked if there were any software developers at the event. His response was "Why would we want software developers involved in digital strategy?" As far as he was concerned, it was non of our godammed business.

I joked that it was like having an event called "HealthNation" and not inviting any doctors, or "SchoolNation" and not inviting any teachers. At which point, doctors and teachers started telling me that this has in actual fact happened, and happens often.

The consequences for everyone when you separate authority and expertise can be severe. If leaders can't learn humility and leave the technical decisions to the technical experts - if that authority won't be willingly given - then authority must be taken. The easiest way - for the people who do the actual work and make the wheels turn - is to simply not ask for permission. That was our first mistake.





March 22, 2017

Learn TDD with Codemanship

Digital Strategy: Why Are Software Developers Excluded From The Conversation?

I'm running another Twitter poll - yes, me and my polls! - spurred on by a conversation I had with a non-technical digital expert (not sure how you can be a non-technical digital expert, but bear with me) in which I asked if there were any software developers attending the government's TechNation report launch today.




This has been a running bugbear for me; the apparent unwillingness to include the people who create the tech in discussions about creating tech. Imagine a 'HealthNation' event where doctors weren't invited...

The response spoke volumes about how our profession is viewed by the managers, marketers, recruiters, politicians and other folk sticking their oars into our industry. Basically, strategy is non of our beeswax. We just write the code. Other geniuses come up with the ideas and make all the important stuff happen.

I don't think we're alone in this predicament. Teachers tell me that there are many events about education strategy where you'll struggle to find an actual educator. Doctors tell me likewise that they're often excluded from the discussion about healthcare. Professionals are just there to do as we're told, it seems. Often by people with no in-depth understanding of what it is they're telling us to do.

This particular person asserted that software development isn't really that important to the economy. This flies in the face of newspaper story after newspaper story about just how critical it is to the functioning of our modern economy. The very same people who see no reason to include developers in the debate also tell us that in decades to come, all businesses will be digital businesses.

So which is it? Is digital technology - that's computers and software to you and me - now so central to modern life that our profession is as critical today as masons were in medieval times? Or can any fool 'code' software, and what's really needed is better PR people?

What's clear is that highly skilled professions like software development - and teaching, and medicine - have had their status undermined to the point where nobody cares what the practitioner thinks any more. Except other practitioners, perhaps.

This runs hand-in-hand with a gradual erosion in real terms of earnings. A developer today, on average, earns 25% less then she did 10 years ago. This trend plains against the grain of the "skills crisis" narrative, and more accurately portrays the real picture where developers are seen as cogs in the machine.

If we're not careful, and this trend continues, software development could become such a low-status profession in the UK that the smartest people will avoid it. This, coupled with a likely brain drain after Brexit, will have the reverse effect to what initiatives like TechNation are aiming for. Despite managers and marketers seeing no great value in software developers, where the rubber meets the road, and some software has to be created, we need good ones. Banks need them. Supermarkets need them. The NHS needs them. The Ministry of Defence needs them. Society needs them.

But I'm not sure they're going about the right way of getting them. Certainly, ignoring the 400,000 people in the UK who currently do it doesn't send the best message to smart young people who might be considering it. They could be more attracted to working in tech marketing, or for a VC firm, or in the Dept of Trade & Industry formulating tech policy.

But if we lack the people to actually make the technology work, what will they be marketing, funding and formulating policy about? They'd be a record industry without any music.

Now, that's not to say that marketing and funding and government policy don't matter. It's all important. But if it's all important, why are the people who really make the technology excluded from the conversation?






March 12, 2017

Learn TDD with Codemanship

Why Products Don't Matter To Tech Investors, But Hype Does



They say 'you get what you measure', and in the world of tech start-ups this is especially true.

Having lived and worked through several tech bubbles, I've seen how the promise of winning mind-bogglingly big ("boggly"?) has totally skewed business models beyond the point where many tech start-ups are even recognisable as businesses.

The old notion of a business - where the aim is to make more money than you spend - seems hopelessly old-fashioned now. Get with it Granddad! It's not about profit any more, it's about share prices and exit strategies.

If your tech start-up looks valuable, then it's valuable. It matters not a jot that you might be losing money hand-over-fist. As long as investors believe it's worth a gazillion dollars, then it's worth a gazillion dollars.

For those of us who create the technologies that these start-ups are built on, this translates into a very distorted idea of what constitutes a 'successful product'. I have, alas, heard many founders state quite openly that it doesn't much matter if the technology really works, or if the underlying code has any long-term future, What matters is that, when it comes time to sell, someone buys it.

Ethically, this is very problematic. It's like a property developer saying 'it doesn't matter if the house is still standing 10 years from now, just as long as it's standing when someone buys it'.

Incredibly, very few tech start-up investors do any technical due diligence at all. Again, I suspect this is because you get what you measure. They're not buying it because it works, or because the technology has a future. They're buying it so that they, too, can sell it on for an inflated price. It could be a brick with "tech" painted on it, and - as long as the valuation keeps going up - they'll buy it.

You see, investors want returns. And that's pretty much all they want. They want returns from tech businesses. They don't care if the tech works, or if the business is viable in the long term. Just like they want returns from luxury waterside apartments that no ordinary person could afford to live in. They don't care if nobody lives in them, just as long as they keep going up in value.

Just like the property bubble, the tech bubble runs chiefly on hype. A £1,000,000 2-bed apartment is worth £1,000,000 because an estate agent or someone in a glossy colour supplement said so. An expectation of value is set. Similarly, Facebook is worth twelvety gazillions because the financial papers say it is. That is the expectation for tech start-ups. They can be valued at billions of dollars with a turnover of peanuts by comparison.

What makes tech businesses worth so much is their potential for expansion. Expanding your chain of coffeeshops requires you to overcome real physical barriers like finding premises and staff and buying more coffee machines and so on. Cost-wise, for a software business, there's not a huge difference in outlay between 1,000 users and 1,000,000 users. If only Costa could host their shops on the cloud!

Tech start-ups are just lottery tickets, where the jackpot can be truly astronomical. And what matters most in this equation is that the technology sounds good.

Exhibit A: the recent Articificial Intelligence bubble. News journalists have latched on to this idea - carefully crafted and lovingly packaged by tech industry PR - that real AI, the kind we saw in the movies, is just around the corner. What, again?

The true promise of AI has been 30 years away for the last 50 years. In reality, chatbot sales support sucks, websites designed by AI are crummy, and recommendations made by ad taregting engines are as asanine as they always were ("we see you bought a lawnmower; would you like to buy another lawnmower?") And as for self-driving cars...

But it doesn't matter that AI isn't as far advanced as they'd have us believe, just as long as we do believe it - for long enough to dip our hands in our pockets and buy some shares in the companies hyping it. And we'll do it, because there's a good chance those shares will go up in value and we'll be able to sell them for a profit. This is why few people are willing to say that the Emperor has no clothes. Too many own shares in the company that manufactured those clothes.

As a software developer, though, I struggle to find job satisfaction or to find any professional pride in making invisible clothes for silly Emperors. Yes, there are those of us - sadly, too plentiful - who'll apparently do anything as long as the money's right. But I'd like to think that's a different profession to the one I'm in.




January 18, 2017

Learn TDD with Codemanship

How Long Would Your Organisation Last Without Programmers?

A little straw poll I did recently on Twitter has proved to be especially timely after a visit to the accident & emergency ward at my local hospital (don't worry - I'm fine).



It struck me just how reliant hospitals have become on largely bespoke IT systems (that's "software" to me and you). From the moment you walk in to see the triage nurse, there's software - you're surrounded by it. The workflow of A&E is carefully controlled via a system they all access. There are computerised machines that take your blood pressure, monitor your heart, peer into your brain and build detailed 3D models, and access your patient records so they don't accidentally cut off the wrong leg.

From printing out prescriptions to writing notes to your family doctor, it all involves computers now.

What happens if all the software developers mysteriously disappeared overnight? There'd be nobody to fix urgent bugs. Would the show-stoppers literally stop the show?

I can certainly see how that would happen in, say, a bank. And I've worked in places where - without 24/7 bug-fixing support - they'd be completely incapable of processing their overnight orders, creating a massive and potentially un-shiftable backlog that could crush their business in a few weeks.

Ultimately, DR is all about coping in the short term, and getting business-as-usual (or some semblance of it) up and running as quickly as possible. It can delay, but not avoid, the effects of having nobody who can write or fix code.

And I'm aware that big organisations have "disaster recovery" plans. I've been privy to quite a few in my lofty position as Chief Technical Arguer in some very large businesses. But all the DR plans I've seen have never asked "what happens if there's nobody to fix it?"

Smart deployers, of course, can just roll back a bad release to the last one that worked, ensuring continuity... for a while. But I know business code: even when it's working, it's often riddled with unknown bugs, waiting to go off, like little business-killing landmines. I've fixed bugs in COBOL that were written in the 1960s.

Realistically, rolling back your systems by potentially decades is not an option. You probably don't even have access to that old code. Or, if you do, someone will have to re-type it all in from code listings kept in cupboards.

And even if you could revert your way back to a reliable system with potential longevity, without the ability to change and adapt those systems to meet future needs will soon start to eat away at your organisation from the inside.

It's food for thought.


December 28, 2016

Learn TDD with Codemanship

The Best Software Developers Know It's Not All 'Hacking'

A social media debate that appears to have been raging over the Xmas break was triggered by some tech CEO claiming that the "best developers" would be hacking over the holidays.

Putting aside just how laden with cultural assumptions that tweet was (and, to be fair, many of the angry responses to it), there is a wider question of what makes a software developer more effective.

Consider the same tweet but in a different context: the "best screenwriters" will spend their holiday "hacking" screenplays. There's an assumption there that writing is all there is to it. Look at what happens, though, when screenwriters become very successful. Their life experiences change. They tend to socialise with other movie people. They move out of their poky little downtown apartments and move into more luxurious surroundings. They exchange their beat-up old Nissan for a brand new Mercedes. They become totally immersed in the world of the movie business. And then we wonder why there are so many movies about screenwriters...

The best screenwriters start with great stories, and tell their stories in interesting and authentic voices. To write a compelling movie about firefighters, they need to spend a lot of time with firefighters, listening to their stories and internalising their way of telling them.

First and foremost, great software developers solve real problems. They spend time with people, listen to their stories, and create working solutions to their problems told in the end user's authentic voice.

What happens when developers withdraw from the outside world, and devote all of their time to "hacking" solutions, is the equivalent of the slew of unoriginal and unimaginative blockbuster special effects movies coming out of Hollywood in recent years. They're not really about anything except making money. They're cinema for cinema's sake. And rehashes of movies we've already seen, in one form or another. because the people who make them are immersed in their own world: movie making.

Ironically, if a screenwriter actually switched off their laptop and devoted Xmas to spending time with the folks, helping out with Xmas dinner, taking Grandma for a walk in her wheelchair, etc, they would probably get more good material out of not writing for a few days.

Being immersed in the real world of people and their problems is essential for a software developer. It's where the ideas come from, and if there's one thing that's in short supply in our industry, it's good ideas.

I've worked with VCs and sat through pitches, and they all have a Decline Of The Hollywood Machine feel to them. "It's like Uber meets Moonpig" is our equivalent of the kind of "It's like Iron-Man meets When Harry Met Sally" pitches studio executives have to endure.

As a coach, when I meet organisations and see how they create software, as wel as the usual technical advice about shortening feedback cycles and automating builds and deployment etc, increasingly I find myself urging teams to spend much more time with end users, seeing how the business works, listening to their stories, and internalising their voices.

As a result, many teams realise they've been solving the wrong problems and ultimately building the wrong software. The effect can be profound.

I'll end with a quick illustration: my apprentice and I are working on a simple exercise to build a community video library. As soon as we saw those words "video library", we immediately envisaged features for borrowing and returning videos, and that sort of library-esque thing.

But hang on a moment! When I articulated the goal of the video library - to allow people to watch a movie a day for just 25p per movie - and we broke that down into a business model where 40+ people in the same local area who like the same genres of movies club together - it became apparent that what was really needed was a way for these people to find each other and form clubs.

Our business model (25p to watch one movie) precluded the possibility of using the mail to send and return DVDs. So these people would have to be local to wherever the movies were kept. That could just be a shed in someone's garden. No real need for a sophistated computer system to make titles. They would just need some shelves, and maybe a book to log loans and returns.

So the features we finally came up with had nothing to do with lending and returning DVDs, but was instead about forming local movie clubs.

I doubt we'd have come to that realisation without first articulating the problem. It's not rocket science. But still, too few teams approach development in this way.

So, instead of the "It's like GitHub, but for cats" ideas that start from a solution, take some time out to live in the real world, and keep your eyes and ears open for when someone says "You know what's really annoys me?", because an idea for the Next Big Thing might follow.





December 8, 2016

Learn TDD with Codemanship

What Do I Think of "Scaled Agile"?

People are increasingly asking me for my thoughts on "scaled agile", so I though I'd take a quiet moment to collect my thoughts in one place.

Ever since that fateful meeting in Snowbird, Utah in 2001, some commercially-minded folk have sought to "scale up" the Agile brand so it can be applied to large organisations.

I'll give you an example of the kind of organisation we're talking about: a couple of years ago I was invited into a business that had about 150 teams of developers all effectively working on the same system (or different versions of the same system). I was asked to put together a report and some recommendations on how TDD could be adopted across the organisation.

The business in question was peppered throughout with Agile consultants, Scrum Masters, Lean experts, Kanban experts, and all manner of Agile flora and fauna.

Teams all used user stories, all had Scrum or Kanban boards, all did daily stand-ups, and all the paraphernalia we associate with Agile Software Development.

But if there was one thing they most definitely were not, it was agile. Change was slow and expensive. There was absolutely no sense of overall direction or control, or of an overall picture of progress. And the layers of "scaled Agile" the managers had piled on top of all that mess was just making things worse.

It's just a fact of life. Software development doesn't scale. Once software projects go above a certain size (~$1 million), chaos is inevitable, and the best you can hope for is an illusion of control.

And that, in my considerably wide experience of organisations of all sizes attempting to apply agile principles and practices, is all that the "scaled agile" methods can offer.

I've seen some quite well-known case studies of organisations that claim to be doing agile at scale with my own eyes, and they just aren't. 150 teams doing scaled agile, it turns out, is just 150 teams doing their own thing, and on a surface level making it look like they're all following a common process. But they're still dogged by all the same problems that any organisation trying to do software development at scale is dogged by. You can't fix nature.

Instead, you have to acknowledge the true nature of development at scale; that these are highly complex systems, not conducive to overall top-down control, from which outcomes organically emerge, planned or not.

Insect colonies do not follow top-down processes. A beehive isn't command and control, even if it might look to the casual observer that there's a co-ordinated plan they're all following. What's actually happening is that individual bees are responding to broadcast messages ("goals") about where the pollen, or the threat, can be found, and then they respond according to a set of simple internalised rules to that message, co-operating at a local level so as not to bump into each other.

In software development teams, the internalised rules - often unspoken and frequently at odds with the spoken or written rules - and the interactions at a local level determine what the outcomes the system will produce. We call these internalised rules "culture". Culture is often simple, but buried so deep that changing the culture can take a long time.

In particular, culture round the way we communicate and collaborate tends to steer the ship in particular directions, regardless of which direction you point the rudder. This is a property of complex adaptive systems called "strange attractors".

Complex systems have a property called "homeostatis" - a tendency, when disturbed, to iteratively revert back to their original dynamic state, as determined by their strange attractors. Hence, a heartbeat can rise to more than 150 bpm, but will eventually return to a resting heart rate of about 70-80 bpm.

We can apply external stimuli to a system to try and change the way it performs, but the intrinsic properties of the agents within that system, and particularly their interactions, will ultimately determine the outcome.

Methods like SAFe, LeSS and DAD are attempts to exert top-down control on highly complex adaptive organisations. As such, in my opinion and in the examples I've witnessed, they - at best - create the illusion of control. And illusions of control aren't to be sniffed at. They've been keeping the management consulting industry in clover for decades.

The promise of scaled agile lies in telling managers what they want to hear: you can have greater control. You can have greater predictability. You can achieve economies of scale. Acknowledging the real risks puts you at a disadvantage when you're bidding for business.

But if you really want to make a practical difference in a large software development organisation, the best results I've seen have come from focusing on the culture: what do people really value? What do people really believe? What are people's real habits? What do they really do under pressure?

You build big, complex products out of small, simple parts. The key is not in trying to exert control over the internal workings of each part, but to focus on how the parts - and the small, simple teams who make them - interact. Each part does a job. Each part will depend on some of the other parts. An overall architecture can emerge by instilling a set of good, practical organising principles across the teams - a design culture, like we have in the architecture of buildings, for example. The teams negotiate with each other to resolve potential conflicts, like motorists on our complex road systems trying to get where they need to go without bumping into each other.

Another word for this is "anarchy". I advice you not to use it in client meetings. But that is what it is.

I think it's very telling that so many of the original signatories of the Agile Manifesto have voiced scepticism - indeed, in some cases been very scathing - of "scaled agile". The way I see it, it's the precise opposite of what they were trying to tell us at Snowbird.

This is why, as a professional, I've invested so much time in training and coaching developers and teams, rather than in management consulting. I certainly engage with bosses, but when they ask about "scaled agile" I tell them what I personally think, which is that it's a mirage.





October 29, 2016

Learn TDD with Codemanship

How Do I Know a "Software Developer" When I See One?

More thoughts on what a software developer is; this time on how we know one when we see one.

Everyone has their own story, and software developers come by many routes. My personal journey, like so many, started with the home computing boom of the early 1980s, but didn't really flourish into what I'd now call "software development" until more than a decade later, when I was introduced to practices with names like "requirements engineering", "software configuration management", "architecture", "modeling", "unit testing", "usability", "formal specification", and all the rest.

20 years ago, when I was relatively new to a lot of these things, I believed in The One True WayTM. Today, I've experienced first-hand several One True Ways - Fusion, Catalysis, the Unified Process, Select Perspective (and a myriad other UML-y, component-y methods, even designing a few myself), and Extreme Programming. The way I practice XP draws on previous experience with Formal Methods, model-driven development, plus ideas pinched from books on Cleanroom Software Engineering, the Personal and Team Software Processes, Feature-Driven Development, business strategy, complexity theory, and anywhere else I stumbled across good ideas that could be applied to writing software.

Having absorbed so many ideas - or, more accurately, so many different interpretations of many of the same good ideas (e.g., iterate towards testable goals, test early & often, build software out of simple swappable parts, etc) - I feel I have a decent handle on what software development is, and have practical experience across a spectrum of usually cosmetically different approaches to it.

And I know a lot of software developers. Their journeys may be different to mine, but they all seem to have one thing in common with me: diversity of experience and ideas. They're not people who just learned Extreme Programming, just work in Java, just do TDD, just use Git, just write web applications, and so on.

They've typically been around long enough to have had careers that span multiple "movements" in software development: the OO/patterns movement of the early-mid-nineties, the UML/UP/component-based development movement of the late 90's. The early agile/XP movement of the early noughties. The post-agile/craftsmanship movement of the late noughties. And, yes, I am saying that if you're my idea of a "software developer", chances are you've been in this game - remaining at the code face to some degree - for 20+ years.

But not necessarily. Someone who reads a diverse range of literature on software development, and tries the ideas for real, could develop the breadth and depth of knowledge and experience faster. But we're still talking a decade or more, realistically. Nobody learns that fast!

And the experience is important. So much of what we do relies on tacit knowledge. It's not written down anywhere. You can't cram that.

So - to recap and summarise - if you asked me "is that person a 'software developer'?", I'd look for experience of all the key disciplines of software development (including some team management, hiring, finance/admin, etc), and I'd look for at least 2 examples of each:

  • 2+ approaches to development (e.g., Unified Process and DSDM)

  • 2+ programming languages & paradigms (e.g., Java and F#)

  • 2+ technology stacks (e.g., LAMP and Android)

  • 2+ VCS technologies & workflows (e.g., Git and SVN)

  • 2+ build & deployment technologies & approaches (e.g., Maven+Jenkins, Make+Bash)

  • 2+ kinds of visual modeling/sketching of architecture & design (e.g., UML and ERD)

  • 2+ database technologies & paradigms (e.g, MySQL and Neo4j)

  • 2+ kinds of testing (e.g., Web GUI with Selenium, and customer tests with FitNesse)

  • 2+ roles they've played (e.g., programmer and architect, or tester and business analyst) - though, really, a "software developer" could wear any hat on a small project

  • 2+ business domains, 2+ different scales of project/programme



You get the general idea, I'm sure.

On top of all this, I'd expect a software developer to have contributed to the wider developer community: blogging (at a minimum), teaching, mentoring, writing, speaking/presenting, creating or contributing to useful development tools and standards. You know... paying it forward. I would not be what I consider a software developer if other people hadn't done these things.




October 28, 2016

Learn TDD with Codemanship

What is a "Software Developer"?

Over the last few months, I've been thinking a lot about what a software developer is, and about how employers might know one when they see one.

I've long had my own idea: that a software developer - as opposed to a programmer, or a web developer, or a tester, or user experience designer, or any other specialised role - is a bit like a building contractor.

My Dad was a builder for many years, and worked in and then took over the family business. He studied at college and worked on building sites, but he wasn't a tradesman. Not a bricklayer, or a carpenter, or an electrician, or a plumber, or a steelworker, or an architect, or a quantity surveyor, or a structural engineer.

As a building contractor - the person in charge of delivering working buildings - he had to be a bit of all those things, and a bit more. He could do bricklaying, who could do carpentry, he could do plumbing. And he could design buildings, as well as read blueprints, cost the work, calculate how deep the foundations would need to be, how big the boiler would need to be, how thick the loft insulation would need to be and so on.

To me, a software developer is a bit like that. We're responsible for delivering working software, and as such we need to have a practical appreciation of everything that's required to make that happen. We're not bricklayers, but we can lay bricks. We're not plumbers, but we can plumb if we need to. We're not electricians, but we could do wiring to a safe standard if necessary.

A software developer isn't a database expert, but can design, implement and administer databases if needed. A software developer isn't a Java programmer, or a C# programmer, or Swift programmer, but could do the coding to a high enough standard if needed. A software developer isn't a UX expert, but could design a usable GUI if required. A software developer isn't a software architect, but could create and maintain a workable architecture, along with any necessary visual models, if called for.

And a software developer isn't a project manager, but they could manage a project without tying themselves in knots, keeping track of time and resources, hiring teams, getting all the space and equipment the team needs, liaising with project stakeholders, and being a bit of a sales person for the team when selling is needed.

Nor is a software developer a requirements or a business analyst, but they're capable of working directly with customers and other stakeholders, identifying real requirements, and articulating requirements in a way that can lead to working software that meets the customer's needs.

For me, a software developer can be trusted to work alone - on projects of a certain size, and on certain technology stacks, in certain problem domains - to produce working software that's fit for purpose. Just as my dad, perhaps with the help of one labourer, could build a kitchen extension or a conservatory.

And on more ambitious projects he'd know who to hire. He knows a good bricklayer when he sees one. He knows a good chippy when he sees one. He knows what he wants from a plumber, from an electrician, from a roofer, from a landscape gardener.

He could build you a kitchen extension by himself. And he could hire and manage a team that could build you a factory unit, or a housing estate.

In my mind, that's what a software developer is. One of Scott Ambler's "generalising specialists", who has a hands-on, practical understanding of all the key disciplines in software development - management, strategy, requirements, security, infrastructure, architecture, design, programming, testing, configuration & release management, support, maintenance, decommissioning and transition - and knows who to hire when the job is too much for one developer.

The main difference between my vision and what a building contractor does is that, in my vision, everyone on the development team is a developer, just with particular specialisms. So, going back to the building metaphor, a building contractor who specialise in brick and block work, a building contractor who specialises in plumbing, and so on.

This gives great flexibility. Specialists who only know their specialism can quickly become bottlenecks. I've lost count of the days I've wasted waiting for the UX designer to come and help me on a user story. And then there was that time the DBA went on a month-long holiday and had locked everyone else out of making changes to the database...

So, not so much "collective code ownership" as collective everything ownership.

Well, that's my idea of what a software developer is. You may now throw the furniture around.




September 29, 2016

Learn TDD with Codemanship

The Afternoon I Learned to Fly a Plane

When I was younger, my Dad was an amateur pilot. One time, he took us up for a flight, and when he'd got us up in the air and on our way - on a lovely calm and clear day - he let me take the controls for a little while. I remember after we landed being cockahoop that I'd learned how to "fly a plane".

Imagine, now, that I harboured ambitions to start my own airline. The only barrier to this plan is that I'm not a pilot, and hiring pilots is expensive. But, fear not, because - remember - on that calm and clear summer afternoon I learned how to "fly a plane".

Sure, I didn't learn how to take off. Or how to land. Or how to navigate. Or what to do when visibility is poor, or when the skies are choppy. And I didn't learn what all the knobs and dials and switches do, or what it means when that particular red light is blinking and what I should do in response. I didn't learn about basic aerodynamics, or how aeroplanes work or which bits of the plane do what, and how to check that everything's working before I take off. I didn't learn about the fuel the plane uses, where to get some, and how to calculate how much fuel I might need for my journey. I didn't learn how to land in an emergency. I didn't learn about speaking to air traffic control, nor how to use the radio for any purpose. I didn't learn about flight plans, or how to read aviation charts. I didn't learn about "air corridors", or about CAA rules.

But I had flown a plane, and was therefore surely only a couple more lessons away from being a pilot. I mean, it's easy. You just hold the steering wheel and keep the nose of the plane pointing to where you want to go. Right? What's that? It's called a "yoke"? Okay, I didn't learn that either.

This is all absurd, of course. It takes many hours of experience, and much study, before you're safe to be in charge of an aeroplane. There's so much that can go wrong, and you can't just pull over on to the hard shoulder when it does. when I had control, there was a qualified, experienced pilot sitting right next to me, and - in reality - he was in control. He talked me through it all the way, and if necessary could take over at a second's notice. I just did exactly what he told me to do.

But if I were a cynical opportunist, I might exploit this initial moment of euphoria that I experienced to make me believe that I can fly a plane. I can see the ads now: "Thinking of starting an airline? Why pay for expensive so-called 'pilots' when anyone can learn to fly on our 1-day course." This would be an exemplary commercialisation of the Dunning-Kruger effect: they don't know that they haven't learned to fly a plane, and we ain't about to tell them.

What that short flight with Dad did do, though, was get us excited about learning to fly. So much so, that for a good few years it was my brother's primary ambition to be a pilot.

A course claiming to teach you to fly in a day - or even an hour - would end up in court, of course. Grown-ups know it just isn't as simple as holding the stick and going in a straight line for 5 minutes.

So, why don't grown-ups know this about writing software?