March 22, 2017
Digital Strategy: Why Are Software Developers Excluded From The Conversation?I'm running another Twitter poll - yes, me and my polls! - spurred on by a conversation I had with a non-technical digital expert (not sure how you can be a non-technical digital expert, but bear with me) in which I asked if there were any software developers attending the government's TechNation report launch today.
As a software developer, do you feel your work and status as a professional is valued?— Codemanship (@codemanship) March 22, 2017
This has been a running bugbear for me; the apparent unwillingness to include the people who create the tech in discussions about creating tech. Imagine a 'HealthNation' event where doctors weren't invited...
The response spoke volumes about how our profession is viewed by the managers, marketers, recruiters, politicians and other folk sticking their oars into our industry. Basically, strategy is non of our beeswax. We just write the code. Other geniuses come up with the ideas and make all the important stuff happen.
I don't think we're alone in this predicament. Teachers tell me that there are many events about education strategy where you'll struggle to find an actual educator. Doctors tell me likewise that they're often excluded from the discussion about healthcare. Professionals are just there to do as we're told, it seems. Often by people with no in-depth understanding of what it is they're telling us to do.
This particular person asserted that software development isn't really that important to the economy. This flies in the face of newspaper story after newspaper story about just how critical it is to the functioning of our modern economy. The very same people who see no reason to include developers in the debate also tell us that in decades to come, all businesses will be digital businesses.
So which is it? Is digital technology - that's computers and software to you and me - now so central to modern life that our profession is as critical today as masons were in medieval times? Or can any fool 'code' software, and what's really needed is better PR people?
What's clear is that highly skilled professions like software development - and teaching, and medicine - have had their status undermined to the point where nobody cares what the practitioner thinks any more. Except other practitioners, perhaps.
This runs hand-in-hand with a gradual erosion in real terms of earnings. A developer today, on average, earns 25% less then she did 10 years ago. This trend plains against the grain of the "skills crisis" narrative, and more accurately portrays the real picture where developers are seen as cogs in the machine.
If we're not careful, and this trend continues, software development could become such a low-status profession in the UK that the smartest people will avoid it. This, coupled with a likely brain drain after Brexit, will have the reverse effect to what initiatives like TechNation are aiming for. Despite managers and marketers seeing no great value in software developers, where the rubber meets the road, and some software has to be created, we need good ones. Banks need them. Supermarkets need them. The NHS needs them. The Ministry of Defence needs them. Society needs them.
But I'm not sure they're going about the right way of getting them. Certainly, ignoring the 400,000 people in the UK who currently do it doesn't send the best message to smart young people who might be considering it. They could be more attracted to working in tech marketing, or for a VC firm, or in the Dept of Trade & Industry formulating tech policy.
But if we lack the people to actually make the technology work, what will they be marketing, funding and formulating policy about? They'd be a record industry without any music.
Now, that's not to say that marketing and funding and government policy don't matter. It's all important. But if it's all important, why are the people who really make the technology excluded from the conversation?
March 12, 2017
Why Products Don't Matter To Tech Investors, But Hype Does
They say 'you get what you measure', and in the world of tech start-ups this is especially true.
Having lived and worked through several tech bubbles, I've seen how the promise of winning mind-bogglingly big ("boggly"?) has totally skewed business models beyond the point where many tech start-ups are even recognisable as businesses.
The old notion of a business - where the aim is to make more money than you spend - seems hopelessly old-fashioned now. Get with it Granddad! It's not about profit any more, it's about share prices and exit strategies.
If your tech start-up looks valuable, then it's valuable. It matters not a jot that you might be losing money hand-over-fist. As long as investors believe it's worth a gazillion dollars, then it's worth a gazillion dollars.
For those of us who create the technologies that these start-ups are built on, this translates into a very distorted idea of what constitutes a 'successful product'. I have, alas, heard many founders state quite openly that it doesn't much matter if the technology really works, or if the underlying code has any long-term future, What matters is that, when it comes time to sell, someone buys it.
Ethically, this is very problematic. It's like a property developer saying 'it doesn't matter if the house is still standing 10 years from now, just as long as it's standing when someone buys it'.
Incredibly, very few tech start-up investors do any technical due diligence at all. Again, I suspect this is because you get what you measure. They're not buying it because it works, or because the technology has a future. They're buying it so that they, too, can sell it on for an inflated price. It could be a brick with "tech" painted on it, and - as long as the valuation keeps going up - they'll buy it.
You see, investors want returns. And that's pretty much all they want. They want returns from tech businesses. They don't care if the tech works, or if the business is viable in the long term. Just like they want returns from luxury waterside apartments that no ordinary person could afford to live in. They don't care if nobody lives in them, just as long as they keep going up in value.
Just like the property bubble, the tech bubble runs chiefly on hype. A £1,000,000 2-bed apartment is worth £1,000,000 because an estate agent or someone in a glossy colour supplement said so. An expectation of value is set. Similarly, Facebook is worth twelvety gazillions because the financial papers say it is. That is the expectation for tech start-ups. They can be valued at billions of dollars with a turnover of peanuts by comparison.
What makes tech businesses worth so much is their potential for expansion. Expanding your chain of coffeeshops requires you to overcome real physical barriers like finding premises and staff and buying more coffee machines and so on. Cost-wise, for a software business, there's not a huge difference in outlay between 1,000 users and 1,000,000 users. If only Costa could host their shops on the cloud!
Tech start-ups are just lottery tickets, where the jackpot can be truly astronomical. And what matters most in this equation is that the technology sounds good.
Exhibit A: the recent Articificial Intelligence bubble. News journalists have latched on to this idea - carefully crafted and lovingly packaged by tech industry PR - that real AI, the kind we saw in the movies, is just around the corner. What, again?
The true promise of AI has been 30 years away for the last 50 years. In reality, chatbot sales support sucks, websites designed by AI are crummy, and recommendations made by ad taregting engines are as asanine as they always were ("we see you bought a lawnmower; would you like to buy another lawnmower?") And as for self-driving cars...
But it doesn't matter that AI isn't as far advanced as they'd have us believe, just as long as we do believe it - for long enough to dip our hands in our pockets and buy some shares in the companies hyping it. And we'll do it, because there's a good chance those shares will go up in value and we'll be able to sell them for a profit. This is why few people are willing to say that the Emperor has no clothes. Too many own shares in the company that manufactured those clothes.
As a software developer, though, I struggle to find job satisfaction or to find any professional pride in making invisible clothes for silly Emperors. Yes, there are those of us - sadly, too plentiful - who'll apparently do anything as long as the money's right. But I'd like to think that's a different profession to the one I'm in.
January 18, 2017
How Long Would Your Organisation Last Without Programmers?A little straw poll I did recently on Twitter has proved to be especially timely after a visit to the accident & emergency ward at my local hospital (don't worry - I'm fine).
It struck me just how reliant hospitals have become on largely bespoke IT systems (that's "software" to me and you). From the moment you walk in to see the triage nurse, there's software - you're surrounded by it. The workflow of A&E is carefully controlled via a system they all access. There are computerised machines that take your blood pressure, monitor your heart, peer into your brain and build detailed 3D models, and access your patient records so they don't accidentally cut off the wrong leg.
From printing out prescriptions to writing notes to your family doctor, it all involves computers now.
What happens if all the software developers mysteriously disappeared overnight? There'd be nobody to fix urgent bugs. Would the show-stoppers literally stop the show?
I can certainly see how that would happen in, say, a bank. And I've worked in places where - without 24/7 bug-fixing support - they'd be completely incapable of processing their overnight orders, creating a massive and potentially un-shiftable backlog that could crush their business in a few weeks.
Ultimately, DR is all about coping in the short term, and getting business-as-usual (or some semblance of it) up and running as quickly as possible. It can delay, but not avoid, the effects of having nobody who can write or fix code.
And I'm aware that big organisations have "disaster recovery" plans. I've been privy to quite a few in my lofty position as Chief Technical Arguer in some very large businesses. But all the DR plans I've seen have never asked "what happens if there's nobody to fix it?"
Smart deployers, of course, can just roll back a bad release to the last one that worked, ensuring continuity... for a while. But I know business code: even when it's working, it's often riddled with unknown bugs, waiting to go off, like little business-killing landmines. I've fixed bugs in COBOL that were written in the 1960s.
Realistically, rolling back your systems by potentially decades is not an option. You probably don't even have access to that old code. Or, if you do, someone will have to re-type it all in from code listings kept in cupboards.
And even if you could revert your way back to a reliable system with potential longevity, without the ability to change and adapt those systems to meet future needs will soon start to eat away at your organisation from the inside.
It's food for thought.
December 28, 2016
The Best Software Developers Know It's Not All 'Hacking'A social media debate that appears to have been raging over the Xmas break was triggered by some tech CEO claiming that the "best developers" would be hacking over the holidays.
Putting aside just how laden with cultural assumptions that tweet was (and, to be fair, many of the angry responses to it), there is a wider question of what makes a software developer more effective.
Consider the same tweet but in a different context: the "best screenwriters" will spend their holiday "hacking" screenplays. There's an assumption there that writing is all there is to it. Look at what happens, though, when screenwriters become very successful. Their life experiences change. They tend to socialise with other movie people. They move out of their poky little downtown apartments and move into more luxurious surroundings. They exchange their beat-up old Nissan for a brand new Mercedes. They become totally immersed in the world of the movie business. And then we wonder why there are so many movies about screenwriters...
The best screenwriters start with great stories, and tell their stories in interesting and authentic voices. To write a compelling movie about firefighters, they need to spend a lot of time with firefighters, listening to their stories and internalising their way of telling them.
First and foremost, great software developers solve real problems. They spend time with people, listen to their stories, and create working solutions to their problems told in the end user's authentic voice.
What happens when developers withdraw from the outside world, and devote all of their time to "hacking" solutions, is the equivalent of the slew of unoriginal and unimaginative blockbuster special effects movies coming out of Hollywood in recent years. They're not really about anything except making money. They're cinema for cinema's sake. And rehashes of movies we've already seen, in one form or another. because the people who make them are immersed in their own world: movie making.
Ironically, if a screenwriter actually switched off their laptop and devoted Xmas to spending time with the folks, helping out with Xmas dinner, taking Grandma for a walk in her wheelchair, etc, they would probably get more good material out of not writing for a few days.
Being immersed in the real world of people and their problems is essential for a software developer. It's where the ideas come from, and if there's one thing that's in short supply in our industry, it's good ideas.
I've worked with VCs and sat through pitches, and they all have a Decline Of The Hollywood Machine feel to them. "It's like Uber meets Moonpig" is our equivalent of the kind of "It's like Iron-Man meets When Harry Met Sally" pitches studio executives have to endure.
As a coach, when I meet organisations and see how they create software, as wel as the usual technical advice about shortening feedback cycles and automating builds and deployment etc, increasingly I find myself urging teams to spend much more time with end users, seeing how the business works, listening to their stories, and internalising their voices.
As a result, many teams realise they've been solving the wrong problems and ultimately building the wrong software. The effect can be profound.
I'll end with a quick illustration: my apprentice and I are working on a simple exercise to build a community video library. As soon as we saw those words "video library", we immediately envisaged features for borrowing and returning videos, and that sort of library-esque thing.
But hang on a moment! When I articulated the goal of the video library - to allow people to watch a movie a day for just 25p per movie - and we broke that down into a business model where 40+ people in the same local area who like the same genres of movies club together - it became apparent that what was really needed was a way for these people to find each other and form clubs.
Our business model (25p to watch one movie) precluded the possibility of using the mail to send and return DVDs. So these people would have to be local to wherever the movies were kept. That could just be a shed in someone's garden. No real need for a sophistated computer system to make titles. They would just need some shelves, and maybe a book to log loans and returns.
So the features we finally came up with had nothing to do with lending and returning DVDs, but was instead about forming local movie clubs.
I doubt we'd have come to that realisation without first articulating the problem. It's not rocket science. But still, too few teams approach development in this way.
So, instead of the "It's like GitHub, but for cats" ideas that start from a solution, take some time out to live in the real world, and keep your eyes and ears open for when someone says "You know what's really annoys me?", because an idea for the Next Big Thing might follow.
December 8, 2016
What Do I Think of "Scaled Agile"?People are increasingly asking me for my thoughts on "scaled agile", so I though I'd take a quiet moment to collect my thoughts in one place.
Ever since that fateful meeting in Snowbird, Utah in 2001, some commercially-minded folk have sought to "scale up" the Agile brand so it can be applied to large organisations.
I'll give you an example of the kind of organisation we're talking about: a couple of years ago I was invited into a business that had about 150 teams of developers all effectively working on the same system (or different versions of the same system). I was asked to put together a report and some recommendations on how TDD could be adopted across the organisation.
The business in question was peppered throughout with Agile consultants, Scrum Masters, Lean experts, Kanban experts, and all manner of Agile flora and fauna.
Teams all used user stories, all had Scrum or Kanban boards, all did daily stand-ups, and all the paraphernalia we associate with Agile Software Development.
But if there was one thing they most definitely were not, it was agile. Change was slow and expensive. There was absolutely no sense of overall direction or control, or of an overall picture of progress. And the layers of "scaled Agile" the managers had piled on top of all that mess was just making things worse.
It's just a fact of life. Software development doesn't scale. Once software projects go above a certain size (~$1 million), chaos is inevitable, and the best you can hope for is an illusion of control.
And that, in my considerably wide experience of organisations of all sizes attempting to apply agile principles and practices, is all that the "scaled agile" methods can offer.
I've seen some quite well-known case studies of organisations that claim to be doing agile at scale with my own eyes, and they just aren't. 150 teams doing scaled agile, it turns out, is just 150 teams doing their own thing, and on a surface level making it look like they're all following a common process. But they're still dogged by all the same problems that any organisation trying to do software development at scale is dogged by. You can't fix nature.
Instead, you have to acknowledge the true nature of development at scale; that these are highly complex systems, not conducive to overall top-down control, from which outcomes organically emerge, planned or not.
Insect colonies do not follow top-down processes. A beehive isn't command and control, even if it might look to the casual observer that there's a co-ordinated plan they're all following. What's actually happening is that individual bees are responding to broadcast messages ("goals") about where the pollen, or the threat, can be found, and then they respond according to a set of simple internalised rules to that message, co-operating at a local level so as not to bump into each other.
In software development teams, the internalised rules - often unspoken and frequently at odds with the spoken or written rules - and the interactions at a local level determine what the outcomes the system will produce. We call these internalised rules "culture". Culture is often simple, but buried so deep that changing the culture can take a long time.
In particular, culture round the way we communicate and collaborate tends to steer the ship in particular directions, regardless of which direction you point the rudder. This is a property of complex adaptive systems called "strange attractors".
Complex systems have a property called "homeostatis" - a tendency, when disturbed, to iteratively revert back to their original dynamic state, as determined by their strange attractors. Hence, a heartbeat can rise to more than 150 bpm, but will eventually return to a resting heart rate of about 70-80 bpm.
We can apply external stimuli to a system to try and change the way it performs, but the intrinsic properties of the agents within that system, and particularly their interactions, will ultimately determine the outcome.
Methods like SAFe, LeSS and DAD are attempts to exert top-down control on highly complex adaptive organisations. As such, in my opinion and in the examples I've witnessed, they - at best - create the illusion of control. And illusions of control aren't to be sniffed at. They've been keeping the management consulting industry in clover for decades.
The promise of scaled agile lies in telling managers what they want to hear: you can have greater control. You can have greater predictability. You can achieve economies of scale. Acknowledging the real risks puts you at a disadvantage when you're bidding for business.
But if you really want to make a practical difference in a large software development organisation, the best results I've seen have come from focusing on the culture: what do people really value? What do people really believe? What are people's real habits? What do they really do under pressure?
You build big, complex products out of small, simple parts. The key is not in trying to exert control over the internal workings of each part, but to focus on how the parts - and the small, simple teams who make them - interact. Each part does a job. Each part will depend on some of the other parts. An overall architecture can emerge by instilling a set of good, practical organising principles across the teams - a design culture, like we have in the architecture of buildings, for example. The teams negotiate with each other to resolve potential conflicts, like motorists on our complex road systems trying to get where they need to go without bumping into each other.
Another word for this is "anarchy". I advice you not to use it in client meetings. But that is what it is.
I think it's very telling that so many of the original signatories of the Agile Manifesto have voiced scepticism - indeed, in some cases been very scathing - of "scaled agile". The way I see it, it's the precise opposite of what they were trying to tell us at Snowbird.
This is why, as a professional, I've invested so much time in training and coaching developers and teams, rather than in management consulting. I certainly engage with bosses, but when they ask about "scaled agile" I tell them what I personally think, which is that it's a mirage.
October 29, 2016
How Do I Know a "Software Developer" When I See One?More thoughts on what a software developer is; this time on how we know one when we see one.
Everyone has their own story, and software developers come by many routes. My personal journey, like so many, started with the home computing boom of the early 1980s, but didn't really flourish into what I'd now call "software development" until more than a decade later, when I was introduced to practices with names like "requirements engineering", "software configuration management", "architecture", "modeling", "unit testing", "usability", "formal specification", and all the rest.
20 years ago, when I was relatively new to a lot of these things, I believed in The One True WayTM. Today, I've experienced first-hand several One True Ways - Fusion, Catalysis, the Unified Process, Select Perspective (and a myriad other UML-y, component-y methods, even designing a few myself), and Extreme Programming. The way I practice XP draws on previous experience with Formal Methods, model-driven development, plus ideas pinched from books on Cleanroom Software Engineering, the Personal and Team Software Processes, Feature-Driven Development, business strategy, complexity theory, and anywhere else I stumbled across good ideas that could be applied to writing software.
Having absorbed so many ideas - or, more accurately, so many different interpretations of many of the same good ideas (e.g., iterate towards testable goals, test early & often, build software out of simple swappable parts, etc) - I feel I have a decent handle on what software development is, and have practical experience across a spectrum of usually cosmetically different approaches to it.
And I know a lot of software developers. Their journeys may be different to mine, but they all seem to have one thing in common with me: diversity of experience and ideas. They're not people who just learned Extreme Programming, just work in Java, just do TDD, just use Git, just write web applications, and so on.
They've typically been around long enough to have had careers that span multiple "movements" in software development: the OO/patterns movement of the early-mid-nineties, the UML/UP/component-based development movement of the late 90's. The early agile/XP movement of the early noughties. The post-agile/craftsmanship movement of the late noughties. And, yes, I am saying that if you're my idea of a "software developer", chances are you've been in this game - remaining at the code face to some degree - for 20+ years.
But not necessarily. Someone who reads a diverse range of literature on software development, and tries the ideas for real, could develop the breadth and depth of knowledge and experience faster. But we're still talking a decade or more, realistically. Nobody learns that fast!
And the experience is important. So much of what we do relies on tacit knowledge. It's not written down anywhere. You can't cram that.
So - to recap and summarise - if you asked me "is that person a 'software developer'?", I'd look for experience of all the key disciplines of software development (including some team management, hiring, finance/admin, etc), and I'd look for at least 2 examples of each:
- 2+ approaches to development (e.g., Unified Process and DSDM)
- 2+ programming languages & paradigms (e.g., Java and F#)
- 2+ technology stacks (e.g., LAMP and Android)
- 2+ VCS technologies & workflows (e.g., Git and SVN)
- 2+ build & deployment technologies & approaches (e.g., Maven+Jenkins, Make+Bash)
- 2+ kinds of visual modeling/sketching of architecture & design (e.g., UML and ERD)
- 2+ database technologies & paradigms (e.g, MySQL and Neo4j)
- 2+ kinds of testing (e.g., Web GUI with Selenium, and customer tests with FitNesse)
- 2+ roles they've played (e.g., programmer and architect, or tester and business analyst) - though, really, a "software developer" could wear any hat on a small project
- 2+ business domains, 2+ different scales of project/programme
You get the general idea, I'm sure.
On top of all this, I'd expect a software developer to have contributed to the wider developer community: blogging (at a minimum), teaching, mentoring, writing, speaking/presenting, creating or contributing to useful development tools and standards. You know... paying it forward. I would not be what I consider a software developer if other people hadn't done these things.
October 28, 2016
What is a "Software Developer"?Over the last few months, I've been thinking a lot about what a software developer is, and about how employers might know one when they see one.
I've long had my own idea: that a software developer - as opposed to a programmer, or a web developer, or a tester, or user experience designer, or any other specialised role - is a bit like a building contractor.
My Dad was a builder for many years, and worked in and then took over the family business. He studied at college and worked on building sites, but he wasn't a tradesman. Not a bricklayer, or a carpenter, or an electrician, or a plumber, or a steelworker, or an architect, or a quantity surveyor, or a structural engineer.
As a building contractor - the person in charge of delivering working buildings - he had to be a bit of all those things, and a bit more. He could do bricklaying, who could do carpentry, he could do plumbing. And he could design buildings, as well as read blueprints, cost the work, calculate how deep the foundations would need to be, how big the boiler would need to be, how thick the loft insulation would need to be and so on.
To me, a software developer is a bit like that. We're responsible for delivering working software, and as such we need to have a practical appreciation of everything that's required to make that happen. We're not bricklayers, but we can lay bricks. We're not plumbers, but we can plumb if we need to. We're not electricians, but we could do wiring to a safe standard if necessary.
A software developer isn't a database expert, but can design, implement and administer databases if needed. A software developer isn't a Java programmer, or a C# programmer, or Swift programmer, but could do the coding to a high enough standard if needed. A software developer isn't a UX expert, but could design a usable GUI if required. A software developer isn't a software architect, but could create and maintain a workable architecture, along with any necessary visual models, if called for.
And a software developer isn't a project manager, but they could manage a project without tying themselves in knots, keeping track of time and resources, hiring teams, getting all the space and equipment the team needs, liaising with project stakeholders, and being a bit of a sales person for the team when selling is needed.
Nor is a software developer a requirements or a business analyst, but they're capable of working directly with customers and other stakeholders, identifying real requirements, and articulating requirements in a way that can lead to working software that meets the customer's needs.
For me, a software developer can be trusted to work alone - on projects of a certain size, and on certain technology stacks, in certain problem domains - to produce working software that's fit for purpose. Just as my dad, perhaps with the help of one labourer, could build a kitchen extension or a conservatory.
And on more ambitious projects he'd know who to hire. He knows a good bricklayer when he sees one. He knows a good chippy when he sees one. He knows what he wants from a plumber, from an electrician, from a roofer, from a landscape gardener.
He could build you a kitchen extension by himself. And he could hire and manage a team that could build you a factory unit, or a housing estate.
In my mind, that's what a software developer is. One of Scott Ambler's "generalising specialists", who has a hands-on, practical understanding of all the key disciplines in software development - management, strategy, requirements, security, infrastructure, architecture, design, programming, testing, configuration & release management, support, maintenance, decommissioning and transition - and knows who to hire when the job is too much for one developer.
The main difference between my vision and what a building contractor does is that, in my vision, everyone on the development team is a developer, just with particular specialisms. So, going back to the building metaphor, a building contractor who specialise in brick and block work, a building contractor who specialises in plumbing, and so on.
This gives great flexibility. Specialists who only know their specialism can quickly become bottlenecks. I've lost count of the days I've wasted waiting for the UX designer to come and help me on a user story. And then there was that time the DBA went on a month-long holiday and had locked everyone else out of making changes to the database...
So, not so much "collective code ownership" as collective everything ownership.
Well, that's my idea of what a software developer is. You may now throw the furniture around.
September 29, 2016
The Afternoon I Learned to Fly a PlaneWhen I was younger, my Dad was an amateur pilot. One time, he took us up for a flight, and when he'd got us up in the air and on our way - on a lovely calm and clear day - he let me take the controls for a little while. I remember after we landed being cockahoop that I'd learned how to "fly a plane".
Imagine, now, that I harboured ambitions to start my own airline. The only barrier to this plan is that I'm not a pilot, and hiring pilots is expensive. But, fear not, because - remember - on that calm and clear summer afternoon I learned how to "fly a plane".
Sure, I didn't learn how to take off. Or how to land. Or how to navigate. Or what to do when visibility is poor, or when the skies are choppy. And I didn't learn what all the knobs and dials and switches do, or what it means when that particular red light is blinking and what I should do in response. I didn't learn about basic aerodynamics, or how aeroplanes work or which bits of the plane do what, and how to check that everything's working before I take off. I didn't learn about the fuel the plane uses, where to get some, and how to calculate how much fuel I might need for my journey. I didn't learn how to land in an emergency. I didn't learn about speaking to air traffic control, nor how to use the radio for any purpose. I didn't learn about flight plans, or how to read aviation charts. I didn't learn about "air corridors", or about CAA rules.
But I had flown a plane, and was therefore surely only a couple more lessons away from being a pilot. I mean, it's easy. You just hold the steering wheel and keep the nose of the plane pointing to where you want to go. Right? What's that? It's called a "yoke"? Okay, I didn't learn that either.
This is all absurd, of course. It takes many hours of experience, and much study, before you're safe to be in charge of an aeroplane. There's so much that can go wrong, and you can't just pull over on to the hard shoulder when it does. when I had control, there was a qualified, experienced pilot sitting right next to me, and - in reality - he was in control. He talked me through it all the way, and if necessary could take over at a second's notice. I just did exactly what he told me to do.
But if I were a cynical opportunist, I might exploit this initial moment of euphoria that I experienced to make me believe that I can fly a plane. I can see the ads now: "Thinking of starting an airline? Why pay for expensive so-called 'pilots' when anyone can learn to fly on our 1-day course." This would be an exemplary commercialisation of the Dunning-Kruger effect: they don't know that they haven't learned to fly a plane, and we ain't about to tell them.
What that short flight with Dad did do, though, was get us excited about learning to fly. So much so, that for a good few years it was my brother's primary ambition to be a pilot.
A course claiming to teach you to fly in a day - or even an hour - would end up in court, of course. Grown-ups know it just isn't as simple as holding the stick and going in a straight line for 5 minutes.
So, why don't grown-ups know this about writing software?
September 13, 2016
4 Things You SHOULDN'T Do When The Schedule's SlippingIt takes real nerve to do the right thing when your delivery date's looming and you're behind on your plan.
Here are four things you should really probably avoid when the schedule's slipping:
1. Hire more developers
It's been over 40 years since the publication of Fred L. Brooks' 'The Mythical Man-Month'. This means that our industry has known for almost my entire life that adding developers to a late project makes it later.
Not only is this born out by data on team size vs. productivity, but we also have a pretty good idea what the causal mechanism is.
Like climate change, people who reject this advice should not be called "skeptics" any more. In the face of the overwhelming evidence, they're Small Team Deniers.
Hiring more devs when the schedule's slipping is like prescribing cigarettes, boxed sets and bacon for a patient with high blood pressure.
2. Cut corners
Still counterintuitively, for most software managers, the relationship between software quality and the time and cost of delivery is not what most of us think it is.
Common sense might lead us to believe that more reliable software takes longer, but the mountain of industry data on this clearly shows the opposite in the vast majority of cases.
To a point - and it's a point 99% of teams are in no danger of crossing - it actually takes less effort to deliver more reliable software.
Again, the causal mechanism for this is well understood. And, again, anyone who rejects the evidence is not a "skeptic"; they're a Defect Prevention Denier.
The way to go faster on 99% of projects is to slow down, and take more care.
3. Work longer hours
Another management myth that's been roundly debunked by the evidence is that, when a software delivery schedule's slipping significantly, teams can get back on track by working longer hours.
The data very clearly shows that - for most kinds of work - longer hours is a false economy. But it's especially true for writing software, which requires a level of concentration and focus that most jobs don't.
Short spurts of extra effort - maybe the odd weekend or late night - can make a small difference in the short term, but day after day, week after week overtime will burn your developers out faster than you can say "get a life". They'll make stupid, easily avoidable mistakes. And, as we've seen, mistakes cost exponentially more to fix than to avoid. This is why teams who routinely work overtime tend to have lower overall productivity: they're too busy fighting their own self-inflicted fires.
You can't "cram" software development. Like your physics final exams, if you're nowhere near ready a week before, then you're not gong to be ready, and no amount of midnight oil and caffeine is going to fix that.
You'll get more done with teams who are rested, energised, feeling positive, and focused.
4. Bribe the team to hit the deadline
Given the first three points we've covered here, promising to shower the team with money and other rewards to hit a deadline is just going to encourage them to make those mistakes for you.
Rewarding teams for hitting deadlines fosters a very 1-dimensional view of software development success. It places extra pressure on developers to do the wrong things: to grow the size of their teams, to cut corners, and to work silly hours. It therefore has a tendency to make things worse.
The standard wheeze, of course, is for teams to pretend that they hit the deadline by delivering something that looks like finished software. The rot under the bonnet quickly becomes apparent when the business then expects a second release. Now the team are bogged down in all the technical debt they took on for the first release, often to the extent that new features and change requests become out of the question.
Yes, we hit the deadline. No, we can't make it any better. You want changes? Then you'll have to pay us to do it all over again.
Granted, it takes real nerve, when the schedule's slipping and the customer is baying for blood, to keep the team small, to slow down and take more care, and to leave the office at 5pm.
Ultimately, the fate of teams rests with the company cultures that encourage and reward doing the wrong thing. Managers get rewarded for managing bigger teams. Developers get rewarded for being at their desk after everyone else has gone home, and appearing to hit deadlines. Perversely, as an industry, it's easier to rise to the top by doing the wrong thing in these situations. Until we stop rewarding that behaviour, little will change.
September 6, 2016
Empowered Teams Can Make Decisions The Boss Disagrees WithComing into contact, as I do, with many software development teams across a wide range of industries, you begin to recognise patterns.
One such pattern that - once I noticed it - I realised is very prevalent in Agile Software Development is what I call the Empowered Straightjacket. Teams are "empowered" to make their own decisions, but when the boss doesn't like a decision they've made, he or she overrules it.
Those who remember their set theory will know that if the set of all possible decisions a team is allowed to make can only include decisions the boss agrees with, then they are effectively working in the same set (or a subset) of the boss's rules.
That is not empowerment. Just in case you were wondering.
To have truly empowered development teams, bosses have to recognise that just being the boss doesn't necessarily make them right, and disagreeing with a decision doesn't necessarily make it a bad decision.
Unfortunately, the notion that decisions are made from above by more enlightened individuals has an iron grip on corporate culture.
Moving away from that means that managers have to reshape their role from someone who makes decisions to someone who facilitates the decision-making process (and then accepts the outcome graciously and with energy and enthusiasm.)
Once we recognise that there are other - more democratic and more objective - ways of making decisions, and that the decisions are just as likely to be right (if not more likely) than our own, then we have a golden opportunity to scale decision-making far beyond what traditional command-and-control hierarchies are capable of.
To scale Agile, you must learn to let go and let teams be the masters of their own destinies. You have no superhuman ability to make better decisions than a team full of highly-educated professionals.
The flipside of this is that developers and teams have to allow themselves to be empowered. With great empowerment comes great responsibility. And developers who've been cushioned from the responsibility of making decisions for a long time can run a mile when it comes a'knocking. Like prisoners who can't cope on the outside after a long stretch of regimented days doing exactly what they tell exactly when they tell you, devs who are used to management "taking care of all that" can panic when someone hands them a company credit card and says "if you need anything, use this". It reminds me of how my grandmother's hands used to shake when she had to write a cheque. Granddad took care of all that sort of thing.
This can lead to developers lacking confidence, which leads to them being afraid to take initiative. They may have learned their craft in environments where failure is not tolerated, and learned a survival strategy of not being responsible for anything that matters.
In this situation, developers rise up through the ranks - usually by length of service - and perpetuate the cycle by micromanaging their teams.
Based on my own subjective experiences leading dev teams (and being led): ultimately, developers empower themselves. The maxim "it's easier to ask for forgiveness than permission" applies here.
Now, t'was ever thus. But the rise of Agile Software Development has forced many managers to at least pretend to empower their teams. (And, let's face it, the majority are just pretending. Scrum Masters are not project managers, BTW.)
That's your cue to seize the day. They may not like it. But they'll have to pretend they do.