June 10, 2017

Learn TDD with Codemanship

Could You Be A Mentor To An Aspiring Software Developer?

I've been beavering away these last few weeks putting together the basis for an initiative that will enable experienced software developers to mentor new programmers looking to become developers one day.

It'll take the form of a Software Developers' Guild - a sort of clearing house that helps talented new programmers find old hands who can provide "light-touch" guidance over the long-term (4-6 years).

I see it working along similar lines to what I've been doing with my "apprentice" Will Price (who's just finished his final exams for his CS degree, and has turned out pretty spiffy as a developer, too). I've been pairing with Will regularly for a couple of hours every fortnight or so, working on the skills formal education tends to leave out (using version control, test automation, TDD< refactoring, design principles and other practical aspects of code craft).

I've also been nudging him towards certain sources of information: books, blogs, conferences, and so forth, and generally giving him a steer on what he would find most useful to know as a software developer.

Reflecting on how it's gone, both Will and I feel it's been of immense value - and not just for Will. Mentoring someone new to this field has spurred me to learn new things, too (like Python, for example) and reinvigorated my enthusiasm for learning. So, after twelvety-stupid years as a developer, I feel renewed. And looking forward to doing it again.

The industry is also up on the deal by one potentially great developer.

My thinking of late has been that this could be a workable route to avoiding the Groundhog Day that our profession seems stuck in, where new developers have to go through the same long process of rediscovery, with all the false leads and dead ends I wasted years on.

And so, this year, I tentatively begin the process of trying to scale this approach up. You can find out a bit more by visiting the Software Developers' Guild holding page. And, maybe, you'd be interested in becoming a mentor?

I'm looking for experienced developers who've "been around the block at least twice" ( call it my Rule of Two), and who'd be willing and able to provide a similar kind of light-touch guidance to someone at university, or from a code club, or returning to work after raising children or caring for a relative, or retraining for a career change, etc.

Could that be you?




May 16, 2017

Learn TDD with Codemanship

My Obligatory Annual Rant About People Who Warn That You Can Take Quality Too Far Like It's An Actual Thing That Happens To Dev Teams

If you teach developers TDD, you can guarantee to bump into people who'll warn you of the dangers of taking quality too far (dun-dun-duuuuuuun!)

"We don't write the tests first because it protects us from over-testing our code", said one person recently. Ah, yes. Over-testing. A common problem in software.

"You need to be careful not to refactor your code too much", said another. And many's the time I've looked at code and thought "This program is just too easy to understand!"

I can't help recalling the time a UK software company, whose main product had literally thousands of open bugs, hired a VP of Quality and sent him around the dev teams warning them that "perfection is the enemy of good enough". Because that was their problem; the software was just too good.

It seems to still pervade our industry's culture, this idea that quality is the enemy of getting things done, despite mountains of very credible evidence that - in the vast majority of cases - the reverse is true. Most dev teams would deliver sooner if they delivered better software. Not aiming for perfection is the enemy of getting shit done more accurately sums up the relationship between quality and productivity in our line of work.

That's not to say that there aren't any teams who have ever taken it too far. In safety-critical software, the costs ramp up very quickly for very modest improvements in reliability. But the fact is that 99.9% of teams are so far away from this asymptote that, from where they're standing, good enough and too good are essentially the same destination.

Worry about wasting time on silly misunderstandings about the requirements. Worry about wasting time fixing avoidable coding errors. Worry about wasting time trying to hack your way through incomprehensible spaghetti code to make changes. Worry about wasting your time doing the same repeatable tasks manually over and over again.

But you very probably needn't worry about over-testing your code. Or about doing too much refactoring. Or about making the software too good. You're almost certainly not in any immediate danger of that.







May 9, 2017

Learn TDD with Codemanship

What Makes a Software Developer? The Rule of Two

I've been thinking a lot recently about what might qualify someone as a "software developer". For sure, it's not just someone who can code. (Any more than a builder is just someone who can lay bricks.)

One rule of thumb I've used with some success over the years is a rule of two: a software developer, in my experience, is someone who has practical hands-on skills in at least two of everything.

* 2 different programming paradigms (e.g., structured & OO)
* 2 different technology stacks (e.g., LAMP and .NET)
* 2 different kinds of application (e.g., desktop and mobile)
* 2 different problem domains (e.g., banking and medicine)
* 2 different approaches to development (e.g., Extreme Programming & RUP)

Essentially, someone who's been around the block at least twice in their careers.

The reason why I think this matters is that folk tend to need to see multiple examples of something before they can start to draw some key underlying insights. If you've only ever done BDD, you may not be aware that almost all approaches to requirements specification are example- or scenario-driven. If you've only ever worked in, say, Java, you may miss the fact that encapsulation isn't exclusively an OO concept. (Yes, you can have loosely coupled modules in Pascal, C, etc, too).

I'd also be interested in the responsibilities a developer has taken on in their careers. I've been a programmer, a tech lead, an architect, a requirements analyst, a methodologist, a strategist, a coach, a conference speaker, a conference organiser, an author, a business owner, and a trainer in my twelvety five years in this career. Wearing multiple hats - like working in multiple languages - can bring insights that decades working exclusively at the code face would probably miss.

So now, as my attentions turn to focus on the whole question of long-term mentoring for would-be software developers, the >rule of two may well be a key part of the process of identifying who might make good mentors, as well as potentially provide a roadmap for mentoring itself. Essentially, we'd be looking to guide rookies at least twice around the block, allowing them a chance to build those insights for themselves.


April 17, 2017

Learn TDD with Codemanship

Do It Anyway

In my previous post, I suggested that developers take the necessary technical authority that managers are often unwilling to give.

If I could summarise my approach to software development - my "method", if you could call it that - it would be Do It AnywayTM.

An example that recently came up in a Twitter conversation about out-of-hours support - something I have mixed feelings about - was how temporary fixes applied at 2am to patch a broken system up often end up being permanent, because the temptation's too great for managers to assign the fixer to seemingly more valuable work the next day.

Think of out-of-hours fixes as being like battlefield surgery. The immediate aim is to stop the patient from dying. The patient is then transported to a proper hospital for proper treatment. If the patient's intestines are being held in with sticky tape, that is not a lasting fix.

We've all seen what happens when the code fills up with quick'n'dirty fixes. It becomes less reliable and less maintainable. The cumulative effect increases failures in production, creates a barrier to useful change, and hastens the system's demise considerably.

My pithy - but entirely serious - advice in that situation is Do It AnywayTM.

There are, of course, obligations implied when we Do It AnywayTM. What we're doing must be in the best interests of stakeholders. Do It AnywayTM is not a Get Out Jail Free card for any decision you might want to justify. We are making informed decisions on their behalf. And then doing what needs to be done. Y'know. Like professionals.

And this is where the rubber meets the road; if managers and customers don't trust us to make good decisions on their behalf, they'll try to stop us making decisions. It's a perfectly natural response. And to some extent we brought it on ourselves - as a profession - through decades of being a bit, well, difficult. We don't always do things in the customer's interests. Sometimes - gasp - we do stuff because it benefits us (e.g., adopting a fashionable technology), or because it's all we know to do (e.g., not writing automated tests because we don't know how), or just because it's the new cool thing (and we love new cool things!)

If we can learn to curb our excesses and focus more on the customer's problems instead of our solutions, then maybe managers and customers can learn to trust us to make technical decisions like whether or not a hot fix needs to be done properly later.




April 15, 2017

Learn TDD with Codemanship

IT's Original Sin: Separating Authority & Expertise

I'm recalling a contract I did a few years ago, where I was leading a new team on an XML publishing sort of thing. The company I was working for had been started by two entrepreneurs who had zero experience of the software industry. Which is fine, if you can hire experts who have the necessary experience.

On our little team, things were generally good, except for one fly in the ointment. The guy they'd trained up in this specific database technology we were using had his own ideas about building the whole thing in XQuery, and refused to go along with any technical decision made by the team.

Without his specific skillset, we couldn't deliver anything, and so he had the team over a barrel. It should have been easily fixed, though. I just had to go to the bosses, and get an executive decision. Case closed.

Except the bosses didn't know whether to believe me or believe him, because they knew nothing about what we were trying to do. And they were unwilling to concede authority to the team to make the decisions.

This is a common anti-pattern in the way organisations are run, and it seems to be getting worse. Our whole society now seems to be organised in a way that separates decision-making authority from expertise. Indeed, the more authority someone has, the less they tend to understand the ramifications of the decisions they're making.

In Britain, this is partly - I think - down to an historical belief in a "leadership class". Leaders go to certain good (fee-paying) schools, they get classical educations that focus on ancient languages, history, literature etc, and they study purely academic non-technical subjects at university like Politics or History. They're prepared for the burden of command by their upbringing - "character building" - rather than their actual education or training. They're not actually qualified to do anything. They're not scientists, or doctors, or nurses, or engineers, or designers, or programmers, or builders, or anything drearily practical like that. They're leaders.

These are the people who tend to end up in the boardrooms of Britain's biggest companies. These are the people who end up in charge of government departments. These are the people who are in charge of TV & radio.

And hence we get someone making the big decisions about healthcare who knows nothing about medicine or about running hospitals or ambulance services. And we get someone in charge of all the schools who knows nothing about teaching or running a school. And we get someone in charge of a major software company whose last job was being in charge of a soft drinks company. And so on.

Again, this is fine, if they leave the technical decisions to the technical experts. And that's where it all falls down, of course. They don't.

The guy in charge of the NHS insists on telling doctors and nurses how they should do their jobs. The woman in charge of UK schools insists on overriding the expertise of teachers. The guy in charge of a major software company refuses to listen to the engineers about the need for automated testing. And so on.

This is the Dunning-Kruger effect writ large. CEOs and government ministers are brimming with the overconfidence of someone who doesn't know that they don't know.

Recently, I got into a Twitter conversation with someone attending the launch of the TechNation report here in the UK. It was all about digital strategy, and I had naively asked if there were any software developers at the event. His response was "Why would we want software developers involved in digital strategy?" As far as he was concerned, it was non of our godammed business.

I joked that it was like having an event called "HealthNation" and not inviting any doctors, or "SchoolNation" and not inviting any teachers. At which point, doctors and teachers started telling me that this has in actual fact happened, and happens often.

The consequences for everyone when you separate authority and expertise can be severe. If leaders can't learn humility and leave the technical decisions to the technical experts - if that authority won't be willingly given - then authority must be taken. The easiest way - for the people who do the actual work and make the wheels turn - is to simply not ask for permission. That was our first mistake.





March 22, 2017

Learn TDD with Codemanship

Digital Strategy: Why Are Software Developers Excluded From The Conversation?

I'm running another Twitter poll - yes, me and my polls! - spurred on by a conversation I had with a non-technical digital expert (not sure how you can be a non-technical digital expert, but bear with me) in which I asked if there were any software developers attending the government's TechNation report launch today.




This has been a running bugbear for me; the apparent unwillingness to include the people who create the tech in discussions about creating tech. Imagine a 'HealthNation' event where doctors weren't invited...

The response spoke volumes about how our profession is viewed by the managers, marketers, recruiters, politicians and other folk sticking their oars into our industry. Basically, strategy is non of our beeswax. We just write the code. Other geniuses come up with the ideas and make all the important stuff happen.

I don't think we're alone in this predicament. Teachers tell me that there are many events about education strategy where you'll struggle to find an actual educator. Doctors tell me likewise that they're often excluded from the discussion about healthcare. Professionals are just there to do as we're told, it seems. Often by people with no in-depth understanding of what it is they're telling us to do.

This particular person asserted that software development isn't really that important to the economy. This flies in the face of newspaper story after newspaper story about just how critical it is to the functioning of our modern economy. The very same people who see no reason to include developers in the debate also tell us that in decades to come, all businesses will be digital businesses.

So which is it? Is digital technology - that's computers and software to you and me - now so central to modern life that our profession is as critical today as masons were in medieval times? Or can any fool 'code' software, and what's really needed is better PR people?

What's clear is that highly skilled professions like software development - and teaching, and medicine - have had their status undermined to the point where nobody cares what the practitioner thinks any more. Except other practitioners, perhaps.

This runs hand-in-hand with a gradual erosion in real terms of earnings. A developer today, on average, earns 25% less then she did 10 years ago. This trend plains against the grain of the "skills crisis" narrative, and more accurately portrays the real picture where developers are seen as cogs in the machine.

If we're not careful, and this trend continues, software development could become such a low-status profession in the UK that the smartest people will avoid it. This, coupled with a likely brain drain after Brexit, will have the reverse effect to what initiatives like TechNation are aiming for. Despite managers and marketers seeing no great value in software developers, where the rubber meets the road, and some software has to be created, we need good ones. Banks need them. Supermarkets need them. The NHS needs them. The Ministry of Defence needs them. Society needs them.

But I'm not sure they're going about the right way of getting them. Certainly, ignoring the 400,000 people in the UK who currently do it doesn't send the best message to smart young people who might be considering it. They could be more attracted to working in tech marketing, or for a VC firm, or in the Dept of Trade & Industry formulating tech policy.

But if we lack the people to actually make the technology work, what will they be marketing, funding and formulating policy about? They'd be a record industry without any music.

Now, that's not to say that marketing and funding and government policy don't matter. It's all important. But if it's all important, why are the people who really make the technology excluded from the conversation?






March 12, 2017

Learn TDD with Codemanship

Why Products Don't Matter To Tech Investors, But Hype Does



They say 'you get what you measure', and in the world of tech start-ups this is especially true.

Having lived and worked through several tech bubbles, I've seen how the promise of winning mind-bogglingly big ("boggly"?) has totally skewed business models beyond the point where many tech start-ups are even recognisable as businesses.

The old notion of a business - where the aim is to make more money than you spend - seems hopelessly old-fashioned now. Get with it Granddad! It's not about profit any more, it's about share prices and exit strategies.

If your tech start-up looks valuable, then it's valuable. It matters not a jot that you might be losing money hand-over-fist. As long as investors believe it's worth a gazillion dollars, then it's worth a gazillion dollars.

For those of us who create the technologies that these start-ups are built on, this translates into a very distorted idea of what constitutes a 'successful product'. I have, alas, heard many founders state quite openly that it doesn't much matter if the technology really works, or if the underlying code has any long-term future, What matters is that, when it comes time to sell, someone buys it.

Ethically, this is very problematic. It's like a property developer saying 'it doesn't matter if the house is still standing 10 years from now, just as long as it's standing when someone buys it'.

Incredibly, very few tech start-up investors do any technical due diligence at all. Again, I suspect this is because you get what you measure. They're not buying it because it works, or because the technology has a future. They're buying it so that they, too, can sell it on for an inflated price. It could be a brick with "tech" painted on it, and - as long as the valuation keeps going up - they'll buy it.

You see, investors want returns. And that's pretty much all they want. They want returns from tech businesses. They don't care if the tech works, or if the business is viable in the long term. Just like they want returns from luxury waterside apartments that no ordinary person could afford to live in. They don't care if nobody lives in them, just as long as they keep going up in value.

Just like the property bubble, the tech bubble runs chiefly on hype. A £1,000,000 2-bed apartment is worth £1,000,000 because an estate agent or someone in a glossy colour supplement said so. An expectation of value is set. Similarly, Facebook is worth twelvety gazillions because the financial papers say it is. That is the expectation for tech start-ups. They can be valued at billions of dollars with a turnover of peanuts by comparison.

What makes tech businesses worth so much is their potential for expansion. Expanding your chain of coffeeshops requires you to overcome real physical barriers like finding premises and staff and buying more coffee machines and so on. Cost-wise, for a software business, there's not a huge difference in outlay between 1,000 users and 1,000,000 users. If only Costa could host their shops on the cloud!

Tech start-ups are just lottery tickets, where the jackpot can be truly astronomical. And what matters most in this equation is that the technology sounds good.

Exhibit A: the recent Articificial Intelligence bubble. News journalists have latched on to this idea - carefully crafted and lovingly packaged by tech industry PR - that real AI, the kind we saw in the movies, is just around the corner. What, again?

The true promise of AI has been 30 years away for the last 50 years. In reality, chatbot sales support sucks, websites designed by AI are crummy, and recommendations made by ad taregting engines are as asanine as they always were ("we see you bought a lawnmower; would you like to buy another lawnmower?") And as for self-driving cars...

But it doesn't matter that AI isn't as far advanced as they'd have us believe, just as long as we do believe it - for long enough to dip our hands in our pockets and buy some shares in the companies hyping it. And we'll do it, because there's a good chance those shares will go up in value and we'll be able to sell them for a profit. This is why few people are willing to say that the Emperor has no clothes. Too many own shares in the company that manufactured those clothes.

As a software developer, though, I struggle to find job satisfaction or to find any professional pride in making invisible clothes for silly Emperors. Yes, there are those of us - sadly, too plentiful - who'll apparently do anything as long as the money's right. But I'd like to think that's a different profession to the one I'm in.




January 18, 2017

Learn TDD with Codemanship

How Long Would Your Organisation Last Without Programmers?

A little straw poll I did recently on Twitter has proved to be especially timely after a visit to the accident & emergency ward at my local hospital (don't worry - I'm fine).



It struck me just how reliant hospitals have become on largely bespoke IT systems (that's "software" to me and you). From the moment you walk in to see the triage nurse, there's software - you're surrounded by it. The workflow of A&E is carefully controlled via a system they all access. There are computerised machines that take your blood pressure, monitor your heart, peer into your brain and build detailed 3D models, and access your patient records so they don't accidentally cut off the wrong leg.

From printing out prescriptions to writing notes to your family doctor, it all involves computers now.

What happens if all the software developers mysteriously disappeared overnight? There'd be nobody to fix urgent bugs. Would the show-stoppers literally stop the show?

I can certainly see how that would happen in, say, a bank. And I've worked in places where - without 24/7 bug-fixing support - they'd be completely incapable of processing their overnight orders, creating a massive and potentially un-shiftable backlog that could crush their business in a few weeks.

Ultimately, DR is all about coping in the short term, and getting business-as-usual (or some semblance of it) up and running as quickly as possible. It can delay, but not avoid, the effects of having nobody who can write or fix code.

And I'm aware that big organisations have "disaster recovery" plans. I've been privy to quite a few in my lofty position as Chief Technical Arguer in some very large businesses. But all the DR plans I've seen have never asked "what happens if there's nobody to fix it?"

Smart deployers, of course, can just roll back a bad release to the last one that worked, ensuring continuity... for a while. But I know business code: even when it's working, it's often riddled with unknown bugs, waiting to go off, like little business-killing landmines. I've fixed bugs in COBOL that were written in the 1960s.

Realistically, rolling back your systems by potentially decades is not an option. You probably don't even have access to that old code. Or, if you do, someone will have to re-type it all in from code listings kept in cupboards.

And even if you could revert your way back to a reliable system with potential longevity, without the ability to change and adapt those systems to meet future needs will soon start to eat away at your organisation from the inside.

It's food for thought.


December 28, 2016

Learn TDD with Codemanship

The Best Software Developers Know It's Not All 'Hacking'

A social media debate that appears to have been raging over the Xmas break was triggered by some tech CEO claiming that the "best developers" would be hacking over the holidays.

Putting aside just how laden with cultural assumptions that tweet was (and, to be fair, many of the angry responses to it), there is a wider question of what makes a software developer more effective.

Consider the same tweet but in a different context: the "best screenwriters" will spend their holiday "hacking" screenplays. There's an assumption there that writing is all there is to it. Look at what happens, though, when screenwriters become very successful. Their life experiences change. They tend to socialise with other movie people. They move out of their poky little downtown apartments and move into more luxurious surroundings. They exchange their beat-up old Nissan for a brand new Mercedes. They become totally immersed in the world of the movie business. And then we wonder why there are so many movies about screenwriters...

The best screenwriters start with great stories, and tell their stories in interesting and authentic voices. To write a compelling movie about firefighters, they need to spend a lot of time with firefighters, listening to their stories and internalising their way of telling them.

First and foremost, great software developers solve real problems. They spend time with people, listen to their stories, and create working solutions to their problems told in the end user's authentic voice.

What happens when developers withdraw from the outside world, and devote all of their time to "hacking" solutions, is the equivalent of the slew of unoriginal and unimaginative blockbuster special effects movies coming out of Hollywood in recent years. They're not really about anything except making money. They're cinema for cinema's sake. And rehashes of movies we've already seen, in one form or another. because the people who make them are immersed in their own world: movie making.

Ironically, if a screenwriter actually switched off their laptop and devoted Xmas to spending time with the folks, helping out with Xmas dinner, taking Grandma for a walk in her wheelchair, etc, they would probably get more good material out of not writing for a few days.

Being immersed in the real world of people and their problems is essential for a software developer. It's where the ideas come from, and if there's one thing that's in short supply in our industry, it's good ideas.

I've worked with VCs and sat through pitches, and they all have a Decline Of The Hollywood Machine feel to them. "It's like Uber meets Moonpig" is our equivalent of the kind of "It's like Iron-Man meets When Harry Met Sally" pitches studio executives have to endure.

As a coach, when I meet organisations and see how they create software, as wel as the usual technical advice about shortening feedback cycles and automating builds and deployment etc, increasingly I find myself urging teams to spend much more time with end users, seeing how the business works, listening to their stories, and internalising their voices.

As a result, many teams realise they've been solving the wrong problems and ultimately building the wrong software. The effect can be profound.

I'll end with a quick illustration: my apprentice and I are working on a simple exercise to build a community video library. As soon as we saw those words "video library", we immediately envisaged features for borrowing and returning videos, and that sort of library-esque thing.

But hang on a moment! When I articulated the goal of the video library - to allow people to watch a movie a day for just 25p per movie - and we broke that down into a business model where 40+ people in the same local area who like the same genres of movies club together - it became apparent that what was really needed was a way for these people to find each other and form clubs.

Our business model (25p to watch one movie) precluded the possibility of using the mail to send and return DVDs. So these people would have to be local to wherever the movies were kept. That could just be a shed in someone's garden. No real need for a sophistated computer system to make titles. They would just need some shelves, and maybe a book to log loans and returns.

So the features we finally came up with had nothing to do with lending and returning DVDs, but was instead about forming local movie clubs.

I doubt we'd have come to that realisation without first articulating the problem. It's not rocket science. But still, too few teams approach development in this way.

So, instead of the "It's like GitHub, but for cats" ideas that start from a solution, take some time out to live in the real world, and keep your eyes and ears open for when someone says "You know what's really annoys me?", because an idea for the Next Big Thing might follow.





December 8, 2016

Learn TDD with Codemanship

What Do I Think of "Scaled Agile"?

People are increasingly asking me for my thoughts on "scaled agile", so I though I'd take a quiet moment to collect my thoughts in one place.

Ever since that fateful meeting in Snowbird, Utah in 2001, some commercially-minded folk have sought to "scale up" the Agile brand so it can be applied to large organisations.

I'll give you an example of the kind of organisation we're talking about: a couple of years ago I was invited into a business that had about 150 teams of developers all effectively working on the same system (or different versions of the same system). I was asked to put together a report and some recommendations on how TDD could be adopted across the organisation.

The business in question was peppered throughout with Agile consultants, Scrum Masters, Lean experts, Kanban experts, and all manner of Agile flora and fauna.

Teams all used user stories, all had Scrum or Kanban boards, all did daily stand-ups, and all the paraphernalia we associate with Agile Software Development.

But if there was one thing they most definitely were not, it was agile. Change was slow and expensive. There was absolutely no sense of overall direction or control, or of an overall picture of progress. And the layers of "scaled Agile" the managers had piled on top of all that mess was just making things worse.

It's just a fact of life. Software development doesn't scale. Once software projects go above a certain size (~$1 million), chaos is inevitable, and the best you can hope for is an illusion of control.

And that, in my considerably wide experience of organisations of all sizes attempting to apply agile principles and practices, is all that the "scaled agile" methods can offer.

I've seen some quite well-known case studies of organisations that claim to be doing agile at scale with my own eyes, and they just aren't. 150 teams doing scaled agile, it turns out, is just 150 teams doing their own thing, and on a surface level making it look like they're all following a common process. But they're still dogged by all the same problems that any organisation trying to do software development at scale is dogged by. You can't fix nature.

Instead, you have to acknowledge the true nature of development at scale; that these are highly complex systems, not conducive to overall top-down control, from which outcomes organically emerge, planned or not.

Insect colonies do not follow top-down processes. A beehive isn't command and control, even if it might look to the casual observer that there's a co-ordinated plan they're all following. What's actually happening is that individual bees are responding to broadcast messages ("goals") about where the pollen, or the threat, can be found, and then they respond according to a set of simple internalised rules to that message, co-operating at a local level so as not to bump into each other.

In software development teams, the internalised rules - often unspoken and frequently at odds with the spoken or written rules - and the interactions at a local level determine what the outcomes the system will produce. We call these internalised rules "culture". Culture is often simple, but buried so deep that changing the culture can take a long time.

In particular, culture round the way we communicate and collaborate tends to steer the ship in particular directions, regardless of which direction you point the rudder. This is a property of complex adaptive systems called "strange attractors".

Complex systems have a property called "homeostatis" - a tendency, when disturbed, to iteratively revert back to their original dynamic state, as determined by their strange attractors. Hence, a heartbeat can rise to more than 150 bpm, but will eventually return to a resting heart rate of about 70-80 bpm.

We can apply external stimuli to a system to try and change the way it performs, but the intrinsic properties of the agents within that system, and particularly their interactions, will ultimately determine the outcome.

Methods like SAFe, LeSS and DAD are attempts to exert top-down control on highly complex adaptive organisations. As such, in my opinion and in the examples I've witnessed, they - at best - create the illusion of control. And illusions of control aren't to be sniffed at. They've been keeping the management consulting industry in clover for decades.

The promise of scaled agile lies in telling managers what they want to hear: you can have greater control. You can have greater predictability. You can achieve economies of scale. Acknowledging the real risks puts you at a disadvantage when you're bidding for business.

But if you really want to make a practical difference in a large software development organisation, the best results I've seen have come from focusing on the culture: what do people really value? What do people really believe? What are people's real habits? What do they really do under pressure?

You build big, complex products out of small, simple parts. The key is not in trying to exert control over the internal workings of each part, but to focus on how the parts - and the small, simple teams who make them - interact. Each part does a job. Each part will depend on some of the other parts. An overall architecture can emerge by instilling a set of good, practical organising principles across the teams - a design culture, like we have in the architecture of buildings, for example. The teams negotiate with each other to resolve potential conflicts, like motorists on our complex road systems trying to get where they need to go without bumping into each other.

Another word for this is "anarchy". I advice you not to use it in client meetings. But that is what it is.

I think it's very telling that so many of the original signatories of the Agile Manifesto have voiced scepticism - indeed, in some cases been very scathing - of "scaled agile". The way I see it, it's the precise opposite of what they were trying to tell us at Snowbird.

This is why, as a professional, I've invested so much time in training and coaching developers and teams, rather than in management consulting. I certainly engage with bosses, but when they ask about "scaled agile" I tell them what I personally think, which is that it's a mirage.