July 5, 2018
The Grand Follies of Software DevelopmentJust time for a few thoughts on software deveopment's grand follies - things many teams chase that tend to make things worse.
Scale - on and on and on we go about scaling up or scaling out our software systems to handle millions of users and tens of thousands of requests every second. By optimising our architectures to work on Facebook scale, or Netflix scale, we potentially waste a lot of time and money and opportunities to get a product out there by doing something much simpler. The bottom line is that almost all software will never need to work on that scale, just like almost every person will never need a place to moor their $120 million yacht. If you're ever lucky enough to have that problem, good for you! Facebook and the others solved their scaling problems when they needed to, and they had the resources to do it because of their enormous scale.
Likewise the trend for scaling up software development itself. Organisations that set out to build large products - millions or tens of millions of lines of code - are going about it fundamentally arse-backwards. If you look at big software products today, they typically started out as small software products. Sure, MS Word today is over 10M LOC, but Word 1.0 was tens of thousands of lines of code. That original small team created something useful that became very popular, and it grew incrementally over time. Nature handles complexity very well, where design is concerned. It doesn't arrive at something like the human brain in a single step. Like Facebook and their scaling problems, Microsoft crossed that bridge when they got to it, by which time they had the money to crack it. And it takes a lot of money to create a new version of Word. There's no economy of scale, and at the scale they do it now, very little latitude for genuine innovation. Microsoft's big experiments these days are relatively small, like they always had to be. Focus on solving the problems you have now.
That can be underpinned by a belief that some software systems are irreducibly complex - that a Word processor would be unusable without the hundreds of features of MS Word. Big complex software, in reality, starts as small simple software and grows. Unless, of course, we set out to reproduce software that has become big and complex. Which is fine, if that's your business model. But you're going to need a tonne of cash, and there are no guarantees yours will fare better in the market. So it's one heck of a gamble. Typically, such efforts are funded by businesses (or governments) with enormous resources, and they usually fail spectacularly. Occasionally we hear about them, but a keenness to manage their brand means most get swept under the carpet - which might explain why organisations continue to attempt them.
Reuse - oh, this was a big deal in the 90s and early noughties. I came across project after project attempting to build reusable components and services that the rest of the organisation could stitch together to create working business solutions. Such efforts suffered from spectacular levels of speculative generality, trying to solve ALL THE PROBLEMS and satisfy such a wide range of use cases that the resulting complexity simply ran away from them. We eventually - well, some of us, anyway - learned that it's better to start by building something useful. Reuse happens organically and opportunistically. The best libraries and frameworks are discovered lurking in the duplication inside and across code bases.
"Waste" - certain fashionable management practices focus on reducing or eliminating waste from the software development process. Which is fine if we're talking about building every developer their own office complex, but potentially damaging f we're talking abut eliminating the "waste" of failed experiments. That can stifle innovation and lead - ironically - to the much greater waste of missed opportunities. Software's a gamble. You're gonna burn a lot of pancakes. Get used to it, and embrace throwing those burned pancakes away.
Predictability - alongside the management trend for "scaling up" the process of innovation comes the desire to eliminate the risks from it. This, too, is an oxymoron: innovation is inherently risky. The bigger the innovation, the greater the risk. But it's always been hard to get funding for risky ventures. Which is why we tend to find that the ideas that end up being greenlit by businesses are typically not very innovative. This is because we're still placing big bets at the crap table of software development, and losing is not an option. Instead of trying to reduce or eliminate risk, businesses should be reducing the size of their bets and placing more of them - a lot more. This is intimately tied to our mad desire to do everything at "enterprise scale". It's much easier to innovate with lots of small, independent teams trying lots of small-scale experiments and rapidly iterating their ideas. Iterating is the key to this process. So much of management theory in software development is about trying to get it right first time, even today. It's actually much easier and quicker and cheaper to get it progressively less wrong. And, yes, like natural evolution, there will be dead ends. The trick is to avoid falling to the Sunk Cost fallacy of having invested so much time and money in that dead end that you feel compelled to persist.
"Quick'n'dirty" - I shouldn't need to elaborate on this. It's one of the few facts we can rely on in software development. In the vast majority of cases, development teams would deliver sooner if they took more care. and yet, still, we fall for it. Start-ups especially have this mindset ("move fast and break things"). Noted that over time, the most successful tech start-ups tend to abandon this mentality. And, yes, I am suggesting that this way of thinking is a sign of a dev organisation's immaturity. There. I've said it.
Posted 2 months, 2 days ago on July 5, 2018