December 18, 2011
Leadership Without LeadersDistracted today by interesting discussions on That Twitter. One in particular about the folly of giving decision-making power to one group of people ("leaders") and tasking another group with understanding the implications of those decisions ("workers", if you like).
Why do we have dedicated leaders on teams - people who have to seek expert advice so they can make "informed decisions"? Why aren't those decisions being made by the experts themselves?
I don't have a problem with leadership, I should stress. At some point we all need to take the lead so the team can move forward in a specific direction.
My concern is that we appoint someone as leader. Leader as job title is what I have a problem with.
I also don't dispute that a single point of contact between the team and the business is a often a good idea. But let's not confuse the role of "team representative" with "leader".
In politics, and management, the distinction is often blurred. We elect "representatives" who then say they are "in power" and start telling us what to do, rather than asking us what we think should be done. Many of society's ills can be traced back to this confusion between people who speak for us and people who decide for us.
A team can find ways to make decisions that don't require a dedicated leader. I've done this many times in the past. When a decision needed to be taken, the team put it to the vote. Should we use Ibatis or Hibernate? Show of hands. Hibernate it is, then?
At this juncture in the debate, there are a couple of objections that tend to come up:
1. What if your team tends to make uninformed decisions?
Well, that's probably because you hired the wrong people. If you don't trust the majority opinion of your team, then you don't trust your team.
2. Who is accountable for the ultimate outcome of decisions, then?
The team. We stand or fall as a coherent unit.
There may be someone on the team the customer gets on best with. Y'know, one of those "people persons" people tend to bang on about. Dandy. It's like having an expert in talking to the customer. If you have someone on the team who excels at something the rest of you don't, it makes perfect sense to let them lead in that area. Empower them to act as the "face of the team", by all means. But be very clear as a team where that power ends.
Similarly, if someone on the team is acknowledged as the expert in, say, code quality andf is good at spotting code smells and SOLID and all that good, healthy stuff, then empower them to lead in that area.
Generally speaking, empower people who excel in key areas to lead in those areas, and take your lead when you're not the best person to be making those decisions.
When the team can't agree, have a simple constitution that kicks in - a basic democratic decision-making process, with checks and balances for changing course when the team discovers they made a wrong call as early as possible.
If we're unwilling to do this, and, as dedicated leaders, unwilling to relinquish decision-making power to experts and to the team as a whole, we are essentially saying "I know better than this team". And maybe you do, but the fact that they're your team suggests perhaps you didn't. Either you hired an inadequate team, or you joined an inadequate team, or you just stood back and let someone else build the team for you, and didn't insist on getting the right people. None of these makes you look good, frankly.
One very interesting Twitter response suggested that leaders "carry the can". To quote: "Do-ers don't go down if the org goes down". This is patently not true. If the org goes down, we all go down. In fact, in many businesses, it's the workers who go to the wall first. Middole management, when ordered to cut costs, are not in the business of firing themselves.
Why can't the whole team carry the whole can? Teams succeed or fail as a whole. They should share the risks, and share the rewards equally. We are grown-ups, after all.
July 31, 2011
Clean Code Helps Prevent BugsAnother quick thought-dump for the weekend.
In his papers on OO design principles, Uncle Bob Martin uses the terms "rigid" and "brittle" to describe code that's hard to change and easy to break.
It probbaly needs rearticulating that there's a direct relationship between those two things: that is, code can be easy to break for the same reasons that it is hard to change.
When we look at factors that make code harder to change, we can see how this might be so.
The readability, or comprehensibility, of code is one obvious common factor. Modifying code we don't understand is a bit like performing keyhole surgery wearing dark glasses - the capacity for unwitting mistakes is significantly higher.
Complexity is another. There's no doubt that code that is more complex tends to be buggier, and the reason for this is simple: more complex code is more likely to be wrong because there are more ways for it to be wrong. Just as we're more likely to throw a seven when rolling a pair of dice, we're more likely to make a mistake when rolling complex code. The exponential explosion of logical combinations of wrong code very quickly overpowers the slow, linear growth of correct combinations. It's basic probability.
Duplication also adds scope for more errors. In particular, it adds a risk that when we need to change common, duplicated logic, we overlook one of the duplicates, leading to inconsistencies and contradictions in our code.
Unmanaged dependencies can also make code more likely to be defective. the risk that a change to one part of the software will inadvertantly cause another part (or parts) to break is magnified significantly when there are more dependencies along which effects can propogate.
If our code is simpler, more comprehensible, low in duplication and organised to minimise the "ripple effect" of changes, it will tend to be less buggy. So we should view Clean Code as not just an enabler of change, but as an aid to defect prevention.
April 4, 2011
On The Subject Of "Value"I really need to get up from my desk and get ready to go out. (Some software craftsmanship thing going on this evening, so will lurk at the back and see what folk are saying.)
Meanwhile, among other distractions, a debate on Twitter about the value of software/features has inspired me to jot down a couple of thoughts before I get busy.
There are people out there who believe it's possible to measure and estimate the value of a feature in ourely financial terms - what it's worth in pounds sterling or dollars.
I have two thoughts on this: first of all, if this were really true, the successes in our industry might be much easier to predict, which we tend to find they're not. So, going purely on the evidence of the distribution of software success, I'd say nobody's cracked this one.
More importantly, I think this is a naive and one-dimensional view of "value". In my experience, value is usually a very complex, nuanced and subjective thing. For hundreds of years businesses have faltered when they focus entirely on the mighty dollar (or euro,or yen) at the expense of other kinds of value.
Profit doesn't exist in a vacuum. It's a product of many interrelated factors, like customer satisfaction, employee morale, skills, environmental factors, human rights records and a miriad of other, sometimes competing, concerns.
One pespective in particular can be very hard to pin down and explain in terms of mere numbers: the user's point of view.
What I've seen happen all too often is customers and product owners/managers defining and prioritising features based on what the business wants, not on what the users might want. Indeed, it's actually rare to find real target end users getting involved in the design of the software.
Now, we have to be very careful here; what the people paying for the software, and what the people using the software want are not necessarily the same thing.
For example, a Facebook user might not think advertising is very important to them. Indeed, they may find some forms of FB advertising intrusive and annoying. But to Facebook's owners, advertising is very, very important. So important, in fact, that in recent years the users' needs seem to have taken a back seat while the developers forge ahead with all manner of features designed to make the site more bankable.
Of course, advertising keeps Facebook, and many other web sites, free to use. So a balance has to be struck. I've felt of late that the balance has tipped too far in favour of what Facebook wants.
You see, you may think you're new car is going to be worth $50,000, but that's only true if someone's willing to pay $50,000 to drive it. If the end user doesn't see the value, then it's worth nothing at all.
Car manufacturers usually put the driver at the centre of their design process. Making as car desirable is what makes it valuable.
It seems we have yet to learn this lesson in software development.
April 2, 2011
The Dependable Dependencies Principle - Draft PaperFor over 15 years we have known of design principles to help us manage dependencies in software to limit the impact of making changes. Another consequence of dependencies has been overlooked, namely the relationship between dependencies and risk of failure in our code. The more depended-upon code is, the greater the potential impact of its failure. We should therefore desire that code this is more depended-upon be more reliable. This paper explores the relationship between dependencies and reliability, and proposes a new design principle, with a first attempt at an accompanying set of metrics, to help us limit the impact of failure.
Download the full draft paper here.
January 23, 2011
What The Architect Saw
Just because you can't see a hole in the ground from space, that doesn't mean you won't fall into it when you're down there.
January 15, 2011
The Illusion Of Control - Beware Consultants Selling Lucky PantsIf you've read the book or seen the movie "The Secret" (or watched a lot of Oprah), you will be familiar with an idea called "The Law Of Attraction". The authors claim that by thinking positively, we can influence outcomes in our lives and become healthier, wealthier and happier.
Critics point out that the evidence is overwhelmingly against this being true, often citing notorious genocides throughout history as the ultimate statistical disproof that thinking positively attracts positive outcomes in a large and diverse population.
It is actually a form of mental delusion to believe that we can influence events that are physically beyond our control. It is often where people get ideas about "lucky charms" (e.g., a sportsman's lucky pants to influence the outcome of a race) and how elaborate rituals can evolve (e.g., rain dances to influence the weather).
The reality is, I'm afraid, that the Law Of Attraction is demonstrably bunk. As are lucky pants and rain dances (beyond a mild, but observable placebo effect.) It can be easily disproved with a selection of positive- and negative-thinking test subjects and a pair of dice.
But that hasn't stopped millions of otherwise sane and rational people buying into the illusion of control. It's a huge and growing industry, selling "lucky pants" to people in all walks of life to cure all manner of ills ranging from baldness to poor business performance.
Of course, in the logical, dispassionate world of computer programming, we don't succumb to such delusions. Or do we?
Well, there possibly is a software equivalent of "The Secret"; namely, the secret of delivering value through software. Is it possible that there are people and organisations out there who know whether one set of software features will have more value than another? Certainly there are many who claim they know, and claim they can teach you their secret.
Let's examine the evidence. Take one of the most successful software companies in the market today: Facebook. One can assume that a company worth billions from software must know the secret, right?
Wrong. What distinguishes Facebook from, say, FriendsReunited? Why is one company worth $50 billion and another worth a tiny fraction of that? What did Facebook get so right, and how did the founders know it was right? The answer is: nothing and they didn't. How do I know that? Because I am on facebook. (No, we can't be friends, by the way - I just use it to occasionally see what old school chums are up to now they're not on FriendsReunited.) Why am I on Facebook? Because my school friends are on Facebook. Why are they on Facebook? Because their friends are on Facebook. What is it about Facebook they like so much? They like that their friends are on it. And that's it. The Unique Selling Point of Facebook is that it is so successful. One may as well ask "why are planets so big, but micro-meteorites are so small? What is the secret of Jupiter's success?" The answer, of course, is luck, and then gravity does the rest.
Of course, Facebook needs a basic feature set - updates, groups, events, photos, and wotnot. But these are very easily replicated, and many far less successful social networking sites actually do these things much, much better. The only feature that marks Facebook out for greatness is that everyone's on it. And that just happened. One guy invited his friends to join, some of whom invited their friends to join, and it just snowballed from there. It could have happened to any number of other similar sites.
Here's how success works. It spreads like a forest fire. Most forest fires burn out quickly and don't spread far. But every so often, completely at random, one tiny little fire spreads and spreads and before you know it the whole forest is ablaze and it's time to call out International Rescue (or Scooby Doo, if the Thunderbirds are busy and there may be a ghost involved).
We can no more predict which software will succeed than we can predict earthquakes or stock market crashes. Skill, knowledge, wisdom, application and good judgement only deal you into the game. They can't guarantee a winning hand. Nothing can.
Of course, with the benefit of hindsight, it's easy to see why the winners won and the losers lost. Just as it's possible to understand what caused an earthquake after it's hit. But we still can't see them coming. That's why the smart venture capitalists understand that they are gamblers, not prophets.
It is possible to analyse the need for a software product and test that it solves the problem it's designed to solve. That's a different kettle of fish altogether. Functional and operational goals are things we have control over. We can verify that a microprocessor-controlled microwave oven heats soup to the right temperature when we select "Soup" from the control menu. What we cannot know up-front is that this will mean the microwave will sell 1,000,000 units. Because we are unable to wite software that instructs the buyer's brain to walk into an electrical goods store and take out their cheque book. Not yet, at least.
The success or failure of a software product in the marketplace (including internal markets of business users) is largely beyond our control.
Anyone who says they can teach you how to "attract" that kind of success is essentially just selling you lucky pants.
November 30, 2010
Codemanship Training Early Bird Offer Ends In 10 DaysRoll up! Roll up!
Just ten days left to take advantage of the best offer going in developer training anywhere in the world.
In Jan-Feb I'll be running my three weekend workshops aimed at software craftsmen.
TDD. And I Do Mean It!
The first workshop (Jan 8-9) is an immersive, totally hands-on 2 days focusing on Test-driven Development, and covers both the "Chicago school" of TDD (state-based behaviour testing and triangulation), and the "London school", which focuses more on interaction testing, mocking and end-to-end TDD, with particular emphasis on Responsibility-driven Design and the Tell, Don't Ask approach to OO recently re-popularized by Steve Freeman's and Nat Pryce's excellent Growing Object Oriented Software Guided By Tests book. The workshop ends with practical and pragmatic advice on building your own TDD practice regime, and how you can objectively assess your TDD effectiveness. A new addition to the workshop is an extra section entitled "High Integrity TDD", which explores several techniques that can complement the TDD approach to produce very reliable code. As well as covering TDD and the basics of test-driven OO design, you will also be doing a fair amount of refactoring and getting to grips with not only the basic discipline of refactoring within TDD, but also with the automated refactorings available in your IDE. If you only ever go on one developer course in 2011, this is the one I'd recommend. (Even though I say so myself!)
Refactoring. No seriously. A whole course just on refactoring
Jan 29-30. I've been digging around the Interweb, and as far as I could find, this is the only publicly-scheduled refactoring course anywhere in the world. During the two days, you'll get your hands dirty dealing with a dozen of the most common and pernicious code smells, learning refactoring strategies to eliminate them, as well as getting to know the discipline of safe refactoring. I can say from my own experience, and the experiences of hundreds of developers I've coached, that nobody really understands software design until they understand refactoring. This workshop gives you a language for expressing code quality problems and the tools for dealing with them. Refactoring is the oil that stops the gears of your development process from grinding to a halt, but so few developers really understand it and even fewer can actually do it. Mastering refactoring will put you in the top ten percent of the top ten percent of Java and .NET developers. At the very least, being able to recognise most code smells could make you the King (or Queen) of your team code reviews. That'll give your architect something to chew on!
OO design. The last word. Well, maybe.
The final workshop (Feb 19-20) will give you a grounding in OO design principles, including the famous S.O.L.I.D. principles (Single Responsibility, Open-Closed, Liskov Substitution, Interface Segregation, Dependency Inversion), as well as covering fundamental design basics (Simple Design, Don't Repeat yourself, "Tell, Don't Ask" etc) and principles for packaging code for larger software systems and products. Rooted firmly in the provable mechanics of code - complexity, duplication and dependencies - this hands-on workshop approaches code quality from a more scientific angle, but is every bit as practical and hands-on as the TDD and Refactoring workshops. Not only will you learn how to recognise OO design problems and root them out in code bases of any size, but you'll get practical experience in dealing with those problems through refactoring and other techniques. You will also never lose an argument with an architect again! Well, not a rational one, at any rate.
I'm keen that folk cover all three of these discplines, as they all fit together and complement each other very closely. Which is why I offer a very special price when people book places on all of them of just 800 GBP.
That's the best deal you're ever going to get on developer training that I'm aware of. To put that into perspective, a leading UK developer training company currently charges 1,095 GBP for a 2-day TDD course with a reputable course leader. Now, you can debate amongst yourselves whether I am a reputable course leader (check out my free TDD, refactoring and OO design screencasts if you need some evidence to help you make that call, or just Google me), but with the Early Bird Offer you can take my 2-day TDD workshop for just 267 GBP - a quarter of the price. And take no time off work, either. This is good news for freelancers and self-starters. A 75% saving AND no loss of income. If you lived in New York, the savings you'd make on the courses would probably pay your air fare to London and for 2 nights in a 3-star local hotel.
How can I offer the same high quality training at 25% of the price? Easy. I don't have the overheads or the cost of sales of these much larger training companies. You'd be amazed just how much of your training dollar goes on stuff other than the actual training itself. And I have been lucky enough to find a great training venue that charges very reasonable prices. All the mod cons - free tea/coffee, wi-fi, catering - but not silly London venue prices. And it's spitting distance from rail and tube links. And a very decent boozer (not that proximity of pubs in any way swayed my choice of venue.)
The other reason is because I really believe in these workshops and I really want folk to come to them. That's why I've gone to great lengths to make them as affordable and accessible as possible. If I won the lottery tomorrow, I'd run them for free.
The final day for Early Bird bookings is Dec 11th. At the risk of making a terrible winter weather pun - time to get your skates on. (Boh!)
October 15, 2010
Life Is The 5th State Of MatterLife is the fifth state of matter. Arrange atoms in a certain way and they start wars and write symphonies.
July 14, 2010
Codemanship's Code Smell Of The Week - Data ClassesA key goal of OO design is to minimise depdencies between classes by packaging data and behaviour as close together as we can. In practice, a good rule of thumb for class design is to put fields and methods that use those fields in the same classes. Data classes are classes which just have fields and no behaviour (besides simple getters and setters), and they break this rule of thumb, creating serious dependency issues in your code.
Here, Jason Gorman demonstrates how to refactor data classes by moving the methods (or parts of methods) that use fields of dta classes into the classes that contain those fields.
Download the source code from http://bit.ly/czsOHP
For training and coaching in refactoring, TDD and OO design, visit http://www.codemanship.com
April 24, 2010
Software Is Both Art & Science. Can We Move On Now?Bonjour, mes enfants.
SEMAT has gotten me thinking. Any mention of "engineering" and "science" in software development seems to polarise opinion.
Undoubtedly, there's a large section of the software development community who believe those words simply do not apply. Software development is not a science. It's an art. Or a craft. Like basket weaving. It's not engineering. No sir!
And there's an equally large section of the community who believe the exact opposite. Software development is a science. It is engineering. We can apply scientific principles to shape predictable, well-engineered end products.
Of course, they're both wrong.
Anyone who dismisses the notion of any kind of scientific basis for software development is running away from reality. Everything that exists has a scientific basis. Even American Idol. You just have to understand it. That we don't fully understand the science of software does not mean that no such science exists or that we'll never understand. I just can't help being reminded of UFOlogists who claim that "science cannot be applied to the study of UFOs". What they mean, of course, is "I've got a good thing going here selling my unscientific ideas to these schmucks, and I'd like to keep it that way, thanks".
And anyone who believes that software development can be completely tamed by science is equally deluded. There are sciences - emerging in recent decades - that teach us that there are many things in the world that, while it's possible that we can fully understand them, we will never be able to control them. Chaos is science. And software development is mostly chaos. Any vision of "software engineering", where pulling lever A causes X and pulling lever B causes Y with certainty, is the product of naivity at the macro, project scale. Life just isn't like that. Clockwork might be, but life isn't. Software development is intractably complex and unpredictable.
In the real world, biology has come up with processes that deal effectively - but not predictably - with intractably complex problems. Evolution is one such process. Evolution solves complex, multi-dimensional problems by iterating through generations of potential solutions. That is science. We can understand it. But we cannot even begin to predict what solution a process of evolution will reach, or how long it will take to reach it.
The clockwork, Newtonian paradigm of "software process enginering" is fundamentally flawed. And anyone who believes that it's possible to attain "value" deterministically is deeply mistaken. "If we pull lever A, we'll ship an extra 10,000 units". Give me a break!
Flawed, too, is any notion that this means that there's no engineering at all to be done in creating good software. I doubt anyone would claim that making rock music is "engineering" - well, anyone sane, at least. But there is science that can be applied within this process.
It's possible mathematically to predict what effect the choice of software compressor and the compression settings used will have on the amplitude of a recording across a certain frequency range. Indeed, it is helpful in getting the best-sounding mix. There is such a thing as "audio engineering" within the music production process. Granted, it's chiefly a creative process. But there is useful science we can appky within in to help tweak the results closer to perfection.
Similarly, while software design is chiefly a creative process, it can be useful to know if the code we're writing "smells" in any significant way. Some code smells can be detected just by looking at the code and using our judgement. Others are more subtle and harder to spot, but just as damaging to maintainability. Code analysis tools, as they grow more sophisticated, can complement our "eye for good design" every bit as much as audio engineering tools complement our "ear for a good mix" and music composition aids can complement our "ear for a good tune".
The trick, I believe, is to find the balance and work within the limitations of science and engineering and creative disciplines. Trying to figure out the formula for "valuable software" is every bit as futile as looking for a formula for making "hit records". And relying entirely on your eyes and ears to refine an end product has severe limitations. For millenia, we've used tools and theory to tweak and refine all manner of creative end products. We even have a word for it: "machine-tooled". The fact that you can fit a computer more powerful than all of the computers in the world were 30 years ago in your breast pocket, and it looks good, is testament to this symbiosis of science and art.
We can continue to refine and extend our scientific understanding of code and coding through research and exploration, just like any science. And new and useful tools will emerge from that understanding that will make it possible for us to produce better quality code, for sure.
And we can also continue to develop and refine the art of software development. Through reflection, practice and sharing.
There is such a thing as "software engineering", but it has limited scope, just like "audio engineering". Specifically, it's limited to things we can predict and can control, like what happens to coupling and cohesion if we move a class from one package to another.
The bigger picture of "delivering value" is a complex human endevour, and creativity, judgement and more than a sprinkling of luck is all we have that we can bring to bear in any meaningful way at this level. We may be capable of understanding, with the benefit of hindsight, why Feature X was used more than Feature Y when the software was released. But then, with hindsight, we understand quite a lot about volcanos and hurricanes, too. These are things that can only be understood with hindsight, really. We don't see them coming until they're almost upon us, and we have two choices - stay and risk everything or get out of the way and live to fight another day.
In years to come, I'll probably notice more and more a difference between "hand-rolled" software and software that has been written with some help from "software engineering" tools - the more grown-up descendants of tools like XDepend and Jester.
But I sincerely doubt I will ever be able to tell at the start of a project whether the resulting software will enjoy success or not. Sure, I'll be able to look back on projects and say "hey, y'know what we got wrong?" But far in advance, the outcome is every bit as unknowable as a hurricane, volcanic eruption or hit record. So where things like requirements and processes and "enterprise architeture" are concerned, I'll stick with arts and crafts.