This post has nothing to do with gaming. Probably.
I've been continuing to work my way through The Transformation of War, and hit something somewhat interesting last night. Van Creveld commented on the traditional division between strategy and tactics, that tactics is the art of winning battles while strategy is the groundwork, to make sure that the battle will be fought under advantageous circumstances, that men and equipment will be in good order on arrival, and that victory can be capitalized on to achieve a campaign's objectives. Further, he argued that much of successful strategy is working around three sorts of difficulties; uncertainty and friction (the tendency for things to go wrong and impede progress) were recognized by Clauswitz, while van Creveld also adds inflexibility.
This is probably not news to those familiar with Western strategy, but it did make an interesting connection with software engineering for me. You see, software engineering isn't really engineering as one might use the term for, say, civil or mechanical engineering. A mechanical engineer can look up the strength of various alloys of steel in reasonably reputable sources, crunch the numbers, and figure out what he needs in order to make the thing he's designing meet the requirements. When he's done, he can be reasonably certain that the thing he's built will more-or-less work as intended if he has done his job properly. Software engineers cannot do this. The performance benchmarks available are gamed all to hell, the mathematics taught in school are primarily concerned with asymptotic behavior and completely neglect very substantial performance concerns, nobody outside of Intel and maybe some of the agencies actually knows the full instruction set most general-purpose computers are running, and next year that library you're using is going to break its interface out from under you with no warning. Not only do we not know the 'laws of physics' of the area we're working in, but those laws are constantly changing, often adversarially.
(And of course, the client wants it done yesterday with a completely different set of requirements than they asked for six months ago, but I figure that's probably normal for most engineering disciplines)
At the end of the day, software engineering has more in common with art and alchemy than with traditional mathematically-rigorous engineering approaches. The proof of this, I think, is that we see reflections of strategy's aims and obstables in modern software engineering practices. Waterfall has fallen by the wayside because its timetables failed to account for friction. Its fixed requirements led to inflexibility and may have been wrong or useless by the time the project was brought to completion. Test-driven development makes a tradeoff between flexibility and friction, as codified tests somewhat reduce your ability to adapt to changing requirements but also reduce the impact of random failures. The Agile family of methodologies seems focused on flexibility to changing requirements and reducing uncertainty about customer requirements through communication, which is good, but may come at the cost of a flavor of friction as developer-time is lost playing Meetings & Metrics (have you heard of my new game? Best-selling the world over...).
(Exercise for the reader: commission a civil engineering firm to design a highway suspension bridge using Agile. Be sure the change the requirements regularly to keep them on their toes. Record the stand-up meetings and put them on youtube.)
Anyway: at the end of the day, software engineering is about establishing a context to programming, where the unpleasant, messy work of one's developers can actually achieve the objective despite uncertainty and both human and machine failures. In this it is like strategy.