No Silver Bullet


In No Silver Bullet - Essence and Accident in Software Engineering, Fred Brooks concluded that "There is no single development, in either technology or management technique, which by itself promises even one order-of-magnitude improvement within a decade in productivity, in reliability, in simplicity", and went on to write "How much of what software engineers now do is still devoted to the accidental, as opposed to the essential? Unless it is more than 9/10 of all effort, shrinking all the accidental activities to zero time will not give an order of magnitude improvement."

I think Brooks has a point, and was mostly correct at the time of writing, but what's happened over the past few decades is that there's been a massive increase in software complexity which has yielded only marginal improvements in functionality. In other words, the accidental complexity now exceeds the essential complexity by a wide margin, so a silver bullet is now at least theoretically possible. You can remove most of this complexity, provided you recognize it where it's happened, and those who added it in the first place no longer have a stake in the software. I have done this several times and reduced code size (of the parts I simplified) by up to 90%. In most cases, the code did exactly the same, but ran faster and used less memory.

In a commercial environment, you're unlikely to be thanked for doing this, as you're being paid to add more functionality, not to reduce complexity. And if you then proceed to simplify the "business logic", another major source of complexity, you will likely make yourself so unpopular you'll probably get fired.

However, this accidental complexity will only generally be obvious following a rewrite of the original system without adding any new features, and perhaps removing features of marginal utility (taking advantage of the Pareto Principle), and then only to those who care about simplicity.

Of course, any rewrite along these lines adds to rather than subtracts from the required effort, at least in the short term (though it would reduce the effort required to maintain the system), so what is needed instead is to avoid the accidental complexity in the first place, by recognizing it before it is inadvertently included in the system.

This will only happen when keeping things simple is actively encouraged, and those capable of it are recognized and are given key roles in the design and development of systems. I don't see that happening very often in the foreseeable future.

Sometimes massive simplifications do catch on, though: Unix is a much stripped-down MULTICS. However, most of todays Unices dwarve the MULTICS they upon which they were originally based. (A better simplification, though, would be to dispense with the operating system altogether, and run languages and applications on the bare metal.) On a smaller scale and, in this case, before the overly complex code was implemented, was the abandonment of Lisp's original syntax with its M-expressions. These were much more Algol-like, and more complex to parse, than S-expressions, which were originally intended only for data. (The S-expression parser is just the read function.) However, after the first interpreter was up and running, early Lisp programmers immediately adopted the S-expression syntax and M-expressions were never used outside of textbooks. Non-Lisp programmers often look aghast at S-expressions, but never think the problem might be with themselves. As confirmation, all subsequent attempts to give Lisp a more Algol-like syntax have foundered.


© Copyright Donald Fisk 2015