“The guy who wrote X is no longer here and no one knows what it does anymore”

That should sound awfully familiar to anyone who has been in the industry for any amount of time. This is so often repeated a story that the instances of it wreaking havoc would be countless. Sadly this repeats every so often.
Programs within programs

Not being able to control complexity is often the major causes failures of large software projects. Controlling and dealing with complexity is the major tenets of software management, espoused so often in many seminal works on the same topic –

“The goal of “conceptual integrity” or unity of design: The best programs and software systems exhibit a sense of consistency and elegance, as if they are shaped by a single creative mind, even when they are in fact the product of many. The fiendish problem is less how to create that sense in the first place than in how to preserve it against all the other pressures that come to bear throughout the process of writing software.”

Ability to hire more people should not be the yard stick to judge whether a task can be added to the system or not. Even Linus Torvald who might theoretically have an infinte army of quality labor at his disposal, chooses to break kernel interfaces every once in a while and kill the cruft, choosing to break compatibility rather than dealing with the additional complexity of maintaining additional layers or binary interfaces in the core kernel.

The most efficient software team i ever saw was able to use a SINGLE top notch programmer to maintain reams of code, which was at the heart of it’s system majorly because they were able to maintain a flowing consistency in design and coding styles. Understanding another module was easy as taking a look at the function definitions exposed my the module. Since every piece of code followed the same naming conventions, file conventions, parameter and coding style conventions, understandability was the best as it could ever be. The rewards where tremendous from a manageability point of view.

What value does reduction in complexity create?

“Complexity kills. It sucks the life out of developers, it makes products difficult to plan, build and test. .. Each of us should … explore and embrace techniques to reduce complexity. Ray Ozzie

UI design principles are extremely well developed compared to other parts of the software design tenets for obvious reasons. Concistency is one of the major tenets of a good UI design. It helps simplicity and reduces learning. But more than being an artifact of what UI is, what this reflects on is the human element.

Anything that is simple and consistent is simply more easier to use. Anythings that’s easier to use makes helluva more business sense than otherwise. This is the same principle that’s behind the major innovations in software design, be it programming languages, structured programming, objects or objects with garbage collection .

The human factor involved in creating and maintaining code bases automatically makes the same rules applicable to software designs too. It is easier to maintain and develop a code base that is uniformly consistent than something that requires a considerable effort for learning each new functionality.

Better Comprehension

The way human beings learn to cope with big systems is by using techniques of abstraction or theory building. Each time this abstraction breaks or in any way does not hold true, details have to be worried about. The more details one has to juggle while dealing with the code base, the more likely are things to be dropped. Architectural consistency avoids having to carry around too many details about the code base, making things easier to understand overall.

Better comprehension results in better decisions made elsewhere due to lesser mistakes in the assumptions being made, resulting in lesser disasters overall. Better still, it results in not having to make the same kind-of decisions elsewhere in the system. This streamlines the whole effort and affords the team to be more productive overall rather than worrying about the system related aspects. After all the code is the only the tool and not the end result that we are after. Any time not spent in creating / learning code is better for everyone involved.

Increase Efficiency & avoid duplications

For an Snmp based framework that we where developing, it so happened that one of the modules ended up using NetSnmp while the others used snmp_pp.

During the course of development, two separate wrappers where developed over these different libraries that did the same kind-of work. However one was essentially C++ based and the other C based. The works that derived from these, also became essentially different.

They had different ways of handling errors & had subtle differences with respect to the way in which the library handled the underlying complexities. Value adds created in one of the wrappers (eg make out if a hex string is an IP address) had to be duplicated in another. Bugs found in one module was particular to it and the testing effort that resulted in the bug and resulting debugging could have been put to better use elsewhere. Users and maintainers of one module had to carry different facts and assumptions of the underlying libraries and was simply more overhead for the team overall. Team members could not be easily switched back and forth between the two modules creating additional rigidity for program management.

Further we had an entire body of code in the system to handle the same functionality. With a little bit of foresight all of this could have been avoided very easily. The complexity described was inherited on a small system developed by a 10 developers. Imagine therefore the complexity that can result from inconsistencies on systems being developed by scores of developers.

Consistency is never accidental – Code Reviews?

Creating a work that is consistent in its design is never an easy task. Keeping the consistency from derailing is major task in itself. Small inconsistencies does happen all the time in all systems. Parallel code development would always result in these small effort duplications that go unnoticed. Bad projects have huge duplication of efforts whereas better ones have less code and always cause the common code to bubble up as a common infrastructure to be reused again and again.

However finding and identifying common code, and reviewing modules is almost a full time job and are most often neglected. These wayward designs go undetected therefore for a long time, until the problem becomes big enough to come in someone’s radar. By that time however the system is too big to control and is a beast by its own. Tearing down and rebuilding using common code becomes a gargantuan exercise that no one is willing to do. That is usually how code development in big organizations proceed.

The only way to control this is to enforce code reviews, with consistency and reuse emphasized as one of the key results arising from the review effort. Having a single team / key set of people familiar with the entire code base will have an advantage here.

At times, the architectural consistency do seem to lend the projects, that always elusive elegance which many of us seek and rarely find*

*kind words lifted from a popular samurai film ending

Advertisements