RLSP

Tags: KC IT

Process complexity

On the one hand there is software processes such as the Unified Process and on the other there are processes such as Extreme Programming. The first seem too complex and heavy while the second seem too chaotic and empty.

The above is a common impression our clients report to us when they are confronted to the subject of setting up their software development organisation. Not really knowing what to do about their software development process they either do nothing  or either they attempt to go somewhere in between. In both cases this often fails from lack of insight on the subject.

Our analysis is that this happens because generally speaking available processes, and in particular the Unified Process familly, lack a pragmatic vision which would adapt them to real life projects (while agile processes lack sufficient guidance to be usable).

The Real Life Software Process

The Real Life Software Process is an extension (both technical and conceptual) of the well known Unified Process.

It has the capacity to range seamlessly between Agile development and Predictive development. The Unified Process was the chosen foundation because it provided the open framework we needed for the RLSP (and also studies have already demonstrated that Extreme Programming can in fact be seen as an instance of the Unified Process).

This can be viewed as a scale which can range between predictive and agile processes. With the RLSP this scale is truly progressive and the processes' frameworks also range continuously between the two extremes. One extreme being the waterfall (fully predictive) process and the other extreme being Extreme Programming. On this scale the 'standard' Unified Process is somewhere in the middle somewhat more on the predictive side for most people.

One of the criteria to establish the required agility (or predictivity) for a process (and therefore its positioning on the scale between predictive and agile) is the requirement stability.

Requirement stability

Requirement stability is a very important concept as it alone contributes a lot to determining how predictive your software process should be. Simply put, the more stable the requirements, the more predictive the process can be.

This sounds quite straight forwards but in fact it isn't so simple. This is mainly because unknown requirements don't mean unstable requirements. Most software development projects start of with very little more that a vision and very few requirements or specifications indeed.

Requirement stability refers to the ability to reasonably specify a sufficient amount of the requirements early enough in the process at an acceptable cost. This is quite vague a description, but it is difficult to be more precise as this depends very much on the project at hand, this is all the difficulty in this concept.

Economic aspects

The cost of stabilising requirements is the main barrier to always adopting a highly predictive process. In theory full stability can always be achieved, and many methods have been developed to address this. Although this is always possible, the price of achieving it (that is the time spent for analysis, prototyping, reviews, ...) soon gets excessively high thus loosing the economical advantages of predictive methods.

On the other hand, assuming the requirements are totally unstable when this generally is not the case leads to inefficient software processes with over emphasis on agility and support for change. With the very popular Extreme Programming method many projects suffered from this symptom as many used the fashionable XP method and its costly support for extreme agility (pair programming, NBUFD, refactoring, on site customer, ...) in cases where this was not necessary at all.

The economical optimum process is one that accepts the required amount of agility, no more and no less.