Extreme Programming - At What Cost? V.V.S.RaveendraWaterfall model assumes that there are no changes in the requirements until the system goes live. Iterative models like Rational Unified Process can absorb changes in the beginning of Iteration. On the other hand Extreme Programming (XP) model should be able to take changes anytime. These days we find many showing interest in Extreme Programming model for software development. It is a matter of fact that in some businesses the rate of change of requirements is quite high and a development process is needed that can deliver software. We need to know the cost of such a model that can incorporate changes continuously and make software releases. In waterfall the testing effort is spent only once towards the end of the project. In iterative model, at the end of iteration, testing of the new code developed in current iteration and regression testing of code built in earlier iterations is done. If there are n iterations, the testing effort is n(n+1)/2. In an iterative development model, normally n<10. Let us treat every release in XP as Iteration. Then n is quite large and hence the effort spent in testing increases as O(n2). One way to solve this is to minimize the extent of testing by designing the application as a collection of components such that the regression testing is localized. Then the size of n will be small. The catch is that the application should lend itself to such componentization and it takes some time to divide the application into such components. Another way is to automate testing. We need to find if the cost of the tool fits within the budget of the project. The test scripts are again specific to the tool we employ. Along with code, we need to maintain the scripts. On top, the versions of scripts and code should be in sync. I will now turn to parallels that come to my mind. Modeling techniques in Civil and Mechanical engineering in 19th and first half of 20th centuries were dominantly on continuum techniques (i.e. differential and integral equations). Availability of computers made it possible to model discretely and solve the problems with computational techniques (i.e. Finite Difference Method, Finite Element Method etc.). In other words, these branches of engineering started with continuum mechanics and progressed towards discrete modeling techniques. On the contrary, the software life cycle models are trying to move from discrete changes to continuous changes. After binary systems came the fuzzy systems, where the 0 or 1 states are enhanced to accommodate more states than two. Then came neural networks that can deal with continuum of states. However, fuzzy and neural systems could not replace the binary digital systems. Neural Networks have their applications, but in a limited context. The lesson to learn from history is - dealing with 'continuous change' is not easy and may not have global applicability across the board. We need to analyze if the context of the project needs XP, before making use of that model. 14 August 2003; updated 29 August 2003 |