Friday, December 15, 2006

The Cost of Code Quality

One of the most controversial event on the Agile 2006 conference was a paper by Yuri Khramov from Apple on the cost of code quality. It is to bad I did not have a chance to see the presentation :( . Let's see what he is writing about:

"It is shown that a 'quick and dirty' approach is actually preferable in some situations"

There are quite a few statements like this in the article, it is clear the guy is very emotional about the issue. Let's see what he says further on:

"Code quality is not the ultimate goal of the development team, but rather one of the options considered for achievement of the real goal of the project, be it commercial success, timely solution of an important problem, or something else"

Agreed

"... there is no positive correlation between the quality of the code and the success of the product..." and also "If there is any correlation between code quality and the success, it is a negative one"

He bases his research on the 80 projects he participated. My impressions that they were mostly not very big projects or the system he built are not very big. But I can also say that this is my experience too. The worst system I ever built (in Perl) was very successful. It had no design whatsoever, not even a procedural one. And in fact it did not have too many bugs, was in production for at least two years. One of the best systems we built never made it to production. Usually the most successful systems I built had moderate code quality, although some of them have excellent code quality.

So anyway, the author says: "... most of the time the 'quick and dirty' approach delivers better return on investments"

Probably the reason for this statement is that the code quality is too expensive but does not bring much benefit. But how about all the projects failures due to the poor code quality? The author argues that volatility of requirements "is the reason for most project failures". This is probably true, although he of course does not count the projects that did not even start because the code was too expensive and risky to modify. Anyway what does the author suggest to fight the volatility of requirements. I suspect that he recommends building a poor code quality system and rewrite it with any significant change of requirements.

"High quality code performs correctly even on edge cases; quick and dirty solutions may work well only on the most frequent case". That's true. But the author argues that if the system will only be used in the most frequent case the quick and dirty solution is acceptable. I think this is more a design defect: why don't we design a system that will only have one or two use cases and remove all unnecessary alternative flows by not designing them into the system? For example, we don't need a breadcrumbs on the data entry applications. And what about all these "Home" buttons on the "wizard" pages? Do you really want the user to go back to the home page in the middle of transaction?

"...high post-release defect density is often related to the high degree of product success". That's true, if the system was never used it has no bugs.

"Despite the fact that several projects had failed, none of the failures was doe to high bug count" It depends how you define failure of the project. Some people define a project failure if it runs over budget or delivered with significant delays. Usually high bug count affects both cost and schedule.

Now the author argues with the theory that it costs more to fix bugs in the later stages of the project. I always thought that the theory was incorrect, but I am using agile principles for 7 years already. He says that "relative costs of fixing a single defect after release decreased substantially since 1994". "In our data, the cost of addressing an issue is not higher in the projects with low code quality". This may be true, but the issues are usually different for the systems with high and low code quality.

Now he says that there are many systems that are only in production for a few months, so why bothering with the code quality. I can agree with one exception: sometimes the business decides to keep the system longer than originally expected. We have one of the examples when a quick and dirty solution is in production already four time longer than originally anticipated.

Very beautiful:
"... a good project manager analyzes the project and designs the development process almost the same way as an architect designs the software."

Very evil:
"It is very natural to perceive the code written by another person as inferior..." But the point is clear: the developers don't enjoy working on the system with low code quality. For me low developer motivation is a project risk.

Good observation about xUnit:
"...xUnit approach means writing more code, and every line of code is a liability" Instead he suggests to use Design By Contract methodology where xUnit is too expensive. I think this is a good suggestion.

Conclusion:
"... experience demonstrates that the quest for the best code is not always justified."

Monday, December 11, 2006

KANO and Agile requirements

One thing the Agile methodlolgy is wrong is in insisting that all requirements must be explicit. Whereas from KANO methodology we know that the customer are explicit about only satisfiers and are implicit about dissatisfiers. Of course they don't even suspect about delighters. Agile methodology is wrong in requiring to explicitly state the dissatisfiers. This is quite irritating for the customers, and if they missed a dissatisfier and it does not get implemented the customer satisfaction swings into negative territory. Good Agile methodology should always make sure the dissatisfiers are implemented as a part of the solution. Correspondingly the customer should be aware that some of his money are going to maintain good standards of software, but he should be confident that all things he is taking for granted will be implemented in the system.