Friday, December 15, 2006

The Cost of Code Quality

One of the most controversial event on the Agile 2006 conference was a paper by Yuri Khramov from Apple on the cost of code quality. It is to bad I did not have a chance to see the presentation :( . Let's see what he is writing about:

"It is shown that a 'quick and dirty' approach is actually preferable in some situations"

There are quite a few statements like this in the article, it is clear the guy is very emotional about the issue. Let's see what he says further on:

"Code quality is not the ultimate goal of the development team, but rather one of the options considered for achievement of the real goal of the project, be it commercial success, timely solution of an important problem, or something else"

Agreed

"... there is no positive correlation between the quality of the code and the success of the product..." and also "If there is any correlation between code quality and the success, it is a negative one"

He bases his research on the 80 projects he participated. My impressions that they were mostly not very big projects or the system he built are not very big. But I can also say that this is my experience too. The worst system I ever built (in Perl) was very successful. It had no design whatsoever, not even a procedural one. And in fact it did not have too many bugs, was in production for at least two years. One of the best systems we built never made it to production. Usually the most successful systems I built had moderate code quality, although some of them have excellent code quality.

So anyway, the author says: "... most of the time the 'quick and dirty' approach delivers better return on investments"

Probably the reason for this statement is that the code quality is too expensive but does not bring much benefit. But how about all the projects failures due to the poor code quality? The author argues that volatility of requirements "is the reason for most project failures". This is probably true, although he of course does not count the projects that did not even start because the code was too expensive and risky to modify. Anyway what does the author suggest to fight the volatility of requirements. I suspect that he recommends building a poor code quality system and rewrite it with any significant change of requirements.

"High quality code performs correctly even on edge cases; quick and dirty solutions may work well only on the most frequent case". That's true. But the author argues that if the system will only be used in the most frequent case the quick and dirty solution is acceptable. I think this is more a design defect: why don't we design a system that will only have one or two use cases and remove all unnecessary alternative flows by not designing them into the system? For example, we don't need a breadcrumbs on the data entry applications. And what about all these "Home" buttons on the "wizard" pages? Do you really want the user to go back to the home page in the middle of transaction?

"...high post-release defect density is often related to the high degree of product success". That's true, if the system was never used it has no bugs.

"Despite the fact that several projects had failed, none of the failures was doe to high bug count" It depends how you define failure of the project. Some people define a project failure if it runs over budget or delivered with significant delays. Usually high bug count affects both cost and schedule.

Now the author argues with the theory that it costs more to fix bugs in the later stages of the project. I always thought that the theory was incorrect, but I am using agile principles for 7 years already. He says that "relative costs of fixing a single defect after release decreased substantially since 1994". "In our data, the cost of addressing an issue is not higher in the projects with low code quality". This may be true, but the issues are usually different for the systems with high and low code quality.

Now he says that there are many systems that are only in production for a few months, so why bothering with the code quality. I can agree with one exception: sometimes the business decides to keep the system longer than originally expected. We have one of the examples when a quick and dirty solution is in production already four time longer than originally anticipated.

Very beautiful:
"... a good project manager analyzes the project and designs the development process almost the same way as an architect designs the software."

Very evil:
"It is very natural to perceive the code written by another person as inferior..." But the point is clear: the developers don't enjoy working on the system with low code quality. For me low developer motivation is a project risk.

Good observation about xUnit:
"...xUnit approach means writing more code, and every line of code is a liability" Instead he suggests to use Design By Contract methodology where xUnit is too expensive. I think this is a good suggestion.

Conclusion:
"... experience demonstrates that the quest for the best code is not always justified."

Monday, December 11, 2006

KANO and Agile requirements

One thing the Agile methodlolgy is wrong is in insisting that all requirements must be explicit. Whereas from KANO methodology we know that the customer are explicit about only satisfiers and are implicit about dissatisfiers. Of course they don't even suspect about delighters. Agile methodology is wrong in requiring to explicitly state the dissatisfiers. This is quite irritating for the customers, and if they missed a dissatisfier and it does not get implemented the customer satisfaction swings into negative territory. Good Agile methodology should always make sure the dissatisfiers are implemented as a part of the solution. Correspondingly the customer should be aware that some of his money are going to maintain good standards of software, but he should be confident that all things he is taking for granted will be implemented in the system.

Tuesday, November 14, 2006

Test-driven writing of standards

We all know about test-driven development, and even test-driven management (which proves to be just setting goals and checking them at the end of the year).

Now I introduce Test-Driven Writing of Standards. Indeed you can start writing coding standards, but it is likely you will miss something, so when you see another ugliness in your code you need first to go and check if your standards cover the problem. If not you need to add it to the standards, and then raise a defect based on the newly added coding standard.

Monday, October 02, 2006

Wiki and Blog in WSS 3.0!

This video shows how to use WSS to create a Wiki web or a blog. WSS also supports RSS. Very cool.

Tuesday, September 19, 2006

DBCC MEMORYSTATUS

Whenever you suspect memory problems in SQL Server 2000 use DBCC MEMORYSTATUS. Here is the KB article about how to interpret results.

Thursday, September 07, 2006

TDD vs RAD

I think the incompatibility between TDD and RAD is artificial. RAD is very useful for interfacng the external systems, like databases, web services and the user (UI is also the part of the external interface). TDD is very useful for developing business logic.

Friday, September 01, 2006

First thing to do when the execution plan changes

I need to run for every database:

EXEC sp_MSforeachtable "DBCC DBREINDEX ('?')"
EXEC sp_MSforeachtable 'UPDATE STATISTICS ? WITH FULLSCAN'

Wednesday, August 30, 2006

So why does SQL server change execution plan?

Well, some hypotheses:
  1. SQL Server has artificial intelligence built in and wants to annoy the developers and DBAs by randomly changing execution plans for the query to make it slow and have developers and DBAs spend extra hours trying to optimize it.
  2. SQL Server has artificial intelligence built in and has a bias toward SQL Server performance ocnsulting. Since the rates dropped somewhat it decided it will torture full-time employees until they will hire a consultant with a decent rate.
  3. This behaviour was a part of SP4 and was designed to annoy developers and DBAs so they will upgrade to SQL Server 2005 as soon as possible, thus generating much-needed revenue for Microsoft.

Friday, August 25, 2006

Why does SQL server always change execution plan?

I just don't get it why does SQL server always changes execution plan.

I run

EXEC sp_MSforeachtable 'UPDATE STATISTICS ? WITH FULLSCAN'

and still cannot get the same execution plan as the last time. I don't think any data were changed since a few weeks ago. And why does the SQL server all the sudden start usign hash or merge joins and try to use parallelism? I worked a couple of days ago to optimize the query and I've got it down to 71 milliseconds. Today it is back to 600 ms. Is the solution to use the real database, for example Oracle?

Wednesday, July 12, 2006

Tuesday, June 06, 2006

Evil empire...

Just found in my code:
using System.Globalization;
I did not know we are using evil system of globalization. Gess what, the globalists are sitting in the ambush in all unlikely places...

Wednesday, May 24, 2006

Architecture Journal useless?

I found Microsoft Architecture Journal utterly useless for containing lengthy articles with only general statements and no strategic vision.

Friday, May 19, 2006

Can't spend 25 hours a week blogging?

Steve Rubel spends 21-25 hours a week blogging while also having a full time job as senior VP at Edelman. I realize I am just too lazy.

Tuesday, April 11, 2006

Friday, March 24, 2006

Wednesday, February 22, 2006

Luck-driven development

I found a term "opportunistic development" (Research Directions of Microsoft, November 2005) referring to writing a VB program in debugger and tweaking it to get desired results. This was referred to as Programming by Accident by the Pragmatic Programmers. But in line with XYZ-Driven Development pattern I decided to introduce a new term: Luck-Driven Development (LDD).

Wednesday, January 25, 2006

Tuesday, January 24, 2006

FreeMind

Mind Mapping (a.k.a. Thought Mapping) is such a great idea. I only started using it, perhaps, because I am backwards. I am using FreeMind, and it seems to be an OK tool, but I have not figured out yet how do I print from that tool.

Friday, January 20, 2006

Windows Messenger

Why in the world I still need to have Windows Messenger on my machine when I already have MSN messenger and Microsoft Office Communicator? Anyone here designed Windows XP?