When is a defect not a defect?
When the user never experiences it.
I once worked with a great QA manager called Will who used to say “If we find 200 bugs in test, and you guys fix them, then they never happened” which always tickled me. It was a way of agreeing amongst our IT-selves, that we wouldn’t necessarily advertise that the software was of poor quality until we squeeeeeeeezed it though the QA/fix machine and it emerged squeaky clean the other side.
We currently are going through a clean up of a large backlog of defects logged in our system, note the use of the term ‘logged’ which doesn’t necessarily mean experienced. When we go through our development projects it’s not un-common for us to stumble upon defects that are already released in our production software. When this is the case we fix them if we can, if timescales and deadlines are just too tight we go through a process of raising them as production defects. It seems like the right thing to do, we found 'em, therefore we hold our hands up put them on the defect log where they are visible and can be fixed….eventually.
Is this a backward view of bugs? If our users don’t experience these defects do they even care if they are there? Are we being much too over-analytical in our efforts to capture and document every last defect? You could argue that data integrity defects might unravel the system in ways not obvious to users. I would agree, however I struggle to come up with a real life example of that kind of hidden defect in our applications, what's more we'd never leave those bad boys in there.
Is fixing a defect that never manifests a needless waste of effort? Is a defect that never manifests a waste of software in the first place? After all it suggests there is some underlying code present that is never executed.
I’m torn between the desire to have perfect software and the urge to resist software waste.
I once worked with a great QA manager called Will who used to say “If we find 200 bugs in test, and you guys fix them, then they never happened” which always tickled me. It was a way of agreeing amongst our IT-selves, that we wouldn’t necessarily advertise that the software was of poor quality until we squeeeeeeeezed it though the QA/fix machine and it emerged squeaky clean the other side.
We currently are going through a clean up of a large backlog of defects logged in our system, note the use of the term ‘logged’ which doesn’t necessarily mean experienced. When we go through our development projects it’s not un-common for us to stumble upon defects that are already released in our production software. When this is the case we fix them if we can, if timescales and deadlines are just too tight we go through a process of raising them as production defects. It seems like the right thing to do, we found 'em, therefore we hold our hands up put them on the defect log where they are visible and can be fixed….eventually.
Is this a backward view of bugs? If our users don’t experience these defects do they even care if they are there? Are we being much too over-analytical in our efforts to capture and document every last defect? You could argue that data integrity defects might unravel the system in ways not obvious to users. I would agree, however I struggle to come up with a real life example of that kind of hidden defect in our applications, what's more we'd never leave those bad boys in there.
Is fixing a defect that never manifests a needless waste of effort? Is a defect that never manifests a waste of software in the first place? After all it suggests there is some underlying code present that is never executed.
I’m torn between the desire to have perfect software and the urge to resist software waste.