The most interesting one for me is 0.1% bug score.
These are rough numbers but as of last week we had about 125K lines of code and 103 bugs from a system that has been in production for half of the 18 month project duration. Another view of this number is about 0.7 bugs per developer month.
The application was built with Dot.Net 3.5 and SQL with only a couple of interfaces to external systems. Data defects resulting from data migrations from legacy systems were included in the bug count. Visual Studio 2008 and TFS were the IDE and source control tools.
The things that gave us this number include a clear up front articulation that quality is our primary driver, a fanatical focus on testing at all stages of development by the team members, continuous integration and pushing the system into production as early as possible and relentless regression testing.
We (maybe it was just me) had a thing going that we should never find a bug in UAT. It was an aspirational goal, but it was almost true.
Of course there are some caveats.
- We never automated our testing so we may have missed some regression bugs
- We may not have found all the bugs for outlier scenarios
- We wrote UAT test cases with a view to testing business capability, not to test whether all outlier scenarios were caterred for continuously
But we have had it in production for 9 months.
Is 0.1% something to brag about? Yes, it probably is. Well done team.
Picture by PMSTW via CC @ Deviantart