This post was migrated from Justin’s personal blog, 'Codethinked.com.' Views, opinions, and colorful expressions should be taken in context, and do not necessarily represent those of Simple Thread (and were written under the influence of dangerous levels of caffeination).
I will refer to this from now on as the Law Of Crappy Code or simply LOCC. The LOCC is all about you, and the code you write. You know what I am getting at, you write crap code. We all do. You ever wonder why you will write something and then come back to it a year later and think "this is crap." Well, that is because it is. Now I hope that you aren’t taking offense at any of this because it isn’t your fault, you are human, and humans make mistakes (lots of them). The tact to take is to look at the problem and say "how can we mitigate this?"
You may be surprised to hear that the "industry average" for flaws in code is all over the map and is very different based on who is reporting the numbers (sarcasm). I have looked all over the web and from what I can tell the industry average is somewhere between .5 and 25 (some numbers I saw went as high as 50, but that is just plain ridiculous in most modern languages. I cannot imagine introducing an error every 20 lines of code!) errors per thousand lines of code. Now, you are going to be closer to .5 if you are writing something like a database or a web server and probably closer to 25 if you are writing general business software. But even somewhere in the middle, like 12 errors per thousand lines, still means that you are introducing a bug roughly every 83 lines of code that you produce. Is that number acceptable? Well, I hope not. In a 250,000 line program that would mean over 3000 errors. Now some of these errors may be trivial, and the numbers on the severity of these errors just isn’t out there (because that would be grossly subjective), but you can still see that the more code you have the more likely you will have bugs.
And yes, the statement that the number of bugs will increase as the number of lines of code is increased is fairly obvious, it is a bit like saying "the more you drive your car, the more likely you are to have an accident." Which is completely true, assuming that you remove all other variables. You obviously can’t compare one guy who drives 500 miles per week to a guy who only drives 20 per week, but does so ridiculously drunk. So, lets just agree that even if you have very very low bug rates, if you were to double your codebase, then you would probably have larger number of bugs. Even though you will still have very few bugs, you still have more code in which a bug can be introduced. And studies have shown that the numbers of errors per lines of code increases as the application increases in size. Anyone who has had to work on a very large system that wasn’t designed in a great way can attest to this. Can we say feature interaction?
So, how can we mitigate this issue? The obvious way is to write less code! Now, I’m not saying that you should be lazy and start programming less, I am saying that every time you add code to your application you should do so consciously. While you are adding code you should be conscious of whether you are repeating functionality, because any time that you can reduce the amount of code you are better off. Just slow down and think about what you are doing. In the software world our schedules can be crazy and in the haste to get the code written and the application out the door we can cut short the parts of software development that actually lead to good applications.
Another thing you can do is to constantly look for code to refactor in order to shorten, simplify, or completely remove it. Constantly ask yourself if a piece of code needs to be there or not. Refactoring can be troublesome though, and if you do not have sufficient tests to check your refactorings then you may end up introducing more bugs than you are avoiding.
One last thing you can do is to avoid large or complex methods and classes. You may think that bugs are distributed evenly around an application, but that just isn’t true. According to Steve McConnell one study found that "Eighty percent of the errors are found in 20 percent of a project’s classes or routines" and another found that "Fifty percent of the errors are found in 5 percent of a project’s classes". The more large and complex a method or routine is, the more likely it is to contain errors and the harder those errors will be to fix.
So, while introducing better development techniques such as TDD, Pair Programming, etc… in order to reduce bug counts in software is the first step, you next need to consider the amount of code you have and whether all of it is truly needed. And before you say that since TDD and other techniques reduce bug levels so adding more code is less of a concern, just remember that bugs are just as likely in test code as they are in application code.Previous Post Next Post