Which is more important in Software Development?
A colleage recently broached this topic, and phrased it as
"we should all focus our priorities on fixing bugs vs. adding features".
This is wrong enough as a generalization that it entire tomes can (and probably have) been
devoted to the subject.
Fixing bugs, or adding features?
The ONLY possible answer is that it depends, and that balance is required.
There are good and valid reasons to focus on either, and these largely depend upon both
the context and circumstances. Either extreme is just plain wrong.
So lets get to the caveats and disclaimers:
Lots of these views come from a family tradition of having my last name on the company.
In that situation, it truly is my professional reputation on the line (which affect the
bottom line) if I screw up badly. See the rest of the top level web site - that was
the family company for many years.
Another significant contributor to these views is based upon the requirements of my
software over the years. Compilers, binary translators, dynamic binary optimizers,
in-kernel drivers, system-wide [libc] interception and interposition. Errors in the first
couple are painful, as user-space applications may misbehave or crash in the hands
of other developers. A user-grade DBO has even higher correctness requirements, in
that novice users may be the consumers. For the last two, an error can easily lead
to complete system failure, up to and including data corruption and requiring a complete
OS reinstall with complete loss of data. On a client machine, that is annoying. It
is completely unacceptable on a mission-critical server.
There will always be a market for high quality products, even if they are a bit more
expensive that the commonly accepted level of quality for that product segment.
The author of the software as having the first line of responsibility
for quality. That is, if I wrote it, I have primary responsibilty for it. There will be times
where this is not possible due to other resource considerations.
The "last touched" rule generally overrides the above. If I was the last to modify some code,
and a new bug occurs in the area I was modifying or enhancing, it is my responsibility.
Pertinent comments in the the code, architectural-level documents, and even
after-the-fact-of-implementation design documents are all useful; my memory isn't
perfect, so recording details of the "tricky bits" is important. Such documents
may save my successors a whole pile of grief.
People who are too lazy (yes, it is merely a matter of discipline unless one is
incompetent) to attempt to keep their code relatively bug free really need
to be urged by management to seek employment with a competitor. Perpetually
buggy code costs an incredible amount of time and energy to correct.
Not to mention frustration on the part of the janitors who "get" to clean up
after lazy types.
Quality is not something that can ever be "tested in". It can be tested
for, but just running tests does not improve the quality. Testing can only
measure it. If there are problems in the code that do not get fixed, it does
not improve the quality to test it even more.
Quality needs to be a strategic priority once a software product is past V1.0.
Some of the issues to consider, (in no specific order) when deciding whether or not
to focus on fixing bugs, or adding features:
An example of a situation where quality took a very low level of priority is
this description of one of the
As a counter argument, the blame for the described behavior falls squarely upon the
manager of "return 12".
I have met only one manager (my current manager) that actually promoted someone for
fixing bugs. And in some cases (personal experience, both as a viewer, and as the
"recipient") fixing a lot of other peoples bugs can get you completely ignored during
financial review cycles.
Adjunct to the above: I have never heard of someone
with the job title of Fellow attaining their position by virtue of fixing bugs.
Unfortunately, it is seldom the case that engineers who write non-trivial bodies of useful,
stable code get recognized for this. They usually get sucked into finding and/or root cause
location and/or fixing the bugs in other peoples code. And consequently end up getting
categorized as one of the sloppy programmers merely via association.
I have yet to meet an engineer who rose particularly far up the ranks who had a reputation
for perpetually writing garbage code.
Unless the code base for a product stays small, there will come a time when no single
person understands all of the code, or could even possibly understand all of the code.
If the group culture is too heavily focused on just fixing bugs/doing maintenance work,
it is a way to attract "B" and "C" grade engineers, along with
Wally types. You know, people who "managed to get in"
to a company with the sole intent of retiring from there, rather than being
interested in contributing to the current and future success of that company.
The level of importance of fixing bugs does depend upon the type of software involved.
A bug in a shipping device driver that causes a system to crash ('BSOD') is really bad news.
A bug which introduces a security vulnerability ranges from bad to catastrophic, mostly depending
upon how hard it turns out to be to actually exploit it.
A compiler bug which commonly causes incorrect code to generated be is bad.
A seldom encountered bug with an easy workaround, which results in an inconvenience for a very
small percentage of the user base is irrelevant. Except when it is an important customer.
At one point in my career, I worked at an ISV compiler firm. One of our customers required that
we remain bug compatible. In spite of the fact that our implementation did not
behave correctly or conform to the language specification, that customer required that we
did not fix that bug.
The level of importance of fixing bugs depends upon goals and job function.
For a researcher, building a repeatable experiment or a proof of concept is important. Making
the code they used valid as product grade code is not important.
For product grade code, it is important to try to eliminate rough edges.
The enterprise market is much more stringent. Often times, applications will be running as
part of a set of business important/critical applications, on servers where downtime is not
acceptable. If I am a fortune 100 corporation, and looking to deploy a 3rd party application
on my corporate root DNS servers, the amount of qualification of that application is going to
be enormous, ranging from many, many documents, to stipulations about service levels, etc.
The consumer market is difficult in other ways. If there is a bug which causes ambiguous or
misunderstood behavior, consumers are going to call the manufacturer. In droves.
And the levels of support calls can actually turn an otherwise good product into a financial loss.
A typo in the user guide is utterly irrelevant to engineers, but is important to tech writers.
An awkward web-site is of critical import to a web designer, but of minimal importance to
a compiler developer.
The level of importance of fixing bugs depends upon what stage a product is at.
In an early-stage startup, if a bug does not prevent a prototype from successfully
functioning, as a demonstrator, it is largely irrelevant.
In a V1.0 release for a new type of good or service, rough edges (bugs) are acceptable.
Possibly even up to severe limitations. Early adopters, and all that.
At the other end of the spectrum, it is reasonable to expect that there will be only
a very few bugs in a V9.0 release of a product, and none of those will be much more
Within a development/release cycle, it can be counter-productive to have bugs take
automatic, instant priority. Bugs are interruptive beasts, so balance is needed.
If there is a chunk of code undergoing a complete rewrite for the current release,
is there a need to fix a bug found in that section of code in the current code base?
Bugs are interrupts if priority is given to fixing every bug immediately.
Always fixing all bugs first is a sure way to have a zero defect product with
a stagnant or irrelevant feature set. If a product has reached the point where
no more new features are being added, this might be acceptable. But if it has
already reached this point, it is likely to have been passed on to another set
of people than those who originally wrote it.
Ignoring bugs until the very end of a release cycle is extraordinarily risky.
It can be indicative of a broken development culture, where writing garbage code
is actually being rewarded. Which in turn may have been driven by someone
creating unrealistic goals and/or having unrealistic expectations.
Or it could be the case that there simply is not any interest in creating a
Not all bugs are equal in difficulty to solve. A high priority bug may be simple
to fix, but a lower priority bug may require reworking an entire subsystem.
It may be better for the product and company to add a new feature and ignore a
low priority bug.
The mindset for bug-fixing is somewhat different than that required when a
developer is figuring out how to add a new feature. There is even a case to
be made that they require significantly different skill sets.
At a certain level of complexity and size of the code base, it will become
impossible to fix all of the bugs in a product. Even attempting to do so
would consume 100% of all of the developers time.