Mar 29, 2011

Excuse #1 - Writing tests takes too much time

This is the most common excuse in the "Testing: Why Bother?" series, so I'm going to address it first.
It turned out to be a pretty long post (and only the second of several more), but I really think if you'll read everything through it will give you some new insights.

So... are you claiming testing doesn't take time?
Well... Of course I'm not. Testing is not free, and you don't HAVE to write automated tests in order to write code that eventually works. But is it really not cost-effective?

When I demonstrate unit-testing or TDD to developers new to the field, I frequently get reactions such as "well, that's nice, but I could have implemented the same code much faster without tests."

This statement might be true, but there are several reasons why I think by testing you actually save time:

  • You don't have to start thinking how to verify that your code works (you already did this)
  • You are already after the debugging phase
  • You ensure having automated tests that can run later on
There's also the concept of improving design and usability of your code by testing it, which may also save you time in future, but that's something I'm going to talk about more in the next posts of this series. 

So lets' dig in and elaborate on all of these points.


Your code isn't finished until you have verified it works!
I have seen many reckless developers forget this rule. It's sometimes so easy to figure out exactly where is the piece of code you have to change and how you have to change it. You simply change the code quickly and commit it, pretty confident your fix doesn't affect anything else and it works beautifully.

So many times I've seen this happen, and so many times I found myself covering up for these nasty untested fixes of my irresponsible teammates. Dude! Test your fix before you "mark as done"! Why do I need to rethink what you've done 4 months ago now that you're on the beach in Thailand?

When you write tests along with your code, you make sure you get the verification you need to gain confidence in your code and save yourself and your teammates the pain of dealing with this later.


Don't forget debugging as a part of the equation. 
I have worked in teams that did not test the code during development, and the story was always the same: We allocated way too little slack time for the next release, adding more and more content with little time to test and debug it, and we always ended up either really late on the deadline or working impossible hours to make it on time.

Continuous testing during the development period, preferably TDD, will significantly narrow the debugging time, lower the fixing cost, and help you make it to your deadlines.

You probably heard already that the earlier a bug is discovered, the easier and cheaper it is to fix it. Take a look on this excellent post (and chart) as an example.

Well, when you test your code immediately along with writing it, you discover a lot of nasty bugs early, that would otherwise result in much higher fixing costs.









So, I can just test the code manually, isn't it good enough?
The developers that are less reckless than the ones I mentioned earlier, would at least make a manual test to make sure they got the bugfix right. They'll write a main that uses their new code, or at least run the entire system and make sure the new cool feature actually works.

Well... If you took the effort to write the main or run the entire system as a manual test, why don't make better use of this time?
Instead - without putting up a much greater amount of time and effort - write an automated reusable test, that could be used for regression!
This way, you will save the future maintainer (mostly, the future you) the pain of manual testing again the next time, and make sure this bug you fixed is not only fixed now, but will remain fixed for as long as the regression tests are being run..

An automated regression test suite will help you gain confidence for the code, allow you to take responsibility for it, and eliminate the fear factor.

Fear Factor? What do you mean by that?
Consider the following story by a close friend of mine:
When he was first assigned to work on a legacy untested project, while browsing the code to understand how it works, he came across the following piece of code:

for (int i = 0; i < size; j++) {
    // do stuff
}

Wow! Noticed that sneaky j there? It's obvious that something is really wrong here. This is 99% NOT the original intention of the developer. 
But what to do? This was code of a production system that seemed to do its job just fine. How could he possibly know if this is a possible bug that needs to be fixed, or an important factor to why the code actually works?

If he had a TestSuite he could run, after changing that j to an i, he would happily do so! But without it, it's plain scary!

The X lines of code misconception
Another common misconception about testing, is a sentence such as: "If a developer writes X lines of code a day, and you want him to spend half of his time and code on tests, he'll go twice as slow writing production code."
Sounds like plain common sense? Absolutely not!
Well, I won't write too much about this one, because many words were said on the subject before me. Here's a great post by GeePawHill about TDD + Pairing, explaining why typing code is NOT the bottleneck.

OK, I'm convinced... But my manager isn't :(
When your manager says "Testing is nice, but you have better things to with the little time you have", it can be a real bummer, I would know. But not all is lost!

Try not to present "adding tests to our untested code" as a mission by itself. This will almost always get negative reactions from management, and won't be a reasonable justification to postpone development of the next feature by a few days/weeks.

Instead, just write tests as part of the time frame allocated to you to work on your regular tasks! No one can tell you not to write tests along with your code, as long as you still make it to the deadline. Heck, when they'll see how faster you respond to changes when you have tests - your managers will probably change their negative view and encourage you!

A good example of this is how I justified increasing code coverage of a legacy project I inherited (lets call it Project A).
One of my first tasks in this project, was updating the version of a dependency project (lets' call it Project B). Project B had a bunch of important bugfixes that were relevant for A's next maintenance release, but it also had some major API and behavior changes. I couldn't possibly know what was broken without testing the code heavily, so we allocated time for a task called "upgrading project B to version 1.5", which about 90% of it was used to "adding tests to project A to see what was broken". I ended up writing about a 100 new tests (both unit and integration), catching some nasty bugs we might had found only after the release reached the customers.



That's it for today... What's next?
Whew! That was long...
I hope you learned some new answers to why tests are NOT a waste of time. I can think of more points to make here but this post is already way too long :)
On the next post (which hopefully would be shorter), we'll attend excuse #2, "writing tests is too hard".
The "too hard" excuse is actually closely related to the "takes to much time" one we discussed here, for obvious reasons: The harder it is to test the code, the more time is will take. Eventually, not being cost-effective any more.



Want to know why I think differently? Subscribe to this blog, follow me on twitter, and you'll know as soon as I publish the next one :)

12 comments:

  1. I hope for some tutorial on writing tests for beginners, what to look for, what should be tested, how code need to look for testing to be most efficient (object programming, like taking some part of code out of function, because if there is too much code you cannot find bug easily because function is too complex, in php for example there is a thing that i write some action in MVC model, that is pretty long function, you could point me somewhere where I can see how to break this action into functions that will can be tested efficiently), other thing is how to test functions that are using database, or just are processing database result(in PHP most of functions i write use database).
    I just would like to read how to test before I actually start doing it.

    ReplyDelete
  2. "Your code isn't finished until you have verified it works!"
    -> This is true, but unit tests is not the answer for that, formal proof is. Unit testing is ok when done correctly but often gives a false sense of security to managers and other programmers. Unfortunately writing unit tests 'take too much time'; formal proofs are outside the capabilities of the people who call themselves programmers (copy/paste generation) and they do take time. As they should; writing good code takes a long time; actually most people are not even capable of writing good code. They shouldn't be writing code at all, with or without tests.

    ReplyDelete
  3. TDD does take too much time. You should probably look into BDD if you think TDD is fast. Really, TDD isn't fast and it doesn't "save" time except for in regression but then you would have "saved" even more if you had done BDD from the start.

    ReplyDelete
  4. Thanks for the comments.

    @tom610, I'm not sure all of this is going to be in the next post, because a deep testing tutorial is a bit out of scope, but I'll try to give references for good tutorials.

    About BDD and other methods of testing your code: something to notice is though my examples are mostly from the world of unit-tests and TDD, I do not claim it's the only way to go (read the introduction post). The important thing is to make sure your code works along with writing it, and having an automated regression suite to make sure it's kept this way.

    ReplyDelete
  5. I think you've missed what is in my opinion the worst developer: the one who sees unit testing as just a chore that gets in the way so writes a few useless tests that don't actually test anything or even worse (and I have seen this) write tests that always pass by having one assertion statement that will always, no matter what, assert to true.

    These kind of people are worse than the guys who don't test as you are left with unreliable tests that can't be used when you come to fix the bugs left by the developer. At least when there are no tests you don't waste half the day working with crap tests. Bad tests are just as bad, if not worse, than bad tests and the are produced by someone with the attitude that unit testing is a chore they have to complete before they are allowed to do the exciting development.

    I view the unit testing phase as development, best way to write code in my opinion, saved me many a sleepless night when my code first goes live.

    ReplyDelete
  6. As a developer who has been on both sides of the fence here: I think the key point about testing that is often missed is the fact that writing clean, maintainable production code is the goal. Depending on the situation a unit test can be an extremely easy, effective solution to maintaining critical code. And sometimes, the complexity of the automated test outweighs the purpose in writing it.

    The correct answer is usually to write test where it makes sense to write test, not to write tests for everything. If you're writing a critical accounting feature or some other piece of logic used generically by all means go to town with the tests and do it well. If you're writing a user interface that is more easily (and correctly) tested by a user testing it- write up the documentation on how to test it instead of the automated tests.

    Too often I see people so firmly entrenched on either side of the fence to see that we all have the same goal: solid products that don't make suffering the primary part of maintaining them.

    Here's where you got it right:
    Your code isn't finished until you have verified it works!

    ReplyDelete
  7. If you can unit test something quickly enough to warrant using TDD, judged from a normal attention-deficit hyperactive managerial point of view, it is most of the time so trivial code that an experienced developer wouldn't need to do the tests in the first place :)

    ReplyDelete
  8. good post johnny.. :-)

    ReplyDelete