Welcome to the 5th excuse in the Testing: Why Bother? series:
"We refactor our code so frequently, that the time we invest in tests just isn't worth it - they are going to change and be irrelevant anyhow"
This excuse is one of my favorites, because it's so ironic, yet surprisingly quite common.
You are constantly refactoring your code? Without tests?? How can you be certain you didn't break anything???
As I mentioned in previous posts, the number one reason for automated tests as I see it, is REGRESSION. It gives the programmer the confidence of changing a piece of code without being afraid of possible side-effects, because if change introduces new bugs - good regression tests will immediately find them.
I totally agree that constantly improving your code quality and the design is a very important phase of developing dynamic software that responds well to changes in requirements. But it's that simple - you should NEVER refactor without tests to back you up.
Quoting from Martin Fowler & Kent Beck's great book Refactoring: Improving the Design of Existing Code: "If you want to refactor, the essential precondition is having solid tests."
Furthermore, for most refactoring operations, when you use refactor-friendly IDEs, performing a simple refactoring operations such as rename, change method signature, etc. will be almost painless, as it will make the necessary changes in your tests' code as well.
"Yeah but... Still... Every time I change my code I feel like I have to re-write all my tests, and then the ROI of testing vs. debugging seems to not be worthwhile any more".
Well... If you are talking from actual experience, and these are more than just fears, you may be "doing it wrong", and should ask yourself the following questions:
1) Are my tests too much low-level?
Often, tests that you feel you need to maintain too much and change for every little change in production code, present a code smell.
Is this test passing if your implementation was with a for loop, but when you replaced it with recursion it failed? This means you were testing too low.
If a refactoring step caused a test that shouldn't care about inner implementation but only about the final outcome to fail, try re-writing the test by using your code at a higher level (e.g. test the public method and not the "helper" that should be only used internally)
2) Do I have irrelevant data in my tests?
This does not refer only to refactoring, but to a very common cause for high-maintenance test code, are tests that fail because you changed the value of data that should be irrelevant to the functionality you are checking for.
I highly recommend the following post by @jbrains for further information about what the problem exactly is and how to deal with it.
3) Are your refactoring steps too big? (Is it actually a "rewrite"?)
I often got to hear the frequent refactoring excuse referred to as frequent "re-writes". Rewrites, as opposed to refactoring, is something I suffered a lot of pain from in the past.
Instead of working your way one small change at a time, you decide to "throw all this subsystem's code away and start over".
This usually turns out to be an adventure you regret every moment of getting into: all the old bugs (and many new ones) you encountered while developing the subsystem in the first place come back, and slowly but steady, the spaghetti situation in your code reoccurs, until you decide to have another rewrite - which starts the same cycle once again.
I don't believe on rewriting working software. I believe in refactoring in small steps, making sure nothing was broken after each step by running all tests. This way - the pain of maintaining the tests now becomes a piece of cake - as you know exactly what was changed in the latest step and you can deal with it.
If you don't have tests - well... I would start by adding them, making the minimal changes in production code to make it testable, and only then getting into this whole refactoring process (well.. that's my point in this whole post).
EDIT: actually, this is a bit harsh, in some cases rewrite may be the right thing to do. Read Nimi's comment and my response for it.
In conclusion
Tests should be a prerequisite for any refactoring adventure, so this excuse is full of irony, and if you really feel the pain of maintaining your tests while refactoring, not writing tests is definitely not the solution.
Another problem this excuse reveals, is why do you feel you are frequently changing everything in your code in the first place? Should you develop a better response-to-change-in-requirements mechanism? Or maybe you don't invest enough time to ensure quality of code in the first round of writing it? Maybe if you were using TDD it would help?
Think about it...
In the meanwhile, you should make sure you read all the previous excuses in the series, stay tuned for more, and follow me on twitter.
Would be great if there was a way to subscribe to rss I might be blind.
ReplyDeletehttp://codesheriff.blogspot.com//feeds/posts/default
ReplyDeleteworks I guess its a blogspot thing. Use used to http://codesheriff.blogspot.com/rss/
http://blog.thecodewhisperer.com/2010/01/14/what-your-tests-dont-need-to-know-will-hurt-you%20
ReplyDeleteis a loken brink.
Just remove the trailing URL-encoded space character and the link works:
ReplyDeletelike this
Thanks, fixed the link.
ReplyDeleteGreat post, but I disagree with your last argument, of "never" doing a re-write.
ReplyDeleteSpecifically, think of a (already written) component that can redefine the meaning of spaghetti code for you, 10-20k LOC, written in C, that does low-level thingies, that doesn't even have corresponding component tests, and certainly no unit tests.
The pain of making the component testable is probably tenfold the pain of re-writing it TDD-style, and if you re-write you also get the new clean code.
I guess that advice is only valid for either
a) code that already has (some) tests
b) situations where it's easy to add tests (maybe managed languages, like python with the mock package, enable testing even in such situations)
Hey Nimi, thanks for the comment!
ReplyDeleteI have to agree with you, I feel I was too harsh about NEVER doing a rewrite.
In situations like you mentioned, rewriting the component might actually be the right thing to do. However, it's something you should do only ONCE.
In other words, rewrites cannot really be an excuse for not writing tests, only a measure to get your code to a state you can refactor it step-by-step in future changes.
Programming is about solving problems. Solving problems is linked to cognitive performance.
ReplyDeleteWhere do you think cognitive performance comes from? I can tell you for sure that it doesn't come from loads of safety arrangements.
As a programmer you need to always challenge yourself in order to, A) keep your current skill level and B) improve and acquire new skills.
Would you learn to walk if someone gave you a wheelchair as a kid? Sure you would have the confidence to move around and not be afraid to fall and hurt yourself, like that would be what life is about....
Sure, some safety equipment is good, like a helmet when riding a motorcycle. But I'm sorry to see that you're another victim of the crowd that just doesn't see the difference.
Tomas, If I understand correctly, you suggest that relying on tests hurts your ability to write good code without them.
ReplyDeleteWhy should that be the goal? Isn't the goal to deliver high quality software and respond to change quickly?
Indeed.
ReplyDeleteBut in order to deliver high quality software you have to have the right tools, and for me highly skilled developers is a better tool than loads of safety arrangements.
My next post in the series will talk about this subject, so stay tuned.
ReplyDeleteAnyhow, I really don't agree programmers who rely on tests are not as skilled as those who aren't writing them.
Usually, the opposite is true. You can't really write effective and well crafted unit tests without being a skilled programmer, and if you are not good enough - tests aren't gonna help you anyhow.
Generally agree. But if you are chaanging tests when refactoring then you are kidding yourself that you are using tests to safeguard the refractory. You can't state that the tests passed before and after the refractory and thus the refractory is good.
ReplyDeleteYoni, I agree that good tests can be challenging in itself. But it's narrow and nowhere near what you face elsewhere in programming.
ReplyDeleteThe idea of getting confidence from tests is just plain wrong. Programming has been around for ages. Sure there's been lots of bugs as well, but imo these bugs are not introduced through the lack of tests, but through the lack of great leadership.
By the help of tests you're going to be able to deliver a solid product on time. Yes, you read it correctly... solid product on time.
However, you will most likely fail in developing ground breaking technology that can compete with technology coming from an excellent team not applying rigourous test principles.
But if the edge of the business doesn't come from technology I agree with you.
I wouldn't want to work there though.
Tomas, thanks for your comments - I think they are very interesting and present an insightful view and not simply a "tests are bad" view I get from other people.
ReplyDeleteMaybe your experience is different, but from my humble experience in the field:
1) The bests programmers I know and the ones who I look up to the most are ones who strongly believe in unit-tests. I doubt it you won't hire someone like Kent Beck for any programming task.
2) 99% of the time software development is about delivering solid products on time and not about developing ground breaking technology.
3) In many of my previous projects, the test code was the most interesting part of the code, applying many complex design decisions and nifty tricks. I don't see a real difference of the challenge of writing test-code vs. writing production code. Heck, sometimes your product IS something to help you write test code (e.g. products like Typemock's Isolator).
- First of all, Chuck Norris doesn't need tests, he writes code with zero bugs in first place.
ReplyDelete- A very good post. Rather than a fanatic 'you must always fix your tests and never complain about it', I liked your one of your messages - 'maybe your test code should be improved to allow easier refactoring'. Raises some thoughts.
- Yes, refactoring rather than re-writing! Nimi's comment is very rational, yet I suspect that in most cases people have an exaggerated tendency for rewrites, so your tip DOES make sense usually.
- As for Allanis - http://www.youtube.com/watch?v=nT1TVSTkAXg :)
Oh! Oh! I've certainly heard this excuse at work. And goodness knows, I've experienced some of the pain because I've "done it wrong" myself, in every way you describe. I think it shows that writing simple, effective tests is harder than most people think — especially when you only look at good tests as examples. They're so simple, it's easy to say, "Gosh, that's dumb." But it can take careful thought (and experience from previous bad tests!) to learn the nuances of what to express, and how. Simple is hard.
ReplyDelete