Sorry about the delay since my last post.
Now that I've freed some spare time, I hope to get back to the "Testing: Why Bother?" series. So stay tuned.
Anyhow, wanted to share a quick insight I had about what to do when you need quick feedback about a feature that you find hard to write a good test for, through an example.
Some Background
I was pairing with my buddy on a cool feature of our project for some time now.
The feature is basically some kind of a "Device Prettifier", that receives a local device path, performs some inquiries about it, and when its __repr__ (toString() equivalent) is called, it displays all the data it gathered from the inquiries prettily.
So, we actually test-driven-developed the mentioned DevicePrettifier, unit-testing everything by creating mock devices returning whatever we want the DevicePrettifier to do. Basic TDD+Mocking, went very smoothly and was a lot of fun.
The Tricky Part
Unit testing here was not enough for us. We mocked what the inquiries returned according to known examples we know about, and we pretty much tailor-made the responses to the logic we wanted to implement. Perfectly reasonable way to implement a feature using TDD. However, we never tested how the DevicePrettifier behaves on a REAL device, on various operating systems and environments.
So... The correct way to go about this, is to write an integration test. An integration test that uses real devices and that tests how the DevicePrettifier works on them.
The problem: without the proper testing framework that will allow us to automatically allocate hosts and real devices to test, we would have to put a lot of effort on such integration test - effort that might take us too much time, as we need to release this feature on a tight deadline.
What Could We Do?
We could have made a manual test... But then we would lose the important regression capabilities we gain when making our tests (including integration tests) automatic.
Or, we could have done something a little bit in between: An integration test with no asserts.
Are You Crazy?
"No asserts?? What do you mean? What do you test when you do that? If a human needs to go over the results of the test run - it's not automatic, and it's of no use."
OK, I knew you guys would say that. But actually there are 2 things to notice here:
1. For the purpose of testing your code for the first time - it's better to write an automatic test with no asserts than to write a manual main(). This way you have the structure of the test that you can later add asserts to more easily. Also - when you run it for the first time and manually check the output, you can quickly find the problems, and write a more specific test for them - even go back to the unit tests and alter them for this purpose.
2. Even a test with no asserts automatically tests something. It tests that no exceptions are thrown. This is a VERY important thing to test for, and -- let me tell you -- it founds a decent amount of bugs!
So, here's the test we made (code almost untouched before uploading):
We simply logged on to a few hosts of different type, ran this test, and it found the most important bugs we had. May be surprising, but really cost-effective!
Now, we can add this test to all of our continuous integration slaves, and if for instance a new type of device is suddenly connected to one of the slaves -- we'll immediately know if out prettifier couldn't parse the inquiry responses it returned. Coolness!
In Conclusion
The next time you feel like writing a complicated integration test and give up because you just don't have time, think about the "assert free" approach. It might be easier, save you a lot of time, and still find most of the bugs!
Stay tuned for more posts on the "Testing: Why Bother?" series, and follow me on twitter :).
Thank you. It is good read about the difficulties beginners experience in their work. Publishing it in blogs like these probably helps others of similar experience levels to ramp up their skills.
ReplyDeleteWhy would you say it's for beginners? Not many people would even use their heads when writing code, they would just "write it the way they are use to do it", which means:
ReplyDelete1. Writing unit tests or not
2. Writing integration tests or not
3. Test it manually or not
This guy actually used his head and for this specific scenario decided to do use a cost effective way of testing. Testing without asserts is not something I've seen for a long time.
I have to say that I do believe that in Python it's more effective than in Java. You will catch syntax errors and type casts, things that you usually catch in compile time in Java.
Anyway, thanks for the post!
Oh, me likes!
ReplyDeleteI like your line of thought, and would like to highlight something else.
One thing I found teams doing is using the automation and (sometimes) unit tests for their sprint review.
An additional bonus from your approach I spot is this: come end of sprint, show the product owner the output of tests with no asserts, that cannot be tested with live environment. Well, hard to test at least. The result of the test is then validated manually
The product owner gets the comforting notion that the test can be repeated with minimal scope for human error, although it is not fully automatic.
Do you see code in the line:
ReplyDelete>>So, here's the test we made (code almost untouched before uploading):
I don't... This is a helpful series. I agree that it's often awkward not to have asserts, but it is valuable.
No small number of my tests have some setup and then:
// crux: don't explode given this setup
doSomething();
You should be able to see the code, but if you're not, it's available at:
ReplyDeletehttps://gist.github.com/900694#file_device_prettifier_test.py
Commenting on a blog is an art. Good comments create relations. You’re doing great work. Great blog and couldn’t be writing much better!..................Click Here for oracle fusion financials training details.
ReplyDelete