Driven by unit testing

By Ghost on Monday 27 April 2009 22:27 - Comments (21)
Category: Personal, Views: 4.506

For a while now I am toying with the Test Driven Development-methodology. I have to admit I kinda like the idea and found it fun to work that way. Last weekend, I decided to look up more information and books about the subject. To my surprise I found TDD to be a widely discussed and controversial subject.

I think one thing both camps agree on is the idea that automatic unit testing is a good thing. They provide a way automatically test most aspects of the code in a short time, providing a form of documentation and some form of safety net for programmers.

The moment where things are getting messy is when TDD comes into play. The focus of this method is around writing unit tests and go from there. In Twelve Benefits of Writing Unit Tests First the blogger writes about several benefits of writing tests first. Although the main article does not receive much criticism, its spin-off I Pity The Fool Who Doesn't Write Unit Tests unleashed a torrent of comments that either agree or disagree furiously around the subject. Searching a bit further, I found more blogs from either both happy users and refusers.

Now I am faced with a problem. Although I am a bit biased towards TDD because I see merit in the suggested benefits, I am not looking forward towards investing time into something that will harm me in the future. I think I will continue trying to employ TDD in my current work and the occasional school project until I hit a snag. At least it will train me in writing useful unit tests.

Volgende: Hour registration using ImageMagick 07-'09 Hour registration using ImageMagick
Volgende: Simple iPaq Familiar file transfers 01-'09 Simple iPaq Familiar file transfers

Comments


By Tweakers user PhoenixT, Monday 27 April 2009 23:29

I really like unit tests and try to write them as much as possible, however TDD (first write unit tests, and after that write the actual code) is something I don't agree with. I just write code as I see fit and then write tests so verify the code works and stays in that state. I don't think such a radical change in development style as to write tests first is worth any possible benefit.

By Tweakers user JeRa, Monday 27 April 2009 23:35

I personally only write unit tests when bugs emerge. I find the idea that I would have to implement my ideas twice (both in the executing class and the unit test) appalling and a waste of time.

Implement once, and if it goes wrong, write a unit test, fix it and use the unit test to confirm the fix and to check every future release. The main problem with TDD is that unit tests written first are foremost useful when writing difficult or very important code, but that the unit test is subject to the same coding mistakes of the developer as the actual code itself.

Creating unit tests afterwards is useful because they check a problem you have made in the past. When in doubt, you can write multiple unit tests for different classes which could exhibit the same problem.

Creating unit tests before the actual development starts can be useful when developing in a team - they can ensure you that the code the other members have written does what it's supposed to do.

By Tweakers user ahbruinsma, Monday 27 April 2009 23:36

I find unittests really usefull when you're refactoring. Before refactoring the piece of code you know what the result has to be. With that knowlegde you can write tests, refactor your code and make sure the result keeps the same, by using the same unittests.

By Tweakers user JeRa, Monday 27 April 2009 23:39

And another thing: writing unit tests in front is very useful when developing an important application and you have lots of time to do so. But when you're developing commercial applications, time is often very limited and fixing bugs afterwards is the way to go. Releasing schedules are often more important for companies than a few bugs that may arise.

By Tweakers user RobIII, Tuesday 28 April 2009 00:05

And another thing: writing unit tests in front is very useful when developing an important application and you have lots of time to do so.
You're partially or even completely responsible for that yourself. When a project manager approaches you and asks for your estimates on how much time you're going to need to complete the code you need to factor that in. If you don't you have only yourself to blame.
But when you're developing commercial applications, time is often very limited and fixing bugs afterwards is the way to go. Releasing schedules are often more important for companies than a few bugs that may arise.
And again you, as a developer, are responsible for this yourself. The company can make up all the 'release schedules' they want but when the developer(s) stand up and tell them it's not possible to meet those deadlines they will be forced to push the deadline(s) back. I see all too often that developers go "oh; okay" in fear of whatever so they work their asses of only to release buggy software and then after releasing it nobody wants to support it and the plethora of bugs that were introduced because of the insane deadlines will punish you even harder.

Fixing bugs afterwards, as you are suggesting, is clearly not the way to go. Fixing a simple bug may be just fixing a spelling error or some misaligned control on a form but a bug can just as easily be (and in practice will be) the one that bites your ass. It will screw up financial data, delete customers, open the application to 1001 exploits and generally leave the customer with a bad taste in his mouth. This will affect you, the developer, because your boss comes storming in (or call you up in the middle of the night), it will cause "quick fixes" introducing new bugs and worsening the matter, it will affect your boss and the company having to explain everything to the customer and whatnot and it will affect the customer not willing to do another project with your company etc. etc. etc.
Sure; bugs are inevitable, but you should always strive for highest quality possible. Even when budgets are low or deadlines are tight; ensure you inform all involved parties and work something out that does work in the long run (be it assigning more developers, pushing back deadlines or dropping requirements in favor of quality of the project, etc. etc.).

Whatever you do: don't turn your company into a sweatshop of developers working overtime and day and night; this will ultimately lower the overall quality of produced code and products. Sure; nobody ever got hurt of a final sprint or a week (or 2, hell even 3 or more) working like crazy. Just don't turn it into a habit.

I just want to donate my $0.02 and annotate that I think what you said is completely backward (though, sad as it may be, reality shows masses of developers live this way).

[Comment edited on Tuesday 28 April 2009 09:10]


By Tweakers user RobIII, Tuesday 28 April 2009 00:17

Oh; forgot to go ontopic :P
Anyhow; on the matter which methodology to use: I think this is a subjective matter and to me only unleashes the same kind of discussions as Mac vs. PC, Windows vs. Linux, Ajax vs. PSV. Who gives a rats ass :? :+
And that's a shame, because any tests are better than zero tests.
If you're in a company that uses either methodology adapt and don't be such a whiner (or convince them your way is better). If you're developing on your own use whatever works for you. And if you're in a company that uses neither methodology quit the job or get them to pick either and enforce it :)

By Tweakers user JeRa, Tuesday 28 April 2009 00:33

You're partially or even completely responsible for that yourself. When a project manager approaches you and asks for your estimates on how much time you're going to need to complete the code you need to factor that in. If you don't you have only yourself to blame.
If I were to offer you a lot of money to develop an application which must be, for a number of external reasons, completed in 3 months, you can go ahead and create your unit tests. I choose the company which can promise me a working application in 3 months, not one which will spend a month to produce unit tests based on a design which will reduce the number of bugs with 50%, leaving the other 50% nonetheless.

This is, of course, highly depending on the nature of the application and the company. But theory and reality differ greatly in this respect.
Sure; bugs are inevitable, but you should always strive for highest quality possible. Even when budgets are low or deadlines are tight; ensure you inform all involved parties and work something out that does work in the long run (be it assigning more developers, pushing back deadlines or dropping requirements in favor of quality of the project, etc. etc.).
There are a lot of ways you can improve the quality of your application, I just think TDD is not the way to go. When we develop applications, we strive to never make the same coding mistakes again by generalizing the problem and implement safeguards in our code framework or adjust our modeling and generating tools to recognize and eliminate the problem. This way, both the fastest and slowest developers can work at their own speed without impeding them with writing unit tests. Again, this works best for us. Take a large developer like Cap Gemini and you'll quickly end up TDD'ing.
Whatever you do: don't turn your company into a sweatshop of developers working overtime and day and night; this will ultimately lower the overall quality of produced code and products. Sure; nobody ever got hurt of a final sprint or a week (or 2, hell even 3 or more) working like crazy. Just don't turn it into a habit.
That's exactly the behaviour I observed when overseeing a TDD-based project a few years ago. The quality of the unit tests are generally comparable to the quality of the actual code, which means that the developers were devoting a significant portion of their time to fixing the tests. The overhead of learning how to handle TDD properly and the extra amount of time invested by all developers to develop, fix and adhere to the unit tests will always be too large to be useful to us. We just write great, self-testing code :)

By Tweakers user Kama, Tuesday 28 April 2009 00:51

JeRa, I'm a bit surprised by your reacting. Especially in a commercial environment it's extremely important to find defects before the customer does. Solving a defect that the customer found is exponentially more coslty than solving it before even compiling (e.g. by code inspection). When you write a test to a defect found, you're too late. You will prevent regression, but that's it.

Writing units tests beforehad is a kind of nice... But only when done right: Unit test (and other kinds of tests) seem to be technical tests and not necessarily functional. So even though the test may technically pass, there is no way to make sure that the application really does what it is supposed to do. You may be able to verify the code, but you will not validate it.
The key thing with testing is that you want to know what the component is supposed to do, without having to look at the delivered code first. In other words: You should not change the expected result based on the way the functionality is implemented. TTD supports this is a way. It should help you focus on "What's the code supposed to do" instead of "Is my code technically sound".

Although that sounds kind of nice, I personally believe that you should not use test to find out what the application should be doing. You think about that before hand and base your design and tests on it (at the same time!). This way you can already validate your design, before writing a line of code.

This way of testing is a common methodology, it's called Requirement base testing. You basically write down what the application should (functionally) do and then start design and implementing the code. In that way it kind of supports TTD... It even helps it in a way: It's easier to create good Unit tests to test your written code with. It's just not really "test driven" development... Its "requiment driven" development.

By Tweakers user JeRa, Tuesday 28 April 2009 01:06

JeRa, I'm a bit surprised by your reacting. Especially in a commercial environment it's extremely important to find defects before the customer does. Solving a defect that the customer found is exponentially more coslty than solving it before even compiling (e.g. by code inspection). When you write a test to a defect found, you're too late. You will prevent regression, but that's it.
It is important to remember that a 'commercial environment' does not equal 'bad developers'. In my opinion and experience, unit tests are slowing down fast developers and provide a safe haven for bad developers. We want fast, but not bad... you should be able to fill in the blanks. This gets rid of a major reason for us to write unit tests first.

Secondly, I agree with you entirely on the 'finding defects before the customer does' thing. It's extremely important, but what's even more important, is that you actually have a working application on time. We take measures (which cost us far less time than writing unit tests) to ensure that whatever happens, we can find and solve the problem within minutes - e.g. full automatic error reports, mutational log of all the data, et cetera.

But to really say anything useful on this subject, you would have to get some statistics about bugs in applications and compare applications with and without TDD. Some key issues will be bug severity, development time and whether or not a bug would have been found without TDD (which would indicate useless development time). Applications have bugs, there's no denying that, but I think there are better ways of preventing bugs than time-expensive TDD.

By Tweakers user RobIII, Tuesday 28 April 2009 01:23

I tend to work by this motto when on my own... :)
Secondly, I agree with you entirely on the 'finding defects before the customer does' thing. It's extremely important, but what's even more important, is that you actually have a working application on time.
Again; this is backwards. You should have enough time in the first place to create your working application. If you don't (have enough time) then you have a flaw in your project management and/or planning. It is not more important that you have a "working application on time" because you can't have "a working application on time" if you're rushing the job. It's important that you have a "working application" period. And it should be on time when planning/management was done correctly. What you are preaching is "whatever happens, never miss a deadline and/or beat all competitors to it and screw quality we'll fix bugs when we see them or the customer reports them".

[Comment edited on Tuesday 28 April 2009 09:11]


By Tweakers user Kama, Tuesday 28 April 2009 01:33

JeRa: First: Testing is not out there to make the projects take longer or to show your customer that you're doing quality control. Testing is actually out there to improve the quality of your product. In other words: Without testing you will never be able to deliver your product at the desired quality level in time!

You call it "Deliver "working" software in time", but what's "working"? Do you think the same about that as your customer? Do you know what your customer thinks? That's what testing is also for!

The need for testing is not related to bad programmers. All developers make mistakes, even the good ones. Mistakes can vary from problems in the code to just implementing the wrong functionality. Writing "sound code" is one thing (You're compiler does that test for you). Delivering the correct functionality a whole different thing.

I agree that TDD probabily isn't the most effective way to accomplish that, but at least it's a way to force you to think about what your code should do, before writing a line of code. You won't find those problems by running code analysis or doing inspection.

My detection rate estimates for functional defects.
Requirement based: 90%+ catch rate
TDD : 40% catch rate
code inspection/regular Unit Test: 0-5% catch rate

Regression defects detection rates won't differ too much here.... But I wanted to point out the advantage of TDD. And true, there are better ways. Requirement base development and testing is the way to go.

Finally: Fixing a defect quickly isn't good enough. The costs of fixing, retesting, releasing, testing again, redoing a roll-out is all way more expensive than doing the test beforehand.... And it doesn't even matter whether the fix took 1 minute or 24 hours.

By Tweakers user Kama, Tuesday 28 April 2009 01:36

I agree with Rob btw on the project management thingy. First you have your estimates, then make your planning. Not the other way round.

By Tweakers user JeRa, Tuesday 28 April 2009 01:41

What you are preaching is "whatever happens, never miss a deadline and/or beat all competitors to it and screw quality we'll fix bugs when we see them or the customer reports them".
That's not true. I'm telling my side of the story, and incidentally telling the story that works best for our company. I've said this in my previous posts. I'm not saying that our method works for everyone, your programmers have to have a certain level of coding skills to work this way. Also, I'm not saying that every company wants us to obey a deadline, I'm just saying that the companies which do, are in good hands with us. They simply cannot receive the same level of quality of software from other parties, which use similar techniques as TDD.

To summarize my opinion, the advantages of TDD are:
- finding bugs before releasing or testing an application
- enabling a team of programmers to work with each others code, in spite of their capabilities or skill
- your design is reviewed before implemented which can save a lot of time

And the disadvantages:
- there's project overhead on creating and maintaining the unit tests
- all developers are slowed down, in spite of their skill
- the level of bug finding is dependent on the skill of the developer, not the technique (TDD)

By Tweakers user JeRa, Tuesday 28 April 2009 01:52

Without testing you will never be able to deliver your product at the desired quality level in time!
Exactly. But this isn't about testing, it's about TDD. And while TDD may be great for old school developers, we use different methods which allow us to do more in less time (compared to writing unit tests).
You call it "Deliver "working" software in time", but what's "working"? Do you think the same about that as your customer? Do you know what your customer thinks? That's what testing is also for!
No, that's what specifications are for. TDD does not eliminate all bugs, so you still have the same problem.
The need for testing is not related to bad programmers. All developers make mistakes, even the good ones. Mistakes can vary from problems in the code to just implementing the wrong functionality. Writing "sound code" is one thing (You're compiler does that test for you). Delivering the correct functionality a whole different thing.
Yes, that's the purpose of TDD. But that's not what I said: too often I have seen a team of developers leveling in skill by the use of unit tests, while a few of them could have programmed the entire application in only half the time. TDD enabled this, so it only adds up to my opinion.
Finally: Fixing a defect quickly isn't good enough. The costs of fixing, retesting, releasing, testing again, redoing a roll-out is all way more expensive than doing the test beforehand.... And it doesn't even matter whether the fix took 1 minute or 24 hours.
That depends on the type of application, bug, development, et cetera. In our case, setting up solid TDD will take significantly more time than improving on our existing methods, which already will catch more bugs beforehand than TDD ever would have. It's not like we don't test or anything, it's just that we don't do TDD (it's not necessary).

By Tweakers user Ghost, Tuesday 28 April 2009 08:08

In my opinion and experience, unit tests are slowing down fast developers and provide a safe haven for bad developers.
This is exactly the kind of comments that makes me scratch my head. I want to hone my programming skills to become a better programmer, not adhering to a method that prevents me from accomplishing just that.

I imagine that using TDD makes me better and faster in writing unit tests and able to assert my own code better. As a result I deliver code to the QA that is better tested and produced with decreasing overhead (still with a bit overhead caused by the writing of tests, of course). Also I am forced to decouple my code and employ sane design patterns to make not only testing possible, but also reusability of both the software components and unit tests. These appear to me as benefits that will show stronger and stronger over time.

Do you guys agree on that or am I missing a point here?

[Comment edited on Tuesday 28 April 2009 08:12]


By Tweakers user bat266, Tuesday 28 April 2009 08:24

My main advantage writing unit tests is that i can refactor at will, while guaranteeing the code works as planned. TDD is a method as any other. It's not my favourite but it depends on the project if its needed.

But unit test are definitely needed in many many projects if you want to make them maintainable.

By Tweakers user Punkie, Tuesday 28 April 2009 10:56

In my opinion and experience, unit tests are slowing down fast developers and provide a safe haven for bad developers
The opposite of fast developers is slow developers, not bad developers. And even if you meant to say exactly "bad developers" then the sentence is without meaning, that is it implies nothing.


Arent there 2 different parameters involved in assessing if TDD merits the effort? Time and buggyness. Although the latter may influence time (because you have to use time for support), the bugyness still is a factor of its own right. Depending on the weight of importance of those 2 factors you may choose to put more effort into TDD compliancy or not. I see a lot of ppl talking about either bugs or either time but seldom a statement that take the weights for both into account.
(ok ok ok there are more factors like reuse possibilities and risk management but taking the prime 2 into account is difficult enough if you want formal statements)

By Tweakers user JeRa, Tuesday 28 April 2009 14:08

The opposite of fast developers is slow developers, not bad developers. And even if you meant to say exactly "bad developers" then the sentence is without meaning, that is it implies nothing.
No, what I said was what I meant. Why would I use opposites? I'm trying to describe two different disadvantages :p unit testing slows down fast developers because of the overhead (there are other ways to 'check' the code of these developers without hindering them) and it enables bad developers to run unit tests before thinking about what they're actually implementing. So how is this without meaning?

By Tweakers user Punkie, Tuesday 28 April 2009 14:51

Hmm, never mind. It does all makes sense. What was i thinking first?

Nevertheless, i do oppose that unit tests slows down developers when you need to deliver a product not riddled by bugs. Unless of course, you replace unit tests by some other form of testing that covers a good share of it.

By Tweakers user Finwe, Tuesday 28 April 2009 17:12

In my experience the kind of bugs you find with unit tests (and TDD for that matter) are often the easy ones - in the sense that once you do discover them, even without unit tests, they are easy to fix. The real time-killers in my opinion are bugs that are related to multi-threading (race conditions), pointer mistakes (when using "old" languages), bugs in OS (Windows CE still has its quirks) and bugs related to hardware (if you write embedded software like we do). These kind of bugs you *cannot* find with unit tests, and they easily cost us 2 weeks each. Therefore I cannot conclude that unit tests in our environment are a real benefit. I also think in such an environment it slows the good developers down, because good developers don't make many "silly" mistakes.

In our project writing and maintaining unit tests costs 2 times as much time as writing the code itself. That is huge - there is no way good developers make bugs that will cost twice as long to fix compared to the time it took to develop!

I also tend to agree with most of JeRa's comments here. In a commercial environment, you cannot think black & white. In fact, releasing a product with some bugs, 2 months before your competitor releases his product, might see a big increase in your sales, despite the fact that some people will return the product/software. It is a trade-off, and not as simple as "Unit tests improve quality, so we must use unit tests".

So yeah, unit tests can be nice, but whether the benefits outweigh the costs is still a very open question in my opinion. You must look within your own organization to see if the time spent is in the end worth it.

By Tweakers user YopY, Tuesday 28 April 2009 17:31

I would only consider test-driven development to be beneficial if the project's structure was pretty much settled before I start programming. In most of my projects so far, I just got the assignment without any design requirements or whatever to implement.

Knowing myself, writing a unit test before I'd even know what I was going to program isn't something that would work. I mean, I flip my code around several times before I'm satisfied, and I have no clue what the end-product of a class will be (in design / contract terms) until I'm actually finished with it.

I'm sure TDD would work for your projects if you have the design worked out for a large part beforehand, but I doubt it'd work for smaller projects that people have full control over.

Unit tests themselves are made of win though. Even though in practice I wind up writing more test code than working code - blame JMock and friends for that. :@. There should be a much less code-intensive way of writing unit tests, but I haven't found any yet.

I mean, to test a simple setter / getter, you have to write at least three lines of code, preferably more:


Java:
1
2
3
4
5
6
7
@Test
public void testSomeSetter() {
String someValue = "some value";
Object someObject = new someObject();
someObject.setSomeValue(someValue);
assertEquals(someValue, someObject.getSomeValue());
}



Note that the above unit test is hardly worth writing, considering a setter / getter is in 99% of the cases very simple. But it does show that to properly test a setter / getter object (each containing just one line of code), you need twice as much code to test.

And it gets worse in bits of code that has branches, where you end up copy / pasting JMock Expectations to get the method to work. Now this might also indicate a possible code smell (for which I love unit tests btw), but it seems like a lot of work.

But now I'm starting to rant myself, it's your blog etc.

I'd like to see a research done on the benefits of unit testing, both before and after writing code, and especially in the long run:

* Do unit tests reduce the time it takes to maintain software later on?
* Do unit tests reduce the amount of bugs and the time it takes to find them?
* Do unit tests increase the reliability and maintainability of a (large) code base?

etc.

Some responses to earlier commenters:
quote: bat266
My main advantage writing unit tests is that i can refactor at will, while guaranteeing the code works as planned.
This will work, but when you refactor your code, you also have to refactor your unit tests, certainly when it involves moving methods around or changing your project's structure. Refactoring in the sense of altering method names or their locations is easy enough with the right refactoring tools (I'm looking at you, Eclipse), but when changing behaviour or API's, you end up doing the change twice - once in your code, once in your tests. And you have to make sure your unit tests still cover the same use cases and ground, and test for all situations.
quote: JeRa
In my opinion and experience, unit tests are slowing down fast developers and provide a safe haven for bad developers. We want fast, but not bad... you should be able to fill in the blanks. This gets rid of a major reason for us to write unit tests first.
Just adding that fast developers can also create fast unit tests and bad developers also create bad unit tests. Just because you have 100% test coverage on a file, doesn't mean you have 100% 'case' coverage. The speed or skill of a developer has little to do with the application of unit tests, I think.

Speaking of, are there any tools out there that can help out with writing unit tests? In particular, tools that could create a load of cases for certain types of methods would be nice.

[Comment edited on Tuesday 28 April 2009 17:36]


Comments are closed