I once joined a company that had 90% code coverage. After a while it became clear that there were all vanity tests: I could delete huge swathes of code with zero test failure. We let the contractors that wrote it move on, and we formed a solid team in house. We don't run code coverage any more because it makes the build run four times slower. Instead, I trust our teams to write the good tests. Sometimes that means <100% coverage, and the teams are able to justify it.
Some feedback on the article:
>Test-driven development, or as it used to be called: test-first approach
Test-first is not the same as Test-Driven. The test-first approach includes situations where a QA dev writes 20 tests, and then hands them to an engineer who implements them. Thats not TDD.
>"But my boss expects me to write test for all classes," he replied.
That's very unlikely to be TDD. "Writing tests because I've been told to" is never likely to be "I'm writing the tests that I know to be necessary", and that's all TDD is: writing necessary tests. If the test isn't necessary, then neither is the code.
>Look, if that imaginary evil/clueless developer comes and breaks that simple code, what do you think he will do if a related unit test breaks? He will just delete it.
Sure. But then their name is on that act in the commit log. The test is a warning. I've been lucky not to have worked with evil developers, but I have worked with some clueless ones, and indeed some have just deleted tests. Thats an opportunity for education, and quality has steadily improved.
>The tragedy is that once a "good practice" becomes mainstream we seem to forget how it came to be, what its benefits are, and most importantly, what the cost of using it is.
Totally agree. So many programmers and teams practice cargo cult behaviors. Unfortunately, this article is one of them: making claims about TDD, and unit tests in general, without understanding "why" TDD is effective.
Some feedback on the article:
>Test-driven development, or as it used to be called: test-first approach
Test-first is not the same as Test-Driven. The test-first approach includes situations where a QA dev writes 20 tests, and then hands them to an engineer who implements them. Thats not TDD.
>"But my boss expects me to write test for all classes," he replied.
That's very unlikely to be TDD. "Writing tests because I've been told to" is never likely to be "I'm writing the tests that I know to be necessary", and that's all TDD is: writing necessary tests. If the test isn't necessary, then neither is the code.
>Look, if that imaginary evil/clueless developer comes and breaks that simple code, what do you think he will do if a related unit test breaks? He will just delete it.
Sure. But then their name is on that act in the commit log. The test is a warning. I've been lucky not to have worked with evil developers, but I have worked with some clueless ones, and indeed some have just deleted tests. Thats an opportunity for education, and quality has steadily improved.
>The tragedy is that once a "good practice" becomes mainstream we seem to forget how it came to be, what its benefits are, and most importantly, what the cost of using it is.
Totally agree. So many programmers and teams practice cargo cult behaviors. Unfortunately, this article is one of them: making claims about TDD, and unit tests in general, without understanding "why" TDD is effective.