Page 1 of 1

"One assertion per test"

Posted: Sat Jul 31, 2010 7:26 am
by josh
Do you follow the mantra? Would it be better to actually say one spec per test? For example if I'm testing something that returns a collection of stuff, I want to assert that there is the right # of items in the collection, and I'll also want to assert each item is the item I expect. So if it was a 2 item collection that's 3 assertions to test it. Am I doing something wrong here? My gut tells me its fine as long as its part of the same feature, but I wanted to get different views on this. Discuss

Re: "One assertion per test"

Posted: Sat Jul 31, 2010 1:02 pm
by Weirdan
Implement your own assertion (like collectionHasItems()) and you're back to one assertion per test. Don't limit yourself to assertions offered by your testing framework.

Re: "One assertion per test"

Posted: Sun Aug 01, 2010 10:26 am
by josh
Right I'm aware, but that can be dogmatic if theres no real justification for doing so.. If it doesn't make the test easier to read in a given situation, like when you're testing something test last and the test "wants" to be procedural to be understood, is it ok to break the rule sometimes or do you think I'm fooling myself and abstraction in the test is always desirable?

So would your characterization of good testing be "one assertion per test, achieved by creating custom assertions, always to maintain one conceptual assertion in the test body". Would your mantra read something like that?

What about verifying pre & post conditions, like a guard assertion? I guess you'd say theres no such thing as a guard assertion and that guard assertion is actually describing a feature in a way?

I may have just written a test case class that should really be its own suite then, lol.

Also is there a such thing as a custom assertion that verifies too much? I mean if you just moved 100 assertions to another method aren't you just hiding the smell?

Also, do you ever find breaking the rule at first helps when doing test last of someone else's code, as long as you refactor it before you check in or something?

Re: "One assertion per test"

Posted: Sun Aug 01, 2010 1:07 pm
by Weirdan
I'm no fanatic, so I'd say do what works for you and your team. To me, multiple assertions is just another code smell. It could indicate I need to split test into several, or it could indicate I need custom assertion, or it could even mean I need to add another method to class under test (like you could add hasItems() to your collection class). It could even mean I need to leave it as is. Do whatever makes it easier to work with the code and test suite.

Re: "One assertion per test"

Posted: Sun Aug 01, 2010 2:21 pm
by josh
Weirdan wrote: It could even mean I need to leave it as is.
Got ya. Ok. Good advice. I was testing XML just now I was thinking in my head, now how could defect localization provide better documentation in this circumstance. I don't think it can. In that case its more readable to look at "chunk" of the XML instead of line by line, with additional tests asserting the ordering of lines, and so on. So much indirection.

So I guess a mantra that consolidates both of our view points is "one test per specification" (specification being a subjective term that in general should be minimized to a small spec where possible)


But. Until you have like a year or more of TDD experience I don't think you can differentiate between a feature & a test effectively, so this mantra would only apply to that subset of programmers ;-)

Re: "One assertion per test"

Posted: Mon Aug 16, 2010 5:19 pm
by Jenk
That "rule" is often misinterpreted. It's not to say limit yourself to one assertion, but to expand your tests so that more than one is unecessary. For your example of a collection, you'd have multiple tests for that behaviour such as ReturnedCollectionShouldHave5Items(), FirstItemInCollectionShouldBeX(), SecondItemShouldBeY() etc. As tests (test names).

Please excuses typos; posting from iPhone. :)

Re: "One assertion per test"

Posted: Mon Aug 16, 2010 5:24 pm
by Jenk
Also, and probably more importantly, it can be an indication that you need to break down the funcionality/requirements more. Hopefully this can be seen in my example above, where instead of considering the entire collection returned, the tests focus on the individual items and how many there are. :)

Re: "One assertion per test"

Posted: Tue Aug 17, 2010 6:20 pm
by josh
I think there is opposing forces at play. Maybe? Do you think the test is more readable with one big XML assertion, or the more isolated assertions? TO me the big assertion gains readability but looses defect localization, and using multiple smaller tests does the opposite, hinders readability (sometimes) but raises the quality of the test suite, would you agree or disagree?

Also would you rather see "item two should be ID 3, but was id 4" as a failure message, ala multiple tests?
Or would you rather see "items were 3,4 but expected 4,3"

I guess what I am getting at is I need a custom assertion designed to verify ordering. Like something where I can pass an expected array of IDs in their expected order, and the collection itself, and have that custom assertion either pass/fail.

I guess my apprehension comes from a commitment I made to not test the test code. When I started I couldn't wrap my head around that. Its hard enough testing production code. I read about people testing tests and was like are you kidding? If you don't draw the line at one level of tests where do you draw it? SO because of this hangup / personal apprehension I have against testing test code, I guess I avoid writing custom assertions where i should. I don't know.

Re: "One assertion per test"

Posted: Wed Aug 18, 2010 3:53 pm
by Jenk
I'm very much in the mantra of one assertion per test when going test-first. I'll even write just the assertion first. If I'm writing retro-fitted tests, then I tend to use multiple asserts because I have lost a lot of enthusiasm for retro-fitting tests. :)

Best example I can think of is if writing a suite of tests for a Fibonacci sequence generator. Lets say the overall requirements are: "A Fibonacci generator that creates a sequence no greater than the first 50 numbers"

So I start with: "FirstNumberShouldBe0()", then after writing the simplest code to pass that test (i.e. return 0;) move on to the next interesting test, the second number. etc. At a point it will become apparent when the 'real' logic should be implemented, i.e. I won't have 50 tests of "NthNumberShouldBeX()". But it is important to remember that your tests should be governing the implementation, not you. Having simple tests like those allows this.


EDIT: Another example is shown in one of the videos for the TDD Kata I posted in such named thread: http://www.21apps.com/agile/tdd-kata-by-example-video/

I'm never going to be able to explain it as well as simply showing you a video can :lol:

Re: "One assertion per test"

Posted: Thu Aug 19, 2010 2:56 am
by josh
Right, I've been doing TDD exclusively for one year. I'm very familiar with the benefits. However the territory in specific I haven't ventured into is writings tests that test custom assertion methods. Guess theres no better way then to do it & learn hands on. In the past I avoided it like I said

Re: "One assertion per test"

Posted: Fri Sep 24, 2010 6:50 am
by Jenk
I didn't mean to teach you to suck eggs, my apologies. I wanted to try and point out what is, at the least, my interpretation of the mantra "one assertion per test" which is in line with the video I posted. I do feel that testing a collection with an 'assertCollection' is potentially masking behaviours in one bundle, and definitely could be broken down further, but that it also is personal preference. :)

Re: "One assertion per test"

Posted: Fri Sep 24, 2010 2:23 pm
by Christopher
I am not strict on one assert per test, but keep them to a minimum. I tend to be less strict during TDD (to keep things moving) than when done and I clean them up into unit tests.

Often you have gone to all the trouble to create a test condition and want to explicitly check for several things. I think that adding assertion methods that are only used for testing is not the best idea -- unless that is the only what to test something. I like it to be clear what is a test (and therefore part of the spec) and what is the code. You just end up with more potential side effects if you are just ANDing test conditions together inside an assert method. Whereas multiple asserts in the test just makes the spec clearer. And it is trivial to separate very specific asserts into multiple tests if that seem better.

Re: "One assertion per test"

Posted: Fri Sep 24, 2010 10:45 pm
by josh
That's sort of what I'm doing now, after reading this thread I've refined my practice to first writing a sloppy test, and then the next time I have to work on that part of the code or add onto it, I'll first review the tests and see if they can use any test re-factoring. Certain stuff like the XML I am keeping all in one test method, but I'm finding most of it could use even more cleaning than I had in place.

I'm trying to get the methods to 2-3 lines if possible (although I'm not putting any hard limits, its just a goal)

I also find the same as Christopher - I'd rather have the logic in production than in some test utility method, once I need it in a test its a sure fire sign I need it in production too. For example sometimes I'll write a method that looks through some "back doors" to inspect some state of some related objects. Almost always, the SUT just needs a way to return a reference to these related objects (simplistic example). I guess this smell is "backdoor verification"? Xunit test patterns has a smell called "indirect verification". (under "obscure test" under "test smells")