What is the relationship between requirements and test cases?

logic_chopper picture logic_chopper · Oct 14, 2010 · Viewed 12.9k times · Source

I see that there are many systems out there for requirements to test cases traceability and I started to ask myself what the relationship is between these two artefacts. For example, why have the notion of test cases as opposed to just calling them detailed requirements? Are test cases in fact a refinement of the requirements set? If test cases are not requirements and require more than the documented requirements (e.g. testing more errors, etc) then surely the requirements set is incomplete? Are requirements just abstract test cases?

Answer

Mark Irvine picture Mark Irvine · Oct 15, 2010

I see that there are many systems out there for requirements to test cases traceability and I started to ask myself what the relationship is between these two artefacts. For example, why have the notion of test cases as opposed to just calling them detailed requirements? Are test cases in fact a refinement of the requirements set?

I think the distinction just signifies when they are produced, and for what purpose. Requirements are produced very early before we know a lot of implementation specific details. We try to keep them implementation neutral so they tend to be more abstract.

The purpose of test scripts is somewhat different. Requirements tell developers what the system should do, not how to do it. Test cases however (as they are often written) specify exactly how to do something and they will often reference the actual implementation details.

If test cases are not requirements and require more than the documented requirements (e.g. testing more errors, etc) then surely the requirements set is incomplete?

Yes, there requirements set is incomplete. It always is because you can never completely document all expectation of all users or stakeholder no matter how long you work at it.

But then the test cases are also incomplete. Complete testing is impossible. Any set of tests is a sample set of all potential tests. However, the tests are typically done at a later stage, when we know much more about the requirements, and so they can be more specific, more detailed, and more complete, not not fully complete.

Take a look at: http://www.ibm.com/developerworks/rational/library/04/r-3217/

In this article, the author explains how to get from use cases to test cases. The point the author makes is that while the use case contain all the flow and sub flows, the test cases are the specific data and the specific flow through the system.

Are requirements just abstract test cases?

I would say yes, they can be viewed that way. Some people would even go as far as to not write test cases, and just use the requirements as a 'checklist' on which to base their testing.

Traceability from test cases to requirements is a very nice idea and is a very popular approach. Tools implement this feature because it sells. But there are limitations and traps with this approach. There is often a false sense of completeness when the tool happily reports 100% coverage because you happen to have 1 test for every requirement. It's doesn't really address the problem that some requirements require much more than just one test. It also does not factor in the content of the tests or whether the tests actually cover what they should.

If you are interested in Requirements->Test traceability, you should be aware of the limitations of the approach, and realize that it should be used carefully in combination with other approaches to give a more comprehensive approach to your testing.