I want to know the cyclomatic complexity of two section of code,
IF((A>B) AND (C>D))
{ a=a+b;c=c+d;}
as far my knowledge the cyclomatic complexity of above code=2+1=3,
Another code
IF((A>B) OR (C>D))
{a=a+b;c=c+d;}
Complexity of the above code is=4+1=5,
whether the above complexities are correct or not?
Both complexities are same and equal 3, counted in 4 ways. I agree with Neil on using De Morgan proving that they are same, I think same can be seen from graphs where it matters for complexity counting.
Let's start with graphs for both code pieces.
Word of explanation:
if and if
and if or if
, so two ifs. Thus my second node being a separate node. As you can see, between both code pieces there's no difference in numbers. Nodes, edges, regions all are the same. The difference is which node connects with which node, and this comes from how short-circuiting works. Obviously for languages without it, graphs need to be different.
There is more than one. Complexity equals
Edges = 5; Nodes = 4; exits = 1;
Complexity = 5-4+2*(1) = 3 in both cases. This definition doesn't need a strongly connected graph so we drop the added edge.
Edges = 6; Nodes = 4; exits = 1;
Complexity = 6-4+1 = 3 in both cases. This definition came to be for it makes more sense topologically and is simpler to think of in terms of cycle counting (graph-wise). It is more useful when you think of counting complexity for many routines / functions, like all methods in a class. It makes then sense to think that function may be called in a loop. But I digress.
Regions: 3.
This comes from Eulerian formula that Regions + Nodes - Edges = 2 Rewording this: Regions = Edges - Nodes + 2
So number of regions is same as complexity (assuming one exit point). This was meant to simplify counting complexity from graph, in one-exit sub-routine.
McCabe himself noted that
in practice compound predicates such as IF "c1 AND c2" THEN are treated as contributing two to complexity since without the connective AND we would have IF c1 THEN IF c2 THEN which has two predicates. For this reason and for testing purposes it has been found to be more convenient to count conditions instead of predicates when calculating complexity
In both code pieces we have one compound conditional, so Decisions = 2;
Complexity = 2+1 = 3.
It's worth noting that cyclomatic complexity started as counting cycles, but ended as counting conditionals for practical purposes.
First try McCabe's paper itself: http://www.literateprogramming.com/mccabe.pdf
Wikipedia has a good article relying on the paper, though I found it insufficient without following BASIC BLOCKS and CONNECTED COMPONENTS:
I found a concise yet good summary at Chambers' page: http://www.chambers.com.au/glossary/mc_cabe_cyclomatic_complexity.php
The book "Integrated Approach to Software Engineering" in chapter 8 has an example illustrating complexity calculation (though I think they ate one edge on a graph, figure 8.7).
http://books.google.pl/books?id=M-mhFtxaaskC&lpg=PA385&ots=jB8P0avJU7&d&hl=pl&pg=PR1#v=onepage