Cyclomatic complexity of IF((A>B) AND (C>D)) and IF((A>B) OR (C>D))

Imran picture Imran · Sep 26, 2012 · Viewed 10.2k times · Source

I want to know the cyclomatic complexity of two section of code,

IF((A>B) AND (C>D)) 
{ a=a+b;c=c+d;}

as far my knowledge the cyclomatic complexity of above code=2+1=3,

Another code

IF((A>B) OR (C>D))
{a=a+b;c=c+d;}

Complexity of the above code is=4+1=5,

whether the above complexities are correct or not?

Answer

Both complexities are same and equal 3, counted in 4 ways. I agree with Neil on using De Morgan proving that they are same, I think same can be seen from graphs where it matters for complexity counting.

Graphs

Let's start with graphs for both code pieces.

if + or

if + and

Word of explanation:

  1. McCabe graph consists of basic blocks, meaning that I can collapse many statements into one, as long as control passes between them linearly.
  2. I treated your code as simple procedure, one entry point, one exit point.
  3. Exit point is added as a sink where it all ends. Noting here as I have few examples that build graphs from code for McCabe computations and none I recall does that, but I think this feels natural, given what are basic blocks and what nodes / edges are to mean.
  4. Edge between exit and entry point is only relevant for simplifying complexity computing, thus the note and different marking (color, arrow).
  5. Basic blocks are delimited by instructions that may pass control non-linearly: while, for, if, etc.
  6. Citing McCabe himself, AND and OR add +1 to complexity, since they basically are if and if and if or if, so two ifs. Thus my second node being a separate node.

As you can see, between both code pieces there's no difference in numbers. Nodes, edges, regions all are the same. The difference is which node connects with which node, and this comes from how short-circuiting works. Obviously for languages without it, graphs need to be different.

Complexity definitions

There is more than one. Complexity equals

Edges - Nodes + 2*(exits)

Edges = 5; Nodes = 4; exits = 1;

Complexity = 5-4+2*(1) = 3 in both cases. This definition doesn't need a strongly connected graph so we drop the added edge.

Edges - Nodes + exits provided strongly connected graph

Edges = 6; Nodes = 4; exits = 1;

Complexity = 6-4+1 = 3 in both cases. This definition came to be for it makes more sense topologically and is simpler to think of in terms of cycle counting (graph-wise). It is more useful when you think of counting complexity for many routines / functions, like all methods in a class. It makes then sense to think that function may be called in a loop. But I digress.

Regions

Regions: 3.

This comes from Eulerian formula that Regions + Nodes - Edges = 2 Rewording this: Regions = Edges - Nodes + 2

So number of regions is same as complexity (assuming one exit point). This was meant to simplify counting complexity from graph, in one-exit sub-routine.

Decision points + 1

McCabe himself noted that

in practice compound predicates such as IF "c1 AND c2" THEN are treated as contributing two to complexity since without the connective AND we would have IF c1 THEN IF c2 THEN which has two predicates. For this reason and for testing purposes it has been found to be more convenient to count conditions instead of predicates when calculating complexity

In both code pieces we have one compound conditional, so Decisions = 2;

Complexity = 2+1 = 3.

It's worth noting that cyclomatic complexity started as counting cycles, but ended as counting conditionals for practical purposes.

Further reading

First try McCabe's paper itself: http://www.literateprogramming.com/mccabe.pdf

Wikipedia has a good article relying on the paper, though I found it insufficient without following BASIC BLOCKS and CONNECTED COMPONENTS:

  1. http://en.wikipedia.org/wiki/Cyclomatic_complexity
  2. http://en.wikipedia.org/wiki/Basic_block
  3. http://en.wikipedia.org/wiki/Connected_component_(graph_theory)

I found a concise yet good summary at Chambers' page: http://www.chambers.com.au/glossary/mc_cabe_cyclomatic_complexity.php

The book "Integrated Approach to Software Engineering" in chapter 8 has an example illustrating complexity calculation (though I think they ate one edge on a graph, figure 8.7).

http://books.google.pl/books?id=M-mhFtxaaskC&lpg=PA385&ots=jB8P0avJU7&d&hl=pl&pg=PR1#v=onepage