Measuring time taken by a function: clock_gettime

Jary picture Jary · Oct 16, 2010 · Viewed 72.2k times · Source

I am trying to measure how long a function takes.

I have a little issue: although I am trying to be precise, and use floating points, every time I print my code using %lf I get one of two answers: 1.000... or 0.000.... This leads me to wonder if my code is correct:

#define BILLION  1000000000L;

// Calculate time taken by a request
struct timespec requestStart, requestEnd;
clock_gettime(CLOCK_REALTIME, &requestStart);
function_call();
clock_gettime(CLOCK_REALTIME, &requestEnd);

// Calculate time it took
double accum = ( requestEnd.tv_sec - requestStart.tv_sec )
  + ( requestEnd.tv_nsec - requestStart.tv_nsec )
  / BILLION;
printf( "%lf\n", accum );

Most of this code has not been made by me. This example page had code illustrating the use of clock_gettime: http://www.users.pjwstk.edu.pl/~jms/qnx/help/watcom/clibref/qnx/clock_gettime.html

Could anyone please let me know what is incorrect, or why I am only getting integer values please?

Thank you very much,

Jary

Answer

Marcelo Cantos picture Marcelo Cantos · Oct 16, 2010

Dividing an integer by an integer yields an integer. Try this:

#define BILLION 1E9

And don't use a semicolon at the end of the line. #define is a preprocessor directive, not a statement, and including the semicolon resulted in BILLION being defined as 1000000000L;, which would break if you tried to use it in most contexts. You got lucky because you used it at the very end of an expression and outside any parentheses.