I have just written this short C++ program to approximate the actual number of clock ticks per second.
#include <iostream>
#include <time.h>
using namespace std;
int main () {
for(int i = 0; i < 10 ; i++) {
int first_clock = clock();
int first_time = time(NULL);
while(time(NULL) <= first_time) {}
int second_time = time(NULL);
int second_clock = clock();
cout << "Actual clocks per second = " << (second_clock - first_clock)/(second_time - first_time) << "\n";
cout << "CLOCKS_PER_SEC = " << CLOCKS_PER_SEC << "\n";
}
return 0;
}
When I run the program, I get output that looks like this.
Actual clocks per second = 199139
CLOCKS_PER_SEC = 1000000
Actual clocks per second = 638164
CLOCKS_PER_SEC = 1000000
Actual clocks per second = 610735
CLOCKS_PER_SEC = 1000000
Actual clocks per second = 614835
CLOCKS_PER_SEC = 1000000
Actual clocks per second = 642327
CLOCKS_PER_SEC = 1000000
Actual clocks per second = 562068
CLOCKS_PER_SEC = 1000000
Actual clocks per second = 605767
CLOCKS_PER_SEC = 1000000
Actual clocks per second = 619543
CLOCKS_PER_SEC = 1000000
Actual clocks per second = 650243
CLOCKS_PER_SEC = 1000000
Actual clocks per second = 639128
CLOCKS_PER_SEC = 1000000
Why doesn't the actual number of clock ticks per second match up with CLOCKS_PER_SEC? They're not even approximately equal. What's going on here?
clock
returns the amount of time spent in your program. There are 1,000,000 clock ticks per second total*. It appears that your program consumed 60% of them.
Something else used the other 40%.
*Okay, there are virtually 1,000,000 clock ticks per second. The actual number is normalized so your program perceives 1,000,000 ticks.