Why write 1,000,000,000 as 1000*1000*1000 in C?

Duck picture Duck · Nov 16, 2016 · Viewed 13.5k times · Source

In code created by Apple, there is this line:

CMTimeMakeWithSeconds( newDurationSeconds, 1000*1000*1000 )

Is there any reason to express 1,000,000,000 as 1000*1000*1000?

Why not 1000^3 for that matter?

Answer

Piotr Falkowski picture Piotr Falkowski · Nov 16, 2016

One reason to declare constants in a multiplicative way is to improve readability, while the run-time performance is not affected. Also, to indicate that the writer was thinking in a multiplicative manner about the number.

Consider this:

double memoryBytes = 1024 * 1024 * 1024;

It's clearly better than:

double memoryBytes = 1073741824;

as the latter doesn't look, at first glance, the third power of 1024.

As Amin Negm-Awad mentioned, the ^ operator is the binary XOR. Many languages lack the built-in, compile-time exponentiation operator, hence the multiplication.