I was looking at some of the solutions in Google Code Jam and some people used this things that I had never seen before. For example,
2LL*r+1LL
What does 2LL and 1LL mean?
Their includes look like this:
#include <math.h>
#include <algorithm>
#define _USE_MATH_DEFINES
or
#include <cmath>
The LL
makes the integer literal of type long long
.
So 2LL
, is a 2 of type long long
.
Without the LL
, the literal would only be of type int
.
This matters when you're doing stuff like this:
1 << 40
1LL << 40
With just the literal 1
, (assuming int
to be 32-bits, you shift beyond the size of the integer type -> undefined behavior).
With 1LL
, you set the type to long long
before hand and now it will properly return 2^40.