I've seen this code several times.
long lastTime = System.nanoTime();
final double ticks = 60D;
double ns = 1000000000 / ticks;
double delta = 0;
The code above takes the System time and stores it to lastTime
. The 60 ticks should equate to the number of times to update per second.
while(running){
long now = System.nanoTime();
delta += (now - lastTime) / ns;
lastTime = now;
if(delta >= 1){
tick();
delta--;
}
It takes now
and subtracts lastTime
, then converts it to nanoseconds/60. Is there some guarantee that the difference in time between now
and lastTime
to nano over 60 will cause delta to be greater than or equal to 1, 60 times per second? I can't understand why tick();
will run around 60 times per second. From my calculation every time the loop runs delta increases by 0.0025 or so.
I've commented the code a bit to show you more clearly what is happening.
//Get the system time
long lastTime = System.nanoTime();
//Specify how many seconds there are in a minute as a double
//store as a double cause 60 sec in nanosec is big and store as final so it can't be changed
final double ticks = 60D;
//Set definition of how many ticks per 1000000000 ns or 1 sec
double ns = 1000000000 / ticks;
double delta = 0;
while(running){
//Update the time
long now = System.nanoTime();
//calculate change in time since last known time
delta += (now - lastTime) / ns;
//update last known time
lastTime = now;
//continue while delta is less than or equal to 1
if(delta >= 1){
//Go through one tick
tick();
//decrement delta
delta--;
}
Now I'm pretty sure that's what this does but I can't say for certain without seeing what tick() is