I want to know more about Arduino Nano timers.
delay()
and delayMicroseconds()
implemented...
The best way to think about the Arduino Nano timers is to think about the timers in the underlying chip: the ATmega328. It has three timers:
All of these timers can produce two kinds of interrupts:
Unfortunately, there is no Arduino function to attach interrupts to timers. To use timer interrupts you will need to write slightly more low-level code. Basically, you will need to declare an interrupt routine something like this:
ISR(TIMER1_OVF_vect) {
...
}
This will declare a function to service timer1 overflow interrupt. Then you will need to enable the timer overflow interrupt using the TIMSK1
register. In the above example case this might look like this:
TIMSK1 |= (1<<TOIE1);
or
TIMSK1 |= BV(TOIE1);
This sets the TOIE1
(generate timer1 overflow interrupt, please) flag in the TIMSK1 register. Assuming that your interrupts are enabled, your ISR(TIMER1_OVF_vect)
will get called every time timer1 overflows.
The Arduino delay()
function looks as follows in the source code (wiring.c
):
void delay(unsigned long ms)
{
uint16_t start = (uint16_t)micros();
while (ms > 0) {
if (((uint16_t)micros() - start) >= 1000) {
ms--;
start += 1000;
}
}
}
So internally it uses the micros()
function, which indeed relies on the timer0 count. The Arduino framework uses timer0 to count milliseconds, indeed, timer0 count is is where millis()
function gets its value.
The delayMicroseconds()
function, on the other hand, uses certain well-timed microprocessor operations to create the delay; which function is used depends on the processor and the clock speed; the most common being nop()
(no operation) which takes exactly one clock cycle. Arduino Nano uses a 16 MHz clock, and here's what the source code looks like for that:
// For a one-microsecond delay, simply return. The overhead
// of the function call yields a delay of approximately 1 1/8 µs.
if (--us == 0)
return;
// The following loop takes a quarter of a microsecond (4 cycles)
// per iteration, so execute it four times for each microsecond of
// delay requested.
us <<= 2;
// Account for the time taken in the proceeding commands.
us -= 2;
What we learn from this: