This is an example to illustrate my question which involves some much more complicated code that I can't post here.
#include <stdio.h>
int main()
{
int a = 0;
for (int i = 0; i < 3; i++)
{
printf("Hello\n");
a = a + 1000000000;
}
}
This program contains undefined behavior on my platform because a
will overflow on the 3rd loop.
Does that make the whole program have undefined behavior, or only after the overflow actually happens? Could the compiler potentially work out that a
will overflow so it can declare the whole loop undefined and not bother to run the printfs even though they all happen before the overflow?
(Tagged C and C++ even though are different because I'd be interested in answers for both languages if they are different.)
If you're interested in a purely theoretical answer, the C++ standard allows undefined behaviour to "time travel":
[intro.execution]/5:
A conforming implementation executing a well-formed program shall produce the same observable behavior as one of the possible executions of the corresponding instance of the abstract machine with the same program and the same input. However, if any such execution contains an undefined operation, this International Standard places no requirement on the implementation executing that program with that input (not even with regard to operations preceding the first undefined operation)
As such, if your program contains undefined behaviour, then the behaviour of your whole program is undefined.