Some time ago, I noted that undefined behavior can result in time travel. Specifically, that once undefined behavior occurs, the undefined behavior extends to the entire program, even the parts that executed before the undefined behavior occurred.
The citation I made was from the C++ standard, which grants blanket time travel permission. The C standard does impose some limits on how far back in time undefined behavior can extend. Specifically, undefined behavior cannot alter observable behavior that has already occurred. The definition of observable behavior appears in section 5.1.2.4.
The least requirements on a conforming implementation are:
- Volatile accesses to objects are evaluated strictly according to the rules of the abstract machine.
- At program termination, all data written into files shall be identical to the result that execution of the program according to the abstract semantics would have produced.
- The input and output dynamics of interactive devices shall take place as specified in 7.23.3. The intent of these requirements is that unbuffered or line-buffered output appear as soon as possible, to ensure that prompting messages appear prior to a program waiting for input.
We can ignore the “At program termination” clause because it says that the behavior must match the abstract semantics, but to reach program termination, the program has already proceeded past undefined behavior, and undefined behavior specifies no abstract semantics, so anything is possible, and that clause imposes no constraints.
That leaves the other two.
Volatile accesses prior to the undefined behavior remain valid, so you cannot time travel-away a volatile access.
And interactive device activity is also not subject to time travel. So anything you printed to the console or read from the console cannot be recalled. This is already true in a metaphysical sense: The output was already printed to the screen. You can’t go back and un-print it. (Mind you, the undefined behavior is permitted to erase the previous output from the screen.)
But everything else is still in play. You can time-travel away normal (non-volatile) memory accesses and non-interactive file access.
I never understood this kind of undefined behaviour. I mean, I understand that if you derreference a uninitialized pointer, then you can get an access violation, a trash value, or even (by chance), a valid value. It would even be better if the compiler warn you about it, though (at least you have valgrind).
But this undefined behaviour you get when you set the maximum optimization of the compiler and have a bug in your...
It's not about utility to the person. It's about utility to the optimizer.
The "time travel" happens when the optimizer moves parts of your program around, and winds up moving something around an instance of undefined behavior. You may see the effect of something before the undefined behavior happens that is in your code after the offending instruction, because the optimizer moved(or removed) it.
The optimizer is allowed to do whatever it wants with code that...
Because it can't always tell if there's undefined behavior. Sometimes it's only at runtime that it becomes undefined.
Like this code from the time travel example:
<code>
Is this undefined behavior? No. You check if it's null, and only use the value if it's not null.
And then you add a printf for debugging purposes.
<code>
Is THAT undefined?
Well, if you pass a NULL, then yes, it's undefined behavior.
But if you never pass a NULL, then no, it's fully well-defined.
What...
Sure. My point is: why “allow” this undefined behaviour in the first place? Shouldn’t be better to stop the compilation with an error? The outcome of all of this is that above -O2, anything can happen. I don’t miss this quirks of C++.