> Perhaps I should have phrased it as "all implementation-defined behaviour is whatever the hardware happens to do when executing whatever code the compiler happens to generate".
Even with this definition, the important part is that compilers would no longer be able to ignore control flow paths that invoke undefined behavior. Signed integer overflow/null pointer dereference/etc. may be documented to produce arbitrary results, and that documentation may be so vague as to be useless, but those overflow/null pointer checks are staying put.
Err, that's not a definition, that's a example of pathologically useless 'documentation' that a perverse implementation might provide if it were allowed to 'define' implementation-defined behaviour by deferring to the hardware. Deferring to the hardware is what undefined behaviour is, the point of implementation-defined behaviour is to be less vague than that.
> may be documented to produce arbitrary results, and that documentation may be so vague as to be useless, but those overflow/null pointer checks are staying put. [emphasis added]
Yes, exactly; that is what undefined behaviour is. That is what "the standard imposes no requirements" means.
> Deferring to the hardware is what undefined behaviour is
If that were the case, the Standard would say so. The entire reason people argue over this in the first place is because the Standard's definition of undefined behavior allows for multiple interpretations.
In any case, you're still missing the point. It doesn't matter how good or bad the documentation of implementation-defined behavior may or may not be; the important part is that compilers cannot optimize under the assumption that control flow paths containing implementation-defined behavior are never reached. Null-pointer checks, overflow checks, etc. would remain in place.
> Yes, exactly; that is what undefined behaviour is. That is what "the standard imposes no requirements" means.
I think you're mixing standardese-undefined-behavior with colloquial-undefined-behavior here. For example, if reading an uninitialized variable were implementation-defined behavior, and an implementation said the result of reading an uninitialized variable was "whatever the hardware returns", you're going to get some arbitrary value/number, but your program is still going to be well-defined in the eyes of the Standard.
When I said implementation-defined, I meant implementation-defined. This is because the applicability of UB-based optimization to implementation-defined behavior - namely, the lack thereof - is wholly uncontroversial. Thus, the diversion into the quality of documentation-defined behavior is not directly relevant here; the mere act of changing something from undefined behavior to implementation-defined behavior neatly renders irrelevant any argument about whether any particular UB-based optimization is valid.
> Compilers cannot assume that, because (in the general case) it is not true.
This is not necessarily true. For example, consider the semantics of the restrict keyword. The guarantees promised by a restrict-qualified pointer aren't true in the general case, but preventing optimizations because of that rather defeats the entire purpose of restricting a pointer in the first place.
More generally, the entire discussion about UB-based optimizations exists precisely because the Standard permits a reading such that compilers can make optimizations that don't hold true in the general case, precisely because the Standard imposes no requirements on programs that violate those assumptions.
Even with this definition, the important part is that compilers would no longer be able to ignore control flow paths that invoke undefined behavior. Signed integer overflow/null pointer dereference/etc. may be documented to produce arbitrary results, and that documentation may be so vague as to be useless, but those overflow/null pointer checks are staying put.