Pretty much no property, not even the most trivial ones, of a C program can be relied upon without assuming no-UB. A compiler can't even assume that a variable won't change value between a statement to the next as it could be changed asynchronously by signal handler or thread.
Exactly - so the problem is perhaps best thought of from a different perspective - i.e. not that the compiler only considered defined behavior when rewriting code (because what else would it do?), but rather that certain behavior could have a definition, but doesn't.
It's a lot easier to reason about code for instance when the domain of signed integer addition is all pairs of integers, not just a subset thereof.
Ideally, buffer overflows would also be defined - but without lifetime analysis ala rust or runtime costs, that's going be hard. But given how many stack guarding techniques there already are, perhaps we're closer to this than I think?