Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Wouldn’t all sequence points executed before undefined behavior is encountered be required to occur as if the undefined behavior wasn’t there? It would seem so:

No. Code optimization is a series of logic proofs. It is like playing Minesweeper. If a revealed square has 1 neighboring mine and a count of 1, then you know that all 7 other squares are safe. In other Minesweeper situations you make a proof that is much more complex and allows you to clear squares many steps away from a revealed mine. If you make a false assumption of where a mine is, via a faulty proof, then you explode.

The compiler is exactly like that. "If there is only one possible code path through this function, then I can assume the range of inputs to this function, then I can assume which function generated those inputs..."

You can see how the compiler's optimization proof goes "back in time" proving further facts about the program's valid behavior.

If the only valid array indexes are 0 and 1 then the only valid values used to compute those indexes are those values that produce 0 and 1.

This isn't even program execution. In many cases the code is collapsed into precomputed results which is why code benchmarking is complicated and not for beginners. Many naive benchmark programs collapse 500 lines of code and loops into "xor eax,eax; ret;" A series of putchar, printf and puts calls can be reduced to a single fwrite and a malloc/free pair can be replaced with an implicit stack alloca because all Standard Library functions are known and defined and there is no need to actually call them as written.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: