Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Legitimate bugs in hardware is probably out of scope for compilers. That would be an unreasonable ask, yes. But I believe it should be the job of the compiler to make sure that a correctly executed program on a reference machine strictly adhering to the architecture specs should not result in UB.

I'm not saying that a computer architecture should be UB-free. That would be awesome if it could be done, but in practice probably a bridge too far. But a compiler should map high-level directives into low-level implementations on a specific architecture using constructs that do not result in UB. This is not too much to ask.

A compiler can't reasonably protect you from rowhammer attacks. But it should guarantee that, barring hardware errors, accessing uninitialized memory has no effect other than something sensible like returning unspecified contents, or causing a memory access exception, or whatever. It should be defined up front what the behavior is, even if some of the runtime values are unpredictable.

As a more concrete example, most languages these days clearly define what happens in signed integer overflow: the thing that you expect to happen in any two's complement machine (char)127 + (char)1 == -128. C treats this as undefined behavior, and as mentioned in my link above that can cause what should be a finite loop (with or without overflow) to compile as an infinite loop. This "optimization" step by the compiler should never have happened. C resists changing this because C compilers exist for non-twos-complement architectures where the behavior would be different. IMHO the correct approach would be to require THOSE weird esoteric architectures to compile in extra overflow checks (possibly disabled with opt-in compiler flags that explicitly violate the standard), rather than burden every C developer everywhere with the mess that is signed arithmetic overflow UB.

It's a matter of expectations. Any but the most junior programmers expect that signed integer overflow will not be portable. That's fine. But they expect a sensible result (e.g. wrap around on two's complement machines), even if it is non-portable. They don't expect the compiler to silently and sneakily change the logic of their program to something fundamentally different, because UB means the compiler can do whatever tf it wants.



> Legitimate bugs in hardware is probably out of scope for compilers.

But that's exactly the point. The "bug" of RowHammer was that it occurred slightly on the "allowed" side of the envelope, at acceptably-low refresh rates. The "UB" of RowHammer and a hundred other observable effects is that, on the "disallowed" side of the envelope, the behavior is undefined. The system designer gets to choose at what probability they are on each side of the envelope, and the trade-offs are very much optimization opportunities.

Writing software in C that may exhibit undefined behavior is exactly this -- it's choosing, as a software engineer, to be on the far side of the specification envelope. In exchange, you get access to several powerful optimizations, some at the compiler level, and some at the career level (if you think that not needing to learn to use understand your language properly is at time optimization, at least).




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: