As a kind of extreme example, I've gone off and duplicated a whole computing stack because I think C is the wrong abstraction. For example, the way signed and unsigned numbers are defined in the C standard really over-complicates simple programs. We often don't care about portability in these days of instruction set monoculture.
Here's how I render the silhouette of the Mandelbrot set using fixed-point math on my computer. Each statement translates to a single x86 instruction. To detect overflow in a computation I don't perform more computation. I just use the processor's overflow flag, which C "abstracts" from me.
Presumably something like the representation of signed numbers being implementation defined (instead of two's complement as is virtually always the case nowadays).
Yeah. The standard avoids obvious guarantees because of some computer you never heard of. And then compiler writers use the standard as license to mess with the guarantee on _your_ computer.
Here's how I render the silhouette of the Mandelbrot set using fixed-point math on my computer. Each statement translates to a single x86 instruction. To detect overflow in a computation I don't perform more computation. I just use the processor's overflow flag, which C "abstracts" from me.
http://akkartik.github.io/mu/html/mandelbrot-fixed.mu.html
Main project page: https://github.com/akkartik/mu