Incredible idea. Can it be applied to compute heavy visual applications too? Like playing a game at 4k at 120hz, what if the game would stop rendering and the display would turn off for 100ms every time you blink but the game would proceed as normal?
Well that assertion leaves a lot to imagination. So if you're running a 1080ti GPU it uses a max power of 250w, worse case you can cut power to its onboard processors in a way that does not require re-initialization after power is resumed in 100ms. Even a 5% improvement would translate to a reduction of 12w just on the GPU front.