Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Heh. Does anyone remember when almost 25 years ago ATI (AMD) caught manipulating the Quake III benchmarks by renaming the executables to ‘quack’?

https://web.archive.org/web/20230929180112/https://techrepor...

https://web.archive.org/web/20011108190056/https://hardocp.c...

https://web.archive.org/web/20011118183932/www.3dcenter.de/a...



Just in case anyone else parsed that sentence the same way as me, ati detected "quake" as the executable and changed things like texture quality etc to increase benchmark performance. Some people discovered this after they renamed the executable to "quack" and the image quality improved but the benchmarks were lower, proving that the ati drivers "optimised" by reducing quality.

Ati did not rename quake to quack as I originally thought from this! :)


The story was that they used a lower mipmap level (blurrier textures) when the process was named Quake, but used the normal mipmap level (standard textures) when the process was named Quack.


Thank you for explaining. I was so confused at how AMD was improving Quake performance with duck-like monikers.


Well, if it _looks_ like a high-performance texture renderer, and it _walks_ like a high-performance texture renderer...


It's probably been duck typed


shocked quack


If it looks like a benchmark and it quacks like a benchmark… duck?


So the additional performance came with a large bill?


Or Intel checking for "GenuineIntel" in ICC's output: https://en.wikipedia.org/wiki/Intel_C%2B%2B_Compiler#Support...


Or Win 3.1 looking for whatever shibboleth was in MS-DOS and popping up a scary-looking message if it found another DOS? https://en.wikipedia.org/wiki/AARD_code


I don’t think anybody remembers this since that code never shipped in retail.


It didn't ship (in the final retail version) only after the tech press of the day exposed what Microsoft had done.


It did ship in the final retail version, in way. It was disabled, but the code was still there, and a flag was all that was needed to enable it.


Every vendor does this to this day - and its a morally grey practice, drivers hijack and modify the rendering loops of popular games, fixing bugs, replacing shaders with more optimized versions, enabling faster codepaths in the driver etc.

These changes are supposed to have minimal to no impact on the actual output, but sometimes vendors are really aggressive, and significantly degrade the outputs so that the game can run faster on their hardware.


Sadly it's built into the vulkan protocol. Even a fully userspace driver arrangement with a microkernel ends up giving the driver access to the client's information. Of course it's forgeable the way it's done though so you could opt out if you really wanted to.

[1]: https://github.com/KhronosGroup/Vulkan-Headers/blob/main/inc...


I mean Khronos put that in for a reason. If the drivers didn't get explicit information about the application being run, they would do silly heuristics like quake3 to squeeze out performance.


> but sometimes vendors are really aggressive, and significantly degrade the outputs so that the game can run faster on their hardware.

Do you have a source for this? I’d like to see some examples


Nvidia has a control panel with it's drivers. Open it up -> Manage 3D settings -> Program Settings. Scroll through and see how every single program/game you have installed openly has different defaults in it based on application name. As someone noted above others do the same thing.

Eg. Frostpunk has Antialiasing for transparency layers on. Slay the spire does not. I never set these settings. Nvidia literally does a lookup on first run for what they judge as best defaults and sets these appropriately.

Every single game/program you install has different options from a huge list of possible optimizations.


Applying different standard settings is pretty different from "hijacking and modifying the rendering loop", though.


In what sense? The render loop is modified from “the” default without user or program opt-in, and “hijacking” is what it would be called if anyone but Nvidia did it — so Nvidia is not exempt from that use. Though: Runtime patch, haxie, hijack, LD_PRELOAD, system extension; the noun changes every few years, so perhaps it’s time for a new one. Override?


But the comment I replied to wasn’t talking about runtime patching or any of the other settings you mentioned. It was talking about changing GPU settings for specific programs. Not changing anything about the program itself.


That’s not what @torginus was referring to. There’s nothing wrong with having and exposing application specific settings. There’s nothing wrong with drivers having application specific optimization patches either, but that’s a very different thing.


For more context and deeper discussion on the subject, see https://news.ycombinator.com/item?id=44531107

Funnily, it's under an older submission of the same cutlass optimizations.


This is weirdly common; phone chipset manufacturers did it with phone benchmarks [0], VW with emissions [1], nVidia did it with 3DMark [2], Intel with the SPEC benchmark for its Xeon processors [3], etc.

When it comes to computer graphics, iirc it's pretty normalized now - graphics drivers all seem to have tweaks, settings, optimizations and workarounds for every game.

(As an aside, I hate that I have to link to archive.org, there's a lot of dead links nowadays but these are important things to remember).

[0] https://web.archive.org/web/20250306120819/https://www.anand...

[1] https://en.wikipedia.org/wiki/Volkswagen_emissions_scandal

[2] https://web.archive.org/web/20051218120547/http://techreport...

[3] https://www.servethehome.com/impact-of-intel-compiler-optimi...


> When it comes to computer graphics, iirc it's pretty normalized now - graphics drivers all seem to have tweaks, settings, optimizations and workarounds for every game.

Even Mesa has them: https://gitlab.freedesktop.org/mesa/mesa/-/blob/main/src/uti...


> graphics drivers all seem to have tweaks, settings, optimizations and workarounds for every game.

Maybe hyperbole, but I think obviously they can't do this for literally every game, that would require huge personnel resources. At least looking at mesa (linked elsewhere), only ~200 games are patched, out of what 100k PC games? So <1%.


Mesa is a lot more conservative about this than the proprietary drivers.


Well, pretty much every large AAA game launch complimented by GPU driver upgrade that adds support for that game. It's in the patch notes.


Goodhart's law: when a measure becomes a target, it ceases to be a good measure.


Are there any archives of that techreport article with images intact?



I feel like omitting AMD is relevant here for anyone who doesn't know the acquisition history of ATI, AMD had no involvement with this.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: