Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Desktop GPUs are borrowing ideas from mobile GPU designs though - most modern desktop GPUs have tiling support (kind of) just like mobile GPUs: https://www.realworldtech.com/tile-based-rasterization-nvidi...

Even though it's not strictly a mobile GPU design, the hybrid architecture is definitely a shift.




Indeed. Nvidia is tile based:

https://www.realworldtech.com/tile-based-rasterization-nvidi...

AMD is tile based:

https://pcper.com/2017/01/amd-vega-gpu-architecture-preview-...

Intel is tile based (section 5.2):

https://www.intel.com/content/dam/develop/external/us/en/doc...

The technology originated with PowerVR in 1996, today known as Imagination Technologies, who designed the ancestor architecture to Apple's GPUs today.


Imagination released the driver and simulator sources for PowerVR series 1 last year [0] which is a fun resource to see how tiling GPUs started. Series 1 is particularly weird because it doesn't use triangles, it uses infinite planes, which may be clipped by other planes - meaning that convex polygons with more sides are faster than the equivalent in triangles. Sadly no RTL but I'm guessing the simulator is based on it given some of the weird coding patterns.

[0] - https://github.com/powervr-graphics/PowerVR-Series1


Thanks so much for sharing that, I had no Idea something this cool had been made public. Though initially when you wrote "simulator" I though the Verilog/VHDL source code and the simulator for the HDL design would be there, but that's not it.

Someone correct me if I'm wrong, but I asume it's a simulator for the driver, probably to simulate there's a PowerVR GPU in the system.

Just looking through the driver source code and seeing all those #ifdef blocks for all possible combinations of HW, SW, APIs, etc scattered everywhere is making my head spin.

Hopefully they had some IDE that would collapse all the unused blocks, as it reminds me of one of my gigs in embedded where the IDE didn't have such a feature and it was brutal following the code.


I haven’t tried to run the simulator, but I believe you should be able to run the HW driver with it - and get the frame buffer output. I imagine it’d take some work to get it working on a modern OS.

It’s the simulator that I suspect is derived from the RTL - hwsabsim.c is a good example…




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: