Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In theory, this should have been possible long ago with WebGL Compute, had Google not given up on it, and removed it from Chrome, quoting WebGPU as the future, and the OpenGL execuse on Apple platforms (excuse, because they ended up switching to Metal for WebGL anyway, and use DirectX on Windows).


WebGL compute was not viable, and only existed as an engineering prototype with lots of rough edges. There were a bunch of hard problems that needed to get solved to ship WebGPU. Perhaps in an alternate universe, that work could have been done in the context of WebGL, but it didn't.

I'll give one example (as it's one I was personally invested in). Doing a barrier in non-uniform control flow is wildly unsafe undefined behavior (I've had it reboot my computer, and it's easy to believe it could be exploited by malicious actors). To make these barriers safe WebGPU does a "uniformity analysis." However, that in turn required adding uniform broadcast intrinsic to the shader language, otherwise a class of algorithms would be impossible to express.

As I say, it's plausible this kind of work could have been done as extensions to WebGL, but I think the end result would have been a lot less compelling than what we have now.


The fact was that Intel did the work, and it was about to ship on Chrome, and as interesting as your explanation is, it wasn't the official reason for the Chrome team to drop support for WebGL 2.0 Compute.

Rather WebGPU and Apple's OpenGL lack of support for compute shaders.

Which became irrelevant the moment Chrome decided to move WebGL on top of Metal via Angle, just like it does with DirectX on Windows.


The official deprecation [1] cites "some technical barriers, including ... [macOS support]". I'm not able to speak for Chrome, but my understanding is that these technical barriers included serious challenges in making it safe. That's where a significant amount of engineering went into WebGPU from the beginning.

[1]: https://issues.chromium.org/issues/40150444


The reality is device specific render output signatures are already demonstrably unique.

So it is likely impossible to make these GPU APIs anonymous, and thus can never really be considered "secure".

Have a nice day, =3




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: