Hacker News new | past | comments | ask | show | jobs | submit login
A Ray-Box Intersection Algorithm and Efficient Dynamic Voxel Rendering (jcgt.org)
139 points by edroche on Sept 20, 2018 | hide | past | favorite | 22 comments



From the paper: "We use the rasterizer as a potentially-visible set optimization to iterate only over pixels for which rays might intersect a voxel, and then execute a small ray tracer in the pixel shader. That is, we “splat” billboards that give coarse visibility and compute exact visibility in a pixel shader. This works for any pinhole perspective projection, including eye rays and shadow rays, so we use it for the shadow map rendering pass as well"

Neat - I used a similar technique [1] in the last iteration of Voxel Quest, but I only rasterized single points/pixels then raytraced them into filled cubes (was not efficient other than on fill rate). However I did not invent the technique (I'm not sure who did, but I first got the idea from talking to Florian Boesch [2])

[1] https://twitter.com/gavanw/status/717265068086308865 [2] http://codeflow.org


> We use the rasterizer as a potentially-visible set optimization to iterate only over pixels for which rays might intersect a voxel, and then execute a small ray tracer in the pixel shader.

I was just thinking about doing this last weekend (in the context of drawing exact reflections off reflective surfaces defined by arbitrary meshes). Nice to see it actually works, at least for voxels!

The fact that raytracing (onto a 2D plane) and rasterization are essentially isomorphic to one another is a powerful observation.


> The fact that raytracing (onto a 2D plane) and rasterization are essentially isomorphic to one another is a powerful observation.

Hasn't this been a pretty well-known perspective (no pun intended) for a couple of decades now?


Of course, but it's only relatively recently that we've had the graphics APIs to make full use of that observation.


<notices username>

Will this somehow turn into faster CSS and/or SVG render times for Firefox? :)


I really liked the concept of Voxel Quest. did you stop the project?

recently I saw two posts related to volume rendering, this and dreamworks' VDB. I immediately thought about voxel quest. I liked it supporting smooth surfaces, not like minecraft.


Still working on it actively, although in the past two years I also dabbled with two other side projects (a node js / web gl / electron side-scroller game and a compiler). Progress is very slow due to real-life constraints but at least steady now. I will be presenting the newest progress soon. :)


Wow, I am really happy to hear that! I have been following Voxel Quest for a few years now, and I was really sad when you stopped working on the project. Looking forward to see what are your new developments!


As an original KS backer, that is awesome news to hear! You produced a wonderful code base and I learned a ton from your blog posts and streams. Any thoughts of streaming/blogging about the progress in the future?


I will definitely blog soon, may stream soon as well although it would be more of a coding stream than anything else.


This is more useful than it might seem because many rendering techniques are sometimes implemented using voxel representations behind the scenes, like global illumination (indirect bounced light, so light bounces off a red wall to create a pale red glow on the ground, that sort of thing), shadowing, fluids, etc. The examples of Minecraft and LEGO given are direct applications but I suspect this will improve a lot of stuff.


There're lots of different GI methods, most of them use spatial acceleration structures such as BVH and KD-Trees after triangulation. Fluids use a grid for discretisation, behind the scenes the PDEs are solved be means of the finite difference methods, it has nothing to do with ray intersection tests.


This is exactly what we do at Roblox.


Just wanted to say that Roblox is a masterful system. My daughter plays it and its an amazing piece of software. Your map editor is great too. My daughter is 8 and she picked it right up.

Curious, how big is the team for the game and editor?


Anyone hiring engineers to implement stuff like this? That would be my dream job.


We do heavy duty graphics/CUDA, feel free to ping [email protected]. We are currently looking.


Most big tech companies, AR/VR startups, game companies have graphics programmer roles that might do similar things.

Teams that specifically have the need for voxel-ish rendering include Minecraft (Microsoft), Roblox, Medium (Oculus), Media Molecule (Sony).

Outside of games, art tools and consumer apps, voxels are used in many medical and scientific applications.


We're not hiring right now, but in the future we may have similar needs. We're all webgl, check out https://ayvri.com and drop me a line at pete[at]ayvri.com

We have some solid game dev tech on the team already.


Be sure to check out the video in the "sample code" supplement.


Thanks for the heads-up! Oddly enough the dynamic scene near the end reminds me of a recent video by Jon Burton where he explains how his game company managed to get an incredibly high number of particle effects on the Playstation 2. It looks like in both cases, the idea is to make the implementation functional and fully computable on a VPU/GPU with as little overhead of moving data between RAM, CPU and GPU as possible.

[0] https://www.youtube.com/watch?v=JK1aV_mzH3A


For people who don't want to hand a random website a RCE vulnerability just to display some HTML, the supplement and actual article are at http://www.jcgt.org/published/0007/03/04/supplement.zip and http://www.jcgt.org/published/0007/03/04/paper.pdf respectively.


Could someone here please put it on youtube/vimeo/... and link?




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: