From the paper: "We use the rasterizer as a potentially-visible set optimization to iterate only over pixels
for which rays might intersect a voxel, and then execute a small ray tracer in the
pixel shader. That is, we “splat” billboards that give coarse visibility and compute
exact visibility in a pixel shader. This works for any pinhole perspective projection,
including eye rays and shadow rays, so we use it for the shadow map rendering pass
as well"
Neat - I used a similar technique [1] in the last iteration of Voxel Quest, but I only rasterized single points/pixels then raytraced them into filled cubes (was not efficient other than on fill rate). However I did not invent the technique (I'm not sure who did, but I first got the idea from talking to Florian Boesch [2])
> We use the rasterizer as a potentially-visible set optimization to iterate only over pixels for which rays might intersect a voxel, and then execute a small ray tracer in the pixel shader.
I was just thinking about doing this last weekend (in the context of drawing exact reflections off reflective surfaces defined by arbitrary meshes). Nice to see it actually works, at least for voxels!
The fact that raytracing (onto a 2D plane) and rasterization are essentially isomorphic to one another is a powerful observation.
I really liked the concept of Voxel Quest. did you stop the project?
recently I saw two posts related to volume rendering, this and dreamworks' VDB. I immediately thought about voxel quest. I liked it supporting smooth surfaces, not like minecraft.
Still working on it actively, although in the past two years I also dabbled with two other side projects (a node js / web gl / electron side-scroller game and a compiler). Progress is very slow due to real-life constraints but at least steady now. I will be presenting the newest progress soon. :)
Wow, I am really happy to hear that! I have been following Voxel Quest for a few years now, and I was really sad when you stopped working on the project. Looking forward to see what are your new developments!
As an original KS backer, that is awesome news to hear! You produced a wonderful code base and I learned a ton from your blog posts and streams. Any thoughts of streaming/blogging about the progress in the future?
This is more useful than it might seem because many rendering techniques are sometimes implemented using voxel representations behind the scenes, like global illumination (indirect bounced light, so light bounces off a red wall to create a pale red glow on the ground, that sort of thing), shadowing, fluids, etc. The examples of Minecraft and LEGO given are direct applications but I suspect this will improve a lot of stuff.
There're lots of different GI methods, most of them use spatial acceleration structures such as BVH and KD-Trees after triangulation. Fluids use a grid for discretisation, behind the scenes the PDEs are solved be means of the finite difference methods, it has nothing to do with ray intersection tests.
Just wanted to say that Roblox is a masterful system. My daughter plays it and its an amazing piece of software. Your map editor is great too. My daughter is 8 and she picked it right up.
Curious, how big is the team for the game and editor?
We're not hiring right now, but in the future we may have similar needs. We're all webgl, check out https://ayvri.com and drop me a line at pete[at]ayvri.com
We have some solid game dev tech on the team already.
Thanks for the heads-up! Oddly enough the dynamic scene near the end reminds me of a recent video by Jon Burton where he explains how his game company managed to get an incredibly high number of particle effects on the Playstation 2. It looks like in both cases, the idea is to make the implementation functional and fully computable on a VPU/GPU with as little overhead of moving data between RAM, CPU and GPU as possible.
Neat - I used a similar technique [1] in the last iteration of Voxel Quest, but I only rasterized single points/pixels then raytraced them into filled cubes (was not efficient other than on fill rate). However I did not invent the technique (I'm not sure who did, but I first got the idea from talking to Florian Boesch [2])
[1] https://twitter.com/gavanw/status/717265068086308865 [2] http://codeflow.org