Saturday, March 2, 2013

UberRay - Uber's real-time raytracing API

We had the opportunity a few years ago to do some cutting edge implementation work on real-time raytracing.  The idea was to solve the practical problems related to making a commercially viable ray tracer for actual games.  The goal was to release a fully raytraced game as well as the UberRay API itself.  Think about it as OpenGL or DirectX but for raytracing specifically. This way other people could have ray traced version of their games without worrying about the underlying implementation.


A shot from our renderer



Now, first off, I'm not saying that ray-tracing is the holy grail.  I've come to believe that it has a place in graphics but that hybrid techniques may be more the path we directly take.  I believe that, as always, we will hack together a variety of techniques for any particular use case.

That being said I'm very concerned about the amount of artist time it takes to create content.  How does this relate?  When creating a modern engine it can be difficult to make everything work together in every case.  We have issues with sorting, issues with techniques that take full screen passes and don't combine well etc.  Most of the time we hack something together that works "good enough" and tell the artists to be careful about building certain types of things that break.  This is an inherent inefficiency and it also means that artists have to understand technical limitations of the tools. What if we had an engine where every shader could play nicely together without worrying about composition as much?  In other words we make the rendering regular so that it's more consistent and easier to work with.  Can we get there with rasterization?  Probably but I do think raytracing may become the model just because it's regular.  Keep in mind that today performance is a top priority but as computing power increases it may become less of an issue.
Funhouse mirrors!

So let's talk about so high level architectural details.  First off ray tracing is a scene level algorithm.  The program needs to know about the current state of the entire scene at once.  Normal graphics API's like OpenGL don't include scene level functions, they are lower level.  You typically combine render calls together in a set of frame buffers and build up the image directly out of these calls.

With ray tracing it's more like:
- Update the scene graph with current information about the scene
- build the acceleration data structures
- trace rays from a camera into a specific buffer
- any post processing we want to do including things like AA and tone mapping

Whereas a rasterize looks like:
- On the CPU figure out what meshes we want to draw in which order
- make draw calls to render the meshes, culling them and potentially multi-passing them
- could be doing lights deferred or not, most of the rendering code is similar
- then when we are finished do a bunch of post processing including again resolving AA and tone mapping

The main difference between them is really what order you are iterating the data in.  The ray tracer iterates over the pixels and asks what the cover.  The rasterizer iterates over the set of meshes and triangles and calculates which pixel they overlap.

Another angle

So it's obvious that UberRay has to be a scene level graph.  I hesitate to just call it a scene graph because it's a very simple one.  For the most part it was simply a list of mesh instances, shaders and lights.  However, as you'll see it did start to get a bit more complex by supporting things like volumetric effect, proper depth of field, particle systems, mathematically defined surfaces etc.

One of the largest areas that we spent time was on the shader system.  William created an amazing interpreted shader language that could be used to create really complex modern era shaders.  The language also had a code gen back-end that could target SSE.  One of the base level shader ops was simply spawn another ray and return it's result so you could do things like reflections.  A material was actually made up of multiple shaders.  For example you would have a shader that directly computes the frame buffer color but you also have another shader that describes the transmission property of the material.  Emission was another shader.  We could do real refraction and even chromatic aberration by tracing three rays separately.

The light on his face is trasmission from a white light going through a film.

Anyway we never got to actually release the thing so I wanted to talk about it a little bit and show some old screen shots.  Hopefully we can revive this project down the road when appropriate hardware is available.  I don't think it will be that long until someone does a raytraced game just to get a different look going.

At this point the code is just sitting there doing nothing and has been for a couple of years.  C'est la vie.


12 comments:

  1. Just have to love a post titled Mavor's Rants! I may have to start doing the same on mine.

    ReplyDelete
  2. Any chance you had something to do with (now closed) ompf.org or igad people (http://igad2.nhtv.nl/ompf2/)? Those guys worked on real-time ray tracing for years and last time I checked they got a few hundred millions of rays per second out of couple generations old NV GPUs.

    ReplyDelete
  3. Copy Write it now!!! Thanks for the article its pretty daunting task editing code for ray tracing is still over my head.

    ReplyDelete
  4. Wow, impressive stuff Jon. Thanks for sharing your thoughts and experience. Ray tracing as you describe it seems to be a heavily customised solution compared to traditional ray tracing as I understand it in order to mimick a set of effects typical 3D rendering packages produce. How does this implementation overcome issues such as blotching and "flickering" due to interpolation of the scene. Is there some degree of baking the radiosity and a second layer of instruction for any modifications on the fly (I.e. fixed lighting is baked to scene or similar). I'll admit I'm no expert on the subject but what kind of framerates and polys count are you seeing in a typical scene. Are you using secondary bounce at all? I'm used to lightwave settings so is this in any way comparable?

    ReplyDelete
    Replies
    1. The big downside of raytracing is that it takes a lot of ray to get rid of the aliasing. We didn't have any super secret sauce for that, you just need a lot of rays. You can do things like MLAA post processing but at some level you just need a ton of rays. This is one reason I think hybrid techniques are more viable.

      Delete
  5. Very interesting and the graphics look really impressive! How did you deal with dynamic scenes, i.e. how many characters could you animate in real-time?

    ReplyDelete
    Replies
    1. Characters were definitely slower but we could still do a bunch of them. Basically with a character you re-adjust the bvh instead of building it from scratch. Dynamic and static items are actually separate in the scene as we can do more pre-processing on the static stuff (throwing it all in one big tree instead of sub-trees).

      Delete
  6. I loved the article it was a very interesting read. Also, It looks really amazing I currently go to Digipen Institute of Technology and love reading your articles hoping to one day implement something along the same caliber as the things you write about. Nice article!

    ReplyDelete
  7. Thank you for sharing this. I am a raytracing -also path tracing- enthusiast and would love to hear more of this.

    ReplyDelete
  8. Super interesting. Can you share any rough performance numbers on any sample hardware? Just curious where we are on the viability graph for this tech.

    ReplyDelete
  9. Really good but what i was thinking is that we dont have to wait for hardware because we can still improve the speed of ray tracing by much better algorithm implementation because to bring ray tracing to gaming level we need a dedicated ray tracing hardware just like imagination technologies caustic series but current rat tracer hardware is not powerful enough to do ray tracing in much larger extent(games) even nvidia has showcased its gpu roadmap which clearly defines that upto year 2016 they are not ready for ray tracing. Rasterization based hardware will never make it possible

    ReplyDelete