Scene sorting with numpy...
Last update on .
You often want to be able to calculate the eye-space coordinate of a given model-space coordinate. One of the most common reasons is to calculate the distance to a given object from the camera (depth in the scene) so that you can do sorting of the scenegraph.
Why would you want to sort a scenegraph?
For opaque objects, doing a front-to-back sort will tend to cause closer objects to occlude further-away objects. This reduces the number of times you run your fragment shaders, which are the "heavy" part of most shader setups.
For transparent objects, many rendering algorithms require you to render in back-to-front order so that the farther-away geometry is already in-buffer when you go to render the closer geometry.
Anyway, the gluProject function allows you to calculate eye-space coordinates by passing in three values, the model and view matrices and the viewport setup. Ever so convenient. Thing is, the calculations to calculate eye-space coordinates are pretty trivial once you have those pieces of data, so why bother with the GLU function?
The code looks something like this:
M = dot( modelView, projection )
v = dot( points, M )
v /= v[:,3].reshape( (-1,1))
And returns the same distance (Z) value as gluProject given the same matrices and input point array. So it can give you the distances without needing to call gluProject many times. The function can do less work than gluProject for each point, as it is not needing to mix in the viewport values at all (not a huge deal, just removes a single multiply and add per coordinate).
You do the same thing with the individual polygons/triangles in transparent geometry, incidentally. If you want to get funky, you can even do it across objects to get proper sorted ordering for everything you want to render.
Pingbacks are closed.
Comments are closed.