Sphere rendering with numpy...
Written by
on
in
Snaking.
So, you ask; "how do you generate the coordinates necessary to render a (unit) sphere" (you do ask the strangest questions)? Well, since you insist, here's a quick rundown of one way to go about it. We'll assume that we're going to use a pair of vertex/index VBOs to render the geometry. We won't try to make the most heavily optimized sphere in the world, but we'll try to keep it reasonably lightweight...
The sphere is modeled as a series of latitude lines around the sphere. At the very poles are latitude lines which are infinitely small circles. We construct each latitude line as a series of points sampled around the latitude line, we will (for convenience) use the same sampling frequency for all of the latitude lines. This creates longitude lines at each sampling point on the latitude lines. We'll assume evenly spaced lines:
The basic sphere can then be drawn by drawing quads (two triangles) between a sampling point, the point next to it on the latitude line, and the point below it on the longitude line. For the special case of the poles, we can drop one of the two triangles from each quad.
The latitude lines occur from angle 0 through angle pi (in radians), while the longitude lines occur from angle 0 through angle 2pi. We divide each of these into a series of equal-angled elements. We're going to do some pointer math, so we'll get the lengths for our calculations too...
latsteps = arange( 0,pi+0.000003, phi )
longsteps = arange( 0,pi*2+0.000003, phi )
ystep = len(longsteps)
zstep = len(latsteps)
xstep = 1
The x,z coordinates of a (unit) circle are easily calculated with cos() and sin() of the longitude. To texture map the sphere, we want the fraction-of-total for latitude and longitude for each coordinate...
coords = zeros((zstep,ystep,5), 'f')
coords[:,:,0] = sin(longsteps) # unit-circle dim
coords[:,:,1] = cos(latsteps).reshape( (-1,1))
coords[:,:,2] = cos(longsteps) # unit-circle dim
coords[:,:,3] = longsteps/(2*pi)
coords[:,:,4] = latsteps.reshape( (-1,1))/ pi
The y coordinate at which the circle is placed is the cos() of the latitude angle, and we can then multiply the unit-circle coordinate by the sin() of the latitude angle (the cos is the height, so the sin is the target x,z length, which is currently 1) to get the final (x,y,z) coordinate of a given longitude/latitude point on the sphere.
scale = sin(latsteps).reshape( (-1,1))
coords[:,:,0] *= scale
coords[:,:,2] *= scale
The indices are a set of triangles which draw n,n+ystep,n+ystep+xstep and n,y+ystep+xstep,n+xstep triangles for each n around the sphere. Each subsequent rectangle around a latitude line has its offsets increased by xtep (1), each subsequent latitude has its offsets increased by ystep.
indices = zeros( (zstep-1,ystep-1,6),dtype='H' )
indices[:] = (0,0+ystep,0+ystep+xstep, 0,0+ystep+xstep,0+xstep)
# all indices now render the first rectangle...
xoffsets = arange(0,ystep-1,1,dtype='H').reshape( (-1,1))
indices += xoffsets
# now all indices render the first row...
yoffsets = arange(0,zstep-1,1,dtype='H').reshape( (-1,1,1))
indices += (yoffsets * ystep)
With that we have the coordinates and indices to render the sphere, but we have a lot of degenerate polygons at the poles. To compress them out, we take the stepped slice of the two polar "slices" in the index arrays:
if len(indices) >= 2:
indices = concatenate(
(
indices[0].reshape( (-1,3))[::2],
indices[1:-1].reshape( (-1,3) ),
indices[-1].reshape( (-1,3) )[1::2],
)
)
And there we have it, the coordinates and indices required to render a sphere as a pair of VBOs, like so:
coordLength = len(coords)
coords = vbo.VBO( coords )
indices = vbo.VBO( indices, target = 'GL_ELEMENT_ARRAY_BUFFER' )
glEnableClientState(GL_VERTEX_ARRAY)
glEnableClientState(GL_NORMAL_ARRAY)
glEnableClientState(GL_TEXTURE_COORD_ARRAY)
self.coords.bind()
glVertexPointer( 3, GL_FLOAT,20,self.coords)
glTexCoordPointer( 3, GL_FLOAT,20,self.coords+12)
glNormalPointer( GL_FLOAT,20,self.coords )
self.indices.bind()
glDrawElements( GL_TRIANGLES, coordLength, GL_UNSIGNED_SHORT, self.indices )
Obviously that relies on the rendering of the unit sphere (since the normals are the coordinates). You could easily record the x,y,z coordinates in separate elements and make a larger data-array.
If we wanted to restrict the quadric so that, for instance, it was just a small spherical patch, we could easily do it by restricting the values of latsteps and longsteps, but if the values are not 0/pi then we cannot compress the first/last rings of the result-set. The observant will note that there are duplicate points in the coordinate array, we could drop those out by referencing the first coordinates in the line as a special case (i.e. when total angle is 2pi). This has simply been omitted to make sphere patches easier to implement. We don't compress the poles to a single vertex, incidentally, because then the texture-coordinates would get messed up even more on the pole triangles than they already are.
The code for the summary here is available from the OpenGLContext bzr repository.
[Update] And now you can see all of Cone, Cylinder and Sphere rendering in the scenegraph library of OpenGLContext. It's all the same basic algorithm as the sphere, incidentally.
Pingbacks
Pingbacks are closed.
Comments
Comments are closed.