Graphics Reference
In-Depth Information
After being multiplied by the ModelView matrix, each vertex lives in the
Eye Coordinate space in which the viewer's eye position has been transformed
to (0,0,0). We convert both the eye and the vertex coordinates into texture space
like this:
vec4 vxyz = uModelViewMatrix*aVertex; // vertex -> eye coords
vec3 vstp = ( vxyz.xyz + 1. ) / 2.; // vertex -> tex coords
vec3 eye = ( vec3(0.,0.,0.) + 1. ) / 2.; // eye -> tex coords
So, a vector from the eye through the vertex will be
stpvec = vstp - eye;
Depending on how the volume has been rotated and translated, vstp
and eye could be well outside the range [0.,1.], even though they are supposed
to be in texture coordinates. This is OK. We really aren't going to use their
values, except to get the vector between them, which we will eventually scale
to something smaller.
Now comes the tricky part. The vertex shader, shown below, takes its
scene rotation from the ModelView matrix. It uses this in two ways. It rotates
the cube quadrilaterals forward. This makes sense—we want the faces of the
volume to appear to rotate.
But the tricky part is that the vertex shader also rotates the casting
direction backward . Why is this? When we rotate the volume, we want it to
appear that the 3D data texture is rotating along with the cube faces. But in
OpenGL, textures themselves don't transform; only the texture coordinates
do. Fortunately, transforming the texture is the inverse of transforming the
texture coordinates. So, if you want to make it look like the data texture is
rotating forward, you need to transform its texture coordinates backward.
Since the casting direction is in texture coordinate space, its coordinates must
be changed by the inverse of the desired texture transformation. In GLSL, to
rotate the casting direction backward, we multiply it by the inverse of the
ModelView matrix, encoded in the mat4 variable, uModelViewMatrixInverse .
This multiplication is operating on a vector, which has direction and magni-
tude, but no position. So, during that multiplication, we force the w compo-
nent of the casting direction to be zero, so that we don't pick up any of the
uModelViewMatrixInverse translations.
The longest possible flight path thro u gh the 3D data texture is from cor-
ner to opposite corner, which would be 3 lo ng in texture coordinates, so the
normalized casting distance is multiplied by 3. Then vDirSTP is divided by
uNumSteps , the number of steps that we want to take samples at along the cast-
Search WWH ::




Custom Search