Graphics Reference
In-Depth Information
SpriteStream. Append (output);
}
SpriteStream. RestartStrip( );
}
Listing 1 2.1 2. Expanding points into quads with the geometry shader.
As seen in Listing 12.12, we expand the points into quads in view space which lets
us use a static set of offsets for the new vertex locations. These offset positions are then
transformed into clip space for use in the rasterizer stage. Similarly, we use a static array
of texture coordinates to provide a full texture over the created quad. We also generate a
color value that is based on the distance from the black hole, and then pass the color to the
output vertices as well. These output vertices are then rasterized and passed to the pixel
shader, where it samples a particle texture and writes the output to the output merger. This
is shown in Listing 12.13.
Texture2D ParticleTexture : register( t0 );
SamplerState LinearSampler : register( s0 );
float4 PSMAIN( in PS_INPUT input ) : SV_Target
{
float4 color = ParticleTexture. Sample( LinearSampler, input.texcoords );
color = color * input.color;
return( color );
}
Listing 12.13. Applying a texture to the output pixels.
After the pixel shader generates the color that is destined for the render target, we
must also configure the output merger to accommodate our desired rendering style. Our
particle system is going to use what is called additive blending, which essentially means
that we will modify the blend state so that each pixel produced by the pixel shader will be
added to the contents of the render target. This will composite particles onto each other
if they overlap, and will create a glowing effect if many particles are occupying the same
location simultaneously. In addition, since we are additively blending it is important to
disable depth-buffer writing in the depth stencil state, while still using the depth buffer for
depth testing. Disabling depth writing requires that any objects in the scene that partially
occlude the particle system must be rendered before it, to allow the resulting rendering to
be properly sorted according to depth. The resulting rendering can be seen in Figure 12.14.
Search WWH ::




Custom Search