Graphics Reference
In-Depth Information
float2 texcoords = input . tex . xy ;
// Unpack clip position from texcoords and depth.
float depth = depthBuffer . SampleLevel ( pointSampler ,
texcoords ,0.0 f );
float4 currClip
= unpackClipPos ( texcoords , depth );
// Unpack into previous homogenous coordinates :
// inverse(view projection) ￿ previous(view projection).
float4 prevHomogenous = mul ( currClip ,
invViewProjPrevViewProjMatrix );
// Unpack homogenous coordinate into clip space coordinate.
float4 prevClip = float4 ( prevHomogenous . xyz /
prevHomogenous . w ,1.0 f );
// Unpack into screen coordinate [ 1, 1] into [0 , 1] range and
// flip the y coordinate.
float3 prevScreen
= float3 ( prevClip . xy
float2 (0.5 f ,
0.5 f )
+ float2 (0.5 f ,0.5 f ), prevClip . z );
// Return the corresponding color from the previous frame.
return prevColorBuffer . SampleLevel ( linearSampler , prevScreen . xy ,
0.0 f );
Listing 4.9. Re-projecting a pixel into its previous location in the previous frame's color
image. Implementation details for rejecting pixels are omitted. Demo comes with the
full code.
If you have a motion blur velocity vector pass, more specifically 2D instan-
taneous velocity buffer, you can use that instead of re-projecting with the code
above. Using 2D instantaneous velocity is more stable but that is beyond the
topic of this chapter.
4.6.5 Temporal Filtering
Temporal filtering is another enhancer that helps the algorithm to produce even
more accurate and stable results by trying to recover and reuse pixels over several
frames, hence the name temporal , over time. The idea is to have a history buffer
that stores the old reflection computation, and then we run a re-projection pass
over it just like we saw in Section 4.6.4, and reject any invalid pixels. This history
buffer is the same buffer we write our final reflection computation to, so it acts
like an accumulation buffer where we keep accumulating valid reflection colors.
In case the ray-marching phase fails to find a proper intersection, due to the ray
falling behind an object or outside of the screen, we can just rely on the previous
re-projected result that is already stored in the history buffer and have a chance
of recovering that missing pixel.
Temporal filtering helps stabilize the result because a failed pixel in frame N
due to occlusion or missing information might be recovered from frame N
1,
which was accumulated over several frames by re-projection of pixels. Having the
Search WWH ::




Custom Search