Graphics Reference
In-Depth Information
The compositing example is one of many cases where a buffer is intended
as input for an algorithm rather than for direct display to a human as an image,
and
32-bit pixe l
α
is only one of many common quantities found in buffers that has no direct
visible representation. For example, it is common to store “depth” in a buffer
that corresponds 1:1 to the color buffer. A depth buffer stores some value that
maps monotonically to distance from the center of projection to the surface seen
at a pixel (we motivate and show how to implement and use a depth buffer in
Chapter 15, and evaluate variations on the method and alternatives extensively in
Chapter 36 and Section 36.3 in particular).
RGBA
RGBA
RGBA
RGBA
RGBA
RGBA
Another example is a stencil buffer, which stores arbitrary bit codes that are
frequently used to mask out parts of an image during processing in the way that a
physical stencil (see Figure 14.7) does during painting.
RGBA
RGBA
RGBA
Stencil buffers typically use very few bits, so it is common to pack them into
some other buffer. For example, Figure 14.8 shows a 3
×
3 combined depth-and-
8-bit channels
stencil buffer in the GL_DEPTH24STENCIL8 format.
A framebuffer 1 is an array of buffers with the same dimensions. For example,
a framebuffer might contain a GL_RGBA8 color buffer and a GL_DEPTH24STENCIL8
depth-and-stencil buffer. The individual buffers act as parallel arrays of fields at
each pixel. A program might have multiple framebuffers with many-to-many rela-
tionships to the individual buffers.
Why create the framebuffer level of abstraction at all? In the previous example,
instead of two buffers, one storing four channels and one with two, why not sim-
ply store a single six-channel buffer? One reason for framebuffers is the many-to-
many relationship. Consider a 3D modeling program that shows two views of the
same object with a common camera but different rendering styles. The left view
is wireframe with hidden lines removed, which allows the artist to see the tes-
sellation of the meshes involved. The right view has full, realistic shading. These
images can be rendered with two framebuffers. The framebuffers share a single
depth buffer but have different color buffers.
Another reason for framebuffers is that the semantic model of channels of
specific-bit widths might not match the true implementation, even though it
was motivated by implementation details. For example, depth buffers are highly
amenable to lossless spatial compression because of how they are computed from
continuous surfaces and the spatial-coherence characteristics of typically ren-
dered scenes. Thus, a compressed representation of the depth buffer might take
significantly less space (and correspondingly take less time to access because
doing so consumes less memory bandwidth) than a naive representation. Yet the
compressed representation in this case still maintains the full precision required
by the semantic buffer format requested through an API. Unsurprisingly given
these observations, it is common practice to store depth buffers in compressed
form but present them with the semantics of uncompressed buffers [HAM06].
Taking advantage of this compressibility, especially using dedicated circuitry
in a hardware renderer, requires storing the depth values separately from the
Figure 14.6: The GL_RGBA8
buffer format packs three 8-bit
normalized fixed-point values
representing red, green, blue,
and coverage values, each on
[0, 1], into every 32-bit pixel.
This format allows efficient,
word-alignedaccesstoanentire
pixel for a memory system with
32-bit words. A 64-bit system
might fetch two pixels at once
and mask off the unneeded
bits—although if processing
multiple pixels of an image in
parallel, both pixels likely need
to be read anyway.
Figure 14.7: A real “stencil” is
a piece of paper with a shape
cut out of it. The stencil is
placed against a surface and then
painted over. When the stencil
is removed, the surface is only
painted where the holes were. A
computer graphics stencil is a
buffer of data that provides sim-
ilar functionality.
1. The framebuffer is an abstraction of an older idea called the “frame buffer,” which
was a buffer that held the pixels of the frame. The modern parallel-rendering term
is “framebuffer” as a nod to history, but note that it is no longer an actual buffer. It
stores the other buffers (depth, color, stencil, etc.). Old “frame buffers” stored multiple
“planes” or kinds of values at each pixel, but they often stored these values in the pixel,
using an array-of-structs model. Parallel processors don't work as well with an array
of structs, so a struct of arrays became preferred for the modern “framebuffer.”
 
 
Search WWH ::




Custom Search