Graphics Reference
In-Depth Information
Pixel unpack buffer objects can be used to stream texture data to the GPU.
The application could allocate a pixel unpack buffer and then map regions
of the buffer for updates. When the calls to load the data to OpenGL are
made (e.g., glTexSubImage* ), these functions can return immediately
because the data already resides in the GPU (or can be copied at a later
time, but an immediate copy does not need to be made as it does with
client-side data). We recommend using pixel unpack buffer objects in
situations where the performance/memory utilization of texture upload
operations is important for the application.
Summary
This chapter covered how to use textures in OpenGL ES 3.0. We
introduced the various types of textures: 2D, 3D, cubemaps, and 2D
texture arrays. For each texture type, we showed how the texture can be
loaded with data either in full, in subimages, or by copying data from
the framebuffer. We detailed the wide range of texture formats available
in OpenGL ES 3.0, which include normalized texture formats, floating-
point textures, integer textures, shared exponent textures, sRGB textures,
and depth textures. We covered all of the texture parameters that can be
set for texture objects, including filter modes, wrap modes, depth texture
comparison, and level-of-detail settings. We explored how to set texture
parameters using the more efficient sampler objects. Finally, we showed
how to create immutable textures that can help reduce the draw-time
overhead of using textures. We also saw how textures can be read in the
fragment shader with several example programs. With all this information
under your belt, you are well on your way toward using OpenGL ES 3.0
for many advanced rendering effects. Next, we cover more details of the
fragment shader that will help you further understand how textures can
be used to achieve a wide range of rendering techniques.
 
 
Search WWH ::




Custom Search