Image Processing Reference
In-Depth Information
24.6 Rendering
Visual presentation of the data is the last stage of the pipeline before involving the
user. The basic B-mode ultrasound images can be depicted on a screen in a straight-
forward manner as varying pixel intensities according to the echo amplitude. Doppler
information can be included as well with color-encoded blood-flow direction. Other
data, such as tissue strain, can also be included into 2D as overlays. Another example
of overlays is the CycleStack Plot which superimposes the respiratory signal onto a
selected feature of interest in the ultrasound image [ 41 ]. Doctors use this information
to account for the respiration-caused motion of the tumor in order to minimize the
damage done by certain tumor treatments.
Freehand ultrasound In Sect. 24.3.1 , we discussed how freehand ultrasound
systems can be used to create large volumes by putting images into 3D spatial context.
Garrett et al. presented a technique for correct visibility ordering of images using a
binary positioning tree [ 18 ]. Visualization of large volumes leads to visual clutter.
Therefore, Gee et al. extended existing re-slicing tools to create narrow-band volumes
which contain less elements and are easier to present [ 20 ].
3D ultrasound is not as trivial to present due to its natural properties. In an early
work, Nelson and Elvis discussed the effect of existing techniques for presenting
3D ultrasound data, such as surface fitting and volume rendering [ 46 ]. Later, seven
ultrasound-dedicated volume projection techniques were evaluated by Steen and
Olstad [ 72 ]. They included maximum intensity projection (MIP), average intensity
projection (AIP) and gradient magnitude projection (GMP). The techniques were
applied to 3D fetal data, where GMP was valued to give the best detail and robustness
towards viewing parameters.
Data definition in the polar coordinate system is another challenge for ultra-
sound volume rendering. Kuo et al. presented a technique for quick on-the-fly scan-
conversion [ 39 ]. To reduce the costs of the functional evaluation of tan
,
the functional values were pre-calculated and stored in a texture as a look-up-table.
Surface Rendering is a common tool for many imaging modalities. In ultra-
sound, the low signal-to-noise ratio and parallel tissue boundary discontinuities make
defining smooth surfaces difficult. Smoothing of a surface can be performed at the
rendering stage. Fattal et al. presented an approach to render smooth surfaces from
3D ultrasound [ 15 ]. The surface is extracted based on the variational principle. Fuzzy
surface rendering is done by a technique called oriented splatting. Oriented splatting
creates triangles aligned with the gradient of the surface function. The triangle is
then colored with a Gaussian function and rendered in a back-to-front order. Wang
et al. proposed an improved surface rendering technique for 3D ultrasound data of
fetuses [ 77 ]. To remove the noise and to preserve edges, a modified anisotropic dif-
fusion is first applied to the dataset. To enhance low intensities which appear due
to signal loss as the sound wave propagates through the tissue, a light absorption
function based on the distance from a point is applied to the data. Finally, a texture-
based surface rendering is used, where the texture is extracted from images of infants.
The textures are warped and blended with the surface of the fetus face. To create
(φ)
and tan
(ψ)
 
Search WWH ::




Custom Search