Image Processing Reference
In-Depth Information
Time
Capturing
Exposure
Exposure
ExposureExposure
Exposure
Integration mode
Digitization of time coordinate
by sampling with integration
FIGURE 1.8
Capturing the sequence of moving pictures.
In capturing moving images, still images are taken at a constant time interval. Although
the physical quantity “time” has an essentially continuous analog distribution, images are
picked up using timings and lengths determined by imaging systems. This is the digitiza-
tion of the time coordinate. The output of pixel r k , color c l , and time t f * is S ( r k , c l , t f ), and a
set of S makes up one moving picture. As a result of the formation of built-in six-dimen-
sional coordinate points, the information that is to be captured, which is essentially seven
dimensional, is compressed to one dimension of light intensity information only. It is also
a significant data compression.
As was discussed, on one hand, the information that constructs optical images is a set of
continuous analog quantities (intensity, space, wavelength, and time), as shown Figure 1.9a.
On the other hand, an image signal captured by image sensors is only the light intensity
information arriving at the built-in coordinate points of space, wavelength, and time, as
shown in Figure 1.9b. This is how image sensors work. The entity of the built-in coordinate
points is each pixel covered with a one-color filter having a corresponding spectral distri-
bution during the exposure period, as shown in the graphics in Figure 1.9b. A pixel output
Sr mn at address r mn is expressed by
∫∫∫
(
)
(
)
(
)
Sr
=
ir tfrAr
,,
λ
,
λ
, ddd
λλ
r
t
(1.1)
mn
∆∆ ∆∆
xy t
,
λ
,
where i , f , and A are the light intensity distribution of the optical image, the spectral dis-
tribution of the color filter, and the spectral sensitivity distribution of the image sensor,
respectively. By integrating at the ranges of Δ x Δ y in space, Δλ in the wavelength coordinate
and Δ t in time, the light intensity signal is sampled. Therefore, the quality of the informa-
tion on space, color, and time is decided by the sampling frequency and the width of the
built-in coordinate point in ( r , c , t ) space. That is, it is determined in the wake of the system
design. What remains is the accuracy and the dynamic range of the light intensity signal S .
The general coordinate points of space, wavelength, and time digitized in ( r , c , t ) space
are shown in Figure 1.10a. The position coordinate r , color coordinate c , and time coordi-
nate t are the pixel at that position, the color of the filter at the pixel, and the frame at the
number of the exposure in chronological order, respectively. The number of the coordinate
point of the space coordinate r , the color coordinate, and time coordinate is the same as the
* As will be discussed in Chapter 4, a single exposure time in moving pictures is called a frame.
Search WWH ::




Custom Search