Digital Signal Processing Reference
In-Depth Information
5
Camera Calibration
Several aspects of the camera network need to be calibrated:
￿
Intrinsic calibration determines camera parameters such as focal length and
color response.
￿
Extrinsic spatial calibration determines the position of the camera relative to the
scene that it views and, in the case of camera networks, to the other cameras in
the system.
￿
Temporal calibration determines the time at which frames are captured and, in
the case of camera networks, how the cameras are synchronized relative to each
other.
Some calibration problems for camera networks are similar to the calibration
problems for a single camera. Camera networks also introduce new calibration
problems. Without proper calibration, we cannot properly compare data or analysis
results between cameras.
The topic by Hartley and Zisserman [ 10 ] provides a thorough discussion of
camera calibration. A thorough review of the subject is beyond the scope of this
article; we concentrate here on distributed algorithms to solve calibration problems.
However, we can identify a few basic techniques for calibration. Intrinsic calibration
is necessary to determine the relationship between image features and objects in
world coordinates. When we have multiple cameras, we need to determine the
relationships between the cameras as well as the internal parameters of each camera.
The fundamental matrix describes the relationship between two views of a point
in space. The three-view geometry problem determines the relationships between
three cameras, which is based on the correspondence between two lines and a point.
The trifocal tensor describes the required relationship, along with a set of internal
constraints. The four-view geometry problem is yet more complex and is often
solved using the simpler affine camera model. This problem can be generalized to
the n -camera case.
Radke et al. [ 23 ] developed a distributed algorithm for the metric calibration
of camera networks. External calibration of a camera determines the position of
the camera in world coordinates using a rotation matrix and translation vector.
Intrinsic parameters include focal length, location of the principal point, and skew.
Calibrating cameras through image processing is usually more feasible than by
directly measuring camera position. Devarajan et al. model the camera network
using a communication graph and a vision graph. The communication graph is based
on network connectivity—it has an edge between nodes that directly communicate;
this graph can be constructed using standard ad-hoc network techniques. The vision
graph is based on signal characteristics—it has an edge between two nodes that have
overlapping fields-of-view; this graph needs to be constructed during the calibration
process. Each camera is described by a 3
4matrix P i that gives the rotation matrix
and optical center of the camera. The intrinsic parameter matrix for camera i is
known as K i . A set of points X
×
= {
X 1 ,...,
X n }
are used as reference points for
 
Search WWH ::




Custom Search