Graphics Reference
In-Depth Information
6.5.3.2 Numerical Optimization
At this point, we assume we have an initial estimate for the minimizer to
Equation ( 6.57 ), parameterized appropriately. In this section, we sketch how to
actually solve the bundle adjustment optimization problemnumerically. These opti-
mization algorithms are “under the hood” of the software packages visual effects
artists use to solve the matchmoving problem. While Appendix A.4 goes into more
detail on general nonlinear optimization, we give the main ideas here.
First, we note that Equation ( 6.57 ) is the sum of squared functions of the camera
parameters and scene points. For the moment, let's denote the vector of parameters
to be optimized as
, and a stacked vector of all the feature point observations as
x . For a given estimate of the parameters
θ
, we compute the reprojections of all the
feature points to produce an estimate of x ; we denote this estimate as a multi-valued
function
θ
x
ˆ
=
f
( θ )
. Therefore, we can rewrite Equation ( 6.57 ) concisely as:
( θ )) (
F
( θ ) = (
x
f
x
f
( θ ))
(6.60)
( θ )
where F is the function we want to minimize. Let's expand the cost function F
in
t :
θ
a Taylor series approximation about some point
2 F
θ
) +
F
θ ( θ
1
2 ( θ θ
)
t
t
) ( θ θ
t
t
t
t
F
( θ )
F
( θ
) +
2 ( θ
)( θ θ
)
(6.61)
θ of this function is given by setting the gradient of Equation ( 6.61 )
The minimizer
to zero:
1
2 F
θ
F
θ ( θ
θ = θ
t
t
t
2 ( θ
)
)
(6.62)
This suggests an iterative process for minimizing F : we start with a good esti-
mate of the minimizer (obtained at the end of Euclidean reconstruction), form the
quadratic approximation in Equation ( 6.61 ) around this point, and iterate
1
2 F
θ
F
θ ( θ
t
+
1
t
t
t
θ
= θ
+
2 ( θ
)
)
(6.63)
We can compute that
F
θ ( θ
t
J ( θ
t
t
) =
)(
x
f
( θ
))
(6.64)
where J is the Jacobian matrix defined by
) =
f
θ ( θ
t
t
J
( θ
)
(6.65)
th element of J is the partial derivative of the j th reprojection
ˆ
That is, the
(
j , k
)
x j with
respect to the k th parameter
θ k . We will return to the structure of this Jacobianmatrix
in a moment, since it has critical implications for designing a fast bundle adjustment
algorithm.
The other important quantity in Equation ( 6.63 ) is the matrix of second partial
derivatives, also called the Hessian . This matrix is impractical (and generally unnec-
essary) to compute exactly, and optimization algorithms differ in how to approximate
 
Search WWH ::




Custom Search