Graphics Reference
In-Depth Information
from view recognitions are called observation edges. Edges that are formed by com-
bining other edges (described below in Sect. 2.8.2 ) are called hybrid edges.
2.7.2 Graph Construction
After each image is processed by the front end, the graph representation is updated:
A new pose node is added for the current pose, and the new node is connected to
the preceding pose node by a motion edge, encoding the accumulated differential
motion estimate between the two poses.
If an existing view has been observed, an observation edge is created from the
observed view node to the pose node, encoding the observation constraint. When
the back end is operating in SE
(
2
)
, the relative pose estimate (in SE
(
3
)
) is first
projected into SE
(
2
)
before creating the observation edge.
If a new view has been created, one of the recent pose nodes corresponding to the
view is promoted to a view node.
2.7.3 Incremental Optimization
The graph flexibly represents the GMRF corresponding to the SLAM estimation
problem. The negative log-likelihood of the parameter estimates (encoded by the
nodes) is the sum residuals of the edges. Denote the edge set by E
= { e i }
. For an
edge e
E , the source and destination are given by s
(
e
)
and d
(
e
)
respectively. The
edge's constraint mean is denoted by
μ(
e
)
and the covariance by
Σ(
e
)
. Then the
negative log-likelihood
L of the graph (up to a constant offset) is given in terms of
the residuals v i by
e i ) 1
v i
μ(
e i ) ·
s
(
e i ) ·
d
(
(2.1)
e i ) 1 v i
v i
=
Σ(
L
(2.2)
i
When the node pose estimates better satisfy the constraints encoded in the edges,
the negative log-likelihood
L is lower. Graph optimization increases the likelihood
of the GMRF parameters by minimizing the negative log-likelihood as function of
the node parameters.
Because computation time must be bounded and the graph is continually grow-
ing and changing, any feasible graph optimization technique must be incremental.
Several methods are described in, e.g., [ 7 ]. Any general method for incremental
nonlinear optimization can be applied successfully to the graph.
We employ spanning tree and blob-based optimizations, which are run for a fixed
number of iterations at each time step following the graph update.
Search WWH ::




Custom Search