Information Technology Reference
In-Depth Information
RPROP. The same network was trained to preform different tasks, as specified by
different image degradation procedures.
The networks reconstruct images iteratively and are able to resolve local ambi-
guities by integrating partial results as context. This is similar to the recently demon-
strated belief propagation in graphical networks with cycles. The main difference is
that the approach proposed here learns horizontal and vertical feedback loops that
produce rich multiscale representations to model the images, whereas current belief
propagation approaches use either trees or arrays to represent either the vertical or
the horizontal dependencies, respectively.
Furthermore, the proposed network can be trained to compute an objective func-
tion directly, while inference in belief networks with cycles is only approximate due
to the multiple counting of evidence. Recently, Yeddia, Freeman and Weiss pro-
posed generalized belief propagation [247] that allows for better approximations of
the inference process. It would be interesting to investigate the relationship between
this approach and the hierarchical recurrent neural networks.
The iterative reconstruction is not restricted to static images. In this chapter it
was shown that the recurrent Neural Abstraction Pyramid network is able to inte-
grate information over time in order to reconstruct digits from a sequence of images
that were degraded by random background level, contrast reduction, occlusion, and
pixel noise. The training method allows also for a change of the desired output at
each time step. Thus, the networks should be able to reconstruct video sequences.
Search WWH ::




Custom Search