Information Technology Reference
In-Depth Information
optical axis of camera i lies along x w
axis. Camera i is then moved in accordance
with Table 19.3).
Figures 19.5 and 19.6 show time profiles of generalized coordinates and visible
image features. The control mode was switched from take-off control to normal
tracking control at 10 s and from normal tracking control to landing control at 20
s. It is seen from Figure 19.5 that the helicopter was controlled well. In particular,
the errors for x and y were within
1 m. Figure 19.6 shows that occlusion took
place in the take-off and landing. The bottom of Figure 19.6 is a closeup of the
middle. It is seem from the bottom figure that frequent changes occurred in the set of
visible image features due to occlusion caused by the rotors. The experimental result
also demonstrates the robustness of the vision servo control method with occlusion
handling proposed in [2].
±
0
.
19.9
Conclusions
This chapter presented an automatic control method for take-offs, landings, and ref-
erence tracking of an unmanned helicopter using visual servoing with occlusion
handling. The proposed control system provides high accuracy in take-offs, hover-
ing and landings. In fact, an experimental result shows the helicopter is within 0.1
m from the reference, while the length of the helicopter is 0.4 m.
Several movies can be found at the webpage http://www.ic.is.tohoku.
ac.jp/E/research/helicopter/ . They show stability, convergence and
robustness of the system in an easy-to-understand way, while the properties may
not be seen easily from the figures in this chapter.
References
[1] Chesi, G., Hashimoto, K.: Configuration and robustness in visual servo. Journal of
Robotics and Mechatronics 16(2), 178-185 (2004)
[2] Iwatani, Y., Watanebe, K., Hashimoto, K.: Image feature extraction with occlusion han-
dling for visual servo control. In: IEEE International Conference on Robotics and Au-
tomation (2008)
[3] Ludington, B., Johnson, E., Vachtsevanos, G.: Augmenting UAV autonomy: vision-
based navigation and target tracking for unmanned aerial vehicles. IEEE Robotics &
Automation Magazine 13(3), 63-71 (2006)
[4] Merino, L., Wiklund, J., Caballero, F., Moe, A., Dios, J., Forss´en,P.,Nordberg,K.,
Ollero, A.: Vision-based multi-UAV position estimation: Localization based on blob fea-
tures for exploration missions. IEEE Robotics & Automation Magazine 13(3), 53-62
(2006)
[5] Ruffier, F., Franceschini, N.: Visually guided micro-aerial vehicle: automatic take off,
terrain following, landing and wind reaction. In: IEEE International Conference on
Robotics and Automation (2004)
Search WWH ::




Custom Search