Information Technology Reference
In-Depth Information
in parallel ( e.g. a force-based guidance mode along a needle axis combined with a
vision-based virtual fixture to position the needle and a position-based alignment
fixture), and to sequence them. Ideas along these lines can be found in [17].
Acknowledgements. We would like to acknowledge the contributions of Panadda Maray-
ong and Allison Okamura to this work. This material is based upon work supported by the
National Science Foundation under Grant Nos. IIS-0099770 and EEC-9731478. Any opin-
ions, findings, and conclusions or recommendations expressed in this material are those of
the author and do not necessarily reflect the views of the National Science Foundation.
References
[1] Bettini, A., Lang, S., Okamura, A., Hager, G.: Vision assisted control for manipulation
using virtual fixtures. In: IEEE/RSJ International Conference on Intelligent Robots and
Systems, pp. 1171-1176 (2001)
[2] Bettini, A., Lang, S., Okamura, A., Hager, G.: Vision assisted control for manipulation
using virtual fixtures: Experiments at macro and micro scales. In: Proc. IEEE Interna-
tional Conference on Robot. Automat., pp. 3354-3361 (2002)
[3] Chaumette, F., Rives, P., Espiau, B.: Classification and realization of the different
vision-based tasks. In: Hashimoto, K. (ed.) Visual Servoing, pp. 199-228. World Sci-
entific, Singapore (1994)
[4] Corke, P., Hutchinson, S.: A new partitioned approach to image-based visual servo
control. IEEE Trans. Robot. Autom. 17(4), 507-515 (2001)
[5] Cowan, N., Weingarten, J., Koditschek, D.: Vision-based control via navigation func-
tions. IEEE Trans. Robot. Autom. (2002) (to appear)
[6] Dewan, M., Marayong, P., Okamura, A., Hager, G.D.: Vision-Based Assistance for
Ophthalmic Micro-Surgery. In: Proceedings of Seventh International Conference on
Medical Image Computing and Computer-Assisted Intervention (MICCAI), vol. 2, pp.
49-57 (2004)
[7] Dodds, Z.: Task specification languages for uncalibrated visual servoing. Ph.D. thesis,
Yale University (2000)
[8] Dodds, Z., Hager, G., Morse, A., Hespanha, J.: Task specification and monitoring for
uncalibrated hand/eye coordination. In: Proc. IEEE Int. Conference Rob. Automat., pp.
1607-1613 (1999)
[9] Forsyth, D., Ponce, J.: Computer Vision: A Modern Approach. Prentice Hall, Engle-
wood Cliffs (2002)
[10] Hager, G.D.: A modular system for robust hand-eye coordination. IEEE Trans. Robot.
Automat. 13(4), 582-595 (1997)
[11] Hager, G.D., Toyama, K.: The “XVision” system: A general purpose substrate for real-
time vision applications. Comp. Vision, Image Understanding 69(1), 23-27 (1998)
[12] Haralick, R.M., Shapiro, L.G.: Computer and Robot Vision. Addison Wesley, Reading
(1993)
[13] Hespanha, J., Dodds, Z., Hager, G.D., Morse, A.S.: What tasks can be performed with
an uncalibrated stereo vision system? International Journal of Computer Vision 35(1),
65-85 (1999)
[14] Hutchinson, S., Hager, G.D., Corke, P.: A tutorial introduction to visual servo control.
IEEE Trans. Robot. Automat. 12(5), 651-670 (1996)
Search WWH ::




Custom Search