Graphics Reference
In-Depth Information
12. Cheng Y (2010) Real-time surface slope estimation by homography alignment for spacecraft
safe landing. In: Proceedings of the IEEE international conference on robotics and automation,
pp 2280-2286
13. Chroust SG, Vincze M (2004) Fusion of vision and inertial data for motion and structure
estimation. J Robot Syst 21(2):73-83
14. Conroy J, Gremillion G, Ranganathan B, Humbert J (2009) Implementation of wide-field
integration of optic flow for autonomous quadrotor navigation. Auton Robot 27(3):189-198
15. Corke P (2004) An inertial and visual sensing system for a small autonomous helicopter. Int J
Robot Syst 21(2):43-51
16. Crazyflie Micro Quadrotor. http://www.bitcraze.se/crazyflie/
17. Di K, Li R (2004) CAHVOR camera model and its photogrammetric conversion for planetary
applications. J Geophys Res 109:E04004
18. Fraundorfer F, Heng L, Honegger, D, Lee GH, Meier L, Tanskanen P, Pollefeys M (2012)
Vision-based autonomous mapping and exploration using a quadrotor MAV. In: IROS, pp
4557-4564
19. Gemeiner P, Einramhof P, Vincze M (2007) Simultaneous motion and structure estimation by
fusion of inertial and vision data. Int J Robot Res 26(6):591-605
20. Goldberg SB, Matthies L (2011) Stereo and IMU assisted visual odometry on an OMAP3530
for small robots. In: 2011 IEEE computer society conference on computer vision and pattern
recognition workshops (CVPRW), pp 169-176
21. Hardkernel. http://www.hardkernel.com
22. How JP, Bethke B, Frank A, Dale D, Vian J (2008) Real-time indoor autonomous vehicle test
environment. IEEE Control Syst Mag 28(2):51-64
23. Hrabar S, Sukhatme GS, Corke P, Usher K, Roberts J (2005) Combined optic-flow and stereo-
based navigation of urban canyons for a uav. In: IROS
24. Huster A, Frew EW, Rock SM (2002) Relative position estimation for AUVs by fusing bear-
ing and inertial rate sensor measurements. In: Proceedings of the oceans conference, vol 3.
MTS/IEEE, Biloxi, pp 1857-1864
25. Hyslop AM, Humbert JS (2010) Autonomous navigation in three-dimensional urban environ-
ments using wide-field integration of optic flow. Guid Control Dyn 33(1):147
26. Johnson A, Montgomery J, Matthies L (2005) Vision guided landing of an autonomous heli-
copter in hazardous terrain. In: Proceedings of the IEEE international conference on robotics
and automation, pp 3966-3971
27. Jones E (2009) Large scale visual navigation and community map building. PhD thesis, Uni-
versity of California at Los Angeles
28. Jones E, Soatto S (2010) Visual-inertial navigation, mapping and localization: a scalable real-
time causal approach. Int J Robot Res 30:407-430
29. Kelly J, Sukhatme GS (2011) Visual-inertial sensor fusion: localization, mapping and sensor-
to-sensor self-calibration. Int J Robot Res (IJRR) 30(1):56-79
30. Klein G, Murray D (2007) Parallel tracking and mapping for small AR workspaces. In: Pro-
ceedings of the 2007 6th IEEE and ACM international symposium on mixed and augmented
reality, ISMAR'07. IEEE Computer Society, p 110
31. Kuwata Y, Teo J, Fiore G, Karaman S, Frazzoli E, How JP (2009) Real-time motion planning
with applications to autonomous urban driving. Trans Control Syst Tech 17(5):1105-1118
32. Luders B, Karaman S, Frazzoli E, How J (2010) Bounds on tracking error using closed-loop
rapidly-exploring random trees. In: American control conference, Baltimore, MD, pp 5406-
5412
33. Lupton T, Sukkarieh S (2008) Removing scale biases and ambiguity from 6DoF monocu-
lar SLAM using inertial. In: International conference on robotics and automation, Pasadena,
California
34. Lupton T, Sukkarieh S (2009) Efficient integration of inertial observations into visual SLAM
without initialization. In: IEEE/RSJ international conference on intelligent robots and systems,
St. Louis
Search WWH ::




Custom Search