Graphics Reference
In-Depth Information
Sophisticated solutions using active sensors (lidar, radar, etc.) exist for large aircraft,
but they are in general unsuitable for small platforms with limited power, payload,
and computational resources. To cope with these limitations, we developed a novel
stereo vision-based obstacle avoidance approach, that is especially suited for onboard
implementation on small aerial vehicles. Our approach is inspired by bird vision [ 36 ],
using a forward-looking stereo camera system to provide depth information in the
direction of flight, that can be expanded by range estimates from peripheral monoc-
ular optical flow. In Sect. 4.3 , we explain our stereo vision-based obstacle avoidance
system, that is designed for fast execution with small memory footprint by using: (1)
a polar-perspective world representation in disparity space; (2) configuration space
(C-space) expansion in image space; and implements (3) collision checking as a
z-buffer like operation in disparity space. For motion planning, we use a closed-loop
RRT approach that incorporates a vehicle model to plan local avoidance maneuvers
in full 3D, which we believe to be scalable for flights at higher speeds.
As an example for a high-level navigation task, we explain autonomous landing
with our MAV platform in Sect. 4.4 . Autonomous landing is especially important not
only for safety reasons, but also for mission endurance. Small rotorcrafts inherently
suffer from overall short mission endurance, since payload restrictions do not allow
carrying large batteries. For surveillance or exploration tasks, endurance can be
greatly improved by not requiring the platform to be airborne at all time. Instead,
such tasks may even favor a steady quiet observer at a strategic location (e.g., high
vantage points like rooftops or on top of telephone poles)—still with the ability to
move if required—which also could include recharging while in sleep mode (e.g.,
from solar cells).
4.1.1 Embedded Hardware Platforms
To evaluate the performance of our algorithms on an embedded system, we tested our
framework with two different MAV platforms: an Asctec Pelican quadrotor equipped
with an Asctec Mastermind flight computer (Core2Duo, 2
×
1.86 GHz CPU [ 4 ]) (total
weight:
1.3 kg), and an Asctec Hummingbird quadrotor equipped with either an
Odroid-X2 or a modified Odroid-U2 flight computer (total weight:
500 g; Fig. 4.1 ).
Both Asctec MAV platforms share the same low-level autopilot boards that include
a MEMS IMU, and were equipped with a downward-looking Matrix Vision cam-
era (mvBlueFOX-MLC200wG, CMOS, 752
480, grayscale, global shutter, up to
90 fps, 18.3 g with 100 FOV lens) that is connected to the flight computer.
The Odroid board (manufactured by Hardkernel [ 21 ]) is based on the Samsung
Exynos 4412 system-on-a-chip (SoC)— a quadcore microcontroller for mobile appli-
cations that provides four ARM-cortex A9 for parallel computation, while only con-
suming 2.2 W (CPU only). For our implementation, we removed all non-necessary
hardware components from the U2 in order to save weight, which included various
connectors and the original heat sink. The final weight of the U2 flight computer was
12 g including the SD card which hosts the operation system.
×
Search WWH ::




Custom Search