Biomedical Engineering Reference
In-Depth Information
gradient index (GRIN) lenses with overlapping
fields of view. The system was designated the
Multi-Aperture Vision System (MAVS) [82] .
Bruckner et al. [83] developed an array of sen-
sors based on the apposition compound eye. In
their design, an array of pinhole aperture photo-
sensors was coupled with a microlens array in a
layered arrangement. The pitch of the two arrays
was dissimilar, which enabled different viewing
directions of the separate optical channels. The
Gaussian overlapped response of neighboring
sensors in the array provided for the localization
of a point source with hyperacuity.
Ogata et al. [84] also employed a layered sen-
sor technique to develop an apposition-style
8 × 8 sensor. The sensor consisted of microlens,
pinhole, and photodiode arrays in tightly cou-
pled layers. Jeong et al. [85] developed tech-
niques to manufacture a biologically inspired
three-dimensional, spherical compound eye
employing microlens technology. They noted
that a light from a distant point source imping-
ing on the omnidirectional array would have a
different coupling efficiency with each omma-
tidium [85] .
Tanida et al. [86-88] developed a compact
image-capturing system designated TOMBO
(Thin Observation Module for Bound Optics).
The TOMBO system employs compound-eye
imaging optics to capture a series of rendered
images to obtain the object image.
Hoshino et al. [89] developed an insect-
inspired retina chip that integrates a microlens
array, a photodiode array, and an electrostatically
driven scanning slit. This system can image a
contrast grating with high temporal resolution.
Considerable work has been devoted to the
development of a sensor based on neural super-
position compound eyes [37, 39, 40] . In neural
superposition eyes, the overlapping Gaussian
photoreceptor acceptance profiles provide for
motion hyperacuity. To achieve the overlapped
response, a variety of optical configurations have
been employed, including optical fibers equipped
with ball lens, off-the-shelf photodiodes, and
optical fibers equipped with small lensets [40] .
Each of these configurations is depicted sche-
matically in Figure 1.20 , with the corresponding
physical sensor prototypes shown in Figure 1.21 .
Wilcox et al. developed a VLSI-based array of
neural superposition sensors that demonstrated
hyperacuity [77, 78] .
Several research groups have developed a
number of bio-inspired processors that provide
for optic flow and aerial vehicle navigation.
Harrison et al. compiled a noteworthy body of
work based on the fly's flow-field processes.
After studying the fly's system in detail, Harri-
son rendered silicon-chip flow-field generators.
He developed a single-chip analog VLSI sensor
that detects imminent vehicle collisions by
measuring radially expanding optic flow based
on the delay-and-correlate scheme similar to
that first proposed by Reichardt [90, 91] . Pudas
et al. also developed a bio-inspired optic flow-
field sensor based on low-temperature co-fired
ceramics (LTCC) technology. The process pro-
vides reliable, small-profile optic-flow sensors
that are largely invariant to both contrast and
spatial frequency [92] .
Netter and Franceschini have demonstrated
the ability to control a model unmanned aerial
vehicle (UAV) with biologically inspired opti-
cal flow processes [93] , based on earlier work
by Aubépart and Franceschini [48] . A model
UAV was equipped with a 20-photoreceptor
linear array. The photoreceptor outputs were
processed by 19 analog elementary motion
detectors (EMDs). Each of the EMDs detects
motion in a particular direction within a limited
field of view. The overall output from the EMD
is a pulse of which the voltage is proportional
to the detected speed. Terrain-following capa-
bility was achieved in the model setup by vary-
ing thrust such that the measured optical flow
was adjusted to the reference optical flow. Other
approaches to hardware sensors that are specifi-
cally sensitive to optical flow have been
designed by researchers such as Chahl and
Mizutani [49] , discussed further in Chapter 9.
Search WWH ::




Custom Search