Graphics Reference
In-Depth Information
Mono camera sensors. The imaging sensors are covering vision and near-infrared
spectrum and have flexible field of view (FOV). Most ADAS functions require
imaging sensor dynamic range of at least 115dB, i.e., more than 19 bits per pixel.
Typical frame rate for imaging sensor in driver assistance is 15 or more frames/s.
Stereo camera pair has the capability of generating scene depth map based on
disparity between the views from two cameras. Stereo camera is the most complex
and complete sensor for driving assistance.
Time-of-flight sensors that give accurate depth information by measuring the time
it takes for the emitted energy to return to the sensor:
- Long-range radar with range of 1-200m and a response time of about 40ms.
Long-range radar is suitable for detection of objects in a highway environment.
- Short-range radar with working range of 0-80m. Short-range radars are suitable
for near-range detection of vehicles in crowded urban scenarios.
- Lidar (light detection and scanning) scanner is used in combination with camera
for object detection and tracking in functions such as Adaptive Cruise Control,
Collision Warning, or Pedestrian Detection).
- Ultrasound sensors are used for park assist functions (to calculate distances to
objects to assist the driver in parking the car).
Near- and far-infrared sensors are used for night vision.
Camera or Radar?
Camera-based systems have the ability to offer multiple convenience and safety func-
tions, including steering control and automatic emergency breaking, thus offering
a cost advantage over radar- or lidar-based systems. Camera-based systems have a
maximum range of about 50-100m (depending on function) and wider field of view
compared to long-range radar systems. Imaging technology can categorize type, esti-
mate size of objects, and its wider FOV enables better tracking. Radar is vulnerable
to false positives, especially around road curves, due to its inability to recognize
object type, therefore in harsh weather conditions, with poor visibility, the radar's
might give the driver false sense of security.
Mono cameras also have their limitations. Imaging sensors give 2D perspective of
the 3D scene, losing the valuable depth information in the process (depth information
offers valuable clues for separating objects from the background). Obtaining 3D
information from a single camera is an ill-posed problem which can be solved by a
stereo camera pair. In stereo camera systems, depth uncertainty is a quadratic function
of distance, while distance is not affecting accuracy of radar and lidar systems.
Relative vibrations between the cameras and large temperature swings drive need
for online dynamic stereo camera system calibration when vehicle is moving. This
online calibration needs to work without test patterns, needs to run in the background
while system is operating, and must be guaranteed for the lifetime of the vehicle.
Small calibration errors should not have impact on disparity estimation.
Radar offers advantages such as long detection range, resolution, and sensing
performance necessary for higher speed systems. Radar can operate off-road and
under extreme weather conditions.
Search WWH ::




Custom Search