Information Technology Reference
In-Depth Information
Our approach follows a behavioral methodology by performing a non-invasive
monitoring of external cues describing drivers level of drowsiness. We look at this
complex problem from a systems' engineering point of view: how to go from a proof
of concept prototype to a stable software framework that can provide solid basis for
future research in this field.
5.2.1
System Initialization: Preparation
The initialization stage consists of analyzing the environment and optimizing
some parameters for best performance. Our current prototype uses the camera
capabilities of an Android-based smartphone device in order to observe the driver.
The smartphone should be positioned as to enable appropriate distance and proper
viewing angle between driver and camera. Moreover, the lighting conditions must be
adequate (above a minimum threshold) and any potential occlusions of the camera
view by the vehicle's internal components should be avoided.
Our system extracts user-specific features during the initialization stage. Those
features include: skin color, head position and eye features. Once these features have
been extracted, the process of localization of key components (head and eyes) takes
place.
We use the Viola-Jones face/eyes detection algorithm [ 18 ], due to its speed and
simplicity. The algorithm performs well if the user is facing the camera but the
performance deteriorates fast as users gaze moves further away from the camera.
Initial tests have shown that algorithm performs within the satisfactory boundaries
both in terms of quality and speed. Typical limitations are shown in Figs. 5.1 and 5.2 .
Tests have shown that face detection rate is high as long as the driver's gaze does
not deviate more than 30 degrees relative to the camera. Conveniently, this allows for
the positioning of the phone around the dashboard area of the car which is generally
most commonly used place for positioning phone device. This position gives us a
clear line of sight of a driver without the steering wheel occluding the view.
Fig. 5.1 Eye detection algorithm: limitations due to horizontal angle change. ( a ) Head rotating
to the right—eyes still located properly. ( b ) Head rotating to the right—eyes lost in the following
frame
Search WWH ::




Custom Search