Information Technology Reference
In-Depth Information
both in space and in time (heat, pressure, sound, eye perception and so on).
Often, behavioural strategies are elaborated with regard to a change in
environmental conditions that in turn can be detected as discontinuities in
such signals.
Refining the techniques for perceptual and behavioural processing is the
key for a good design of adaptive interfaces along with the use of effective
cognitive modules. In this way it is possible to obtain personalized GUIs by
detecting automatically the user's features.
Feature extraction is a model-driven process. The model starts with an
initial configuration that is adapted by detecting the displacement of the
body, the eyes and the mouth [7].
The research literature in perceptual processing includes face and body
detection [8] and automatic extraction of face boundaries [9].
The input measures like colour, brightness, boundaries and motion can be
regarded as first-order features in a hierarchical scheme. Such measures are
merged to estimate body profiles, eye and mouth movements and location of
the facial regions and of the upper limbs.
The next level in the hierarchy consists of a figure parametric description
where the motion fields of the eye and mouth regions are computed. In
general, such parameter vectors are recorded over time to allow the system to
learn some numerical indices of the emotional status. A suitable learning
algorithm for this purpose is the learning vector quantization (LVQ).
Finally, the shape and motion of the upper part of the body are detected.
As an example, it is possible to evaluate the 3D position of the head and the
shoulders along with their movements.
Status information is related to the eye gaze, while transitions are related
eye movements. Changes in the eye displacements are classified as gazes or
movements depending on their intensity and direction.
Behavioural processing is related to key pressing and mouse data. Key
pressing involves the choice of a particular key and the pressure time. Mouse
data are the pointer coordinates, clicks' strength and frequency, and
movements' acceleration. These are the primal features to update the user
cognitive model.
One might think to devise some straightforward rules to adapt the inter-
face layout to the users' emotional status (i.e. in case the user gets confused;
the layout has to be as simple as possible). However, this approach is
unlikely to be effective because the cognitive status of the user is not taken
into account. The best interface arrangement depends on the task the user is
engaged with. The system can remove correctly some elements from the
interface layout only if the users will not use them, provided the nature of the
task they are performing.
Search WWH ::

Custom Search