Database Reference
In-Depth Information
disease, type II diabetes, colon cancer, and osteoporosis, and symptoms
associated with mental health conditions such as depression and anx-
iety. Researchers [78] have developed the UbiFit Garden, which uses
on-body sensing, activity inference, and a novel personal, mobile dis-
play to encourage physical activity. The UbiFit Garden system consists
of three components: a fitness device, an interactive application, and
a glanceable display. The fitness device automatically infers and com-
municates information about several types of physical activities to the
glanceable display and interactive application. The interactive applica-
tion includes detailed information about the individuals physical activ-
ities. The glanceable display, that resides on the background screen of
a mobile phone uses a non-literal, aesthetic representation of physical
activities and goal attainment to motivate behavior. The UbiFit appli-
cation includes the continuous monitoring of different fitness parameters
and building statistical models of these to compute and project trends,
and provide better information to users. Several other such fitness and
physical activity monitoring applications are presented in [73].
The authors in [79] have shown how body movements and eye move-
ments can be used to provide contextual information to an adaptive hear-
ing instrument to distinguish different hearing needs in various acoustic
environments. The authors record body movements, eye movements
(using electrooculography), and hearing instrument sound in different
simulated acoustic environments. They then use an SVM based classi-
fier and person-independent training to show that these different sensor
readings can be used to accurately (in some cases up to 92%) determine
the acoustic environment characteristics, and modify the settings of the
hearing instrument appropriately.
Correlating different body sensors to monitor dietary activities has
been demonstrated in [84]. The authors capture dietary parameters
such as the rate of intake (in grams per seconds.) the number of chews
for a food piece etc. that capture palatability, satiety and speed of
eating. In particular, three core aspects of dietary activity were inves-
tigated using sensors: characteristic arm and trunk movement capture
using inertial sensors, chewing of foods and food breakdown sounds us-
ing an ear microphone, and swallowing activity using a sensor-collar
containing surface Electromyography (EMG) electrodes and a stetho-
scope microphone. The authors then build a recognition algorithm using
time and frequency-domain features that addresses multiple challenges
of continuous activity recognition, including the dynamic adaptability
for variable-length activities and flexible deployment by supporting one
to many independent classes. The approach uses a sensitive activity
event search followed by a selective refinement of the detection using
Search WWH ::




Custom Search