Biomedical Engineering Reference
In-Depth Information
(up to a factor of two). This is true even if one hand remains on the starting point [135,
136]. Furthermore, if people move one arm against a constant force, the other arm can
estimate its position appropriately, but if the force varies during the movement people
make systematic mistakes [137 - 139]. This holds also true for actively induced force
changes, for example, when matching the indentation depth of two springs with different
compliances [140].
Altogether, the number of illusions in the perception of manipulatory haptic space is
large and may, especially, depend upon the way the kinesthetic system derives position
and movement of limbs from the muscle receptors. As noted above, this relationship is
not presently well understood. Thus, although the illusions are rather interesting for basic
research, it is still not easy to derive general rules and apply them to haptic display design.
Consequently, the following suggestions remain highly speculative and will need to be
subjected to further investigation.
One general rule may be derived from the fact that in everyday life, when we touch
and look around in space, we are not aware of all these distortions. Visual distortions in
the same direction can explain this fact to a small extent [141]. Primarily, it may mean
that with vision available we do not care much about coarse haptic relationships and,
consequently, it may be of minor importance to get these exactly right in the haptic part
of virtual reality. Moreover, specific distortions may suggest specific and probably useful
applications or directions of research. For example, virtual realities that visually stretch
empty space in haptically overestimated directions may be able to increase virtual haptical
workspace somewhat, albeit unnoticed by the user. Or, the fact that space perception by
the two different hands is especially inconsistent could promise simplification in the
synchronization of bimanual interfaces.
Finally, space distortions stemming from varying forces have been shown to have a
promising counterpart in shape perception, which will be discussed.
In VEs, the perception of objects (and surfaces) in haptic space can be split into the
perception of their different haptic properties, such as material and geometry, as in real
haptic systems. Compared to the other senses, haptics is the only one that can determine
the material property of an object, such as its weight and surface properties which, itself,
has been further differentiated into perceptual categories such as roughness, softness,
stickiness, and apparent temperature [142], and through which most people primarily
tend to make their determination and formulate their perception of softness and stickiness
[143]. This may mean that the perception resulting from a surface is well described
in terms of these two or three dimensions. Likewise the display of haptic material in
VEs probably will be limited but should, in any case, concentrate on these two or three
dimensions.
References
1. Fearing, R.S., Moy, G., and Tan, E. (1997) Some basic issues in teletaction. Proceedings of the 1997
IEEE International Conference on Robotics and Automation, 1997, vol. 4, pp. 3093 - 3099.
2. Dargahi, J. (2000) A piezoelectric tactile sensor with three sensing elements for robotic, endoscopic and
prosthetic applications. Sensors and Actuators A: Physical , 80 , 23 - 30.
3. Mirbagheri, A., Dargahi, J., Najarian, S., and Ghomshe, F.T. (2007) Design, fabrication, and testing of a
membrane Piezoelectric tactile sensor with four sensing elements. American Journal of Applied Sciences ,
4 , 645 - 652.
Search WWH ::




Custom Search