Haptics

Haptic perception of force magnitude and force direction

Femke E. van Beek, Wouter M. Bergmann Tiest and Astrid M.L. Kappers

In visual perception, stimulus direction often influences perceived intensity. In haptic perception, this phenomenon has also been observed, for instance in the radial-tangential illusion [Davidon et al., 1964, Quarterly Journal of Experimental Psychology, 16(3), 277-281]. Haptic devices are increasingly being used in tele-operation systems, but they still lack haptic force feedback that feels 'natural'. To improve this, it would be useful to know the relation between perceived force magnitude and force direction. This relation has been investigated in discrimination tasks showing the resolution of perceived force direction [Barbagli et al., 2006, ACM Transactions on Applied Perception, 3(2), 125-135] and magnitude [Vicentini et al., 2010, ACM Transactions on Applied Perception, 8(1), 1-16]. However, tasks showing the (an)isotropy of force perception have not been performed yet. Therefore, the goal of this experiment was to establish the relation between perceived force direction and magnitude. Subjects were presented with a range of force magnitudes at a range of directions. They had to estimate the magnitude and direction of the force. Preliminary results suggest that force direction influences force magnitude perception. They also suggest a distortion in the perception of force direction. [This research is supported by the Dutch Technology Foundation STW]

The development of position and force cues in haptic shape perception

Gabriel Baud-Bovy, Giulio Sandini and Monica Gori

Both the finger position and the contact force provide information about the shape of a surface that is manually explored with the fingertip (Robles-de-la-Torre & Hayward, 2000). Previous research has linked relatively late age-related performance improvement in various perceptual tasks to the development of cue integration, both across sensory-modalities (Gori et al., 2008) and within the same sensory modality (Nardini et al., 2010). In this study, we measured the absolute thresholds of 7 to 10 year old children and adults in a haptic curvature perception task. Participants judged the convexity/concavity of a virtual surface rendered with a haptic device where the position and force cues were systematically manipulated. We found that the performance improved markedly with age in all conditions and that the position cue dominated the force cue in all age groups. Noticeably, the relative weight of the force and position cues varied little across age groups. These results suggest that the marked age-related improvement of performance is probably due to an increase of the reliability of both cues with age.

Quantifying haptic exploratory procedures by characterizing hand dynamics and forces

Sander E.M. Jansen, Wouter M. Bergmann Tiest and Astrid M.L. Kappers

Numerous studies investigating eye movements during visual scene exploration propose a strong link between the type of movement made by observers and the task they were given. Analogous to this approach in vision, one can study hand movements during haptic scene exploration. Lederman and Klatzky (1987 Cognitive Psychology 19, 342-368) proposed a number of exploratory procedures (EPs) that describe links between desired object knowledge and hand movements that are performed to gather the appropriate information (e.g., 'lateral motion'to establish roughness). From these qualitative descriptions we propose a model that can be used to discriminate between several EPs using hand forces and dynamics. The model is based on variables such as orientation, speed, force, and exploration area. From these, objective measures are devised such that a weighted sum of the scores on each can discriminate between the following EPs: 'contour following', 'lateral motion', 'pressure', and 'static contact'. With such a model, it is possible to examine the link between a certain EP and the desired object knowledge it is presumed to assess. We evaluate its ability to predict the correct object property based solely on participants'hand forces and dynamics during a discrimination task in a 2D haptic scene. [This work was supported by the European Commission with the Collaborative Project no. 248587, 'THE Hand Embodied'.]

From which viewpoint is hand movement represented in 3D object manipulation?

Juan Liu and Hiroshi Ando

People can effortlessly integrate visual and somatosensory information to provide the right motor command, but how we code the different sensory inputs into a coherent interpretation is not clear yet. To investigate situations where visual and haptic information is spatially separated and/or rotationally misaligned may shed a light on this issue. In our previous work (Liu and Ando in IMRF2011) we studied 3D object manipulation performance in a misalignment setting, in which subjects were asked to look forward while the hand was at the height of the stomach. Subjects'performance was closer to that of co-location condition if the visual image of their hand was provided from the looking-down viewpoint. To investigate whether the looking-down view is critical to object manipulation performance in any eye-hand de-located settings, the eye and hand positions were reversed, that is, subjects were asked to look down while the hand was at the height of the eye. The result showed that the orientated hand movements were facilitated more by the image from the straight-forward viewpoint compared with the looking-down viewpoint, which suggests that the mental representation of hand movement may be constructed from the viewpoint that people look at the hand in their current posture.

Signal detection study of the effect of sound on the discrimination of hardness

Yuika Suzuki, Takeshi Okuyama and Mami Tanaka

We aimed to explore the contribution of perceptual and decisional factors to the effects of sounds on the haptic perception of hardness by using a signal detection analysis. Previous research has showed that a contact sound provides the material perception of an object, and a valid determinant of the perceived material is a decay parameter. We investigated whether stimulus congruency between haptic and auditory information affects the effect of sounds on perceived hardness by using two pairs of silicone pieces that varied in stiffness and two synthesized contact sounds that contained different decay parameters. Participants were asked to differentiate between the harder and softer silicon samples. When they tapped on silicon, the auditory stimulus with a long or short decay time was presented simultaneously, or not. The results showed that the sounds modulated the discrimination sensitivity (d') and shifted the criterion (c), depending on the difference between the decay parameters of the sounds but not on the stimulus congruency. Our results indicated that sounds affect haptic hardness discrimination at the perceptual and decision level, depending on the relationship between the time of sensing the hardness and the decay time of sounds.

Haptic spatial constancy for object motion

Lucile Dupin, Vincent Hayward and Mark Wexler

In vision, spatial constancy is the phenomenon that when our eyes or body move, we perceive objects in an external or spatiotopic reference frame, independent of our own movement, even though visual information is initially retinotopic. Spatial constancy seems to require some sort of compensation of retinotopic signals to take into account the observer's movements. In haptic perception, a similar problem can be posed: are objects perceived in a cutaneous reference frame, or are their positions and velocities compensated by information about our body's configuration and movement, resulting in haptic spatial constancy, i.e., perception in a spatiotopic reference frame. Here, we investigated haptic spatial constancy for motion. We used a tactile display in contact with a fingertip, mounted on a mobile platform. While the finger moved along a line, an object moved in various directions with respect to the fingertip. Subjects reported perceived orientation of motion, from which we could deduce the degree of compensation for motion of the finger in space. We found some evidence for compensation, but this compensation is partial at best, implying strong haptic spatial under-constancy.

Grasping a shape with uncertain location

Massimiliano Di Luca, Timothy E. Vivian-Griffiths, Jeremy L. Wyatt and Claudio Zito

Successful grasp planning requires an appropriate finger placement for which object geometry and location need to be known. Here we investigate how position uncertainty and shape influence the selection of a two-finger pinch grasp. Elliptical cylinders were stereoscopically presented in rapid succession. The position of each cylinder was randomly selected using two orthogonal Gaussian distribution whose orientation changed at each trial. The axes of the elliptical base were aligned with these orthogonal directions. Participants reported the grasp they deemed more likely to be successful. In randomized trials the variance of the Gaussian distributions and the shape of the cylinders varied. Results show an interaction between position uncertainty and shape resulting in a combination of two strategies; (1) grasp is aligned with the direction of maximum position uncertainty and (2) grasp is aligned with the minor axis of the elliptical base. In conditions where ellipses are aligned with the maximum uncertainty, there is a trade-off between the two strategies that depends on magnitude and uniformity of position variability and on cylinder's eccentricity. Overall, participants seem to maximize the chance of actually reaching the objects while also trying to pinch them along the minor axis (i.e., as in a stable grasp).