Decoding visual objects in early somatosensory cortex

Fraser Smith and Melvyn Goodale

Neurons, even in the earliest sensory areas of cortex, are subject to a great deal of contextual influences from both within and across modality connections. In the present work we investigated whether the earliest regions of somatosensory cortex (S1 and S2) would contain content-specific information about visual object categories. We reasoned that this might be possible due to the associative links formed through experience that link different sensory aspects of a given object. Participants were presented with visual images of different object categories in a block design fMRI experiment. Multivariate pattern analysis (MVPA) revealed reliable decoding of visual object category in bilateral S1 (i.e. post-central gyri) and right S2. In addition, a whole-brain searchlight decoding analysis revealed several areas in the parietal lobe that could mediate the observed context effects between vision and somatosensation. These results demonstrate that even the first cortical stages of somato-sensory processing carry information about the category of visually presented objects.

Bimanual integration of curvature

Virjanand Panday, Wouter Bergmann Tiest and Astrid Kappers

When holding a basketball or a volleyball, we can not only see, but also feel which one we are holding in our hands, due to a difference in diameter and curvature. So far, the haptic integration of distance and curvature information between two hands has received little attention. In Experiment 1, distance discrimination thresholds were determined for unimanual and bimanual exploration of flat surfaces. In Experiment 2, curvature was added. In the unimanual condition, subjects were asked to indicate whether the distance between a curved surface and the midsagittal plane was larger or smaller than the corresponding radius of the curved surface. In the bimanual condition, subjects were asked to indicate whether the distance between two curved surfaces was larger or smaller than the corresponding diameter of the curved surfaces. We found that there was no significant difference between unimanual or bimanual distance discrimination thresholds for flat surfaces. In contrast, bimanual exploration of curved surfaces results in lower discrimination thresholds than unimanual exploration. We conclude that haptic perception of distance is not integrated in bimanual exploration, whereas haptic perception of curvature is. [This work has been partially supported by the European Commission with the Collaborative Project no. 248587, ``THE Hand Embodied", within the FP7-ICT-2009-4-2-1 program ``Cognitive Systems and Robotics"]

The effect of compliance on haptic volume perception

Wouter M. Bergmann Tiest, Kassahun Bogale Sirna and Astrid M. L. Kappers

In perception, size judgements may often be influenced by irrelevant object features. This study investigated how object compliance affects the haptic perception of object size (volume). In a two-alternative forced-choice discrimination experiment, eight blindfolded participants were asked to select the larger of two cubes presented on stands. There were two experimental conditions: the cubes were to be either completely enclosed with the hand, or pinch-grasped between thumb and index finger. In each trial, one of the cubes was made out of hard synthetic material, whereas the other was made out of soft foam. Points of subjective equality were derived from psychometric curves. On average, a soft cube of 8.0 cm3 was perceived to be equal in volume to a hard cube of 6.4 or 6.7 cm3 for enclosure or pinch-grasp, respectively. These significant biases indicate that volume perception is influenced by material properties, in this case compliance. The biases for the two conditions were not significantly different from each other, indicating that the method of touch does not play a large role. We hypothesise that hardness as a salient feature causes an overestimation of the object's size. [This work has been supported by the European Commission with the Collaborative Project no. 248587, "THE Hand Embodied", within the FP7-ICT-2009-4-2-1 program "Cognitive Systems and Robotics"]

The saliency of compliance in a haptic search task

Vonne van Polanen, Wouter M. Bergmann Tiest and Astrid M. L. Kappers

Visual search has proven to be a valid method to investigate feature saliency. Similarly, haptic search reveals efficient perception of haptic properties. In this study, the saliency of hardness and softness was investigated in a search task. In Experiment 1, participants had to grasp a bundle of spheres and determine whether a hard target was present among soft spheres or vice versa. When the difference in compliance between target and distractors was small, a serial strategy was found and reaction times increased with the number of items. With a large difference in compliance, the reaction times did not depend on the number of items and a parallel strategy was found. In Experiment 2, participants had to press their hand on a display filled with hard and soft spheres. In the search for a soft target the slopes of reaction times against the number of items were high, but the locations of target and distractors had a large influence on the search difficulty. With a hard target, the reaction time was independent of the number of items. This showed that weight cues did not cause the finding in Experiment 1 and that both hardness and softness are salient features. [This work was supported by the European Commission with the Collaborative Project no. 248587, “THE Hand Embodied”.]

Using two hands is better than one

Myrthe A. Plaisier and Marc O. Ernst

From multisensory studies it is known that redundant information, for instance between the haptic and visual modalities, is integrated such that the combined percept becomes more precise. We often handle and explore objects with both hands. Is information across both hands integrated as well? Note that we usually don’t touch an object at the same location with both hands and it is also quite common to simultaneously touch different objects with both hands. In such a case perceptual integration would not be beneficial, because the two estimates do not necessarily have a shared source. For this reason, we may speculate that bimanual haptic information is never integrated. Instead, here we show that bimanual stiffness information is integrated such that the bimanual percept is more precise than the unimanual percepts. Furthermore, to our surprise integration did not break down by displaying visual information indicating that the two hands were touching separate objects. Clearly, visual information alone was not enough to break the assumption that both inputs had a shared source. These results show that bimanual information is combined according to maximum likelihood estimation. Consequently, there is a clear benefit of exploring an object with both hands in terms of precision of the percept.

Plastic reorganization to form dissociated networks for perceptual encoding and memory recall in congenital blindness

Lora Likova

Can short-term training produce rapid reorganization of cognitive networks in the adult brain? This question was addressed by means of our novel Cognitive-Kinesthetic training method in congenital blindness. Functional MRI was run before and after one week of training in a drawing paradigm. The specific tasks were: tactile exploration/memory encoding of complex raised-line images, drawing the images based solely on tactile memory retrieval, and scribbling, each of 20s duration, separated by 20s rest intervals. After training, dissociated networks of temporal-lobe regions emerged that were typically activated exclusively either a) during perceptual encoding or b) during memory retrieval. Pre/post-training analysis revealed that the memory retrieval network had undergone dramatic plastic reorganization relative to its undifferentiated state before the training. The hippocampus which was strongly deactivated during the memory retrieval task before training reversed its sign to become strongly activated after training. Interestingly, this reversal was correlated with the emergence of the memory retrieval network extending through the inferotemporal cortex. These findings provide novel insights into the evolution of rapid experience-based functional segregation as a consequence of active spatiomotor learning involving complex memory encoding and retrieval.