Mulisensory processing

Early interaction between vision and touch during Binocular Rivalry: characteristics and constraints

Claudia Lunghi, David Burr, David Alais and Concetta Morrone
Growing evidence shows that cross-modal signals can interact at the earliest stages of sensory processing. Haptic signals have been shown to influence the dynamics of binocular rivalry, a form of perceptual bistability (Lunghi et al, 2010, Current Biology, 20(4), R143-144). We investigated the role of spatial proximity, type of tactile stimulation (active or passive) and cross-modal attention in promoting fusion between vision and touch during binocular rivalry of orthogonally oriented gratings (±45°, size 2.5°, SF 2c/cm). By varying the tactile orientation (±7.5°, ±15°, ±30°), we further investigated the orientation selectivity of the interaction. We found that both active exploration and passive tactile stimulation boosted the visual stimulus congruent with the tactile one (engraved grating), both prolonging conscious perception and restoring it from binocular rivalry suppression. The interaction was strictly tuned (less than one octave) matched visuo-tactile spatial frequency and orientation. We also found that voluntary and stimulus-driven attention played a minor role in mediating the interaction, while spatial proximity between visual and tactile stimuli was necessary for fusion of the cross-sensory signals, suggesting that the visual and the somatosensory spatial maps are aligned. Taken together, our results points to a very early multisensory interaction site, possibly V1.

Not glass but plastic — Audiovisual integration in human material perception

Shin'Ya Nishida, Waka Fujisaki, Naokazu Goda, Isamu Motoyoshi and Hidehiko Komatsu
Vision provides useful information about the material of objects. We are able to judge only from visual appearance whether an object is likely to be made of metal, glass, ceramic or wood. Likewise, we are able to judge object’s material only from auditory information, such as the impact sound made by striking the object [e.g., Giordano, & McAdams, 2006, The Journal of the Acoustical Society of America, 119, 1171-1181]. Then what kind of material do we perceive when the visual appearance of one material is combined with the impact sound of another material? We tested this question for a variety of material, and found a strong audiovisual interaction in material perception. For example, a visual glass appearance paired with a paprika sound was perceived as a transparent plastic, and a visual bark appearance paired with a metal sound was perceived as a coated ceramic. Rating data suggest that the observer perceives such materials as being consistent with both optical properties (e.g., transparency, roughness, texture) given by vision and internal properties (e.g., hardness, heaviness, emptiness) suggested by audition. Material perception smartly integrates complementary information from different sensory modules. (Supported by Grant-in-Aid for Scientific Research on Innovative Areas (No. 22135007) from the Ministry of Education, Science, Sports and Culture, Japan)

Content cues can constrain AV temporal recalibration regardless of spatial overlap

Warrick Roseboom, Takahiro Kawabe and Shin'Ya Nishida
Several studies have shown that the point of perceptual synchrony for audio (A) and visual (V) events can be shifted by exposure to asynchronous AV relationships. Recently, it has been demonstrated that it is possible to concurrently maintain two different, and opposing, estimates of AV temporal synchrony (Roseboom & Arnold, 2011, PsychSci). However, some uncertainty remains over precisely what defines a given AV pair such that it is possible to maintain a temporal relationship distinct from other pairs. Another recent study (Heron et al., 2012, EBR) suggested that, at least for ecologically arbitrary AV pairs, spatial separation was necessary to achieve multiple distinct AV synchrony estimates. Here we investigated if this was necessarily true, or if the magnitude of difference between AV pairs in that study was insufficient to promote independent grouping of each pair. Using a similar paradigm, we examined whether it is possible to obtain distinct temporal recalibrations using two oriented Gabor patches, arbitrarily matched with high or low frequency pure tones. We found concurrent, and opposite, recalibrations despite the two stimuli being presented in the same spatial location. This result indicates that the content of an AV pair can be used to derive distinct AV synchrony estimates regardless of spatial overlap.

Visual search driven by audiovisual synchrony shows a right visual field bias

David Alais, Erik van der Burg, John Cass and Jan Theeuwes
Visual search for a modulating target in a modulating array is much easier when synchronized with an auditory transient. Here we show an asymmetry in synchrony-driven search efficiency across the visual field. Participants viewed a ring of 19 luminance-modulating discs while hearing a modulating tone. The modulating discs had unique temporal phases (–380 to +380 ms; 40 ms steps), with one synchronized to the tone. Participants did a speeded visual search for the synchronized disc, with modulations (auditory and visual) both sinusoidal or both square at 1.3 Hz. Target position was randomized and spatial distributions of search efficiency were compiled. Results show that sine modulations did not facilitate search (chance performance at all target phases), but square-wave modulations did: the target (phase = 0 ms) was frequently chosen, with tight error distributions (~120 ms wide) around zero-phase lag. Spatially, visual search varied over the visual field: error distributions were more tightly tuned temporally on the right side, especially the upper-right quadrant. These results show that synchrony-driven visual search: (i) requires synchronized transient signals, (ii) has a narrow integration window (±60 ms), and (iii) is spatially biased to the right visual field, suggesting a hemisphereic specialization for synchrony-driven visual search.

Small rewards modulate the latency of stimulus-driven eye movements

Stephen Dunne, Daniel Smith and Amanda Ellison
Extensive research has demonstrated how rewards can influence the programming and function of saccades in non-human primates. However, little is known about the effects of rewards on the function of eye movements in healthy human participants. Here, the effects of instrumental conditioning of eye movements were investigated in human participants. Specifically, participants were given a small financial reward for making one particular eye movement. Consistent with primate data, participants exhibited faster saccadic reaction times to the rewarded hemifield. When reward was removed this effect persisted. The frequency of participants’ errors also illustrated a sustained biasing of the oculomotor system even after reward was removed. A second experiment, using the same paradigm but combining an auditory tone with reward found saccades to the rewarded hemifield were significantly slower across participants than those to the unrewarded hemifield, contrasting to the previous experiments findings. Small monetary rewards were able to successfully induce a sustained bias in the oculomotor system. This result may have potential real world applications with patients suffering with visual biases against exploring a full visual field.

Multi-sensory integration of audio-visual temporal signals in children with cochlear implants

Monica Gori, Anna Chilosi, Giulio Sandini and David Burr
Animal studies (e.g. Stein 1993) suggest that multi-sensory stimulation is necessary for the normal development of multi-sensory neural mechanisms. We investigated how abnormal auditory input (severe hearing loss) during early infancy can affect human multisensory integration of temporal information. We measured temporal bisection thresholds for auditory, visual and conflictual audio-visual stimuli in 57 typical children and adults, and in 13 children and adults with cochlear hearing devises implanted at various ages. Subjects were required to judge whether the second of a three-pulse sequence was closer to the first or third pulse (separated by 1 sec). In the audio-visual condition, the visual and auditory stimuli were either simultaneous or in conflict. The results show that typically developing adults and children rely primarily on audition for the audio-visual temporal bisection task. However, children with implants gave more weight to visual information. Interestingly, in this group the auditory dominance was be restored after about 12 years of auditory exposure, suggesting that the cross-sensory calibration takes some time, but can also occur relatively late in life.