The human ability to follow speech gestures through the visual modality is a core component of speech perception. Remarkably, speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker’s face. In the present study, early cross-modal interactions were investigated by comparing early auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in natural dyadic interactions between a listener and a speaker. Although participants were not experienced with audio-haptic speech perception, shortened latencies of auditory evoked potentials were observed in both audio-visual and audio-tactile modalities compared to the auditory modality. These results demonstrate early cross-modal interactions during face-to-face and hand-to-face speech perception and highlight a predictive role of visual and haptic information on auditory speech processing in dyadic interactions.
Index Terms: audio-visual speech perception, audio-haptic speech perception, EEG.