The effects of AV asynchrony on respectively the visual bias of auditory input localization and on the McGurk phenomenon were examined within a single experimental situation. On each trial, the face of a talker, articulating one of the two trisyllables /ama/ or /ana/, or staying still, was shown on a screen and his voice saying one of the two tokens was delivered on a hidden loudspeaker to the left of the right of the screen. The subject pointed to the apparent origin of voice, and repeated the heard utterance. With synchronous presentations or short lags of the auditory input, identification responses were influenced by the nature of the visual input (McGurk effect), and pointing responses were attracted toward the talker's face when it moved, compared with trials on which it did not (visual localization bias). Both effects tended to disappear with larger positive auditory lags or with negative ones. But the relation to lag depended on peculiarities of the presented token for localization, and not for identification.